Pollution, Concentration and Mortality

Pollution, Concentration and Mortality

by Clifford E Carnicom
Mar 19 2016

A preliminary analytical model has been developed to estimate the impact of increased concentrations of atmospheric fine particulate pollution (PM 2.5) upon mortality rates. The model is a synthesis between an analysis of measured pollution levels (PM 2.5) and published increased mortality estimates. The model is based, in part, upon previous investigations as published in the paper “The Obscuration of Health Hazards : An Analysis of EPA Air Quality Standards“, Mar 2016.

Models for both concentration levels and visibility have now been developed; for a related model in terms of visibility, please see the paper entitled Pollution, Visibility and Mortality, Mar 2016.

 mortality-concentration-days-02
Preliminary Concentration -Exposure – Mortality Model

A substantial data base based upon direct field measurements of atmospheric fine particulate matter in the southwestern United States during the winter of 2015-2016 has been acquired. The measurements reveal clear relationships between the quality of air, the PM 2.5 concentration levels, visibility of the surrounding territory, and the existence or absence of airborne aerosol operations.

The field data shows that repeated instances of the PM 2.5 count in the range between 30-60 ug/m3 is not unusual in combination with active atmospheric aerosol operations; visibility and health impacts are obvious under these conditions. The PM 2.5 count will inevitably be less than 10 (or even 5) ug/m3 under good quality air conditions.

Additional studies based upon this acquired data may be conducted in the future. Numerous published studies make known relationships between small increases in PM 2.5 pollution and increased mortality.

 meter44Measured PM 2.5 Count, 44 ug/m3.

As an example of use of this model, if the PM 2.5 count is 44 ug/m3 as shown in the above example, and if the number of days of exposure of this level is approximately 50, then the estimated increase in annual mortality is approximately 17%. This is an extreme increase in mortality, but under observed conditions in various locales it is not beyond the range of consideration.  It is thought that reasonably conservative approaches have been adopted within the modeling process.

The field data that has been collected and this model further highlight the serious deficiencies in the current Air Quality Index (AQI) as in current use by the U.S. Environmental Protection Agency (EPA). In light of the current understanding of the health impacts of small changes in PM 2.5 counts (e.g, 10 ug/m3), a scale that gives equal prominence to values as high as 500 ug/m3 (catastrophic conditions) is an incredible disservice to the public. Please see the earlier referenced papers for a more thorough discussion of the schism between public health needs and the reporting systems that are in place.

This researcher advocates the availability of direct and real-time fine particulate matter concentration levels (PM 2.5) to the public; this information should be as readily available as current weather data is.  Cost and technology are no longer major barriers to this goal.

 

operation-01Active Aerosol Operation
City of Rocks, Southern N.M.

operation-02Demonstration of the Impact of Aerosol Banks Upon Visibility.
Concentration Levels and Subsequent Visibility Changes
Directly Impact Mortality.

As an incidental note, it may be recalled from earlier work that there is a strong conceptual basis for the development and application of surveillance systems that are dependent upon atmospheric aerosol concentrations. This application is only one of many that have been proposed over a period of many years, and readers may refer to additional details on this subject within the research library. Documentaries produced by this researcher (Aerosol Crimes, Cloud Cover) during the last decade also elaborate on those analyses. The principles of LIDAR apply here.

Current field observations continue to reinforce this hypothesis. Observation in the southwest U.S. indicates that two locale types appear to be preferred targets for application: these include the large urban areas and the border region between the U.S. and Mexico. These locations, considered in a joint sense, suggest that both people and the monitoring or tracking of those same people within an area may be a technical and strategic priority of the project. A citizen based systematic and sustained nationwide monitoring system of PM 2.5 concentrations over a sufficient time period can clarify this issue further.

The recent papers on the subject of air quality are intended to raise the awareness and involvement of the public with respect to environmental and health conditions. There are very real relationships between how far you can see, the concentration levels of particulates in the atmosphere, and ultimately our mortality. It is our responsibility as stewards, as well as in our own best interest, to not deliberately and wantonly contaminate the planet.

Clifford E Carnicom
Mar 19, 2016

Pollution, Visibility and Mortality

Pollution, Visibility and Mortality
by
Clifford E Carnicom
Mar 12 2016

A preliminary empirical model has been developed to estimate the impact of diminished visibility and fine particulate pollution upon mortality rates.  The model is a synthesis between an analysis of measured pollution levels (PM 2.5), observed visibility levels and published increased mortality estimates.  The model is based, in part, upon previous investigations as published in the paper “The Obscuration of Health Hazards : An Analysis of EPA Air Quality Standards“, Mar 2016.

 

mortality-visibility-days-04

Preliminary Visibility -Exposure – Mortality Model

Air pollution has many consequences.  One of the simplest of these consequences to understand is that of mortality and the degradation of health.  It would be prudent for each of us to be aware of the sources of pollution in the atmosphere, and their subsequent effects upon our well being.  Measurement, monitoring and auditing of airborne pollution is within range of the general public, and the role of the citizens to participate in these actions is of increased imperative.  The role of public service agencies to act on behalf of public health needs and interests has not been fulfilled and we must all understand and react to the consequences of that neglect.

This particular model places the emphasis upon what can be directly observed with no special means, and that is the visibility of the surrounding sky.  Visibility levels are a direct reflection of the particulate matter that is in the atmosphere, and relations between what can be seen (or not seen, for that matter) and the concentration of pollution in the atmosphere can be established.  The relationships are observable, verifiable and are well known for their impacts upon human health, including that of mortality.

All models are idealized representations of reality.  Regardless of variations in the modeling process, it can be confidently asserted that there are direct physical relationships between particulate matter in the atmosphere, the state of visibility, and your health.   There are, of course, many other relationships of supreme importance, but the objective of this article is a simple one.  It is : to look, to be aware of your surroundings, to think, to act, and to participate. The luxuries and damage from perpetual ignorance can not be dismissed or excused.

The call for awareness is a fairly simple one here.  I encourage you to become engaged;  if for nothing else than the sake of your own health.  When this has been achieved, you are in a position of strength to help others and to improve our world.  This generation has no right or privilege to deny the depths of nature to those that will follow us.

2016-03-06_11.10.49

 

Models are one thing, real life is another.  It is time to assume your place.

Sincerely,

Clifford E Carnicom
Mar 12, 2016

The Obscuration of Health Hazards :

The Obscuration of Health Hazards:
An Analysis of EPA Air Quality Standards

by
Clifford E Carnicom
Mar 12 2016

A discrepancy between measured and observed air quality in comparison to that reported by the U.S. Environmental Protection Agency under poor conditions in real time has prompted an inquiry into the air quality standards in use by that same agency. This analysis, from the perspective of this researcher, raises important questions about the methods and reliability of the data that the public has access to, and that is used to make decisions and judgements about the surrounding air quality and its impact upon human health. The logic and rationale inherent within these same standards are now also open to further examination. The issues are important as they have a direct influence upon the perception by the public of the state of health of the environment and atmosphere. The purpose of this paper is to raise honest questions about the strategies and rationales that have been adopted and codified into our environmental regulatory systems, and to seek active participation by the public in the evaluation process.  Weaknesses in the current air quality standards will be discussed, and alternatives to the current system will be proposed.

Particulate Matter (PM) has an important effect upon human health.  Currently, there are two standards for measuring the particulate matter in the atmosphere, PM 10 and PM 2.5.  PM 10 consists of material less than 10 microns in size and is often composed of dust and smoke particles, for example.  PM 2.5 consists of materials less than 2.5 microns in size and is generally invisible to the human eye until it accumulates in sufficient quantity.  PM 2.5 material is considered to be a much greater risk to human health as it penetrates deeper into the lungs and the respiratory system.  This paper is concerned solely with PM 2.5 pollution.

As an introduction to the inquiry, curiosity can certainly be called to attention with the following statement by the EPA in 2012, as taken from a document (U.S. Environmental Protection Agency 2012,1) that outlines certain changes made relatively recently to air quality standards:

“EPA has issued a number of rules that will make significant strides toward reducing fine particle pollution (PM 2.5). These rules will help the vast majority of U.S. counties meet the revised PM 2.5 standard without taking additional action to reduce emissions.”

Knowing and studying the “rule changes” in detail may serve to clarify this statement, but on the surface it certainly conveys the impression of a scenario whereby a teacher changes the mood in the classroom by letting the students know that more of them will be passing the next test.  Even better, they won’t need to study any harder and they will still get the same result.

In contrast, the World Health Organization (WHO) is a little more direct (World Health Organization 2013, 10) about the severity and impact of fine particle pollution (PM 2.5):

“There is no evidence of a safe level of exposure or a threshold below which no adverse health effects occur. The exposure is ubiquitous and involuntary, increasing the significance of this determinant of health.”

We can, therefore, see that there are already significant differences in the interpretation of the impact of fine particle pollution (especially from an international perspective), and that the U.S. EPA is not exactly setting a progressive example toward improvement.

Another topic of introductory importance is that of the AQI, or “Air Quality Index” that has been adopted by the EPA (“Air Quality Index – Wikipedia, the Free Encyclopedia” 2016).  This index is of the “idiot light” or traffic light style, where green means all is fine, yellow is to exercise caution, and red means that we have a problem.  The index, therefore, has the following appearance:

2016-02-02_11.42.35
There are other countries that use a similar type of index and color-coded scheme.  China, for example, uses the following scale (“Air Quality Index – Wikipedia, the Free Encyclopedia” 2016):

2016-02-02_11.51.45

As we continue to examine these scale variations, it will also be of interest to note that China is known to have some of the most polluted air in the world, especially over many of the urban areas.

Not all countries, jurisdictions or entities , however, use the idiot light approach that employs an arbitrary scaling method that is removed from showing the actual PM 2.5 pollution concentrations, such as those shown from the United States and China above.  For example, the United Kingdom uses a scale (“Air Quality Index – Wikipedia, the Free Encyclopedia” 2016) that is dependent upon actual PM 2.5 concentrations, as is shown below:

2016-02-02_12.04.02
Notice that the PM 2.5 concentration for the U.K. index is directly accessible and that the scaling for the index is dramatically different than that for the U.S. or China.  In the case of the AQI used by the U.S. and China (and other countries as well), a transformed scale runs from 0 to 300-500 with concentration levels that are generally more obscure and ambiguous within the index.  In the case of the U.K index, the scale directly reports with a specific PM 2.5 concentration level with a maximum (i.e., ~70 ug/m^3) that is far below that incorporated into the AQI index (i.e., 300 – 500 ug/m^3).

We can be assured that if a reading of 500 ug/m^3 is ever before us, we have a much bigger problem on our hands than discussions of air quality.  The EPA AQI is heavily biased toward extreme concentration levels that are seldom likely to occur in practical affairs; the U.K. index gives much greater weight to the lower concentration levels that are known to directly impact health, as reflected by the WHO statement above.

Major differences in the scaling of the indices, as well as their associated health effects, are therefore hidden within the various color schemes that have been adopted by various countries or jurisdictions.  Color has an immediate impact upon perception and communication; the reality is that most people will seldom, if ever, explore the basis of such a system as long as the message is “green” under most circumstances that they are presented with.  The fact that one system acknowledges serious health effects at a concentration level of  50 – 70 ug/m^3 and that another does not do so until the concentration level is on the order of 150 – 300 ug/m^3 is certainly lost to the common citizen, especially when the scalings and color schemes chosen obscure the real risks that are present at low concentrations.

The EPA AQI system appears to have its roots in history as opposed to simplicity and directness in describing the pollution levels of the atmosphere, especially as it relates to the real-time known health effects of even short-term exposure to lower concentration PM 2.5 levels.  The following statement (“Air Quality Index | World Public Library” 2016) acknowledges weaknesses in the AQI since its introduction in 1968, but the methods are nevertheless perpetuated for more than 45 years.

“While the methodology was designed to be robust, the practical application for all metropolitan areas proved to be inconsistent due to the paucity of ambient air quality monitoring data, lack of agreement on weighting factors, and non-uniformity of air quality standards across geographical and political boundaries. Despite these issues, the publication of lists ranking metropolitan areas achieved the public policy objectives and led to the future development of improved indices and their routine application.”


The system of color coding to extreme and rarified levels with the use of an averaged and biased scale versus one that directly reports the PM 2.5 concentration levels in real time is an artifact that is divorced from current observed measurements and the knowledge of the impact of fine particulates upon human health.

The reporting of PM 2.5 concentrations directly along with a more realistic assessment of impact upon human health is hardly unique to the U.K. index system. With little more than casual research, at least three other independent systems of measurement have been identified that mirror the U.K. maximum scaling levels along with the commensurate PM 2.5 counts. These include the World Health Organization, a European environmental monitoring agency, and a professional metering company index scale (World Health Organization 2013, 10) (“Air Quality Now – About US – Indices Definition” 2016) (“HHTP21 Air Quality Meter, User Manual, Omega Engineering” 2016, 10).
.

As another example to gain perspective between extremes and maximum “safe” levels of PM 2.5 concentrations, we can recall an event that occurred in Beijing, China during November 2010, and that was reported by the New York Times in January of 2013 (Wong 2013) .  During this extreme situation, the U.S. Embassy monitoring equipment registered a PM 2.5 reading of 755, and the story certainly made news as the levels blew out any scale imaginable, including those that set maximums at 500.

An after statement within the article that references the World Health Organization standards may be the lasting impression that we should carry forward from the horrendous event, where it is stated that:

“The World Health Organization has standards that judge a score above 500 to be more than 20 times the level of particulate matter in the air deemed safe.”

Not withstanding the fact that WHO also states that no there is no evidence of any truly “safe” level of particulate matter in the atmosphere, we can nevertheless back out of this statement that a maximum “safe” level for the PM 2.5 count, as assessed by WHO, is approximately 25 ug / m^3.  This statement alone should convince us that we must pay close attention to the lower levels of pollution that enter into the atmosphere, and that public perception should not be distorted by scales and color schemes that usually only affect public perception when they number into the hundreds.

Let us gain a further understanding of how low concentration levels and small changes affect human health and, shall I daresay, mortality. The case for low PM 2.5 concentrations being seriously detrimental to human health is strong and easy to make.  Casual research on the subject will uncover a host of research papers that quantify increased mortality rates with direct relationship to small changes in PM 2.5 concentrations, usually expressing a change in mortality per 10 ug / m^3.  Such papers are not operating in the arena of scores to hundreds of micrograms per cubic meter, but on the order of TEN micrograms per cubic meter.  This work underscores the need to update the air quality standards, methods and reporting to the public based upon current health knowledge, instead of continuing a system of artifacts based upon decades old postulations.

These papers will refer to both daily mortality levels as well as long term mortality based upon these “small” increases in PM 2.5 concentrations.  The numbers are significant from a public health perspective.  As a representative article, consider the following recent published paper in Environmental Health Perspectives in June of 2015, under the auspices of the National Institute of Environmental Health Sciences(Shi et al. 2015) :

 

2016-02-04_16.52.40

 

with the following conclusions:

 

2016-02-04_16.54.29

 

as based upon the following results:

 

2016-02-04_16.55.04

 

Let us therefore assume a more conservative increase of 2% mortality for a short-term exposure (i.e., 2 day) per TEN (not 12, not 100, not 500 per AQI scaling) micrograms per cubic meter.  Let us assume a mortality increase of 7% for long term exposure (i.e, 365 days).

Let us put these results into further perspective.  A sensible question to ask is, given a certain level of fine particulate pollution introduced into the air for a certain number of days within the year, how many people would die as a consequence of this change in our environment?  We must understand that the physical nature of the particulates is being ignored here (e.g., toxicity, solubility, etc.) other than that of the size being less than 2.5 microns.

The data results suggest a logarithmic form of influence, i.e. a relatively large effect for short term exposures, and a subsequently more gradual impact for long term exposure.  A linear model is the simplest approach, but it also is likely to be too modest in modeling the mortality impact. For the purpose of this inquiry, a combined linear-log approach will be taken as a reasonably conservative approach.

The model developed, therefore, is of the form:

Mortality % Increase (per 10ug/m^3) = 1.65 +. 007(days) + 0.48 * ln(days)

The next step is to choose the activity level and time period for which we wish to model the mortality increase.  Although any scenario within the data range could be chosen, a reasonably conservative approach will also be adopted here.  The scenario chosen will be to introduce 30 ug/m^3 of fine particulate matter into the air for 10% of the days within a year.

The model will therefore estimate a 3.6% increase in mortality for 10 ug/ m^3 of introduced PM 2.5 materials (36.5 days).  For 30 ug/m^3, we will therefore have a a 10.9% increase in mortality.  As we can see, the numbers can quickly become significant, even with relatively low or modest PM 2.5 increases in pollution.

Next we transform this percentage into real numbers. During the year of 2013, the Centers for Disease Control (CDC) reports that 2,596,993 people died during that year from all causes combined (“FastStats” 2016).  The percentage of 10.9% increase applied to this number results in 283, 072 additional projected deaths per year.

Continuing to place this number into perspective, this number exceeds the number of deaths that result from stroke, Alzheimer’s, and influenza and pneumonia combined (i.e, 5th, 6th, and 8th leading causes of death) during that same year.  The number is also much higher than the death toll for Chronic Pulmonary Obstructive Disease (COPD), which is now curiously the third leading cause of death.

We should now understand that PM 2.5 pollution levels are a very real concern with respect to public health, even at relatively modest levels.  Some individuals might argue that such a scenario could never occur, as the EPA has diminished the PM 2.5 standard on an annual basis down to 12 ug/m^3.  The enforcement and sensitivity of that measurement standard is another discussion that will be reserved for a later date.  Suffice it to say that the scenario chosen here is not unduly unrealistic here for consideration, and that it is in the public’s interest to engage themselves in this discussion and examination.

 


 

The next issue of interest to discuss is that of a comparison between different air quality scales in some detail.  In particular, the “weighting”, or influence, of lower concentration levels vs. higher concentration levels will be examined.  This topic is important because it affects the interpretation by the public of the state of air quality, and it is essential that the impacts upon human health are represented equitably and with forthrightness.

The explanation of this topic will be considerably more detailed and complex than the former issues of “color coding” and mortality potentials, but it is no less important.  The results are at the heart of the perception of the quality of the air by the public and its subsequent impact upon human health.

To compare different scales of air quality that have been developed; we must first equate them.  For example, if one scale ranges from 1 to 6, and another from 0 to 10, we must “map”, or transform them such that the scales are of equivalent range.  Another need in the evaluation of any scale is to look at the distribution of concentration levels within that same scale, and to compare this on an equal footing as well.  Let us get started with an important comparison between the EPA AQI and alternative scales that deserve equal consideration in the representation of air quality.

Here is the structure of the EPA AQI in more detail (U.S. Environmental Protection Agency 2012, 4) .

 

 AQI Index AQI Abitrary Numeric  AQI Rank PM 2.5 (ug/m^3) 24 hr avg.
Good  0-50  1  0-12
Moderate  51-100  2  12.1-35.4
Unhealthy for Sensitive Groups  101-150  3  35.5-55.4
Unhealthy  151-200  4  55.5-150.4
Very Unhealthy  201-300  5  150.5-250.4
Hazardous  301-500  6  250.5-500

 

Now let us become familiar with three alternative scaling and health assessment scales that are readily available and that acknowledge the impact of lower PM 2.5 concentrations to human health:

 

United Kingdom Index U.K. Nomenclature PM 2.5 ug/m3 24 hr avg.
1 Low 0-11
2 Low 12-23
3 Low 24-35
4 Moderate 36-41
5 Moderate 41-47
6 Moderate 48-53
7 High 54-58
8 High 59-64
9 High 65-70
10 Very High >=71

 

Now for a second alternative air quality scale, this being from Air Quality Now, a European monitoring entity:

 

Air Quality Now EU Rank Nomenclature PM 2.5  Hr PM 2.5 24 Hrs.
1 Very Low 0-15 0-10
2 Low 15-30 10-20
3 Medium 30-55 20-30
4 High 55-110 30-60
5 Very High >110 >60

 

And lastly, the scale from a professional air quality meter manufacturer:

 

Professional Meter Index Nomenclature PM 2.5 ug/m^3 Real Time Concentration
0 Very Good 0-7
1 Good 8-12
2 Moderate 13-20
3 Moderate 21-31
4 Moderate 32-46
5 Poor 47-50
6 Poor 52-71
7 Poor 72-79
8 Poor 73-89
9 Very Poor >90

 

We can see that the only true common denominator between all scaling systems is the PM 2.5 concentration.  Even with the acceptance of that reference, there remains the issue of “averaging” a value, or acquiring maximum or real time values.  Setting aside the issue of time weighting as a separate discussion, the most practical means to equate the scaling system is to do what is mentioned earlier:  First, equate the scales to a common index range (in this case, the EPA AQI range of 1 to 6 will be adopted).  Second, inspect the PM 2.5 concentrations from the standpoint of distribution, i.e., evaluate these indices as a function of PM 2.5 concentrations.  The results of this comparison follow below, accepting the midpoint of each PM 2.5 concentration band as the reference point:

PM 2.5 (ug/m^3) EPA AQI UK EU (1hr) Meter
1-10 1 1 1 1
10-20 2 1.6 1 2.1
20-30 2 2.1 2.2 2.7
30-40 2 2.1 3.5 3.2
40-50 3 3.2 3.5 3.2
50-60 3 4.3 3.5 4.3
60-80 4 5.4 4.8 4.9
80-100 4 6 4.8 6
100-150 4 6 6 6
150-200 4 6 6 6
200-250 5 6 6 6
250-300 5 6 6 6
300-400 6 6 6 6
400-500 6 6 6 6

 

This table reveals the essence of the problem; the skew of the EPA AQI index toward high concentrations that diminishes awareness of the health impacts from lower concentrations can be seen within the tabulation. 

This same conclusion will be demonstrated graphically at a later point.

Now that all air quality scales are referenced to a common standard, i.e., the PM 2.5 concentration), the general nature of each series can be examined via a regression analysis.  It will be found that a logistical function is a favored functional form in this case and the results of that analysis are as follows:

EPA Index (1-6) = 5.57 / (1 + 2.30 * exp(-.016 * PM 2.5))
Mean Square Error = 0.27

Mean (UK – EU – Meter) Index (1-6) = 6.03 / (1 + 5.65 * exp(-.046 * PM 2.5))
Mean Square Error = 0.01

The information that will now be of value to evaluate the weighting distribution applied to various concentration levels is that of integration of the logistical regression curves as a function of bandwidth.  The result of the integration process (Int.) applied to the above regressions is as follows:

PM 2.5 Band EPA AQI (Int.)
[Index * PM 2.5]
Mean Index (Int.)
(UK-EU-Meter)
[Index * PM 2.5]
% Relative Overweight or Underweight of PM 2.5 Band Contribution Between EPA AQI and Mean Alternative Air Quality Index Scale (Endpoint Bias Removed)
1-10 16.1 10.1 +42%
10-20 19.8 15.8 +27%
20-30 21.9 21.6 +8%
30-40 24.1 28.3 -10%
40-50 26.3 35.2 -27%
50-60 28.5 41.5 -39%
60-80 63.6 98.0 -47%
80-100 72.1 110.4 -46%
100-150 211.7 295.0 -32%
150-200 243.7 300.8 -16%
200-250 261.7 301.4 -8%
250-300 270.7 301.5 -4%
300-400 551.8 603.0 -2%
400-500 555.9 603.0 0%

 

A graph of a regression curve to the % Relative Overweight/Underweight data in the final column of the table above is as follows (band interval midpoints selected; standard error = 4.1%).

 

EPA Underweight Function Feb 09 2016 - 01

 

And, thus, we are led to another interpretation regarding the demerits of the EPA AQI.  The EPA AQI scaling system unjustifiably under-weights the harmful effects of PM 2.5 concentrations that are most likely to occur in real world, real time, daily circumstances.  The scale over-weights the impacts of extremely low concentrations that have little to no impact upon human health.  And lastly, when the PM 2.5 concentrations are at catastrophic levels and the viability of life itself is threatened, all monitoring sources, including the EPA, are in agreement that we have a serious situation.  One must seriously question the public service value under such distorted and disproportionate representation of this important monitor of human health, the PM 2.5 concentration.

 


 

Let us proceed to an additional serious flaw in the EPA air quality standards, and this is the issue of averaging the data. It will be noticed that the current standard for EPA PM 2.5 air quality is 12 ug/m^3 , as averaged over a 24 hour period. On the surface, this value appears to be reasonably sound, cautious and protective of human health. A significant problem, however, occurs when we understand that the value is averaged over a period of time, and is not reflective of real-time dynamic conditions that involve “short-term” exposures.

To begin to understand the nature of the problem, let us present two different scenarios:

Scenario One:

In the first scenario, the PM 2.5 count in the environment is perfectly even and smooth, let us say at 10 ug/m^3. This is comfortably within the EPA air quality standard “maximum” per a 24 hour period, and all appears well and good.

Scenario Two:

In this scenario, the PM 2.5 count is 6 ug/m^3 for 23 hours out of 24 hours a day. For one hour per day, however, the PM 2.5 count rises to 100 ug/m^3, and then settles down back to 6 ug/m^3 in the following hour.

Instinctively, most of us will realize that the second scenario poses a significant health risk, as we understand that maximum values may be as important (or even more important) than an average value. One could equate this to a dosage of radiation, for example, where a short term exposure could produce a lethal result, but an average value over a sufficiently long time period might persuade us that everything is fine.

And this, therefore, poses the problem that is before us.

In the first scenario, the weighted average PM 2.5 count over a 24 hour period is 10 ug/ m^3.

In the second scenario, the weighted average PM 2.5 count over a 24 hour period is 10 ug/m^3.

Both scenario averages are within the current EPA air quality maximum pollution standards.

Clearly, this method has the potential for disguising significant threats to human health if “short-term” exposures occur on any regular basis. Observation and measurement will show that they do.

Now that we have seen some of the weaknesses of the averaging methods, let us look at an additional scenario based upon more realistic data, but that continues to show a measurable influence upon human health. The scenario selected has a basis in recent and independently monitored PM 2.5 data.

The situation in this case is as follows:

This model scenario will postulate that the following conditions are occurring for approximately 10% of the days in a year. For that period, let us assume that for 13.5 hours of the day that the PM 2.5 count is essentially nil at 2 ug/m^3. For the remaining 10.5 hours of the day during that same 10% of the year, let us assume the average PM 2.5 count is 20 ug/m^3. The range of the PM 2.5 count during the 10.5 hour period is from 2 to 60 ug/m^3, but the average of 20 ug/m^3 (representing a significant increase) will be the value required for the analysis. For the remainder of the year very clean air will be assumed at a level of 2 ug/m^3 for all hours of the day.

A more extended discussion of the nature of this data is anticipated at a later date, but suffice it to say that the energy of sunlight is the primary driver for the difference in the PM 2.5 levels throughout the day.

The next step in the problem is to determine the number of full days that correspond to the concentration level of 20 ug/m^3, and also to provide for the fact that the elevated levels will be presumed to exist for only 10% of the year.  The value that results is:

0.10 * (365 days) * (10.5 hrs / 24 hrs) = 16 full days of 20 ug/m^3 concentration level.

As a reference point, we can now estimate the increase in mortality that will result for an arbitrary 10 ug/m^3 (based upon the relationship derived earlier):

Mortality % Increase (per 10ug/m^3) = 1.65 +. 007(16 days) + 0.48 * ln(16 days)

and

Mortality % Increase (per 10ug/m^3) = 3.1%

The increase in this case is 18 ug/m^3 (20 ug/m^2 – 2 ug/m^3), however, and the mortality increase to be expected is therefore:

Mortality % Increase (per 18ug/m^3 increase) = 1.8 * 3.1% = 5.6%.

Once again, to place this number into perspective, we translate this percentage into projected deaths (as based upon CDC data, 2013):

.056 * (2, 596, 993) = 145, 431 projected additional deaths.

This value is essentially equivalent (again, curiously) to the third leading cause of death, namely Chronic Pulmonary Obstructive Disease (COPD), with a reported value of deaths for 2013 of 149, 205.

It is understood that a variety of factors will ultimately lead to mortality rates, however, this value may help to put the significance of  “lower” or “short-term” exposures to PM 2.5 pollution into perspective.

It should also be recalled that the averaging of PM 2.5 data over a 24 hour period can significantly mask the influences of such “short-term” exposures.

A remaining issue of concern with respect to AQI deficiencies is its accuracy in reflecting real world conditions in a real-time sense. The weakness in averaging data has already been discussed to some extent, but the issue in this case is of a more practical nature. Independent monitoring of PM 2.5 data over a reasonably broad geographic area has produced direct visible and measurable conflicts in the reported state of air quality by the EPA.

After close to twenty years of public research and investigation, there is no rational denial that the citizenry is subject to intensive aerosol operations on a regular and frequent basis. These operations are conducted without the consent of that same public. The resulting contamination and pollution of the atmosphere is harmful to human health.  The objective here is to simply document the changes in air quality that result from such a typical operation, and the corresponding public reporting of air quality by the EPA for that same time and location.

Multiple occasions of this activity are certainly open to further examination, but a representative case will be presented here in order to disclose the concern.

 

clear_01

Typical Conditions for Non- Operational Day.
Sonoran National Monument – Stanfield AZ

op_01

Aerosol Operation – Early Hours
Jan 19 2016 – Sonoran National Monument – Stanfield AZ

op_02

Aerosol Operation – Mid-Day Hours
Jan 19 2016 – Sonoran National Monument – Stanfield AZ

 

op_day-crop

EPA Website Report at Location and Time of Aerosol Operation.
Jan 19 2016 – Sonoran National Monument – Stanfield AZ
Air Quality Index : Good
Forecast Air Quality Index : Good
Health Message : None

Current Conditions : Not Available
(“AirNow” 2016)

 

The PM 2.5 measurements that correlate with the above photographs are as follows:

With respect to the non-operational day photograph, clean air can and does exist at times in this country, especially in the more remote portions of the southwestern U.S. under investigation.  It is quite typical to have PM 2.5 counts from 2 to 5 ug/m^3, which fall under the category of very good air quality by any index used.  Low PM 2.5 counts are especially prone to occur after periods of heavier rain, as the materials are purged from the atmosphere.  The El Nino influence has been especially influential in this regard during the earlier portion of this winter season.  Visibility conditions of the air are a direct reflection of the PM 2.5 count.

On the day of the aerosol operation, the PM 2.5 counts were not low and the visibility down to ground level was highly diminished.  The range of values throughout the day were from 2 to 57, with the low value occurring prior to sunrise and post sundown.  The highest value of 57 occurred during mid-afternoon.  A PM 2.5 value of 57 ug/m^3 is considered poor air quality by many alternative and contemporary air quality standards, and the prior discussions on mortality rates for “lower” concentrations should be consulted above.  This high value has no corollary, thus far, during non-aerosol-operational days.  From a common sense point of view, the conditions recorded by both photograph and measurement were indeed unhealthy.  Visibility was diminished from a typical 70 miles + in the region to a level of approximately 30 miles during the operational period.  Please refer to the earlier papers (Visibility Standards Changed, March 2001 and Mortality vs. Visibility, June 2004; also additional papers) for additional discussions related to these topics.

The U.S. Environmental Protection Agency reports no concerns, no immediate impact, nor any potential impact to health or the environment during the aerosol operation at the nearest reporting location.

 


Summary:

This paper has reviewed several factors that affect the interpretation of the Air Quality Index (AQI) as it has been developed and is used by the U.S. Environmental Protection Agency (EPA). In the process, several shortcomings have been identified:

1. The use of a color scheme strongly affects the perception of the index by the public. The colors used in the AQI are not consistent with what is now known about the impact of fine particulate matter (PM 2.5) to human health. The World Health Organization (WHO) acknowledges that there are NO known safe levels of fine particulate matter, and the literature also acknowledges the serious impact of low concentration levels of PM 2.5, including increased mortality.

2. The scaling range adopted by the AQI is much too large to adequately reveal the impact of the lower concentration levels of PM 2.5 to human health. A range of 500 ug/m^3 attached to the scale when mortality studies acknowledge significant impact at a level of 10 ug/m^3 is out of step with current needs by the public.

3. The underweighting of the lower PM 2.5 concentration levels relative to more contemporary scales that adequately emphasize lower level health impacts obscures health impacts which deserve more prominent exposure.

4. The AQI numeric scale is divorced from actual PM 2.5 concentration levels. The arbitrary scaling has no direct relationship to existing and actual concentrations of mass to volume ratios. The actual conditions of pollution are therefore hidden by an arbitrary construct that obscures the impact of pollution to human health.

5. The AQI is a historic development that has been maintained in various incarnations and modifications since its origin more than 45 years ago. The method of presentation and computation is obtuse and appears to exist as a legacy to the past rather than directly portraying pollution health risks.

6. The averaging of pollution data over a time period that filters out short term exposures of high magnitude is unnecessary and it hinders the awareness of the actual conditions of exposure to the public.

7. Presentation of air quality information through the authorized portal appears to present potential conflicts between reported information and actual field condition observation, data and measurement.

Recommendations:

In the opinion of this researcher the AQI, as it exists, should be revamped or discarded. Allowing for catastrophic pollution in the development of the scale is commendable, but not if it interferes with the presentation of useful and valuable information to the public on a practical and daily basis.

There is a partial analogy here with the scales used to report earthquakes and other natural events, as they are of an exponential nature and they provide for extreme events when they occur. It is now known, however, that very low levels of fine particulate matter are very harmful to human health. Any scaling chosen to represent the state of pollution in the atmosphere must correspondingly emphasize and reveal this fact. This is what matters on a daily basis in the practical affairs of living; the extreme events are known to occur but they should not receive equal (or even greater) emphasis in a daily pollution reporting standard. It is primarily a question of communicating to the public directly in real-time with actual data, versus the adherence to decades old legacies and methods that do not accurately portray modern pollution and its sources.

It seems to me that a solution to the problem is fairly straightforward; this issue is whether or not such a transformation can be made on a national level and whether or not it has strong public support. Many other scaling systems have already made the switch to emphasize the impact of lower level concentrations to human health; this would seem to be admirable based upon the actual needs of society.

It is a fairly simple matter to reconstruct the scale for an air quality index. THE SIMPLEST SOLUTION IS TO REPORT THE CONCENTRATION LEVELS DIRECTLY, IN REAL TIME MODE. For example, if the PM 2.5 pollution level at a particular location is, for example, 20 ug/m^3, then report it as such. This is not hard to do and technology is fully supportive of this direct change and access to data. We do not average our rain when it rains, we do not average our sunlight when we report how clear the sky is, we do not average the cloud cover, and we do not average how far we can see. The environmental conditions exist as they are, and they should be reported as such. There is no need to manipulate or “transform” the data, as is being done now. A linear scale can also be matched fairly well to the majority of daily life needs, and the extreme ranges can also be accommodated without any severe distortion of the system. The relationship between visibility and PM 2.5 counts will be very quickly and readily assimilated by the public when the actual data is simply available in real-time mode as it needs to be and should be. Of course, greater awareness of the public of the actual conditions of pollution may also lead to a stronger investigation of their source and nature; this may or may not be as welcome in our modern society. I hope that it will be, as the health of our generation, succeeding generations, and of the planet itself is dependent upon our willingness to confront the truths of our own existence.

Clifford E Carnicom
Mar 12, 2016

Born Clifford Bruce Stewart
Jan 19, 1953

 

References

“AirNow.” 2016. Accessed March 13. https://www.airnow.gov/.

“Air Quality Index | World Public Library.” 2016. Accessed March 13. http://www.worldlibrary.org/articles/air_quality_index.

“Air Quality Index – Wikipedia, the Free Encyclopedia.” 2016. Accessed March 13. https://en.wikipedia.org/wiki/Air_quality_index.

“Air Quality Now – About US – Indices Definition.” 2016a. Accessed March 13. http://www.airqualitynow.eu/about_indices_definition.php.
———. 2016b. Accessed March 13. http://www.airqualitynow.eu/about_indices_definition.php.

“FastStats.” 2016. Accessed March 13. http://www.cdc.gov/nchs/fastats/deaths.htm.

“HHTP21 Air Quality Meter, User Manual, Omega Engineering.” 2016.

Shi, Liuhua, Antonella Zanobetti, Itai Kloog, Brent A. Coull, Petros Koutrakis, Steven J. Melly, and Joel D. Schwartz. 2015. “Low-Concentration PM2.5 and Mortality: Estimating Acute and Chronic Effects in a Population-Based Study.” Environmental Health Perspectives 124 (1). doi:10.1289/ehp.1409111.

U.S. Environmental Protection Agency. 2012. “Revised Air Quality Standards for Particle Pollution and Updates to the Air Quality Index (AQI).”

Wong, Edward. 2013. “Beijing Air Pollution Off the Charts.” The New York Times, January 12. http://www.nytimes.com/2013/01/13/science/earth/beijing-air-pollution-off-the-charts.html.

World Health Organization. 2013. “Health Effects of Particulate Matter, Policy Implications for Countries in Eastern Europe, Caucasus and Central Asia.”

Global Warming Model

A Global Warming Model
Clifford E Carnicom
Santa Fe, NM
Apr 13 2007

global warming model


From a Special Report on April  1, 2007 from CBS 60 Minutes, entitled, The Age of Warming:

“Over the past 50 years, this region, the Antarctica peninsula, the northwestern part and the islands around it
has been going up in temperature about one degree every decade and that makes the region the fastest warming place on earth.
…And it’s not unique. More than 90 percent of the world’s glaciers are retreating….”


A study has been done to examine the role of the aerosol operations with respect to global warming. It has long been proposed1,2,3 that the aerosol operations have the effect of aggravating the heating condition of the planet, and that they show no prospect for cooling the earth as many have claimed. This is in direct contradiction to many of the popular notions that commonly circulate regarding the operations, i.e., that these operations are somehow intended for our benefit, but it is best that their true nature remain undisclosed and closed to fair examination by the public. Whether or not such popular theories are intended to mislead the public is open to question; the facts, however, speak of an opposite end result.  The aerosols are being dispersed into the lower atmosphere, and it can be shown from this fact that they will indeed heat up the lower portion of the atmosphere.  Global warming itself is defined as the heating of the lower atmosphere and earth4. The notion that the aerosols are in some way cooling the planet is contradictory to direct observation and the examinations of physics.  To cool the planet, the intentionally dispersed aerosols would have to be in the upper regions of the atmosphere or in space; readers interested in that conclusion may wish to read more closely the proposals of Edward Teller that are often cited in the claims of supposed mitigation.   It will be found that any claims of aerosols cooling the planet will usually require those materials to be at the upper reaches of the atmosphere to the boundaries of space; aerosols in the lower atmosphere will usually be shown to be heating the planet.  These facts must be considered by any of those individuals that continue to promulgate claims of anonymous and beneficial mitigation in conjunction with the aerosol operations.

The current model examines the effects of deliberately introducing barium particulates into the lower atmosphere, and the subsequent contribution to the global warming problem.  The results are not encouraging.  The results indicate that these particulates, even at rather modest concentration levels, can contribute in a real and significant way to the heating of the lower atmosphere.  The magnitude appears to be quite on par with any of the more popularly discussed contributions, such as carbon dioxide increase and greenhouse gases.  It is recommended that the public be willing to consider some of the more direct, visible and palpable alterations to our planet and atmosphere within the pursuit of the global warming issue,  namely the aerosol operations as they have been imposed upon the public without informed consent for more than 8 years now.

The graph above shows the expected interactions from 3 variables that relate to the global warming issue; these are: aerosol concentration, time and rise in temperature.  On one axis, relatively modest concentrations of barium particulates in the atmosphere are shown.  The magnitudes shown are not at all unreasonable with respect to the numerous analyses that have been made by this researcher in the past, e.g., visibility studies available on this site. As a point of reference, the EPA air quality standard for particulates of less than 2.5 microns in size has been recently lowered5 to 35 ugms (micrograms) per m3 (cubic meter).  It will be seen from the graph, for example,  that even a 10% level of this standard (i.e., 3.5-ugms / m3) can produce a noticeable heating of the lower atmosphere.  As has been stated previously, the candor and accountability of the EPA  is sorely lacking over the past decade, and this agency has failed miserably in its duty to the public to maintain environmental safeguards.  It can no longer be assured or assumed that minimal air quality standards are being honored in any way, and the integrity of the EPA to serve the public interest can no longer be upheld.  It is quite possible, and unfortunately somewhat expected, that enforceable and accountable air quality standards have been sacrificed some time ago with the advent of the aerosol operations.

A second axis on the graph is that of time in years. A point of zero time would be one that assumes no such artificial and increased concentration of barium particulates exists in the lower atmosphere.  The graph is marked in intervals of 5 year periods, from 0 to 50 years.  The time period of 50 years has been chosen only to demonstrate that the effects of these particulates upon heating is of serious and immediate concern; within a matter of decades the effects are pronounced and have measurable global impact.  The variables of aerosol concentration and time can now be considered mutually with the above graph and model.  Presumably, humans have a vested interest in protecting the welfare of the planet beyond the immediate future of a few decades, and the problem would be only more pronounced if a century of time had been presented versus a fifty year period.

The third axis is that of temperature rise presented in degrees of centigrade.  This is the variable that should solicit the greatest concern.  To give an example of usage, a concentration of 5ugms / m3 over an interval as short as 20 years would lead to heating of the lower atmosphere on the order of  0.6 degrees centigrade.  This corresponds to approximately 1 degree of Fahrenheit.  This is found by finding the intersection of 5ugms along the concentration axis with 20 years of elapsed time on the second axis.  This point is then projected horizontally upon the temperature increase axis, where it will be found to intersect at approximately 0.6 degrees.  This is a very real and measurable result in terms of global impact.  Nobel Prize Winner Paul Crutzen, in Atmosphere, Climate and Change6 writes in 1997 that even conservative estimates of global planetary surface temperature change are on the order of  1 to 3 degrees centigrade over a 50 year interval.  This temperature change will produce sea level changes on the order of 10 to 30 centimeters. It is stated, furthermore, that “much of Earth’s population would find it inordinately difficult to adjust to such changes”.

Readers may now notice that the recent CBS special report referred to above demonstrates that the rate of  heating in Antarctica is already approximately 1.5 times greater than the predictions from the 1997 era.

It can be seen from this model that the results of artificial aerosol introduction into the lower atmosphere can be of a magnitude quite on par with the extraordinary impacts projected by even modest and conservative global warming models upon humans in the near future.   As the model presented herein is intended to be reasonably conservative, the impact of the aerosol operations could be much greater than these results show.  It is advised that the citizens consider the viability and merit of this model in the examination of the global warming issue, and that they openly take aggressive action to halt the intentional aerosol operations.

This paper is late in its offering, as my availability for continued research at this level is limited.  I am nevertheless hopeful that the information can be evaluated and assimilated into the many rationales and arguments that have developed over the last decade to cease the intentional alteration of the atmosphere of our planet.

Clifford E Carnicom
April 13, 2007

Additional Notes :  The model can easily be extended to other elements of concern, however, a focus on barium has taken place due to the unique physical properties of that element along with the evidence for its existence at unexpected levels in the atmosphere.The mathematics and physics of the model is presented in a separate paper.

References:

1. CE Carnicom, Drought Inducement, https://carnicominstitute.org/wp/drought-inducement/, April 2002
2. Carnicom, Global Warming and Aerosols, https://carnicominstitute.org/wp/global-warming-aerosols/, Jan 2004
3. Carnicom, Global Warming and Aerosols, Further Discussion, https://carnicominstitute.org/wp/global-warming-aerosols-ii/, Feb 2004
4. Wikipedia, Global Warming, http://en.wikipedia.org/wiki/Global_warming
5. EPA,  EPA Strengthens U.S. Air Quality Standards,
http://yosemite.epa.gov/opa/admpress.nsf/a8f952395381d3968525701c005e65b5/92771013f7dda087852571f00067873d!opendocument
6. Crutzen, Paul, Atmosphere, Climate and Change, (Scientific American Library, 1997), p141.

Global Warming Model [Part II]

A Global Warming Model
Part II
Clifford E Carnicom
Santa Fe, NM
Apr 10 2007

The details of the Global Warming Model  are presented on this page.

The model has the following final form:

mathematical global warming model

The model is developed in the following manner: (text form)

The definition of heat capacity is given as1

C = dQ/dT

which states that the heat capacity of a substance is defined as the instantaneous change in the quantity of heat (joules) with respect to an instantaneous change in temperature (degrees Kelvin or centigrade).  The units of C are J / K, or joules per degree Kelvin.

The specific heat capacity is furthermore defined as:2

c = del Q / (m * del T)

where Q is in joules, m is the mass in kilograms (kg) and T is in degrees Kelvin or centigrade, and del is the change operator.

Specific heats are measured values that are commonly available, and they indicate how much energy is required to raise a unit volume of material a unit rise in temperature (centrigrade or Kelvin).

Specific heats can be measured at constant pressure (cp) or constant volume (cv). Specific heats for gases do not vary significantly over large temperature variations3, and they may therefore usually be treated as constants.  A suitable value of cp for air is 1.003 kJ/ kg K4. For solids and liquids, the difference between cp and cv is usually quite small5 and can usually be ignored; values for cp are readily available.

As the definition of specific heat results from a differential form, this paper will focus on the change in a small volume of air, namely 1 cubic meter of air under ideal gas conditions.

The specific heat can be rearranged to:

del T = del Q /  ( m * cp )

this is equivalent to:

del T = ( Watts / m * cp ) * t

where t is time in seconds, and Watts is the incoming energy in joules /second.

The model under consideration examines the above change from a differential standpoint, i.e., what is the effect upon temperature change with respect to an incremental change in input energy for a unit mass of air?  The incremental change in input energy will result from the change in specific heat of a mixture, i.e, air vs. air with aerosolized particulates.  Developing further, our model now has the form:

del (del T) = ( t / m * cp) * del (Watts)

The model will also be permitted to include an efficiency factor (EF), as not all of the energy coming into the system (i.e., solar energy) will be absorbed.  A current estimate for this efficiency factor is set at 50 percent.6

or

del (del T) = ( EF  * t ) / ( m * cp ) * del ( Watts )

The next problem is to determine a value of cp for the modified atmosphere, i.,e. air with aerosolized particulates added to the cubic meter of air under examination.  The specific heat capacity of a mixture is given7 as:

cp(air+aerosol) = sum ( mfi * cpi)

where mfi is the mass fraction of the ith component of mixture, and cpi is the specific heat capacity of the ith component of the mixture.

mfi is defined as mi / m

whre mi is the mass of the ith component and m is the total mass of the mixture.

Let us now refer to:

mair = mass of 1 cubic meter of air in kg

maer = mass of aerosols added to 1 cubic meter of air in kg

cpair = specific heat of air in J /kg  K

cpaer = specific heat of aerosol in J /kg K

cp(air+aerosol) = [mair / (mair + maer) ] * cpair + [maer / (mair + maer)] * cpaer

It can be proposed that del (Watts) can be aequately represented by:

del (Watts ) = [ del (cp) / cpair ] * Average Solar Radiation

and that

del (cp) = cpair – cp(air+aerosol)

or that

del (del T) = [( EF  * t ) / ( mair * cpair )] *  [( cpair – cp(aer+aerosol) ) / cpair] * Average Solar Radiation

or that

del (del T) = [( EF  * t ) / ( mair * cpair )] *  [ cpair –   ( [mair / (mair + maer) ] * cpair + [maer / (mair + maer)] * cpaer  ) / cpair ] * Average Solar Radiation

which is equivalent to the model presented above.

The average incoming solar radiation (insolation) to the earth will be taken as 342 W / m2.8

The mass of air will be taken as 1.2 kg / m3.

The specific heat capacity of barium, cpaer, is .19 J / kg  K.9,10

The specific heat capacity of air, cpair,  is 1.003 J /kg K.

The efficiency factor is selected as .50.

In the model proposed, the mass of the aerosol varies from 0 to 50 ugms (micrograms) per cubic meter, or from 0 to 50E-9kg/ m3.

Time is measured in seconds, and varies from 0 to 50 years (one year = 31536000 seconds).

The model evaluated with respect to variations in time and mass concentration of the aerosol will produce the graphic result of this report.  The final units of the model are in degrees centigrade per m2, which corresponds to the differential element of air chosen as 1 cubic meter.  A more complete partial differential model of change with respect to both del (Watts) and del (cp)  may be pursued in the future if warranted. The model is not intended by any respects to be all inclusive of the global warming issue; it is intended to introduce, in a quantitative sense, the consideration of heating of the lower atmosphere from the artificial introduction of particulates.

References:

1. Walter Benenson, Handbook of Physics, (Springer-Verlag, 2002), 684.
2. Benenson, 687.
3. Merle C. Potter, Thermodynamics for Engineers, (McGraw-Hill, 1983), 55.
4. Potter, 289.
5. Potter, 56.
6. National Snow and Ice Data Center,  University of Colorado, Boulder,  Arctic Climate and Meteorology, http://nsidc.org/arcticmet/factors/radiation.html
7. Potter, 251.
8. Wikipedia, Solar Radiation, http://en.wikipedia.org/wiki/Solar_radiation
9. C.E. Carnicom, Drought Inducement, https://carnicominstitute.org/wp/drought-inducement/
10. David Lide, Editor, Handbook of Chemistry and Physics, (CRC Press, 2001-2002), 12-219.

POTASSIUM INTERFERENCE IS EXPECTED

POTASSIUM INTERFERENCE
IS EXPECTED
Clifford E Carnicom
May 15 2005

 

It is to be expected that specific ions that are important to human health are in the process of being affected on a large scale geographic basis.  This premise is based upon the principle of “cyclotronic resonance”, a phenomenon which occurs when charged particles are subjected to low frequency radiation in the presence of a magnetic field.  Each of these mechanisms is in place, and the documentation of ambient Extremely Low Frequency (ELF) radiation at a fundamental frequency of 4Hz, along with several harmonics,1 adds a critical component.  This radiation,  in addition to the earth’s magnetic field,  provides the physical mechanisms to induce this resonance for specific nutrient ions within the human body.

 

The emphasis in this report is upon the potassium ion, which is of fundamental importance to human health.  Although there are additional ions which deserve discussion at a later point, the primary ranking of potassium in human biology is of special concern.

 

The reader is referred to the following statement by Dr. Robert. O. Becker:

 

Cyclotronic resonance is a mechanism of action that enables very low strength electromagnetic fields, acting in concert with the Earth’s geomagnetic field, to produce major biological effects by concentrating the energy in the applied field upon specific particles, such as the biologically important ions of sodium, calcium, potassium and lithium“.2

 

It is shown within this report that the potassium ion is specifically expected to incur biological interference within people over large regions of the earth’s surface.  This is due to the fact that the fifth harmonic of the ELF that has been repeatedly measured over a period of several years corresponds to the cyclotronic resonant frequency of potassium.  This fifth harmonic, along with numerous other harmonics is a regular component of the ELF radiation that under measurement at this time.  This expected interference, albeit intentional or not, can be shown to exist based upon the principles and physics of cyclotronic resonance, a phenomenon well established3 in classical electromagnetic theory.

 

 Measured 20Hz ELF Signal - Spectral Analysis
Measured 20Hz ELF Signal – Spectral Analysis
12 May 2005

 


 

Additional Notes:

 

The physical conditions required to achieve cyclotronic resonance with a certain ion are very specific.  They require a unique combination of charge on the particle, a specific mass of the particle, a specific magnetic field strength and a specific introduced electromagnetic frequency into the environment.  The current examination shows such combinations to be few in number, but potentially very important if they are found to exist.  The importance of the potassium ion to human health has prompted this initial disclosure so that the process can be examined more fully.  Additional ions are under examination.  The consideration of an additional resonance phenomena, that of nuclear magnetic resonance, is also underway.

 

With respect to cyclotronic resonance, the equation for the resonant condition occurs at4:

 

whz = ( q * B ) / (2 * pi * m)

 

where

 

whz is the cyclotronic resonant frequency in cycles per second (Hz)

 

q is the electric charge on the particle in Coulombs

 

B is the strength of the magnetic field in Teslas

 

and m is the mass of the particle in kilograms.

 

To approach this problem, we are interested in the special cases where the resonant frequency to be determined is 4hz, or a multiple of 4hz (harmonic).  The harmonics have been measured up to 28-32hz (7th and 8th harmonics) with regularity.

 

Therefore we can set up the problem as:

 

(4 * n)  = (Z *q * B ) / (2 * pi * m)

 

where n is the harmonic under consideration ( n =1 is the fundamental frequency of 4hz), and Z is the valence of the ion.

 

Our end goal will be to determine what atomic masses correspond to the 4hz frequency multiples, as that will identify any specific ions of concern.  The charge on the particle will be the product of the valence of the ion with the charge of an electron (e).  The charge of an electron is 1.6E-19 coulombs, and the valences of some common ions under investigation are K+1, Mg+2, Ca+2, Ba+2, for example.

 

The magnetic field strength varies to some degree across the earth’s surface. A reasonable estimate for the strength of the magnetic field in the United States is approximately .5 gauss, or 5E-5 Teslas.  Values of the magnetic field strength over the earth’s surface can be estimated with geomagnetic models that are available to the public5.  A current estimate for the magnetic field strength in the Santa Fe, NM area is 5.06E-5 Teslas.

 

The mass of an atom is equal to the atomic mass in grams per mole times 1E-3, divided by Avogadro’s number, 6.02E23.

 

Therefore, the equation can be rewritten in more convenient terms as:

 

atomic mass number in gms = ( Z * q * B * 6.02E23 ) / (4 * n * 2 * pi * 1E-3)

 

Our problem, is to find those combinations of Z (valence) and n (harmonic multiples) that result in an atomic mass number that corresponds to the reality of a known element.

 

It will be found that the K+ ion satisfies this equation very closely for the magnetic field strength of much of the mid-latitude regions of the globe.

 

Specifically, if Z = + 1, B = 5.06E-5 Teslas, and n = 5 (fifth harmonic corresponding to 20Hz), the atomic mass number that results is 38.8 gms.

 

The atomic mass number for the most common isotope of potassium is 39.0gms6.  Equivalently, the cyclotronic frequency that corresponds to the atomic mass number of 39.0gms in the magnetic field examined is 19.9Hz.  Such unique combinations are not common, but they do occur.  They are of concern with respect to human biological function and interference, as these ions will absorb energy and can lead to the disruption of cellular ion exchange processes.

 

It can also be expected that variations in the magnetic field of the earth can lead to other potential resonance conditions in various regions or latitudes.  It is therefore not unexpected to find large regional health issues that will correlate with variations in the magnetic field strength of the earth.  Certain ions are expected to be disrupted in some areas of the globe more than others.

 

It should be remembered that the true cause for concern here is introduced Extremely Low Frequency radiation that makes this condition possible in the first place.  

 

On a more personal note, it may be found that supplementing the diet with chelated forms of magnesium, calcium and potassium can be beneficial, especially in relation to allergic conditions. Readers may wish to examine further the relationships between positive ion increases and allergic conditions that have been reported7. A deficiency of potassium is known to cause fatique.8 The body’s ability to manage both potassium and magnesium levels appears to be strongly linked; additional symptoms of potassium deficiency include depression, cognitive impairment, nervousness and insomnia9.  Excess of potassium can also lead to significant complications.

 

More complex resonance conditions, such as those involving the 60Hz power grid and the modified atmosphere, are also under examination and may be reported on in the future.  The primal importance of the potassium ion has prompted the issuance of this report.

 

Clifford E Carnicom
May 15, 2005

 

 

 


 

References:

1. Carnicom, ELF 2005 : Positive Identification, https://carnicominstitute.org/wp/elf-2005-positive-identification/, May 2005.
2. Robert O. Becker, MD, Cross Currents, The Perils of Electropollution, The Promise of Electromedicine, (Penquin Putnam, 1990), 235.
3. Phillip R. Wallace, Mathematical Analysis of Physical Problems,((Dover, 1984), 330.
4. Research and Education Association, Physics (Research and Education Association, Inc, 2005), 436.
5. National Geophysical Data Center, International Geomagnetic Reference Field, http://www.ngdc.noaa.gov/seg/geomag/magfield.shtml
6. John Emsley, The Elements, (Oxford University Press, 1998), 161.
7. Carnicom, Calcium and Potassium, https://carnicominstitute.org/wp/potassium-interference-is-expected/, March 2005.
8. Life Clinic, Potassium, http://www.lifeclinic.com/focus/nutrition/potassium.asp
9. The Bartter Site, Potassium Dosing Page,http://www.barttersite.com/potassium_dosing_page.htm

CONDUCTIVITY: The Air, The Water, and The Land

CONDUCTIVITY:
The Air, The Water, and The Land
Clifford E Carnicom
April 15, 2005

A  rainfall laboratory test recently received from a rural location in the Midwestern United States has refocused attention on the electrolytic, ionic and conductive properties of environmental samples in connection with the aerosol operations.  These “interesting characteristics” of solids in our atmosphere have a more direct and down to earth impact as their nature is better understood.  This is nothing less than the changing of the air, the water and the soil of this planet.  All life is eventually to be affected as it continues.

A laboratory report has been received that documents unusually high levels of calcium and potassium within a rain sample.1   Previous work has demonstrated unexpected levels of barium and magnesium.   The continuous presence of easily ionizable salts at higher concentrations within atmospheric samples has many ramifications upon the environment.  A brief introduction to the severe health impact of this category of particulates has also been made on this site. Current work is now dedicated to the impact that these materials are having upon not only upon the atmosphere, but upon the water and soil as well.  All inhabitants of this planet will eventually confront, voluntarily or not, the consequences of the actions that are being allowed to degrade the viability and habitability of our home.

The burden of testing for the problems underway does not fall upon any private citizen, as the resources are not available to support it.  Nevertheless, testing and analysis does continue in whatever way is  possible.  Accountability must eventually fall to those public servants and agencies entrusted with protection of the general welfare and environment.  It should not be assumed that there is infinite time available to ponder the strategies of improvement and the solutions for remedy.  We shall all bear the final price for any condonement of what has been allowed to pass.

Now, for the more immediate particulars:

A series of conductivity tests have been conducted with recent heavy snowfall samples collected in New Mexico and Arizona. Conductivity is a means to measure the ionic concentration within a solution. These tests have been performed with the use of a calibrated conductivity meter in conjunction with calibrated seawater solutions. A series of electrolysis tests have also been completed with these same samples and calibrated solutions.

These tests demonstrate conclusively the presence of reactive metal hydroxides (salts) in concentrations sufficient to induce visible electrolysis in all recent snowfall samples encountered2.  

Precipitates result if reactive electrodes are used; air filtration tests have produced these same results in even more dramatic fashion from the solids that have been collected.  Highly significant electrolytic reactions occur in the case when the solid materials from the atmosphere are concentrated and then placed into solution.  Rainfall is expected to be one of the purest forms of water available, especially in the rural and high mountain sites that have been visited.  Rainfall from such “clean” environments is not expected to support electrolysis is any significant fashion3, and conductivity is expected to be on the order of 4-10uS4. Current conductivity readings are in the range of approximately 15 to 25uS. These values may not appear to be extraordinarily large, however any increase in salt content, especially with the use of remote samples, will need to be considered with respect to the cumulative effect upon the land.  These results do indicate an increase in conductivity on the order of 2-3 times, and the effects of increased salinity on plant life will merit further discussion.

Beyond the indicated increase in conductivity levels of sampled precipitation, there are two additional important results from the current study. The first is the ability to make an analytic estimate of the concentration of ionic salts within the regional atmosphere.  The results do appear to be potentially significant from an air quality perspective and with respect to the enforcement (or lack thereof) of existing standards.   The second is the introduction of the principle of “ohmic heating”, which in this case allows for increased conductivity of the atmosphere as a result of an introduced current.

First, with respect to estimated concentrations of ionic salt forms in the atmosphere, the principle is as follows.  The methods demonstrate that our focus is upon reactive metal hydroxide forms (barium hydroxide, for example).  Conductivity is proportional to ionic concentration.  Although a conductivity meter is especially useful over a wide range of concentrations, special care is required when dealing with the weak saline forms of precipitation as they now exist.  It has been found that current flow as measured by a sensitive ammeter (µamps) appears to be useful in assessing the conductivity of the weak saline solution.  The results have been confirmed and duplicated with the use of the calibrated conductivity meter. The use of on ohm meter to measure resistance is found from both experience and from the literature to not be reliable without much caution, due to complications of heating and/or polarization.  Weak saline solutions appear to have their own interesting characteristics with respect to introduced currents, and this topic will come to the forefront when ohmic heating is discussed.

A series of weak sea saltwater solutions have been carefully prepared for use in calibrating both the conductivity meter and the ammeter.  These solutions are in strengths of 0.56%, 1.51% and 3.01% respectively.  Many tests have also been completed with refined water samples as well as seawater equivalents.  Conductivity is proportional to concentration levels, especially as it has been bracketed with a variety of solutions in the range of expected measurements.  Measurements currently estimate the saline concentration of the precipitation samples at approximately 0.041%.  Salt concentrations in any amount are extremely influential to conductivity.  

Assuming an equivalency in density of the precipitation salts to sea salts, this results in an expected concentration level of approximately 15 milligrams per liter.  For comparison purposes, rainwater in Poker Flats, Alaska is reported as approximately 1mg/liter for all dissolved ions; the contribution from reactive metal compounds is a small fraction of that total.  Highly polluted rain over Los Angeles CA is reported at approximately 4mg/liter, with approximately 1mg/liter composed of the reactive metals.5  Simulated rainfall samples report concentration levels of approximately 4 and 21 mg/liter respectively, presumed to reflect reasonably clean and polluted samples respectively6.  In all cases cited, the contribution from reactive metal ions is quite small relative to the whole, and sulfate, nitrate and chloride ions are the largest contributors to the pollutants.    Testing here indicates the composition of the precipitate pollutants may be biased toward the reactive metal ion concentrations.

The next objective is to translate the measured and estimated concentration level to an equivalent density, or particulate count, within the atmosphere.  This method is based upon saturation levels for moisture within the atmosphere.  Air at a given temperature can only hold so much water.

From the Smithsonian Meteorological Tables, the saturation density is given as:7

saturation density = 216.68 * (ew / (Cv * T) )

where ew is the saturation vapor pressure in millibars, T is temperature in Kelvin, and Cv is the compressibility factor.  Cv is 1.0000 to the level of precision required.

From Saucier8, the saturation vapor pressure in millibars with respect to water is estimated as:

 es = 6.11 * 10(a*t)/(t+b)

where a = 7.5
b = 237.3

and t is degrees Centigrade.

Therefore, the saturation density can be stated as:

density (gms /m3) = [ 216.68 * es / K

and the density in gms / m3 of salt particulate in the air can be estimated as:

gms / m3 = Conductivity Estimate of Solids (in gms per liter) * (RH% / 100) * Saturation Density * 1E-3

and in µgms:

µgms = gms / m3 * 1E6

and as an example, if the solid density is .015 gms / liter and the temperature is 15 deg centigrade and humidity is 50%, the estimate of particulate concentration from the salts is 96µgms / m3.  This concentration will vary directly with altitude (temperature) and humidity levels.

The estimates show that at ground levels and temperatures it is quite possible that the EPA air quality standards for particulate matter are no longer being met.  This determination will also depend on the size of the particles in question, as EPA standards vary according to size (PM2.5 and PM10 respectively).  All analyses indicate that the size of the aerosols under examination are sub-micron, and if so, this makes the problem more acute.  Air quality standards for comparison to various scenarios are available9 to examine the relationship that has been developed. Unfortunately, the failures of United States government agencies now require the independent audit of EPA data and presentation.  The U.S. Environmental Protection Agency is especially culpable in this regard, and the enforcement of existing standards is a serious topic of controversy.

Finally, let us introduce the subject of ohmic heating.  The behavior of electric currents within weak saline solutions has many points of interest.  During the testing for this report, it was observed that the conductivity of weak saline solutions noticeably increased over time when these solutions were subjected to a weak electric current. It appears that the most likely source of this conductivity is a phenomenon known as ohmic heating.  In plasma physics, ohmic heating is the energy imparted to charged particles as they respond to an electric field and make collisions with other particles.  A classic definition would be the heating that results from the flow of current through a medium with electrical resistance.  Please recall the difficulty of using an ohmmeter to measure conductivity in a solution; this difficulty was realized in the trials of this report.

Metals are known to increase their resistance with the introduction of an electric current.  As the metal becomes hotter, resistance increases and conductivity decreases.  Salt water and plasmas are quite interesting in that the opposite effect occurs.  The conductivity of salt water increases when temperature increases.  The same effect occurs within a plasma; an increase in temperature will result in a decrease of the resistance.10, i.e, the conductivity increases.  Introduction of an electric current into the plasma, or salt water for that matter, will increase the temperature and therefore the conductivity will also increase.  This is in opposition to our normal experience with metals and conductors.

In the past, conductivity studies have focused on the ability of the reactive metals to lose ions through the photoionization process.  This remains a highly significant aspect of the aerosol research.

The importance of this study is that a second factor has now been introduced into the conductivity equation, and that is the introduction of electric current itself into the plasma state. This research, through direct observation and analysis,  has inadvertently turned attention once again to the HAARP facility, where ohmic heating is stated within the Eastlund patent to be a direct contributor to atmospheric conductivity increase.  All evidence indicates that this plasma is saline based, which further propagates the hypothesis of increased conductivity in the atmosphere with the introduction of electric current, in addition to that provided by photoionization.

A future presentation will examine the changes in the conductivity of our soil, in addition to that of our air and water.

1. CE Carnicom, Calcium and Potassium, https://carnicominstitute.org/wp/calcium-and-potassium/, March 2005.
2. Andrew Hunt, A-Z Chemistry, (McGraw Hill, 2003), 125.
3. Dr. Rana Munns, The Impact of Salinity Stress, http://www.plantstress.com/Articles/salinity_i/salinity_i.htm.
4. Steven Lower, Ion Bunk, http://www.chem1.com/CQ/ionbunk.html.
5. Hobbs, Peter, Introduction to Atmospheric Chemistry, Cambridge University Press, 2000, p137.
6. Water Standards, Simulated Rainwater, http://www.hps.net/simrain.html
7. Smithsonian Meteorological Tables, Table 108, (Smithsonian Institution Press, 1984), 381.
8. Walter J. Saucier, Principles of Meterological Analysis, (Dover, 1989), 9.
9. National Ambient Air Quality Standards, http://www.tceq.state.tx.us/compliance/monitoring/air/monops/naaqs.html
10. S. Eliezer and Y. Eliezer, The Fourth State of Matter, An Introduction to Plasma Science, (Institute of Physics Publishing 2001), 124-125.

MORTALITY VS. VISIBILITY

MORTALITY VS. VISIBILITY
Clifford E Carnicom
Santa Fe, New Mexico
Jun 03 2004

MORTALITY VS. VISIBILITY

MORTALITY VS. VISIBILITY
Distance to Mountain Range : Approximately 15 miles
3 % of U.S. Population : Approximately 8 million people

A model has been developed to depict the estimated increase in the mortality rate as a function of the decrease in visibility. The results of this model in a graphical form are shown above. It can be observed that mortality increases as visibility decreases, and that the effect is highly significant. This model does not consider the additional negative health effects that occur from the toxic nature of particulate matter1.

 


Additional Notes:

The American Heart Associations establishes that an increase in the density of particulate matter will cause an increase in mortality. The expected increase is expressed in a differential form of an increase of 1% mortality of an increase of 10ug (micrograms) per cubic meter.2 Additional sources3 refer to an increase of 3.4% mortality increase per equivalent density change, however the more conservative approach will be adopted within this model.

 

TO BE CONTINUED

References:

1. Clifford E Carnicom, Barium Tests are Positive, (https://carnicominstitute.org/wp/barium-tests-are-positive/), May 24, 2004.
2. American Heart Association, Air Pollution, Heart Disease and Stroke, (http://www.americanheart.org), Jun 1 2004.1. Clifford E Carnicom, Mortality Requires Examination, (https://carnicominstitute.org/wp/mortality-requires-examination/), Mar 22, 2004.
3. Laden F, Neas LM, Dockery DW, Schwartz J., Association of Fine Particulate Matter from Different Sources with Daily Mortality in Six U.S. Cities, (Environmental Health Perspective), 2000 Oct; 108 (10), 941-7. Abstract available from U.S. National Institute of Health.
4. Carnicom, Air Quality Data Requires Public Scrutiny, (https://carnicominstitute.org/wp/air-quality-data-requires-public-scrutiny/), Aug 27, 2001.
5. Carnicom, Microscopic Particle Count Study, New Mexico 1996-1999, (https://carnicominstitute.org/wp/microscopic-particle-count-study-new-mexico-1996-1999/), Mar 23, 2000.
6. Carnicom, The Theft of Sunlight, (https://carnicominstitute.org/wp/the-theft-of-sunlight/), Oct 25, 2003.
7. Carnicom, Visibility Standards Changed, (https://carnicominstitute.org/wp/visibility-standards-changed/), Apr 01, 2001.
8. Carnicom, The Extinction of the Stars, (https://carnicominstitute.org/wp/the-extinction-of-the-stars/), Jun 23, 2003.
9. American Lung Association, Particulate Matter, (http://www.lungusa.org), Apr 2000.