CI Research Paper Categories
CI Research Paper Categories
CI Research Paper Categories
CI Research Paper Categories
Clifford E Carnicom
Aug 22 2016
Preliminary Note: A journalist of professional standing recently contacted Carnicom Institute requesting comments with respect to a recently published paper by the University of California. The paper claims to issue an authoritative edict as a denial of geoengineering activities that are now actively practiced and that are detrimental to the global environment. The following comments were provided to that journalist and they are made available to the public as follows:
The body of scientific work on geoengineering and bioengineering issues by Carnicom Institute spans close to twenty years. The library of work, approximately 350 original research papers, encompasses a variety of scientific disciplines. The methods and results, essentially with no exception, are reproducible and adhere to scientific protocols. This evidence (not survey) based work is available for your review at:
In addition, documentary summaries are available at:
(2011 abbreviated version):
The remainder of this response will be necessarily brief: we can pursue further discussion later, should you choose.
Specifically, in reference to the UC “peer-review study” and the presentation on the UC website, I will make the following comments at this time:
Let us begin with what appears to be the motive for the study; it speaks more strongly of the desire to influence public behavior than it does to seek observational and evidence-based data to substantiate the scientific method.
“Meanwhile, a growing number of studies have shown that quantifying and communicating the scientific consensus on contested issues such as vaccine safety and climate change can help lower public misperceptions and uncertainty(Myers et al 2015, vander Linden et al 2015, van der Linden et al 2015).
Here, therefore, we report the results of an expert survey in which we asked experts on atmospheric chemistry and atmospheric deposition to scientifically evaluate the claims of SLAP theorists.”
The first assumption implicit within this statement is that for some “unknown” reason, the public is in a state of “misperception” and “uncertainty.” Why would such an assumption need to exist for the scientific method to proceed? This type of bias is a discredit to the acumen of the public. Even casual research will reveal that the concern by the public regarding the geoengineering issue is now elevated to a global level. By what right and upon what basis must we start our endeavor by assuming that this global population is ill-informed?
Notice the phrase “Here, therefore, we report the results of an expert survey….” This phrase continues the mis-advised logic from above and it states the true motive for the project. It is to “correct” the misguided ways of the global public in their “growing public distrust of elites and social institutions.”
The project is flawed from the beginning. It does not embody or represent the scientific method; it is not based upon direct observation, direct collection of evidence, the testing of hypothesis, and the fair and honest assessment of bonafide data to reach accurate and truthful conclusions. None of the work or research in the paper is original. This so-called “peer-review study” is an orchestrated and manipulative social engineering project; it is not science.
Where is the cry and demand for the data? Not a trailing and vague ending to the most critical questions at hand, but real data, impartial data, independent data, accountable data. The lack of accountability on this global environmental issue is preposterous.
“We therefore offer the first peer-reviewed expert response on SLAP data.” … “The evidence as evaluated here does not point to a ….”
What a perfectly loaded and crafted phrase. It is everything that the social engineers need to achieve their goals of manipulating and affecting public perception. Sarcasm aside, it is even more impressive because it is the “first.” This statement is a masterful conclusion of an incomplete and questionable process that avoids the hard-hitting realities and confrontations that come forth from TRUE science. Finally, I would claim that this paper does not present evidence; it present a series of ambiguous and incomplete responses to the reasonable demands from an alert and aware global population that is truly and genuinely concerned about our environment.
This is only a partial response to a purported accredited and authoritative study. My hope is that readers will pursue honesty and thoroughness in these affairs and that they will be guided by their moral conscience toward truth.
1. Having attended the University of California at the onset of my higher education pursuits approximately 45 years ago, I must say that I am embarrassed and sorry for the state of education as it now exists in this country. What was once considered to be an honor and privilege of attendance must now be accepted with a level of disgrace to the nobler goals that were once served. I encourage each member of that institution, student, faculty and administrator, to reclaim the powers and benefits that come forth from comprehensive investigation and critical thinking to reach honest conclusions and assessments of the state of our world.
2. As of this date, the journalist referred to has not acknowledged receipt of the comments above. This statement will be revised as circumstances warrant.
Clifford E Carnicom
May 14 2016
To give the public some insight into the practical affairs of work that transpires at Carnicom Institute on a daily basis, the following diary of a representative week is offered:
Saturday, May 07, 2016:
Back in the laboratory in Idaho after a magnificent winter in the southwest portion of the country. Field studies and research continue throughout the year regardless of location, but it is good to have access to a laboratory again for several months. Spent the first half of the morning unpacking, packing and regrouping for the summer activities.
Forming a daily outline is a common task; this time the outline is for a longer time scale. Within about 20 minutes, 32 projects have been identified as topics in immediate need of research. The list is actually only partial, but it represents a good start. It is understood that completion of the list can never be achieved from this sole standpoint, but it is still valuable to lay out what needs to be done. If nothing else, it serves to help prioritize so that I can isolate the half dozen projects that are likely to show some progress during the upcoming season.
|Lab Notes, May 07 2016
The list now includes, for example:
That is a partial list. Give me a few more lifetimes and I will try to manage it. We are running out of time to assume that business as usual is OK. Do not assume anything and we will still be behind. The list only grows, but CI keeps chipping away. The MRP is collecting data nicely.
Colorado snow sample processing started, including concentration work. Initial tests on pH, oxidation reduction potential, conductivity, TDS, and UV measurements completed. pH shows up alkaline again; seems to be a pretty strong pattern there. See the work from years ago. Relationships between oxides and hydroxides to be explored. Trace metal determinations should be helpful here. ORP an interesting topic, but less commonly used and understood. Might be valuable, but no significant result there as of yet. Index of refraction comparison also a tame issue, it seems. UV work produces highly repeatable results; leads to method to determine dissolved aromatic organic content in solution. Run references with phenols. Gilson time drift can be compensated for, as required. Only slight detection in sample apparent at this point, but method developed is extremely valuable regardless of what develops here.
The internet here is the pits, it is essentially non-functional. This has been going on since the move to the new location. It is hindering and interfering with the work, I need information now and always, lots of it, and I need it fast. This is a problem. Running on a hotspot card as best we can. Once again, the issue is money on that one; a commercial internet account is needed here – long overdue.
That is a round for today.. A couple of calls and this report will tie up a good part of tomorrow. Shutdown near 2100, on the early side no less.
Sunday, May 08, 2016:
Two extended calls today related to work projects and collaborative efforts. The calls were productive but they do consume a good chunk of the daylight hours.
|Lab Notes, May 08 2016
A day of DOC (Dissolved Organic Carbon) investigations and methods development. Literature shows a strong linear relationship between UV 254 nm absorption and DOC. Standards developed at different concentrations; very strong linear relationship established and duplicated (r^2 = .999 n = 4). Good work there, UV 254 has some real value, even with limited and old equipment.
CO snowmelt sample analyzed per the standards comparison. The number is coming in low, on the order of 1-2 PPM. Dissolved Organic Carbon topic is quite a bit more relevant than we might expect, but in some unexpected ways. DOC ends up being heavily tied into the ‘global warming’ issue and the so-called carbon footprint topic – but not in a way that many of us might think. Value determined is in line with low values tracing back to 1995 research papers on DOC of 88 different natural water sources. We all get to ask what does a low DOC value in rainfall actually tell us? Suggest we start by reading the Clash of Evidence paper…
Also a strong tie in of DOC with the acid- rain issue. Again, all may not be as it appears to be or as many of us think. pH data is consistently alkaline with the rainfall samples. High DOC values apparently will sway towards the acidic side. Low DOC values (as being found here on first run) are consistent are therefore more consistent with an alkaline sway (as is also being found). Twenty to thirty years ago we would all be talking about the ‘acid-rain’ issue; notice that does not seem to be so popular anymore? The winds of environmental politics, snapchats and selfies…
As usual, the plot thickens, and DOC may end up being a more valuable tool than originally perceived. Issues of the Clash of Evidence paper, ‘global warming’ (emphasis upon quotes here), acid rain, may all come into play as we learn more here. The evidence drives the truth, not our perceptions or what we are told is the state of affairs.
The DOC project is a young one, so it is not time to read too much into the picture. Collect the data, as usual. Will reopen the Nov rainfall sample case, and examine that sample further.
CO snow melt sample continues to be concentrated, should be ready within a couple more days.
End at 0230, the wee hours.
Monday, May 09, 2016:
Morning hours occupied with the editing of two documents; the first related to a European interview and the second with respect to a collaboration project document. Time devoted to planting the seeds of future awareness and involvement; I hope that they will be fruitful in the years ahead.
|Lab Notes, May 09 2016
A revisit to the Idaho rain sample of Nov 2015. There is now the advantage of two distinct precipitation samples of sufficient quantity collected under known conditions over a period of time and distance. It takes approximately 3-4 days to prepare and concentrate each sample; accelerated evaporation must be done carefully and with good measurements for volume.
The DOC calibration standards are further adjusted by the carbon fraction within the phenol standard.
DOC measurement for the Idaho sample takes place. Measurements also extended to include pH, Total Dissolved Solids (TDS), conductivity, and Oxidation Reduction Potential (ORP). First indications are that the Idaho sample has foreign materials on the order of 3 times the level of the CO sample; filtration of the snow sample may also be a factor here. The CO sample evaporation process should be complete by tomorrow, slow and steady. We definitely have some soluble metallic salts in the ID sample; time will tell with CO after concentration is complete. It is difficult for commercial lab tests to be productive unless you know what to look for first; this work is geared heavily toward that end. From the November rainfall work, the list is already rather long, including metallic salts, organophosphates, pesticides, and biologicals, for example.
Beginning emission spectroscopy work in the evening hours. Development of the setup, lighting, flame source, etc. A couple of metallic salt references explored, including sodium and strontium. Variation of the spectrum with respect to molecular form quickly comes to light (more puns here without much effort) as strontium introduces the first puzzle of the process. Investigation of research papers within the fireworks world (pyrotechics) shows how the oxide forms can vary dramatically from the elemental and solution forms). Every field of study is its own ballgame, but time to get the feet wet on emission spectroscopy also. Application of the method is likely to be limited, namely because of difficulties of acquiring sufficient materials in solid form, but it is another good tool to get under the belt for certain circumstances.
Approx. shutdown at 0200, call it another day, the tortoise keeps marching…
Tuesday, May 10, 2016:
Detailed measurements today of the simpler parameters (pH, conductivity, TDS, ORP, DOC) of the three precipitation samples, one from CO, one from ID, and one from NM. Sample set is collected over 6 months of time, spans a considerable distance, and now begins to provide some valuable redundant data for comparison.
|Lab Notes, May 10 2016
The results are quite similar with one another in a general sense. The portrait being created is one of further investigation as to metallic salt composition as well as the basis for alkalinity across all samples. The dissolved organic carbon levels are quite low in all cases; further relationships between this finding and climate/acid rain issues will be of interest if time permits.
Assessment of the rainfall samples to date indicates the following potential areas of concern and further investigation:
Next on tap with this work: Visual analysis of concentrated samples, microscopic study, trace metal investigations (electrochemical), infra red work, and emission spectroscopy (if samples can be developed).
Certainly this one project with these three samples alone could tie me up for the next week or two, there are no shortcuts here and the work is detailed and demanding.
Wednesday, May 11, 2016:
Significant and time consuming administrative paperwork tended to today. Not sure if any time for lab will be available yet or not. Also an extended CI call again tonight. I also need to ride my bicycle soon, as I am overdue for putting around town and trail. Methinks that the ID sun has to be delivering again soon..
… It was a veeery long call tonight.
In the late hours, started the initial rainfall electrochemistry studies via voltammetry.
|Lab Notes, May 11 2016
Thursday, May 12, 2016:
Electrochemistry (voltammetry) work all day and well into the night. Lots of steady and patient work ahead of me now. Don’t expect any daily drama, as it is all happening in the notes for the foreseeable future.
|Lab Notes, May 12 2016
Another benefit in methods over the last couple of days. The original volume of rainfall sample required for analysis was on the order of about 100 ml (maybe a half cup or so). This was, over time, reduced to about 20 ml, which made a huge improvement in efficiency. There is now a method in place that reduces the sample size to 4 ml; this is a major improvement. Samples take time and effort and cost to collect and ship; reducing this to 4 ml makes life much easier for everyone involved. It also aids the concentration process dramatically. Not quite to micro-electrodes yet, but certainly a lot closer. Just for your info, electrodes are now small enough that individual biological cells are being investigated via electrochemistry. I am a bit more coarse than that, but major strides in sample prep reduction have now been implemented.
Friday, May 13, 2016:
And there we have it, another most un-ordinary week in the lab of CI. Most every day here is a day of discovery, questions, problems, mysteries, patience, serendipity, pursuit and, most definitely, Capricorn persistence. Leeuwenhoek did not look at things one time to be convinced, and neither do I. It is over and over and over, as there are a million ways to be tricked with what we think is real and which depends upon our senses. Anyone that thinks that science is a rote and ordained path reliant solely upon rules, dogma and certifications has missed out on the real affair. There is both wonder, challenge and creativity at every step. It is a shame that much of science has lost much of its original spirit of openness, discovery and expression. Here, it has not.
Another day deep in the bowels of voltammetry study today. The first consolidated pass of the Idaho rain sample has been completed; three more samples to make a review of. When that is done, I will start it all over again and repeat until reproducibility is in hand.
Most textbook examples of voltammetry usually have one, or at most two, components in the solution to examine. If only real life was so simple. The rainwater under examination indicates close to a dozen components dissolved within, and it is most certainly a very complex affair that is underway there. There are some surprises that are showing up, and the situation seems to be more involved than many of us might be aware of. When you consider the activity described within the preliminary reports published on this same sample, maybe that is a clue as to what lies ahead. Trace metals, inorganics, organics, organophosphates, neonicotinoids, halogens, radiation and biologicals is a tall order to investigate thoroughly. The overall indications are not favorable to environmental and human health. Incidentally, a radiation meter would also be helpful to CI. I will do my best with what has been granted to me, and hopefully you will get to see another paper on another day.
And by the way, I still have plans to ride my bicycle for a spell.
My best to you,
Clifford E Carnicom
May 14 2016
Born Clifford Bruce Stewart
Jan 19 1953
There are many that serve to champion human rights with no return of fame and glory; they do so passionately and unselfishly to make this world a better place for us all. In many cases, we never even know who they are as the winds of public knowledge and awareness are often transient, and they soon pass us by. This page calls attention to only a few of such individuals that I have come to known over the years; those that follow below are entitled to honor and appreciation from each of us. Each of them has contributed in significant ways to the betterment of mankind. May they rest in peace.
Clifford E Carnicom
Dr. Mike Castle
July 2, 2015
April 17, 2009
A. C. Griffith “Griff”
June 19, 2012
Dr. Lorraine Hurley MD
October 2, 2015
Carl Lewis McBrayer
September 6, 2013
Dr. Ilya Sandra Perlingieri
October 6, 2013
Dr Gwen Scott, N.D.
March 15, 2015
Alfred Wyant Stites
April 11, 2016
by Clifford E Carnicom
Apr 06 2016
Edit May 14 2016
Edit Jul 05 2016
(A Partial Editorial)
There are many environmental activists who assume a certain cause and relationship between active geoengineering programs and those projects that fall under the term of “Solar Radiation Management“ (SRM). This paper will reiterate the basic fallacy of that assumption, and it will direct the reader towards a more comprehensive inquiry of the true nature of the forces and agendas that are likely to be involved.
For those that do not wish to engage in the full length of this article, the Solar Radiation Management principle is one of interfering with solar heat transfer to the earth. There are various schemes for accomplishing this which will be discussed later; the most modest of the choices requires the introduction of certain types of particulates into the middle of the stratosphere (from about 7 to 30 miles above sea level).
The essential problem here is that geoengineering activity as it is currently practiced (and for that matter, bioengineering as well), is operational in the troposphere (from ground level to an average of about 7 miles above sea level), and not the mid-stratosphere. There is a world of difference between the two, but for that discussion you will have to muse yourself further into this paper.
Before going further, however, it will be beneficial to provide a brief historical context for the issues and the language involved. There is a track record of controversy and confusion, information and misinformation, official responses and denials, organization and disorganization, research and speculation, and authorities and personalities that now span close to two decades. Unfortunately, the progress of society coming to terms and truthfulness with the deliberate modification of the atmosphere, and ultimately the planet itself, has been slow.
So first, a little history of language and personalities. The journalistic rise of the geoengineering issue began, to my best recollection, in the last few weeks of the year 1998. A certain Canadian journalist came to prominence quite rapidly on a nationally syndicated radio show, with coined language and defined agendas to let the world know that something very different and important was to affect the world. It is fair to say that I have never been at ease with either the language or the a priori “agenda” that was introduced, as they always seemed to be supported with substantial fanfare and attention, but without any basic science to support claims being made. The issue was, essentially, outlined and served to the public without proper investigation and discussion.
It is worthwhile to investigate that history a bit, as it represents a good portion of why we are where we are today. Most of us may not be aware that generational forces are now at play in our understanding of the geoengineering issue. The language introduced at that time was the use of the term “chemtrail”, a term that never did have a formal, accurate, or scientific definition then, and it still does not today. That deficiency alone has been enough to interfere with the proper investigation of environmental pollution and contaminants, and it remains moderately successful to this day. Whether such language of derision and denial, but of popular appeal, was a product of personal creativity or design of influence I may never be able to state with certainty; I do, however, have my opinions on the matter and I see no benefits from the choice. My separation and disdain for populist and ill-defined terminology that is used in vain to seek legal standing is known, and I shall not be party to perpetuate this dubious origin. Only those words that will stand up in a court of law have merit here, and you are the one that will need to make your case.
The second great coup of the early journalistic ‘work’ was to define, in the eyes of the public, the very reason for the existence of geoengineering programs before any science was in place to justify the claim. Again, it was all far, far “too easy” for one of my persuasion. Check your internet history books, but you will find that a global and covert operation of unprecedented scale was, by use of a curious combination of implication and certainty, for the purpose of “reducing global warming.”
History will show that there has been an incredible level of success in strategy and influence upon public perception with these implants. They are, however, in reality travesties and injustices to the public cause.
What the public was ‘given’, therefore, was an unsubstantiated agenda, ill-defined language of popular attraction, and a host of ready-made and supported ‘detractors’ that raised a commotion, provided distraction and dispute; all of these set the stage to successfully avoid journalistic integrity, scientific investigation, and accountability by public representatives. The obstacles were all provided at little cost, but at great expense to the needs and interests of the public.
This strategy of framing public perception and discussion under the guise of potential benefit was generally effective for more than a decade. Hard hitting journalism never did take place, thorough investigations were not launched, scientific work was not supported, and public officials were not held accountable.
The problem that developed was that the claim of ‘cooling the planet’ by using aircraft to disperse aerosols did not fit the facts of observation. They did not fit them then, and they do not fit them now. It has taken some time for this truth to become evident; I presented my first paper on this topic (Drought Inducement, Apr. 2002) in the early part of the last decade. This work was followed by additional papers (Global Warming and Aerosols, Jan. 2004, A Global Warming Model, Apr. 2007, and A Geoengineering and Climate Change Model, Jan. 2015) during the course of the successive decades. The tenets of that investigative work are also confirmed on a broader level with documents issued by, for example, the International Panel on Climate Change (IPCC) (International Panel on Climate Change (IPCC) 1999, 17) and NASA (“Clouds & Radiation Fact Sheet : Feature Articles” 2016) on the net heating effects from “thin, high clouds.”
High, thin “clouds”, including those that originate from an introduced aerosol base, do not cool the planet; they heat it up.
The next piece of the puzzle that we must fit into the picture is Edward Teller, and specifically the paper by him entitled, “Global Warming and Ice Ages: Prospects for Physics-Based Modulation of Global Climate Change.” This paper, authored in part by the developer of the hydrogen bomb, is often cited by activists themselves as one of the holy grails that proves that geoengineering operations are in place, and that they are indeed “cooling the planet” and “combating global warming” (albeit covertly, for some unknown reason). There are some important portions of the paper that have not been paid attention to; this omission inappropriately supports a culture of popular belief that lacks scientific foundation.
Edward Teller does indeed propose various schemes for cooling the earth’s temperature, including the introduction of aerosols or particulates into the atmosphere. The issue, however, is WHERE in the atmosphere he proposes to do this, and the answer to this question is very relevant to the cause and purpose of this paper. It is even more revealing to point out the additional options that are both proposed and preferred by Edward Teller in his paper, as they help to place his atmospheric aerosol proposal into a better perspective.
Let us spend a brief time with the proposals of Edward Teller, as they are outlined in the paper cited above. Please note that even within the introductory notes that Teller uses the phrase of introducing “scatterers” (i.e., light and heat) “into space from the vicinity of the earth”; this should give some indication of what the thrust of the thinking process is. Teller proposes to introduce the scatterers into three different locations to artificially cool the earth (Teller 1997, 7):
1. Into the middle of the stratosphere (NOT the troposphere). The stratosphere is in the upper atmosphere, and the troposphere is the lower atmosphere. This important difference will be discussed in more detail a little later in this paper.
2. In orbit, in SPACE, approximately 4000 miles above the earth.
3. Deep in SPACE, approximately 400,000 miles from the center of the earth.
An obvious pattern of diverting the heat to locations distant from the earth should be apparent to us; it is one that has not been disclosed sufficiently within the current discussions taking place with respect to both geoengineering and climate control.
The reason the materials are proposed to be so distant from the earth is two-fold:
1. Most of the materials considered will absorb heat.
2. It is desired to have the captured heat radiate into space; not into the earth and its lower atmosphere.
The principles of the approach should not be difficult to grasp here, but they most certainly have been misrepresented in most discussions that are taking place with respect to current and active geoengineering (and bioengineering) operations.
If you hold a parasol over your head on a hot sunny day, it might keep you cooler. The air around you will still absorb that heat, however. The color and material of the umbrella is going to be another factor (i.e, albedo, specific heat, etc.) that you will want to consider. If you want to cool the planet, you are going to have to move the umbrella a lot further away – into space, for example. This is the essence of the Teller paper, and it is important to understand this proposal before certain terms of “solar radiation management” with respect to current geoengineering practices are bandied about. WHERE the material is injected into the atmosphere makes a big difference on the net heat effect, and this topic has largely been ignored within the popular circles of discussion on geoengineering. This discussion should lead one to think much more deeply about what the definition of geoengineering actually is, and how that definition compares to the realities of the projects and operations AS THEY ARE CURRENTLY AND ACTIVELY PRACTICED. Climate modification strategies, or more appropriately, environmental control strategies, are only one part of a much bigger picture.
The Teller paper has gained a lot of mileage in the geoengineering circles, and it is my opinion that much of this mileage is without merit and in ignorance. I must credit the Canadian journalist again for the majority of that progress, as the seed was planted very early in the game with a great deal of supposed ‘alternative media’ support. The Teller paper never explained the physics or consequences of introducing massive amounts of specific aerosol types into the lower atmosphere. The reason for this is simple; the paper was never intended to explain it because this act is not a viable way to cool down the earth. The Teller paper was inappropriately supported and attached to the observation of and media coverage of geoengineering (and bioengineering) operations as they are currently in place and operational.
Now let’s discuss some of the differences between the troposphere and the stratosphere in more detail. The distinction between what is real and hypothetical will never take place until we put at least some effort in that direction.
The troposphere is where weather is made. The troposphere is where airplanes generally fly. The troposphere is where the air is more dense and it is where pollution has a more immediate impact upon us. It is the where the majority of the earth’s atmosphere is, and consequently it is where we can breath and live. The troposphere has a profound and immediate impact upon our very existence on this planet. Roughly ¾ of the mass of the entire atmosphere is contained within the troposphere, the average height is about seven miles (a trip to the grocery store), and it is a veritable delicate eggshell of life for this planet. The troposphere is delicate and crucial to all life on this planet, and disturbance or pollution within it threatens our very existence. It cannot sustain serious damage without immediate consequence.
The stratosphere is where the air is very thin, centering closer to an average height of 20 miles above the earth. Airplanes cannot and do not fly in the mid-stratosphere regularly, as there is not enough air to support them; only specialized or high performance aircraft will rarely be able to visit this transitional zone to space. Geoengineering (and bioengineering) operations, in a practical aviation sense with current technology, cannot be practiced there. Teller makes clear that the preferred target for his ideas is generally in space, where the heat can feasibly be diverted or managed AWAY from the earth.
Readers may also to review an interview from several years past on this and related subjects; it is available via Freedom For All TV which is based in Canada (“Freedom Free For All TV: Clifford Carnicom Interview – YouTube” 2016).
It is now that we can understand a portion of the dilemma that is before us. If we accept that aviation is a primary tool that is actively being used to artificially modify the atmosphere, then we know that this is occurring within the troposphere, and not the mid-stratosphere. But we also know, at least as based upon Teller’s models, that mid-stratospheric operations would be required to effect any type of practical mitigation to global climate warming. Teller also lets us know that long term climate control by aircraft is hardly a preferred method, as it requires specialized performance aircraft and requires continual renewal to maintain its effectiveness. What is known, therefore, is that geoengineering (and bioengineering) operations AS THEY ARE NOW PRACTICED IN THE LOWER ATMOSPHERE, i.e., the troposphere, are not directed and motivated primarily toward climate control, including the purported mitigation of “global warming”.
The forces behind the implementation of active and current geoengineering operations have always understood this, and it never has been a logical motive for the current operations. This is the case regardless of popular conceptions with popular appeal that have been circulated for far too long without contest.
It is certainly past time for the citizens of the world to understand this as well, including many of the well intended environmental individuals and organizations that affect this same citizenry.
The language may have changed some over the recent decades, but the confusion and obfuscation remains as strong as ever now. It is past time to play the cards straight and to force each of us to confront the truths of the matter.
We must now pay some attention to the language that is now in vogue and how it changes. The terms of ‘chemtrails’ and ‘global warming’ were foisted upon us in earlier days; aerosols and particulates were always favored from my position, but those terms do not exactly have popular twitter appeal. They do, however, remain valid and accurate as far as the substance of the matter.
We have transitioned now to more socially acceptable terms of climate change, geoengineering, and “solar radiation management”. Unfortunately, the confusion behind the terms remains as dysfunctional as ever. We can be assured that the definition of geoengineering (and bioengineering) as I understand them, are not at all in agreement with many popularly held notions of that same term. Environmental modification and control is simply one small slice of the bigger pie, as far as I am concerned. I will reiterate my scope of consideration for the term near the end of this paper.
We should, however, at least seek out the definition of the popular term (by many environmental activists as well) “Solar Radiation Management”. This term refers to the management of climate control issues through a modification of the earth’s heat balance; only one option of which includes the introduction of particulate matter into the stratosphere (NOT the troposphere).
Specifically, from the Royal Society:
“Solar Radiation Management (SRM) [are] techniques which reflect a small percentage of the sun’s light and heat back into space.”
Again, I will make the case here that the term cannot and does not apply to current and active geoengineering (and bioengineering) operations as they are currently practiced in the lower atmosphere (troposphere). The stratosphere is not the troposphere, and the troposphere is not the stratosphere. The physics of each layer within the atmosphere are completely different from one another and they cannot, in general, be “used” for the same purposes. You cannot talk about them or treat them as though there is no difference of importance.
You cannot rely on methods and definitions that have physical principles, meaning and application within a certain domain (i.e, the stratosphere) and then use those same methods and principles for a different domain (i.e., the troposphere).
To further assume that the practitioners of active geoengineering (and bioengineering) operations are active within the mid-stratosphere when they are not (as determined by direct observation) further undermines the case for protest of the actual modification of the lower atmosphere (i.e., the troposphere) that is taking place. Talk about misrepresentation and obfuscation of a global environmental and health issue; there is plenty of fodder to work with here.
To claim further that the motives of the geoengineering practitioners are beneficial and well-intended (i.e, “solar radiation management and the curtailment of “global warming”) but that the operations are now known to actually cause harm because of a net heating effect is equally misguided. The operations as they are practiced are not an experiment of beneficent intent; the developers understand the physics and the applications quite well (within their sphere of interest). Rest assured that the web of deployment is not centered on, or confined to, the principles of “Solar Radiation Management”.
Current operations directly impact and affect the lower atmosphere (troposphere) in which we all live and breathe; this assertion is now supported directly by field measurements. The particulate counts are real and observable, and they have been made. The measurements referred to are not worthy solely of “climate control” consideration; they are, however, of immediate impact and detriment to your health and well-being. Gravity works, and the materials do ultimately reach ground level and they are measurable in direct correspondence to activity levels. You may wish to think a little closer to home, in some respects, and become active on that front.
Incidentally, attention should probably be called to a particular segment of a particular interview from several years past; my recollection is that a Mr. George Knapp from the Coast to Coast network moderated the affair. It is another part of the social history, “alternative” media, and social impressionability that precedes us. You may or may not choose to investigate the affair as I report it here.
It was not made clear prior that multiple parties would be available on the interview and fair representation on the sides of an issue can always be a topic of debate. What remains of interest to me is a particular response evoked from a particular Canadian journalist on the panel when I introduced the subject of “biological operations” (e.g., bioengineering) into the discussion. I think it is fair to say that I must have struck a nerve in the flow or agenda of the conversation. After the claim that biological operations are indeed an active component of the aerosol operations as they are now practiced, the particular response from this “Canadian journalist” was:
“There is not! There is not! I repeat there is not any evidence of biological operations available!” (to my best recollection). The response was immediate, emphatic and unqualified.
The show’s host then immediately switched to a commercial break after this statement was made. You may judge for yourself what dynamics transpired at that moment, but the forceful response certainly struck me as out of balance within a purported discussion of important environmental issues.
In the time made available, I refuted the unsubstantiated claim then. I refute it now as well.
I am only one researcher, and I hardly make claim to knowing all shades of an operation that I am not party to. Over the years, however, a ‘list of applications” has been developed which remains internally consistent with all known and observed data. The list has not changed in any significant fashion for more than a decade. I will continue to voice the claim that no discussion of geoengineering (or bioengineering) is of adequate scope unless it delves into each of the following domains:
1. Environmental modification and control (of broader scope than global temperature issues).
2. Military applications
3. Electromagnetic operations
4. Biological operations (including bioengineering)
5. Geophysical considerations
6. Surveillance System Development (LIDAR applications)
7. Exotic technology system monitoring
The prime-time audience may not be ready for the realities and implications of the various aspects itemized above, but they are ultimately deserving.
There are parties that continue to promulgate the thesis that Solar Radiation Management, i.e., the attempted mitigation of “global warming” via stratospheric modification is at the crux of active geoengineering operations. There frequently remains the implication that the motives for operation are of good intention even if the observations of consequence contradict that claim. The use of Edward Teller’s paper is frequently cited as the basis for the implementation of theoretical concepts into actual operation, regardless of the physics or details involved. There are seldom, if ever, references to differences between the impact of operations in the troposphere (lower atmosphere) vs. the stratosphere (upper atmosphere). There frequently is the assumption that the agendas of operation are known and defined by popular perceptions. For close to two decades, the evidence does not support these claims and misrepresentation is in place.
I would encourage that each of us seek common ground and understanding of the forces and applications that are likely operative within the spheres of active and practiced geoengineering (and bioengineering) operations. There is some value in review and observation of the social history and assumptions that accompany our evolution in the pursuit of truth. It is also wise to force good science and reason continuously into our deliberations and debates, and to admit our mistakes so that we may rise above them. If information, analyses and representations are inconsistent we must each be willing to confront those positions. I believe that the phrase has already been coined for us – “The Truth is Out There”, and it is the job of each one of us to help find it.
Clifford E Carnicom
April 06, 2016
Edit May 14, 2016
Edit July 05, 2016
Readers may also wish to become familiar with a model document that proposes an international ban on geoengineering (and bioengineering) practices. Please refer to StopGlobalGeoengineering.org for additional information(“Global Ban on Geoengineering – Stop Global Geoengineering” 2016).
Appreciation is extended to Harold Saive for a note of clarification within this paper.
“Clouds & Radiation Fact Sheet : Feature Articles.” 2016. Accessed March 24. http://earthobservatory.nasa.gov/Features/Clouds/.
“Freedom Free For All TV: Clifford Carnicom Interview – YouTube.” 2016. Accessed April 6. https://www.youtube.com/watch?v=Z1islqA3QNo.
“Geoengineering the Climate: Science, Governance and Uncertainty | Royal Society.” 2016. Accessed March 29. https://royalsociety.org/topics-policy/publications/2009/geoengineering-climate/.
“Global Ban on Geoengineering – Stop Global Geoengineering.” 2016. Accessed April 6. http://stopglobalgeoengineering.org/global-ban-on-geoengineering/.
“Image: The Stratosphere – Overview | UCAR Center for Science Education.” 2016. Accessed March 29.
International Panel on Climate Change (IPCC). 1999. “Aviation and the Global Atmosphere.”
Teller, Edward. 1997. “Global Warming and Ice Ages: Prospects for Physics-Based Modulation of Global Climate Change.”
“The Stratosphere – Overview | UCAR Center for Science Education.” 2016. Accessed April 6. http://scied.ucar.edu/shortcontent/stratosphere-overview.
“The Troposphere – Overview | UCAR Center for Science Education.” 2016. Accessed April 6. http://scied.ucar.edu/shortcontent/troposphere-overview
Clifford E Carnicom
Mar 12 2016
A discrepancy between measured and observed air quality in comparison to that reported by the U.S. Environmental Protection Agency under poor conditions in real time has prompted an inquiry into the air quality standards in use by that same agency. This analysis, from the perspective of this researcher, raises important questions about the methods and reliability of the data that the public has access to, and that is used to make decisions and judgements about the surrounding air quality and its impact upon human health. The logic and rationale inherent within these same standards are now also open to further examination. The issues are important as they have a direct influence upon the perception by the public of the state of health of the environment and atmosphere. The purpose of this paper is to raise honest questions about the strategies and rationales that have been adopted and codified into our environmental regulatory systems, and to seek active participation by the public in the evaluation process. Weaknesses in the current air quality standards will be discussed, and alternatives to the current system will be proposed.
Particulate Matter (PM) has an important effect upon human health. Currently, there are two standards for measuring the particulate matter in the atmosphere, PM 10 and PM 2.5. PM 10 consists of material less than 10 microns in size and is often composed of dust and smoke particles, for example. PM 2.5 consists of materials less than 2.5 microns in size and is generally invisible to the human eye until it accumulates in sufficient quantity. PM 2.5 material is considered to be a much greater risk to human health as it penetrates deeper into the lungs and the respiratory system. This paper is concerned solely with PM 2.5 pollution.
As an introduction to the inquiry, curiosity can certainly be called to attention with the following statement by the EPA in 2012, as taken from a document (U.S. Environmental Protection Agency 2012,1) that outlines certain changes made relatively recently to air quality standards:
“EPA has issued a number of rules that will make significant strides toward reducing fine particle pollution (PM 2.5). These rules will help the vast majority of U.S. counties meet the revised PM 2.5 standard without taking additional action to reduce emissions.”
Knowing and studying the “rule changes” in detail may serve to clarify this statement, but on the surface it certainly conveys the impression of a scenario whereby a teacher changes the mood in the classroom by letting the students know that more of them will be passing the next test. Even better, they won’t need to study any harder and they will still get the same result.
In contrast, the World Health Organization (WHO) is a little more direct (World Health Organization 2013, 10) about the severity and impact of fine particle pollution (PM 2.5):
“There is no evidence of a safe level of exposure or a threshold below which no adverse health effects occur. The exposure is ubiquitous and involuntary, increasing the significance of this determinant of health.”
We can, therefore, see that there are already significant differences in the interpretation of the impact of fine particle pollution (especially from an international perspective), and that the U.S. EPA is not exactly setting a progressive example toward improvement.
Another topic of introductory importance is that of the AQI, or “Air Quality Index” that has been adopted by the EPA (“Air Quality Index – Wikipedia, the Free Encyclopedia” 2016). This index is of the “idiot light” or traffic light style, where green means all is fine, yellow is to exercise caution, and red means that we have a problem. The index, therefore, has the following appearance:
There are other countries that use a similar type of index and color-coded scheme. China, for example, uses the following scale (“Air Quality Index – Wikipedia, the Free Encyclopedia” 2016):
As we continue to examine these scale variations, it will also be of interest to note that China is known to have some of the most polluted air in the world, especially over many of the urban areas.
Not all countries, jurisdictions or entities , however, use the idiot light approach that employs an arbitrary scaling method that is removed from showing the actual PM 2.5 pollution concentrations, such as those shown from the United States and China above. For example, the United Kingdom uses a scale (“Air Quality Index – Wikipedia, the Free Encyclopedia” 2016) that is dependent upon actual PM 2.5 concentrations, as is shown below:
Notice that the PM 2.5 concentration for the U.K. index is directly accessible and that the scaling for the index is dramatically different than that for the U.S. or China. In the case of the AQI used by the U.S. and China (and other countries as well), a transformed scale runs from 0 to 300-500 with concentration levels that are generally more obscure and ambiguous within the index. In the case of the U.K index, the scale directly reports with a specific PM 2.5 concentration level with a maximum (i.e., ~70 ug/m^3) that is far below that incorporated into the AQI index (i.e., 300 – 500 ug/m^3).
We can be assured that if a reading of 500 ug/m^3 is ever before us, we have a much bigger problem on our hands than discussions of air quality. The EPA AQI is heavily biased toward extreme concentration levels that are seldom likely to occur in practical affairs; the U.K. index gives much greater weight to the lower concentration levels that are known to directly impact health, as reflected by the WHO statement above.
Major differences in the scaling of the indices, as well as their associated health effects, are therefore hidden within the various color schemes that have been adopted by various countries or jurisdictions. Color has an immediate impact upon perception and communication; the reality is that most people will seldom, if ever, explore the basis of such a system as long as the message is “green” under most circumstances that they are presented with. The fact that one system acknowledges serious health effects at a concentration level of 50 – 70 ug/m^3 and that another does not do so until the concentration level is on the order of 150 – 300 ug/m^3 is certainly lost to the common citizen, especially when the scalings and color schemes chosen obscure the real risks that are present at low concentrations.
The EPA AQI system appears to have its roots in history as opposed to simplicity and directness in describing the pollution levels of the atmosphere, especially as it relates to the real-time known health effects of even short-term exposure to lower concentration PM 2.5 levels. The following statement (“Air Quality Index | World Public Library” 2016) acknowledges weaknesses in the AQI since its introduction in 1968, but the methods are nevertheless perpetuated for more than 45 years.
“While the methodology was designed to be robust, the practical application for all metropolitan areas proved to be inconsistent due to the paucity of ambient air quality monitoring data, lack of agreement on weighting factors, and non-uniformity of air quality standards across geographical and political boundaries. Despite these issues, the publication of lists ranking metropolitan areas achieved the public policy objectives and led to the future development of improved indices and their routine application.”
The system of color coding to extreme and rarified levels with the use of an averaged and biased scale versus one that directly reports the PM 2.5 concentration levels in real time is an artifact that is divorced from current observed measurements and the knowledge of the impact of fine particulates upon human health.
The reporting of PM 2.5 concentrations directly along with a more realistic assessment of impact upon human health is hardly unique to the U.K. index system. With little more than casual research, at least three other independent systems of measurement have been identified that mirror the U.K. maximum scaling levels along with the commensurate PM 2.5 counts. These include the World Health Organization, a European environmental monitoring agency, and a professional metering company index scale (World Health Organization 2013, 10) (“Air Quality Now – About US – Indices Definition” 2016) (“HHTP21 Air Quality Meter, User Manual, Omega Engineering” 2016, 10).
As another example to gain perspective between extremes and maximum “safe” levels of PM 2.5 concentrations, we can recall an event that occurred in Beijing, China during November 2010, and that was reported by the New York Times in January of 2013 (Wong 2013) . During this extreme situation, the U.S. Embassy monitoring equipment registered a PM 2.5 reading of 755, and the story certainly made news as the levels blew out any scale imaginable, including those that set maximums at 500.
An after statement within the article that references the World Health Organization standards may be the lasting impression that we should carry forward from the horrendous event, where it is stated that:
“The World Health Organization has standards that judge a score above 500 to be more than 20 times the level of particulate matter in the air deemed safe.”
Not withstanding the fact that WHO also states that no there is no evidence of any truly “safe” level of particulate matter in the atmosphere, we can nevertheless back out of this statement that a maximum “safe” level for the PM 2.5 count, as assessed by WHO, is approximately 25 ug / m^3. This statement alone should convince us that we must pay close attention to the lower levels of pollution that enter into the atmosphere, and that public perception should not be distorted by scales and color schemes that usually only affect public perception when they number into the hundreds.
Let us gain a further understanding of how low concentration levels and small changes affect human health and, shall I daresay, mortality. The case for low PM 2.5 concentrations being seriously detrimental to human health is strong and easy to make. Casual research on the subject will uncover a host of research papers that quantify increased mortality rates with direct relationship to small changes in PM 2.5 concentrations, usually expressing a change in mortality per 10 ug / m^3. Such papers are not operating in the arena of scores to hundreds of micrograms per cubic meter, but on the order of TEN micrograms per cubic meter. This work underscores the need to update the air quality standards, methods and reporting to the public based upon current health knowledge, instead of continuing a system of artifacts based upon decades old postulations.
These papers will refer to both daily mortality levels as well as long term mortality based upon these “small” increases in PM 2.5 concentrations. The numbers are significant from a public health perspective. As a representative article, consider the following recent published paper in Environmental Health Perspectives in June of 2015, under the auspices of the National Institute of Environmental Health Sciences(Shi et al. 2015) :
with the following conclusions:
as based upon the following results:
Let us therefore assume a more conservative increase of 2% mortality for a short-term exposure (i.e., 2 day) per TEN (not 12, not 100, not 500 per AQI scaling) micrograms per cubic meter. Let us assume a mortality increase of 7% for long term exposure (i.e, 365 days).
Let us put these results into further perspective. A sensible question to ask is, given a certain level of fine particulate pollution introduced into the air for a certain number of days within the year, how many people would die as a consequence of this change in our environment? We must understand that the physical nature of the particulates is being ignored here (e.g., toxicity, solubility, etc.) other than that of the size being less than 2.5 microns.
The data results suggest a logarithmic form of influence, i.e. a relatively large effect for short term exposures, and a subsequently more gradual impact for long term exposure. A linear model is the simplest approach, but it also is likely to be too modest in modeling the mortality impact. For the purpose of this inquiry, a combined linear-log approach will be taken as a reasonably conservative approach.
The model developed, therefore, is of the form:
Mortality % Increase (per 10ug/m^3) = 1.65 +. 007(days) + 0.48 * ln(days)
The next step is to choose the activity level and time period for which we wish to model the mortality increase. Although any scenario within the data range could be chosen, a reasonably conservative approach will also be adopted here. The scenario chosen will be to introduce 30 ug/m^3 of fine particulate matter into the air for 10% of the days within a year.
The model will therefore estimate a 3.6% increase in mortality for 10 ug/ m^3 of introduced PM 2.5 materials (36.5 days). For 30 ug/m^3, we will therefore have a a 10.9% increase in mortality. As we can see, the numbers can quickly become significant, even with relatively low or modest PM 2.5 increases in pollution.
Next we transform this percentage into real numbers. During the year of 2013, the Centers for Disease Control (CDC) reports that 2,596,993 people died during that year from all causes combined (“FastStats” 2016). The percentage of 10.9% increase applied to this number results in 283, 072 additional projected deaths per year.
Continuing to place this number into perspective, this number exceeds the number of deaths that result from stroke, Alzheimer’s, and influenza and pneumonia combined (i.e, 5th, 6th, and 8th leading causes of death) during that same year. The number is also much higher than the death toll for Chronic Pulmonary Obstructive Disease (COPD), which is now curiously the third leading cause of death.
We should now understand that PM 2.5 pollution levels are a very real concern with respect to public health, even at relatively modest levels. Some individuals might argue that such a scenario could never occur, as the EPA has diminished the PM 2.5 standard on an annual basis down to 12 ug/m^3. The enforcement and sensitivity of that measurement standard is another discussion that will be reserved for a later date. Suffice it to say that the scenario chosen here is not unduly unrealistic here for consideration, and that it is in the public’s interest to engage themselves in this discussion and examination.
The next issue of interest to discuss is that of a comparison between different air quality scales in some detail. In particular, the “weighting”, or influence, of lower concentration levels vs. higher concentration levels will be examined. This topic is important because it affects the interpretation by the public of the state of air quality, and it is essential that the impacts upon human health are represented equitably and with forthrightness.
The explanation of this topic will be considerably more detailed and complex than the former issues of “color coding” and mortality potentials, but it is no less important. The results are at the heart of the perception of the quality of the air by the public and its subsequent impact upon human health.
To compare different scales of air quality that have been developed; we must first equate them. For example, if one scale ranges from 1 to 6, and another from 0 to 10, we must “map”, or transform them such that the scales are of equivalent range. Another need in the evaluation of any scale is to look at the distribution of concentration levels within that same scale, and to compare this on an equal footing as well. Let us get started with an important comparison between the EPA AQI and alternative scales that deserve equal consideration in the representation of air quality.
Here is the structure of the EPA AQI in more detail (U.S. Environmental Protection Agency 2012, 4) .
|AQI Index||AQI Abitrary Numeric||AQI Rank||PM 2.5 (ug/m^3) 24 hr avg.|
|Unhealthy for Sensitive Groups||101-150||3||35.5-55.4|
Now let us become familiar with three alternative scaling and health assessment scales that are readily available and that acknowledge the impact of lower PM 2.5 concentrations to human health:
|United Kingdom Index||U.K. Nomenclature||PM 2.5 ug/m3 24 hr avg.|
Now for a second alternative air quality scale, this being from Air Quality Now, a European monitoring entity:
|Air Quality Now EU Rank||Nomenclature||PM 2.5 Hr||PM 2.5 24 Hrs.|
And lastly, the scale from a professional air quality meter manufacturer:
|Professional Meter Index||Nomenclature||PM 2.5 ug/m^3 Real Time Concentration|
We can see that the only true common denominator between all scaling systems is the PM 2.5 concentration. Even with the acceptance of that reference, there remains the issue of “averaging” a value, or acquiring maximum or real time values. Setting aside the issue of time weighting as a separate discussion, the most practical means to equate the scaling system is to do what is mentioned earlier: First, equate the scales to a common index range (in this case, the EPA AQI range of 1 to 6 will be adopted). Second, inspect the PM 2.5 concentrations from the standpoint of distribution, i.e., evaluate these indices as a function of PM 2.5 concentrations. The results of this comparison follow below, accepting the midpoint of each PM 2.5 concentration band as the reference point:
|PM 2.5 (ug/m^3)||EPA AQI||UK||EU (1hr)||Meter|
This table reveals the essence of the problem; the skew of the EPA AQI index toward high concentrations that diminishes awareness of the health impacts from lower concentrations can be seen within the tabulation.
This same conclusion will be demonstrated graphically at a later point.
Now that all air quality scales are referenced to a common standard, i.e., the PM 2.5 concentration), the general nature of each series can be examined via a regression analysis. It will be found that a logistical function is a favored functional form in this case and the results of that analysis are as follows:
EPA Index (1-6) = 5.57 / (1 + 2.30 * exp(-.016 * PM 2.5))
Mean Square Error = 0.27
Mean (UK – EU – Meter) Index (1-6) = 6.03 / (1 + 5.65 * exp(-.046 * PM 2.5))
Mean Square Error = 0.01
The information that will now be of value to evaluate the weighting distribution applied to various concentration levels is that of integration of the logistical regression curves as a function of bandwidth. The result of the integration process (Int.) applied to the above regressions is as follows:
|PM 2.5 Band||EPA AQI (Int.)
[Index * PM 2.5]
|Mean Index (Int.)
[Index * PM 2.5]
|% Relative Overweight or Underweight of PM 2.5 Band Contribution Between EPA AQI and Mean Alternative Air Quality Index Scale (Endpoint Bias Removed)|
A graph of a regression curve to the % Relative Overweight/Underweight data in the final column of the table above is as follows (band interval midpoints selected; standard error = 4.1%).
And, thus, we are led to another interpretation regarding the demerits of the EPA AQI. The EPA AQI scaling system unjustifiably under-weights the harmful effects of PM 2.5 concentrations that are most likely to occur in real world, real time, daily circumstances. The scale over-weights the impacts of extremely low concentrations that have little to no impact upon human health. And lastly, when the PM 2.5 concentrations are at catastrophic levels and the viability of life itself is threatened, all monitoring sources, including the EPA, are in agreement that we have a serious situation. One must seriously question the public service value under such distorted and disproportionate representation of this important monitor of human health, the PM 2.5 concentration.
Let us proceed to an additional serious flaw in the EPA air quality standards, and this is the issue of averaging the data. It will be noticed that the current standard for EPA PM 2.5 air quality is 12 ug/m^3 , as averaged over a 24 hour period. On the surface, this value appears to be reasonably sound, cautious and protective of human health. A significant problem, however, occurs when we understand that the value is averaged over a period of time, and is not reflective of real-time dynamic conditions that involve “short-term” exposures.
To begin to understand the nature of the problem, let us present two different scenarios:
In the first scenario, the PM 2.5 count in the environment is perfectly even and smooth, let us say at 10 ug/m^3. This is comfortably within the EPA air quality standard “maximum” per a 24 hour period, and all appears well and good.
In this scenario, the PM 2.5 count is 6 ug/m^3 for 23 hours out of 24 hours a day. For one hour per day, however, the PM 2.5 count rises to 100 ug/m^3, and then settles down back to 6 ug/m^3 in the following hour.
Instinctively, most of us will realize that the second scenario poses a significant health risk, as we understand that maximum values may be as important (or even more important) than an average value. One could equate this to a dosage of radiation, for example, where a short term exposure could produce a lethal result, but an average value over a sufficiently long time period might persuade us that everything is fine.
And this, therefore, poses the problem that is before us.
In the first scenario, the weighted average PM 2.5 count over a 24 hour period is 10 ug/ m^3.
In the second scenario, the weighted average PM 2.5 count over a 24 hour period is 10 ug/m^3.
Both scenario averages are within the current EPA air quality maximum pollution standards.
Clearly, this method has the potential for disguising significant threats to human health if “short-term” exposures occur on any regular basis. Observation and measurement will show that they do.
Now that we have seen some of the weaknesses of the averaging methods, let us look at an additional scenario based upon more realistic data, but that continues to show a measurable influence upon human health. The scenario selected has a basis in recent and independently monitored PM 2.5 data.
The situation in this case is as follows:
This model scenario will postulate that the following conditions are occurring for approximately 10% of the days in a year. For that period, let us assume that for 13.5 hours of the day that the PM 2.5 count is essentially nil at 2 ug/m^3. For the remaining 10.5 hours of the day during that same 10% of the year, let us assume the average PM 2.5 count is 20 ug/m^3. The range of the PM 2.5 count during the 10.5 hour period is from 2 to 60 ug/m^3, but the average of 20 ug/m^3 (representing a significant increase) will be the value required for the analysis. For the remainder of the year very clean air will be assumed at a level of 2 ug/m^3 for all hours of the day.
A more extended discussion of the nature of this data is anticipated at a later date, but suffice it to say that the energy of sunlight is the primary driver for the difference in the PM 2.5 levels throughout the day.
The next step in the problem is to determine the number of full days that correspond to the concentration level of 20 ug/m^3, and also to provide for the fact that the elevated levels will be presumed to exist for only 10% of the year. The value that results is:
0.10 * (365 days) * (10.5 hrs / 24 hrs) = 16 full days of 20 ug/m^3 concentration level.
As a reference point, we can now estimate the increase in mortality that will result for an arbitrary 10 ug/m^3 (based upon the relationship derived earlier):
Mortality % Increase (per 10ug/m^3) = 1.65 +. 007(16 days) + 0.48 * ln(16 days)
Mortality % Increase (per 10ug/m^3) = 3.1%
The increase in this case is 18 ug/m^3 (20 ug/m^2 – 2 ug/m^3), however, and the mortality increase to be expected is therefore:
Mortality % Increase (per 18ug/m^3 increase) = 1.8 * 3.1% = 5.6%.
Once again, to place this number into perspective, we translate this percentage into projected deaths (as based upon CDC data, 2013):
.056 * (2, 596, 993) = 145, 431 projected additional deaths.
This value is essentially equivalent (again, curiously) to the third leading cause of death, namely Chronic Pulmonary Obstructive Disease (COPD), with a reported value of deaths for 2013 of 149, 205.
It is understood that a variety of factors will ultimately lead to mortality rates, however, this value may help to put the significance of “lower” or “short-term” exposures to PM 2.5 pollution into perspective.
It should also be recalled that the averaging of PM 2.5 data over a 24 hour period can significantly mask the influences of such “short-term” exposures.
A remaining issue of concern with respect to AQI deficiencies is its accuracy in reflecting real world conditions in a real-time sense. The weakness in averaging data has already been discussed to some extent, but the issue in this case is of a more practical nature. Independent monitoring of PM 2.5 data over a reasonably broad geographic area has produced direct visible and measurable conflicts in the reported state of air quality by the EPA.
After close to twenty years of public research and investigation, there is no rational denial that the citizenry is subject to intensive aerosol operations on a regular and frequent basis. These operations are conducted without the consent of that same public. The resulting contamination and pollution of the atmosphere is harmful to human health. The objective here is to simply document the changes in air quality that result from such a typical operation, and the corresponding public reporting of air quality by the EPA for that same time and location.
Multiple occasions of this activity are certainly open to further examination, but a representative case will be presented here in order to disclose the concern.
Typical Conditions for Non- Operational Day.
Sonoran National Monument – Stanfield AZ
Aerosol Operation – Early Hours
Jan 19 2016 – Sonoran National Monument – Stanfield AZ
Aerosol Operation – Mid-Day Hours
Jan 19 2016 – Sonoran National Monument – Stanfield AZ
EPA Website Report at Location and Time of Aerosol Operation.
Jan 19 2016 – Sonoran National Monument – Stanfield AZ
Air Quality Index : Good
Forecast Air Quality Index : Good
Health Message : None
Current Conditions : Not Available
The PM 2.5 measurements that correlate with the above photographs are as follows:
With respect to the non-operational day photograph, clean air can and does exist at times in this country, especially in the more remote portions of the southwestern U.S. under investigation. It is quite typical to have PM 2.5 counts from 2 to 5 ug/m^3, which fall under the category of very good air quality by any index used. Low PM 2.5 counts are especially prone to occur after periods of heavier rain, as the materials are purged from the atmosphere. The El Nino influence has been especially influential in this regard during the earlier portion of this winter season. Visibility conditions of the air are a direct reflection of the PM 2.5 count.
On the day of the aerosol operation, the PM 2.5 counts were not low and the visibility down to ground level was highly diminished. The range of values throughout the day were from 2 to 57, with the low value occurring prior to sunrise and post sundown. The highest value of 57 occurred during mid-afternoon. A PM 2.5 value of 57 ug/m^3 is considered poor air quality by many alternative and contemporary air quality standards, and the prior discussions on mortality rates for “lower” concentrations should be consulted above. This high value has no corollary, thus far, during non-aerosol-operational days. From a common sense point of view, the conditions recorded by both photograph and measurement were indeed unhealthy. Visibility was diminished from a typical 70 miles + in the region to a level of approximately 30 miles during the operational period. Please refer to the earlier papers (Visibility Standards Changed, March 2001 and Mortality vs. Visibility, June 2004; also additional papers) for additional discussions related to these topics.
The U.S. Environmental Protection Agency reports no concerns, no immediate impact, nor any potential impact to health or the environment during the aerosol operation at the nearest reporting location.
This paper has reviewed several factors that affect the interpretation of the Air Quality Index (AQI) as it has been developed and is used by the U.S. Environmental Protection Agency (EPA). In the process, several shortcomings have been identified:
1. The use of a color scheme strongly affects the perception of the index by the public. The colors used in the AQI are not consistent with what is now known about the impact of fine particulate matter (PM 2.5) to human health. The World Health Organization (WHO) acknowledges that there are NO known safe levels of fine particulate matter, and the literature also acknowledges the serious impact of low concentration levels of PM 2.5, including increased mortality.
2. The scaling range adopted by the AQI is much too large to adequately reveal the impact of the lower concentration levels of PM 2.5 to human health. A range of 500 ug/m^3 attached to the scale when mortality studies acknowledge significant impact at a level of 10 ug/m^3 is out of step with current needs by the public.
3. The underweighting of the lower PM 2.5 concentration levels relative to more contemporary scales that adequately emphasize lower level health impacts obscures health impacts which deserve more prominent exposure.
4. The AQI numeric scale is divorced from actual PM 2.5 concentration levels. The arbitrary scaling has no direct relationship to existing and actual concentrations of mass to volume ratios. The actual conditions of pollution are therefore hidden by an arbitrary construct that obscures the impact of pollution to human health.
5. The AQI is a historic development that has been maintained in various incarnations and modifications since its origin more than 45 years ago. The method of presentation and computation is obtuse and appears to exist as a legacy to the past rather than directly portraying pollution health risks.
6. The averaging of pollution data over a time period that filters out short term exposures of high magnitude is unnecessary and it hinders the awareness of the actual conditions of exposure to the public.
7. Presentation of air quality information through the authorized portal appears to present potential conflicts between reported information and actual field condition observation, data and measurement.
In the opinion of this researcher the AQI, as it exists, should be revamped or discarded. Allowing for catastrophic pollution in the development of the scale is commendable, but not if it interferes with the presentation of useful and valuable information to the public on a practical and daily basis.
There is a partial analogy here with the scales used to report earthquakes and other natural events, as they are of an exponential nature and they provide for extreme events when they occur. It is now known, however, that very low levels of fine particulate matter are very harmful to human health. Any scaling chosen to represent the state of pollution in the atmosphere must correspondingly emphasize and reveal this fact. This is what matters on a daily basis in the practical affairs of living; the extreme events are known to occur but they should not receive equal (or even greater) emphasis in a daily pollution reporting standard. It is primarily a question of communicating to the public directly in real-time with actual data, versus the adherence to decades old legacies and methods that do not accurately portray modern pollution and its sources.
It seems to me that a solution to the problem is fairly straightforward; this issue is whether or not such a transformation can be made on a national level and whether or not it has strong public support. Many other scaling systems have already made the switch to emphasize the impact of lower level concentrations to human health; this would seem to be admirable based upon the actual needs of society.
It is a fairly simple matter to reconstruct the scale for an air quality index. THE SIMPLEST SOLUTION IS TO REPORT THE CONCENTRATION LEVELS DIRECTLY, IN REAL TIME MODE. For example, if the PM 2.5 pollution level at a particular location is, for example, 20 ug/m^3, then report it as such. This is not hard to do and technology is fully supportive of this direct change and access to data. We do not average our rain when it rains, we do not average our sunlight when we report how clear the sky is, we do not average the cloud cover, and we do not average how far we can see. The environmental conditions exist as they are, and they should be reported as such. There is no need to manipulate or “transform” the data, as is being done now. A linear scale can also be matched fairly well to the majority of daily life needs, and the extreme ranges can also be accommodated without any severe distortion of the system. The relationship between visibility and PM 2.5 counts will be very quickly and readily assimilated by the public when the actual data is simply available in real-time mode as it needs to be and should be. Of course, greater awareness of the public of the actual conditions of pollution may also lead to a stronger investigation of their source and nature; this may or may not be as welcome in our modern society. I hope that it will be, as the health of our generation, succeeding generations, and of the planet itself is dependent upon our willingness to confront the truths of our own existence.
Clifford E Carnicom
Mar 12, 2016
Born Clifford Bruce Stewart
Jan 19, 1953
“AirNow.” 2016. Accessed March 13. https://www.airnow.gov/.
“Air Quality Index | World Public Library.” 2016. Accessed March 13. http://www.worldlibrary.org/articles/air_quality_index.
“Air Quality Index – Wikipedia, the Free Encyclopedia.” 2016. Accessed March 13. https://en.wikipedia.org/wiki/Air_quality_index.
“Air Quality Now – About US – Indices Definition.” 2016a. Accessed March 13. http://www.airqualitynow.eu/about_indices_definition.php.
———. 2016b. Accessed March 13. http://www.airqualitynow.eu/about_indices_definition.php.
“FastStats.” 2016. Accessed March 13. http://www.cdc.gov/nchs/fastats/deaths.htm.
“HHTP21 Air Quality Meter, User Manual, Omega Engineering.” 2016.
Shi, Liuhua, Antonella Zanobetti, Itai Kloog, Brent A. Coull, Petros Koutrakis, Steven J. Melly, and Joel D. Schwartz. 2015. “Low-Concentration PM2.5 and Mortality: Estimating Acute and Chronic Effects in a Population-Based Study.” Environmental Health Perspectives 124 (1). doi:10.1289/ehp.1409111.
U.S. Environmental Protection Agency. 2012. “Revised Air Quality Standards for Particle Pollution and Updates to the Air Quality Index (AQI).”
Wong, Edward. 2013. “Beijing Air Pollution Off the Charts.” The New York Times, January 12. http://www.nytimes.com/2013/01/13/science/earth/beijing-air-pollution-off-the-charts.html.
World Health Organization. 2013. “Health Effects of Particulate Matter, Policy Implications for Countries in Eastern Europe, Caucasus and Central Asia.”
This pages provides notice that renewed efforts of computer hacking directed toward Carnicom Institute may be in effect. There are strong indications that research data has been selectively retrieved. There appears to be multiple infiltrations of email accounts. Computer failure and software disturbance coincident with these events has occurred. There is a precedent for such actions in the history of research that has been conducted, however, recent years have provided a hiatus in this regard. The security of information acquired through research and its presentation can not be guaranteed. The methods and strategies used appear to be sophisticated.
If this statement is found to be in error it will be corrected.
Clifford E Carnicom
CI Collaborates with NHFC
Late on the night of September 23rd, Clifford Carnicom, founder and president of Carnicom Institute, set out by train from Spokane, Washington to attend the 2014 United States Health Freedom Congress in St. Paul Minnesota. He was invited by Diane Miller, JD, Director of Law and Public Policy for the National Health Freedom Coalition (NHFC), and the National Health Freedom Action. These organizations bring together leaders from across the country who are working toward health freedoms and the legislation which can secure these freedoms for people in the United States. Clifford was honored to be invited and to offer a view of his work to people unfamiliar with it. At the same time, he was eager to learn about the work of the other members. I interviewed Clifford upon his return.
KW: Clifford, you seem very eager to share your experience of the Health Freedom Congress. What happened there? What’s it all about?
CEC: I am excited by my experience there. It was so encouraging to see that many other organizations are working along similar lines as Carnicom Institute. We each have our different focus areas, of course, but generally we are working to provide a climate in which people have access to health care of their choosing, rather than being forced into accepting health care that does not fit with their values and preferences. You’ve probably heard the stories about people losing their children because they refused a certain course of prescribed and mandated treatment. The National Health Freedom Coalition is at the forefront of a movement that will give people the rights to decide for themselves what kind of healthcare they want without being penalized for their choices. The NHFC also created the legislation to protect healthcare practitioners in what are called Safe Harbor Laws. A safe harbor provides protection so that those providing alternative healthcare will not be penalized for practicing medicine which falls outside the domain of conventional medicine.
KW: That’s impressive, Clifford. I hear about people whose practices have been closed down by the authorities for practicing medicine without a license, even though these people did have licenses or certifications in their chosen field of alternative medicine.
CEC: Conventional medical professionals have, in many cases, been given powers that far exceed what is reasonable and it was never intended to be so. It all depends on how you define medicine. Nine states have passed health freedom legislation that was spearheaded by the NHFC. This legislation is sweeping the nation and more than two dozen other states are working on similar legislation. These laws define the scope of licensing, as well as support the freedoms to which people are entitled. People have a right to informed consent and a right to choose.
KW: Why did they ask you to come?
CEC: They recognize that CI and scores of other organizations are working along the same lines. The focus of the organizations which make up the NHFC encompass a broad range of issues, including vaccines and GMOs. They are gradually increasing the scope of issues they are bringing to the table; this year they decided it was important to begin learning about geoengineering. CI extended the discussion to include the full range of the research, which includes bioengineering as well.
KW: How did you bring this issue to the people attending?
CEC: The conference was structured to allow for discussion in smaller groups. There were no presentations by voting members of the congress, of which I was one. But though I was not one of the speakers, I was working and making connections whenever I could… in the hallways, at the breaks, and in the small groups. Many connections and understandings were reached, but not to the depths that we will seek in the future. Many people showed an interest in our organization and would like to learn what we do. Likewise, I had an equal interest in understanding the other organizations and what they are working toward. We are all sharing the same interests here. What they didn’t know when they invited me was whether CI was working along the same lines. They found out that we share common goals and a seriousness about the depth of the issues involved with health freedoms. However, our work is international in scope, while that of the other organizations is at the national or state level. It became overwhelming to many of them because it was outside of their normal turf.
KW: So now what, Clifford? Where do we go from here?
CEC: I want to forge a collaboration between CI and the other organizations that are part of the Coalition, and I want to bring the public into an understanding of all the issues at stake here. There is CI, there are the members of the Coalition, and there is the public. I am looking toward increased awareness, involvement, and action between all parties. I want to openly declare Carnicom Institute’s advocacy and active support for the National Health Freedom Coalition. Additionally, the public has a responsibility to become educated in the shared principles of CI and the Coalition, such as informed consent, and to become aware of the work that is being done to benefit the public. This awareness will present the work of Carnicom Institute in relationship to the larger themes that involve the violations of basic human rights and the freedom of choice. What excites me is the potential for a more powerful network of public involvement through the collaboration of Carnicom Institute and the National Freedom Health Coalition.