This week the good people of Portland voted down fluoridating the water. The kids here have rotten teeth, at least more rotten than, say, Seattle. For some reason the far left and far right have come together on this.
So what's the problem with Americans and science?
The potent combination of our powerful intelligence with our massive reality denial has led to a dangerous world. Less obvious, but in the long term more dangerous, are threats resulting directly or indirectly from technological developments that have permitted us to increase our numbers well beyond the carrying capacity of the natural world. More efficient agriculture and the invention of artificial fertilizers permitted humans to produce food sufficient to support numbers that would be unthinkable for other animals of our physical size. Public health measures, vaccinations, antibiotics, and other medical advances also permitted population numbers to explode. The world is overpopulated already and is becoming more so at an alarming rate. And although we pay lip service to the resulting problems, we do relatively little to address their root causes. Indeed, some religions continue to promote the unrestrained propagation of their flocks. Planet Earth is sick, with a bad case of “infection by humans.” In fact, as far as the other species on the planet are concerned, we humans are like the rapaciously invasive conglomerate of aliens called the Borg in the classic TV series “Star Trek” — a race that indiscriminately assimilates and takes over anything and everything it encounters. The motto of the Borg is “Resistance is futile.” And indeed, for all other species on Planet Earth, resistance is futile when faced with humans! The exceptions, of course, are the microbes that infect us (such as tuberculosis, HIV, and malaria), which are also spreading just fine, thank you. As explained by environmental activist Paul Gilding in “The Great Disruption”: “We have now reached a moment where four words—the earth is full—will define our times. This is not a philosophical statement; this is just science based in physics, chemistry and biology … To keep operating at our current level, we need 50 percent more Earth than we’ve got.”
The most dramatic consequence, of course, is the effect we are having on the atmosphere and the climate. Besides the very public efforts of Al Gore, many writers have spoken out about this vital issue, including Scientific American editor Fred Guterl. “The Fate of the Species: Why the Human Race May Cause Its Own Extinction and How We Can Stop It” describes climate change as one of the most pressing dangers to the human species. But while this has become a popular topic of discussion, few are willing to make the major lifestyle changes necessary to reverse it. The government of the country that is one of the biggest per-capita culprits (the United States) now at least acknowledges the reality of global climate change—but still refuses to face up to the problem. The energy platforms of major political candidates for leadership positions in the United States carefully ignore or minimize attention to this politically charged issue. And the same is true of the other major contributors to the problem, such as those companies that extract more and more fossil fuels from the earth yet run misleading advertising campaigns that claim that they really do care about the environment. Even worse, the melting of Arctic ice has given many countries the impetus to prospect for more fossil fuels in that pristine wilderness.
Why is it that ordinary citizens do not sit up and take notice of the danger? Unfortunately, the focus remains mostly on “global warming” instead of on the bigger concern—that we are disrupting the planet’s climate in completely unpredictable ways. Because climate prediction includes a significant degree of scientific uncertainty, this has allowed skeptics to gain the upper hand and even corner some expert scientists into difficult positions. A friend in the climate research field privately admits that he and most of his colleagues are afraid to stand up and speak out because of the vituperative attacks and massive smear campaigns that they would inevitably suffer—as did Michael Mann and others. But much research indicates that as forests disappear and polar ice caps melt, etc., there are unpredictable feedback mechanisms that will make global warming increasingly difficult to tackle. Even more worrisome, there will likely be a tipping point after which continued warming may become irreversible, no matter what we do. Of course, other scenarios are also possible. For example, it is plausible that we could instead tip the planet into an ice age. The Hollywood movie “The Day After Tomorrow” took a reasonably valid climate change model that is possible over a sixty-year span and instead told a story in which it happened in six weeks. This made it easy for viewers to deny the possibility that this could ever actually happen.
During Bill Clinton’s successful bid for the presidency, one of his campaign aphorisms was “It’s the economy, stupid!” The point was that while strategists were considering many diverse and important political issues, the state of the economy was the single factor that was actually going to determine the outcome of the election. In like manner, we humans are focusing on the wrong issues when it comes to the debate over global warming. The slogan should be “It’s local climate destabilization, stupid!” One does not need to be an expert to find convincing evidence that global temperatures are indeed rising, and that the climate is changing, likely due to human activities. Every one of the more than 150 national scientific academies in the world, every professional scientific society with members in relevant fields, and more than 98 percent of all scientists who study climate agree on this point. There is increasing agreement that the Industrial Revolution ushered in a new climatic period, which Nobel Prize winner Paul Crutzen has called the Anthropocene. The frequency of extreme weather events across the world is increasing at a rate not previously seen since climate records began to be kept—and an ice-free Arctic sea may be years, not decades, away. However, despite the overwhelming body of data, it is unclear exactly what is going to happen in the future. Thus the mild-sounding term warming is too easy to pass off as being irrelevant to an individual (“So what? I will just turn up my air conditioner!”).
Instead of allowing complacency based on uncertainty, we need to look back at the history of climate on this planet and consider the potential consequences of human interference. Data from sources such as the Greenland ice core (from which it is possible to determine historical temperatures over relatively short time spans) indicate that the period until around ten thousand years ago was prone to wildly oscillating temperatures (similar data is available back to almost a million years ago). For example, twelve thousand years ago there were likely tens of feet of ice over what is now San Francisco and Washington, D.C., as well as over most of northern Europe. The less frequent and cyclical warm periods of the past could also be associated with temperature fluctuations, variations in ocean levels, and so on. In contrast, the last ten thousand or so years have been one of the uncommon periods ofrelatively stable warm climate. Years ago I used to joke that this must be a consequence of humans having spread all across the planet at approximately that time, reaching all the way into South America and most other habitable parts of the world. In fact, this concept is now the basis of a current theory—that the diaspora of humans, and the associated burning of forests, the initiation of agriculture, and the elimination of most large animal species (beginning around ten thousand years ago) may have stabilized the climate and prevented its usual wanderings. Whatever the mechanism, we are living in an unusually stable period called the Holocene epoch. It is only because of this stable climate that we have so successfully populated the world while optimizing our homes, facilities, and agriculture to suit a relatively predictable local climate in each location.
So what the average human should fear is not global warming but rather local climate destabilization, i.e., a change in the relative stability and predictability of his or her own local situation. This does not only mean unusually severe hurricanes and tornadoes, or unexpected droughts and flooding. Such sad catastrophes affect only a small part of the world at any one time. Of even greater concern should be local changes that may seem trivial yet have a huge impact on our living conditions and economies. Try a casual poll of your friends across the planet who have lived in the same place for a while, and ask, “How’s the weather been lately in your neck of the woods?” There is a high probability that the answer will include words such as “strange,” “unusual,” and “weird.” And in some cases your friends will not mention warming but rather unusual cold spells, or rain and snow that fell with unusual frequency at unexpected times. The general trend seems to be increasing dryness in previously dry areas and increasing wetness in previously wet areas. It will not take many more such changes to disrupt local economies and agriculture in a manner that destabilizes local societies. And the impact of local events can be global. For example, very high temperatures in Russia in 2010 were unpleasant, but the bigger consequences were forest fires and loss of wheat production. The unprecedented 2011 floods in Thailand raised the costs of computer hard disks worldwide because some key local factories were damaged. And in 2012, the great drought in North America decimated the corn crop and ignited forest fires that destroyed many homes.
Remarkably, the subject of climate change was never brought up by the moderators of the three U.S. presidential debates of 2012—and was only mentioned briefly by Barack Obama in his acceptance speech upon reelection (he did expand on the theme in his 2013 State of the Union address). It remained the big elephant in the room that everyone was conveniently ignoring. But as this is being written, the northeastern seaboard of the United States is still struggling to recover from the devastation wrought by the deadly hurricane Sandy. This so-called Frankenstorm was unprecedented in terms of its timing, path, and site of landfall. The hurricane was apparently unique in the annals of American weather history, and many climate scientists feel it is the latest manifestation of human-induced climate change.
But the tragedy of Hurricane Sandy may have a silver lining. Politicians whose constituents were directly affected began speaking out. New York’s governor, Andrew Cuomo, said in a news briefing the day after the hurricane hit: “There has been a series of extreme weather incidents. That is not a political statement. That is a factual statement … Anyone who says there’s not a dramatic change in weather patterns, I think, is denying reality.” The next day he added, ―”I think part of learning from this is realizing that climate change is a reality.” And the politically independent New York City mayor, Michael Bloomberg, said: “Our climate is changing. And while the increase in extreme weather we have experienced in New York City and around the world may or may not be the result of it, the risk that it may be—given the devastation it is wreaking—should be enough to compel all elected leaders to take immediate action.” Fittingly, the contemporaneous cover story of Bloomberg Businessweek magazine was entitled “It’s Global Warming, Stupid.”
While the public and politicians may choose to ignore climate change, the insurance industry cannot afford to do so. Using its comprehensive NatCatSERVICE database, which maintains data on natural catastrophes, Munich Re (one of the world’s largest reinsurance conglomerates) analyzes the frequency and loss trends of various events from an insurance perspective. In an October 2012 report, Munich Re published its analysis of all kinds of weather perils and trends. The study was prepared for underwriters and clients in North America, the world’s largest insurance and reinsurance market. Ironically, this region (also one of the world’s largest producers of greenhouse gases) has been most affected by extreme weather-related events in recent decades. The North American continent is already vulnerable to all types of weather hazards—hurricanes, winter storms, tornadoes, wildfires, drought, and floods (one reason is that there is no mountain range running from east to west that can separate hot from cold air). But the study shows a nearly fivefold increase in the number of weather-related loss events in North America for the period from 1980 to 2011, compared to a 400 percent increase in Asia, a 250 percent increase in Africa, a 200 percent increase in Europe, and a 150 percent increase in South America. The overall loss burden during this time frame from weather catastrophes in the United States was $1.06 trillion (in 2011 values), and some thirty thousand people lost their lives.
Just a few weeks before the Munich Re report appeared, James Hansen and other scientists at NASA’s Goddard Institute for Space Studies, in New York, published a study on the apparent increase in extreme heat waves during the summertime. Such events, which just a few decades ago affected less than 1 percent of the earth’s surface, “now typically cover about 10 percent of the land area,” the paper stated. “It follows that we can state, with a high degree of confidence, that extreme anomalies”—i.e., heat waves—”such as those in Texas and Oklahoma in 2011 and Moscow in 2010 were a consequence of global warming because their likelihood in the absence of global warming was exceedingly small.”
Meanwhile, in a Kafkaesque twist, the late-2012 United Nations Framework Convention on Climate Change took place in Qatar, the country that tops all others in per-capita production of greenhouse gases! The official meeting report sounded superficially encouraging: “Countries … agreed on a firm timetable to adopt a universal climate agreement by 2015 … They further agreed on a path to raise the necessary ambition [italics mine] to respond to climate change, endorsed the completion of new institutions, and agreed on ways to deliver scaled-up climate finance and technology to developing countries. The Kyoto Protocol … under which developed countries commit to cutting greenhouse gases … will continue … the length of the second commitment period will be eight years.” But Greenpeace International responded: “Which planet are you on? Clearly not the planet where people are dying from storms, floods, and droughts … The talks in Doha … failed to live up to even the historically low expectations. Where is the urgency? … It appears governments are putting national short-term interest ahead of long-term global survival.”
Even more worrisome, it was hard to find any headline or front-page news about the all-important Doha talks.
Indeed, the mainstream media seem to have generally lost interest in this story, perhaps because it has become so commonplace. This despite the fact that the National Oceanic and Atmospheric Administration (NOAA) has just officially declared that 2012 was the hottest year on record for the continental United States and the second-worst for “extreme” weather conditions, such as hurricanes, droughts, and floods.
Seven of the ten hottest years in U.S. records (dating back to 1895), and four of the hottest five, have followed 1990, according to NOAA figures. The year 2012 also saw Arctic sea ice hit a record low, based on more than thirty years of satellite observations. And at a global level, according to NOAA scientists, all twelve years of the twenty-first century so far (2001–2012) rank among the fourteen warmest in the 133-year period since records have been kept.
Despite all the facts, there is a deafening silence on the part of world leadership over the plausible relationship between extreme weather events and human-induced climate disruption. The analogy that comes to mind is the emperor Nero playing the fiddle while his city, Rome, burned down! But it is too late now, anyway—the “Planet Earth Climate Destabilization Experiment,” as I call it, is under way, and we just have to wait to see how bad things are going to get. We have poked a tiger in the eye, and we just have to hope we will get away without suffering too many consequences. We should be very concerned about disrupting the relatively predictable weather that we currently enjoy, something that is critical not only for human living conditions in most parts of the world but also for the relative stability of our local economies and livelihoods. Nevertheless, it is more convenient to simply deny the problem. Indeed, the news media seem to have gotten weary of reporting extreme weather events and don’t bother anymore unless the events are close to home. Were readers in America aware, for example, that Pakistan had a second megaflood in 2011? Both of these catastrophic floods affected a single area, washed away vital crops, forced almost two million people to flee their homes, and left them suffering from malaria, hepatitis, and other diseases. And did you hear that the city of Beijing had record-breaking floods in 2012? The heaviest rainfall to hit China’s capital in sixty years left many dead, stranded thousands at the main airport, and flooded major roads. Almost two million people were affected, and economic losses were estimated at $1.5 billion. But this kind of event is no longer considered international news—as it has become far too common.
In the worst-case scenario, we could even find ourselves enduring the same wild weather that plagued times past, swinging between ice ages and warm periods. But the skeptic can argue that current climate models might turn out to be wrong after all, and that climate destabilization might not be as severe as the alarmists suggest. And as Matt Ridley points out in The Rational Optimist, some prior doomsday predictions turned out to be overstated. For example, Rachel Carson’s 1962 book, “Silent Spring,” documented the detrimental effects of pesticides on the environment, particularly on birds. The title was meant to evoke a future spring season in which no bird could be heard singing because they had all vanished due to pesticide abuse. Given the state of knowledge at the time, Carson’s additional concerns about effects of synthetic chemicals on human health were reasonable, and many have been borne out. But decades later some suggest that the overreaction to this seminal book also did some harm—for example, by eliminating the vital role that DDT played in killing malaria-carrying mosquitoes. But it also seems that the resulting cleanup of toxic lead and mercury from the environment has had a positive effect on the brain development of children. Another example that Ridley cites is Paul Ehrlich’s 1969 book, “The Population Bomb,” whose original edition began with the statement, “The battle to feed all of humanity is over … nothing can prevent a substantial increase in the world death rate.” Ehrlich and his wife, Anne, still stand by the basic ideas in the book, believing that it achieved their goals because “it alerted people to the importance of environmental issues and brought human numbers into the debate on the human future.” And there is no doubt that many issues related to population growth were addressed only because of these warnings. But the book also made a number of specific predictions that did not come to pass. Ridley argues that, guided by our human ingenuity and ability to adjust to change, we took action to fix some of these problems, and the predictions turned out to be worse than the reality. So his comforting notion is that humanity can and will fix the climate problem when the time comes to really deal with it.
But there is one big difference between pesticide pollution and population growth on the one hand and global climate change on the other. Unlike any other current policy issue, the potential for climate destabilization is one we simply cannot afford to get wrong the first time around. The fact is that we humans are conducting a very dangerous experiment with our climate on the one and only planet that we have. Whether or not the recent spate of extreme weather events is a harbinger of things to come, the point is that there is no way of turning back once we have set major climate destabilization in motion. And if it happens, the consequences will be devastating for all of humanity at both the local and the global level. So unlike almost any other policy issue, about which we can afford to debate the pros and cons and change our minds later as new information comes in, there is no margin for error here. There is only one planet, one biosphere, and one Anthropocene epoch, and we must err on the side of caution!
Many approaches have been taken to try to convince individual citizens to take climate change seriously. These include the arguments that we have an ethical and moral obligation to the less fortunate on the planet and to future generations, that we are conducting a risky experiment with the planet, that the global economy will suffer, and so on. However, none of these really has had an effect because of our all-powerful capacity for reality denial. The resulting optimism bias makes most people (even if they believe the science) just go on with life and hope for the best. Between the time the writing of this book began, in 2007, and the time it was completed, toward the end of 2012, much change has occurred, and there has been a spate of extreme weather events. Regrettable as they are, these events have at least made many people begin to think that climate change may be real. However, as long as there is no certainty regarding the future of climate, the average person will continue to hope for the best, and the naysayers and profiteers will take advantage of that uncertainty.
Even appeals to individuals based on the fact that local climate disruption will affect their lifestyles and pocketbooks generate the response “How can you be certain?” So let me offer an analogy that the average human can perhaps better understand and relate to. Imagine you are going to take a long airplane flight and you’re told that there is just a 10 percent chance that things will go badly wrong and that the plane will crash. Would you get on that plane? Of course not: Most people would want at least a 99 percent certainty that the flight is safe. Well, the same is true for climate disruption.
By the time you’re reading this, a majority of skeptics may have finally begun to admit that something bad is happening to our climate and that we humans may be contributing to it. But given the extreme degree of polarization surrounding the debate, it is unlikely that any consensus on action will be reached soon. If so, there are alternate sensible approaches that can be pursued. Durwood Zaelke, founder of the Institute for Governance and Sustainable Development, and Veerabhadran Ramanathan, a professor at the Scripps Institution of Oceanography, suggest a short-term strategy that involves cutting emissions of four climate pollutants: “black carbon, a component of soot; methane, the main component of natural gas; lower-level ozone, a main ingredient of urban smog; and hydrofluorocarbons, or HFCs, which are used as coolants. They account for as much as 40 percent of current warming. Unlike carbon dioxide, these pollutants are short-lived in the atmosphere. If we stop emitting them, they will disappear in a matter of weeks to a few decades. We have technologies to do this, and, in many cases, laws and institutions to support these cuts.”
Another reasonable approach is suggested by Peter Byck in his movie “Carbon Nation.” Byck points out that even if you agree to disagree with those who deny climate disruption by humans, most such denialists are still very much in favor of clean air, clean water, and clean sources of energy. One can even point out that there are opportunities to make good money in connection with several of the new approaches to alternative energy. Perhaps this is the way to bypass our human denial of climate change and deal with the problem.
Again, let us hope it is not too late to turn back the clock so that we can continue to have a relatively stable climate, as we have enjoyed in the last ten thousand years. But as of this writing, we simply continue to deny the limits to which we can push the planet.
One can build a logical argument that our innate reality denial, coupled with our runaway technological achievements, virtually guarantees that we will be facing global calamities on a scale never before seen. Many different scenarios can be constructed around resource depletion, climate change, disease pandemics, etc., that will lead to a breakdown in modern civilization, war, and human death and suffering on a massive scale. History suggests that we will not learn any long-term lessons from the first few of these disasters, in large part because of our nemesis, reality denial. Indeed, it is arguable that we are destined ultimately to destroy ourselves as a species—or, at the very least, to continue to cycle between well-developed civilization and catastrophic collapse, never reaching a technological state much beyond what we currently enjoy. We hope that these words do not prove to be prophetic. But we may well be in for a cycle of catastrophic collapses and have to rebuild ourselves, much as many civilizations have done in the past.
Even those of us who agree that human nature and technology are essentially incompatible would like to think that eventually, perhaps after a disaster or two, we would shape up and come to grips with our basic problems. One can be an optimist in this manner and still accept all the arguments here about self-awareness and denial. Indeed, it is probably essential for our long-term success that we embrace the idea that reality denial is a fundamental part of human nature. For only by knowing this enemy can we consciously change our innate, destructive behavioral tendencies. Ironically, many readers of this statement are likely to deny the important point we are making. In other words, we are in a state of denial about our denial of reality, and this is not an easy problem to overcome on a daily basis.
It is only by understanding reality denial as an enemy within that we might be able to overcome it. An alcoholic is not necessarily a drunk. But in order to avoid becoming one, it is necessary for the alcoholic to acknowledge his innate tendencies and to actively fight the impulses that drive a behavior that is satisfying in the short term but self-destructive over the long haul. Just as an alcoholic must sometimes hit psychological or emotional bottom in order to come to grips with his problem, it may require a small nuclear war or a major climatic disruption in order for us to see the light. We haven’t seen it yet, nor have we even acknowledged the underlying trait (denial) that makes it so difficult for us to deal with our critical issues.
Many of our everyday problems also have a component of denial—whether we’re investing in a risky scheme, deciding to stay with an abusive partner, or whatever. Those who give advice professionally or otherwise can easily detect the denial component in a person’s situation and advise the subject to escape from the danger. What they may fail to recognize is that reality denial is such a fundamental part of being human that one cannot easily just shed its clutches. To continue the alcoholic analogy, we can’t abandon denial any more than we can change our fundamental personality traits. What we can do is to recognize this trait and try to manage its deleterious effects, just as the alcoholic manages his disease. Indeed, we don’t want to completely escape from our state of denial, even if we could—it’s the only thing that keeps us sane in the face of rational realization of mortality. We just need to recognize and manage its pathological consequences. We need psychotherapy on a societal and global scale.
The big question is, then, are we capable of controlling denial sufficiently to solve our current dilemmas? Can we create a spiritual construct (individually, or as a new formal religion) that can satisfy our acceptance of mortality without letting it drive our lives and society to oblivion? As always, the first step is recognition of the problem. The next step may require a process that is every bit as unlikely as the convergence of self-awareness and self-deception that allowed us to break through the wall so many years ago. It required millions of years for the latter event to occur. We don’t have that much time to solve our current dilemma. Humans may be products of chance events that allowed full theory of mind (ToM) and intentionality to emerge, but we were able to come into existence only because we simultaneously developed the ability to deny our own mortality and reality. But as a by-product, we also deny many of our other problems, despite having the ability to understand them.
The twentieth century was the era in which the greatest amount of human knowledge was accumulated, knowledge that was contributed to by people who were following up on the knowledge generated by many past human civilizations. This knowledge base (which is essentially the understanding and appreciation of aspects of reality) generated much scientific and technological progress, and there was great hope that the twenty-first century would see further steps in this direction. However, a variety of sociopolitical doctrines and agendas seemed to have caused a marked regression in the public appreciation for the value of scientific knowledge.
In 2007 I had the opportunity to participate in a historic meeting in which some of the oldest and best-known American societies that focus on the value of knowledge (the American Philosophical Society, the American Academy of Arts and Sciences, and the National Academy of Sciences) came together to discuss the situation in America. This meeting was entitled The Public Good: Knowledge as the Foundation for a Democratic Society and involved scientists, humanists, and leaders in business and public affairs from around the country in a two-and-a- half-day series of panel discussions, conversations, and dinner programs that focused on “some of the most pressing issues facing the nation.” The underlying premise was simple—that in order for a democracy to succeed it should be based on real knowledge of the facts of the world around us. Following the meeting, several tomes were published to explicate this important though obvious idea. But this effort has been largely ignored, and reality denial has continued to gain ground. Why is this so? Perhaps it is because reality is unpleasant for most people, and our built- in mechanisms for denial allow us to pick up whatever line of reasoning we find the most comforting in the face of competitive realities, however flawed that reasoning is. But let us hope that other efforts like this continue, so that we can once again base our future on our understanding of reality.
Sadly, following a century of intense focus on the value of science for society, we are even facing a growing and dangerous antiscience movement that appears to originate from adherence to a variety of social, political, and religious doctrines that favor alternate realities. New Yorker staff writer Michael Specter addresses this new and widespread fear of science and the consequences of this reality denial for individuals and for the planet in his 2009 book, “Denialism.” He expresses concern over the fact that both political leaders and the public seem to mistrust science more than ever before. So irrational and unfounded fears about everything from childhood vaccines to genetically modified grains abound, even while dietary supplements and ―natural‖ cures with no proven value are gaining many followers. As Specter sees it, this war against science amounts to a war on progress itself, and it’s occurring at a time when we actually need science more than ever to chart our future in a rational fashion.
Why is it that so many humans are attracted to these illogical doctrines? Is it simply a reflection of the fact that most humans would prefer not to face reality? After all, science is all about revealing factual realities. But is it possible to hold a full ToM along with a complete recognition of reality? Yes, of course it is, but it requires a large amount of definitive information and rational thinking about the meaning of all that information. We mentioned earlier the excellent book by Richard Dawkins entitled “The Magic of Reality.” In this wonderfully written and beautifully illustrated volume, Dawkins explicates the reality of who we are, starting with the Big Bang, proceeding through human evolution, and going all the way to atoms and subatomic structures. As Dawkins correctly points out, this true reality (as revealed by the methods of science) is indeed magical beyond belief, at least from our perspective as humans. However, it is easy for someone of Dawkins’s knowledge and intellect to appreciate reality’s magic. For the average human, though, the enormity of this reality can actually be rather unpleasant, and the book could also be read as The Horror of Reality. Thus most people prefer to ignore or rationalize away many of the realities they do not like to think about. On the other hand, even the extreme realist cannot get through a day without ignoring some realities and taking some risks. The ideal situation, therefore, would be to relish and use both our full ToM as well as our ability to deny reality, allowing us optimism. And indeed, there are also many benefits to reality denial.
You have brought up a very excellent details , regards for the post.
Posted by: macys wedding | 01/04/2014 at 04:14 PM