Space Exploration and the American Character

Historian Dr. Michael Robinson, of the University of Hartford, opened his talk with a William Falkner quote that helped frame the 1961-1981 Key Moments in Human Spaceflight conference in Washington DC, held on April 26th through April 27th: “The past is never dead. It's not even past.” The first talks of the day dealt extensively with the narrative and drive behind space travel and exploration, painting it as much of a major cultural element within the United States as it was one of scientific discovery and military necessity. In a way, we went to space because it was something that we’ve always done as Americans.

The Past

Dr. Robinson started with a short story of a great endeavor that captured the imagination of the public, one that brought in a lot of rivalry between nations on a global scale, advanced our scientific knowledge, and where high tech equipment helped bring valiant explorers to the extremes. Several disasters followed, and the government pulled back its support, yielding part of the field to private companies. If asked, most people would describe the space race of the twentieth century, and while they would be right, what Robinson talked about was the race for the North Pole. In 1909, American explorer Robert Peary claimed to have reached the North Pole, becoming the first known man to do so. While there are reasons to doubt or support Peary’s travels, Robinson makes some interesting points in comparing the North Pole to that of the space expeditions.

Robinson described a culture of exploration that’s existed in the United States since its inception, but took pains to make a distinction between the frontier motif that has permeated science fiction, and the realities that we’ve come to expect from going into orbit. Television shows have undoubtedly aided in the excitement for space research and exploration, but they’ve incorporated elements that have great significance for American audiences: Star Trek, for example, had been described as a ‘Wagon train to the stars’, while Firefly has likewise been described as a ‘Western in space’, to say nothing of films like Outland, Star Wars, and numerous other examples. In his 2004 address that helped outline America’s space ambitions, President George W. Bush noted that “the desire to explore and understand is part of our character”. Other presidents have said similar things, and it’s clear that there’s a certain vibe that it catches with the American voter.

It makes sense, considering the United State’s history over the past centuries: Americans are all newcomers, and as Robinson said, the west was a place to settle. The arctic, and space, really aren’t, and distinctions should be made between everything. Historically, both space and the arctic have much smaller footprints of human interactions. It’s a difficult area to reach, and once people are there, it’s an incredibly hostile environment that discourages casual visits.

The American West, on the other hand, is very different for the purposes of imagery for space travel. During the great migration during the 1800s, it was relatively cheap for a family to travel out to vast untapped territory: around $500. Additionally, once people reached the west, they found a place that readily supported human life, providing land, food, and raw materials. The American west was transformed by mass migration, helping to vastly expand the U.S. economy during that time, while leading to a massive expansion of the federal government and to the Civil War. Space, on the other hand, isn’t so forgiving, and like the arctic, doesn’t yield the benefits that the west provided.

File:Caspar David Friedrich 006.jpg

The explorations into the arctic gives us a sense of where space can go and how expectations from the public and the scientific community can come into line with one another. The polar explorations absolutely captured the imagination of the public: art exhibits toured the country, while one of the first science fiction novels, Frankenstein, was partially set in the North. However, what we can learn from the arctic is fairly simple: we abandon the idea of development in the short to mid future. Like the arctic, space is an extreme for human life, and the best lessons that we can glean for space will come from the past experiences that we’ve had from other such extremes: exploration in areas where people don’t usually go. This isn’t to say that people shouldn’t, or can’t go to the ends of the Earth and beyond, but to prepare accordingly, in all elements.

The arctic provides a useful model in what our expectations should be for space, and provide some historical context for why we go into space. We shouldn’t discount the idea that the west and the country’s history of exploration and settlement as a factor in going into space.

The Space Age

James Spiller, of SUNY Brockport, followed up with talk about the frontier analogy in space travel, noting that the imagery conformed to people’s expectations, and that notable figures in the field, such as Werner von Braun, liked the comparison because it helped to promote people’s interest in space. The west connected and resonated with the public, which has a history and mythos of exploration. This goes deep in our metaphorical, cultural veins, linking the ideas of US exceptionalism and individualism that came from the colonization of the American continent. The explorations to the west, the arctic and eventually to space, came about because it appealed to out character: it was part of our identity.

The launch of Sputnik in 1957 undermined much of what Americans believed, not just on a technical scale, but seemed to confirm that a country with vastly different values could do what we weren’t, with everything that was going for us, able to do. In the aftermath of the launch, President Eisenhower moved slowly on an American response, to great dismay of the public. It was a shock to the entire country, one that helped to prompt fast action and pushing up the urgency for a red-blooded American to go into space. How could the individual, exceptional Americans fall behind the socialists, whose values run completely counter to our own? There had already been numerous examples of individuals who had conquered machines and territories, such as Charles Lindberg and Robert Peary and the Mercury astronauts followed. Indeed, for all of the reasons for why the West feels important to Americans, the space program exemplifies certain traits in the people we selected to represent us in space.

Spiller noted that the frontier of the west seems to have vanished: the culture towards the end of the 1960s and early 1970s fractured society and the idea of American exceptionalism: the Civil Rights movement discredited parts of it, all the while the United States seems to have lost its lead in the global economy as other countries have overtaken it. As a result, the message of space changed, looking not out, but in. President Ronald Reagan worked to revisit the message, as did President H.W. Bush. There have been further changes since the first space missions: a new global threat that actively seeks to curtail modernism, terrorism, has preoccupied out attention, and pushed our priorities elsewhere.

Going Forward

The last speaker was former NASA Historian Steven Dick, who looked at the relationship between Exploration, Discovery and Science within human spaceflight, pointing out distinctions between the three: Exploration implies searching, while Discovery implies finding something, while science leads to explanation. The distinctions are important because they are fundamental to the rhetoric, he explained, and that the last program to really accomplish all three was the Apollo program.

Going into the future, NASA appears to be at a crossroads, and its actions now will help to define where it goes from here on out. The original budget that put men on the moon was unsustainable, but only just, and that as a result, NASA at the age of fifty is still constrained by actions taken when it was only twelve. The space shuttle is part of a program that was not a robust agent of exploration, discovery or science. He pointed out that where programs like Apollo and the Hubble Space telescope have their dramatic top ten moments, the Space shuttle really doesn’t, because it’s a truck: it’s designed with indeterminate, multiple functions, ranging from a science platform to a delivery vehicle for satellites. This isn’t to discredit the advances made because of the shuttle, but when compared to other programs, it doesn’t quite compare. The space station, on the other hand, was well worth the money, but people don’t respond as well to pure science as they do exploration. Apollo demonstrated that science alone isn’t enough to sustain public interest.

As he said it, “exploration without science is lame, discovery without science is blind, and exploration without discovery or science is unfulfilled.” Going forward, any endeavors beyond our planet should encapsulate all three elements to capture the public’s imagination, and make the efforts to go beyond orbit worthwhile for all. However, manned spaceflight can accomplish so much more than robotic probes and satellites, especially for fulfilling the frontier motif that helps to define our interest in going into space: it seems hard to embody the traits that have helped inspire people to go further when it’s someone, or something, else doing the exploring.

Space, the final frontier, is an apt way to look at how manned spaceflight programs are looked at, and it certainly captures the imagination of people from around the world. While some of the direct imagry is misplaced, it's not a bad thing for people to capture, but it does help to remember the bigger, and more realistic picture when it comes to what the goals and expectations are for space. NASA, going forward will have to take some of these lessons to heart, reexamining its core mission and the goals that its working to put forward. Nobody in the room doubted that the money and the advances that have come forward as a result of space travel were worth the cost and risks involved, but they want it to continue forward far into the future. To do otherwise would mean giving up a significant part of who we are, because the traits that that have come to define our exploration beyond the horizon, to the North and high above us are elements that are worth celebrating: the drive to discover, to explore and to explain are all essential for the future.

The Original Mad Scientist: Nikolas Tesla

When looking at the roots of the modern world, one needs to not look further than one man, Nikolas Tesla, for a notable example. A bright mind from an early age, Tesla defines the term 'genius', and from an early age, demonstrated an ability for innovation and invention, and would later go on to enlighten the world: literally.

Born in January of 1856 in Croatia (then the Austrian Empire), Tesla's intelligence and intellect exhibited itself at an early age. In his autobiography, My Inventions, he noted that "suffered from a peculiar affliction due to the appearance of images, often accompanied by strong flashes of light, which marred the sight of real objects and interfered with my thought and action. They were pictures of things and scenes which I had really scene, never of those I imagined." He attributes this ability to strongly conceptualize and visualize as a key element in how he was able to invent various things, and early on, was frightened by this perceived ability. From an early age, he began to invent various objects: a hook to catch frogs, air powered guns, as well as dismantled clocks and at one point, fixed a fire engine's hose during a demonstration to the town.

Following this, at the age of six, he attended the Higher Real Gymnasium Karlovac, finishing out his time there in three years, instead of the four generally required. After he had finished, Tesla was stricken with Cholera. This incident encouraged his parents to send him to school for science and engineering, where they had previously hoped that he would join the clergy. Recovering, Tesla was permitted to join the Austrian Polytechnic in Graz in 1875,  where he further excelled and became further interested in physics and engineering, becoming interested in creating motors, a particularly early step in his work in alternate current.

In 1880, he relocated to Prague, Bohemia, at the Charles-Ferdinand University, before realizing that his academic pursuits were putting a strain on his parents. Leaving the school, he sought work at the National Telephone Company, before moving in 1882 to Paris, where he worked for the Continental Edison Company, working on electrical equipment, and two years later, he travelled to the United States, seeking to work for Thomas Edison. In a letter of recommendation from Charles Batchelor, a former employer and friend of Tesla's, it noted that "I know two great men and you are one of them; the other is this young man". This was a rather positive start to a relationship that would quickly sour. Tesla went to work for Edison, who had promised him $50,000 for his work to upgrade and repair generators, but shortly after the work was done, Edison claimed that he had been making a joke, and Tesla, furious, left the company.

Telsa then formed his own company, Tesla Electric Light & Manufacturing, where be began to work on his method of Alternate Current, which he believed was far cheaper and safer than the Direct Current that was used across the world by this point, but due to issues with the company, he was soon removed. In 1888, he began work under George Westinghouse at the Westinghouse Electric & Manufacturing Company, where he worked on his alternating current and studied what were later understood to be x-rays. Over the next several years, Tesla continued his work in electronics and physics.

During this time, both he and Edison became adversaries, with Edison invested in his Direct Current technology, while Tesla and Westinghouse backed Alternate Current. Edison implemented a public campaign against AC power, touting accidents and the fact that AC power was used for the first electric chair. The tide turned, however, when Westinghouse's company was awarded a contract to harness the power of Niagara falls to generate electrical power, which resulted in a positive, highly public and practical test of AC power, while the 1893 Chicago World Fair likewise utilized Tesla's power system in a highly public fashion. The result was a shift from the utilization of DC power to AC power, which allowed for a greater range for power, and over the next century, DC power was phased out.

In 1899, Tesla moved to Colorado Springs (where he was portrayed by David Bowie in Christopher Nolan's film, The Prestige), where he continued his experiments with electricity. He created several methods for transmitting power wirelessly, and in 1900, he began work on the Wardenclyffe Tower with funds from J.P. Morgan, a wireless transmission tower. The tower was completed, but the project ran short of funds, and was eventually discontinued. Tesla lost several patents at the same time, and in the years that followed, he continued scientific research, designing things such as a directed energy weapon, but found little support for his plans. In the last decades of his life, he began suffering from a mental illness, and passed away in 1943 at his home in New York City.

Tesla is a figure that has captured the imagination of the geek community, but is at the same time someone who is almost single-handedly responsible for the transmission of power that covers the nation, a necessity in modern life. In fiction, he has been portrayed several times (the aforementioned appearance in The Prestige is a good example), but is known for his intellect and forward thinking in science and technology. Several of his inventions, such as a death ray and wireless power, are still elements that belong to the science fictional realm.

What is most astonishing, reading over Tesla's 36 page autobiography, is his ability to conceive of projects and carry them out, understanding them almost completely. He appears to have had a very rare gift, one that borders on the supernatural, or to some, some sort of mental illness or disability that allowed him unprecedented abilities. In the truest sense how I see geekdom, Tesla fits all of the marks, a textbook case of following a passion extensively, and changing the world as he did so.

Defining Geek History

Before looking at exactly what 'geek history' is, the term must be defined, to give the term relevance, but also the content that should be looked at. With those elements in mind, an examination of the history behind the Geekdom becomes much easier, but also allows for someone to look at the greater significance for how exactly Geek History is in any way important.

A couple of years ago, Ben Nugent published a book titled American Nerd: The Story Of My People, a short book that was part biography, part history and part examination of culture. While I wasn't particularly impressed with the book as a whole, there were a number of very good ideas there, particularly in how he defined a geek or a nerd-type person. It boiled down to a fairly simple concept: a geek/nerd (minus the social connotations) is someone who is extremely passionate about any given subject, learning all that they can about it. They tend to be readers, and because of this attention, there's a tendency to miss out on some social elements that most people take for granted. The subject itself doesn't necessarily matter, and I've generally assumed that geeks/nerds tend to gravitate towards the science fiction / fantasy realms because the content is more appealing.

By this definition, education, literacy and an attention to detail are paramount, defining elements in how geeks and nerds are defined. In a country where education seems to be a point against an individual, it's even more important to understand the role that such things play with the public, and to recognize the importance of individuals in the past, and how their actions and knowledge has helped to define the present that we now know today.

In a large way, looking at geekdom in history is akin to looking at major historical figures who have the largest impact because of their contributions to events through conception, rather than just actions. These are people who help to develop ideas in a number of different stages, either formulating designs, concepts of plans, or helping to see some major thing through. With the Geek definition in mind, people such as this also tend to be very hands on with a lot of their work, being directly involved with their projects, or singlehandedly putting something together that changes how people think about the world afterwards. In some cases, this is a simple person to pick out: an author of a notable book, or a director of a film. Other instances, where science and industry are involved, this would be slightly more difficult, given the collaborative nature of some of these projects.

Looking at Geek History, then, is looking at the people who change the future because of their ideas, rather than predominantly implementing these changes themselves. These creators were instrumental in putting items in place that likewise changed how people interact with the world, and in addition to examining the people behind the advances, it's also important to look at how their works, whether they're inventions, novels, films or even events, helped to transform the world into a much different place.

Geek History largely comes down to the history of knowledge and ideas. Given the general rise in popularity in geek things, I tend to think of this style of history as one that looks to the past hundred to hundred and fifty years, simply because of the general proximity of the modern day, and more highly relevant to the modern sort of geek movement. However, there's elements of this line of thinking that extend far more into the past, mixing science and social histories that can likely go back to the beginning of the examination of thought itself.

The study and appreciation of the modern geek movement should look at the roots and elements that make up the modern geek, from the tools that are used to the entertainment that we soak up to the way that we think and approach the world. It's far more than the stereotypes, it's in everything that makes up those stereotypes.

Rant: Education

As someone who studied to become a historian, one of the most frustrating things to watch unfold is the ongoing debate over textbook content that is happening now in Texas. School boards have opted to revise criteria in favor of modern political happenings, injecting their own preferences to combat the 'liberal version' of history as it has been playing out. The political as to how this will impact education aside, this seems to me to be a dangerous shift in how we will educate our younger generations.

In college, I studied both history and geology, and came away with a dominant feeling for context. While exploring vastly different subjects, both the study of prior human events and of geological happenings are linked by a couple of very basic things: they're about actions, and how those actions affect other things down the line. Listening to the radio this afternoon, Vermont Edition talked about a recent landslide that consumed a home in Canada, and geologists on the show noted that there is a direct correlation between what happened over ten thousand years ago and today. Actions have a tendency, in both nature and human history, to have both short term and long term effects. Thus, the context of whatever one is studying is just as important as the individual figures and events that make up the present day.

History is the interpretation of the past. When I've talked about my degree, an M.A. in Military History, I usually have to preface that with an explanation that I'm not an expert in the specifics of World War II, Vietnam, the Napoleonic Era or the American Civil War. This was a degree that was designed to teach someone how to think like a historian, how to research like a historian and how to put together an argument, backed up with evidence like a historian – I can confidently say that I can talk about any number of military concepts, battles and figures, but more importantly, I know how to research those things, but also understand how to examine them within the context of history.

The founder of my alma mater, Alden Partridge, conceived of the school at a time when practical achievements were just as important as the theory behind the words, and as such, sought to educate the first Norwich University cadets in ways that encouraged them to see their teachings in practice, but also to formulate their own thinking based on what they saw when they were seeing. Where Partridge looked to more practical studies, such as Engineering, the same line of thinking applies to the social sciences field, which is where the worry about the Texas Board of Education comes into play.

History is not a static field, but one that is constantly growing and changing as different minds enter the field. Nor is history the study of the past: history is the examination of the past, and the interpretation of events as they happened. Thus, removing important figures such as Thomas Jefferson from mention as a founding father based on some of the things that he pushed eliminates the change to examine some of the context, and arguments, that have helped to shape the present. While teaching any sort of correct form of what happened in the past is far more preferable than teaching something that is ultimately incorrect, the problems surrounding the study of the past in this instance isn’t about correcting past mistakes, it’s about re-framing the past with a modern mindset, and patently ignoring the context of past events to suit modern political thought.

Removing elements of the past is harmful in a number of ways, going far beyond the individual figures: it not only impacts a student’s understanding as to what events happened, but why they happened. Removing Thomas Jefferson as a figure who had pushed for the separation of church and State leaves a void in the understanding for a student as to why the founders placed such a restriction within the constitution. Rewriting history in this manner will thus leave a flawed understanding of the past, which in turn impacts how we view and act in the present.

While that, in and of itself is frightening, what bothers me far more is that a trend towards intellectual backwater and restriction on thought has grown. Often, there are arguments against spending on scientific endeavors, because a practical use or result might not result, or someone cannot think of how any such argument or study can be useful. However, the progress of science and thinking cannot be directed, channeled or moved for convenient thinking: science and learning will ultimately find what it will find: oftentimes, the results and findings exist, but only through searching, will answers be found. The same applies to education, and restricting what people learn simply for the sake of political convenience is short-sighted, ignorant and downright offensive to anybody who wants to see this country grow intellectually, politically and economically in the future.

Thar She Blows

The recent eruptions from the Eyjafjallajökull volcano this past week has caused havoc with European air carriers, bringing everything to a virtual stop. Something along the lines of 60,000 to 80,000 flights have been disrupted, stranding passengers and cargo in place, having a huge effect on the economies of numerous countries. And to think, this is a pretty minor eruption, with a historic record of followup eruptions that have taken place after the first ones in surrounding Volcanoes.

Volcanoes are one of the world's most powerful forces of nature, literally fire from the Earth itself, a force that has proved to be incredible devastating throughout planetary history. During my college years, I minored in geology (a trait that I seem to have inherited from my father, who is a professional geologist), and it remains a field that I continue to find fascinating, beautiful and awe-inspiring. In 2005 and 2006, I travelled to the American Southwest with the geology department for two separate trips to study the regional characteristics in the beds of rock below the surface of the Earth.

While most of my geologic interests centered around sedimentology and stratigraphy (studying sedimentary rocks, and interpreting the conditions in which they were laid down, respectively), there are some parallels with studying igneous rocks and the larger structures that are formed in the presence of volcanoes. Walking in and around volcanoes is an awe-inspiring thing to do, and it's an experience that I would really like to repeat sometime in the future.

Volcanic activity occurs when molten rock from the Earth's mantle pushes its way up into the crust and onto the surface. There are three general methods in which this is presented: shield volcanoes, cinder cones and stratovolcanos. There are a couple of other out there, but those are the general types. The formation of each respective volcano depends greatly on the surrounding environment in the crust in which it is formed. There is a key element that helps to dictate the type of volcano that erupting magma forms: Silica.

The explosive nature of a volcano depends greatly on the viscosity of the magma, which in turn determines the gas content within the magma. From Princeton University: [Viscosity is the] resistance of a liquid to shear forces (and hence to flow). In a nutshell, this means that something with a high level of viscosity will have a higher resistance to flow: it's thicker. Something with a low viscosity will have less resistance. The move viscous something is, the better it is at releasing gases trapped within the magma. The more gas within magma, the more explosive potential within a volcano.

This is why features such as the ones that created Hawaii constantly erupt with little disruption to anyone outside of the lava flows: the gasses within the magma allow for it to escape, and as a result, there are a number of very smooth flows of molten rock that spreads out from the origin, resulting in what is called a shield volcano, because of the shape that it forms. Here, the magma is classified as Mafic, which has a lower silicone content within the minerals that compose the flow - the resulting rocks tend to be rich in pyroxenes and olivines, and are darker in color. The other major class of volcanoes is the Stratovolcano, which form over major subduction zones, such as what you would find ringing the Pacific rim. The magma here tends to be classified as felsic, with a much higher silicone content, which is more viscous in nature and allows for more gas to be trapped within. These volcanoes tend to be very tall, with high peaks composed of alternating flows and debris from prior eruptions. Cinder cones tend to be found on both types of volcano, and are usually one-time events that build mounds of basalt to some impressive heights.

File:Krakatoa eruption lithograph.jpg

The Stratovolcanos are the ones that are problematic, because they have effects that stretch far beyond their immediate vicinity, as we've been seeing with Eyjafjallajökull in Iceland, and more notably, with the Krakatoa eruption of 1883. This was one of the most violent eruption (About 13,000 times the strength of the Hiroshima atomic bomb) in human-recorded history, and had profound, long term effects on global climate. Following that eruption was a marked drop in global temperature (1.2 degrees C, according to Wikipedia). Eruptions of this nature do far more than throw out lava from the vents: pent up energy within the magma builds, then explodes, vaporizing rock and throwing up a massive plume of ash, debris and dust. Larger particles come down the quickest, given their mass, and the further from the volcano you go, the smaller the debris. The dust thrown up in an event such as this rises and moves to the Stratosphere, where it can be carried around the globe. This pumps other gasses into the atmosphere, which in turn helps to deflect sunlight from the planet, allowing for a cooling event to occur. The dust and gasses in the atmosphere has the added effect of filtering out sunlight, leading to some spectacular sunsets.

Another notable event was the 1816 'Year without a summer', which had in turn been caused by the eruption of Mount Tambora the year before, which is likewise one of the most powerful eruptions in known history, at roughly four times the Krakatowa eruption. In this instance, a massive global cooling occurred, affecting the Northern Hemisphere by destroying crops and precipitating a famine. Here in Vermont, snow fell each month of the year, and the eruption would have an affect on the planet's climate for years to come.

Most of the major eruptions in recorded history have been relatively minor, with explosions of Krakatoa and Mount Tambora occurring long before the advent of modern society and globalization. The dust that is thrown up into the air by the explosion is very fine, and has the ability to completely ruin mechanical engines, resulting in the grounding of air traffic around Europe, and soon, most likely Canada. Keeping in mind that this was a relatively small and localized eruption, imagine what will happen when there is another eruption on the scale of one of those eruptions. In that instance, we will have quite a lot more to worry about than stranded passengers.

The Weather Outside

It's finally snowing again in Vermont, and we're expected to get up to a foot in some places. Not necessarily central Vermont, which has lost a lot of its snow and taken on a spring-like atmosphere, something that will hopefully be changing. Meanwhile the rest of the country has gotten all of the snow that should rightfully be Vermont's, with feet of snow at a time, exhausting the budgets of state highway departments two months into the year.

With the snow came, from conservative pundits, a quick outcry as to how the storms invalidated the theory of global climate change, on the grounds that if there is snow on the ground, clearly, there can't be any sort of warming in the atmosphere, and that the liberal lies concerning man's impact on the planet have been unraveled by the white stuff on the ground. Just as quickly, liberal commentators slammed, and rightly so, the thinking behind these fairly short sighted arguments.

There are a number of different theories when it comes to how the climate of the world has interacted with humanity in the past ten thousand years of our existence. Scholarly evidence points to irrefutable evidence that the planet has indeed been heating up - both the atmosphere and the oceans (which are a major component to the Earth's atmosphere), and that this trend largely fits with the rise of industrialization around the world. By and large, there is an assumption that these two figures are inextricably linked together. This may or not be the case, but it does present a compelling notion that humanity is indeed responsible, at least in part, for some of the changes in the atmosphere. Numerous scientific groups from around the world look to general circulation models (which attempt to mathematically link the atmosphere, the oceans and life of the planet into a representation of the world) to help see what is happening in the world. While their methods differ, there is a general consensus that humanity has contributed to CO2 in the atmosphere in a way that is likely to raise global temperatures between .05 and 1.5 degrees Celsius. (Brian Skinner, Stephen C. Porter and Jeffrey Park, Dynamic Earth: An Introduction to Physical Geology, 5th Edition, 518) While a single degree doesn't seem like a lot, and is even welcomed by some (I can't begin to say how many people I've heard say that they'll welcome Global Warming with each new snowfall each year) that sort of rise in temperature does more than just heat up the planet. With increases in temperatures, minute changes within atmospheric patterns occur - increased evaporation from water sources in turn leads to more precipitation elsewhere, which in turn has an effect on other areas, which in turn has its own effects in other areas. This is why the term Global Warming has been shifted in recent years to the more politically correct sounding Climate Change - not necessarily for politically correct reasons, but simply because Global Warming does not cover the entire story. Global Warming, in a way, is a component of Global Climate Change.

While wide-scale reporting of the weather did not really exist for much of the world prior to the Second World War, leading to only recent accurate data, other sources of information can be found within the geologic record. Global Warming and Climate Changing events are nothing new within the Earth's history, and numerous locations around the world help to pinpoint what happened in the past. On each continent, large formations of Limestone, topped with glacial deposits, point to long periods of warming periods, followed by global cooling events. Ancient ocean bed deposits littered with drop stones provide concrete and tangible evidence that these sorts of events happened time and time again, over the courses of thousands of years. With the most recent indications pointing to new elements of climate change, and with the possibility of humans speeding up what might be a natural process, the real question becomes, not what we can do about it, but what can we do next?

When looking through the geologic column, it becomes readily apparent that these sorts of changes occur often, and that the planet's climate has changed drastically throughout the billions of years of its existence. On both sides of the liberal and conservative arguments, there exists a certain stupidity and simplification to the issue at hand. I don't necessarily think that human society should be vilified for essentially doing what life generally does when left to its own devices: expand and make it easier to reproduce, or that we should blindly close our eyes to the changes that are clearly happening in the world. Where there is snowfall in Washington DC - In the middle of winter, I might add, there are countless other problems around the world as global weather patterns shift. Our atmosphere has a fickle attitude, and our memory only extends so far, but we have become comfortable with what we remember and what we are used to.

What I dislike the most is the timing of much of the arguments against Global Climate Change, with allegations towards respected scientific bodies, resignations and the recent row with the sudden weather, and the entire theory of climate change has been thrown into question, with TV pundits talking back and forth, and instant polls from viewers being broadcast as real news. The notion that human-made climate change is certainly open to debate, but there is irrefutable evidence that the planet’s temperature is rising. The idea that the polling data taken from average Americans is put toe to toe with decades of scholarly, peer reviewed evidence is just ridiculous. I would hardly expect any sort of average person to understand the science and workings behind how our climate works, not to mention the analysis of such a study, and when said viewers are fed information and doubt from the media, the comparison is even more ridiculous. I, as someone educated in geology and scientific method, can hardly understand the implications and vast nature of such science.

What scares me the most is that the television pundits who go on screen and doubt the existence of such a phenomenon or before a wide scale audience at a convention to dispute such claims most likely know that what they are doing is playing to the fears and uncertainty of the public to fulfill some larger agenda that they might have: whether it’s demonstrating climate change legislation as a sort of over-reach of the Federal Government or of elite liberalism gone wrong. And in reaction, the left overreacts, making fun or coming across as arrogant in their rebuttals, rather than explaining the background of the science involved with such a concept. In the end, it just helps to fulfill the images of both sides of political thought, all the while just adding to the hot air around the world.

The problem with all of this is the dismissal of scientific method, and it demonstrates that much of the mentality and feeling that existed under the Bush administration still exists within a large segment of the United States. There seems to be an irrational fear of academics, of learning and of knowledge, in favor of someone’s gut instincts and what they can see. The principles behind science are sound: any sort of phenomenon can be replicated and tested, but the thinking behind sciences seems to elude much of the population, something that is then exploited when something out of the ordinary occurs, such as the storms that have blanketed the United States recently.

In the meantime, I wouldn't mind if the weather patterns would shift back to normal, so I can get a proper winter back to the places where it can be appreciated.

(In the time that I wrote this last night and the time that I posted this, we got a foot of snow.)

Happy Birthday Hubble

Today is the 19th birthday of the Hubble Telescope, which was launched into a high orbit on this day in 1990 by Space Shuttle Discovery. The Satellite has remained one of the most important installations to have been launched. The images that have been taken have helped to vastly increase our knowledge of the surrounding universe, and take some of the most beautiful sights from all over.

The Hubble Space Telescope was an important project for NASA, which was still reeling from the destruction of the Challenger orbiter just four years earlier after faulty parts and negligence contributed to the deaths of the crew members. NASA's public image was tarnished from the accident, and hopes were riding high on the successes of Hubble. The launch, STS-031, brought the Hubble 380 miles up, the second highest orbit, and twice that of the Shuttle's normal range.

After it's deployment on April 25th, scientists found that the images that they recieved weren't as sharp as they'd thought. The primary lense in the telescope was incorrectly built, 2.3 micrometers out of the correct shape. NASA's image was once again tarnished, and scientists worked quickly to devise a solution. This was aided by the design of the Hubble, which was the only satellite that could be serviced in orbit. The first of four servicing missions brought up a sort of add-on that allowed for Hubble's vision to be corrected. The mission was an overwhelming success (except that the astronauts couldn't get one of the doors closed, and had to ratchet it shut). Five spacewalks were performed, and with the new images from the telescope, the public image of the agency rebounded. Three additional Servicing Missions were conducted, one in 1997, 1999 and 2002, each of which upgraded equipment or repaired faulty parts.

The last mission is scheduled later next month, STS-125, which will install a new camera and spectrograph and repair several other instruments that have failed. Following this mission, the Telescope will continue its life through to 2013, and it will be replaced by the James Webb Space Telescope at that time.

Hubble's website. Follow it on twitter.

Top Geek Things of 2008

It's coming up to the end of the year, and looking back, 2008 has been a very fun year for geeks everywhere - in books, television programs and films, among other things. Over the past couple of days, I've been thinking back over the year to see what was the best and worst of 2008.

The Best:

Starbuck returned from the Grave; The Fleet reaches Earth. (Battlestar Galactica Season 4)

The third season of Battlestar Galactica was a little rocky in the middle, but the last episodes set up a real bang. Starbuck was presumably killed, only to turn up during a major confrontation of the Human and Cylon fleets. Season 4 opens even bigger, with one of the best space battles that I've ever seen. Our four new cylons are freaking out, Starbuck's back and everything culminates in the discovery of Earth in episode 10.Galactica has long been one of my favorite shows, and with a certain end point in mind, Season four was where Galactica got somewhat back onto the tracks, with a fairly tight story arc, only to get to another long wait for the final ten episodes. It's been well worth it though.

Pushing Daisies... back from the Grave, and back to it

After a long hiatus due to the writer's strike (more about that in a bit) my favorite show of 2007-2008 came back with a new set of episodes. There are not enough good things that I can say about this show. We left off last year with Chuck learning that it was Ned that killed her father, only to end up at the end of this season with him being awoken. It was another season of fantastic storytelling, character development and extremely fantastic dialog. Unfortunately, the show has been axed due to low ratings. Fortunately, Bryan Fuller will be going to Heroes for the latter half of Season 3.

Lost Gets Better - Again.

Here's the situation. LOST season 1 blew everyone away. Season 2 drove them away. Season 3 brought some people back, and Season 4, everything got interesting again. This season was the best since Season 1, in my opinion. We had several new characters (my favorite was Daniel Faraday, the physicist), and a couple people killed off. We started seeing flash-forwards, where Jack has a beard and addicted to pain pills, Hurley's in a mental institution and Sayid is channeling Abram's Alias. Oh, and they get off the island. Then the island vanishes.

I have Leonard Nemoy's DNA? (The Big Bang Theory)

This show started in 2007, where I was annoyed by its laugh track and annoying characters. But this year, I started watching it and enjoying it. While it's certainly a very stereotypical portrayal of nerds and geeks, it's fun, because the creators have put in place a series of fun characters, and the writers make some jokes that are actually funny. This week's episode was absolutely priceless, when Sheldon gets a napkin signed by Leonard Nimoy. Now, if they'll just ditch the laugh track. This show's likely to be around for a while longer - it's been getting better and better ratings as the year goes on.

Back in a Nick of Time (Life on Mars)

One of my absolute favorite shows of all time was Life on Mars. Up until this year, it was only a BBC drama, until ABC picked it up and made a pilot. That pilot sucked, horribly, so the cast was ditched, except for Jason O'Mara, and the show was redone, set in New York City, given a good cast and started up. The result? A solid TV series that's mirrored the original (but it's starting to diverge a bit now), a wonderful soundtrack of classic rock and a story that's actually interesting. I can't wait for its return in 2009.

The Joker raises worldwide GDP. (The Dark Knight)

First, there was excitement when it was announced that the Joker was going to be the villain. Then Heath Ledger signed up for the role. Then he died earlier this year after filming was completed, leaving some people to wonder if the film would be released on schedule. Then Warner Brothers covered every surface they could find with Dark Knight ads. When the film was released, it went on to gross $996,680,514 in theaters. The film was a huge success, and a fantastic film at that. It was a comic book movie with true darkness, some real symbolism and good storytelling throughout. It's a pity that we won't see Heath Ledger reprise his role of The Joker, because he's done the best portrayal of a villain in recent film memory.

I am Iron Man (Iron Man)

Before The Dark Knight blew the doors off the box office, there was Iron Man. Iron Man has long been a favorite marvel superhero of mine, and everything fell into place for this film. Good story, well directed, fantastic casting (Robert Downey Jr. as Tony Stark was brilliant) and of course, the Mark II set of armor. Marvel proved that they could make a good superhero movie, one that was relevant and not stuck in the low-humor that characterized other comic book adaptations. Already, I can't wait for Iron Man 2. And Iron Man 3. And The Avengers.

Eeeeevvvvvaaaaaa (Wall-E)

Pixar has released what is possibly their best film to date. (Except maybe Toy Story and The Incredibles). Following a robot far from home, Andrew Stanton has presented a film with a cute, romantic science fiction story with some social commentary (said to be unintentional) woven into the CGI. Wall-E is easily the most appealing robot since R2-D2 hit the big screen in 1977, and his antics as he's pulled along for the ride (literally) are cute, heartbreaking and funny.And with very little real dialog.

Roar. Crunch. Repeat. (Cloverfield)

Monster movies meets social networking video and America gets its own monster. This film was brilliantly shot with an extremely fun concept. A monster comes and plays t-ball with the statue of liberty, and it's caught on camera by a bunch of twenty-somethings as they escape. The project was conceived of by LOST creator J.J. Abrams, and his fingerprints are all over it. From the lack of explanation of everything to the weird stuff, this is a very fun film to watch. Rumors are that there's a Cloverfield 2 being talked about.

With My Freeze Ray I Will Stop... The World (Dr. Horrible's Sing Along Blog)

This project was a huge success for Joss Whedon & Co. Conceived of during the Writer's strike, Whedon presents an aspiring supervillian, Dr. Horrible (Neil Patrick Harris), his buddies and his quest to finish his freeze ray, avoid Captain Hammer (Nathan Fillion) and win over Penny (Felicia Day). We're treated to musical numbers, crazy plots and a fantastic venture to prove that the internet is a viable place to release content.Take a look here.

Up, up and away! (When We Left Earth/NASA)

This year was NASA's 50th year in operation, and the Discovery channel released a fantastic documentary entitled When We Left Earth that touted its major achievements and failures throughout the years, bringing viewers some of the most incredible footage of space that I've ever seen, and telling a fantastic story of how NASA has come to be, with interviews with astronauts and support personnel. I get chills when I watch it, and wonder when we'll return to the moon and beyond.

Hobbit's Labyrinth (The Hobbit)

After long rumors, production problems and drama with Peter Jackson (who directed Lord of the Rings), Guillermo del Toro signed on to direct the upcoming Hobbit film and prequel. (Or two Hobbit films?) This is extremely good news, because the people who can adequately fill Jackson's shoes after LOTR are few and far between. del Toro is the perfect director for this project, and has already proven that he can do fantasy brilliantly, with his masterpiece Pan's Labyrinth. Plus, he can play in other people's universes, as per his work with the Hellboy films. (Which weren't as good, but fun)

Watchman Trailer (Watchman)

What's called the greatest graphic novel ever is coming to the big screen, much to the annoyance of its creator, and to FOX, apparently. A trailer for Watchman aired with The Dark Knight, and it made fanboys everywhere sit up and take notice. There's still complaints about how it's unfilmable and that it'll be too short or too long, but from my eyes? This looks like it'll be THE comic book film to see next year. It looks like it captured the feel of the comic book pretty well, and it's embellished a bit to look badass. Plus, Rorschach looks dead on. Just like I thought he'd be like.

Large Hadron Collider (Science)

The Large Hadron Collider was turned on on September 10th, to many worries about the world ending. Contrary to popular opinion, the earth didn't vanish in a tiny black hole. It was set to uncover the mysteries of the universe, but then it broke down again nine days later and won't be up online until 2009. But, it's still cool!

Geeks in Politics (Obama [spiderman, conan, superman] Patrick Leahy [Batman Cameo])

There's been a lot of geekiness in politics this year. No lightsaber waving from McCain this time around, but President Elect Obama has claimed to be a big Spiderman and Conan fan, and did a superman pose in Metropolis, IL. In addition to him, VT senator Patrick Leahy, a huge batman fan, had a cameo in The Dark Knight. He's also the head of the Senate Judiciary Committee. Ironic.

Superheroes: Fashion and Fantasy (Costumes)

The New York Metropolitan Museum of Art hosted an exhibit earlier this year (it's since closed) called Superheroes: Fashion and Fantasy. It featured a number of costumes from a number of classic films, such as the original Superman and Wonder Woman films, but also things as recently released as The Dark Knight and Iron Man. The fashion section was a bit of a miss for me, but the exhibit as a whole was just outstanding. Plus, they had several original copies of Superman and Batman, Spiderman and Iron Man on display. Covered in a plastic shield of course...

Star Wars Encyclopedia (Star Wars)

Del Rey released a new and expanded Star Wars Encyclopedia this year, one that is not only complete, but still remarkably up to date. That's not likely to last as long, given how fast LFL churns out canon material, but it's a beautiful repository of information in the universe. I can spend hours just paging through reading things.

"Anathem" By Neal Stephenson

I actually have yet to read this book, but it's caught my eye, and it's made a splash when it comes to the sci-fi literary world. All I really know about it is that it takes place on an earth-like world, and doubles as a philosophical text for knowledge and religion. I'll have to pick it up, and only expand my to-read list further.

A Game of Thrones picked up by HBO (Song of Fire & Ice)

Another book that I have yet to read, but I actually own this one. HBO has picked up the book for a series. If there's one thing that HBO does well, it's TV shows, because they can pour money into them and get a good result. And, they have a good track record with adaptations, with things such as Band of Brothers and John Adams. I'll watch this when it's released.

We'ss Har Wars End (Karen Traviss)

Several years in the making, Karen Traviss has finally finished her Wess'Har Wars series with book 6, Judge. Starting back in 2003, she introduced readers to a fantastic story of first contacts filled with alien races, political commentary and expert storytelling. Judge didn't deliver quite as well as I'd have liked (It certainly wasn't the strongest of the series), it carried the momentum well, and proved to be a good read, one that finished up one of my favorite series satisfactorily. Hopefully, Karen will be back to writing hard scifi again, because she's incredible at it.

Trooping (501st)

This year I got back into trooping with the 501st Legion. All in all, I did a total of 30 or so events, ranging from small affairs here in VT to much larger ones. The most memorable ones were the Boston St. Patrick's Day Parade, Burlington Kid's Day, the Weird Al ConcertSt-Jean-sur-Richelieu Balloon Festival, Walk for Autisms, and the 2008 Woburn Halloween Parade. All my events are listed here.

With all the good things that have happened this year, there's the other side of the coin, and some letdowns, disappointments and pure flops.

Worst:

Writer's Strike

Okay, this started in 2007, but it messed up television for the foreseeable future, by ending some shows and putting others on a long hiatus that has really hurt ratings. Pushing Daisies was one casualty, Terminator was almost one, LOST was put off for a year, as was 24, and already, we're on the eve of another major strike over pretty much the same issues - internet distribution. Hopefully, some lessons will be learned.

Surviving a Nuclear Detonation (Indiana Jones)

Indiana Jones came back, and he came back bland. Indiana Jones and the Crystal Skull was an impossible undertaking to fill the hopes of fans for the past twenty years. While it's not a horrible film, it's nowhere near as high quality as Raiders or Crusade (although I did like it better than Doom). There was no passion, a crazy storyline and some annoying characters. It does have its moments, but they are few and far between.

Skyguy/Snips/Roger Roger (The Clone Wars)

Star Wars was another big LFL franchise that came back this year, and while The Clone Wars certainly had its moments, even high points, this film just extends the image of money grubbing that LFL is involved with, which is a shame. There's too much bad dialog, characters and situations to make this a good part of the Star Wars universe, but the TV show has been making some improvements. The animation is stunningly good, some of the stories are actually good, but every time the battle droids start talking, I want to throw something at my TV.

Michael Crichton Eaten by Cyborg T-Rex and Flesh eating Space Bacteria from the Past.

While my interest in Michael Crichton has waned over the years as he began to write crappy books (Such as Prey and State of Fear), there's no doubt that he's shaped my reading. I'm still a huge fan of Jurassic Park, The Andromeda Strain, Terminal Man and a number of his older novels. He's one of the most popular scifi authors (although he's resisted the genre title) out there with his works, most of which were made into films. It's a shame that he's passed - I was always hoping for another good story from him.

Gary Gygax failed his saving throw

Geek-God Gary Gygax likewise passed away this year, leaving behind a legacy that has shaped nerd-culture in the US forever. His creation, Dungeons and Dragons, along with co-creator Dave Arneson, was one of the defining features of geeks everywhere, something that I got into back in 2001. Along with giving geeks something to do in groups, it helped define a generation's activities, reading materials and conceptions of fantasy through to this day.

Arthur C Clarke becomes the Space Child

Arguably one of the greatest science fiction authors ever, Clarke's death hit the world hard. He helped to define the literary genre, and the actual science behind it, and was responsible for such classics as 2001: A Space Odyssey, Rama, Childhood's End, and numerous others, as well as the telecommunications satellite. He will be sorely missed, and is one of the last of the golden age of science fiction to be with us.(Today would have been his 91st birthday)

CNN Hologram technology

On election nigh, CNN touted their new thing in news casting, a hologram of Will.I.Am. Looked cool, and it looked like a hologram, but it was nothing more than a lot of cameras and empty space plus some CGI. Blah. Let's see some real technology in action please.

Close the Iris! (Stargate Atlantis)

I was a huge fan of Stargate SG-1, and same with Atlantis for the first couple of seasons. This season has just plain sucked. It's a shame, because there's a good concept there, amidst the horrible characters, stories and situations. Not long now, because Atlantis has been canceled, and will be replaced with Stargate Universe next year.

Even more Confusing and Confounding! (Heroes Season 3)

Heroes Season 1 was brilliant. It introduced a new spin on superheroes, only to fall to its own success and have a fairly slow and boring second season. (To be sure, the writer's strike had something to do with it, because it got better). Season 3 was promised to be bigger and better. And it was certainly bigger, with heroes coming back from the grave, more time travel and action, but none of it really made the same impression that season 1 did. I'm still behind episodes, but apparently it's been getting better. Now that Bryan Fuller's returning to the show, can we PLEASE start off really good and get better? Please?

Weird Science (Fringe)

I was really excited for Fringe, the latest show by JJ Abrams. It was a fun concept, and had a good couple episodes at first, but just became so dull that I stopped following it. I might pick it up again at some point, but only when I can marathon the entire thing at once.

Forrest J. Ackerman Dies

Forrest J. Ackerman, one of the first science fiction fans out there recently passed away. He was a key element of the spread of science fiction fandom, and he helped to found the LA Science Fantasy Society, among other numerous achivements, as well as influencing numerous authors over his long life.

Borders Downsizes SciFi Sections

I ranted about this earlier, as did a number of authors. Borders has been downsizing their sci-fi sections. While it's understandable that they have to sell items, and that they can't put everything on the shelf, you can't predict what the next big hit will be, and you can't know that until you actually start selling things.

That's it for this year. Next year, there's already quite a bit coming up. Should be a fun year.

Sci-Fictional

A while ago, I wrote about a show that was coming out that I was pretty excited for - Fringe. The show's been out, and it's pretty much what I've expected, and it's certainly a fun program to watch. The main thing is though, you really can't take it too seriously. Popular mechanics went and did a feature on the bad science in the show. From both episodes that I've seen, they're really taking liberties with what's going on here, and theyve acknowledged that - J.J. Abrams has said that they would pretty much jump the shark each episode, which makes me think that the creators just want to have as much fun as possible before the ratings plummet.

One of the readers on the PM website left this comment:

" It's science fiction, not science fact. there's no point in wasting time and effort to debunk something that isn't real in the first place"

This made me think a little bit - to what extent is Science Fiction about made-up science? To some extent, there's quite a bit, when you look at some of the things that SciFi has covered over all the years. We see aliens from mars, aliens from other star systems, worm holes, cloning, robotics, robots that look like people, robots that look like people and want to be people, hyperspace, and so forth, nothing that really has any real-life counterparts, unless you subscribe to the aliens landed at Roswell thing. So there's a lot of science fiction that utilizes made up items in order to tell its story.

But how much of this is merely a plot device and how much is just technobabble? This, in my mind, is what seperates the good science fiction from the bad. The best science fiction stories that I've read and watched have some of the more absurd things happen to some of the characters. Takeshi Kovacs is a super soldier who's trained to switch bodies by means of a Stack, a small carbon device implanted in his brain (and much of the rest of the population) to prolong life. Shan Frankland was infected with a parasite that allowed her to survive a trip into the vacuum of space for months before being revived. Martin Springfield is an agent for a super intelligence known as the Eschaton, and works to prevent causality breaks designed to eliminate the Eschaton. Dr. Susan Calvin is a robopsychologist for US Robotics and Mechanical Men, and ... you get the idea.

In each instance, the science here is a secondary element, although generally, very well thought out, given the level and sophistication of knowledge at the time of the book's publication. The characters and story are the primary movers here. The same goes for two of my favorite TV shows, Firefly and Battlestar Galactica, where a lot of the science that could, and has been traditionally dropped in as technobabble, has been eliminated in favour of a character driven story.

To me, this is what really makes or breaks a story, when an author or creator can place people in improbable or impossible situations, and make them react in a way that entertains, or enlightens us, rather than a useless explaination for something that doesn't exist.

This isn't to say that all science fiction utilizes fake science, and with time, science catches up to the literature. Charles Stross's Halting State (reviewed here) utilizes MMORPG and Social Networking as part of its storyline, showing off a near future that's quite frightening. Karen Traviss's Wess'Har series utilizes some likely technology throughout the story, and presents some very real problems, such as Global Warming and Climate Change several hundred years from the present day, and provides a fairly realistic-seeming future for society after that happens. The film Minority Report actually utilized a think tank to try and figure out where technology would go, and in the years since its been released, much of what we saw seems likely. The list goes on and on.

The big question is, when does some of the more fantastic things, like Cloning, Artificial intelligence, flying cars and jet packs become non-fictional? We've already had a couple of those things happen.

In short, there's a lot of Science that will be perceived as fake, but necessary. In Fringe's case, it's the fantastic explanation that's undermined by bad science. This really doesn't set the show apart from things such as the X-Files or Star Trek, but it is fun to watch.

Rant: Large Hadron Collider

"I'm scared if we're going to destroy the world, because if it does, I'll blame the scientists." - Child interviewed on BBC Global News

Today, the LHC went online under the French/Swiss border earlier today. In a nutshell, the LHC is a large scientific lab that is designed to try and re-create the conditions that might have existed at the beginning of the Universe, during the Big Bang. What the device does is direct beams of matter against each other at 99.999999% the speed of light, which will bombard particles against another, and from there, they will be able to view the wreckage, which will hopefully tell us something about some of the fundamental building blocks of matter.

From what I've been able to read and comprehend, some of the big questions that they're looking to discover why matter exist. Wired has an article here that talks about some of the other hopes that they're looking for.

Why the rant? See the quote on top - there have been a lot of complaints, legal suits and fears that the LHC could spell the end of the world. Some theorize that the device could create micro black holes that last for nano-seconds, and which could lead to bigger ones, and thus the end of the world. Personally, I think that this is nonsense, because of the life-spans of said black holes, but also, as Wired states:

Q: Is the Large Hadron Collider a threat to human civilization and the existence of the Earth? A: No. Einstein's relativity says it's impossible. And, just in case, studies of highly-energetic cosmic rays hitting earth rule it out, too.

A more reasonable complaint might be the 10 billion dollars spent on the project, which some argue could go to more practical scientific issues and pressing problems. I say that while it's a lot of money, this is possibly one of the most exciting projects to come to fruition at the moment - learning how the universe was created is just mind blowing.

What annoys me is the absolute lack of vision and possibility that lays before us with this device that some people seem to have, but also that some people are chosing to ignore some of the more obvious facts and studies regarding the possibility of cataclysmic failure. Cosmic rays from space have much more power than this, and we've never had a problem with them.

Of course, if the world does end, well, that's that.

Near-Hit?

I bought a newspaper the other day, and while reading it, came across an interesting column headline: Asteroid Could Collide With Mars, Scientists Say

According to the very brief article, this asteroid, called 2007 WD5, which was discovered in November, is calculated to pass very close to Mars. Earlier odds were given at 1-75 of an impact, but recent historical data on the asteroid's path have bumped that up to 1-25. The current prediction is that if this is the 1 in 25 chance, the asteroid will hit Mars on January 30th.

Those are close odds, like really, really, really close. If this was something coming towards Earth, I'd stock up on food and water and bring along my copy of Cormic McCarthy's The Road.

An asteroid is any object that's smaller than 600 miles across. The largest one, around that size, is called Ceres. Most asteroids orbit the sun between Mars and Jupiter. Both plants have moons that are thought to be captured asteroids.

This isn't the first time that something's plowed into one of the planets - this is most likely a common occurance. One of the coolest asteroid impacts occured on one of Saturn's moons, Mimas. The crater, which was formed billions of years ago, has turned the moon to look almost exactly like the Death Star. In 1994, 21 fragments of the Shoemaker-Levy comet impacted Jupiter, creating huge impacts in the atmosphere, releasing an incredible amount of energy.

2007 WD5 is about the size of a football field. That doesn't sound too big, but it's something that could do a lot of damage to the planet - an imact from an object that size would have huge rammifications for the planet. The Meteor Crater in Arizona, was formed when an object a hundred feet in length, a third of the size of the one going towards Mars, hit Earth, leaving a crater a mile in diameter. The last time earth was hit is thought to be 1908, when an asteroid landed in an uninhabited area in Sibera, exploding several miles above the surface, causing widespread devestation. Currently, it's theorized that a 6 mile long asteroid is enough to cause a mass-extinction that would kill off the human race. 65 million years ago, a large asteroid hit what's now Mexico, and is believed to have played a large part in killing off the dinosaurs. An earlier asteroid impact at Permian-Cretasious bountry was larger, killing off 95% of the planet's life. 95% !

Overall, asteroid impacts are probably a fairly common occurence, when you take into account geologic time. (think big) The moon is pockmarked with major asteroid impacts that've left bright scars across the surface. Mars and Jupiter, who are closer to the asteroid belt, have probably been hit numerous times in their history. As I mentioned before, Earth has been hit before as well, and will probably be hit again. Currently, there are 200 known asteroids that pass over Earth's pathway around the sun. There's probably a lot more that we don't know about. If I remember correctly, our next near-hit will be around 40 years from now, when one will come close to the planet. I think this sort of thing should really put into focus the fragility of our planet. Life isn't something that should be taken for granted here - in the larger scheme of things, it hasn't been around long, only a billion years or so, and it's come very close to being erased, and that's just because nature's out to get us. A small thing like global warming is pretty good evidence that we can screw things up pretty badly on our own.

Despite all the doomsday and gloom surrounding these sorts of events (there's really nothing that upbeat about planetary extinction), 2007 WD5 (they have to find a better name for this rock) coming close to Mars and possibly hitting it is really interesting. Scientifically, it'll allow scientists at NASA and around the world to examine what will happen on a planet when a huge rock slams into it, and will be able to see it as it happens. I have no idea what it will mean for all the existing equipment that we have on the planet, such as Viking and the Mars Rovers. But, we do have weather satellites up there, and the amount of data that they'll be able to collect will be immense - planetary geologists will have a field day. This is all assuming that this isn't the 24 times that this asteroid will miss. I for one, will be following this with great interest.

In the meantime, let's have a marathon of Deep Impact, Armageddon, the latest craptacular SciFi movie of the month, and watch the skies.

An interesting, related sidenote - I got a copy of an article entitled An Asteroid Breakup 160 MYR ago As the Probable Source of the K/T Impactor, which came out in the August issue of Nature. It's really interesting, and talks about the place where the K/T boundry asteroid (the one from 65 million years ago), came from.

Yes, there is Science, no matter how you cut it.

I'm getting back into the regular writing habit, although I'll still update on what I'm doing.

Climate Expert Says NASA Tried to Silence Him

When will the Bush Administration get it into their heads that Global Warming is a very real and decidedly problematic thing for everyone in the world? Or, that Science, does infact exist in the world, and that Science doesn't conform to any political agenda or party. Having the public relations officer of NASA stating that his job is to make the president, and the Republican party look good is unexcusable, especially when you look up what NASA's motto is.
Beyond that, having administration clearance for access to reporters is also completely out there.

Where scientists' points of view on climate policy align with those of the administration, however, there are few signs of restrictions on extracurricular lectures or writing.

That shouldn't happen. At all. The entire thing reminds me of the soldiers in the middle east when interviewed by the press. They can give out their name and an extremely general activity that they're doing, but it's so heavily censored that they might as well be given a script. Since when did Science become a classified thing? It's simply not right for the government to pick and choose which information to use, because you can't sugar coat everything, no matter how much you ignore. Science encompasses good and bad, and you must look at both. And it's essential that you look at both before using it to dictate policy.

Diamond Planets?

I found this interesting article on CNN just a couple of minutes ago. The geology of it all makes sense, but I seriously doubt that it has any serious interest beyond that. And it wouldn't be a treasure there. No, Quarz would most likely become the really valuable thing if silicon is absent, which is a dominant component of the Earth's crust. Diamonds really aren't that valuable anyway, mainly because the prices have been artificially inflated by dealers over human history.


Other Planets in Galaxy May Have Layer of Diamonds
WASHINGTON (Reuters) - Some planets in our galaxy could harbor an unexpected treasure: a thick layer of diamonds hiding under the surface, astronomers reported on Monday. No diamond planet exists in our solar system, but some planets orbiting other stars in the Milky Way might have enough carbon to produce a diamond layer, Princeton University astronomer Marc Kuchner said in a telephone news conference. That kind of planet would have to develop
differently from Earth, Mars and Venus, so-called silicate planets made up mostly of silicon-oxygen compounds. Carbon planets might form more like some meteorites than like Earth, which is believed to have condensed from a disk of gas orbiting the sun.
In gas with extra carbon or too little oxygen, carbon compounds like carbides and graphite could form instead of silicates, Kuchner said at a conference on extrasolar planets in Aspen, Colorado.
Any condensed graphite would change into diamond under the high pressures inside carbon planets, potentially forming diamond layers inside the planets many miles thick.
Carbon planets would be made mostly of carbides, although they might have iron cores and atmospheres. Carbides are a kind of ceramic used to line the cylinders of motorcycle engines among other things, Kuchner said.
Planets orbiting the pulsar PSR 1257+12 may be carbon planets, possibly forming from the disruption of a star that produced carbon as it aged, he said.
Other good candidates for carbon planets might be those located near the galaxy's center,
where stars have more carbon than the sun. In fact, the galaxy as a whole is becoming richer in carbon as it gets older, raising the possibility all planets in the future may be carbon planets, Kuchner said.