Archive for June, 2014

Antarctic Sea Ice Growing Despite Global Warming Warnings

Antarctic Sea Ice Growing Despite Global Warming Warnings

Sunday, 29 Jun 2014 10:37 AM

By Sandy Fitzgerald

Urgent: Should Obamacare Be Repealed? Vote Here Now!

Eavedropping on ET: Two New Programs Launching to Listen for Aliens

The 305-meter telescope at Arecibo is just one of a collection that SETI will use to search nearby stars for electronic signals that could indicate intelligent life. If such a civilization was utilizing a similar dish to image exoplanets, SETI’s team should be able to detect it.
CREDIT: Arecibo Observatory

SETI is stepping up its search for alien lifeforms on far off worlds.

The Search for Extraterrestrial Intelligence (SETI) program recently announced two new methods to search for signals that could come from life on other planets. In the Panchromatic SETI project, multiple telescopes will scan a variety of wavelengths from 30 stars near the sun; the project will look for powerful signals beamed into space, potentially by intelligent extraterrestrials. SETI is also launching an interplanetary eavesdropping program that is expected to search for messages beamed between planets in a single system.

“If we are polluting space, perhaps other extraterrestrials are leaking signals,” Dan Werthimer, director of the Berkley SETI Research Center, told an audience during the Smithsonian Magazine’s “The Future is Here” Festival in May. “Maybe they’re sending something our way.” [10 Exoplanets That Could Host Alien Life]


‘Everything we’ve got’

Since humans made their first FM radio and television transmissions, signals from Earth have been spilling out into space, announcing the presence of intelligent life to any group that might be searching for it. According to Werthimer, signals from the 1950s television show “I Love Lucy” have reached thousands of stars, while the nearest suns have already enjoyed the “The Simpsons.”

If Earth has unintentionally leaked signs of its presence, other alien civilizations may have done the same thing. SETI’s new Panchromatic project will utilize a variety of telescopes covering a range of frequencies to scour the nearest stars.

“We’re going to throw everything we’ve got at it,” Werthimer added.

The panchromatic project will examine a sample of the 30 stars that lie within 5 parsecs (16 light-years) from the sun. The list includes 13 single stars, seven binary systems and one triple system. Most of the stars are smaller than the sun, but the project will also examine two white dwarfs and one moderately evolved F star. No confirmed exoplanets have been found around any of the stars.

By setting distance as the criteria, the SETI team hopes to alleviate any bias that might otherwise result from focusing on systems similar to that of Earth. The team selected stars for study based only on how far they lie from the sun.

According to SETI-Berkeley’s Andrew Siemion, chief scientist of the eavesdropping project, the search will also probe a diverse stellar population already well studied at many wavelengths.

“In the event of a non-detection, these attributes of the sample will allow us to place strong and broadly applicable limits on the presence of technology,” Siemion told via email.

Observations from the Low Frequency Array (LOFAR) telescope in Europe and the Green Bank Telescope (GBT) in West Virginia will begin over the summer and fall of 2014. Instrument development and commissioning is still in progress for the Infrared Spatial Interferometer (ISI) at Mount Wilson Observatory and the Nickel Telescope at Lick Observatory, both in California. But according to Siemion, the pair should be ready at about the same time. The Nickel Telescope will conduct the first-ever SETI observations done in the near-infrared.

The project has also proposed time on the William E. Gordon telescope at Arecibo Observatory in Puerto Rico, and hopes to piggyback on observations obtained at the Keck Telescope in Mauna Kea, Hawaii.

“Pending availability, we also intend to observe our initial panchromatic target list with other telescopes,” Siemion said.

The close distances of the stars selected for the Panchromatic project should make potential signals from intelligent civilizations easier to detect, Siemion said.

“Within a couple of parsecs, E.T. wouldn’t have to have technology much more advanced than ours in order for us to detect it,” Siemion said. [5 Bold Claims of Alien Life]

Signals from E.T.’s rovers

The second SETI project will make use of the observations of multi-planet systems gathered by NASA’s Kepler mission as it attempts to eavesdrop on signals broadcast from one planet to another.

The Kepler telescope detects planets as they pass in front of their stars, causing a dip in the stars’ brightness. If two planets lie in the same orbital plane, pointed toward Earth, they will occasionally line up. If an intelligent species originated on one planet in a system, then went on to explore or inhabit a second planet, signals sent from one planet to the other should be detectable when the two are lined up facing the Earth.

So far, the team has observed about 75 of these events in multi-planet systems using the Green Bank Telescope. The range of radio frequencies include those used on Earth to communicate with craft sent to other planets.

“Our detection algorithms are sensitive to communications like those used by NASA’s Deep Space Network to communicate with spacecraft, so if E.T. broadcasts something similar at sufficient power, we could hear it,” Siemion said.

Detecting such signals doesn’t necessarily mean researchers will be able to translate them. Scientists may not be able to determine if the communication is to an outpost or a rover. However, that won’t make the discovery any less exciting.

Though a signal between planets should be detectable, Siemion said that it is more likely that a broad signal would be intercepted. Although terrestrial television broadcasts in large beams, these would be too weak to detect under the current experiments. Instead, scientists would be looking for something like the U.S. Air Force’s “sky fence,” a high-frequency radar used in an attempt to track space junk in orbit.

Distance poses one of the biggest problems in eavesdropping on extraterrestrials. The required power for a transmitter to be detected increases with the square of the distance. A transmitter 150 light-years away would need to be 100 times as powerful as one 15 light-years away, if everything else remains the same.

Most of the Kepler planets and planetary candidates lie at significant distances from Earth, making it difficult for scientists to detect weaker signals like those emitted by spacecraft communication. However, if alien civilizations used something akin to Arecibo, Siemion said, scientists would stand a far better chance of detecting it.

“The flood of multiplanet systems discovered by Kepler and the high precision of the planetary ephemerides the Kepler team publishes has directly made this experiment possible,” Siemion said. Ephemerides are tables that provide the positions of astronomical bodies at a given time.

He expressed his excitement about NASA’s planned Transiting Exoplanet Survey Satellite (TESS) mission, set to launch in 2017.


“TESS will find lots of multiplanet systems as well, but they will be closer to Earth,” said Siemion.

Astronomers also look forward to using the Square Kilometer Array (SKA), which could be more than an order of magnitude more sensitive than current systems and thousands of times faster.

The explosion in discoveries of planets and planetary candidates over the past two decades has provided a strong encouragement for SETI’s search for intelligent life, Siemion said.

“If there is one message from exoplanet research in the last two decades, it is that, simply, planets are everywhere,” he said. “Moreover, rocky, lukewarm planets appear to be very common. We shouldn’t have to look very far, statistically speaking, to find planets where life could develop.”

Follow us @Spacedotcom, Facebook and Google+. Original article on

NASA launches challenges using OpenNEX data

By GCN Staff

NASA launches challenges using OpenNEX data

NASA is launching two challenges to give the public an opportunity to create innovative ways to use data from the agency’s Earth science satellites.

The open data challenges will use the Open NASA Earth Exchange (OpenNEX), an Amazon Web Services data and supercomputing platform where users can share knowledge and expertise.

A component of the NASA Earth Exchange, OpenNEX also features a large collection of climate and Earth science satellite data sets, including global land surface images, vegetation conditions, climate observations and climate projections.

“OpenNEX provides the general public with easy access to an integrated Earth science computational and data platform,” said Rama Nemani, principal scientist for the NEX project at NASA’s Ames Research Center in Moffett Field, Calif.

“These challenges allow citizen scientists to realize the value of NASA data assets and offers NASA new ideas on how to share and use that data.”

To educate citizen scientists on how the data on OpenNEX can be used, NASA is releasing a series of online video lectures and hands-on lab modules.

The first stage of the challenge offers as much as $10,000 in awards for ideas on novel uses of the data sets. The second stage, beginning in August, will offer between $30,000 and $50,000 for the development of an application or algorithm that promotes climate resilience using the OpenNEX data, and based on ideas from the first stage of the challenge. NASA will announce the overall challenge winners in December.

OpenNEX is hosted on the Amazon Web Services cloud and available to the public through a Space Act Agreement.

Posted by GCN Staff on Jun 25, 2014 at 12:18 PM
main site ….

Global warming of the Earth’s surface has decelerated

Global warming of the Earth’s surface has decelerated (Viewpoint)

The recently-released National Climate Assessment (NCA) from the U.S. government offers considerable cause for concern for climate calamity, but downplays the decelerating trend in global surface temperature in the 2000s, which I document here.

Many climate scientists are currently working to figure out what is causing the slowdown, because if it continues, it would call into question the legitimacy of many climate model projections (and inversely offer some good news for our planet).

An article in Nature earlier this year discusses some of the possible causes for what some have to referred to as the global warming “pause” or “hiatus”.  Explanations include the quietest solar cycle in over a hundred years, increases in Asian pollution, more effective oceanic heat absorption, and even volcanic activity. Indeed, a peer-reviewed paper published in February estimates that about 15 percent of the pause can be attributed to increased volcanism. But some have questioned whether the pause or deceleration is even occurring at all.

 Verifying the pause

You can see the pause (or deceleration in warming) yourself by simply grabbing the freely available data from NASA and NOAA. For the chart below, I took the annual global temperature difference from average (or anomaly) and calculated the change from the prior year. So the very first data point is the change from 2000 to 2001 and so on. One sign of data validation is that the trends are the same on both datasets.  Both of these government sources show a slight downward slope since 2000:

(Matt Rogers)

You can see some of the spikes associated with El Niño events (when heat was released into the atmosphere from warmer than normal ocean temperatures in the tropical Pacific) that occurred in 2004-05 and 2009-10. But the warm changes have generally been decreasing while cool changes have grown.

To be sure, both sets of data points show an mean annual change of +0.01C during the 2000s. But, if current trends continue for just a few more years, then the mean change for the 2000s will shift to negative; in other words, the warming would really stop. The current +.01C mean increase in temperatures is insufficient to verify the climate change projections for major warming (even the low end +1-2C) by mid-to-late century. A peer reviewed study in Nature Climate Change published in 2013 drew the same conclusion: “Recent observed global warming is significantly less than that simulated by climate models,” it says.

Addressing objections

Whenever this surprising result (that warming has slowed) is pointed out, it raises some objections. Here are a few (feel free to add your own in the comments section!):

“You are cherry-picking your start and end times.”

This is a common argument when any data are shown. The recently released National Climate Assessment used 1901 to 1960 as its definition for “normal” weather in a number of its benchmark analyses. Other reports use the entire century-wide mean, while yet others use the National Weather Service conventional 1981-2010 climatology. All of this is cherry-picking one way or another. The key here is to see if the data are behaving as they should.

For the chart I show above, I could have easily chosen the very warm 1998 as my starting point to amplify my trend line, but instead I cleanly chose the 2000s. However, another point to make that everyone will agree with is that I’m plotting temperature coinciding with the highest global atmospheric CO2 concentration. Therefore, no matter what you believe the sensitivity is, the impact should be strongest in these recent years vs. any others.

Space Elevator

“The last decade was still the warmest of all time.”

This is true per the data sets that I am using (NASA and NOAA), so no dispute there. However, in order for climate change projections to verify, we need to continue breaking records more often than not. In the NASA data set, 2013 only broke one monthly record (2012 only tied one), meaning that most of the time, we are not moving upward. Without breaking new warm records, we continue to flat line and each year, fall further and further behind projections.

“Your sample size is too small.”

My thirteen data points from the 2000s are deemed by critics as not enough data to make any case at all. I could have expanded to 1998 to raise the size to 15, but I readily admit that the more data the better in these situations. The question then becomes what sample size would you need to see to start getting concerned that the climate models might be too warm? The trend line for either data set suggests the mean change could shift negative in just the next few years. Would that be sufficient?

Every person- every scientist- may have a different definition here. I will say that the global annual temperature is not just one figure, but a culmination of thousands of data points- a very large sample size in itself! The deceleration in warming is inconsistent with climate model projections if it were to continue. You can choose to agree with that prior statement, but also caveat with the usual “but we need more data”. I’m fine with that.

“The data are not accurate.”

This has become my new favorite, because for years and years, key figures in the climate change research community have used these data points to support the view warming is occurring at an alarming pace. Now, we hear from some scientists that this data is “masking” reality, such that the real global warming is buried in the deep oceans in areas that are difficult to measure.

For example, climate scientist Kevin Trenberth notes the slow down in warming may just mean that it is “manifested in different ways” now. Trenberth accurately describes the Pacific Decadal Oscillation, which has slowed down warming trends during prior negative states (like its current condition). But climate change modeling fails to show this, which suggests it’s  not capturing important oceanic processes and could well be overdoing climate sensitivity to CO2 increases.

Another “data are inaccurate” argument is that about 15 percent of the planet is not being counted, including large sections of the Arctic which have warmed markedly in recent years. One recent study (Cowtan and Way, 2013) suggests that if those areas were measured (and are estimated in this study using satellite data), the warming would be much stronger and no pause would be seen. Assuming this study is correct, it would not undo the pause in the warming outside the Arctic where most people live. Furthermore, the rate of warming it estimates globally (factoring in the steeper Arctic warming) is still at at the very low end of climate model projections.  The study appears to be a valuable contribution but further work is needed to confirm its results.

“Your assessment is accurate, but it doesn’t matter.”

The main point here is that yes, we are indeed seeing this slowdown, it is real, but it is only temporary. The recently-released NCA acknowledges the slowdown in Appendix 3 and even shows a chart of it (see below).

(National Climate Assessment)

However, it notes that these periods are temporary, driven by natural variability-induced modifications to the climate system (factors such as the El Niño-La Niña cycles). All of this may indeed be true, but note that the current pause is longer than prior ones indicated on the chart, so again, the question becomes (and they don’t answer this) how long is too long? You can even see their red line outlining the latest pause on the right side of the chart, but not extending to include the last three years which looks even longer than its predecessors.

(Editor’s note: For alternative perspectives, see: Faux Pause: Ocean Warming, Sea Level Rise And Polar Ice Melt Speed Up, Surface Warming To Follow and Global Warming Since 1997 Underestimated by Half)

Update (6/21/2014):   NASA GISS revises their historical data monthly, so there have indeed been some changes since I produced the top graph earlier this spring.  The chart below shows these changes, which do not change the fundamental argument of deceleration, but do adjust the mean of change from +0.01C to +0.02C.

You can see the changes here by year:



Renaissance of Hope

Special Kindle verison
Renaissance of Hope