Filled Under: NASA
An Advance in Tractor-Beam Technology
The term “tractor beam” is thought to have made its first appearance in “Spacehounds of IPC,” a sci-fi novel by Edward E. Smith published in 1947. Smith, whose work has been cited as an influence by the likes of Arthur C. Clarke, George Lucas, and J. Michael Straczynski, the creator of the show “Babylon 5,” worked as the chief chemist for a Michigan flour mill (his specialty was doughnut mixes). His best-known works, the Lensman and Skylark series, are full of imagined technologies that, like the tractor beam, were far beyond the reaches of contemporary science but nevertheless based on seemingly sound principles.
Scientists first began working on making tractor beams a reality in the nineteen-nineties, after the Russian ceramics engineer Eugene Podkletnov reported that certain small objects, when placed above a superconducting disk supported on a rotating magnetic field, lost up to two per cent of their weight. His experiment—the results of which were met with widespread, albeit somewhat knee-jerk, skepticism in the physics community—seemed to indicate that it was possible to neutralize the force of gravity, at least in part. Further experiments followed; in 2001, Podkletnov and the Italian physicist Giovanni Modanese built what they called an impulse gravity generator, a device that emitted a beam of focussed radiation in a “short repulsive force.”
Until recently, no one had managed to move anything bigger than a particle. (There was brief excitement earlier this year, when researchers from Australia and Spain successfully moved a plastic sphere fifty nanometres across—around a thousand times thinner than a human hair—by splitting a beam of light in two and using it to press in on the sphere from each side, like a pair of tweezers.) Even NASA has tried to get in on the action, although their vision seems somewhat lacking when compared with the many tractor-beam scenarios already laid out in science fiction: the team of scientists tasked with the job are supposed to come up with more efficient ways of clearing “orbital debris,” i.e., space garbage. (And they don’t look happy about it.)
Now scientists from the University of Dundee, in Scotland, have created something with a bit more muscle. While most of the documented experiments with tractor-beam technology so far have involved light waves, the team from Dundee used sound waves to manipulate a half-inch triangular prism made of metal and rubber, successfully pulling the target toward the source of the acoustic beam. Half an inch may not sound like much, but it’s a vast improvement on fifty nanometres. The experiment was part of a larger project across four U.K. universities—Bristol, Southampton, Glasgow, and Dundee—and took nine months to complete. The results have been published in Physical Review Letters.
The Dundee tractor beam is not entirely dissimilar from those in “Star Wars” and “Star Trek,” in that it draws an object toward it without making physical contact. The device works by taking advantage of an acoustic wave’s natural push effect, called radiation pressure. (Photons also exert radiation pressure, which is part of the reason comet tails always point away from the sun.) What the Dundee team was able to demonstrate was an example of negative radiation pressure, otherwise known as pull. According to Christine Démoré, a senior research fellow at the Institute for Medical Science and Technology, at Dundee, and a co-author of the paper, one of the team’s main reasons for staging the experiment was to show how easily it could be done. “It’s a relatively simple concept, but it’s just obscured by complex math,” she told me. “By shaping a beam of energy so that it goes around an object in some way, hitting it in the back, it’s possible to then pull the object instead of push it.”
To do this, the team used a commercial ultrasound-surgery machine to generate two Bessel beams, a type of acoustic radiation that remains focussed as it travels rather than spreading out. They fired these beams from either side of the target; when the beams hit the sloped sides of the prism, they were deflected up, like cue balls bouncing off the side of a billiards table. The sideways momentum of the beams transferred to the target, pushing it down, toward the energy source.
The immediate applications of the Dundee tractor beam are medical. Démoré and her colleagues hope to improve the efficacy of focussed ultrasound surgery, a noninvasive treatment for tumors that works by heating and destroying unwanted tissue. Another potential application is targeted drug delivery, achieved via tiny capsules in the bloodstream. “What we’ve shown in our tractor-beam experiment is that it may be possible to push, drag, or hold the drug capsules at a specific location in the body, improving the targeting of the released drugs,” Démoré told me. And if Dundee’s device could be made to work on larger objects, it could also prove useful for collecting geological samples from parts of the planet currently impossible to reach—volcanic vents, the deep sea, perhaps even space. “Some of this may be a fair way off,” Démoré said. “But we’ve demonstrated the physics that make it conceivable.”
By GCN Staff
NASA launches challenges using OpenNEX data
NASA is launching two challenges to give the public an opportunity to create innovative ways to use data from the agency’s Earth science satellites.
The open data challenges will use the Open NASA Earth Exchange (OpenNEX), an Amazon Web Services data and supercomputing platform where users can share knowledge and expertise.
A component of the NASA Earth Exchange, OpenNEX also features a large collection of climate and Earth science satellite data sets, including global land surface images, vegetation conditions, climate observations and climate projections.
“OpenNEX provides the general public with easy access to an integrated Earth science computational and data platform,” said Rama Nemani, principal scientist for the NEX project at NASA’s Ames Research Center in Moffett Field, Calif.
“These challenges allow citizen scientists to realize the value of NASA data assets and offers NASA new ideas on how to share and use that data.”
To educate citizen scientists on how the data on OpenNEX can be used, NASA is releasing a series of online video lectures and hands-on lab modules.
The first stage of the challenge offers as much as $10,000 in awards for ideas on novel uses of the data sets. The second stage, beginning in August, will offer between $30,000 and $50,000 for the development of an application or algorithm that promotes climate resilience using the OpenNEX data, and based on ideas from the first stage of the challenge. NASA will announce the overall challenge winners in December.
OpenNEX is hosted on the Amazon Web Services cloud and available to the public through a Space Act Agreement.
Posted by GCN Staff on Jun 25, 2014 at 12:18 PM
main site ….
The recently-released National Climate Assessment (NCA) from the U.S. government offers considerable cause for concern for climate calamity, but downplays the decelerating trend in global surface temperature in the 2000s, which I document here.
Many climate scientists are currently working to figure out what is causing the slowdown, because if it continues, it would call into question the legitimacy of many climate model projections (and inversely offer some good news for our planet).
An article in Nature earlier this year discusses some of the possible causes for what some have to referred to as the global warming “pause” or “hiatus”. Explanations include the quietest solar cycle in over a hundred years, increases in Asian pollution, more effective oceanic heat absorption, and even volcanic activity. Indeed, a peer-reviewed paper published in February estimates that about 15 percent of the pause can be attributed to increased volcanism. But some have questioned whether the pause or deceleration is even occurring at all.
Verifying the pause
You can see the pause (or deceleration in warming) yourself by simply grabbing the freely available data from NASA and NOAA. For the chart below, I took the annual global temperature difference from average (or anomaly) and calculated the change from the prior year. So the very first data point is the change from 2000 to 2001 and so on. One sign of data validation is that the trends are the same on both datasets. Both of these government sources show a slight downward slope since 2000:
You can see some of the spikes associated with El Niño events (when heat was released into the atmosphere from warmer than normal ocean temperatures in the tropical Pacific) that occurred in 2004-05 and 2009-10. But the warm changes have generally been decreasing while cool changes have grown.
To be sure, both sets of data points show an mean annual change of +0.01C during the 2000s. But, if current trends continue for just a few more years, then the mean change for the 2000s will shift to negative; in other words, the warming would really stop. The current +.01C mean increase in temperatures is insufficient to verify the climate change projections for major warming (even the low end +1-2C) by mid-to-late century. A peer reviewed study in Nature Climate Change published in 2013 drew the same conclusion: “Recent observed global warming is significantly less than that simulated by climate models,” it says.
Whenever this surprising result (that warming has slowed) is pointed out, it raises some objections. Here are a few (feel free to add your own in the comments section!):
“You are cherry-picking your start and end times.”
This is a common argument when any data are shown. The recently released National Climate Assessment used 1901 to 1960 as its definition for “normal” weather in a number of its benchmark analyses. Other reports use the entire century-wide mean, while yet others use the National Weather Service conventional 1981-2010 climatology. All of this is cherry-picking one way or another. The key here is to see if the data are behaving as they should.
For the chart I show above, I could have easily chosen the very warm 1998 as my starting point to amplify my trend line, but instead I cleanly chose the 2000s. However, another point to make that everyone will agree with is that I’m plotting temperature coinciding with the highest global atmospheric CO2 concentration. Therefore, no matter what you believe the sensitivity is, the impact should be strongest in these recent years vs. any others.
“The last decade was still the warmest of all time.”
This is true per the data sets that I am using (NASA and NOAA), so no dispute there. However, in order for climate change projections to verify, we need to continue breaking records more often than not. In the NASA data set, 2013 only broke one monthly record (2012 only tied one), meaning that most of the time, we are not moving upward. Without breaking new warm records, we continue to flat line and each year, fall further and further behind projections.
“Your sample size is too small.”
My thirteen data points from the 2000s are deemed by critics as not enough data to make any case at all. I could have expanded to 1998 to raise the size to 15, but I readily admit that the more data the better in these situations. The question then becomes what sample size would you need to see to start getting concerned that the climate models might be too warm? The trend line for either data set suggests the mean change could shift negative in just the next few years. Would that be sufficient?
Every person- every scientist- may have a different definition here. I will say that the global annual temperature is not just one figure, but a culmination of thousands of data points- a very large sample size in itself! The deceleration in warming is inconsistent with climate model projections if it were to continue. You can choose to agree with that prior statement, but also caveat with the usual “but we need more data”. I’m fine with that.
“The data are not accurate.”
This has become my new favorite, because for years and years, key figures in the climate change research community have used these data points to support the view warming is occurring at an alarming pace. Now, we hear from some scientists that this data is “masking” reality, such that the real global warming is buried in the deep oceans in areas that are difficult to measure.
For example, climate scientist Kevin Trenberth notes the slow down in warming may just mean that it is “manifested in different ways” now. Trenberth accurately describes the Pacific Decadal Oscillation, which has slowed down warming trends during prior negative states (like its current condition). But climate change modeling fails to show this, which suggests it’s not capturing important oceanic processes and could well be overdoing climate sensitivity to CO2 increases.
Another “data are inaccurate” argument is that about 15 percent of the planet is not being counted, including large sections of the Arctic which have warmed markedly in recent years. One recent study (Cowtan and Way, 2013) suggests that if those areas were measured (and are estimated in this study using satellite data), the warming would be much stronger and no pause would be seen. Assuming this study is correct, it would not undo the pause in the warming outside the Arctic where most people live. Furthermore, the rate of warming it estimates globally (factoring in the steeper Arctic warming) is still at at the very low end of climate model projections. The study appears to be a valuable contribution but further work is needed to confirm its results.
“Your assessment is accurate, but it doesn’t matter.”
The main point here is that yes, we are indeed seeing this slowdown, it is real, but it is only temporary. The recently-released NCA acknowledges the slowdown in Appendix 3 and even shows a chart of it (see below).
However, it notes that these periods are temporary, driven by natural variability-induced modifications to the climate system (factors such as the El Niño-La Niña cycles). All of this may indeed be true, but note that the current pause is longer than prior ones indicated on the chart, so again, the question becomes (and they don’t answer this) how long is too long? You can even see their red line outlining the latest pause on the right side of the chart, but not extending to include the last three years which looks even longer than its predecessors.
(Editor’s note: For alternative perspectives, see: Faux Pause: Ocean Warming, Sea Level Rise And Polar Ice Melt Speed Up, Surface Warming To Follow and Global Warming Since 1997 Underestimated by Half)
Update (6/21/2014): NASA GISS revises their historical data monthly, so there have indeed been some changes since I produced the top graph earlier this spring. The chart below shows these changes, which do not change the fundamental argument of deceleration, but do adjust the mean of change from +0.01C to +0.02C.
You can see the changes here by year:
Climate models wildly overestimated global warming, study finds
By Maxim Lott
Published September 12, 2013
Can you rely on the weather forecast? Maybe not, at least when it comes to global warming predictions over short time periods.
That’s the upshot of a new study in the journal Nature Climate Change that compared 117 climate predictions made in the 1990’s to the actual amount of warming. Out of 117 predictions, the study’s author told FoxNews.com, three were roughly accurate and 114 overestimated the amount of warming. On average, the predictions forecasted two times more global warming than actually occurred.
Some scientists say the study shows that climate modelers need to go back to the drawing board.
“It’s a real problem … it shows that there really is something that needs to be fixed in the climate models,” climate scientist John Christy, a professor at the University of Alabama in Huntsville, told FoxNews.com.
But other scientists say that’s making a mountain out of a molehill.
“This is neither surprising nor particularly troubling to me as a climate scientist,” Melanie Fitzpatrick, a climate scientist with the Union of Concerned Scientists, told FoxNews.com. “The work of our community is constantly to refine our understanding of the climate system and improve models based on that,” she added.
The climate models, Fitzpatrick said, will likely be correct over long periods of time. But there are too many variations in climate to expect models to be accurate over two decades.
But John Christy says that climate models have had this problem going back 35 years, to 1979, the first year for which reliable satellite temperature data exists to compare the predictions to.
“I looked at 73 climate models going back to 1979 and every single one predicted more warming than happened in the real world,” Christy said.
Many of the overestimations also made their way into the popular press. In 1989, the Associate Press reported: “Using computer models, researchers concluded that global warming would raise average annual temperatures nationwide 2 degrees by 2010.”
But according to NASA, global temperature has increased by less than half that — about 0.7 degrees Fahrenheit — from 1989 to 2010.
And in 1972, the Christian Science Monitor reported: “Arctic specialist Bernt Balchen says a general warming trend over the North Pole is melting the polar ice cap and may produce an ice-free Arctic Ocean by the year 2000.” That also proved wrong.
But people should still be concerned about global warming, Fitzpatrick says.
“The paper in no way diminishes the extensive body of observations that global warming is happening and that it is largely due to human activity,” she added.
“Global surface temperature is still rising … 2012 was in the top ten warmest years on record. The period 2001-2010 was the warmest on record since instrumental measurements began,” she added.
Christy agrees that there has been some warming over time, but says man-made greenhouse gasses are not as big of a driver of climate change as many think — and that many scientists are in denial about their mistakes.
“I think in one sense the climate establishment is embarrassed by this, and so they’re trying to minimize the problem,” he said. “The fundamental thing a climate model is supposed to predict is temperature. And yet it gets that wrong.”
The study authors did not answer questions from FoxNews.com about the policy implications of their research.
Why were the predictions off? The study authors list many possible reasons, from solar irradiation and incorrect assumptions about the number of volcanic eruptions to bad estimates about how CO2 effects cloud patterns.
Christy said he believes the models overestimate warming because of the way they handle clouds.
“Most models assume that clouds shrink when there is CO2 warming, and that lets in more sun, and that’s what heats up the planet – not so much the direct effect of CO2, but the ‘feedback effect’ of having fewer clouds. In the real world, though, the clouds aren’t shrinking,” he said.
The study also says that an overestimate of the power of CO2 as a greenhouse gas could be why the models over-predict, but that they do not know why the models are wrong at this point.
Christy said he is not optimistic about the models being fixed.
“The Earth system is just too complex to be represented in current climate models. I don’t think they’ll get it right for a long time.”
Read more: http://www.foxnews.com/science/2013/09/12/climate-models-wildly-overestimated-global-warming-study-finds/print#ixzz2ehasx37o
How an Atheist Might Be Behind You Not Seeing It
Billy Hallowell Science, Social Science, & Humanities
more detail at http://swampland.time.com/2013/07/20/the-secret-communion-on-the-moon-the-44-year-anniversary/
For those who don’t know, there’s a fascinating story surrounding Edwin “Buzz” Aldrin’s “secretive” activities in space. The second man to walk on the moon, Aldrin is also the first — and only known — individual to take Holy Communion. The act, which has gained scattered coverage in the past, was both a coveted and a controversial one.
Just before Aldrin stepped foot on the moon, he took the Eucharist, using wafers and a bottle of wine he brought into space from the Webster Presbyterian Church in Webster, Texas. Over the past decades since the 1969 moon landing, he has openly discussed the fascinating experience in detail.
“I poured the wine into the chalice our church had given me. In the one-sixth gravity of the moon the wine curled slowly and gracefully up the side of the cup,” Aldrin told Guideposts Magazine in 1970. “It was interesting to think that the very first liquid ever poured on the moon, and the first food eaten there, were communion elements.”
But this intriguing display, according to the Daily Mail, was purportedly kept secret by the U.S. government, despite plans that the astronaut originally had to broadcast the Christian act on radio.
See a dramatized version of his communion in the 1998 HBO miniseries, “From the Earth to the Moon,” below:
NASA apparently decided not to allow Aldrin to showcase the sacrament, as Madalyn Murray O’Hair, the now-deceased founder of American Atheists, threatened to sue the U.S. government over a previous religious broadcast that apparently unfolded on Apollo 8, a separate vessel (Aldrin was on Apollo 11 for his moon mission).
According to PBS, “She tried to prevent the reading of Genesis 8 on the Apollo space mission, arguing that the astronauts were government employees and thus prohibited from reading the Bible. (The Supreme Court declined jurisdiction.).” So the government was, thus, allegedly hesitant to allow Aldrin to broadcast the communion for fear that it would, again, rile O’Hair.
The Guardian adds more about the famed atheist’s opposition to NASA personnel exhibiting overt faith on the job:
After the Apollo 8 crew had read out the Genesis creation account in orbit, O’Hair wanted a ban on Nasa astronauts practising religion on earth, in space or “around and about the moon” while on duty. She believed it violated the constitutional separation between church and state.
Did You Know: Buzz Aldrin Took Holy Communion on the Moon But an Atheist Activist Allegedly Thwarted a Live Broadcast of the Sacrament
Astronaut Buzz Aldrin displays a copy of his new book as he speaks at the launch of the PayPal Galactic initiative at the SETI institute in Mountain View, Calif., Thursday, June 27, 2013. The program aims to bring together leaders in the space industry working on the issues around the commercialization of space, including what currency will be used, how banking systems will adapt, managing risk and fraud, and customer support.Credit: AP
While Americans didn’t hear the communion live as Aldrin had originally hoped, those at the Houston Space Center Mission Control were invited to join him in giving thanks as he read a section from the Book of John.
“I am the vine, you are the branches. Whoever remains in me, and I in him, will bear much fruit; for you can do nothing without me,” the astronaut read from a written card (John 15:5).
Despite this great fanfare, Aldrin later wondered if Christian communion was the right move after all. While he said that it was incredibly fulfilling, in his memoir, “Magnificent Desolation,” he seemed to admit that it was, in a sense, divisive.
“Perhaps if I had it to do over again, I would not choose to celebrate communion,” he wrote. “Although it was a deeply meaningful experience for me, it was a Christian sacrament, and we had come to the moon in the name of all mankind–be they Christians, Jews, Muslims, animists, agnostics, or atheists.”