Vladimir Putin on Climate Change

We need to take into consideration all the bombs Loree McBride is dropping! That is a LOT OF POLLUTION. I do think we need to be concerned, first and foremost, with Loree McBride’s space fleet dropping bombs filled with deadly germs on the population. THAT IS THE MOST IMPORTANT ENVIRONMENTAL DISASTER OF OUR TIMES.


I am uncertain how I feel about the claim made by scientists that emissions cause global warming. But I certainly feel that if we can make less pollution that is always a good thing, even if the pollution does not cause global warming. I also know that Jesuits have in the past put contaminants in gasoline to deliberately pollute the air to make their enemies sick. Our most urgent environmental hazard are Jesuits who willingly and knowingly pollute as a form of biological/chemical warfare! So, for this reason, I appreciate leaders who care about the environment, because they will probably have my passion to STOP LOREE MCBRIDE’S BOMBS! Trump seems to not care about this AT ALL.

Putin says climate change not caused by emissions: https://phys.org/news/2017-03-putin-climate-emissions.html

AND https://www.france24.com/en/20170331-russian-president-vladimir-putin-says-humans-not-responsible-climate-change

CHRISTIAN SCIENTIST WHO BELIEVES IN CLIMATE  CHANGE: http://www1.cbn.com/cbnnews/healthscience/2015/July/Christians-Who-Believe-in-Climate-Change


R-click on links to open up to a new page.

  • Sunday fun: personality testing
    by Judith Curry And now for something different. A sociological experiment for the Denizens. Please take the Enneagram personality test (with instinctual variant) and report your results in a comment. I scored a strong 1 (reformer).  Hard to argue with … Continue reading →
  • Week in review – climate science edition
    by Judith Curry A few things that caught my eye these past few weeks. New research verifies that sea level rise is causing #carbon burial rates to increase on some Florida coasts. [link] Rare ozone hole opens in the Arctic … Continue reading →
  • Imperial College UK COVID-19 numbers don’t seem to add up
    By Nic Lewis Introduction and summary A study published two weeks ago by the COVID-19 Response Team from Imperial College (Ferguson20[1]) appears to be largely responsible for driving UK government policy actions. The study is not peer reviewed; indeed, it … Continue reading →
  • CoV discussion thread II
    by Judith Curry Time for a new thread. To kick things off, here are some interesting articles that I’ve spotted recently. Let me know when you are read for a climate thread, week in review or something. Its time to … Continue reading →
  • COVID-19: Updated data implies that UK modelling hugely overestimates the expected death rates from infection
    By Nic Lewis Introduction There has been much media coverage about the danger to life posed by the COVID-19 coronavirus pandemic. While it is clearly a serious threat, one should consider whether the best evidence supports the current degree of … Continue reading →
  • CoV discussion thread
    by Judith Curry Some articles I’ve flagged, plus emails I’ve received. Here are some articles I’ve flagged for discussion; I am not personally endorsing anything here: Coronavirus: The Hammer and the Dance NYT:  Harsh Steps Are Needed to Stop the … Continue reading →
  • Coronavirus uncertainty
    by Judith Curry My thoughts on coronavirus and deep uncertainty. I and my family are in isolation, in relatively comfortable, well-stocked and in safe circumstances (solar power with Tesla power wall).   My community (Reno NV) has relatively few cases and … Continue reading →
  • Coronavirus technical thread
    by Judith Curry A thread devoted to technical topics, e.g. epidemiology, immunology, treatments.  A more general thread will be coming shortly.
  • Coronavirus discussion thread
    by Judith Curry Discuss.
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Rates of sea level rise along the N. American E. Coast have decelerated in recent decades (5 of 6). SLR rates were “only slightly lower” in the … Continue reading →
  • Australian fires: Climate ‘truth bomb’?
    by Alan Longhurst Recipe for Australia’s climate ‘truth bomb’:  dubious manipulations of the historical temperature record, ignorance of the climate dynamics of the Southern Hemisphere, and ignorance of Australia’s ecological and social history. A correspondent of The Guardian newspaper writes … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. On the climate sensitivity and historical warming evolurion in recent coupled model ensembles [link] Greenland’s largest glacier (Jakobshavn) has rapidly thickened since 2016. Thickening has been so … Continue reading →
  • Plausible scenarios for climate change: 2020-2050
    by Judith Curry A range of scenarios for global mean surface temperature change between 2020 and 2050, derived using a semi-empirical approach. All three modes of natural climate variability – volcanoes, solar and internal variability – are expected to act … Continue reading →
  • Inconsistency between historical and future CMIP5 simulations
    by Kenneth Fritsch Identification of significant differences between the historical and future CMIP5 simulations for intrinsic climate sensitivities. Introduction There are a number of climate science articles that refer to the potential for climate modelers to select from parameter variables … Continue reading →
  • Economic impact of energy consumption change caused by global warming
    by Peter Lang and Ken Gregory A new paper ‘Economic impact of energy consumption change caused by global warming’ finds global warming may be beneficial. In this blog post we reproduce the Abstract, Policy Implications and Conclusions and parts of … Continue reading →
  • Analysis of a carbon forecast gone wrong: the case of the IPCC FAR
    by Alberto Zaragoza Comendador The IPCC’s First Assessment Report (FAR) made forecasts or projections of future concentrations of carbon dioxide that turned out to be too high. From 1990 to 2018, the increase in atmospheric CO2 concentrations was about 25% … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week UK Met Office on the Australian fires [link] Attribution science and the Australian fires [link] Pielke Jr: The inconvenient facts on Australian bushfires [link] Another take on … Continue reading →
  • Explaining the Discrepancies Between Hausfather et al. (2019) and Lewis&Curry (2018)
    by Ross McKitrick Challenging the claim that a large set of climate model runs published since 1970’s are consistent with observations for the right reasons. Introduction Zeke Hausfather et al. (2019) (herein ZH19) examined a large set of climate model … Continue reading →
  • Climate sensitivity in light of the latest energy imbalance evidence
    by Frank Bosse Equilibrium climate sensitivity computed from the latest energy imbalance data. The Earth Energy Imbalance (EEI) is a key issue for estimating climate sensitivity. If EEI is positive then the Earth’s climate system gains energy; if it’s negative … Continue reading →
  • Why the CO2 reduction pathways are too stringent
    by Jacques Hagoort Why the IPCC carbon budgets in SR1.5 are over conservative, and the CO2 reduction pathways are too stringent. Abstract Carbon Budgets specify the total amount of CO2 that can be emitted before global warming exceeds a certain … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Decadal changes of the reflected solar radiation and the Earth’s energy imbalance [link] Over the 2000–2018 period the Earth Energy Imbalance (EEI) appears to have a downward … Continue reading →
  • 2020
    by Judith Curry Happy New Year! An end of year post.  Not that I have much to say at the moment, but I think we need a new thread. Here are a few ‘end of year’ articles, looking back and … Continue reading →
  • Two more degrees by 2100!
    by Vaughan Pratt An alternative perspective on 3 degrees C? This post was originally intended as a short comment questioning certain aspects of the methodology in JC’s post of December 23, “3 degrees C?”. But every methodology is bound to … Continue reading →
  • 3 degrees C?
    by Judith Curry Is 3 C warming over the 21st century now the ‘best estimate’?  A reframing of how we think about climate change over the 21st century, and my arguments for 1 C. There has been much discussion over … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week Global average mortality and loss rates down 5x over last decade. Effect is strongest for poorest populations. [link] Important new research on the underlying topography of Antarctica … Continue reading →
  • Comment by Cowtan & Jacobs on Lewis & Curry 2018 and Reply: Part 2
    By Nic Lewis In an earlier article here I discussed a Comment on Lewis and Curry 2018 (LC18) by Kevin Cowtan and Peter Jacobs (CJ20), and a Reply from myself and Judith Curry recently published by Journal of Climate (copy … Continue reading →
  • Comment by Cowtan & Jacobs on Lewis & Curry 2018 and Reply: Part 1
    By Nic Lewis A comment on LC18 (recent paper by Lewis and Curry on climate sensitivity)  by Cowtan and Jacobs has been published, along with our response. Introduction In an earlier article here I discussed the Lewis and Curry (2018) … Continue reading →
  • The toxic rhetoric of climate change
    by Judith Curry “I genuinely have the fear that climate change is going to kill me and all my family, I’m not even kidding it’s  all I have thought about for the last 9 months every second of the day. … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Tim Palmer and Bjorn Stevens: The challenge of understanding and estimating climate change [link] A new paper indicates “present” temperatures are the coldest of the last 4 … Continue reading →
  • Madrid
    by Judith Curry The UN Climate Change Conference this week in Madrid provides an important opportunity to reflect on state of the public debate surrounding climate change. The UN Climate Conference (COP25) is beginning today in Madrid.  I’ve been invited … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week Winter weather whiplash: impact of meteorological events in seasonally snow covered regions [link] Coral records of variable stress impacts and possible acclimatization to recent marine heat wave … Continue reading →
  • Legacy of Climategate – 10 years later
    by Judith Curry My reflections on Climategate 10 years later, and also reflections on my reflections of 5 years ago. Last week, an email from Rob Bradley reminded me of my previous blog post The legacy of Climategate: 5 years … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Effects of atmospheric CO2 enrichment on net photosynthesis and dark respiration rates [link] Identifying key driving processes of major recent heat waves [link] Reassessing Southern Ocean air-sea … Continue reading →
  • Escape from model land
    by Judith Curry “Letting go of the phantastic mathematical objects and achievables of model- land can lead to more relevant information on the real world and thus better-informed decision- making.” – Erica Thompson and Lenny Smith The title and motivation … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Coupled modes of North Atlantic ocean-atmosphere variability and the onset of the Little Ice Age [link] Marine Ice Cliff Instability mitigated by slow removal of ice shelves … Continue reading →
  • Reflections on energy blogging
    by Planning Engineer Five years ago today I started guest blogging on Climate Etc., focusing on energy related issues. My initial goal was to share some insights in a more formal fashion on energy related issues being discussed in the … Continue reading →
  • Gregory et al 2019: Unsound claims about bias in climate feedback and climate sensitivity estimation
    By Nic Lewis The recently published open-access paper “How accurately can the climate sensitivity to CO2 be estimated from historical climate change?” by Gregory et al.[i] makes a number of assertions, many uncontentious but others in my view unjustified, misleading … Continue reading →
  • Climate ‘limits’ and timelines
    by Judith Curry Some thoughts in response to a query from a reporter. I received the following questions today from a reporter, related to climate change and ‘timelines.’   These questions are good topics for discussion. My answers are provided below … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Was Common ra glacier expansion in the Arctic Atlantic region triggered by unforced atmospheric cooling? [link] The amplitude and origin of sea level variability during the Pliocene … Continue reading →
  • Resplandy et al. Part 5: Final outcome
    By Nic Lewis The editors of Nature have retracted the Resplandy et al. paper. Readers may recall that last autumn I wrote several article critiquing the Resplandy et al. (2018) ocean heat uptake study in Nature, which was based on … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Indian Ocean warming can strengthen the Atlantic meridional overturning circulation [link] Indian ocean warming trend reduces Pacific warming response to anthropogenic greenhouse gases: an interbasin thermostat mechanism … Continue reading →
  • A philospher’s reflections on AGW denial
    by Dr. Paul Viminitz Of the things I care most about, AGW is near the bottom. But because, as George W. Bush put it, either you’re with us or you’re against them, I think I’d rather be interestingly wrong than … Continue reading →
  • Don’t overhype the link between climate change and hurricanes
    by Judith Curry Doing so erodes scientific credibility — and distracts from the urgent need to shore up our vulnerability to storms’ impacts. Here is the link to my op-ed in the National Review.  Full text below. n the aftermath … Continue reading →
  • ‘Alarmism enforcement’ on hurricanes and global warming
    by Judith Curry I used to be concerned about ‘consensus enforcement’ on the topic of climate change.  Now I am concerned about ‘alarmism enforcement.’ Ever since Hurricane Katrina in 2005, any hurricane causing catastrophic damage has been seized upon  by … Continue reading →
  • ENSO predictions based on solar activity
    by Javier By knowing or estimating where in the solar cycle we are we can get an estimate of the chances of a particular outcome even years ahead. El Niño Southern Oscillation (ENSO) is the main source of interannual tropical … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. Several papers of fundamental importance: *Important new paper by Peter Minnett:  The response of the ocean thermal skin layer to variations in incident infrared radiation [link] *A … Continue reading →
  • Climate Change: What’s the Worst Case?
    by Judith Curry My new manuscript is now available. A link to my new paper ‘Climate Change: What’s the Worst Case?’ is provided here [worst case paper final (1)] A few words on the intended audience and motivation for writing … Continue reading →
  • Re-evaluating the manufacture of the climate consensus
    by Judith Curry A new book by Oppenheimer, Oreskes et al. entitled ‘Discerning Experts: The Practices of Scientific Assessment for Environmental Policy‘ makes a case against consensus seeking in climate science assessments. I have long railed against the consensus-seeking process … Continue reading →
  • The latest travesty in ‘consensus enforcement’
    The latest travesty in consensus ‘enforcement’, published by Nature. There is a new paper published in Nature, entitled Discrepancies in scientific authority and media visibility of climate change scientists and contrarians. . Abstract. We juxtapose 386 prominent contrarians with 386 … Continue reading →
  • Week in review – science edition
    by Judith Curry A few things that caught my eye this past week. ‘modern climate sensitivity is relatively low in the context of the geological record, as a result of relatively weak feedbacks due to a relatively low CO2 baseline, … Continue reading →


R-click on links to open up to a new page.

  • Flooding Stunted 2019 Cropland Growing Season, Resulting in More Atmospheric Carbon Dioxide
    Severe flooding throughout the Midwest—which triggered a delayed growing season for crops in the region—led to a reduction of 100 million metric tons of net carbon uptake during June and July of 2019, according to a new study. For reference, the massive California wildfires of 2018 released an estimated 12.4 million metric tons of carbon into the atmosphere. And although part of this deficit due to floods was compensated for later in the growing season, the combined effects are likely to have resulted in a 15% reduction in crop productivity relative to 2018, the study authors say. The study, published March 31, 2020, in the journal AGU Advances, describes how the carbon uptake was measured using satellite data. Researchers used a novel marker of photosynthesis known as solar-induced fluorescence to quantify the reduced carbon uptake due to the delay in the crops' growth. Independent observations of atmospheric carbon dioxide (CO2) levels were then employed to confirm the reduction in carbon uptake. "We were able to show that it's possible to monitor the impacts of floods on crop growth on a daily basis in near real time from space, which is critical to future ecological forecasting and mitigation," says Yi Yin, research scientist at Caltech and lead author of the study. Credit: Caltech. View interactive version. Record rainfalls soaked the Midwest during the spring and early summer of 2019. For three consecutive months (April, May, and June), the National Oceanic and Atmospheric Administration reported that 12-month precipitation measurements had hit all-time highs. The resulting floods not only damaged homes and infrastructure but also impacted agricultural productivity, delaying the planting of crops in large parts of the Corn Belt, which stretches from Kansas and Nebraska in the west to Ohio in the east. To assess the environmental impact of the delayed growing season, scientists at Caltech and JPL, which Caltech manages for NASA, turned to satellite data. As plants convert CO2 and sunlight into oxygen and energy through photosynthesis, a small amount of the sunlight they absorb is emitted back in the form of a very faint glow. The glow, known as solar-induced fluorescence, or SIF, is far too dim for us to see with bare eyes, but it can be measured through a process called satellite spectrophotometry. The Caltech-JPL team quantified SIF using measurements from a European Space Agency (ESA) satellite-borne instrument to track the growth of crops with unprecedented detail. They found that the seasonal cycle of the 2019 crop growth was delayed by around two weeks and the maximum seasonal photosynthesis was reduced by about 15%. The stunted growing season was estimated to have led to a reduction in carbon uptake by plants of around 100 million metric tons from June to July 2019. "SIF is the most accurate signal of photosynthesis by far that can be observed from space," says Christian Frankenberg, professor of environmental science and engineering at Caltech. "And since plants absorb carbon dioxide during photosynthesis, we wanted to see if SIF could track the reductions in crop carbon uptake during the 2019 floods." To find out, the team analyzed atmospheric CO2 measurements from NASA's Orbiting Carbon Observatory-2 (OCO-2) satellite as well as from aircraft from NASA's Atmospheric Carbon and Transport America (ACT-America) project. "We found that the SIF-based estimates of reduced uptake are consistent with elevated atmospheric CO2 when the two quantities are connected by atmospheric transport models," says Brendan Bryne, co-corresponding author of the study and a NASA postdoc fellow at JPL. "This study illuminates our ability to monitor the ecosystem and its impact on atmospheric CO2 in near real time from space. These new tools allow for global sensing of biospheric uptake of carbon dioxide," says Paul Wennberg, the R. Stanton Avery Professor of Atmospheric Chemistry and Environmental Science and Engineering, director of the Ronald and Maxine Linde Center for Global Environmental Science, and founding member of the Orbiting Carbon Observatory project. Wennberg is also the principal investigator of the Resnick Sustainability Institute's Climate Science Research Initiative at Caltech. The paper is titled "Cropland carbon uptake delayed and reduced by 2019 Midwest floods." Co-authors at Caltech include Junjie Liu, visiting associate in environmental science and engineering; Philipp Köhler, research scientist; Liyin He (MS '18), Resnick Sustainability Institute fellow; Rupesh Jeyaram, undergraduate student; and Vincent Humphrey, postdoctoral scholar. Other co-authors include Troy Magney of UC Davis; Kenneth J. Davis, Tobias Gerken, and Sha Feng of Pennsylvania State University; and Joshua P. Digangi of NASA. This research was funded by NASA. News Media Contact Whitney Clavin (626) 395‑1944 wclavin@caltech.edu This article was originally published on Caltech.edu.
  • NASA, University of Nebraska Release New Global Groundwater Maps and U.S. Drought Forecasts
    if (typeof captions == 'undefined'){ var captions = []; } captions.push("Thirty-, 60- and 90-day forecasts of dry and wet conditions, relative to the historic record for the lower 48 states, use GRACE-FO satellite groundwater data for the initial conditions. Credit: NASA/Scientific Visualization Studio › Larger view") captions.push("This illustration shows the two GRACE-FO satellites in orbit around Earth. The spacecraft track the movement of water around the world. Credit: NASA/JPL-Caltech › Full image and caption") $(document).ready(function(){ var type = "news"; var slider = new MasterSlider(); // adds Arrows navigation control to the slider. slider.control('bullets', {autohide: false}); slider.control('arrows'); homepage_slider_options = { width: $(window).width(), // slider standard width height: 400, // slider standard height layout: "autofill", space:5, fullwidth: true, autoHeight: false, //will expand to height of image autoplay: false, speed: 20, loop: true, instantStartLayers: true //disable to allow for layer transitions }; slider.setup('masterslider_6722' , homepage_slider_options); if (type == "news"){ slider.api.addEventListener(MSSliderEvent.CHANGE_START , function(){ $('.slider_caption').html(captions[slider.api.index()]); }); } }); NASA researchers have developed new weekly, satellite-based global maps of soil moisture and groundwater wetness conditions and one- to three-month U.S. forecasts of each product. While maps of current dry/wet conditions for the United States have been available since 2012, this is the first time they have been available globally. "The global products are important because there are so few worldwide drought maps out there," said hydrologist and project lead Matt Rodell of NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Droughts are usually well known when they happen in developed nations. But when there's a drought in central Africa, for example, it may not be noticed until it causes a humanitarian crisis. So it's valuable to have a product like this where people can say, 'Wow, it's really dry there and no one's reporting it.'" Using measurements from two satellite missions assimilated into a computer model, researchers have created global maps of terrestrial water around the planet. In addition, they can forecast water availability in the United States up to three months out. Credits: NASA's Goddard Space Flight Center/Scientific Visualization Studio. Download this video in HD from NASA Goddard's Scientific Visualization Studio. [transcript] These maps are distributed online by the National Drought Mitigation Center at the University of Nebraska-Lincoln (UNL) to support U.S. and global drought monitoring. "Being able to see a weekly snapshot of both soil moisture and groundwater is important to get a complete picture of drought," said professor Brian Wardlow, director for the Center for Advanced Land Management Information Technologies at UNL, who works closely with Rodell on developing remote sensing tools for operational drought monitoring. Monitoring the wetness of the soil is essential for managing agricultural crops and predicting their yields, because soil moisture is the water available to plant roots. Groundwater is often the source of water for crop irrigation. It also sustains streams during dry periods and is a useful indicator of extended drought. But ground-based observations are too sparse to capture the full picture of wetness and dryness across the landscape like the combination of satellites and models can. A Global Eye on Water Both the global maps and the U.S. forecasts use data from NASA and the German Research Center for Geosciences's Gravity Recovery and Climate Experiment Follow On (GRACE-FO) satellites, a pair of spacecraft that detect the movement of water on Earth based on variations of Earth's gravity field. GRACE-FO succeeds the highly successful GRACE satellites, which ended their mission in 2017 after 15 years of operation. With the global expansion of the product, and the addition of U.S. forecasts, the GRACE-FO data are filling in key gaps for understanding the full picture of wet and dry conditions that can lead to drought. The satellite-based observations of changes in water distribution are integrated with other data within a computer model that simulates the water and energy cycles. The model then produces, among other outputs, time-varying maps of the distribution of water at three depths: surface soil moisture, root zone soil moisture (roughly the top three feet of soil) and shallow groundwater. The maps have a resolution of 1/8th degree of latitude, or about 8.5 miles (13.7 kilometers), providing continuous data on moisture and groundwater conditions across the landscape. The GRACE and GRACE-FO satellite-based maps are among the essential datasets used by the authors of the U.S. Drought Monitor, the premier weekly map of drought conditions for the United States that is used by the U.S. Department of Agriculture and the Federal Emergency Management Agency, among others, to evaluate which areas may need financial assistance due to losses from drought. "GRACE [provided and GRACE-FO now provides] a national scope of groundwater," said climatologist and Drought Monitor author Brian Fuchs, at the drought center. He and the other authors use multiple datasets to see where the evidence shows conditions have gotten drier or wetter. For groundwater, that used to mean going to individual states' groundwater well data to update the weekly map. "It's saved a lot of time having that groundwater layer along with the soil moisture layers all in one spot," Fuchs said. "The high-resolution data that we're able to bring in allows us to draw those contours of dryness or wetness right to the data itself." One of the goals of the new global maps is to make the same consistent product available in all parts of the world—especially in countries that do not have any groundwater-monitoring infrastructure. "Drought is really a key [topic] ... with a lot of the projections of climate and climate change," Wardlow said. "The emphasis is on getting more relevant, more accurate and more timely drought information, whether it be soil moisture, crop health, groundwater, streamflow-[the GRACE missions are] central to this," he said. "These types of tools are absolutely critical to helping us address and offset some of the impacts anticipated, whether it be from population growth, climate change or just increased water consumption in general." Both the Center for Advanced Land Management and the National Drought Mitigation Center are based in UNL's School of Natural Resources, and they are working with international partners, including the U.S. Agency for International Development and the World Bank, to develop and support drought monitoring using the GRACE-FO global maps and other tools in the Middle East, North Africa, South Africa, South East Asia and India. U.S. Forecasts Maps for the Lower 48 Droughts can be complex, both in timing and extent. At the surface, soil moisture changes rapidly with weather conditions. The moisture in the root zone changes a little slower but is still very responsive to weather. Lagging behind both is groundwater, since it is insulated from changes in the weather. But for longer-term outlooks on drought severity—or, conversely, flood risk in low-lying areas—groundwater is the metric to watch, said Rodell. "The groundwater maps are like a slowed-down, smoothed version of what you see at the surface," Rodell said. "They represent the accumulation of months or years of weather events." That smoothing provides a more complete picture of the overall drying or wetting trend going on in an area. Having an accurate accounting of groundwater levels is essential for accurately forecasting near-future conditions. The new forecast product that projects dry and wet conditions 30, 60 and 90 days out for the lower 48 states uses GRACE-FO data to help set the current conditions. Then the model runs forward in time using the Goddard Earth Observing System, Version 5 seasonal weather forecast model as input. The researchers found that including the GRACE-FO data made the resulting soil moisture and groundwater forecasts more accurate. Since the product has just been rolled out, the user community is only just beginning to work with the forecasts, but Wardlow sees a huge potential. "I think you'll see the GRACE-FO monitoring products used in combination with the forecasts," Wardlow said. "For example, the current U.S. product may show moderate drought conditions, and if you look at the forecast and the forecast shows next month that there's a continued drying trend, then that may change the decision versus if it was a wet trend." The U.S. forecast and global maps are freely available to users through the drought center's data portal. GRACE-FO is a partnership between NASA and the German Research Centre for Geosciences (GeoForschungsZentrum [GFZ]). Both spacecraft are being operated from the German Space Operations Center in Oberpfaffenhofen, Germany, under a GFZ contract with the German Aerospace Center (Deutsches Zentrum für Luft- und Raumfahrt). NASA's Jet Propulsion Laboratory manages the mission for the agency's Science Mission Directorate at NASA Headquarters in Washington. Caltech in Pasadena, California, manages JPL for NASA. The GRACE-FO mission was launched in early 2018. GRACE was implemented as a joint mission of NASA and the German Aerospace Center. JPL managed the mission's implementation and operations. The GRACE mission was decommissioned in late 2017. Development of the drought/wetness products was funded by NASA's Applied Sciences-Water Resources, Terrestrial Hydrology, and GRACE-FO Science Team programs. To download the maps, visit: https://nasagrace.unl.edu/ To learn more about GRACE and GRACE-FO, visit: https://gracefo.jpl.nasa.gov/ News Media Contact Jane J. Lee Jet Propulsion Laboratory, Pasadena, Calif. 818-354-0307 jane.j.lee@jpl.nasa.gov
  • Huge East Antarctic Glacier Especially Susceptible to Climate Impacts
    if (typeof captions == 'undefined'){ var captions = []; } captions.push("This photograph shows ripples in the surface of Denman Glacier in East Antarctica that throw shadows against the ice. The glacier is melting at a faster rate now than it was from 2003 to 2008 Credit: NASA › Larger view") captions.push("This illustration shows a vertically exaggerated image of the ground under Denman Glacier in East Antarctica, including a deep trough (blue area in the center) beneath its eastern flank. Credit: NASA's Scientific Visualization Studio › Larger view") $(document).ready(function(){ var type = "news"; var slider = new MasterSlider(); // adds Arrows navigation control to the slider. slider.control('bullets', {autohide: false}); slider.control('arrows'); homepage_slider_options = { width: $(window).width(), // slider standard width height: 400, // slider standard height layout: "autofill", space:5, fullwidth: true, autoHeight: false, //will expand to height of image autoplay: false, speed: 20, loop: true, instantStartLayers: true //disable to allow for layer transitions }; slider.setup('masterslider_1730' , homepage_slider_options); if (type == "news"){ slider.api.addEventListener(MSSliderEvent.CHANGE_START , function(){ $('.slider_caption').html(captions[slider.api.index()]); }); } }); Denman Glacier in East Antarctica retreated 3.4 miles (5.4 kilometers) from 1996 to 2018, according to a new study by scientists at NASA's Jet Propulsion Laboratory and the University of California, Irvine. Their analysis of Denman—a single glacier that holds as much ice as half of West Antarctica—also shows that the shape of the ground beneath the ice sheet makes it especially susceptible to climate-driven retreat. Until recently, researchers believed East Antarctica was more stable than West Antarctica because it wasn't losing as much ice compared to the glacial melt observed in the western part of the continent. "East Antarctica has long been thought to be less threatened, but as glaciers such as Denman have come under closer scrutiny by the cryosphere science community, we are now beginning to see evidence of potential marine ice sheet instability in this region," said Eric Rignot, project senior scientist at JPL and professor of Earth system science at UCI. "The ice in West Antarctica has been melting faster in recent years, but the sheer size of Denman Glacier means that its potential impact on long-term sea level rise is just as significant," Rignot added. If all of Denman melted, it would result in about 4.9 feet (1.5 meters) of sea level rise worldwide. Using radar data from four satellites, part of the Italian COSMO-SkyMed mission that launched its first satellite in 2007, the researchers were able to discern the precise location where the glacier meets the sea and the ice starts to float on the ocean, or its grounding zone. The scientists were also able to reveal the contours of the ground beneath portions of the glacier using data on ice thickness and its speed over land. Denman's eastern flank is protected from exposure to warm ocean water by a roughly 6-mile-wide (10-kilometer-wide) ridge under the ice sheet. But its western flank, which extends about 3 miles (4 kilometers) past its eastern part, sits over a deep, steep trough with a bottom that's smooth and slopes inland. This configuration could funnel warm seawater underneath the ice, making for an unstable ice sheet. The warm water is increasingly being pushed against the Antarctic continent by winds called the westerlies, which have strengthened since the 1980s. "Because of the shape of the ground beneath Denman's western side, there is potential for the intrusion of warm water, which would cause rapid and irreversible retreat, and contribute to global sea level rise in the future," said lead author Virginia Brancato, a scientist at JPL. It will also be important, her colleague Rignot noted, to monitor the part of Denman Glacier that floats on the ocean, which extends for 9,300 square miles (24,000 square kilometers) and includes the Shackleton Ice Shelf and Denman Ice Tongue. Currently, that extension is melting from the bottom up at a rate of about 10 feet (3 meters) annually. That's an increase over its annual melt average of 9 feet (2.7 meters). It's also greater than the average melt rate for East Antarctic ice shelves between 2003 and 2008, which was roughly 2 feet (0.7 meters) per year. The team published their assessment on March 23 in the American Geophysical Union journal Geophysical Research Letters. This project was funded by NASA's Cryosphere Program and received support from the Italian Space Agency and the German Space Agency. Data and bed topography maps are publicly available. News Media Contacts Jane Lee / Ian J. O'Neill Jet Propulsion Laboratory, Pasadena, Calif. 818-354-0307 / 818-354-2649 jane.j.lee@jpl.nasa.gov / ian.j.oneill@jpl.nasa.gov Brian Bell University of California, Irvine 949-824-8249 bpbell@uci.edu
  • New 3D View of Methane Tracks Sources and Movement around the Globe
    NASA’s new 3-dimensional portrait of methane concentrations shows the world’s second largest contributor to greenhouse warming, the diversity of sources on the ground, and the behavior of the gas as it moves through the atmosphere. Combining multiple data sets from emissions inventories, including fossil fuel, agricultural, biomass burning and biofuels, and simulations of wetland sources into a high-resolution computer model, researchers now have an additional tool for understanding this complex gas and its role in Earth’s carbon cycle, atmospheric composition, and climate system. Since the Industrial Revolution, methane concentrations in the atmosphere have more than doubled. After carbon dioxide, methane is the second most influential greenhouse gas, responsible for 20 to 30% of Earth’s rising temperatures to date. “There’s an urgency in understanding where the sources are coming from so that we can be better prepared to mitigate methane emissions where there are opportunities to do so,” said research scientist Ben Poulter at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Credit: NASA/Scientific Visualization Studio. This video can be downloaded at NASAs Scientific Visualization Studio. A single molecule of methane is more efficient at trapping heat than a molecule of carbon dioxide, but because the lifetime of methane in the atmosphere is shorter and carbon dioxide concentrations are much higher, carbon dioxide still remains the main contributor to climate change. Methane also has many more sources than carbon dioxide, which include the energy and agricultural sectors, as well as natural sources from various types of wetlands and water bodies. “Methane is a gas that’s produced under anaerobic conditions, so that means when there’s no oxygen available, you’ll likely find methane being produced,” said Poulter. In addition to fossil fuel activities, primarily from the coal, oil and gas sectors, sources of methane also include the ocean, flooded soils in vegetated wetlands along rivers and lakes, agriculture, such as rice cultivation, and the stomachs of ruminant livestock, including cattle. “It is estimated that up to 60% of the current methane flux from land to the atmosphere is the result of human activities,” said Abhishek Chatterjee, a carbon cycle scientist with Universities Space Research Association based at Goddard. “Similar to carbon dioxide, human activity over long time periods is increasing atmospheric methane concentrations faster than the removal from natural ‘sinks’ can offset it. As human populations continue to grow, changes in energy use, agriculture and rice cultivation, livestock raising will influence methane emissions. However, it’s difficult to predict future trends due to both lack of measurements and incomplete understanding of the carbon-climate feedbacks.” Researchers are using computer models to try to build a more complete picture of methane, said research meteorologist Lesley Ott with the Global Modeling and Assimilation Office at Goddard. “We have pieces that tell us about the emissions, we have pieces that tell us something about the atmospheric concentrations, and the models are basically the missing piece tying all that together and helping us understand where the methane is coming from and where it’s going.” To create a global picture of methane, Ott, Chatterjee, Poulter and their colleagues used methane data from emissions inventories reported by countries, NASA field campaigns, like the Arctic Boreal Vulnerability Experiment (ABoVE) and observations from the Japanese Space Agency’s Greenhouse Gases Observing Satellite (GOSAT Ibuki) and the Tropospheric Monitoring Instrument aboard the European Space Agency’s Sentinel-5P satellite. They combined the data sets with a computer model that estimates methane emissions based on known processes for certain land-cover types, such as wetlands. The model also simulates the atmospheric chemistry that breaks down methane and removes it from the air. Then they used a weather model to see how methane traveled and behaved over time while in the atmosphere. The data visualization of their results shows methane’s ethereal movements and illuminates its complexities both in space over various landscapes and with the seasons. Once methane emissions are lofted up into the atmosphere, high-altitude winds can transport it far beyond their sources. When they first saw the data visualized, several locations stood out. Credit: NASA's Scientific Visualization Studio In South America, the Amazon River basin and its adjacent wetlands flood seasonally, creating an oxygen-deprived environment that is a significant source of methane. Globally, about 60% of methane emissions come from the tropics, so it’s important to understand the various human and natural sources, said Poulter. Credit: NASA's Scientific Visualization Studio Over Europe, the methane signal is not as strong as over the Amazon. European methane sources are influenced by the human population and the exploration and transport of oil, gas and coal from the energy sector. Credit: NASA's Scientific Visualization Studio In India, rice cultivation and livestock are the two driving sources of methane. “Agriculture is responsible for about 20% of global methane emissions and includes enteric fermentation, which is the processing of food in the guts of cattle, mainly, but also includes how we manage the waste products that come from livestock and other agricultural activities,” said Poulter. Credit: NASA's Scientific Visualization Studio China’s economic expansion and large population drive the high demand for oil, gas and coal exploration for industry as well as agriculture production, which are its underlying sources of methane. Credit: NASA's Scientific Visualization Studio The Arctic and high-latitude regions are responsible for about 20% of global methane emissions. “What happens in the Arctic, doesn’t always stay in the Arctic,” Ott said. “There’s a massive amount of carbon that’s stored in the northern high latitudes. One of the things scientists are really concerned about is whether or not, as the soils warm, more of that carbon could be released to the atmosphere. Right now, what you’re seeing in this visualization is not very strong pulses of methane, but we’re watching that very closely because we know that’s a place that is changing rapidly and that could change dramatically over time.” “One of the challenges with understanding the global methane budget has been to reconcile the atmospheric perspective on where we think methane is being produced versus the bottom-up perspective, or how we use country-level reporting or land surface models to estimate methane emissions,” said Poulter. “The visualization that we have here can help us understand this top-down and bottom-up discrepancy and help us also reduce the uncertainties in our understanding of the global methane budget by giving us visual cues and a qualitative understanding of how methane moves around the atmosphere and where it’s produced.” The model data of methane sources and transport will also help in the planning of both future field and satellite missions. Currently, NASA has a planned satellite called GeoCarb that will launch around 2023 to provide geostationary space-based observations of methane in the atmosphere over much of the western hemisphere.
  • Solar Energy Tracker Powers Down After 17 Years
    After nearly two decades, the Sun has set for NASA’s SOlar Radiation and Climate Experiment (SORCE), a mission that continued and advanced the agency’s 40-year record of measuring solar irradiance and studying its influence on Earth’s climate. The SORCE team turned off the spacecraft on February 25, 2020, concluding 17 years of measuring the amount, spectrum and fluctuations of solar energy entering Earth’s atmosphere — vital information for understanding climate and the planet’s energy balance. The mission’s legacy is continued by the Total and Spectral solar Irradiance Sensor (TSIS-1), launched to the International Space Station in December 2017, and TSIS-2, which will launch aboard its own spacecraft in 2023. Monitoring Earth’s “Battery” The Sun is Earth’s primary power source. Energy from the Sun, called solar irradiance, drives Earth’s climate, temperature, weather, atmospheric chemistry, ocean cycles, energy balance and more. Scientists need accurate measurements of solar power to model these processes, and the technological advances in SORCE’s instruments allowed more accurate solar irradiance measurements than previous missions. “These measurements are important for two reasons,” said Dong Wu, project scientist for SORCE and TSIS-1 at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Climate scientists need to know how much the Sun varies, so they know how much change in the Earth’s climate is due to solar variation. Secondly, we’ve debated for years, is the Sun getting brighter or dimmer over hundreds of years? We live only a short period, but an accurate trend will become very important. If you know how the Sun is varying and can extend that knowledge into the future, you can then put the anticipated future solar input into climate models together with other information, like trace gas concentrations, to estimate what our future climate will be.” Since 1750, the warming driven by greenhouse gases coming from the human burning of fossil fuels is over 50 times greater than the slight extra warming coming from the Sun itself over that same time interval. SORCE’s four instruments measured solar irradiance in two complementary ways: Total and spectral. NASA's Solar Radiation and Climate Experiment, or SORCE, collected these data on total solar irradiance (TSI), the total amount of the Sun’s radiant energy, throughout Sept. 2017. Sunspots (darkened areas on the Sun’s surface) and faculae (brightened areas) create tiny TSI variations that show up as measurable changes in Earth’s climate and systems. Credit: NASA / Walt Feimer Total solar irradiance, or TSI, is the total amount of solar energy that reaches the Earth’s outer atmosphere in a given time. Sunspots (darkened areas on the Sun’s surface) and faculae (brightened areas) create tiny TSI variations that show up as measurable changes in Earth’s climate and systems. From space, SORCE and other solar irradiance missions measure TSI without interference from Earth’s atmosphere. SORCE’s TSI values were slightly but significantly lower than those measured by previous missions. This was not an error — its Total Irradiance Monitor was ten times more accurate than previous instruments. This improved solar irradiance inputs into the Earth climate and weather models from what was previously available. “The big surprise with TSI was that the amount of irradiance it measured was 4.6 watts per square meter less than what was expected,” said Tom Woods, SORCE’s principal investigator and senior research associate at the University of Colorado’s Laboratory for Atmospheric and Space Physics (LASP) in Boulder, Colorado. “That started a whole scientific discussion and the development of a new calibration laboratory for TSI instruments. It turned out that the TIM was correct, and all the past irradiance measurements were erroneously high.” “It’s not often in climate studies that you make a quantum leap in measurement capability, but the tenfold improvement in accuracy by the SORCE / TIM was exactly that,” said Greg Kopp, TIM instrument scientist for SORCE and TSIS at LASP. SORCE’s other measurements focused on spectrally-resolved solar irradiance (SSI): The variation of solar irradiance with wavelength across the solar spectrum, covering the major wavelength regions important to Earth’s climate and atmospheric composition. Besides the familiar rainbow of colors in visible light, solar energy also contains shorter ultraviolet and longer infrared wavelengths, both of which play important roles in affecting Earth’s atmosphere. Earth’s atmospheric layers and surface absorb different wavelengths of energy — for example, atmospheric ozone absorbs harmful ultraviolet radiation, while atmospheric water vapor and carbon dioxide absorb longer-wavelength infrared radiation, which keeps the surface warm. SORCE was the first satellite mission to record a broad spectrum of SSI for a long period, tracking wavelengths from 1 to 2400 nanometers across its three SSI instruments. “For public health, ozone chemistry and ultraviolet radiation are very important, and visible light is important for climate modeling,” Wu said. “We need to know the solar variability at different wavelengths and compare these measurements with our models.” SORCE observed the Sun across two solar minima (periods of low sunspot activity), providing valuable information about variability over a relatively short 11-year period. But a longer record is needed to improve long-term predictions, Wu said. Buying Time for an Aging Mission SORCE was originally designed to collect data for just five years. Extending its lifespan to 17 required creative and resourceful engineering, said Eric Moyer, SORCE’s mission director at Goddard. SORCE’s battery began to degrade in its eighth year of operations, no longer providing enough power to support consistent data collection. Unfortunately, the NASA instrument designed to take up its TSI measurements, Glory, was lost shortly after its 2011 launch, and the next instrument, the NOAA / U.S. Air Force Total solar irradiance Calibration Transfer Experiment (TCTE), would not launch until 2013. If SORCE could no longer operate, the ongoing solar irradiance record could be interrupted. Because the Sun changes very slowly — its sunspots and faculae follow an 11-year cycle, and some changes span decades or even centuries — a long, continuous record is essential for understanding how the Sun behaves. The SORCE team turned off the spacecraft on February 25, 2020, concluding 17 years of measuring the amount, spectrum and fluctuations of solar energy entering Earth’s atmosphere. Credit: NASA / Walt Feimer The engineering team switched to daytime-only solar data collection, powering down the instruments and part of the spacecraft during the night part of the SORCE orbit. This plan effectively allowed the satellite to run with no functioning battery, Woods said — a groundbreaking engineering achievement. “The operation and science teams at our partner organizations developed and implemented a completely new way to operate this mission when it appeared it was over because of battery capacity loss,” said Moyer. LASP and Northrup Grumman Space Systems led the development of new operational software in order to continue the SORCE mission. “The small, highly dedicated team persevered and excelled when encountering operational challenges. I am very proud of their excellent accomplishment and honored to have had the opportunity to participate in managing the SORCE mission.” Continuing a Bright Legacy As SORCE’s time in the Sun ends, NASA’s solar irradiance record continues with TSIS-1. The mission’s two instruments measure TSI and SSI with even more advanced instruments that build on SORCE’s legacy, said Wu. They have already enabled advances like establishing a new reference for the “quiet” Sun when there were no sunspots in 2019, and for comparing this to SORCE observations of the previous solar cycle minimum in 2008. TSIS-2 is scheduled to launch in 2023 with identical instruments to TSIS-1. Its vantage point aboard its own spacecraft will give it more flexibility than TSIS-1’s data collection aboard the ISS. “We are looking forward to continuing the groundbreaking science ushered in by SORCE, and to maintaining the solar irradiance data record through this decade and beyond with TSIS-1 and 2,” said LASP’s Peter Pilewskie, principal investigator for the TSIS missions. “SORCE set the standard for measurement accuracy and spectral coverage, two attributes of the mission that were key to gaining insight into the Sun's role in the climate system. TSIS has made additional improvements that should further enhance Sun-climate studies.” “Solar irradiance measurements are very challenging, and the SORCE team proposed a different way, a new technology, to measure them,” said Wu. “Using advanced technology to advance our science capability, SORCE is a very good example of NASA’s spirit.” For more information on SORCE, visit http://lasp.colorado.edu/home/sorce/.
  • GRACE, GRACE-FO Satellite Data Track Ice Loss at the Poles
    if (typeof captions == 'undefined'){ var captions = []; } captions.push("Greenland's Steenstrup Glacier, with the midmorning sun glinting off the Denmark Strait in the background. The image was taken during a NASA IceBridge airborne survey of the region in 2016. Credit: NASA/Operation IceBridge › Larger view") captions.push("Crevasses in southern Greenland are visible from a 2017 Operation IceBridge airborne survey of the region. Credit: NASA/Operation IceBridge › Larger view") $(document).ready(function(){ var type = "news"; var slider = new MasterSlider(); // adds Arrows navigation control to the slider. slider.control('bullets', {autohide: false}); slider.control('arrows'); homepage_slider_options = { width: $(window).width(), // slider standard width height: 400, // slider standard height layout: "autofill", space:5, fullwidth: true, autoHeight: false, //will expand to height of image autoplay: false, speed: 20, loop: true, instantStartLayers: true //disable to allow for layer transitions }; slider.setup('masterslider_9938' , homepage_slider_options); if (type == "news"){ slider.api.addEventListener(MSSliderEvent.CHANGE_START , function(){ $('.slider_caption').html(captions[slider.api.index()]); }); } }); During the exceptionally warm Arctic summer of 2019, Greenland lost 600 billion tons of ice—enough to raise global sea levels by nearly a tenth of an inch (2.2 millimeters) in just two months, a new study shows. Led by scientists at NASA's Jet Propulsion Laboratory and the University of California, Irvine, the study also concludes that Antarctica continues to lose mass, particularly in the Amundsen Sea Embayment and the Antarctic Peninsula on the western part of the continent; however, those losses have been partially offset by gains from increased snowfall in the northeast. "We knew this past summer had been particularly warm in Greenland, melting every corner of the ice sheet," said lead author Isabella Velicogna, senior project scientist at JPL and a professor at UCI. "But the numbers really are enormous." For context, last summer's losses are more than double Greenland's 2002-2019 yearly average. "In Antarctica, the mass loss in the west proceeds unabated, which will lead to an even further increase in sea level rise," Velicogna said. "But we also observe a mass gain in the Atlantic sector of East Antarctica caused by an uptick in snowfall, which helps mitigate the enormous increase in mass loss that we have seen in the last two decades on other parts of the continent." She and her colleagues came to these conclusions in the process of establishing data continuity between the recently decommissioned Gravity Recovery and Climate Experiment (GRACE) satellite mission and its successor, GRACE Follow-On. As mission partnerships between NASA and the German Aerospace Center, and NASA and the German Research Centre for Geosciences, respectively, the GRACE and GRACE-FO satellites were designed to measure changes to Earth's gravitational pull that result from changes in mass, including water. As water moves around the planet—flowing ocean currents, melting ice, falling rain and so on—it changes the gravitational pull ever so slightly. Scientists use the precise measurements of these variations to monitor Earth's water reserves, including polar ice, global sea levels and groundwater availability. The first GRACE mission was launched in 2002 and decommissioned in October 2017. GRACE-FO, based on similar technology and designed to continue the data record of its predecessor, launched in May 2018. Because of this brief gap, the study team used independent data to test and confirm that the GRACE and GRACE-FO data over Greenland and Antarctica were consistent. Velicogna was pleased with the results. "It is great to see how well the data line up in Greenland and Antarctica, even at the regional level," she said. "It is a tribute to the great effort by the project, engineering and science teams to make the mission successful." The study, titled "Continuity of Ice Sheet Mass Loss in Greenland and Antarctica From the GRACE and GRACE Follow-On Missions," was published March 18 in Geophysical Research Letters. In addition to scientists from JPL and UCI, the GRACE and GRACE-FO data continuity project involved researchers from University of Grenoble in France, University of Utrecht in the Netherlands, and the Polar Ice Center at the University of Washington in Seattle. JPL managed the GRACE mission and manages the GRACE-FO mission for NASA's Earth Science Division in the Science Mission Directorate at NASA Headquarters in Washington. Caltech in Pasadena, California, manages JPL for NASA. More information on GRACE and GRACE-FO can be found here: https://www.nasa.gov/mission_pages/Grace/index.html https://gracefo.jpl.nasa.gov/mission/overview/
  • Greenland, Antarctica Melting Six Times Faster Than in the 1990s
    if (typeof captions == 'undefined'){ var captions = []; } captions.push("An aerial view of the icebergs near Kulusuk Island, off the southeastern coastline of Greenland, a region that is exhibiting an accelerated rate of ice loss. Credit: NASA Goddard Space Flight Center › Larger view") captions.push("Pools of meltwater in southwestern Greenland's ice sheet as observed by a NASA satellite in 2016. Credit: NASA Goddard Space Flight Center › Larger view") $(document).ready(function(){ var type = "news"; var slider = new MasterSlider(); // adds Arrows navigation control to the slider. slider.control('bullets', {autohide: false}); slider.control('arrows'); homepage_slider_options = { width: $(window).width(), // slider standard width height: 400, // slider standard height layout: "autofill", space:5, fullwidth: true, autoHeight: false, //will expand to height of image autoplay: false, speed: 20, loop: true, instantStartLayers: true //disable to allow for layer transitions }; slider.setup('masterslider_7295' , homepage_slider_options); if (type == "news"){ slider.api.addEventListener(MSSliderEvent.CHANGE_START , function(){ $('.slider_caption').html(captions[slider.api.index()]); }); } }); Observations from 11 satellite missions monitoring the Greenland and Antarctic ice sheets have revealed that the regions are losing ice six times faster than they were in the 1990s. If the current melting trend continues, the regions will be on track to match the "worst-case" scenario of the Intergovernmental Panel on Climate Change (IPCC) of an extra 6.7 inches (17 centimeters) of sea level rise by 2100. The findings, published online March 12 in the journal Nature from an international team of 89 polar scientists from 50 organizations, are the most comprehensive assessment to date of the changing ice sheets. The Ice Sheet Mass Balance Intercomparison Exercise team combined 26 surveys to calculate changes in the mass of the Greenland and Antarctic ice sheets between 1992 and 2018. The assessment was supported by NASA and the European Space Agency. The surveys used measurements from satellites including NASA's Ice, Cloud, and land Elevation Satellite and the joint NASA-German Aerospace Center Gravity Recovery and Climate Experiment. Andrew Shepherd at the University of Leeds in England and Erik Ivins at NASA's Jet Propulsion Laboratory in Southern California led the study. The team calculated that the two ice sheets together lost 81 billion tons per year in the 1990s, compared with 475 billion tons of ice per year in the 2010s—a sixfold increase. All total, Greenland and Antarctica have lost 6.4 trillion tons of ice since the 1990s. The resulting meltwater boosted global sea levels by 0.7 inches (17.8 millimeters). Together, the melting polar ice sheets are responsible for a third of all sea level rise. Of this total sea level rise, 60 percent resulted from Greenland's ice loss and 40 percent resulted from Antarctica's. "Satellite observations of polar ice are essential for monitoring and predicting how climate change could affect ice losses and sea level rise," said Ivins. "While computer simulations allow us to make projections from climate change scenarios, the satellite measurements provide prima facie, rather irrefutable, evidence." The IPCC in its Fifth Assessment Report issued in 2014 predicted global sea levels would rise 28 inches (71 centimeters) by 2100. The Ice Sheet Mass Balance Intercomparison Exercise team's studies show that ice loss from Antarctica and Greenland tracks with the IPCC's worst-case scenario. Combined losses from both ice sheets peaked at 552 billion tons per year in 2010 and averaged 475 billion tons per year for the remainder of the decade. The peak loss coincided with several years of intense surface melting in Greenland, and last summer's Arctic heat wave means that 2019 will likely set a new record for polar ice sheet loss, but further analysis is needed. IPCC projections indicate the resulting sea level rise could put 400 million people at risk of annual coastal flooding by the end of the century. "Every centimeter of sea level rise leads to coastal flooding and coastal erosion, disrupting people's lives around the planet," said Shepherd. As to what is leading to the ice loss, Antarctica's outlet glaciers are being melted by the ocean, which causes them to speed up. Whereas this accounts for the majority of Antarctica's ice loss, it accounts for half of Greenland's ice loss; the rest is caused by rising air temperatures melting the surface of its ice sheet. For more information about the Ice Sheet Mass Balance Intercomparison Exercise, visit: http://imbie.org/ News Media Contacts Ian J. O'Neill Jet Propulsion Laboratory, Pasadena, Calif. 818-354-2649 ian.j.oneill@jpl.nasa.gov Jane Lee Jet Propulsion Laboratory, Pasadena, Calif. 818-354-0307 jane.j.lee@jpl.nasa.gov
  • Visualizing the Quantities of Climate Change
    How big is just one gigatonne? Satellite data show that Greenland and Antarctica are losing mass at a rate of and , respectively. So how big is just one gigatonne? + SHOW ME THE MATH National Mall dimensions (from OpenStreetMap): 2 km long, 0.2 km wide ice height = 1.091 km³ / (2 * 0.2) = 2.7275 km = 2,727.5 meters A fully loaded Nimitz-class aircraft carrier weights 100,000 tonnes, according to the Naval Institute Guide to the Ships and Aircraft of the U.S. Fleet. Central Park dimensions (from OpenStreetMap): 4 km meters long, 0.8km wide 1 gigatonne of ice = 1.091 km³ 4 km * 0.8 km * ice height km = 1.091 km³ ice height = 1.091 km³ / (4 km * 0.8 km) ice height = 0.3409 km = 340.9 meters A fully loaded Nimitz-class aircraft carrier weights 100,000 tonnes, according to the Naval Institute Guide to the Ships and Aircraft of the U.S. Fleet. How much is 5,000 gigatonnes of ice? This is the amount of ice lost from the polar ice caps that NASA’s original GRACE mission observed from 2002 to 2017. During the 15-year lifetime of the original GRACE mission (2002-2017), 5,641 gigatonnes of ice of were lost in Greenland and Antarctica. Ninety-nine percent of the world’s freshwater ice is located in these ice sheets. This is enough to cover Texas in a sheet of ice 26 feet high. + SHOW ME THE MATH Antarctica lost 1,870 gigatonnes since 2002 Greenland lost 3,771 gigatonnes since 2002 5,641 gigatonnes total combined Area of Texas: 696,241 km² (Texas Almanac) Ice density = 1.091 km³ / gigatonne 5,000 gigatonnes * 1.091 km³ / gigatonne = ice height * 696,241 ice height = (5,000 gigatonnes * 1.091 km³ / gigatonne) / 696,241 km² = 0.00783493072 km = 7.83493072 m = 25.70515328084 feet How much is 49,000 gigatonnes of ice? This is our best estimate of how much Greenland and Antarctic ice has melted into the ocean since the start of the 20th century.
  • NASA Satellite Offers Urban Carbon Dioxide Insights
    A new NASA/university study of carbon dioxide emissions for 20 major cities around the world provides the first direct, satellite-based evidence that as a city's population density increases, the carbon dioxide it emits per person declines, with some notable exceptions. The study also demonstrates how satellite measurements of this powerful greenhouse gas can give fast-growing cities new tools to track carbon dioxide emissions and assess the impact of policy changes and infrastructure improvements on their energy efficiency. Cities account for more than 70 percent of global carbon dioxide emissions associated with energy production, and rapid, ongoing urbanization is increasing their number and size. But some densely populated cities emit more carbon dioxide per capita than others. To better understand why, atmospheric scientists Dien Wu and John Lin of the University of Utah in Salt Lake City teamed with colleagues at NASA's Goddard Space Flight Center in Greenbelt, Maryland; Universities Space Research Association (USRA) in Columbia, Maryland; and the University of Michigan in Ann Arbor. They calculated per capita carbon dioxide emissions for 20 urban areas on several continents using recently available carbon dioxide estimates from NASA's Orbiting Carbon Observatory-2 (OCO-2) satellite, managed by the agency's Jet Propulsion Laboratory in Pasadena, California. Cities spanning a range of population densities were selected based on the quality and quantity of OCO-2 data available for them. Cities with minimal vegetation were preferred because plants can absorb and emit carbon dioxide, complicating the interpretation of the measurements. Two U.S. cities were included: Las Vegas and Phoenix. Many scientists and policy makers have assumed the best way to estimate and understand differences in carbon dioxide emissions in major cities is to employ a "bottom-up" approach, compiling an inventory of fossil fuel emissions produced by industrial facilities, farms, road transport and power plants. The bottom-up method was the only feasible approach before remote-sensing data sets became available. This approach can provide estimates of emissions by fuel type (coal, oil, natural gas) and sector (power generation, transportation, manufacturing) but can miss some emissions, especially in rapidly developing urban areas. A spatial map of the amount of carbon dioxide (CO2) present in columns of the atmosphere below NASA's Orbiting Carbon Observatory-2 (OCO-2) satellite as it flew over Las Vegas on Feb. 8, 2018. Warmer colors over the city center indicate higher amounts of carbon dioxide. Credit: NASA/JPL-Caltech/University of Utah › Larger view But for this study, researchers instead employed a "top-down" approach to inventory emissions, using satellite-derived estimates of the amount of carbon dioxide present in the air above an urban area as the satellite flies overhead. "Other people have used fuel statistics, the number of miles driven by a person or how big people's houses are to calculate per capita emissions," Lin said. "We're looking down from space to actually measure the carbon dioxide concentration over a city." Published Feb. 20 in the journal Environmental Research Letters, the study found that cities with higher population densities generally have lower per capita carbon dioxide emissions, in line with previous bottom-up studies based on emissions inventories. But the satellite data provided new insights. "Our motivating question was essentially: When people live in denser cities, do they emit less carbon dioxide? The general answer from our analysis suggests, yes, emissions from denser cities are lower," said Eric Kort, principal investigator and associate professor of climate and space sciences and engineering at the University of Michigan. "It isn't a complete picture, since we only see local direct emissions, but our study does provide an alternative direct observational assessment that was entirely missing before." The Density Factor, and Exceptions Scientists have hypothesized that more densely-populated urban areas generally emit less carbon dioxide per person because they are more energy efficient: That is, less energy per person is needed in these areas because of factors like the use of public transportation and the efficient heating and cooling of multi-family dwellings. Satellite data can improve our understanding of this relationship because they describe the combined emissions from all sources. This information can be incorporated with more source-specific, bottom-up inventories to help city managers plan for more energy-efficient growth and develop better estimates of future carbon dioxide emissions. The OCO-2 data show that not all densely-populated urban areas have lower per capita emissions, however. Cities with major power generation facilities, such as Yinchuan, China, and Johannesburg, had higher emissions than what their population density would otherwise suggest. "The satellite detects the carbon dioxide plume at the power plant, not at the city that actually uses the power," Lin said. "Some cities don't produce as much carbon dioxide, given their population density, but they consume goods and services that would give rise to carbon dioxide emissions elsewhere," Wu added. Another exception to the higher population density/lower emissions observation is affluence. A wealthy urban area, like Phoenix, produces more emissions per capita than a developing city like Hyderabad, India, which has a similar population density. The researchers speculate that Phoenix's higher per capita emissions are due to factors such as higher rates of driving and larger, better air-conditioned homes. This animation shows the Orbiting Carbon Observatory-2, the first NASA spacecraft dedicated to studying carbon dioxide in Earth's atmosphere. Credit: NASA/JPL-Caltech › Larger view Looking Ahead The researchers stress there's much more to be learned about urban carbon dioxide emissions. They believe new data from OCO-2's successor, OCO-3 — which launched to the International Space Station last year — along with future space-based carbon dioxide-observing missions, may shed light on potential solutions to mitigating cities' carbon emissions. "Many people are interested in carbon dioxide emissions from large cities," Wu said. "Additionally, there are a few places with high emissions that aren't necessarily related to population. Satellites can detect and quantify emissions from those locations around the globe." Launched in 2014, OCO-2 gathers global measurements of atmospheric carbon dioxide - the principal human-produced driver of climate change - with the resolution, precision and coverage needed to understand how it moves through the Earth system and how it changes over time. From its vantage point in space, OCO-2 makes roughly 100,000 measurements of atmospheric carbon dioxide over the globe every day. JPL manages OCO-2 for NASA's Science Mission Directorate, Washington. While OCO-2 wasn't optimized to monitor carbon emissions from cities or power plants, it can observe these targets if it flies directly overhead or if the observatory is reoriented to point in their direction. In contrast, OCO-3, which has been collecting daily measurements of carbon dioxide since last summer, features an agile mirror-pointing system that allows it to capture "snapshot maps." In a matter of minutes, it can create detailed mini-maps of carbon dioxide over areas of interest as small as an individual power plant to a large urban area up to 2,300 square miles (6,400 square kilometers), such as the Los Angeles Basin, something that would take OCO-2 several days to do. For more information on OCO-2 and OCO-3, visit: https://www.nasa.gov/oco2 https://ocov3.jpl.nasa.gov/ News Media Contacts Jane Lee Jet Propulsion Laboratory, Pasadena, Calif. 818-354-0307 janelee@jpl.nasa.gov Paul Gabrielsen University of Utah, Salt Lake City 801-505-8253 paul.gabrielsen@utah.edu
  • Milankovitch (Orbital) Cycles and Their Role in Earth's Climate
    Our lives literally revolve around cycles: series of events that are repeated regularly in the same order. There are hundreds of different types of cycles in our world and in the universe. Some are natural, such as the change of the seasons, annual animal migrations or the circadian rhythms that govern our sleep patterns. Others are human-produced, like growing and harvesting crops, musical rhythms or economic cycles. Cycles also play key roles in Earth’s short-term weather and long-term climate. A century ago, Serbian scientist Milutin Milankovitch hypothesized the long-term, collective effects of changes in Earth’s position relative to the Sun are a strong driver of Earth’s long-term climate, and are responsible for triggering the beginning and end of glaciation periods (Ice Ages). Specifically, he examined how variations in three types of Earth orbital movements affect how much solar radiation (known as insolation) reaches the top of Earth’s atmosphere as well as where the insolation reaches. These cyclical orbital movements, which became known as the Milankovitch cycles, cause variations of up to 25 percent in the amount of incoming insolation at Earth’s mid-latitudes (the areas of our planet located between about 30 and 60 degrees north and south of the equator). The Milankovitch cycles include: The shape of Earth’s orbit, known as eccentricity; The angle Earth’s axis is tilted with respect to Earth’s orbital plane, known as obliquity; and The direction Earth’s axis of rotation is pointed, known as precession. Let’s take a look at each (further reading on why Milankovitch cycles can't explain Earth's current warming here). Credit: NASA/JPL-Caltech Eccentricity – Earth’s annual pilgrimage around the Sun isn’t perfectly circular, but it’s pretty close. Over time, the pull of gravity from our solar system’s two largest gas giant planets, Jupiter and Saturn, causes the shape of Earth’s orbit to vary from nearly circular to slightly elliptical. Eccentricity measures how much the shape of Earth’s orbit departs from a perfect circle. These variations affect the distance between Earth and the Sun. Eccentricity is the reason why our seasons are slightly different lengths, with summers in the Northern Hemisphere currently about 4.5 days longer than winters, and springs about three days longer than autumns. As eccentricity decreases, the length of our seasons gradually evens out. The difference in the distance between Earth’s closest approach to the Sun (known as perihelion), which occurs on or about January 3 each year, and its farthest departure from the Sun (known as aphelion) on or about July 4, is currently about 5.1 million kilometers (about 3.2 million miles), a variation of 3.4 percent. That means each January, about 6.8 percent more incoming solar radiation reaches Earth than it does each July. When Earth’s orbit is at its most elliptic, about 23 percent more incoming solar radiation reaches Earth at our planet’s closest approach to the Sun each year than does at its farthest departure from the Sun. Currently, Earth’s eccentricity is near its most elliptic and is very slowly decreasing, in a cycle that spans about 100,000 years. The total change in global annual insolation due to the eccentricity cycle is very small. Because variations in Earth’s eccentricity are fairly small, they’re a relatively minor factor in annual seasonal climate variations. Credit: NASA/JPL-Caltech Obliquity – The angle Earth’s axis of rotation is tilted as it travels around the Sun is known as obliquity. Obliquity is why Earth has seasons. Over the last million years, it has varied between 22.1 and 24.5 degrees perpendicular to Earth’s orbital plane. The greater Earth’s axial tilt angle, the more extreme our seasons are, as each hemisphere receives more solar radiation during its summer, when the hemisphere is tilted toward the Sun, and less during winter, when it is tilted away. Larger tilt angles favor periods of deglaciation (the melting and retreat of glaciers and ice sheets). These effects aren’t uniform globally -- higher latitudes receive a larger change in total solar radiation than areas closer to the equator. Earth’s axis is currently tilted 23.4 degrees, or about half way between its extremes, and this angle is very slowly decreasing in a cycle that spans about 41,000 years. It was last at its maximum tilt about 10,700 years ago and will reach its minimum tilt about 9,800 years from now. As obliquity decreases, it gradually helps make our seasons milder, resulting in increasingly warmer winters, and cooler summers that gradually, over time, allow snow and ice at high latitudes to build up into large ice sheets. As ice cover increases, it reflects more of the Sun’s energy back into space, promoting even further cooling. Credit: NASA/JPL-Caltech Precession – As Earth rotates, it wobbles slightly upon its axis, like a slightly off-center spinning toy top. This wobble is due to tidal forces caused by the gravitational influences of the Sun and Moon that cause Earth to bulge at the equator, affecting its rotation. The trend in the direction of this wobble relative to the fixed positions of stars is known as axial precession. The cycle of axial precession spans about 25,771.5 years. Axial precession makes seasonal contrasts more extreme in one hemisphere and less extreme in the other. Currently perihelion occurs during winter in the Northern Hemisphere and in summer in the Southern Hemisphere. This makes Southern Hemisphere summers hotter and moderates Northern Hemisphere seasonal variations. But in about 13,000 years, axial precession will cause these conditions to flip, with the Northern Hemisphere seeing more extremes in solar radiation and the Southern Hemisphere experiencing more moderate seasonal variations. Axial precession also gradually changes the timing of the seasons, causing them to begin earlier over time, and gradually changes which star Earth’s axis points to at the North Pole (the North Star). Today Earth’s North Stars are Polaris and Polaris Australis, but a couple of thousand years ago, they were Kochab and Pherkad. There’s also apsidal precession. Not only does Earth’s axis wobble, but Earth’s entire orbital ellipse also wobbles irregularly, primarily due to its interactions with Jupiter and Saturn. The cycle of apsidal precession spans about 112,000 years. Apsidal precession changes the orientation of Earth’s orbit relative to the elliptical plane. The combined effects of axial and apsidal precession result in an overall precession cycle spanning about 23,000 years on average. A Climate Time Machine The small changes set in motion by Milankovitch cycles operate separately and together to influence Earth’s climate over very long timespans, leading to larger changes in our climate over tens of thousands to hundreds of thousands of years. Milankovitch combined the cycles to create a comprehensive mathematical model for calculating differences in solar radiation at various Earth latitudes along with corresponding surface temperatures. The model is sort of like a climate time machine: it can be run backward and forward to examine past and future climate conditions. Milankovitch assumed changes in radiation at some latitudes and in some seasons are more important than others to the growth and retreat of ice sheets. In addition, it was his belief that obliquity was the most important of the three cycles for climate, because it affects the amount of insolation in Earth’s northern high-latitude regions during summer (the relative role of precession versus obliquity is still a matter of scientific study). He calculated that Ice Ages occur approximately every 41,000 years. Subsequent research confirms that they did occur at 41,000-year intervals between one and three million years ago. But about 800,000 years ago, the cycle of Ice Ages lengthened to 100,000 years, matching Earth’s eccentricity cycle. While various theories have been proposed to explain this transition, scientists do not yet have a clear answer. Milankovitch’s work was supported by other researchers of his time, and he authored numerous publications on his hypothesis. But it wasn’t until about 10 years after his death in 1958 that the global science community began to take serious notice of his theory. In 1976, a study in the journal Science by Hays et al. using deep-sea sediment cores found that Milankovitch cycles correspond with periods of major climate change over the past 450,000 years, with Ice Ages occurring when Earth was undergoing different stages of orbital variation. Several other projects and studies have also upheld the validity of Milankovitch’s work, including research using data from ice cores in Greenland and Antarctica that has provided strong evidence of Milankovitch cycles going back many hundreds of thousands of years. In addition, his work has been embraced by the National Research Council of the U.S. National Academy of Sciences. Scientific research to better understand the mechanisms that cause changes in Earth’s rotation and how specifically Milankovitch cycles combine to affect climate is ongoing. But the theory that they drive the timing of glacial-interglacial cycles is well accepted.
  • NASA Prepares for New Science Flights Above Coastal Louisiana
    Delta-X, a new NASA airborne investigation, is preparing to embark on its first field campaign in the Mississippi River Delta in coastal Louisiana. Beginning in April, the Delta-X science team, led by Principal Investigator Marc Simard of NASA's Jet Propulsion Laboratory in Pasadena, California, will be collecting data by air and by boat to better understand why some parts of the delta are disappearing due to sea-level rise, while other parts are not. "Millions of people live on, and live from services provided by, coastal deltas like the Mississippi River Delta. But sea-level rise is causing many major deltas to lose land or disappear altogether, taking those services with them," Simard said. "We hope to be able to predict where and why some parts of the region will disappear and some are likely to survive." Deltas typically form where large rivers enter the ocean or other bodies of water. As a river flows downstream, it carries with it sediment—small particles of silt, gravel and clay. By the time the river meets the other body of water, it is moving more slowly, allowing the sediment to sink to the bottom and accumulate to form a landmass, or delta. A Gulfstream III aircraft equipped with NASA's UAVSAR instrument, one of several instruments to be deployed for Delta-X. Credit: NASA Deltas protect inland areas from wind and flooding during storms, they serve as a first line of defense against sea-level rise, and they are home to many species of plants and wildlife. The Mississippi River Delta, one of the world's largest, also helps to drive local and national economies via the shipping, fishing and tourism industries. But it is quickly losing land area: Over the last 80 years, coastal Louisiana has lost some 2,000 square miles (about 5,000 square kilometers) of wetlands—roughly an area the size of the state of Delaware. When deltas don't accumulate sediment fast enough to offset sea-level rise and ground sinking—a result of extracting subterranean water, petroleum and natural gas—they essentially drown. Enter Delta-X. Over the course of two field campaigns, one in April and one in the fall, the Delta-X science team will investigate how and why sediment accumulates in some areas and not in others. They'll also work to determine what areas are most susceptible to disappearing beneath rising seas. Specifically, they'll focus on two key locations: the Atchafalaya Basin and northwest of Terrebonne Bay, which have gained and lost land, respectively. During both campaigns, the science team will fly over the region simultaneously in three aircraft, each equipped with specialized remote-sensing instruments. They'll measure how much water flows through the river's channels and how much of it overflows to the wetlands. They'll also detect the amount of sediment in the water and how much of it gets deposited to build land. The team will fly four times for each campaign, collecting data at both high and low tides to better understand how the tides impact the exchange of water and sediment between river channels and wetlands. In addition, they will collect water samples and measurements by boat. After processing the data, which is expected to take about nine months, the science team will use it to provide detailed models of the delta region and how it works. "These models will empower local communities and resource managers with the information and prediction capabilities they need to make the necessary decisions to save and restore the delta," said Simard. But for now, with just over a month to go, Simard and the Delta-X team are hard at work poring over spreadsheets, tide tables and logistics. "Right now, we're determining exactly where and when the aircraft will fly, coordinating field teams to be deployed on boats and making sure everything is in order for a successful campaign," Simard said. Earth Venture investigations, including Delta-X, are part of NASA's Earth System Science Pathfinder program, managed at NASA's Langley Research Center in Hampton, Virginia, for the agency's Science Mission Directorate in Washington. Competitively selected orbital missions and field campaigns in this program provide innovative approaches to address Earth science research with frequent windows of opportunity to accommodate new scientific priorities. In addition to investigators from JPL, the Delta-X team includes co-investigators from Louisiana State University, Florida International University, University of North Carolina, Boston University, University of Texas-Austin, Woods Hole Oceanographic Institute and Caltech, which manages JPL for NASA. More information on Delta-X can be found here: https://deltax.jpl.nasa.gov/ News Media Contact Jane Lee Jet Propulsion Laboratory, Pasadena, Calif. 818-354-0307 Jane.j.lee@jpl.nasa.gov
  • NASA Flights Detect Millions of Arctic Methane Hotspots
    The Arctic is one of the fastest warming places on the planet. As temperatures rise, the perpetually frozen layer of soil, called permafrost, begins to thaw, releasing methane and other greenhouse gases into the atmosphere. These methane emissions can accelerate future warming—but to understand to what extent, we need to know how much methane may be emitted, when and what environmental factors may influence its release. That's a tricky feat. The Arctic spans thousands of miles, many of them inaccessible to humans. This inaccessibility has limited most ground-based observations to places with existing infrastructure—a mere fraction of the vast and varied Arctic terrain. Moreover, satellite observations are not detailed enough for scientists to identify key patterns and smaller-scale environmental influences on methane concentrations. In a new study, scientists with NASA's Arctic Boreal Vulnerability Experiment (ABoVE), found a way to bridge that gap. In 2017, they used planes equipped with the Airborne Visible Infrared Imaging Spectrometer - Next Generation (AVIRIS - NG), a highly specialized instrument, to fly over some 11,583 square miles (30,000 square kilometers) of the Arctic landscape in the hope of detecting methane hotspots. The instrument did not disappoint. "We consider hotspots to be areas showing an excess of 3,000 parts per million of methane between the airborne sensor and the ground," said lead author Clayton Elder of NASA's Jet Propulsion Laboratory in Pasadena, California. "And we detected 2 million of these hotspots over the land that we covered." The paper, titled "Airborne Mapping Reveals Emergent Power Law of Arctic Methane Emissions," was published Feb. 10 in Geophysical Research Letters. Within the dataset, the team also discovered a pattern: On average, the methane hotspots were mostly concentrated within about 44 yards (40 meters) of standing bodies of water, like lakes and streams. After the 44-yard mark, the presence of hotspots gradually became sparser, and at about 330 yards (300 meters) from the water source, they dropped off almost completely. The scientists working on this study don't have a complete answer as to why 44 yards is the "magic number" for the whole survey region yet, but additional studies they've conducted on the ground provide some insight. "After two years of ground field studies that began in 2018 at an Alaskan lake site with a methane hotspot, we found abrupt thawing of the permafrost right underneath the hotspot," said Elder. "It's that additional contribution of permafrost carbon - carbon that's been frozen for thousands of years—that's essentially contributing food for the microbes to chew up and turn into methane as the permafrost continues to thaw." Scientists are just scratching the surface of what is possible with the new data, but their first observations are valuable. Being able to identify the likely causes of the distribution of methane hotspots, for example, will help them to more accurately calculate this greenhouse gas's emissions across areas where we don't have observations. This new knowledge will improve how Arctic land models represent methane dynamics and therefore our ability to forecast the region's impact on global climate and global climate change impacts on the Arctic. Elder says the study is also a technological breakthrough. "AVIRIS-NG has been used in previous methane surveys, but those surveys focused on human-caused emissions in populated areas and areas with major infrastructure known to produce emissions," he said. "Our study marks the first time the instrument has been used to find hotspots where the locations of possible permafrost-related emissions are far less understood." More information on ABoVE can be found here: https://above.nasa.gov/ News Media Contact Jane Lee Jet Propulsion Laboratory, Pasadena, Calif. 818-354-0307
  • Probing the Hazy Mysteries of Marine Clouds
    Sea salt, soot, sulfate — probably not the first words that come to mind when you think of clouds. But as these and many other microscopic aerosol particles rise through the atmosphere, they act as nuclei on which water vapor can condense to form cloud droplets. The HU-25 Falcon, pictured here, will fly under and through the clouds, where a suite of instruments will take samples directly from the surrounding air to measure everything from aerosol properties to droplet composition to gas. Instrument probes are visible here protruding from the top of the aircraft. Credit: NASA/David C. Bowman Because different kinds of aerosol particles affect the formation and evolution of clouds in ways that aren't entirely understood, and because more data on that process will help researchers refine climate and weather models, it's a phenomenon ripe for an intensive field study. A new NASA airborne science mission will pick up that gauntlet by taking researchers on coordinated flights above, through and below the clouds over the western North Atlantic Ocean. The Aerosol Cloud Meteorology Interactions Over the Western Atlantic Experiment (ACTIVATE) is scheduled to begin the first of six flight campaigns this week at NASA's Langley Research Center in Hampton, Virginia. Researchers load an instrument onto the HU-25 Falcon. Credit: NASA/David C. Bowman Using two aircraft, a King Air and an HU-25 Falcon, ACTIVATE scientists will amass nearly 1,200 hours of coordinated flight data over the course of those campaigns. ACTIVATE data will help climate and weather modelers better understand how aerosol particles and meteorological processes affect cloud properties. In addition, modelers will use these data to better characterize how the clouds themselves, in turn, affect aerosol particle properties and atmospheric lifetime as well as the meteorological environment. Two-month-long flight campaigns will be spread out over the next three years, across different seasons and covering a slew of atmospheric conditions. ACTIVATE will focus on marine boundary layer clouds — meaning clouds in the layer of the atmosphere closest to the ocean's surface. Those could range from thin stratiform clouds to thicker, deeper and more convective cumulus clouds. "One big advantage of the western North Atlantic Ocean is its meteorological set up," said Armin Sorooshian, ACTIVATE principal investigator and an atmospheric scientist at the University of Arizona. "That's one of the important reasons we picked this region. It's got a wide range of weather conditions, which leads to different cloud types." Equally important is that it's a region with a rich array of aerosol particles on which cloud droplets can form. Some of the possible aerosol sources include smoke from agricultural fires and wildfires in the U.S. and Canada; biogenic emissions from plants, trees and ocean-dwelling microorganisms; urban outflow from cities on the East Coast; and even dust blown over from the Sahara Desert. The coordinated flights will measure these aerosol particles and cloud processes from just about every imaginable angle. The Falcon will fly under and through the clouds, where a suite of instruments will take samples directly from the surrounding air to measure everything from aerosol properties to droplet composition to gas concentrations. Up above the clouds, researchers on the King Air will employ two remote sensors — an active sensor that fires a pulsed laser and measures a vertical profile of the atmosphere, and a passive sensor that measures light scattering from the sun at many different angles and wavelengths. These complementary measurements provide properties of cloud droplets and aerosol particles, such as the size and number concentration. In addition, researchers will use dropsondes to measure atmospheric conditions such as humidity, temperature, air pressure and winds. Dropsondes are small instruments that are ejected from a tube in the aircraft and take readings as they parachute through the atmosphere. "Although it is more challenging to coordinate and execute, using a two-aircraft approach offers a unique opportunity to sample both the horizontal and vertical changes in the aerosol-cloud system simultaneously," said John Hair, ACTIVATE project scientist at Langley. "In addition, with this approach, we expect the abundance of planned flights to sample a range of atmospheric conditions and aerosol types that will reveal how weather processes and aerosol particles impact the formation, size and lifetime of cloud droplets." Through this coordinated flight approach laser focused on one particular region of the planet that offers a rich variety of meteorological and aerosol conditions, ACTIVATE scientists plan to build an unprecedented catalog of data that will fill an important gap in the knowledge of climate and weather forecasters. "To my knowledge there's never been a campaign or a mission of this magnitude that has focused so much on just building statistics in one region," said Sorooshian. "The data we collect will be a treasure trove for modelers, who need this kind of information to better evaluate the effects of differing atmospheric conditions on model outcomes." The ACTIVATE science team includes researchers from NASA, the National Institute of Aerospace, universities, Brookhaven National Laboratory, Pacific Northwest National Laboratory, the National Center for Atmospheric Research and the German Aerospace Center. The current flight campaign is the first of two in 2020, with two more to follow in 2021, and another two in 2022. ACTIVATE is one of five NASA Earth Venture campaigns taking to the field in 2020. To learn more visit https://www.nasa.gov/feature/goddard/2019/nasa-embarks-on-us-cross-country-expeditions.
  • Climate Change Could Trigger More Landslides in High Mountain Asia
    More frequent and intense rainfall events due to climate change could cause more landslides in the High Mountain Asia region of China, Tibet and Nepal, according to the first quantitative study of the link between precipitation and landslides in the region. The model shows landslide risk for High Mountain Asia increasing in the summer months in the years 2061-2100, thanks to increasingly frequent and intense rainfall events. Summer monsoon rains can destabilize steep mountainsides, triggering landslides. Credit: NASA's Earth Observatory/Joshua Stevens High Mountain Asia stores more fresh water in its snow and glaciers than any place on Earth outside the poles, and more than a billion people rely on it for drinking and irrigation. The study team used satellite estimates and modeled precipitation data to project how changing rainfall patterns in the region might affect landslide frequency. The study team found that warming temperatures will cause more intense rainfall in some areas, and this could lead to increased landslide activity in the border region of China and Nepal. More landslides in this region, especially in areas currently covered by glaciers and glacial lakes, could cause cascading disasters like landslide dams and floods that affect areas downstream, sometimes hundreds of miles away, according to the study. The study was a collaboration between scientists from NASA’s Goddard Space Flight Center in Greenbelt, Maryland; the National Oceanic and Atmospheric Administration (NOAA) in Washington; and Stanford University in Palo Alto, California. High Mountain Asia stretches across tens of thousands of rugged, glacier-covered miles, from the Himalayas in the east to the Hindu Kush and Tian Shan mountain ranges in the west. As Earth’s climate warms, High Mountain Asia’s water cycle is changing, including shifts in its annual monsoon patterns and rainfall. Heavy rain, like the kind that falls during the monsoon season in June through September, can trigger landslides on the steep terrain, creating disasters that range from destroying towns to cutting off drinking water and transportation networks. In summer 2019, monsoon flooding and landslides in Nepal, India and Bangladesh displaced more than 7 million people. In order to predict how climate change might affect landslides, researchers need to know what future rainfall events might look like. But until now, the research making the landslide predictions has relied on records of past landslides or general precipitation estimate models. “Other studies have either addressed this relationship very locally, or by adjusting the precipitation signal in a general way,” said Dalia Kirschbaum, a research scientist at NASA’s Goddard Space Flight Center. “Our goal was to demonstrate how we could combine global model estimates of future precipitation with our landslide model to provide quantitative estimates of potential landslide changes in this region.” The study team used a NASA model that generates a “nowcast” estimating potential landslide activity triggered by rainfall in near real-time. The model, called Landslide Hazard Assessment for Situational Awareness (LHASA), assesses the hazard by evaluating information about roadways, the presence or absence of nearby tectonic faults, the types of bedrock, change in tree cover and the steepness of slopes. Then, it integrates current precipitation data from the Global Precipitation Measurement mission. If the amount of precipitation in the preceding seven days is abnormally high for that area, then the potential occurrence of landslides increases. NASA’s Global Landslide Catalog contains more than 1,000 records of landslides in High Mountain Asia between 2007 and 2017. Some of these events caused hundreds or thousands of fatalities. Credit: NASA's Earth Observatory/Joshua Stevens The study team first ran LHASA with NASA precipitation data from 2000-2019 and NOAA climate model data from 1982-2017. They compared the results from both data sets to NASA’s Global Landslide Catalog, which documents landslides reported in the media and other sources. Both data sets compared favorably with the catalog, giving the team confidence that using the modeled precipitation data would yield accurate forecasts. Finally, the study team used NOAA’s model data to take LHASA into the future, assessing precipitation and landslide trends in the future (2061-2100) versus the past (1961-2000). They found that extreme precipitation events are likely to become more common in the future as the climate warms, and in some areas, this may lead to a higher frequency of landslide activity. Most significantly, the border region of China and Nepal could see a 30 to 70 percent increase in landslide activity. The border region is not currently heavily populated, Kirschbaum said, but is partially covered by glaciers and glacial lakes. The combined impacts of more frequent intense rainfall and a warming environment could affect the delicate structure of these lakes, releasing flash floods and causing downstream flooding, infrastructure damage, and loss of water resources. The full human impact of increasing landslide risk will depend on how climate change affects glaciers and how populations and communities change. When they evaluated their model projections in the context of five potential population scenarios, the team found that most residents in the area will be exposed to more landslides in the future regardless of the scenario, but only a small proportion will be exposed to landslide activity increases greater than 20 percent. The study demonstrates new possibilities for research that could help decision-makers prepare for future disasters, both in High Mountain Asia and in other areas, said Kirschbaum. “Our hope is to expand our research to other areas of the world with similar risks of landslides, including Alaska and Appalachia in the United States,” said Sarah Kapnick, physical scientist at NOAA’s Geophysical Fluid Dynamics Laboratory and co-author on the study. “We’ve developed a method, figured out how to work together on a specific region, and now we’d like to look at the U.S. to understand what the hazards are now and in the future.”
  • Arctic Ice Melt Is Changing Ocean Currents
    A major ocean current in the Arctic is faster and more turbulent as a result of rapid sea ice melt, a new study from NASA shows. The current is part of a delicate Arctic environment that is now flooded with fresh water, an effect of human-caused climate change. Using 12 years of satellite data, scientists have measured how this circular current, called the Beaufort Gyre, has precariously balanced an influx of unprecedented amounts of cold, fresh water — a change that could alter the currents in the Atlantic Ocean and cool the climate of Western Europe. The Beaufort Gyre keeps the polar environment in equilibrium by storing fresh water near the surface of the ocean. Wind blows the gyre in a clockwise direction around the western Arctic Ocean, north of Canada and Alaska, where it naturally collects fresh water from glacial melt, river runoff and precipitation. This fresh water is important in the Arctic in part because it floats above the warmer, salty water and helps to protect the sea ice from melting, which in turn helps regulate Earth's climate. The gyre then slowly releases this fresh water into the Atlantic Ocean over a period of decades, allowing the Atlantic Ocean currents to carry it away in small amounts. But the since the 1990s, the gyre has accumulated a large amount of fresh water — 1,920 cubic miles (8,000 cubic kilometers) — or almost twice the volume of Lake Michigan. The new study, published in Nature Communications, found that the cause of this gain in freshwater concentration is the loss of sea ice in summer and autumn. This decades-long decline of the Arctic's summertime sea ice coverhas left the Beaufort Gyre more exposed to the wind, which spins the gyre faster and traps the fresh water in its current. Persistent westerly winds have also dragged the current in one direction for over 20 years, increasing the speed and size of the clockwise current and preventing the fresh water from leaving the Arctic Ocean. This decades-long western wind is unusual for the region, where previously, the winds changed direction every five to seven years. Scientists have been keeping an eye on the Beaufort Gyre in case the wind changes direction again. If the direction were to change, the wind would reverse the current, pulling it counterclockwise and releasing the water it has accumulated all at once. "If the Beaufort Gyre were to release the excess fresh water into the Atlantic Ocean, it could potentially slow down its circulation. And that would have hemisphere-wide implications for the climate, especially in Western Europe," said Tom Armitage, lead author of the study and polar scientist at NASA's Jet Propulsion Laboratory in Pasadena, California. Fresh water released from the Arctic Ocean to the North Atlantic can change the density of surface waters. Normally, water from the Arctic loses heat and moisture to the atmosphere and sinks to the bottom of the ocean, where it drives water from the north Atlantic Ocean down to the tropics like a conveyor belt. This important current is called the Atlantic Meridional Overturning Circulation and helps regulate the planet's climate by carrying heat from the tropically-warmed water to northern latitudes like Europe and North America. If slowed enough, it could negatively impact marine life and the communities that depend it. "We don't expect a shutting down of the Gulf Stream, but we do expect impacts. That's why we're monitoring the Beaufort Gyre so closely," said Alek Petty, a co-author on the paper and polar scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The study also found that, although the Beaufort Gyre is out of balance because of the added energy from the wind, the current expels that excess energy by forming small, circular eddies of water. While the increased turbulence has helped keep the system balanced, it has the potential to lead to further ice melt because it mixes layers of cold, fresh water with relatively warm, salt water below. The melting ice could, in turn, lead to changes in how nutrients and organic material in the ocean are mixed, significantly affecting the food chain and wildlife in the Arctic. The results reveal a delicate balance between wind and ocean as the sea ice pack recedes under climate change. "What this study is showing is that the loss of sea ice has really important impacts on our climate system that we're only just discovering," said Petty. News Media Contacts Rexana Vizza / Matthew Segal Jet Propulsion Laboratory, Pasadena, Calif. 818-393-1931 / 818-354-8307 rexana.v.vizza@jpl.nasa.gov / matthew.j.segal@jpl.nasa.gov
  • NASA, Partners Name Ocean-Observing Satellite for Noted Earth Scientist
    NASA and several partners announced Tuesday that they have renamed a key ocean-observing satellite launching this fall in honor of Earth scientist Michael Freilich, who retired last year as head of NASA's Earth Science division, a position he held since 2006. NASA—along with ESA (European Space Agency), the European Commission (EC), the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), and the National Oceanic and Atmospheric Administration (NOAA)—made the announcement during a special event at the agency's headquarters. "This honor demonstrates the global reach of Mike's legacy," said NASA Administrator Jim Bridenstine. "We are grateful for ESA and the European partners' generosity in recognizing Mike's lifelong dedication to understanding our planet and improving life for everyone on it. Mike's contributions to NASA—and to Earth science worldwide—have been invaluable, and we are thrilled that this satellite bearing his name will uncover new knowledge about the oceans for which he has such an abiding passion." The Sentinel-6A/Jason CS satellite, scheduled to launch this fall from Vandenberg Air Force base in California, will now be known as Sentinel-6 Michael Freilich. The mission aims to continue high-precision ocean altimetry measurements in the 2020-2030 timeframe using two identical satellites launching five years apart—Sentinel-6A Michael Freilich and Sentinel-6B. NASA and its partners are developing the mission with support from the Centre National d'Etudes Spatiales (CNES), France's space agency. Project management is being provided by NASA's Jet Propulsion Laboratory in Pasadena, California. ESA is developing the new Sentinel family of missions specifically to support the operational needs of the European Union's Copernicus program, the EU's Earth observation program managed by the European Commission. They will replace older satellites nearing the end of their operational lifespan to ensure there are no gaps in ongoing land, atmosphere and ocean monitoring, as well as introduce new monitoring capabilities. A key ocean-observing satellite launching this fall has been named after Earth scientist Michael Freilich, as announced Jan. 28 by NASA, ESA (European Space Agency), the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), and the National Oceanic and Atmospheric Administration (NOAA). Credit: NASA "Together, with other missions of the European Union's Earth Observation Programme Copernicus, Sentinel-6 Michael Freilich will contribute to improved knowledge and understanding of the role of the ocean in climate change and for mitigation and adaptation policies in coastal areas," said Mercedes Garcia Perez, head of the Global Issues and Innovation of the European Union Delegation to the United States. "It will have a large societal impact worldwide as it supports applications in the area of operational oceanography, including ship routing, support for off-shore and other marine industries, fisheries, and responses to environmental hazards. This new satellite within the Copernicus constellation will be an additional tool for implementing the European Green Deal to transition the EU to a carbon-neutral economy." A secondary objective of the mission is to collect high-resolution vertical profiles of temperature, using the Global Navigation Satellite Sounding Radio-Occultation sounding technique, which derives atmospheric information from analyses of signals from international Global Positioning System satellites. Sentinel-6 measurements of temperature changes in the troposphere and stratosphere will be used by weather agencies worldwide to improve the accuracy of global forecasts produced by their complex, state-of-the-art computer models. The Sentinel-6 Michael Freilich satellite also will continue the existing 28-year data set of sea level changes measured from space. Before his retirement, Freilich was instrumental in advancing the collaborative mission to a critical stage of development and helping to strengthen its essential international partnerships. "This mission demonstrates what the United States and Europe can achieve as equal partners in such a large space project. Our suggestion to rename the mission to 'Sentinel-6 Michael Freilich' is an expression of how thankful we are to Mike. Without him, this mission as it is today would not have been possible," said Josef Aschbacher, ESA director of Earth Observation Programmes. Freilich's career as an oceanographer spanned nearly four decades and integrated research on Earth's ocean, leading satellite mission development, and helping to train and inspire the next generation of scientific leaders. His training was in ocean physics, but his vision encompasses the full spectrum of Earth's dynamics. "Earth Science shows perhaps more than any other discipline how important partnership is to the future of this planet," said Thomas Zurbuchen, NASA associate administrator for Science. "Mike exemplifies the commitment to excellence, generosity of spirit and unmatched ability to inspire trust that made so many people across the world want to advance big goals on behalf of our planet and all its people by working with NASA. The fact that ESA and the European partners have given him this unprecedented honor demonstrates that respect and admiration." During Freilich's NASA tenure, the agency increased the pace of Earth science mission launches and in 2014 alone sent five missions to space to study our home planet. The missions balanced many objectives from research to applications and technology development activities. Freilich also led NASA's response to the National Academy of Sciences' first-ever Earth Science and Applications from Space decadal survey in 2007, which expanded NASA's innovative Earth-observing programs and continues to guide the agency's global Earth observation efforts. "My NOAA colleagues and I enthusiastically support renaming Sentinel-6A after Mike," said Stephen Volz, assistant administrator for NOAA's Satellite and Information Service. "This is a fitting honor for a man who helped transform space-based Earth observation and has brought together the best contributions from our global Earth science community to improve our collective understanding of how our planet is changing." NOAA uses data from missions such as Sentinel-6 in a variety of ways, from monitoring the rate of global sea-level rise to producing more accurate weather forecasts. Michael Freilich, who served as director of NASA’s Earth Science division from 2006-2019. Credit: NASA Freilich also established the sustained Venture Class program of low-cost space and airborne science missions that is now a central feature of the NASA Earth Science Division's portfolio. He pioneered the broad use of the International Space Station as a platform for Earth-observing instruments, a unique observing platform for the Earth system. Unlike many of the traditional Earth observation platforms, the space station orbits the Earth in an inclined equatorial orbit that is not Sun-synchronous. This means that the space station passes over locations between 52 degrees north and 52 degrees south latitude at different times of day and night, and under varying illumination conditions. This is particularly important for collecting imagery of unexpected natural hazard and disaster events such as volcanic eruptions, earthquakes, flooding and tsunamis, as well as for cross-calibrating other satellites in Sun-synchronous polar orbits. Freilich also inaugurated a NASA activity to use data products from private sector, small-satellite constellations and commercial partners to supplement traditional government data sources. Under Freilich's leadership, NASA looked at new ways to carry out its critical mission and established cutting-edge programs to use small satellites and payloads hosted on commercial satellites to advance Earth science research and to demonstrate new technologies. All told, during Freilich's time at NASA Headquarters, he oversaw 16 successful major mission and instrument launches and eight CubeSat/small-satellite launches. The agency's Earth Science Division has 14 Earth-observing missions in development for launch by 2023, which includes eight major hosted instruments on other nations' satellites. NASA uses the unique vantage point of space and suborbital platforms to better understand Earth as an interconnected system for societal benefit. The agency also develops new technologies and approaches to observe and study Earth with long-term data records, research, modeling, and computer analysis tools to quantify how our planet is changing. NASA shares this knowledge with the global community, including managers and policymakers domestically and internationally to understand and protect our home planet. https://www.nasa.gov/earth News Media Contacts Grey Hautaluoma / Steve Cole Headquarters, Washington 202-358-0668; 202-358-0918 grey.hautaluoma-1@nasa.gov / stephen.e.cole@nasa.gov John Leslie NOAA's Office of Communications for Satellites 301-713-0214 john.leslie@noaa.gov Matthew Segal Jet Propulsion Laboratory, Pasadena, Calif. 818-354-8307 matthew.j.segal@jpl.nasa.gov
  • NASA, NOAA Analyses Reveal 2019 Second Warmest Year on Record
    According to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA), Earth's average global surface temperature in 2019 was the second warmest since modern record-keeping began in 1880. Globally, 2019's average temperature was second only to that of 2016 and continued the planet's long-term warming trend: the past five years have been the warmest of the last 140 years. This past year was 1.8 degrees Fahrenheit (0.98 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York. Credit: NASA’s Scientific Visualization Studio/Kathryn Mersmann “The decade that just ended is clearly the warmest decade on record,” said GISS Director Gavin Schmidt. “Every decade since the 1960s clearly has been warmer than the one before.” The average global surface temperature has risen since the 1880s and is now more than 2 degrees Fahrenheit (a bit more than 1 degree Celsius) above that of the late 19th century. For reference, the last Ice Age was about 10 degrees Fahrenheit colder than pre-industrial temperatures. Using climate models and statistical analysis of global temperature data, scientists have concluded that this increase has been driven mostly by increased emissions into the atmosphere of carbon dioxide and other greenhouse gases produced by human activities. This plot shows yearly temperature anomalies from 1880 to 2019, with respect to the 1951-1980 mean, as recorded by NASA, NOAA, the Berkeley Earth research group, the Met Office Hadley Centre (UK), and the Cowtan and Way analysis. Though there are minor variations from year to year, all five temperature records show peaks and valleys in sync with each other. All show rapid warming in the past few decades, and all show the past decade has been the warmest. Credit: NASA GISS/Gavin Schmidt “We crossed over into more than 2 degrees Fahrenheit warming territory in 2015 and we are unlikely to go back. This shows that what’s happening is persistent, not a fluke due to some weather phenomenon: we know that the long-term trends are being driven by the increasing levels of greenhouse gases in the atmosphere,” Schmidt said. Because weather station locations and measurement practices change over time, the interpretation of specific year-to-year global mean temperature differences has some uncertainties. Taking this into account, NASA estimates that 2019’s global mean change is accurate to within 0.1 degrees Fahrenheit, with a 95 percent certainty level. Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2019 annual mean temperature for the contiguous 48 United States was the 34th warmest on record, giving it a “warmer than average” classification. The Arctic region has warmed slightly more than three times faster than the rest of the world since 1970. Rising temperatures in the atmosphere and ocean are contributing to the continued mass loss from Greenland and Antarctica and to increases in some extreme events, such as heat waves, wildfires and intense precipitation. NASA’s temperature analyses incorporate surface temperature measurements from more than 20,000 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations. These in-situ measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heat island effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980. NOAA scientists used much of the same raw temperature data, but with a different interpolation into the Earth’s poles and other data-poor regions. NOAA’s analysis found 2019's average global temperature was 1.7 degrees Fahrenheit (0.95 degrees Celsius) above the 20th century average. NASA’s full 2019 surface temperature dataset and the complete methodology used for the temperature calculation and its uncertainties are available at: https://data.giss.nasa.gov/gistemp GISS is a laboratory within the Earth Sciences Division of NASA’s Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University’s Earth Institute and School of Engineering and Applied Science in New York. NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based measurements, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet. For more information about NASA’s Earth science activities, visit: https://www.nasa.gov/earth The slides for the Jan. 15 news conference are available at: https://www.ncdc.noaa.gov/sotc/briefings/20200115.pdf NOAA’s Global Report is available at: https://www.ncdc.noaa.gov/sotc/global/201913 News Media Contacts Steve Cole Headquarters, Washington 202-358-0918 stephen.e.cole@nasa.gov Peter Jacobs Goddard Space Flight Center, Greenbelt, Md. 301-286-0535 peter.jacobs@nasa.gov
  • NASA, NOAA to Announce 2019 Global Temperatures, Climate Conditions
    Climate experts from NASA and the National Oceanic and Atmospheric Administration (NOAA) will release their annual assessment of global temperatures and discuss the major climate trends of 2019 during a media teleconference at 12:15 p.m. EST Wednesday, Jan. 15. The briefing will take place at the 100th American Meteorological Society Annual Meeting in Boston. The teleconference participants are: Gavin Schmidt, director of NASA's Goddard Institute for Space Studies in New York Deke Arndt, chief of the global monitoring branch of NOAA’s National Centers for Environmental Information in Asheville, North Carolina Media can participate in the teleconference by calling 800-369-2090 (toll-free in the United States and Canada) or 1-203-827-7030 (international) and use the passcode CLIMATE. Audio of the briefing with supporting graphics will stream live at: http://www.nasa.gov/live The supporting graphics will also be available at: http://www.ncdc.noaa.gov/sotc/briefings NASA and NOAA are two keepers of the world's temperature data and independently produce a record of Earth's surface temperatures and changes based on historical observations over oceans and land. For more information about NASA's Earth science programs, visit: https://www.nasa.gov/earth News Media Contacts Steve Cole Headquarters, Washington 202-358-0918 stephen.e.cole@nasa.gov Peter Jacobs Goddard Space Flight Center, Greenbelt, Md. 301-286-0535 peter.jacobs@nasa.gov John Bateman NOAA Satellite and Information Service, Silver Spring, Md. 202-424-0929 nesdis.pa@noaa.gov
  • Study Confirms Climate Models are Getting Future Warming Projections Right
    An animation of a GISS (Goddard Institute for Space Studies) climate model simulation made for the United Nations' Intergovernmental Panel on Climate Change Fourth Assessment Report, showing five-year averaged surface air temperature anomalies in degrees Celsius from 1880 to 2100. The temperature anomaly is a measure of how much warmer or colder it is at a particular place and time than the long-term mean temperature, defined as the average temperature over the 30-year base period from 1951 to 1980. Blue areas represent cool areas and yellow and red areas represent warmer areas. The number in the upper right corner represents the global mean anomaly. Credit: NASA’s Goddard Institute for Space Studies There’s an old saying that “the proof is in the pudding,” meaning that you can only truly gauge the quality of something once it’s been put to a test. Such is the case with climate models: mathematical computer simulations of the various factors that interact to affect Earth’s climate, such as our atmosphere, ocean, ice, land surface and the Sun. For decades, people have legitimately wondered how well climate models perform in predicting future climate conditions. Based on solid physics and the best understanding of the Earth system available, they skillfully reproduce observed data. Nevertheless, they have a wide response to increasing carbon dioxide levels, and many uncertainties remain in the details. The hallmark of good science, however, is the ability to make testable predictions, and climate models have been making predictions since the 1970s. How reliable have they been? Now a new evaluation of global climate models used to project Earth’s future global average surface temperatures over the past half-century answers that question: most of the models have been quite accurate. Models that were used in the IPCC 4th Assessment Report can be evaluated by comparing their approximately 20-year predictions with what actually happened. In this figure, the multi-model ensemble and the average of all the models are plotted alongside the NASA Goddard Institute for Space Studies (GISS) Surface Temperature Index (GISTEMP). Climate drivers were known for the ‘hindcast’ period (before 2000) and forecast for the period beyond. The temperatures are plotted with respect to a 1980-1999 baseline. Credit: Gavin Schmidt In a study accepted for publication in the journal Geophysical Research Letters, a research team led by Zeke Hausfather of the University of California, Berkeley, conducted a systematic evaluation of the performance of past climate models. The team compared 17 increasingly sophisticated model projections of global average temperature developed between 1970 and 2007, including some originally developed by NASA, with actual changes in global temperature observed through the end of 2017. The observational temperature data came from multiple sources, including NASA’s Goddard Institute for Space Studies Surface Temperature Analysis (GISTEMP) time series, an estimate of global surface temperature change. The results: 10 of the model projections closely matched observations. Moreover, after accounting for differences between modeled and actual changes in atmospheric carbon dioxide and other factors that drive climate, the number increased to 14. The authors found no evidence that the climate models evaluated either systematically overestimated or underestimated warming over the period of their projections. “The results of this study of past climate models bolster scientists’ confidence that both they as well as today’s more advanced climate models are skillfully projecting global warming,” said study co-author Gavin Schmidt, director of NASA’s Goddard Institute of Space Studies in New York. “This research could help resolve public confusion around the performance of past climate modeling efforts.” Scientists use climate models to better understand how Earth’s climate changed in the past, how it is changing now and to predict future climate trends. Global temperature trends are among the most significant predictions, since global warming has widespread effects, is tied directly to international target agreements for mitigating future climate warming, and have the longest, most accurate observational records. Other climate variables are forecast in the newer, more complex models, and those predictions too will need to be assessed. To successfully match new observational data, climate model projections have to encapsulate the physics of the climate and also make accurate predictions about future carbon dioxide emission levels and other factors that affect climate, such as solar variability, volcanoes, other human-produced and natural emissions of greenhouse gases and aerosols. This study’s accounting for differences between the projected and actual emissions and other factors allowed a more focused evaluation of the models’ representation of Earth’s climate system. Schmidt says climate models have come a long way from the simple energy balance and general circulation models of the 1960s and early ‘70s to today’s increasingly high-resolution and comprehensive general circulation models. “The fact that many of the older climate models we reviewed accurately projected subsequent global temperatures is particularly impressive given the limited observational evidence of warming that scientists had in the 1970s, when Earth had been cooling for a few decades,” he said. The authors say that while the relative simplicity of the models analyzed makes their climate projections functionally obsolete, they can still be useful for verifying methods used to evaluate current state-of-the-art climate models, such as those to be used in the United Nations’ Intergovernmental Panel on Climate Change Sixth Assessment Report, to be released in 2022. “As climate model projections have matured, more signals have emerged from the noise of natural variability that allow for retrospective evaluation of other aspects of climate models — for instance, in Arctic sea ice and ocean heat content,” Schmidt said. “But it’s the temperature trends that people still tend to focus on.” Other participating institutions included the Massachusetts Institute of Technology in Cambridge and Woods Hole Oceanographic Institution in Woods Hole, Massachusetts. For more information on GISS and GISTEMP, visit: https://www.giss.nasa.gov/ https://data.giss.nasa.gov/gistemp/
  • Clouds, Arctic Crocodiles and a New Climate Model
    Key Points ◆ Understanding how greenhouse gases will affect clouds is crucial to forecasting climate change. But current computer climate models can’t handle the high resolution needed to simulate cloud dynamics worldwide. $('document').ready(function(){ mb_sharelines.addClassesToAttachedSharelines(); }); Key Points ◆ A recent study suggests that if greenhouse gases raise the atmosphere’s temperature enough, stratocumulus clouds could disappear, causing a large spike in global temperature. $('document').ready(function(){ mb_sharelines.addClassesToAttachedSharelines(); }); Key Points ◆ JPL is partnering with Caltech, MIT and the Naval Postgraduate School to build a new climate model that can take full advantage of Earth-observing satellites, with thousands of sub-models capable of simulating the dynamics of clouds and other small-scale phenomena. $('document').ready(function(){ mb_sharelines.addClassesToAttachedSharelines(); }); Crocodile bones, 50 million years old, have shown up on the Arctic island of Ellesmere, and that’s a problem. Scientists have been unable to explain how the Arctic could have warmed up enough to host those tropical creatures. Tapio Schneider, a senior research scientist at NASA’s Jet Propulsion Laboratory and professor at Caltech, thinks the answer may lie in the clouds. Given that they are so insubstantial—all the moisture that constitutes clouds planetwide at any given moment would amount to no more than a micro-thin film of water if spread out over Earth’s surface—clouds have an outsized impact on our climate. They cool Earth's surface by reflecting much of the Sun’s energy back into space. They warm like blankets by preventing the planet from radiating away all of the heat it absorbs each day. And of course, they are key components of the water cycle, storing water that has evaporated from the oceans and other bodies and then returning it as rain and snow. This crocodile has probably never been north of Africa, but its cousins lived in the Arctic 50 million years ago. Could clouds explain how that icy region got warm enough? Credit: Leigh Bedford/Wikimedia Commons It stands to reason that any computer model that hopes to explain past climates or forecast how ours will change would have to take clouds into account. And that’s primarily where climate models fall short. Climate models are digital simulations of Earth’s climate that run on supercomputers. Each model works by dividing the atmosphere into a three-dimensional grid and calculating what goes on within each box of that grid. You input things like temperature, humidity and pressure, and the model uses physics to compute how weather will develop within each box, what effects those developments will have on neighboring boxes, what effects those neighboring boxes will have on their neighbors, and on and on all over the planet and into the future. The problem that climate models have with clouds is that cloud behavior operates on a much finer scale than the grid boxes that the models employ. You would need to divide the atmosphere into boxes a thousand times smaller to capture those physics. And you would need a supercomputer vastly more powerful than today’s best in order to run the calculations for the entire planet in a useful amount of time. When the Cooling Clouds Disappear But today’s supercomputers can handle that kind of resolution for a much more limited region, and the cloud dynamics observed there can help us understand what happens over much larger regions. So Schneider led a study at Caltech, using a little more than 12 cubic miles (52 cubic kilometers) of simulated atmosphere, divided into 2 million grid boxes that were small enough to capture cloud dynamics. The goal was to see how increasing the concentration of greenhouse gases in the atmosphere is likely to affect the development of stratocumulus clouds over tropical oceans, and how global temperature could change as a result. Low-lying stratocumulus clouds are the most common type on Earth, and they’re the heavyweight champs among clouds at cooling the planet. If the buildup of greenhouse gases were to increase even slightly how much of Earth’s surface these clouds cover, global warming would slow down significantly. But if cloud coverage were to decrease, the world would heat up faster. In Schneider’s high-resolution simulation, global temperature rose and stratocumulus clouds thinned at a fairly steady pace as CO2 concentration increased—until the concentration reached roughly triple today’s level. Climate models divide the atmosphere into a 3-D grid. A new model will nest high-resolution sub-models, capable of realistically simulating cloud behavior, within the larger model. Credit: Schneider et al., Nature Climate Change At that point, the clouds broke up. And without their cooling effect, global temperature jumped by 8°C (about 14°F). If this is what happened 50 million years ago, Schneider said, the surge of heat from the loss of the clouds, on top of the warming from the increase in greenhouse gases, could have raised temperatures enough to make the Arctic suitable for crocodiles. If we continue releasing greenhouse gases at our current rate, he added, we could reach that point again somewhere around the beginning of the next century. Further, returning to our current “normal” might not be easy. In Schneider’s model, once the stratocumulus clouds were lost, greenhouse-gas concentration had to fall to about half of today’s level before the clouds returned to their present state. America’s Next Supermodel This small-scale simulation might or might not show what really could happen all over the world. But incorporating thousands of such simulations, representing key spots throughout the planet, into a global model could dramatically improve the model's ability to calculate cloud behavior worldwide and the future of Earth's climate. Schneider hopes to make such a global climate model a reality through the work of the Climate Modeling Alliance, a coalition of scientists, engineers and mathematicians from JPL, Caltech, the Massachusetts Institute of Technology (MIT) and the Naval Postgraduate School­. And the new model promises other important improvements as well. “We live in a golden age of Earth observations from space,” Schneider said, and the new model is being designed to automatically incorporate the flood of data continually pouring down from our Earth-observing satellites, as well as from ocean buoys and other sources. That, Schneider said, means the new model will be able to take advantage of much more observational data than current models can employ. The new model also aims to automate and improve the current process of tweaking a model to bring it into closer agreement with actual measurements. “Right now, there’s manual calibration of just a few points, a few parameters, relative to very few pieces of data,” Schneider said. “We want to do the same thing, but with all imprecisely known parameters of the model rather than just a few, and using data much more massively.” This level of accuracy stands to benefit everyone who needs to plan for the challenges ahead, from the leaders of governments and businesses to individuals. “Just as you now have weather-forecasting apps on the phone in your pocket, helping you make decisions about the future, we want to make sure that in a few years, you can have climate-prediction apps on your phone,” Schneider said. “So, for example, when you buy a house, you can know how likely it is that the forest behind the house will burn or that the neighborhood will be flooded. I would say within 5 years, we’ll have it.”

Copyright © 2019 Gail Chord Schuler. All Rights Reserved.