Jump to content
  • Sign Up
×
×
  • Create New...

SpaceMan

Diamond Member
  • Posts

    3,275
  • Joined

  • Last visited

    Never
  • Feedback

    0%

Everything posted by SpaceMan

  1. Redwire This June 5, 2024, image shows lysozyme crystals aboard the International Space Station. Lysozyme is a protein found in bodily fluids like tears, saliva, and milk, and is used as a control compound to demonstrate well-formed crystals. Lysozyme plays a vital role in innate immunity, protecting against bacteria, viruses, and fungi. The crystals were grown with Redwire’s PIL-BOX in a study of the effect of microgravity on various types of crystals production. Image credit: Redwire View the full article
  2. 2 min read Map the Earth’s Magnetic Shield with the Space Umbrella Project A stream of charged particles known as the solar wind flows from the Sun toward Earth. Here, it meets the Earth’s magnetic fields, which shield our planet like a giant umbrella. The Space Umbrella project needs your help investigating this dynamic region, where NASA’s Magnetosphere Multiscale (MMS) mission has been collecting data since 2015. The MMS mission investigates how the Sun and Earth’s magnetic fields connect and disconnect, explosively transferring energy from one to the other in a process that is important to the Sun, other planets, and everywhere in the universe. With the Space Umbrella project, you will help identify when the MMS spacecraft has observed the strongest interactions between the Earth’s magnetosphere and the solar wind. While these interactions can result in beautiful auroras, they also release energy that could disrupt GPS and communications systems and endanger astronauts. Your work will also help scientists better understand solar storms. Understanding these solar storms can contribute to keeping our astronauts and technology safe. To get started, visit the Space Umbrella project website and complete the tutorial. The tutorial will teach you everything you need to know, including how to tell when the satellite is inside Earth’s magnetic field and when the magnetosphere is interacting with the Sun’s particles. Everyone is welcome to participate — no prior experience needed! Left: An artist’s drawing of Earth’s magnetic field (blue lines) interacting with the Sun’s charged particles (yellow lines). The Earth’s magnetosphere (orange crescent) is created by Earth’s magnetic field. It deflects those particles like an umbrella. Right: NASA MMS mission observations like those volunteers would see while participating in the Space Umbrella project. NASA/Johns Hopkins Applied Physics Laboratory Learn More and Get Involved Space Umbrella Use data from NASA’s Magnetosphere Multiscale Mission to shed light on solar storms. Facebook logo @nasascience_ @nasascience_ Instagram logo @nasascience_ Linkedin logo @nasascience_ Share Details Last Updated Feb 19, 2026 Related Terms Citizen Science Heliophysics Heliophysics Division Explore More 3 min read Northern Glow Spans Iceland and Canada A vivid display of the aurora lit up skies over the Denmark Strait and eastern… Article 10 hours ago 2 min read How Small Is Too Small? Volunteers Help NASA Test Lake Monitoring From Space Volunteers participating in the Lake Observations by Citizen Scientists and Satellites (LOCSS) project have been… Article 2 months ago 3 min read NASA’s IMAP Mission Captures ‘First Light,’ Looks Back at Earth Article 2 months ago View the full article
  3. The International Space Station orbits above the Atlantic Ocean off the coast of Portugal. A small but mighty piece of lab equipment, about the size of a cellphone, has arrived at the International Space Station after launching with NASA’s SpaceX Crew-12 mission. NASA aims to use the off-the-shelf device, called a microplate reader, to conduct vital biological research in space and get real-time access to data. Demonstrations like this are part of NASA’s Commercially Enabled Rapid Space Science (CERISS) initiative, which partners with industry to develop transformative research capabilities and increase the pace and productivity of space science. NASA’s Biological and Physical Sciences Division is leading the demonstration in collaboration with the agency’s International Space Station Program. Potential to speed up access to research results The immediate benefit of using a microplate reader for space science is speed. Scientists can get data as soon as testing is complete, rather than waiting for samples to be stored, returned to Earth, and analyzed in ground labs. In-situ analysis like this — testing done on-site rather than after sample return — could reduce the delays, complications, and costs of bringing materials back to Earth. Traditional microplate readers on the ground are typically much larger — often ******* than a microwave — but NASA’s tests will use a version that is not much larger than a cellphone. For now, the microplate reader device requires a trained astronaut to run tests. But proving commercial lab equipment can work in low Earth orbit could open doors for future automation and even more advanced testing capabilities. In the future, scientists could test astronaut samples for various molecules during long-duration missions to monitor crew health in deep space. The microplate reader is adaptable — different test kits could support a range of measurements wherever humans explore in space. Shining light on space biology The microplate reader uses a wavelength of light to detect color in biological tests. When a target molecule is present in a sample, the test produces a color change. The intensity of that change tells researchers how much of a particular molecule is present. NASA will initially use samples from the Microgravity Associated Bone Loss-B (MABL-B) investigation — which explores potential ways to prevent bone loss in space — to test the microplate reader on the space station. For this demonstration, the microplate reader will measure a protein called interleukin-6 in samples from the MABL-B investigation. Scientists suspect this protein may contribute to astronaut bone loss. Operating the device is straightforward. It connects to a tablet or laptop via USB and uses standard 96-well plates — the same format many labs use on Earth. An astronaut runs the test using software to operate the device and get results immediately. Scientists can monitor the experiment in real-time via video and visually observe the initial readouts. If researchers have instructions for the crew, those are relayed via space station ground personnel communicating with crew. Additionally, a detailed data file can be downlinked quickly from the station and shared with the researchers. Testing commercial lab equipment using ultimate laboratory A microplate reader arrived at the orbiting laboratory Feb. 14 with Crew-12. The test kit and samples will launch aboard a future mission to the space station. Once all materials are aboard station, NASA will run the demonstration and compare the results with identical tests conducted on Earth. “The microplate reader hardware and the kit to measure a protein called Interleukin-6 are both off the shelf — we’re testing these commercially available products in space to accelerate the pace of doing research in orbit,” said Dan Walsh, CERISS program executive for NASA. “Our CERISS effort is building the capabilities and infrastructure needed for a thriving low Earth orbit research economy. Demonstrations like this show how commercial tools can integrate into space station operations and help grow the commercial space industry.” View the full article
  4. Earth Observatory Science Earth Observatory Northern Glow Spans Iceland… Earth Earth Observatory Image of the Day EO Explorer Topics All Topics Atmosphere Land Heat & Radiation Life on Earth Human Dimensions Natural Events Oceans Remote Sensing Technology Snow & Ice Water More Content Collections Global Maps World of Change Articles Notes from the Field Blog Earth Matters Blog Blue Marble: Next Generation EO Kids Mission: Biomes About About Us Subscribe 🛜 RSS Contact Us Search February 16, 2026 Although the aurora borealis, or northern lights, is most often observed in March and September, it can appear at other times of the year if conditions are right. For instance, in February 2026, a minor geomagnetic storm produced a striking display of light swirling across northern skies. The VIIRS (Visible Infrared Imaging Radiometer Suite) on the Suomi NPP satellite acquired these images in the early morning hours of February 16. The VIIRS day-night band detects nighttime light in a range of wavelengths from green to near-infrared and uses filtering techniques to observe signals such as city lights, reflected moonlight, and auroras. While these satellite data are displayed in grayscale, auroras appear in various colors to observers on the ground, from green (the most common) to purple to red. The first image (top) shows ribbons of light that shimmered over the Denmark Strait and Iceland at 04:45 Universal Time (4:45 a.m. local time in Reykjavík). The second image shows the view farther west, where the lights danced above the ********* provinces of Québec and Newfoundland and Labrador at about 06:30 Universal Time (1:30 a.m. local time in Montreal). February 16, 2026 According to the NOAA Space Weather Prediction Center, a minor geomagnetic storm was in progress during this *******. Classified as a G1—the lowest level on a scale that goes up to G5—such storms typically make the aurora visible at high latitudes. G1 storms can also cause slight disruptions, including weak fluctuations in power grids and minor impacts on satellite operations. Later that day, conditions intensified to a G2 storm, likely associated with a coronal hole and a high-speed stream of solar wind. G2 storms are considered moderate in strength and can occasionally push auroral displays as far south as New York and Idaho. About a week earlier, on February 10, a NASA rocket mission launched from the Poker Flat Research Range near Fairbanks, Alaska, to study the electrical environment of an aurora. The GNEISS (Geophysical Non-Equilibrium Ionospheric System Science) mission’s two sounding rockets gathered data that will help scientists create a 3D reconstruction of the electrical currents flowing from the northern lights. Combined with observations from the ground and space, this information can help researchers better understand the system that drives space weather near Earth. NASA Earth Observatory images by Michala Garrison, using VIIRS day-night band data from the Suomi National Polar-Orbiting Partnership. Story by Kathryn Hansen. Downloads Iceland, February 16, 2026 JPEG (654.50 KB) Canada, February 16, 2026 JPEG (1.79 MB) References & Resources NASA Science (2025, February 27) Electrojet Zeeman Imaging Explorer. Accessed February 18, 2026. NASA Science (2025, January 23) Aurorasaurus. Accessed February 18, 2026. NASA’s Wallops Flight Facility (2026, February 10) NASA Rocket to Conduct ‘CT Scan’ of Auroral Electricity. Accessed February 18, 2026. NOAA Space Weather Prediction Center via X (2026, February 16) EXTENDED WARNING: Geomagnetic K-index of 5 expected. Accessed February 18, 2026. University of Alaska Fairbanks (2026, February) Launches x4: Multiple missions kept everyone busy at Poker Flat. Accessed February 18, 2026. You may also be interested in: Stay up-to-date with the latest content from NASA as we explore the universe and discover more about our home planet. A Northwest Night Awash in Light 3 min read The glow of city lights, the aurora, and a rising Moon illuminate the night along the northwest coast of North… Article The Galaxy Next Door 3 min read The Large Magellanic Cloud—one of our closest neighboring galaxies—is a hotbed of star formation that is visible to both astronauts… Article Five Minutes in Orbit 3 min read An astronaut captured a moonrise—and much more—in a series of photos taken from the International Space Station. Article 1 2 3 4 Next Keep Exploring Discover More from NASA Earth Science Subscribe to Earth Observatory Newsletters Subscribe to the Earth Observatory and get the Earth in your inbox. Earth Observatory Image of the Day NASA’s Earth Observatory brings you the Earth, every day, with in-depth stories and stunning imagery. Explore Earth Science Earth Science Data View the full article
  5. 4 min read Digital Surface and Terrain Models from Vantor’s Precision3D Product Line Added to Satellite Data Explorer NASA’s Commercial Satellite Data Acquisition (CSDA) Program announces the addition of three digital elevation and digital terrain products from Vantor’s Precision3D Product Line to its Satellite Data Explorer (SDX) data access and discovery tool. The products include: Digital Surface Model (DSM) at 1-meter spatial resolution The DSM is a 3D elevation model derived from imagery captured by Vantor’s constellation of Worldview satellites. It provides precise measurements across all surfaces and terrains and is available in standard formats to facilitate integration into a range of workflows and analysis. It is suitable for a range of applications requiring detailed elevation data, such as urban planning, environmental monitoring, disaster mitigation and response, and terrain mapping. Digital Terrain Model (DTM) at 1-meter spatial resolution The DTM is a 3D elevation model derived from the DSM that offers bare-earth elevation data by removing above-ground features like vegetation and buildings and is designed for analyzing terrain and topography. Created with automated processing techniques, the DTM ensures consistency across all terrain types and is available in a variety of in user-friendly formats. Elevation Bundle (DSM + DTM) at 1-, 2-, and 4-meter spatial resolution The Elevation Bundle, which combines the DSM and DTM products, provides a detailed view of both above-ground features and the underlying bare earth. With global coverage and high-resolution data at 1-, 2-, and 4-meter resolution, this product offers reliable elevation information in all types of terrain, making it a suitable tool for a range of applications from slope analysis to flood modeling. “Digital Elevation Models are foundational geospatial infrastructure for NASA’s science community, and including them in the CSDA program ensures broad, consistent access to high‑quality commercial terrain data that sharpen geometric accuracy, support Earth system and hazard modeling, and extend NASA’s capabilities in support of Earth action priorities,” said Dana Ostrenga, Project Manager for the CSDA. About SDX The SDX allows users to search, discover, and access data acquired through the CSDA program. The web tool offers streamlined data download, automated quota tracking, and a new coverage map that provides a high-level overview of the regions covered by of the data discoverable through the SDX for any specified month and year. Currently, SDX offers access to the EarthDEM digital elevation model created by the Polar Geospatial Center at the University of Minnesota and now Vantor (formerly Maxar). For a summary of the NASA commercial partner datasets available in SDX, visit the SDX website. Researchers interested in accessing these data in SDX can use their Earthdata Login for authentication and initiate data download requests. Data will be made available for download upon approval and acceptance of the end user license agreement (EULA). The use of these digital elevation and digital terrain products is governed by a United States government End User License Agreement (USG EULA). To order data from SDX, users must create an account with and be logged in to NASA Earthdata. (The initial attempt to use SDX will redirect users to Earthdata Login, where they will be prompted to enter their Earthdata credentials and accept the terms of the EULA.) Users must agree to the terms of the EULA before any data can be requested. Note: All data requests must be approved by CSDA data managers. About the CSDA Program NASA’s Earth Science Division (ESD) established the CSDA Program to identify, evaluate, and acquire data from commercial providers that to support NASA’s Earth science research and applications. NASA recognizes the potential of commercial satellite constellations to advance Earth System Science and applications for societal benefit and believes commercially acquired data can augment the Earth observations acquired by NASA, other U.S. government agencies, and NASA’s international partners. All data from CSDA contract-awarded vendors are evaluated by the investigator-led CSDA project teams that assess the value of adding a vendor’s data to CSDA’s data holdings based on their quality and how they might benefit in the context of NASA Earth science research and applications. To learn about the program, its commercial partners, data evaluation process, and more, visit the CSDA website. Learning Resources For more information on the CSDA Program’s SDX, see the SDX user guide. Share Details Last Updated Feb 18, 2026 Related Terms Uncategorized Explore More 4 min read Vantor Archive Imagery Added to Satellite Data Explorer The CSDA Program has added imagery from Vantor to its Satellite Data Explorer (SDX) data… Article 1 hour ago 4 min read CSDA Releases New Data Acquisition Request System The CSDA Program’s Data Acquisition Request System lets authorized users submit proposals for yet-to-be-collected data… Article 1 hour ago 4 min read CSDA Program Announces Eight New Data Agreements The CSDA Program announced eight new agreements that will give users more access to multispectral and… Article 2 hours ago Keep Exploring Discover Related Topics Missions Humans in Space Climate Change Solar System View the full article
  6. 4 min read Vantor Archive Imagery Added to Satellite Data Explorer A high-resolution multispectral image of Washington, DC from Vantor. Visible are the Washington Monument (left), Tidal Basin (the body of water in the center-right), and the Jefferson Memorial (right). Credit: Vantor NASA’s Commercial Satellite Data Acquisition (CSDA) Program announces the addition of imagery from Vantor to its Satellite Data Explorer (SDX) data access and discovery tool. The imagery, which was obtained by Vantor’s Legion satellites, comes from Vantor’s 125-plus petabyte imagery archive, which dates back to 1999. The imagery from this archive contains a mix of panchromatic (******/white) and color imagery (up to 18 multispectral bands) and offers global coverage of up to 30 cm resolution. There are three types of imagery available from this archive in SDX: System-Ready Level 1B Data This data is idea for users who are looking to apply their own tools and models to fully process the data and extract the information that they need. It comes with all bands, full bit-depth, and requires further processing to be ready for deriving downstream analytics. This basic processing of this product offers an imagery product ready for custom orthorectification. View-Ready Level 2A Data This processing level is intended for users who want to get straight to using the data to extract downstream analytical information. It provides a basis for deriving downstream analytics and has been orthorectified against a coarse digital elevation model (DEM). It comes with all bands and full bit depth. Map-Ready 3-D This data product offers standardized and orthorectified (i.e., corrected to remove distortion caused by terrain variations, and sensor angle), imagery that has been radiometrically calibrated and geo-rectified to produce a highly accurate imagery product ready for seamless integration into workflows. Map-ready data is ideal for image viewing and locational referencing and offers a high degree of cartographic accuracy. Vantor’s Legion satellites offer 8-band visible and near-infrared multispectral imagery at a resolution of up to 30-centimeters for use in a wide variety of applications ranging from agriculture and natural resources monitoring to disaster response and environmental surveillance. Further, the addition of these datasets to the CSDA Program’s SDX enhances the tool’s utility for users within the larger NASA’s Earth observation community to find high-resolution data that meets their needs. “NASA established the CSDA Program is to identify, evaluate, and acquire data from commercial sources that support NASA’s Earth science research and application goals,” said CSDA Project Manager Dana Ostrenga. “The inclusion of these Vantor data products in SDX is an example of our focus on realizing that mission and marks yet another step to our goal of bringing high-quality data from NASA’s commercial partners to users within the Earth observation science community.” About SDX The SDX allows users to search, discover, and access a variety of Global Navigation Satellite System (GNSS), digital elevation model (DEM), synthetic aperture radar (SAR), multispectral, and precipitation radar data acquired through the CSDA program. It also provides streamlined data download, automated quota tracking, and a new coverage map that provides a high-level overview of the spatial coverage of the data discoverable through the SDX for any specified month and year. For a summary of the NASA commercial partner datasets available in SDX, visit the SDX website. Researchers interested in accessing these data in SDX can use their Earthdata Login for authentication and initiate data download requests. Data will be made available for download upon approval and acceptance of the end user license agreement (EULA). To order data from SDX, users must create an account with and be logged in to NASA Earthdata. (The initial attempt to use SDX will redirect users to Earthdata Login, where they will be prompted to enter their Earthdata credentials and accept the terms of the EULA.) Users must agree to the terms of the EULA before any data can be requested. Note: All data requests must be approved by CSDA data managers. About the CSDA Program NASA’s Earth Science Division (ESD) established the CSDA Program to identify, evaluate, and acquire data from commercial providers that to support NASA’s Earth science research and applications. NASA recognizes the potential of commercial satellite constellations to advance Earth System Science and applications for societal benefit and believes commercially acquired data may also can augment the Earth observations acquired by NASA, and other U.S. government agencies, and NASA’s international partners. All data from CSDA contract-awarded vendors are evaluated by the investigator-led CSDA project teams that assess the value of adding a vendor’s data to CSDA’s data holdings based on their quality and how they might benefit in the context of NASA Earth science research and applications. To learn more about the program, its commercial partners, data evaluation process, and more, visit the CSDA website. Learning Resources For more information on the CSDA Program’s SDX, see the tool’s user guide. Share Details Last Updated Feb 18, 2026 Related Terms Uncategorized Explore More 4 min read CSDA Releases New Data Acquisition Request System The CSDA Program’s Data Acquisition Request System lets authorized users submit proposals for yet-to-be-collected data… Article 23 minutes ago 4 min read CSDA Program Announces Eight New Data Agreements NASA’s Commercial Satellite Data Acquisition (CSDA) Program announced eight new agreements with seven of its commercial… Article 41 minutes ago 3 min read Seasons Change in Southwest Virginia From autumn color to a winter-white finish, forested areas around Blacksburg trade foliage for snow… Article 2 weeks ago Keep Exploring Discover Related Topics Missions Humans in Space Climate Change Solar System View the full article
  7. 4 min read CSDA Releases New Data Acquisition Request System This screen capture of the SDX dashboard shows a map of Earth’s surface, and on the right, the search filters SDX users can manipulate to find the imagery that they need. Credit: CSDA NASA’s Commercial Satellite Data Acquisition (CSDA) Program released a new Data Acquisition Request System, which lets authorized users submit proposals for yet-to-be-collected data from CSDA’s commercial partners and track their requests through an easy-to-use dashboard. “With the Data Acquisition Request System, approved users will be able to ‘task,’ meaning to request future data, from a CSDA commercial partner’s satellite,” said Aaron Kaulfus, CSDA Data Management Team Lead. “The process begins with a user submitting a proposal that is subject to an approval process. If approved, the proposal will be processed by a CSDA commercial partner in accordance with the user’s other parameters.” The Data Acquisition Request System has been incorporated into the CSDA Program’s Satellite Data Explorer (SDX), an online tool for searching, discovering, and accessing the commercial satellite data acquired by NASA. (Note: Although anyone can browse the CSDA’s data holdings, only authorized data users can log into the SDX and request data. Information on the user authentication and authorization process is provided below.) “The dashboard shows users the proposals they’ve submitted and informs them of each proposal’s status and whether it’s been approved. In the case a proposal is partially approved, the dashboard will also include information supporting that decision,” said Kaulfus. “After approval, the proposal will be processed by the vendor, and the requested data will be collected and delivered to the system for download. This means that users can now request data from a vendor, track the status of their proposal, and download the data all in one place.” By providing these services in a single, centralized system, the CSDA aims to make the process of requesting future data from CSDA vendors more efficient and user-friendly. “Currently, the proposal process relies on users filling in a PDF-type form about their data needs followed by a series of email exchanges among users, CSDA Program staff, and vendors,” Kaulfus said. “The Data Acquisition Request System confines all of these interactions in a single, streamlined system, which allows users’ proposals to move through the [proposal review] process as quickly and efficiently as possible.” That process includes in-depth proposal reviews by CSDA staff to ensure the requested data fall within the program’s budget and the vendor’s capabilities. Therefore, the program’s response to users’ proposals won’t be immediate. Still, Kaulfus says the Data Acquisition Request System’s dashboard will help CSDA staff stay abreast of each proposal’s status and any actions required to keep it moving through the evaluation process. In addition to expediting users’ proposals, the Data Acquisition Request System will help the program address CSDA data users’ needs over the long term by providing the program with information it can use to expand its catalog of commercial satellite data. “We’ve realized that, through the Data Acquisition Request System, we can collect and catalog our users’ requests to inform future CSDA initiatives and add to our current capabilities,” said Kaulfus. “For example, in regard to fire applications, we really don’t have vendors that will support hotspot detection right now. But if a large number of users’ submit proposals requesting hotspot detection data, then that points to a need that we’ve not addressed.” This ability to zero-in on unmet user needs supports the program’s goal of expanding the use of commercial data within NASA’s data-user community. “Expanding the use of commercial data is a big part of this effort,” said Kaulfus. “We want to grow the audience of people who use our data and we want to do it efficiently, but for that to happen, we need information about the data that users need. Along with direct feedback from users themselves, the Data Acquisition Request System will help us get it.” Learning Resources For more information on the CSDA Program’s SDX, see the SDX user guide. Read More Share Details Last Updated Feb 18, 2026 Related Terms Uncategorized Explore More 4 min read Vantor Archive Imagery Added to Satellite Data Explorer NASA’s Commercial Satellite Data Acquisition (CSDA) Program announces the addition of imagery from Vantor to… Article 11 minutes ago 4 min read CSDA Program Announces Eight New Data Agreements NASA’s Commercial Satellite Data Acquisition (CSDA) Program announced eight new agreements with seven of its commercial… Article 41 minutes ago 3 min read Seasons Change in Southwest Virginia From autumn color to a winter-white finish, forested areas around Blacksburg trade foliage for snow… Article 2 weeks ago Keep Exploring Discover Related Topics Missions Humans in Space Climate Change Solar System View the full article
  8. CSDA Program Announces Eight New Data Agreements This Spotlight Mode SAR image from Capella Space shows a portion of the city of Pittsburgh, Pennsylvania, on August 21, 2021. Credit: Capella Space NASA’s Commercial Satellite Data Acquisition (CSDA) Program announced eight new agreements with seven of its commercial partners— Airbus Defense and Space GEO Inc (Airbus U.S.), Capella Space Corporation, ICEYE US, MDA Space, Planet Labs, Umbra, and Vantor (formerly Maxar)—to give users more access to near‑global multispectral and synthetic aperture radar (SAR) data. With these agreements, the CSDA program further advances its mission to acquire data from commercial providers that supports NASA’s Earth science research and applications, and expands the quality, coverage, and range of Earth observation data NASA offers to the scientific community. “These new agreements will provide users with a range of high-quality multispectral and SAR data that can be used in a variety of applications from environmental monitoring to surface deformation,” said CSDA Project Manager Dana Ostrenga. “In addition, they exemplify the CSDA Program’s commitment to acquiring data that enhances and supports the agency’s application and research objectives.” New Near-Global, Multispectral Imagery In support of NASA programs and stakeholders, the CSDA program enacted three agreements with Planet, Airbus, and Vantor (formerly Maxar) to provide near‑global multispectral and pan‑sharpened electro‑optical satellite imagery of nearly all global land and coastal surfaces. This imagery has a spatial resolution of approximately 30 centimeters, 1 meters, and up to 10 meters (depending on the product) and is suitable for applications including environmental monitoring, agriculture, and urban applications. Data products will include Top of Atmosphere radiances and surface reflectance across the visible and near‑infrared spectrum. New SAR Data In response to NASA’s and users’ needs for SAR data, and following rigorous technical and programmatic evaluation, CSDA executed five agreements for high‑resolution SAR imagery, including tasked Spotlight, StripMap, Scan, Wide/Extended Spotlight, and Long‑Dwell modes, with Capella, ICEYE, MDA, Umbra, and Airbus. These SAR capabilities provide all‑weather, day‑night imaging that complements the electro‑optical agreements and enhances NASA’s ability to monitor dynamic processes such as flooding, land deformation, sea‑ice motion, and infrastructure impacts. Further, under these agreements, each commercial partner will provide specific data requirements consistent with their respective sensor capabilities and performance, as well as tasking and archive access. Increased Access and User Eligibility The data acquired under these agreements will be made available to authorized commercial satellite data users in accordance with the CSDA Program’s End User License Agreements (EULAs). EULAs generally pertain to NASA‑funded investigators and designated collaborators and outline established mechanisms for accessing CSDA data, such as the CSDA Satellite Data Explorer (SDX) and related portals. Users can contact the CSDA Program at *****@*****.tld to obtain additional information about user agreements, detailed product specifications, and procedures for requesting and accessing these commercial datasets for their research and application activities. About the CSDA Program NASA’s Earth Science Division (ESD) established the CSDA Program to identify, evaluate, and acquire data from commercial providers that to support NASA’s Earth science research and applications. NASA recognizes the potential of commercial satellite constellations to advance Earth System Science and applications for societal benefit and believes commercially acquired data may also can augment the Earth observations acquired by NASA, and other U.S. government agencies, and NASA’s international partners. All data from CSDA contract-awarded vendors are evaluated by the investigator-led CSDA project teams that assess the value of adding a vendor’s data to CSDA’s data holdings based on their quality and how they might benefit in the context of NASA Earth science research and applications. To learn more about the program, its commercial partners, data evaluation process, and more, visit the CSDA website. View the full article
  9. 2 min read Notes from the Field Looking at Chlorophyll from Space By Compton “Jim” Tucker Tucker began his ground studies using a handheld instrument built by one of his classmates. “The instrument was literally held together by masking tape and rubber bands.” NASA scientists are able to study plants from space, but this wasn’t always the case. “I love using satellite data to study the Earth,” says Dr. Compton “Jim” Tucker. When Tucker was a graduate student, he and some friends discovered a new way to study photosynthesis. “We realized that there was a really strong connection with the plant pigment, chlorophyll, and certain wavelengths of light. We figured out that if you wanted to study photosynthesis you needed to study chlorophyll.” Tucker learned that you could figure out plant health by measuring how much visible and near-infrared light a plant reflects. “We call this light-type comparison the Normalized Differentiated Vegetation Index (NDVI). Really it is just a simple ratio of these two wavelengths or bands.” Tucker in 1971. Tucker first became interested in the world around him and began to look at it more closely when a friend’s older brother took them both exploring around the Pecos River in New Mexico. “He really helped to raise my awareness and my interest in the natural wonders of Earth. I really enjoy doing field work.” This was groundbreaking science. Tucker also learned that this observation and comparison could be done from space. In 1981 the first NDVI instrument flew in space as part of the Advanced Very High Resolution Radiometer (AVHRR) mission. “It is the same instrument from my working-in-the-field days, literally, just *******.” Later in 1983, Tucker met Piers Sellers. This meeting began a decades-long friendship and scientific collaboration. Sellers came up with a way to scale Tucker’s photosynthesis measurements. This made it possible to get detailed information about plant health around the globe — from a single leaf to plants covering a field, a forest, or a continent and all from space. “People are always asking me when I plan to retire,” Tucker says. “And I always say that I really like what I am doing. I am going to do it for as long as I can because it is fun. Most people look at me and think ‘Are you crazy?’ I am not. It is true: I really love my work.” About the Author Compton “Jim” Tucker Compton “Jim” Tucker is a Senior Scientist in the Earth Sciences Division at NASA’s Goddard Spaceflight Center (GSFC). Tucker has been able to travel to some pretty exciting places to do research. This image was taken while in the field in the Amazon. Jim’s beard, usually white, appears red in this picture. He used a special native Amazonian fruit, to dye his hair red for fun. Share Details Last Updated Feb 18, 2026 Related Terms Explorer Keep Exploring Discover More Topics From NASA Jet Propulsion Laboratory Earth Your home. Our Mission. And the one planet that NASA studies more than any other. Explore NASA’s History Get Your Daily Dose of NASA History Explorer 1 America’s first satellite, Explorer 1. America joined the space race with the launch of this small, but important spacecraft. View the full article
  10. 5 min read 42 Years of Measuring the Sun, the Earth and the Energy in Between By Denise Lineberry NASA’s Earth Radiation Budget Satellite (ERBS), a part of the NASA’s three satellite Earth Radiation Budget Experiment (ERBE), was designed to investigate how energy from the Sun is absorbed and re-emitted by the Earth. On Jan. 31, 1958, Explorer 1 became the first satellite launched by the United States. Its primary science instrument, a cosmic ray detector, was designed to measure the radiation environment in Earth orbit. Though its final transmission was in May 1958, it continued to revolve around Earth more than 58,000 times. As those looping orbits continued, NASA was busy building other ground-breaking instruments to observe and better understand Earth’s systems. By 1975, just five years after Explorer 1 burned up as it entered Earth’s atmosphere, NASA’s first Nimbus instrument launched, providing the first global, direct observations of the amount of solar radiation entering and exiting Earth. This helped confirm and improve the earliest climate models and laid the groundwork for NASA’s Earth Radiation Budget Experiment (ERBE). By the 1970s, the ERBE team was beginning to plan for the next phase of Earth Radiation Budget measurements. Retired experiment scientist for ERBE, Bruce Barkstrom, recalled the very first ERBE science team meeting involved a full day of attempting to determine exactly where the top of the atmosphere was. After much debate, they assigned one person at NASA’s Langley Research Center in Hampton, Virginia, to develop the number, which ended up being about 18 miles (30 kilometers) above the sphere that forms the Earth. “That was the level of detail we had to get into as a science team,” Barkstrom said. In October 1984, ERBE launched aboard NASA’s Earth Radiation Budget Satellite (ERBS) from the space shuttle Challenger (STS-41G). “We had to get up at 3:30 a.m. to watch the ERBS launch at 7:30 a.m., and what I remember about that particular morning was that we had an overcast sky. And when the shuttle lit up, it was such a bright exhaust that it lit up the whole sky from underneath,” Barkstrom recalled. “And then, of course, the shuttle went through the clouds, and the light dimmed, and probably about a minute later the sky lit up again because the sun was reflected off the exhaust. “It’s impossible for me to describe this without getting a little emotional.” Early leaders in NASA’s CERES (Clouds and the Earth’s Radiant Energy System) mission, including former Principal Investigators Bruce Wielicki and Bruce Barkstrom, used knowledge gathered from Nimbus and ERBE to formulate and execute a long-term satellite-based study of the role that cloud’s play in Earth’s Radiation Energy System. The seventh and final CERES Flight Model-6 achieved ‘first light’ in January 2018. For 10 years, ERBE provided invaluable data for scientists studying the energy interactions between the Sun, clouds and Earth. Its satellite measurements have provided new information on Earth’s radiation at the top of the atmosphere, including the important radiative effects of clouds on incoming and outgoing energy in the overall process. In the late 1980s, satellite instruments provided the first direct observation that clouds cooled Earth’s climate. Former CERES Principal Investigator Bruce Wielicki developed an algorithm to apply to Nimbus and ERBE models to help quantify cloud forcing — the difference between the radiation budget components for average cloud conditions and cloud-free conditions. With new knowledge about the important role that clouds play in Earth’s energy budget, the science team was anxious to gather more data. In 1997, the first in a new series of instruments, the Clouds and the Earth’s Radiant Energy System (CERES), launched, extending the important ERBE measurements. Six other CERES instruments have since been activated in space to measure the solar energy reflected by Earth, the heat the planet emits, and the role of clouds in that process. “The CERES instrument is small, it’s very elegant, it’s probably the most accurate radiometry that NASA has flown,” said CERES Principal Investigator Kory Priestley. “We’re trying to build the next generation of instrument now to meet the same requirements.” The seventh and final CERES instrument launched aboard NOAA’s Joint Polar Satellite System (JPSS)-1 in November 2017. It has since been activated and first light is expected in January 2018. For 42 years, NASA has observed Earth’s energy budget. NASA Langley’s Earth Radiation Budget Science Team is the only group producing ERB data globally. Though our understanding of Earth’s energy budget and the technology used to gather data has taken massive strides since Explorer 1 and Nimbus, that understanding is ever-evolving. “With Earth observations, you never complete your understanding, so you’re always at the mercy of somebody discovering some new things,” Barkstrom said. “If you’re dealing with observational science, you never have that final escape into absolute certainty where you’ll never have to change things.” Why Measure Earth’s Energy Budget? According to Barkstrom, attempts to understand the radiation budget started in about 1880. Earth’s energy budget is a metaphor for the delicate equilibrium between energy from the Sun versus energy radiated back into space. Continuous, stable and accurate data records over decades are critical to understanding Earth’s energy balance. The data collected improve models that provide seasonal and longer-term forecasts, which inform industry and policy makers to better plan for the future. The Latest NASA’s Total and Spectral Solar Irradiance Sensor (TSIS)-1 is currently on the International Space Station in a mission to measure the Sun’s energy input to Earth. Various satellites have captured a continuous record of this solar energy input since 1978. TSIS-1 sensors advance previous measurements, enabling scientists to study the Sun’s natural influence on Earth’s ozone layer, atmospheric circulation, clouds and ecosystems. These observations are essential for a scientific understanding of the effects of solar variability on the Earth system. About the Missions: ERBE and CERES ERBE The radiation budget represents the balance between incoming energy from the Sun and outgoing thermal (longwave) and reflected (shortwave) energy from the Earth. In the 1970s, NASA recognized the importance of improving our understanding of the radiation budget and its effects on Earth’s climate. Langley Research Center was charged with developing a new generation of instrumentation to make accurate regional and global measurements of the components of the radiation budget. The Goddard Space Flight Center built the Earth Radiation Budget Satellite (ERBS) on which the first Earth Radiation Budget Experiment (ERBE) instruments were launched by the Space Shuttle Challenger in 1984. ERBE instruments were also launched on two National Oceanic and Atmospheric Administration weather monitoring satellites, NOAA 9 and NOAA 10, in 1984 and 1986. CERES The Clouds and Earth’s Radiant Energy System (CERES) experiment is one of the highest priority scientific satellite instruments developed for NASA’s Earth Observing System (EOS). The first CERES instrument was launched in December 1997 aboard NASA’s Tropical Rainfall Measurement Mission (TRMM), CERES instruments are collecting observations on three separate satellite missions, including the EOS Terra and Aqua observatories, the Suomi National Polar-orbiting Partnership (S-NPP) observatory, and soon, the Joint Polar Satellite System, a partnership between NASA and the National Oceanic and Atmospheric Administration (NOAA). In fall 2017, CERES FM6 launched on JPSS-1, becoming the last in a generation of successful CERES instruments that help us to better observe and study Earth’s interconnected natural systems with long-term data records. Share Details Last Updated Feb 18, 2026 Related Terms Explorer Keep Exploring Discover More Topics From NASA Jet Propulsion Laboratory Earth Your home. Our Mission. And the one planet that NASA studies more than any other. Explore NASA’s History Get Your Daily Dose of NASA History Explorer 1 America’s first satellite, Explorer 1. America joined the space race with the launch of this small, but important spacecraft. View the full article
  11. 6 min read The Sky Belongs to All of Us By Hashima Hasan How did a little girl born in India soon after its independence from the British Empire, become a program scientist for NASA’s Hubble Space Telescope, and the first female program scientist for the James Webb Space Telescope, Stratospheric Observatory for Infrared Astronomy (SOFIA), Gravity Probe B, and other astrophysics flight missions? The story starts in October 1957, when I was 7 years old, and my grandmother ordered the entire family, including my 3-year-old sister, all the servants and their families, to collect at dawn in the backyard of the home and watch Sputnik pass by the clear night skies of Lucknow. That morning, as I saw Sputnik and the dark, starry sky, I dreamt the impossible dream that one day I would be a space scientist. The path was not easy. With determination and encouragement from my mother and school teachers, I forged ahead, won a scholarship to the University of Oxford, from where I earned a doctorate in theoretical nuclear physics in 1976. The path to a traditional academic career for a female scientist was fraught with challenges, exacerbated by social pressures. After pursuing post-doctoral research, a university faculty position, crisscrossing three continents and making a home across the Atlantic three times, I found myself again on the shores of the U.S. (1985) ― this time with a husband and two infant sons. My research career had oscillated between nuclear physics and environmental science, preparing me for yet another scientific challenge, when I was offered a research position at the Space Telescope Science Institute (STScI), Baltimore, to write the software to simulate the optics of NASA’s newest (now legendary) telescope, the Hubble Space Telescope and its science instruments. I boldly accepted the job, and under the guidance of Dr. Christopher Burrows, wrote the Telescope Image Modeling (TIM) software. Little did we know that after the launch of Hubble, TIM would be instrumental in our analysis of the first images, the identification and characterization of the spherical aberration, monitoring the focus of the telescope, and image simulations to enable scientists to analyze their aberrated data. I was appointed as the Optical Telescope Assembly (OTA) scientist, and have the dubious distinction of being the first and only OTA scientist whose task was to keep the Hubble “in focus” until a fix could be designed. I regularly monitored the images to learn about the health of the telescope optics, degradation of filters in the Faint Object Camera, and image characteristics. The flaw in the primary mirror caused by shaving off glass from its edges no thicker than about a human hair, not only caused blurry images, but had a dramatic effect when there were minute movements of the mirror. We learned that the graphite epoxy truss that supported the primary and secondary mirrors, desorbed water faster and longer than calculations had predicted, causing minute shrinkage in the truss. This meant that approximately every 3 months the mirror had to be moved to bring it back to the “best focus” established by the science community. I also participated in the design and optical testing phase of the Corrective Optics Space Telescope Axial Replacement (COSTAR). During the first servicing mission, I did a final image analysis and focusing the telescope before COSTAR was deployed. I had been allowed three attempts to focus the telescope, but I achieved it in one attempt and COSTAR was deployed ahead of schedule. The following 2 years, I continued to work on the Hubble optics, a concept for an Advanced Camera for the Hubble, and astronomical research on barred galaxies. I am proud to be a part of the NASA team that turned adversity to victory. The story of Hubble is a tribute to NASA’s “can do” attitude. The entire scientific, technology and human space flight community rallied around Hubble in the true “Explore as One” spirit to fix Hubble. The brave astronauts, who undertook the life-threatening job of servicing Hubble five times, helped make the observatory what it is today. In 1994, I was ready for a new challenge and accepted a job as visiting senior scientist at NASA Headquarters, under the wing of the fabled, Dr. Edward Weiler. Under his tutelage, I rapidly learned how to manage flight missions and research programs, lead community working groups, strategic planning, international negotiations, and other skills. By 1999, I had achieved sufficient skills and experience to be appointed as a civil servant. During my 23 years at NASA, there have been numerous memorable moments. I would like to mention some. In 1999, I was appointed as the program scientist for the Hubble, a position that I held till 2004. I provided scientific oversight to the science instruments, Wide Field Camera 3, and the Space Telescope Imaging Spectrograph (STIS), taking strategic decisions to enable development within cost and schedule. I participated in two servicing missions, SM3A and SM3B. My involvement with the James Webb Space Telescope (JWST) started in 1995, when it was a mere concept referred to as the Next Generation Space Telescope (NGST), and Ed Weiler asked me to send a research grant to John Mather at Goddard Space Flight Center (GSFC) to study the concept for NGST. I was appointed NGST program scientist from 1999-2001 (and JWST program scientist from 2011-2015), and led the solicitation and selection of early technology development. I led the appointment of an Interim Science Working Group to develop the science requirement for NGST science instruments, and wrote the solicitation for the science instruments and Science Working Group. A particularly contentious negotiation we went through with our partners, the European Space Agency (ESA), and the ********* Space Agency (CSA), was the partnership on the Mid-InfraRed Instrument (MIRI), ended amicably. Much negotiation was held with our partners, the European Space Agency (ESA) and the ********* Space Agency (CSA), concerning the Mid-InfraRed Instrument (MIRI). I developed a strategy for selecting a NASA center for management of the MIRI instrument. We were conducting a review of proposals for MIRI management on the fateful day, Sept. 11, 2001. Again, we did not let adversity stop us, and today MIRI and all the other science instruments are installed on JWST. Lessons learned from Hubble development have been applied to JWST development, including complete optical testing in a specially modified chamber at Johnson Space Center (JSC). The building of JWST is another example of “Explore as One,” where scientists, engineers, private industry and non-U.S. space agencies have come together with the ambitious goal of learning how the first stars and galaxies were born. I would like all readers to follow their dreams as I have and not to get discouraged, as we continue exploring the Universe. The sky belongs to all of us, and NASA’s tremendous scientific journey can be followed through our space missions on [Hidden Content]. About the Author Hashima Hasan Hashima Hasan is the NASA program scientist for the Keck Observatory, the SOFIA mission, ADCAR and is deputy program scientist for the James Webb Space Telescope. She also serves as the education lead for Astrophysics. Dr. Hasan has been the program scientist for many NASA missions, and from 2001-2006, she served as the lead for Astronomy and Physics Research and Analysis programs. Dr. Hasan received Her Ph.D. from the University of Oxford, U.K., in theoretical nuclear physics. She was the optical telescope assembly scientist at Space Telescope Science Institute, Baltimore, until 1994, when she joined NASA Headquarters. Share Details Last Updated Feb 18, 2026 Related Terms Explorer Keep Exploring Discover More Topics From NASA Jet Propulsion Laboratory Earth Your home. Our Mission. And the one planet that NASA studies more than any other. Explore NASA’s History Get Your Daily Dose of NASA History Explorer 1 America’s first satellite, Explorer 1. America joined the space race with the launch of this small, but important spacecraft. View the full article
  12. 4 min read Measuring the Big Bang with the COBE satellite By John Mather The Cosmic Background Explorer satellite (COBE) went up on a Delta rocket on Nov. 18, 1989, into a polar sun-synchronous orbit 900 km up. Our team at NASA Goddard Space Flight Center (GSFC), Ball Aerospace, the Jet Propulsion Laboratory (JPL) and universities built it to look at the cosmic microwave and infrared background light that comes to us from the distant universe, so far away that it seems to be a nearly uniform glow. With it, we started the new subject of precision cosmology; before the COBE very little was known except the general idea of an expanding universe, misnamed the Big Bang. (It’s misnamed because the name conjures up the image of a firecracker, happening at a place and a time. Astronomers see an infinite universe expanding into itself, with no center, no edge and no first moment.) Our team measured the spectrum of the cosmic heat ― more precisely the cosmic microwave background radiation― left over from times when the universe was compressed and hot, with a precision of 50 parts per million. The prediction was for a nearly perfect blackbody spectrum, and it matched. No other story of the universe was ever able to explain that. We also found the hot and cold spots of the heat radiation, known as anisotropy (Greek for not the same in every direction). Stephen Hawking said that was the most important scientific discovery of the century, if not of all time. Now we know that: a.) the spots are responsible for our existence, because gravity acting on the regions of higher density was able to stop the matter from expanding; b.) most of the spots are caused by dark matter; and c.) if we ever know what made the spots, we might understand quantum gravity. In 2006, I got a call from Stockholm, and the Nobel Prize in Physics went to me and to George Smoot in recognition of the work of our team. Now the entire world knows what we know: it was really important. We started in 1974, just 5 years after the first Apollo landing on the Moon, when NASA announced opportunities to propose new satellite missions. I had just finished my thesis project in January and taken a job with NASA’s Goddard Institute for Space Studies in New York City to become a radio astronomer. My thesis project at the University of California, Berkeley, was intended to measure that cosmic background radiation, but it failed to function properly. Yet only months after my arrival in New York, NASA announced the opportunity. My advisor Pat Thaddeus knew what to do: call up our friends and write a proposal. (One of those friends is Rainer Weiss of the Massachusetts Institute of Technology, who was also working on gravitational wave detection. He shared the 2017 Nobel Prize for detecting gravitational waves from merging ****** holes.) I never expected our proposal to be chosen, but it was, thanks to far-seeing people at Headquarters like Nancy Boggess, and NASA created a new science team including people from two competing teams. Anticipating that choice, Mike Hauser recruited me to Goddard in Greenbelt, Maryland, and I was hoping to become the lead scientist. Soon Goddard assigned a brilliant team of engineers, who were just completing the IUE observatory, to help us along. We built up a team that eventually included 1,500 contributors, including a science team of 19 spread around the country. The project was extraordinarily challenging, and became the largest in-house project Goddard has ever done. We brought the work in, because we were pushing so far beyond known engineering that it was impossible to write a contract specification; I spent much of my life in the offices of engineers seeking approaches to doing the impossible. I trusted my future to them, and they to me. In the end, our mission worked beautifully, after many changes, including a redesign after the Challenger loss made it clear we would not be launched on the shuttle. NASA and its partner agencies like the European Space Agency and ********* Space Agency are the only places in the known universe where space science and space engineering meet so intimately, where engineers build what has never been built before, so scientists may discover what has never been known before. I can only marvel at the works we have done, and imagine what we may yet do together. About the Author John C. Mather John C. Mather is a senior astrophysicist in the Observational Cosmology Laboratory at NASA’s Goddard Space Flight Center (GSFC). His research centers on infrared astronomy and cosmology. As an NRC postdoctoral fellow at the Goddard Institute for Space Studies, New York City, he led the proposal efforts for the Cosmic Background Explorer (1974-1976), and came to GSFC to be the study scientist (1976-1988), project scientist (1988-1998), and also the principal investigator for the Far IR Absolute Spectrophotometer (FIRAS) on COBE. As senior project scientist (1995-present) for the James Webb Space Telescope, Dr. Mather leads the science team and represents scientific interests within the project management. He has received many awards including the 2006 Nobel Prize in Physics for his precise measurements of the cosmic microwave background radiation using the COBE satellite. Share Details Last Updated Feb 18, 2026 Related Terms Explorer Keep Exploring Discover More Topics From NASA Jet Propulsion Laboratory Earth Your home. Our Mission. And the one planet that NASA studies more than any other. Explore NASA’s History Get Your Daily Dose of NASA History Explorer 1 America’s first satellite, Explorer 1. America joined the space race with the launch of this small, but important spacecraft. View the full article
  13. 7 min read Peering Homeward, 1972 By Laura Rocchio The scientists and engineers at NASA Goddard looking at the first MSS images were looking at just one band of data, so the images appeared ****** and white to them. The image shows the area on that July 25, 1972 image that initially had them concerned that something was wrong with the imagery (an area in the Ouachita Mountains). NASA/USGS On July 23, 1972 the first civilian satellite designed to image Earth’s land surfaces was launched from Vandenberg Air Force Base in California. On board the satellite, originally named the Earth Resources Technology Satellite (ERTS), but later known as Landsat 1, were two sensors. The primary sensor, called the Return Beam Vidicon (RBV), used three shuttered cameras to take photographs; the secondary sensor, the Multispectral Scanner System (MSS) was an experimental instrument. Both sensors were packed onto a “butterfly-shaped” spacecraft repurposed from the successful Nimbus weather missions. There were strict size and weight limitations for the sensors, especially the experimental MSS that weighed less than the primary RBV sensor and the data recorder. (At over 150 pounds, the data recording system onboard was the biggest recording device ever orbited.) A color composite (MSS bands 6,7,5) showing the first cloud-free land image acquired by the Landsat 1 multispectral scanner system (MSS), on July 25, 1972, including the Ouachita Mountains in southeastern Oklahoma. The dark stripe above the image center results from several dropped MSS scanlines. NASA/USGS The MSS technology was a novel way of looking at Earth. It used a scanning mirror to build up an image pixel-by-pixel with six scan lines sweeping across the satellite’s ground path 13.62 times per second as the satellite hurtled around Earth at over 14,400 mph. As the first civilian imaging scanner to orbit Earth, many of the scientists and engineers outside the small cadre of scanner enthusiasts wondered if the satellite’s MSS instrument would be able to successfully produce an image traveling at such a high velocity. This made for a harrowing day when the first imagery was transmitted back to Earth two days after launch. A group of Landsat scientists and engineers gathered in the Landsat data processing facility at NASA’s Goddard Space Flight Center as the first MSS digital transmission was translated onto 70-mm film by an electron beam recorder and then displayed. As they watched the first imagery scroll by they saw clouds, more clouds, and finally land… but the ****** and white image had irregular wavy lines on it. “It’s terrible. It has moiré patterns,” a technician lamented. Quickly those in the room figured out where the image was showing geographically—the Ouachita Mountain region of southeastern Oklahoma. Then the geologists in the room realized that they were seeing the curvilinear outcrops of the ancient mountains. Landsat 1’s Return Beam Vidicon (RBV) cameras, built by RCA. NASA Anxiety transformed into excitement. NASA geologist Nicholas Short, who had been unconvinced of the utility of land remote sensing for geology, turned to the NASA Deputy Associate Administrator for Space Applications and said, “I was so wrong about this. I’m not going to eat crow. Not big enough. I’m going to eat raven.” USGS cartographer Alden Colvocoresses, who had been cynical about any cartographically accurate data being collected with “a little mirror in space,” turned to his colleagues in the room and said simply, “Gentlemen, that’s a map.” To the surprise of many, it was the ride-along secondary instrument of Landsat 1, the experimental Multispectral Scanner System that became the mission’s imaging powerhouse. The MSS instrument represented many “firsts.” It was the first space-based sensor to digitally encode and transmit Earth surface data; the first Earth-observing instrument to obtain in orbit calibration data, which meant it was the first instrument Earth-scientists could use to make robust comparisons of changes to Earth’s surface over time and across geographies. It quickly proved itself better than the primary Return Beam Vidicon instrument—and a good thing too because just 15 days after launch a major electrical short associated with the RBV’s power-switching circuit caused enough problems that the RBV was shut down for the rest of the satellite’s mission. The MSS data’s accurate geometric fidelity made it a major cartographic tool, and the low sun angle of Landsat’s mid-morning acquisition time accentuated shadows of topographic features making the images especially valuable to geologists; but many fields including agriculture, forest management and marine studies found the data useful. A diagram of a Multispectral Scanner System (MSS) instrument. NASA/Hughes Santa Barbara Research Center The Explorer 1 mission had begun the U.S. forays into space, yet a striking realization that came from the space-bound missions that followed Explorer 1 in quick succession (Mercury, Gemini, Apollo) was that space offered a distinctive vantage point for observing our home planet. A few months prior to the Landsat 1 launch, Secretary of the Interior and Landsat champion, Stuart Udall, had explained to The New York Times, “I thought an Earth applications program was a perfect means of bringing the benefits of space back to Earth.” Once Landsat and its MSS instrument had proved itself after launch, NASA Administrator James C. Fletcher confirmed Udall’s belief, remarking that Landsat was “a second giant stride for mankind” because of the new technology’s potential to improve the understanding of environmental issues. He went on to say that Landsat had a “profound effect on the thinking of the world, particularly on our approach to emerging problems of protecting our environment and maintaining the quality of life for all of Earth’s people…not just clean air and water, but clean land.” The First Space-Based GPS Satellite Tracking Experiment, 1982 On July 16, 1982 the fourth Landsat satellite—carrying “the most complex and pioneering Earth viewing instrument ever proposed for a NASA program” at the time—took to the sky. Nearly everything about this second-generation Earth observation satellite had been upgraded from its Landsat 1, 2, and 3 predecessors. In addition to an MSS sensor, Landsat 4 carried a second-generation Earth-observing sensor, called the Thematic Mapper or TM instrument. The TM, a more advanced version of the MSS, was only one aspect of the mission’s radical redesign. Artist’s concept of the Landsat 4 satellite in position for repair in the Space Shuttle cargo bay. NASA/Hughes Santa Barbara Research Center The Landsat 4 spacecraft was a custom-designed platform and not a re-purposed Nimbus weather satellite platform used for the first three Landsats. But the mission requirements were many—the satellite was required to be Space Shuttle rendezvous ready (for the concept of Shuttle-based repairs); to carry a large antenna (at the end of a long 12.5 foot *****) for communicating with NASA’s Tracking and Data Relay Satellite System (TDRSS); and to carry a GPS receiver. Schematic showing the Landsat 1 satellite in orbit and how the MSS used a scan mirror to build an image six lines at a time as it traveled over its ground path. NASA Landsat 4 was the very first civilian satellite to carry a spaceborne GPS receiver package and to use GPS signals to calculate its position. The concept of GPS was so new at this time that in Landsat 4 press communications, the acronym “GPS” had to be written out and described as “a new US Air Force satellite navigation system involving orbiting navigational satellites to triangulate the exact position of other satellites which require navigation information as part of their data communication to Earth Stations.” GPS receivers were used on both Landsat 4 and 5 satellites to assess if GPS could deliver more accurate position-location data than data gathered from traditional methods (ground-predicted ephemeris, or mathematically modeled locations). GPS was in its infancy and only 4 of the planned 24 GPS constellation satellites were in orbit at the time of Landsat 4’s launch. So there were often times during Landsat 4’s orbit when no GPS satellites were in range. Two researchers at NASA’s Goddard Space Flight Center, Howard Heuberger and Leonard Church, presented a paper on the Landsat 4 GPS navigation results demonstrating that GPS could establish Landsat 4’s position to within 50 meters, and its velocity within six centimeters per second—when the GPS satellites were in view. Though these error margins grew exponentially when GPS satellites were out of reach (because of lapses between measurements), Heuberger and Church concluded that GPS was a good alternative for supplying onboard ephemeris to future spacecraft systems even before the full GPS constellation was in orbit. Drawing sowing the breakout diagram of the instruments individual components. NASA The experiment was largely a success, but deemed not ready for operational use. It was not until the launch of Landsat 8 in 2013—almost three decades after the Landsat 4 GPS experiment—that GPS receivers would become a routine part of Landsat spacecraft. For an exhaustive technical history of the Landsat program, see the new book: Landsat’s Enduring Legacy: Pioneering Global Land Observations from Space. About the Mission Landsat This joint NASA-U.S. Geological Survey program provides the longest continuous space-based record of Earth’s land in existence. Every day, Landsat satellites provide essential information to help land managers and policy makers make wise decisions about our resources and our environment. For over 40 years, the Landsat program has collected spectral information from Earth’s surface, creating a historical archive unmatched in quality, detail, coverage, and length. Landsat sensors have a moderate spatial-resolution. You cannot see individual houses on a Landsat image, but you can see large man-made objects such as highways. This is an important spatial resolution because it is coarse enough for global coverage, yet detailed enough to characterize human-scale processes such as urban growth. Share Details Last Updated Feb 18, 2026 Related Terms Explorer Keep Exploring Discover More Topics From NASA Jet Propulsion Laboratory Earth Your home. Our Mission. And the one planet that NASA studies more than any other. Explore NASA’s History Get Your Daily Dose of NASA History Explorer 1 America’s first satellite, Explorer 1. America joined the space race with the launch of this small, but important spacecraft. View the full article
  14. 4 min read My NASA Experience By Marcia J. Rieke The development of infrared detector arrays is intertwined with my experiences working on NASA projects. As an astronomer at a university, my interactions with NASA all start with a proposal in response to an opportunity. In 1983, near-infrared detector arrays were beginning to attract the attention of astronomers. At the suggestion of Nancy Boggess at NASA Headquarters, we wrote a proposal to the NASA Research and Analysis Program to obtain an array and test it. At the time, I was a member of the Infrared Astronomy Group working with George Rieke using a single light-sensing element (e.g. a 1 pixel array!) on ground-based telescopes, and I was only starting to become cognizant of astronomy opportunities with NASA. In this initial proposal, we wrote that the array we were contemplating acquiring from what was then called Rockwell International (now Teledyne Imaging Systems), would potentially be useful for infrared instruments on HST. We were not thinking of proposing such an instrument ourselves as we were preoccupied with proposing an instrument for the SIRTF which was later re-named Spitzer. Our proposal was selected, and we purchased a 32×32 HgCdTe array (wow, a whole kilopixel!). Taking a device to the telescope where one could actually take an infrared picture rather than creating a picture by scanning a single pixel back and forth made me feel even happier than a kid in the candy store. Some of my colleagues called it my “toy” camera, but it was so much fun. I remember characterizing the performance of this array, since performance would be of obvious great importance if such arrays were to be used on future NASA missions. During testing, I discovered that the dark current of our first device was orders of magnitude less than what Rockwell had quoted. This needed to be understood because if my result was correct, then this class of infrared array would be a candidate for second generation HST instruments. I called Rockwell, and quizzed the staff about how they had measured the dark current on the array that they had sent us. The Rockwell test engineer explained that he had put a piece of aluminum foil over the dewar window to ensure that the array was in the dark. Well, that was the answer. Yes, the aluminum foil prevented visible light from entering the test dewar, but since it was at room temperature, it was emitting loads of infrared photons. Based on this discovery we decided to propose for a second generation HST instrument which eventually became “NICMOS.” As part of the development funding for that instrument, we moved all the way up to a 256×256 pixel array – 65.5 kilopixels but still not even 1 Mpixel camera. As a result of my involvement in the early steps of working with HgCdTe arrays, I became the Deputy PI for NICMOS, and became deeply involved in a NASA project. NICMOS was the first use in space of the style of near-infrared array that has now become the standard for infrared arrays. Near the end of my involvement with NICMOS and before Spitzer was launched, another opportunity presented itself. People were discussing a “Next Generation Space Telescope” that would push the limits of detectability back to the first stage of galaxy formation. I replied to a letter soliciting members, and I set out to work on this new project. I stuck with it, and responded to the Announcement of Opportunity in 2001, and this triggered a change of events that has led to my being PI of the NIRCam instrument on the James Webb Space Telescope. The detector arrays in NIRCam are each 2028×2048 pixels (eg. 4 Megapixels) with the entire camera holding 40 Megapixels, a long way from my first 1 kilopixel array camera! About the Author Marcia J. Rieke Marcia J. Rieke is a professor of Astronomy at the University of Arizona and is the principal investigator for the near-infrared camera (NIRCam) on the James Webb Space Telescope. Rieke came to the University of Arizona (UA) in 1976 and has made seminal contributions to infrared astronomy. She has served as the deputy principal investigator on the Near Infrared Camera and Multi-Object Spectrometer for the Hubble Space Telescope (NICMOS), and the outreach coordinator for the Spitzer Space Telescope. A fellow of the American Academy of Arts and Sciences, Rieke received her undergraduate and graduate degrees in physics from the Massachusetts Institute of Technology, Boston, Massachusetts. Share Details Last Updated Feb 18, 2026 Related Terms Explorer Keep Exploring Discover More Topics From NASA Jet Propulsion Laboratory Earth Your home. Our Mission. And the one planet that NASA studies more than any other. Explore NASA’s History Get Your Daily Dose of NASA History Explorer 1 America’s first satellite, Explorer 1. America joined the space race with the launch of this small, but important spacecraft. View the full article
  15. 14 min read The Gestation of the Hubble By Nancy Grace Roman Looking through the atmosphere is like looking through a piece of old stained glass. The glass has defects that distort the image. The atmosphere also has defects that distort the image, but the defects in the atmosphere move, thus blurring the image as well. The glass is colored, so only some colors get through. Until the mid-20th century, that did not appear to be a major problem. Stars primarily radiated like ****** bodies, and their temperatures were such that their radiation came through the atmosphere and our eyes adapted to seeing it. The development of radio astronomy, as a result of the technology stimulated by World War II, proved that the universe was far more complex and far more interesting than the staid view in the visible. This made astronomers eager to detect colors that do not come through atmosphere. In addition, the glass is dusty. The dust scatters light making the background brighter and harder to see through. The molecules in the atmosphere also scatter light. This is why we cannot see stars in the daytime. It also keeps us from seeing the faintest stars at night. Finally, unlike the glass, the atmosphere shines faintly, making the faintest objects invisible from the ground. For these reasons, astronomers had been anxious for decades to put telescopes above the atmosphere, and they jumped at the opportunity provided by the opening of the Space Age. The first NASA astronomy missions hunted for high-energy radiation in gamma ray and X-ray regions of the spectrum. These searches relied on techniques that had been developed for decades for the measurement of cosmic rays and for studying high-energy phenomena in laboratories. We knew from rocket observation that the Sun displayed interesting effects in the ultraviolet that changed continuously. This was an impetus behind the Orbiting Solar Observatories. Stellar astronomers were also interested in the ultraviolet. Young, massive stars emit most of their energy in that region. In addition, the strongest and simplest lines of common, light elements are in the ultraviolet. Without observations of these lines, it was impossible to analyze the compositions of stars. This led to the development of the Orbiting Astronomical Observatories with their emphasis on the ultraviolet of stars. We were less interested in the infrared at that time, and detector technology was too primitive to make this region easily accessible. These instruments provided an exciting introduction to space astronomy, but astronomical objects are very distant. That makes them appear faint and tiny. A large mirror is required to collect enough light to analyze any but the brightest stars. The fineness of the detail that is discernible is a direct function of the size of the mirror. Thus, to take advantage of the dark sky and steady images above the atmosphere requires a large mirror. For decades, astronomers had longed for a large space telescope. In 1946, Lyman Spitzer wrote a short paper for the Rand Corporation describing the science that could be learned with a 4-meter telescope in space. This is generally considered the impetus for such a telescope in the U.S. From time to time, NASA asks the National Academy of Sciences (NAS) for advice on its science program. In the summer of 1962, the Academy assembled a group of scientists at the University of Iowa, dividing the group into various committees representing different areas of science, including one for astronomy. One astronomer had studied the characteristics of the Saturn rocket and determined that it could carry a 3-meter telescope. The entire astronomy committee jumped on the idea. That is what they really wanted. I thought that it was too early to start work on such a project. I knew how much trouble we were having trying to develop a satellite and instrumentation for a 6-inch telescope. This telescope was not successful until 1968. Thus, I essentially ignored the idea. At that time, NASA’s Langley Research Center (LaRC) was responsible for NASA’s human space program. Some of the engineers there jumped on the idea of developing a large, manned orbiting telescope. The NAS conducted another study in the summer of 1965. By this time, the astronomers only argued about whether the telescope should be in orbit or on the Moon. The latter would provide a stable base, making the telescope less sensitive to the motion of parts, and also provide a reference system for the pointing controls. Connected to a manned base, it could be used much as ground-based telescopes are used. There were also disadvantages with the Moon. Perhaps the most serious one was that it was unclear how soon such an installation would be feasible. The Moon appeared to be undesirably dusty. Moreover, its motion is complex, making the guidance difficult before modern computers were well developed. Nevertheless, the issue remained alive until the early 1970s. Several aerospace companies were intrigued by the LaRC idea and presented designs for a manned, large space telescope. This was the last thing astronomers wanted! Aside from the fact that research had not been done by a person looking through a telescope for almost a century, with one small exception, a man needed an atmosphere, and that was what we were trying to get away from. In addition, a man would wiggle during long exposures and that would cause the telescope floating in orbit to wiggle in the opposite direction, blurring the image. I still thought it was too early to design a satellite for a 3-meter telescope, but decided that if companies were going to spend money designing such a satellite system, they might as well design a usable one. A major problem at this stage was to win the support of the general astronomical community, many of whom had no interest in observations from space. One facet of attacking this problem was to set up a working group under the auspices of the National Academy of Sciences (NAS) on the uses of a Large Space Telescope (LST), under the direction of Lyman Spitzer. The committee held an early meeting in Pasadena, California, to discuss the use of such a telescope for studies of galaxies, cosmology, and interstellar matter. Numerous West Coast astronomers attended the meeting, increasing their understanding of the possibilities and, hence, somewhat decreasing their antipathy. Although the members of the working group were supporters, the cachet of the NAS gave their report, which was published in 1969, special importance. I met with many astronomers to discuss the promise of a 3-meter telescope above the atmosphere. I addition, I gave many illustrated public talks on the questions that we expected such a telescope to answer, although I also emphasized that the most important results would be those we could not predict. The Astronomy Working Group that had been established to advise me on the entire astronomy program also started to discuss what was really needed for a successful LST and the engineering problems that required solution. By 1971, I assembled an LST Science Steering Group to work only on the LST. For this, I assembled a group of astronomers from all over the country representing various interests that could be served by a large space telescope and some NASA engineers to sit down and outline a design that would meet the needs of the astronomers and that the engineers thought would be doable. Purposely, I included several who were not really enthusiastic about the project but whose science could benefit from the program. Together, we sketched the system that would become the basis for the Hubble. After about 2 years, a more detailed design was needed. NASA’s Marshall Space Flight Center was assigned the responsibility for turning our sketch into a design. I maintained a general overview of the continued developments as program scientist, but Robert O’Dell was hired in September 1972 as the project scientist, with the detailed responsibility for keeping the scientific requirements at the center of the planning. At one point, there was a strong push to decrease the diameter of the mirror, probably to make use of facilities that existed for other purposes. We were asked to consider mirror sizes of 2.4 m and 1.8 m. A primary objective of the telescope was to determine the brightness of Cepheid variables in the Virgo cluster of galaxies. Hubble had shown that the velocity of recession of distant galaxies was proportional to their distance. However, the proportionality constant was uncertain by a factor of two. Galaxies have random motions. The velocities of distant galaxies are small compared to the velocity caused by expansion, but for nearby galaxies, these random motions overwhelm the general expansion. Moreover, the nearby galaxies are in a group in which they interact gravitationally. To determine the proportionality constant it was necessary to determine the distance of a cluster of galaxies not interacting with nearby galaxies and distant enough that the random velocities are not significant on the average. The nearest suitable cluster is the Virgo cluster of galaxies at a distance of about 54 million light years. Henrietta Leavitt had shown that the brightness of a particular class of variable stars, called Cepheids, was an accurate function of the periods of variation. We could calibrate this relation for Cepheids in the Milky Way galaxy. Thus, if we could observe these variables in the Virgo cluster, we could determine the distance of the cluster. Measuring the velocity of the expansion was easy. I and, independently, several others determined that with the available detectors, we could reach the Cepheid variables in the Virgo cluster with a 2.4 m mirror but that we could not do so with a 1.8 m mirror. Dropping the mirror diameter to 2.4 m also made the design of a satellite that would fit the space shuttle easier. As the early design developed, it was necessary to make a place for the project in the NASA plans. It was relatively easy to convince my superiors in NASA that such a telescope would be worth the cost. Convincing the political community, with little understanding of science was more difficult. James Webb, the administrator of NASA at that time gave a series of dinners for men with political power. After each dinner, three of us presented a “dog and pony show.” Jesse Mitchell discussed the engineering and its feasibility, ***** Halpern presented the management plans, and I described the scientific research we expected to do with the telescope. I never testified before Congress, but I did write congressional testimony to justify the Large Space Telescope for about 10 years. I also pitched the case for the telescope to representatives of the Bureau of the Budget (now the Office of Management and Budget), the agency that prepares the budget the president sends to Congress. At some point, for political reasons, the word “Large” was dropped from the name with the satellite simply becoming the Space Telescope until launch. In spite of these efforts, Congress continuously postponed approval for construction. Even after construction was started, Congress cut the budget below an optimum level. Of course, this increased the final cost of the mission. By the early to mid-1970s, astronomers organized major lobbying efforts. This finally led to the approval of the project. At one point, then-Sen. William Proxmire (D-Wisconsin), noted for ridiculing government funding that he considered frivolous, asked NASA why the American taxpayer should support an expensive telescope. I did a back-of-envelope calculation and determined that for the cost of one night at the movies, every American would have 15 years of exciting discoveries. I was probably off by a factor of four or five, depending on how launch and servicing costs are allocated, but we shall probably have 25 years of discoveries. Even at a cost of a night at the movies once a year, which would more than cover costs by any accounting, I believe that most Americans believe that the expenditure has been worth it. At the time the Hubble was being designed, NASA was pitching the space shuttle as a cheap way to launch spacecraft. To lower the costs, a busy launch schedule was required. Therefore, all satellites were designed to be launched by the shuttle and several were designed to be serviceable. The Hubble was scheduled to be launched by the next flight after the Challenger accident. That catastrophe cancelled all shuttle launches for 3 years, during which the satellite was kept in storage and a knowledgeable group of engineers kept on the payroll until the 1990 launch. These 3 wasted years also added significantly to the cost of the mission. The Challenger experience caused NASA to rethink its use of the shuttle for most missions. Most payloads had to be redesigned for robotic launches. Fortunately, the Hubble was too far along to be changed. The ability to service it with the shuttle not only saved the basic mission after the mirror problem was discovered, but also provided the possibility of replacing instruments from time to time by more modern versions, thus greatly increasing the capability of the telescope. As mentioned earlier, I started funding development of detectors early in the program. A major portion of the funding for ultraviolet detectors went to Princeton University which subcontracted to Westinghouse for the development of an intensified vidicon for the telescope camera. The Steering Group, and later the Working Group, assumed that this detector was already chosen. As the time approached for the selection of the scientific instruments for the telescope, I was unsatisfied with the progress on the intensified vidicon. At a Steering Group meeting shortly before the selection of the instruments, I arranged a presentation of various types of detectors. Charged coupled devices (CCDs) had clear advantages in resolution, sensitivity, and stability. These are arrays of tiny, solid state chips (pixels) each sensitive to photons. At the conclusion of an exposure, the intensity recorded by each chip is read sequentially down a column, and then the sums are read across. In this way, a map of intensity as a function of position, that is, a picture is obtained. Commercial establishments were strongly interested in supporting their development. (They are the basis of the modern digital camera and are also used for TV cameras.) A problem is that a bare CCD is not sensitive in the ultraviolet. Nevertheless, as a result of this presentation, the Working Group decided to open the choice of detector for the camera. When a proposal from Jim Westphal solved the ultraviolet sensitivity problem by coating the CCD surface with an organic substance that fluoresced in the visible when hit with ultraviolet light, the vidicon lost the competition. Many in the astronomical community were unhappy with NASA management of the Space Telescope. They wanted it in the hands of astronomers with a management contractor in the way that the National Optical and Radio Observatories were handled. This overlooked the fact that the scope of the LST construction and operation was far larger than that of the ground-based observatories. Nevertheless, there was one area in which the community insistence on operation by scientists was non-negotiable – the scientific management of the operation. This nearly cost me my job. Goddard badly wanted the scientific operation of the telescope. After considering this, I decided that it was much too big a job for the small astronomy group at Goddard, even if the astronomical community would have stood still for such an arrangement. As a result, the scientific and astronomy leaders at Goddard talked Noel Hinners into to transferring me to a different job. I decided that I did not want the other job and stayed put for a year or so. I took advantage of an early-out ******* to retire in 1979, but continued for 9 months longer as the Space Telescope program scientist in order to participate on the Source Selection Board for the Space Telescope Science Institute, which would manage the scientific operations of the Space Telescope. I found this an interesting experience. There were five proposals, four of which based the Institute at Princeton University. The proposals from Associated Universities Incorporated, which managed the National Radio Astronomy Observatories, and from Associated Universities for Research in Astronomy, which managed the National Optical Astronomy Observatories, were highly competitive, and the decision between them was difficult. The latter, placed the Institute at Johns Hopkins University in Baltimore. Many people believed that it was selected because Baltimore is closer to Goddard. That has helped over time but did not enter our deliberations. I left the project before substantial management problems arose, leaving their solution to my successor, Ed Weiler. He also had to handle the discovery of the mirror problem. It was clear from his actions in these major fiascos that I had left the project in good hands. About the Author Nancy Grace Roman Nancy Grace Roman received her Ph.D. in astronomy from the University of Chicago in 1949. She joined NASA in 1959 and became the first chief of astronomy in the Office of Space Science, where she had oversight for the planning and development of programs including the Cosmic Background Explorer and the Hubble Space Telescope. Dr. Roman finished her NASA career at the Goddard Space Flight Center, retiring as manager of the Astronomical Data Center in 1979, and continued to work at Goddard as a contractor. The first woman to hold a leadership position at NASA, Dr. Roman has been an advocate for woman in the sciences throughout her career. Share Details Last Updated Feb 18, 2026 Related Terms Explorer Keep Exploring Discover More Topics From NASA Jet Propulsion Laboratory Earth Your home. Our Mission. And the one planet that NASA studies more than any other. Explore NASA’s History Get Your Daily Dose of NASA History Explorer 1 America’s first satellite, Explorer 1. America joined the space race with the launch of this small, but important spacecraft. View the full article
  16. The Moon rises behind NASA’s Artemis II SLS (Space Launch System) rocket and Orion spacecraft atop a mobile launcher at Launch Complex 39B at NASA’s Kennedy Space Center in Florida on Sunday, Feb. 1, 2026. The Artemis II test flight will take Commander Reid Wiseman, Pilot Victor Glover, and Mission Specialist Christina Koch from NASA, and Mission Specialist Jeremy Hansen from the CSA (********* Space Agency), around the Moon and back to Earth. NASA/Ben Smegelsky As NASA continues preparations for the Artemis II test flight, the agency will provide coverage Thursday, Feb. 19, of its next wet dress rehearsal, a fueling test of the SLS (Space Launch System) rocket, and hold a news conference on Friday, Feb. 20. Teams are counting down to the opening of a simulated launch window at 8:30 p.m. EST on Feb. 19, and the test could extend to up to four hours. At 11 a.m. on Feb. 20, agency leadership will participate in a news conference to provide details about the outcome of the rehearsal. NASA participants include: Lori Glaze, acting associate administrator, Exploration Systems Development Mission Directorate John Honeycutt, chair, Artemis II Mission Management Team Representative, Exploration Ground Systems The agency will stream the news conference live on its YouTube channel. A 24/7 live stream of the rocket at the pad continues online, and NASA will provide a separate feed capturing wet dress activities and share real-time blog posts during the fueling day. Look for individual streams for these events to watch on YouTube. Learn how to stream NASA content through a variety of online platforms, including social media. This is the second wet dress rehearsal following a previous rehearsal that concluded Feb. 3. Media previously credentialed for launch may join the news conference in person. To participate virtually, media should contact the newsroom at NASA’s Kennedy Space Center in Florida no later than one hour prior to the beginning of the news conference at: ksc*****@*****.tld. As part of a Golden Age of innovation and exploration, Artemis will pave the way for new U.S. crewed missions on the lunar surface in preparation to send the first astronauts to Mars. To learn more about the Artemis campaign, visit: [Hidden Content] -end- Rachel Kraft / Jimi Russell Headquarters, Washington 202-358-1600 rachel.h*****@*****.tld / *****@*****.tld Tiffany Fairley Kennedy Space Center, Florida 321-747-8306 *****@*****.tld Share Details Last Updated Feb 18, 2026 LocationNASA Headquarters Related TermsArtemisArtemis 2Humans in Space View the full article
  17. 1 min read Commodity Classic 2026 Hyperwall Schedule Commodity Classic, February 25 – 27, 2026 Join NASA in the Exhibit Hall (Booth #3481) for Hyperwall Storytelling by NASA experts. Full Hyperwall Agenda below. WEDNESDAY, FEBRUARY 25 4:30 – 4:50 PM NASA: Your Space and Science Agency Karen St Germain THURSDAY, FEBRUARY 26 11:00 – 11:20 AM Informing Water and Agricultural Management Apps with NASA Modeling and Remote Sensing Sujay Kumar 11:20 – 11:40 AM Turning Agricultural Needs into Satellite Solutions Emily Adams 11:40 – 12:00 PM NASA’s Applied Remote Sensing Training Program (ARSET) Brock Blevins/Sean McCartney 12:00 – 12:20 PM NASA Acres: Down to Earth Information for U.S. Agriculture Alyssa Whitcraft 12:20 – 12:40 PM Farmers and NASA Acres Co-Create a New Farm Innovation Ambassador Team Panel FRIDAY, FEBRUARY 26 11:00 – 11:20 AM How Landsat Helps: Monitoring Crop Health & Water Use from Space Allison Nussbaum 11:20 – 11:40 AM NASA Data for Drought Resilience in Alabama Brent Roberts 11:40 – 12:00 PM Using NASA Data for Irrigation Management: The OpenET Farm and Ranch Management Support Tools Forrest Melton 12:00 – 12:20 PM NASA’s AVAIL Program Combining Multiple Perspectives on Agriculture to Support US Farmers Alex Ruane 12:20 – 12:40 PM From the Ground Up with STELLA: Affordable Open-Source Handheld Instruments Mike Taylor View the full article
  18. NASA/Zena Cardman Fishing boats illuminate the Arabian Sea along India’s west coast with green lights designed to attract squid, shrimp, sardines, and mackerel in this nighttime photograph from the International Space Station, orbiting 259 miles above Earth on Dec. 25, 2025. Studying nighttime light offers a unique perspective for investigations into human behaviors, such as tracking the expansion of urban areas or assessing power outages caused by natural disasters such as hurricanes, and biological and ecological studies researching how artificial lights influences nature. Crew members aboard the orbital lab have produced hundreds of thousands of images of the land, oceans, and atmosphere of Earth, and even of the Moon through Crew Earth Observations. Their photographs of Earth record how the planet changes over time due to human activity and natural events. This allows scientists to monitor disasters and direct response on the ground and study phenomena, from the movement of glaciers to urban wildlife. Image credit: NASA/Zena Cardman View the full article
  19. 3 Min Read Mars Global Localization Pinpoints Perseverance’s Location PIA26705 Credits: NASA/JPL-Caltech Photojournal Navigation Science Photojournal Mars Global Localization… Photojournal Home Photojournal Search Latest Content Galleries Feedback RSS About Downloads Mars Global Localization Pinpoints Perseverance’s Location PNG (566.29 KB) PIA26705 animation MP4 (2.77 MB) Description These images were part of the first successful use of a new technology called Mars Global Localization, developed at NASA’s Jet Propulsion Laboratory. Using its navigation cameras, NASA’s Perseverance captured a 360-degree view of the surrounding terrain that was matched to orbital imagery, enabling the rover to pinpoint its location on Mars on Feb. 2, 2026, the 1,762nd day, or sol, of the mission. The navcam images were turned into an overhead view called an orthomosaic, forming a circle around the rover. In this animation, the orthomosaic is superimposed on the imagery from NASA’s Mars Reconnaissance Orbiter (MRO). Contrast and hue have been enhanced to increase visibility of terrain features, which align in the ground and orbital imagery. The rover took the five stereo pairs of navcam images in this relatively featureless location, dubbed “Mala Mala,” an area on the rim of Jezero Crater. The blank area in the upper right of the orthomosaic is where the back of the rover blocked the cameras’ view of the surrounding landscape. Mars Global Localization features an algorithm that rapidly compares panoramic navcam shots to MRO orbital imagery. Running on a powerful processor that Perseverance originally used to communicate with the now-retired Ingenuity Mars Helicopter, the algorithm takes about two minutes to pinpoint the rover’s location to within some 10 inches (25 centimeters). Like NASA’s previous Mars rovers, Perseverance tracks its position using what’s called visual odometry, analyzing geologic features in camera images taken every few feet while accounting for wheel slippage. As tiny errors in the process add up over the course of each drive, the rover becomes increasingly unsure about its exact location. On long drives, the rover’s sense of its position can be off by more than 100 feet (up to 35 meters). Believing it could be too close to hazardous terrain, Perseverance may prematurely end its drive and wait for instructions from Earth. After each drive comes to a halt, the rover sends a 360-degree panorama to Earth, where mapping experts match the imagery with shots from MRO. The team then sends the rover its location and instructions for its next drive. That process can take a day or more, but with Mars Global Localization, the rover can compare the images itself, determine its location, and roll ahead on its pre-planned route. Managed for NASA by Caltech, JPL built and manages operations of the Perseverance rover. JPL also manages MRO for the agency’s Science Mission Directorate in Washington as part of its Mars Exploration Program portfolio. Keep Exploring Discover More Topics From Photojournal Photojournal Search Photojournal Photojournal’s Latest Content Feedback View the full article
  20. 2 Min Read Perseverance Pinpoints Its Location at ‘Mala Mala’ PIA26704 Credits: NASA/JPL-Caltech Photojournal Navigation Science Photojournal Perseverance Pinpoints Its… Photojournal Home Photojournal Search Latest Content Galleries Feedback RSS About Downloads Perseverance Pinpoints Its Location at ‘Mala Mala’ PNG (23.40 MB) Description Using its navigation cameras, NASA’s Perseverance Mars rover captured the five stereo pairs of images that make up this panorama on Feb. 2, 2026, the 1,762nd day, or sol, of the mission. A new technology called Mars Global Localization matched this 360-degree view to onboard orbital imagery from the agency’s Mars Reconnaissance Orbiter (MRO), enabling the rover to pinpoint its location on the Red Planet for the first time without human help. The rover is in a relatively featureless area dubbed “Mala Mala” on the rim of Jezero Crater. NASA’s Jet Propulsion Laboratory developed Mars Global Localization, which features an algorithm that rapidly compares panoramic navcam shots to MRO orbital imagery. Running on a powerful processor that Perseverance originally used to communicate with the now-retired Ingenuity Mars Helicopter, the algorithm takes about two minutes to pinpoint the rover’s location within some 10 inches (25 centimeters). Like NASA’s previous Mars rovers, Perseverance tracks its position using what’s called visual odometry, analyzing geologic features in camera images taken every few feet while accounting for wheel slippage. As tiny errors in the process add up over the course of each drive, the rover becomes increasingly unsure about its exact location. On long drives, the rover’s sense of its position can be off by than 100 feet (up to 35 meters). Believing it could be too close to hazardous terrain, the rover may prematurely end its drive and wait for instructions from Earth. After each drive comes to a halt, the rover sends a 360-degree panorama to Earth, where mapping experts match the imagery with shots from MRO. The team then sends the rover its location and instructions for its next drive. That process can take a day or more. With Mars Global Localization, the rover can compare the images itself, determine its location, and roll ahead on its pre-planned route. Managed for NASA by Caltech, JPL built and manages operations of the Perseverance rover. JPL also manages MRO for the agency’s Science Mission Directorate in Washington as part of its Mars Exploration Program portfolio. Keep Exploring Discover More Topics From Photojournal Photojournal Search Photojournal Photojournal’s Latest Content Feedback View the full article
  21. 7 min read Preparations for Next Moonwalk Simulations Underway (and Underwater) There is no GPS at the Red Planet, but a new technology called Mars Global Localization lets Perseverance determine precisely where it is — without human help. Imagine you’re all alone, driving along in a rocky, unforgiving desert with no roads, no map, no GPS, and no more than one phone call a day for someone to inform you exactly where you are. That’s what NASA’s Perseverance rover has been experiencing since landing on Mars five years ago. Though it carries time-tested tools for determining its general location, the rover has needed operators on Earth to tell it precisely where it is — until now. A new technology developed at NASA’s Jet Propulsion Laboratory in Southern California enables Perseverance to figure out its whereabouts without calling humans for help. Dubbed Mars Global Localization, the technology features an algorithm that rapidly compares panoramic images from the rover’s navigation cameras with onboard orbital terrain maps. Running on a powerful processor that Perseverance originally used to communicate with the Ingenuity Mars Helicopter, the algorithm takes about two minutes to pinpoint the rover’s location within some 10 inches (25 centimeters). Mars Global Localization was first used successfully in regular mission operations on Feb. 2, then again Feb. 16. “This is kind of like giving the rover GPS. Now it can determine its own location on Mars,” said JPL’s Vandi Verma, chief engineer of robotics operations for the mission. “It means the rover will be able to drive for much longer distances autonomously, so we’ll explore more of the planet and get more science. And it could be used by almost any other rover traveling fast and far.” The upgrade is especially valuable given how well Perseverance’s auto-navigation self-driving system has been working. Enabling the rover to re-plan its path around obstacles en route to a preestablished destination, AutoNav has proved so capable that the distance Perseverance can drive without instructions from Earth is largely limited by the rover’s uncertainty about its whereabouts. Now that it can stop and determine its exact location, Perseverance can be commanded to drive to potentially unlimited distances without calling home. Implementation of Mars Global Localization comes on the heels of another innovation from the Perseverance team: the first use of generative artificial intelligence to help plan a drive route by selecting waypoints for the rover, which are normally chosen by human rover operators. Both technologies enable Perseverance to travel farther and faster while minimizing team workload. This panorama from Perseverance is composed of five stereo pairs of navigation camera images that the rover matched to orbital imagery in order to pinpoint its position on Feb. 2, 2026, using a technology called Mars Global Localization.NASA/JPL-Caltech Beyond visual odometry Unlike on Earth, there is no network of GPS satellites in deep space to locate spacecraft on planetary surfaces. So missions — whether robotic or crewed — must come up with other ways to determine their location. As with NASA’s previous Mars rovers, Perseverance tracks its position using what’s called visual odometry, analyzing geologic features in camera images taken every few feet while accounting for wheel slippage. But as tiny errors in the process add up over the course of each drive, the rover becomes increasingly unsure about its exact location. On long drives, the rover’s sense of its position can be off by more than 100 feet (up to 35 meters). Believing it may be too close to hazardous terrain, Perseverance may prematurely end its drive and wait for instructions from Earth. “Humans have to tell it, ‘You’re not lost, you’re safe. Keep going,’” Verma said. “We knew if we addressed this problem, the rover could travel much farther every day.” To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video The new technology called Mars Global Localization enables NASA’s Perseverance to pinpoint is location using an onboard algorithm that matches terrain features in navigation camera shots (the circular image, called an orthomosaic) to those in orbital imagery (the background). NASA/JPL-Caltech After each drive comes to a halt, the rover sends a 360-degree panorama to Earth, where mapping experts match the imagery with shots from NASA’s Mars Reconnaissance Orbiter (MRO). The team then sends the rover its location and instructions for its next drive. That process can take a day or more, but with Mars Global Localization, the rover is able to compare the images itself, determine its location, and roll ahead on its preplanned route. “We’ve given the rover a new ability,” said Jeremy Nash, a JPL robotics engineer who led the team working on the project under Verma. “This has been an open problem in robotics research for decades, and it’s been super exciting to deploy this solution in space for the first time.” The small team began working in 2023, testing the accuracy of the algorithm they’d developed using data from 264 previous rover stops. The algorithm compared rover panoramic photos to MRO imagery and correctly pinpointed the rover’s location for every single stop. How Ingenuity helped Key to Mars Global Localization is the rover’s Helicopter Base Station (HBS), which Perseverance used to communicate with the now-retired Ingenuity Mars Helicopter. Equipped with a commercial processor that powered many consumer smartphones in the mid-2010s, the HBS runs more than 100 times faster than the rover’s two main computers, which, built to survive the radiation-heavy Martian environment, are based on hardware introduced in 1997. The Mars Global Localization algorithm runs on a fast commercial processor in the Helicopter Base Station — the upper, gold-colored box that was integrated into NASA’s Perseverance rover in a clean room. Perseverance used the base station to communicate with the now-retired Ingenuity Mars Helicopter.NASA/JPL-Caltech As a technology demonstration designed to test capabilities, the Ingenuity mission was able to risk employing more powerful commercial chips in the HBS and the helicopter even though they hadn’t been proven in space. It paid off: Expected to fly no more than five times, the rotorcraft completed 72 flights. The power of the HBS processor inspired Verma to look for ways the Perseverance mission might harness it. “It’s almost like a gift. Ingenuity blazed the trail, proving we could use commercial processors on Mars,” Verma said. Tapping into the HBS computer has had its challenges. To address reliability, the team developed a “sanity check”: The algorithm runs on the HBS multiple times before one of the rover’s main computers checks to ensure the results match. During testing, the team repeatedly found the rover’s position was off by 1 millimeter. They discovered damage to about 25 bits — a minuscule fraction of the processor’s 1 gigabyte of memory — and developed a solution to isolate those bits while the algorithm runs. Alongside the broader Mars Global Localization process, the team’s sanity check and memory solutions are expected to find new uses as faster commercial processors are employed in future missions. In the meantime, the team has already turned their sights to the Moon, where difficult lighting conditions and long, cold lunar nights make knowing exactly where spacecraft are located all the more critical. More about Perseverance NASA’s Jet Propulsion Laboratory, which is managed for the agency by Caltech, built and manages operations of the Perseverance rover on behalf of NASA’s Science Mission Directorate in Washington, as part of NASA’s Mars Exploration Program portfolio. To learn more about Perseverance: [Hidden Content] News Media Contacts Melissa Pamer / DC Agle Jet Propulsion Laboratory, Pasadena, Calif. 626-314-4928 / 818-393-9011 *****@*****.tld / *****@*****.tld Karen Fox / Molly Wasser NASA Headquarters, Washington 202-358-1600 *****@*****.tld / *****@*****.tld 2026-012 Share Details Last Updated Feb 18, 2026 Related TermsPerseverance (Rover)Jet Propulsion LaboratoryMarsPlanetary Science Division Explore More 3 min read Stonebreen’s Beating Heart The glacier in southeastern Svalbard pulses with the changing seasons, speeding up and slowing its… Article 6 days ago 2 min read NASA Honor Awards for Cold Atom Lab Team Members NASA OUTSTANDING PUBLIC LEADERSHIP MEDAL Awarded for notable leadership accomplishments that have significantly influenced NASA’s… Article 3 weeks ago 6 min read NASA’s Perseverance Rover Completes First AI-Planned Drive on Mars Article 3 weeks ago Keep Exploring Discover Related Topics Mars Perseverance Rover The Mars Perseverance rover is the first leg the Mars Sample Return Campaign’s interplanetary relay team. Its job is to… Ingenuity Mars Helicopter NASA’s Ingenuity Mars Helicopter completed 72 historic flights since first taking to the skies above the Red Planet. Mars Exploration Mars is the only planet we know of inhabited entirely by robots. Learn more about the Mars Missions. Planetary Science NASA’s planetary science program explores the objects in our solar system to better understand its history and the distribution of… View the full article
  22. 3 Min Read I Am Artemis: Katie Oriti Katie Oriti manages the Orion European Service Module Integration Office, working closely with commercial and international partners to ensure the module is ready to safely support NASA’s Artemis II mission around the Moon. Credits: NASA/Jef Janis Listen to this audio excerpt from Katie Oriti, Orion European Service Module Integration Office manager: 0:00 / 0:00 Your browser does not support the audio element. Growing up in rural America, Katie Oriti could only dream of working for NASA. Not because she wasn’t inspired by the dark, star-filled skies of her hometown Shelby, Ohio, but because it felt out of reach. “I think NASA was always in the back of my mind because I had an interest in space,” said Oriti, manager for the Orion European Service Module Integration Office. “It was something that felt unattainable, and I just didn’t think it was in the cards for me.” Oriti originally had her sights set on becoming a doctor. She studied mechanical engineering in college and minored in biomedical engineering, intending to apply to medical school. However, as graduation approached, she shifted course and applied to roles that sparked her curiosity. That leap led her to NASA’s Glenn Research Center in Cleveland as a support service contractor helping to build and maintain hardware for cryogenic testing, a process that exposes materials to extremely low temperatures. A career at NASA, previously a dream, suddenly felt real. Oriti became a civil servant, working as a thermal analyst for Orion, the spacecraft carrying astronauts to the Moon through NASA’s Artemis campaign. I loved any opportunity I had to hear what was going on at the spacecraft and the program level. I knew if I wanted to grow and be part of that ******* conversation, I had to expand my knowledge base. Katie oriti Orion European Service Module Integration Office manager Mentors played a critical role at every step, helping her translate her technical skills into a leadership role. Oriti leads integration efforts for the European Service Module, the powerhouse for Orion, working closely with ESA (European Space Agency) and Airbus to ensure the module is ready to safely support NASA’s Artemis II mission around the Moon. She sets the framework for decisions and determines priorities to ensure her team has the resources they need to succeed. “I feel very privileged every day to lead this team,” Oriti said. “It’s the highest functioning team I’ve ever been a part of, and everyone on the team is an A-player. They know their stuff and are very dedicated to the mission.” Front row, from left, General David *********, Katie Oriti, and Artemis II crew members Jeremy Hansen and Christina Koch visit the stainless-steel vacuum chamber in the In-Space Propulsion Facility at NASA’s Neil Armstrong Test Facility in Sandusky, Ohio. This is the world’s only facility capable of testing full-scale upper stage launch vehicles and rocket engines under simulated high-altitude conditions.NASA/Sara Lowthian-Hanna Oriti is excited to support the Artemis II launch from the Launch Control Center at NASA’s Kennedy Space Center in Florida as part of the mission management team, providing expertise for the European Service Module. After launch, she will travel to NASA’s Johnson Space Center in Houston to support from mission control, assisting with program-level decisions and monitoring the European Service Module’s performance throughout the flight. “I think the flyby of the Moon will be awesome. It was cool when we did the powered flybys on Artemis I and came very close to the surface of the Moon, but now we’ll have crew members who will be looking out the window and able to tell us what they see. Katie Oriti Orion European Service Module Integration Office manager While Oriti looks forward to the thrill of launch day, she’s even more inspired by the impact this mission could have on the next generation. She takes pride in knowing that she has become the role model she once searched for, showing others that a dream as big as aiming for the Moon can be within reach. Katie Oriti manages the Orion European Service Module Integration Office, working closely with commercial and international partners to ensure the module is ready to safely support NASA’s Artemis II mission around the Moon.NASA/Jef Janis About the AuthorJacqueline MinerdPublic Affairs Specialist Share Details Last Updated Feb 18, 2026 Related TermsGeneralArtemisArtemis 2Glenn Research CenterHumans in SpaceI Am ArtemisMissionsOrion Multi-Purpose Crew VehicleOrion Program Explore More 3 min read NASA’s Hubble Identifies One of Darkest Known Galaxies In the vast tapestry of the universe, most galaxies shine brightly across cosmic time and… Article 2 hours ago 3 min read A Second Cyclone Slams Madagascar Widespread flooding affected tens of thousands of people after cyclones Fytia and Gezani drenched the… Article 12 hours ago 4 min read NASA Advances High-Altitude Traffic Management Article 1 day ago Keep Exploring Discover More Topics From NASA Missions Humans in Space Climate Change Solar System View the full article
  23. Share Details Last Updated Feb 18, 2026 Editor Andrea Gianopoulos Location NASA Goddard Space Flight Center Contact Media Claire Andreoli NASA’s Goddard Space Flight Center Greenbelt, Maryland *****@*****.tld Christine Pulliam Space Telescope Science Institute Baltimore, Maryland Related Terms Hubble Space Telescope Astrophysics Astrophysics Division Dark Matter Galaxies Globular Clusters Goddard Space Flight Center The Universe Related Links and Documents The science paper by D. Li et al. University of Toronto Press Release
  24. The NASA Engineering and Safety Center (NESC) performed an assessment to characterize the effects of abnormal grain growth (AGG) within a metallic liner of a composite overwrapped pressure vessel (COPV). This effort focused on evaluating the mechanical response of the liner material, including the strain amplification factor (SAF), using a series of custom-designed coupons that incorporated both metal and composite overwrap. The study demonstrated that this approach was effective and practical to characterize strain localization under various conditions and showed strong correlation with modeling results. Additionally, preliminary investigations of phase coherence imaging (PCI), an ultrasonic technique, offered promise in detecting AGG microstructures, but further development is needed. Download PDF: Effects of Large Grain Size in Composite Overwrapped Pressure Vessel View the full article
  25. Earth Observatory Science Earth Observatory A Second Cyclone Slams Madagascar Earth Earth Observatory Image of the Day EO Explorer Topics All Topics Atmosphere Land Heat & Radiation Life on Earth Human Dimensions Natural Events Oceans Remote Sensing Technology Snow & Ice Water More Content Collections Global Maps World of Change Articles Notes from the Field Blog Earth Matters Blog Blue Marble: Next Generation EO Kids Mission: Biomes About About Us Subscribe 🛜 RSS Contact Us Search February 10, 2026 For the second time in two weeks, a powerful tropical cyclone struck Madagascar. On January 31, Fytia battered the remote northwestern coast of the island with destructive winds and torrential rains that displaced thousands of people. Less than two weeks later, Gezani made a direct hit on one of the island’s largest cities before sweeping past areas that Fytia had just flooded. The MODIS (Moderate Resolution Imaging Spectroradiometer) on NASA’s Aqua satellite captured this image of Gezani as it neared Madagascar on February 10, 2026. At the time, the storm was undergoing rapid intensification. Its sustained winds peaked at 200 kilometers (125 miles) per hour before making landfall at Category 3 hurricane strength. According to meteorologists with the Joint Typhoon Warning Center, the storm developed amid conditions “highly favorable” to strengthening, including sea surface temperatures above 28 degrees Celsius (82 degrees Fahrenheit), wind shear below 20 kilometers (12 miles) per hour, and an unusually moist atmosphere. As the storm passed near Toamasina, Madagascar’s second-largest city, satellites that contribute to NASA’s IMERG (Integrated Multi-satellite Retrievals for GPM) product measured rain rates up to 4 centimeters (1.6 inches) per hour. The deluge caused widespread flooding in Toamasina and several other parts of the island. Preliminary damage assessments from Madagascar’s National Office for Risk and Disaster Management linked the storm to dozens of deaths, hundreds of injuries, and damage to more than 27,000 homes. Reports from news outlets and humanitarian groups described chaotic conditions in Toamasina, with widespread power outages, numerous collapsed roofs, and a lack of clean water. January 29, 2026 February 14, 2026 In this false-color image acquired before the flooding, the Rianila and Rongaronga rivers merge near the town of Brickaville. River water appears dark blue against a bright green background of farmland and savanna forest. NASA Earth Observatory / Lauren Dauphin In a false-color image acquired after the flooding, waterways appear much wider, and floodwater covers large portions of the landscape west of the two rivers, both north and south of Brickaville. NASA Earth Observatory / Lauren Dauphin January 29, 2026February 14, 2026 In this false-color image acquired before the flooding, the Rianila and Rongaronga rivers merge near the town of Brickaville. River water appears dark blue against a bright green background of farmland and savanna forest. NASA Earth Observatory / Lauren Dauphin In a false-color image acquired after the flooding, waterways appear much wider, and floodwater covers large portions of the landscape west of the two rivers, both north and south of Brickaville. NASA Earth Observatory / Lauren Dauphin January 29, 2026 February 14, 2026 Before and After January 29, 2026 – February 14, 2026 CurtainToggle2-Up The OLI (Operational Land Imager) on Landsat 8 captured this false-color image of severe flooding near Brickaville, just south of Toamasina, on February 14, 2026 (right). For comparison, the left image shows the same area before the storm. Villages and farmland along the Rongaronga River appear particularly hard hit. Crops commonly grown in this area include rice, vanilla, lychees, ****** pepper, cloves, and cinnamon, according to researchers from the French Agricultural Research Centre for International Development. Madagascar is one of the most cyclone-prone countries in Africa, with about six storms typically affecting the island each year and two making direct landfall. The cyclone season generally runs from November through April, with peak activity between January and March. NASA Earth Observatory image by Lauren Dauphin, using MODIS data from NASA EOSDIS LANCE and GIBS/Worldview and Landsat data from the U.S. Geological Survey. Story by Adam Voiland. Downloads February 10, 2026 JPEG (3.04 MB) January 29, 2026 JPEG (5.40 MB) February 14, 2026 JPEG (6.00 MB) References & Resources ACAPS (2024) Cyclone exposure and vulnerabilities. Accessed February 17, 2026. Associated Press (2026, February 12) Cyclone Gezani destroys 18,000 homes and causes at least 36 deaths in Madagascar. Accessed February 17, 2026. BNGRC (2026, February 16) Cyclone Gezani: Preliminary Assessment of Recorded Damage. Accessed February 17, 2026. BNGRC, via Facebook (2026) Posts. Accessed February 17, 2026. CIMSS Satellite Blog (2026, February 10) Cyclone Gezani makes landfall on Madagascar as a Category 3 storm. Accessed February 17, 2026. France24 (2026, February 11) Cyclone Gezani ravages Madagascar, leaving dozens dead and much of second city in ruins. Accessed February 17, 2026. Global Disaster Awareness and Coordination System (2026, February 17) Overall Red alert Tropical Cyclone for GEZANI-26. Accessed February 17, 2026. NASA Earthdata Tropical Cyclones. Accessed February 17, 2026. The Washington Post (2026, February 11) Death toll rises to 31 after Tropical Cyclone Gezani hits Madagascar and crushes houses. Accessed February 17, 2026. World Meteorological Organization (2026, February 13) Tropical cyclone Gezani hits Madagascar and threatens Mozambique. Accessed February 17, 2026. Yahoo News (2026, February 13) Madagascar cyclone death toll rises to 40, water, power still out. Accessed February 17, 2026. You may also be interested in: Stay up-to-date with the latest content from NASA as we explore the universe and discover more about our home planet. Senyar Swamps Sumatra 3 min read A rare tropical cyclone dropped torrential rains on the Indonesian island, fueling extensive and destructive floods. Article Ragasa Steers Toward China 3 min read The super typhoon headed for Guangdong province after lashing Taiwan and northern Luzon in the Philippines. Article Imelda and Humberto Crowd the Atlantic 3 min read The tropical cyclones are close enough in proximity that they may influence one another. Article 1 2 3 4 Next Keep Exploring Discover More from NASA Earth Science Subscribe to Earth Observatory Newsletters Subscribe to the Earth Observatory and get the Earth in your inbox. Earth Observatory Image of the Day NASA’s Earth Observatory brings you the Earth, every day, with in-depth stories and stunning imagery. Explore Earth Science Earth Science Data View the full article

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.