Aditya Madanapalle and Kavya NarayananOct 01, 2019 16:50:28 IST
Editor’s Note: On 1 October, NASA celebrates 61 years since it opened for business (in 1958). The American space research agency, apart from sending several hundreds missions to space and orbit, also invests a lot of manpower, time and money in everyday technologies that are useful to the general public. On its 61st anniversary, we take a look at ten technological innovations brought about by research at NASA.
NASA showcases how its technologies are useful to life on Earth as we know it, in an annual volume called Spinoff. In the 2017 version of the catalog, NASA displayed a series of technologies that have been brought about by NASA research for Earthly purposes.
The books have been released every year since 1996. They include an array of seemingly-random technologies that NASA has engineered through its six-decade-long mission to understand and explore the Earth, the solar system and the universe. They range from self-driving tractors that use advanced GPS tracking to harvest food, to CMOS sensors used in action cameras, to cooling pipes for use in brain surgery.
“The stories published in Spinoff represent the end of a technology transfer pipeline that begins when researchers and engineers at NASA develop innovations to meet mission needs,” says Stephen Jurczyk, the associate administrator of the agency’s Space Technology Mission Directorate in Washington. “They are innovations that make people more productive, protect the environment, and much more.”
If you’ve ever wondered how the space race helps the human race directly, here are ten selected uses of NASA technologies on Earth that might offer an answer.
A partnership between John Deere and NASA’s Jet Propulsion Laboratory has resulted in significant savings for farmers who use automated tractors for agriculture. While the world is seeing the first fully-automated vehicles on the roads, agricultural fields, where regulations are laxer, have been ploughed, turned and harvested by automated vehicles for over ten years.
Over this period, the accuracy of GPS was improved from 30 feet to an inch. The tractors are used for planting feeds, distributing fertiliser, spraying pesticides and harvesting the crop.
The increasing accuracy meant that there was less overlap in the areas automatically covered, resulting in significant savings to the farmers. The automated tractors can take into account inputs from local moisture sensors on the field, and combine it with the GPS data for a process known as Yield Mapping, which informs the farmers how much of their harvests come from which part of the field. John Deere has weaned itself off NASA to guide its tractors, and uses their own systems now.
NASA scientists have been using light detection and ranging (LiDAR) technology since the early Apollo moon missions in 1971. A Canadian company called Teledyne Optech was designated to engineer tiny, lightweight LiDAR devices for NASA’s use. These devices could detect snow in the atmosphere of Mars. A small bread box-sized LiDAR is aboard the OSIRIS-REx mission, which has landed on asteroid Bennu to explore the asteroid and any clues it can find to the origin of life.
In modern-day archaeology, the small, portable LiDAR devices come in extremely useful to peer into the past of the planet without having to dig.
Research teams flew over suspected sites of bison hunting, and scanned the ground with LiDAR devices. These gadgets can identify features on the ground and peer through layers of thick vegetation to produce 3-dimensional maps of natural and man-made topography. Archaeologists would then go on foot to corresponding features on the ground and look for signs of pre-historic activity.
Once promising signs were found, scientists move into the dig site, and find remains of bison bones and pre-historic artifacts associated with bison hunting and cooking. The legendary lost settlement of Ciudad Blanca in Honduras was pinpointed using LiDAR. Other than archeology, LiDAR technology is also useful in mapping natural disasters like wildfires and earthquakes to look at how structures on the surface were affected.
Heat pipes for brain surgery
NASA has been using heat pipes to dissipate heat from even its earliest spacecraft. Any satellite that doesn’t rotate around its own axis and is facing the sun much of the time is susceptible to damage from space radiation to its electric components (due to a build-up of heat). Funneling away heat from the exposed part of the satellite, copper tubes run to the back of the spacecraft, protecting the delicate equipment on the inside from damage.
More recently, NASA has been using heat pipes to passively cool fuel cells, which generate heat. Technology developed by NASA partners Thermacore is used by surgeons in open brain surgery.
A working heat pipe requires the use of electronic, bipolar forceps to generate heat. This heat is then used to cauterise (burn a wound of part of the injured tissue with heat to stop bleeding or prevent infection of healthy tissue) the wound. However, any excess heat can damage the perfectly healthy tissue of the brain. Miniature heat pipes on the forceps increase the surgical precision of these surgical instruments, ending in better results from brain surgeries for patients.
Blood warmers are another important medical application of heat pipes. These devices evenly warm blood and not create local hotspots before it is administered to patients. In hospitals and medical institutes, fluid warmers are used to prevent hypothermia in patients.
As the technology is improved and developed further, medical practitioners are finding increasing applications for these heat pipes.
Baby blankets with spacesuit-like insulation
Spacesuits for astronauts are made using some of the most advanced materials in the world to insulate their bodies against extreme temperatures. Technology originally developed for use in spacesuits is now also used to prevent deaths of newborns in many developing nations including India. These materials, called phase-change materials (PCMs) are not used widely to manage heat distribution in spacesuits anymore, as more advanced materials with insulation (like Mylar) have come along.
PCMs absorb heat when they are warm, turn from a solid to a liquid form, and then release heat in cold temperatures, in which they refreeze. While they didn’t catch on for use in spacesuits, but the technology was licensed for commercial use, after which it was picked up by a Stanford University graduate to create inexpensive incubating blankets for infants.
A prototype of a baby incubator was developed by Stanford graduate Jane Chen, who is also co-founder of a social enterprise startup called Embrace, which aims to help the millions of premature and low birth-weight babies born each year through low-cost infant warmers, which could go a long way in improving their odds of survival. It costs a fraction of traditional incubators. Embrace was formed and moved to Bangalore as India has the highest rate of infant mortality.
After a round of funding from a Kickstarter campaign, Embrace is now creating a range of products like swaddles, sleeping bags, and blankets with PCMs at its core. The PCM in these products absorb heat when babies are too warm, and release heat when they get too cold. Embrace’s products are available on its website, and an infant in the developing world gets one for free every time a product is purchased.
Radios to track airplanes from orbit
In the advent of space-based tracking systems, NASA developed a method of using radio signals to track every plane that’s airborne without a single tracker going offline/off the grid. This is, in part, due to a reconfigurable radio designed by NASA.
There was some hesitation to use this reconfigurable radio technology on its satellites since its testing on signals is highly complex, and the radio needs to perform in a variety of challenging situations. It is tedious to test radios for potential (use case) scenarios that you aren’t even aware of at the time the radio is being developed. That said, NASA has embraced reconfigurable radios as the future of space-based communication for the flexibility and potential it offers space missions. Radio communication in space has also boosted the capability to return data at the higher rate than previously possible. While reconfigurable radios are certainly superior in space communication, they are also used for something purpose — the worldwide tracking of aeroplanes.
The American company Iridium launched a constellation of low-Earth orbit satellites providing communication and data services to the remotest part of the Earth. But the effort was ill-fated in more than one way.
“Surely you remember Iridium, Motorola Corporation’s USD 5 billion low-Earth-orbit debacle,” a 2004 report in Air & Space Magazine recalls. Plans for the satellite network (and a business plan to supplement it) were drawn up in the mid-1980s, and went rapidly archaic by the time the satellites were deployed in 1998. Till the company went bankrupt in August 1999, Iridium global communications offered a brick-size, USD 3,000 satellite phone with calling charges ranging from USD 6 to 30 a minute. The constellation’s coverage was impressive, considering their well-choreographed dance in orbit.
The original Iridium constellation was 95 satellites strong, launched over twenty-two launch missions (nine launched in 1997, ten in 1998, one in 1999 and two in 2002). This constellation was enhanced with the new and upgraded Iridium NEXT satellite constellation, the first of which was launched in 2017. All of Iridium’s upgraded satellites were equipped with Harris AppSTAR reconfigurable radios programmed to receive signals from satellite phones, pagers, integrated transceivers over the entire Earth surface and a new generation of airplane transceivers called ADS-B.
What this meant is that no aircraft in the future go off radar and communication mysteriously, similar to the perplexing disappearance of Malaysian Airlines Flight 370.
For the landing test of the Orion crew capsule (still under development, to carry astronauts to the moon and back), NASA needed a camera that didn’t exist on Earth at all. The camera was to be hosed in a rugged, self-contained casing that could survive the vacuum of space, a warm and turbulent fall through the atmosphere, and immersion in water. The camera was also to capture high-speed videos in high-resolution to help scientists troubleshoot or diagnose what went wrong and how (if the mission were to fail).
NASA approached digital imaging firm Integrated Design Tools Inc for the project, a manufacturer of cameras for use in industrial and scientific research. The camera technology then developed for NASA’s Orion capsule is now used in IDT’s cameras, which saw a remarkable increase in its sales alongside. Previously, high-resolution cameras would save data in volatile memory, where the data could get lost in case there was a disruption in the power supply to the camera.
NASA couldn’t afford the risk. IDT’s camera for Orion was developed to rectify this flaw. Ultimately, the camera could transfer 10 to 12 gigabits of data per second to a hard drive. This technology has birthed not just a powerful camera tailor-made for the task at hand, but also high-bandwidth pathways for other high-speed cameras to write data onto hard disks.
Rail monitoring sensors
To ensure the competitiveness of the USA’s helicopter industry, an initiative called the Subsonic Rotary Wing Project was launched by NASA in 2006. Under the project, contractors developed a sensor called RotoSense — a rotating, vibration sensor meant to predict failures in helicopter transmission. Wireless accelerometers constantly monitor the circular movement of the rotors and analyse its health and condition in real-time. Three such accelerometers were used to measuring G-forces in three different directions. RotoSense can also be used in fast-moving parts of other vehicles — the wheels of trains, for instance.
By monitoring vibrations in the axle of a train, engineers can detect flaws in large stretches of the railway tracks on which the train moves. An American company, Ridgetop, repurposed the RotoSense sensor meant for helicopters into a product called the RailSafe Integrity Analysis System. The eventual aim of the device is to pre-emptively detect problems and predict failures before they happen, with long-term monitoring of these track vibrations.
Apart from preventing catastrophe, the sensors also prevent unnecessary maintenance and routine manual checks, thereby cutting costs. Devices like these can be used to prevent incidents like the Patna-Indore express derailing.
Nanofibre water purifiers
Water scarcity is a global problem and one that continues to intensify with every passing year. Over a billion people in the world don’t have easy access to clean drinking water. Even in developed countries, water sources are getting contaminated or cut off during natural disasters, and groundwater is fast-depleting.
Some of the finest water filtration systems in the world were NASA-designed, for use in a different environment where water is quite scarce — space. Astronauts on the International Space Station consume water that is purified after it is collected from every available source (and yes, that includes sweat and urine).
NASA has collaborated with a number of companies to fund or transfer the technologies needed for creating cheap, portable water purifiers. These have resulted in solar powered devices, hand cranked devices, filters for use under the sink, and filters for use in remote locations. The problem with traditional filters for the removal of viruses and bacteria was that water had to be pushed through tiny holes, which was a slow process. Nanofibre attracts the bacteria and the viruses, allowing for more than 10 litres of water to be filtered in under a minute.
Human eyes are more sensitive to the colours in the middle range of the visible spectrum (like the blues and greens) and less sensitive to colours in both the extremities (like red and violet).
A NASA Ames researcher Len Haslim was exploring ways to counter the effects of a phenomenon called “center-loading sensitivity”, which can interfere with visibility in situations like spotting targets or judging distances. He developed a simple, low-cost optical filter, that would allow colors such as red to stand out in a field of green to detect foreign objects in a camouflaged environment, such as forests. The goggle’s lenses were especially-developed to help spot targets and gauge distances better (presumably by the military), but the goggle lenses had another interesting application.
The lenses could detect distress in plants, allowing them to be saved before their conditions deteriorated. Robert Brock, an optical scientist, recognized the value of using these lenses for other applications, and founded a company now known as NASTEK.
Bluelight is a hindrance to skiers when it comes down to perceiving the terrain. Ski goggles available in the market today, therefore, can filter out between 79 to 80 percent of blue light. NASTEK’s lenses, however, manages a neat 95 percent blue light filtration and were first used in goggles by Optik Nerve in 2015.
Today, ski goggles are available in different designs, each for different light conditions — low, intense and direct light. NASTEK’s ski goggles can handle both conditions equally well and are much cheaper than its competitors.
Would you believe it if I told you digital photography was conceptualised at NASA’s Jet Propulsion Laboratory alongside rockets and rovers, way back in 1960?
JPL hired charged coupled device (CCD) expert Eric Fossum to work on digital cameras. Fossum believed that the complementary metal-oxide-semiconductor (CMOS) technology, into a commercially viable technology. CMOS sensors suffered from various issues, including noise. Fossum used a CCD technique of measuring the voltage of a pixel before and after an exposure to reduce the noise. This allowed for a range of commercial applications, including consumer cameras with low power requirements, low noise and capable of being housed in tiny casings.
The CMOS sensors invented at NASA’s JPL are perhaps the most commonly used technologies to be spun off from NASA. These sensors are in action cameras, mobile phones, and dedicated cameras as well. Image sensors for automotive, surveillance and industrial applications also use CMOS sensors. The devices are small, light, and have a reduced energy footprint compared to other technologies. Cell phone cameras are the most commonly used cameras now, and these would not have been possible without CMOS sensors.
The Spinoff 2017 book by NASA lists out many more such innovative space technologies that made it to practical applications on Earth. Users can download the book in .pdf format for free, view it in an html version, or even request a free physical copy from NASA. There is a dedicated app for the iPad, with shortened versions of the text, but image galleries and associated video content.