A NASA engineer is developing software that will enable a machine to use objects on the Moon's horizon to navigate over the lunar surface, similar to how humans use GPS on their smartphones to know where they are headed.
This GPS on Moon will be called LunaNet.
Internet-like Features
A communications and navigation infrastructure for the Moon is currently being developed by NASA in collaboration with businesses and other international organizations.
LunaNet will give the Moon "internet-like" features such as location services.
To ensure safety if communication signals are unavailable, explorers in particular areas of the lunar surface could need overlapping solutions drawn from different sources.
This innovative project was primarily driven by Alvin Yew, a research engineer at NASA's Goddard Space Flight Center in Greenbelt, Maryland.
Yew began by using information from the Lunar Orbiter Laser Altimeter on NASA's Lunar Reconnaissance Orbiter (LOLA). LOLA creates detailed topographic maps of the Moon and calculates surface roughness and slopes.
Using LOLA's digital elevation models, Yew is teaching an artificial intelligence software to replicate how the moon's horizon might look like to an astronaut on the lunar surface.
These computerized panoramas allow for the precise location identification of any given area by correlating known boulders and ridges with those that can be seen in photographs taken by a rover or astronaut.
Conceptually, Yew explained that it's similar to going outside and attempting to determine your location by glancing at the horizon and the nearby landmarks.
"While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet (9 meters). This accuracy opens the door to a broad range of mission concepts for future exploration," Yew said in a statement.
Read Also : NASA Perseverance Rover's Lightsaber Image Excites 'Star Wars' Fans; Here's What the Metal Tube Really Is
GIANT's Capabilities
To save memory, a handheld device could be configured with a local subset of topography and elevation data using LOLA data.
A lunar explorer can only see up to around 180 miles (300 kilometers) from any undisturbed spot on the Moon, according to a study published by Goddard researcher Erwan Mazarico.
Yew's positioning technology may even be useful to explorers on Earth if their route is in an area where GPS signals are blocked or interfered with.
Yew's geolocation tech will also make use of GIANT's capabilities (Goddard Image Analysis and Navigation Tool).
Unlike radar or laser ranging devices, which pulse radio signals and light at a target to examine the reflected signals, GIANT analyzes photos to measure the separation between and from visible landmarks as fast and precise as it could.
Goddard's autonomous Navigation Guidance and Control system (autoGNC), which offers mission autonomy capabilities for all phases of spacecraft and rover activities, has a portable version called cGIANT.
Future moon explorers would greatly benefit from this project since they would one day have an AI-powered tool that would help them navigate the lunar surface.