The Pittsburgh-based company Astrobotic, in partnership with Carnegie Mellon University, has won a NASA Phase II Small Business Innovation Research (SBIR) Award to develop this 4.4-lb. (2 kilograms) rover platform capable of small-scale science and exploration on the moon and other planetary surfaces.
There will be a 3-8 second delay between when the rover transmits data to when we receive it. Also, we can transmit up to 40kbps of data meaning single images take at least 10 seconds of time to transmit on top of the delay. Finally, our rover is very small and may struggle to traverse the terrain and can easily lose sight of the lander.
The rover compresses and crops image from lander to reduce file-size to 41KB (10s of transmission). Then it measurs a distance on the lander which is of a known size and extrapolate that to estimate width of FOV at the lander. It will use the width of the FOV to estimate the rover’s distance from lander. Then it will combine distance with known or estimated rotation data to find absolute location.
On earth feature recognition is performed on a series of two or more overlapping images sent back from each move the rover makes. From those features we use a Simultaneous Localization And Mapping algorithm to determine how much and in what direction or rotation the rover moved. Each move is added to the last to give a best estimate of our current location relative to the lander.
These use cases include 6 different core features the tele-operations team will require in order to successfully carry out a remote mission.
Users can indicate hazards or highlight areas of interest. The map is updated real-time to show live progress as the localization software continuously estimates the rover’s position.
We have a lot of work ahead of us. I will release updates as soon as they become available for public sharing. See you soon on the Moon!