Regulus Astrogazer for the Blind

Yuma Antoine Decaux
October 15, 2020
I first got a telescope on my birthday at age 7. It was a small one, that magnified by a maximum of 50x, and was accompanied by a book showing every planet of the solar system. I used to devour the pictures and descriptions of each planet and their satellites, the sun, all the information I could gather. The telescope, before ending up being a long range reconnaissance tool I used while spying on neighbouring towns, gave me dreams and a fertile imagination. Who didn't get high with excitement upon being able to distinguish Saturn's rings viewed through the lens? I then spent hours mapping various constellations and important stars, orion's belt, the north star, mars, etc. Today, there are myriads of tools, software and apps that provide a clear view of the sky and its stars and planets. They are probably really cool, well thought out and super inspiring for young astronomers, cosmonauts and space enthusiasts. Yet again, those tools just have not been conceived with accessibility for the blind. So I took to the task of creating an app that covers this, if not for me, for all other blind users who have the same interest and curiosity for what shines above our heads 24 hours a day. The app is still very early stage, but I made it so that you can get what is called an "ephemeris" or astronomical information about an object in the solar system, and place it in a 3D environment, which takes your geo-location and uses your phone's accelerometer to orient you towards the object, emitting a tone and the distinct frequencies coming out of the object.

Here is a brief description in audio. 3D position is there, but you won't be able to hear it well from the phone's speaker as it requires earphones.

My first task was to figure what data I could gather to start my project. Thanks to Brisbane's code network and its many knowledgeable members, I was given a link to NASA's Horizons system, which is a telnet service providing up to date ephemeris information for both large and small bodies within the confines of the solar system. It is very comprehensive and provides both coordinates and magnitudes for all the planets and satellites, man-made satellites, spacecrafts and even bits of spacecrafts lost in the vacuum. Small bodies have an impressive 720k+ bodies which are monitored using both geodesic, velocity and time stamped information. All of these bodies have cycles and rotations, velocities and other information Horizons can give in a tabular form over periods spanning from millennia back in time up to the far future.

To say the least, it was initially a bit intimidating, but once the commands were understood, pulling up information about a celestial body was quite interesting and informative.

It first gives an overview of the body being examined, dimensions, atmospheric pressure, main geochemical composition, temperatures, oscillations and so on.

It then asks you how you want the data fed, with various parameters which are prompted for input.

There are several methods for retrieving the data for selected bodies, via email, through the telnet service and an FTP service that generates the ephemeris data which then can be accessed.

Since my app uses this information to provide 3D sound in a spherical vector space, by comparing current location on the planet earth and local to local coordinates of the target body, I needed to extract the information by scanning the output of the Horizons terminal and making sense of it. First finding local coordinates requires opening a geo-location service available on the mac or IOS, then inputting that into the headless terminal session. There are also quite a few prompts to follow when pulling up info on a body. These include reference frame, types of table to output, whether to consider error corrections and deltas, the time domain and increments for each snapshot, whether the output should be carthesian vectors, velocity, lat long, magintudinal or a mix of each.

It is a very well thought out system, and though some parts, from what the system says, require advanced math knowledge, the bulk of the information has to do with 3D vector spaces used on spherical objects. So if you know anything about latitudes/longitudes or Right Ascent and Descent in astronomical terminology, can convert one astronomical unit (distance between the sun and our planet) and certain time based proportionalities, you're doing good. Some more advanced knowledge is required when it comes to the change in occlusion a body will have during eclipses relative to one another, especially for smaller bodies which absorb most of the light and require background spectrographic analysis.


As usual, it is done in swift, and provides stable telnet communication with Horizons, with the following capabilities:

• Input and output streams using modern implements of CFStream, , opening and closing connections, sending commands and parsing data

• Pulling up the list of all small and major bodies, with its information stored in CelestialBody.swift. This includes unique IDs, names and other descriptions.

• Pulling up the ephemeris data of a specific body

• Configuring parameters when retrieving ephemeris data

The CelestialBody.swift class holds most information on the body itself, and and can be used to request it's ephemeris. The coordinates are pulled out using pre-configured settings, but you can play with them and add more properties to the class file, such as introductory information about the body if it exists.

The Telnet.swift class can be instantiated and assigned a delegate (ViewController.swift) to parse the information and place it where you need it.

The Commands.swift file is a set of arrays to identify where in the telnet session you are, and is limited to the principal commands for the Horizons service. Furthermore, a CommandType enum is used to switch between requests when buffers are available for both input and output. Standby mode does nothing while Connect, MB, SB, Ephemeris and EphemData go through the sequences for listing MajorBodies, SmallBodies, a selected body's ephemeris commands, and finally parsing the data output. and placing the information into a 3D vector format.

Future development

After the first iteration, since small bodies include comets and asteroids which paths cross our planet, most small enough to turn into shooting stars as they burn through the atmosphere, a comet shower notifier at which point the user points out to the sky, and as the comets shoot past, an audible feedback in 3D illustrates the path of that said object. This would require a server side notification as it periodically scans for small bodies close enough to the planet, compares updated trajectory estimates and by high confidence level, shoots the notification to users within a given geo-location.

Furthermore, I would like to use another system which has more outer solar system celestial bodies to map the sky out with constellations, and categories of systems whether they're quasars, dual or poly systems, black holes, galaxies and other far out systems to make the experience richer for blind kids and enthusiasts alike.

Finally, since this information is real, I was thinking that a multiplayer exploration game for blind users, with stock of resources being placed anywhere one is able to create would be an awesome experience and a catalyst for STEM subject students who are blind to fully participate in the space age era with flight paths being set using real data and collision reports, gravitational phenomena and other dynamic systems within which to play, using audio only.

The API is available here

If you happen to use it for something interesting, please give us a nudge or contact us. We'd love to check it out.