Client
Urban Screens
Agency
None
Date
August 6, 2015
November 21, 2022
Client
Urban Screens
Agency
None
Date
August 6, 2015
I was asked to create a visual representation of the International Space Station (ISS) for Science Week, to be shown on large public screens. It was to show the location of the ISS and give some idea of what was happening on-board - a way of reminding people that it and the astronauts on board doing research were up there right now.NASA provide a live data feed from the ISS via the internet, and with a bit of reverse engineering and some calculations involving orbital mechanics, it seemed possible to realistically show where the ISS was, along with flood of other data.
I collaborated with UI designer Adam Ferns to create a functional interface that drew heavily on cinematic sci-fi UIs. It was important to us that every element was meaningful and not faked/random. I remembered being a (nerdy) kid and always being disappointed when I found out films or other media had cut corners with how they represented the world/science. It was easy to get it right, so why did they so often just make things up that were totally wrong? Now I'm old(er) I understand story or budget trumping accuracy, but I decided to build the ISS Science Week visualiser for kids like I was, who wanted every flashing light and ticking number to be showing something real.
The information shown on screen includes position and power generation of the solar panels, stabilisation gyros, velocity and position of the station, cooling and life support systems, and the astronauts daily schedule in realtime. The visualiser cycles through a few layouts with different info and graphs, which can be controlled by an operator or left on automatic. The pages share common elements so that the audience can see what it is at a glance, but those who want to watch longer will see changing elements over time.
The ISS had a good quality video stream that ran on Ustream, delivering majestic shots of the earth with the ISS in the foreground. Unfortunately the stream was quite unpredictable - it would be offline when out of range of NASA's downlink dishes, and also cut in and out at random (probably depending on other factors like interference from weather fronts etc). Ultimately, the sight of the earth spinning beneath the station was the most compelling visual element (and also the fastest way for people to recognise what the visualiser was showing at a glance). It would be a shame if it was a blue failed-connection box >50% of the time.
Getting the livestream running in Unity3D was a struggle in itself (maybe one for another post), but I decided we needed a simulated camera to deliver similar shots. At first we considered a greatly simplified wire-frame type view of the planet, but it lacked the visual impact of the actual video, so I decided to aim for photorealism despite the tight deadline.Eventually, after a lot of research, I managed to get a high-res version of the globe working, with cloud layer hovering over the terrain, and an atmosphere shader providing accurate Raleigh scattering (the effect that brings out the colours at sunrise/sunset, which you see pretty often on the ISS as it happens every 45 minutes). I also added a map for city-lights at night, to give some realistic interest when on the dark side of the planet.
Using my background in VFX, I also added some post-processing FX to match the livestream look and quality to the virtual camera. Most game-programmers are horrified at the idea of intentionally blurring and obscuring a carefully rendered image, but in this case it keeps the transition between real and simulated cameras less jarring. I also added a manual control where, if the operator noticed the live video was down for a long period, only simulated cameras would be used.
I was originally using a vector-based system with a fixed height to determine the location of the ISS from the streaming data. Comparing my simulation with online trackers, I saw the accuracy was sub-par. To compensate for this, I didn't use the NASA data stream, and instead relied on an approach astronomers use to track satellites and other orbital bodies - TLEs (Two Line Element sets). This approach actually relies on the predictable qualities of an orbiting object in near-vacuum - we give the system a time, a position and a velocity for the object, then an algorithm gives an estimated position for any point in the past or future. It took a while to convert the required code into C#, but once I did some exciting imagery popped out:
After getting this image, I actually found a better library that someone had built in C# for handling TLEs, so switched to that for ISS position prediction. This new approach was accurate enough that I could now confidently put a label on screen to show what country the ISS was over in each shot (this was pulled from a web geolocation API using latitude/longitude worked out from the TLE)
Although the ISS position is accurate in relation to the size and placement of the earth, I cheated a bit with the Sun. Originally I took a realistic approach whereby the distance to the Sun was accurate, but game engines are not designed to handle such huge scales. I got around this with the ISS by rendering the close-up ISS with a separate camera from the camera rendering the earth. This gets rid of the "juddering" you would otherwise see as the numbers representing the ISS position are rounded to the nearest values rather than smoothly moving. Doing the same for the Earth and Sun seemed like overkill, so the sun is dangerously close to the earth, but luckily only about 3x the size of the planet.The day/night effect of the sun is accurate. I was going to implement the moon as well, but it turns out doing multiple layers of TLE prediction is harder than it looks, and it wasn't looking like it was worth taking the time out of the remaining schedule for the minimal visual impact.
We had to cut back a bit of the visual flair to work on the PC available to us on-site (limited VRAM and GPU power meant the globe texture isnt as hi-res as I would like and the post-processing is dialled back a little), but ultimately I'm very happy with this project. Adam and I had a shared visual treatment that worked as intended, I solved some tough but interesting problems, the product exceeded the clients expectations, and it was delivered on a spectacularly tight deadline.
Lighting in screenshots doesn't match lighting in the live video due to manually setting the rotation of the earth for these captures.