Shuttle Commander, launching today on PS4, puts you in the pilot’s seat for some of America’s most important missions following Neil Armstrong’s first steps on the moon.
Now you can embark along the journey of discovery with this modern-age marvel, and take part in the Hubble Space Telescope missions first-hand with accurate recreations of the Hubble, space missions, and shuttle cockpit.
A passage of time and space
The year: 1990. For perspective: Windows 3, the debut of Photoshop, the first web page.
But 1990 also kicked off a series of events unlike any other. On April 24, the Hubble Space Telescope was deployed from Edwards Air Force Base. This launch marked one of the most significant advancements in astronomy since Galileo’s telescope. It would allow astronomists and researchers to ponder questions that had never been asked, to find answers to the unknown.
For almost 30 years, the Hubble has sparked a sense of wonderment, letting us explore the expansion and expand the exploration. Even after a lengthy development and problematic beginnings, the Hubble Space Telescope will leave a solid legacy of ground-breaking discoveries and stellar imagery. And so, brings forward our story, our reasoning for our virtual story, our passage through time and space.
Metaverse: artwork, audio and animation
But just how do you create a universe inside a metaverse?
We knew we had an overwhelming task ahead of us. The servicing missions of the Hubble Telescope and its discoveries was going to be no mean feat for a start-up studio. After all, Shuttle Commander is based on one of the longest serving and most successful space crafts, as well as the most famous telescope. Accuracy and detail were very important to us.
Thankfully, NASA is very transparent and open to sharing their data, providing the public with a massive database which spans decades of shuttle’s service.
Schematics, images, camera footage and thousands of pages of operations manuals were extensively poured through to ensure we made as accurate a recreation as possible, building a solid foundation for our art, animation and audio departments begin their work.
However, creating that accuracy inside a virtual medium isn’t easy. We had to strike a balance, making sure the players have a smooth gaming experience while still providing players with the most visually realistic experience as possible.
We found ourselves cutting back on some details we’d have preferred to keep, simplifying meshes and textures. Below is a simple example of our workflow, showing the various stages of creating our assets. Research, modelling and texturing, and finally integration into the Unity Game Engine.
We also worked with the developers of the highly popular F-Sim to create an accurate physics-based modelling of the Space Shuttle landings. Players can land at real locations such as White Sands, Kennedy Space Centre or Edwards Airforce Base.
Variables such as weather, time of day, approach and turbulence can be edited by players so that even seasoned pilots will be challenged to make a perfect landing. This detailing and accuracy are crucial to making this experience as close to the real experience as possible.
But it isn’t just the visual representations that make this experience feel authentic. As with the development of our other VR experiences, such as Apollo 11 VR, and Titanic VR, we also focus on aural quality of our audio presentation in order to fully immerse the user.
Although a musical score was originally drafted for this scene, it was discarded in order to give the user a more immersive and authentic experience.
To recreate the launch animation, which shows in real time what astronauts experience in this remarkable spacecraft, there are 27 tracks used alone. These include a washing machine, hyperbaric oxygen chamber, several rattling gates and window frames, a respirator/dust mask, a lift in a department store, an underwater depth charge, a garage door opening, and a man doing push-ups.
As very few people have ever flown onboard a shuttle, we relied heavily on what on-board footage we could find. The best quality came later during the digital age. The final launch of Atlantis with its onboard cameras was a key source. We also relied on the astronauts’ own accounts of the experiences of a launch, from the effects on the body and the emotions they felt before during and after.
The audio for the Hubble launch and service missions contains original NASA radio communications. In order to make the experience more user-friendly, we added additional VO. Special thanks to ‘Houston Mike’ here in Immersive VR Education.
Mike and our sound engineer Karen had to become familiar with the lingo and acronyms used by NASA while also trying to make it understandable for the user. This was the biggest challenge as we needed to research what every part was, where it went and what its function was. Hubble is an incredibly complicated instrument and repairing it was even more challenging.
The goal of our animation pipeline is to create realistic digital characters, that can capture and express the actor’s performances.
The process starts in our scanning studio. We photograph the actor with our camera array. This series of photos is used to generate a 3D model. The actor goes through a defined series of poses capturing all the muscle movements necessary to recreate any facial expression. This can be a tiring process for the actor, but it’s worth it for the final effect!
The data is then passed to our character artists who use the set of 50-60 3D scans as the basis for their characters and morph shapes.
Once this is completed, we move to performance capture. In most scenes we simultaneously capture the actors facial and full body motion. But this isn’t always possible, in this case we opted for “face overs”, capturing the facial performance separately to the body motion.
Now that we have facial expressions and body motion, we can then look at animating the character. This is where all work of the character artists, actors, sound engineers and animators come together. It is crafted into one seamless performance. At this stage we have a clear idea of how all the elements are working together.
Then the final stage is integration and testing, optimising all the assets so we have a balance between performance and aesthetic to give the users an enjoyable convincing performance.