Building the Universe: The Holy Grail of Virtual Reality
With an estimated worth of almost $100billion, the gaming industry has become a strong contender for the prince’s crown in entertainment. But despite its accomplishments, a number of developers are beginning to wipe the board clean, stripping the process right back to the start, in order to accomplish what has really always been the ultimate programmer’s goal: a fully virtual world.
One such developer is Robert Sugar, a 34 year-old Hungarian with a ravenous enthusiasm for computer graphics. Having begun his career as a games developer, Sugar is currently attempting to push the boundaries of virtual possibility, in his role as head researcher for Crocotta Research and Development.
Traditionally, Sugar tells me, virtual graphics are created using a mesh system which creates a ‘hollow shell’ onto which detail can then be laid: “The mesh system is surface-based, it’s a polygonised form,” he says in his thick caricature of an accent. “Why people did that in the past was because the hardware wasn’t quick enough. It’s going back to the ages when Doom and Quake came to the surface.”
This May saw the twentieth anniversary of Wolfenstein 3D, the first-person-shooter that effectively invented the genre. Whilst the detail in modern gaming visuals makes Wolfenstein look more akin to an Etch A Sketch than an Xbox game, the system they use is actually still remarkably similar.
Over the years developers have added increasing levels of detail, creating ever more tactile surfaces for a player to run-and-gun around in. But, as Sugar explains, there’s only so far you can take this before you hit the wall of limitations: “The next step in evolution is that you need to go beyond the surface. To do this you handle it in a similar way to the real world. Like matter.”
What if we wanted to create a virtual copy of the universe? That sounds crazy, but it’s our job.
Sugar shows me a programme he’s made in order to visualise what he’s trying to do. Whipping out a standard Dell laptop, he pops up on the screen a simple image of a purple cube, suspended in blackness and gently spinning. Out of the corner of his eye I see him snigger at my wholly unimpressed expression. “Let’s take a closer look,” he laughs.
Using the trackpad he zooms us towards the shape, the motion on the screen surprisingly cumbersome with the programme only running at a couple of frames per second. As it gets closer, I notice that the surface of the cube is not smooth, as it first appears to be, but fragmented. It’s in fact made up of millions of individual spheres, all so closely spaced that, from a distance, they seem as one.
Sugar then drives us into the middle of the cube, immersing us in this strange world of translucent balls, explaining that the most impressive feature of this programme is that he’s able to run it off such a basic set-up (for the well informed: around 20 million retraced objects off a 1 core processor). Building objects using a particle-based system is something that Crocotta is focusing heavily on in their research. For what seems like an inordinate amount of time we fly through this realm of spheres whilst Sugar explains how an increasing number of industries are beginning to shift development towards this set-up, applying physical laws to particle representations in order to create a virtual reality that accurately simulates real life.
“This is actually really not a big thing,” Sugar teases. “What if we wanted to create a virtual copy of the universe? That sounds crazy, but it’s our job. This is one of the holy grails in science.” He stops and frowns at the screen, and then apologises. He has got lost in the cube.
Creating the universe sounds like a stretch, but Sugar seems unfazed by the task: “a lot of people believe we could do this in twenty to thirty years maximum.” I ask if he feels mimicking the totality of existence within twenty years is perhaps a little ambitious – especially as we just got lost in a relatively small box. He pauses for a moment before conceding: “Okay, let’s call it thirty.”
One of the main benefits of working with particle-based graphics is that, when the laws of physics are written into the programmes, the need for pre-determined paths is removed, so that programmers can sit back and watch events unfold. This gives scientists a high probability insight into future events and takes us places real-world exploration currently cannot.
Sugar recently witnessed one such programme – a real-time simulation of the merger of two galaxies, millions of years ahead of the present day. “They simulated over 200,000 stars,” he says, clearly impressed, “as well as the gravitational effects they have on each other.”
Although the displays are only as accurate as the data inputted, they do allow scientists to test theories surrounding areas such as galactic interactions, events that we are normally unable to witness. With the ever-increasing accuracy in physics and algorithm, the potential for virtual reality’s role in space theory could increase exponentially, unlocking the secrets to earth’s demise and, possibly, its birth.
But astrophysics is not the only branch of science beginning to explore the potential of virtual reality; practical implementation in the medical profession has also been on the rise in recent years. But the programmes in use are mainly of the old-fashioned variety – creating basic mesh environments in order to enhance cognitive development or crudely simulate operations for a trainee surgeon.“None of the medical tests, even in the experimental phase, are dealing with any kind of interaction beyond, say, the MRI image,” Sugar says, appalled at this level of primitive technology. “It’s just for seeing inside the body.”
A shaft of light filters down upon the box, catching the surface and bursting out into fragmented rays.
Take, for example, the tumour. Doctors can currently identify the types of cells and tissues that create a tumour, as well as the ones in the surrounding area, and identify with reasonable accuracy the physical, biological and chemical properties that define them. Sugar suggests that this information can be taken and built into particle systems to benefit the patient through high-accuracy predictions, “We can run a little simulation to predict the growth of the tumour,” he says. Whilst this would not be 100% accurate, it could offer a significant improvement to current methods of growth prediction.
Sugar boots up another programme; this one he is particularly excited about. It apparently tackles one of the bigger problems that developers like him face: light. He holds up a glass whilst it loads and explains how emulating photons is generally a complete mess, that “when the first light rays hit the surface, they scatter all around,” making simulation a nightmare. He laughs hard at this, at this most simple of complexities, before turning my attention back to the screen.
He points excitedly at another cube, this one dark blue in colour. He doesn’t zoom in on this one, but I can already see that all is not as it seems. The surface of the cube crawls with thick liquid-like slugs, constantly moving in what appear to be random three-dimensional motions. When two slugs come into contact they merge into each other before splitting off in unequal quantities. But this is not what he is excited about. From the top corner, a shaft of light filters down upon the box, catching the surface of the slugs and bursting out into fragmented rays.
The slugs are a construct of Sugar’s latest algorithm, they are the results of millions of particles, all full constructs, all existing, moving and interacting through programmed physical laws. At a glance they might not look like much, the jerky little motion drifting round the screen, but this is a building block, the DNA of something bigger. This is the start of true virtual reality, and it’s running on a Dell laptop.
LIAM DESROY is a writer, former editor of The Journal of Wild Culture and currently the Editor of Science Prophet. He lives in London.
Image Credit: Crocotta Research.