Masters of Pie have teamed up with our pals at Lumacode to form team LumaPie after being chosen to participate in the exciting Big Data VR Challenge, a joint initiative from the Wellcome Trust and Epic Games. The event is designed to harness ‘some of the brightest minds in interactive entertainment’ to attempt to solve the ever growing problem of navigating, finding connections, trends and solutions among the huge data sets being gathered every minute by industries around the world.
We were assigned the specific task of working with the researches at ALSPAC, the Longitudinal Study of Parents and Children on their ‘Children of the 90’s’ project. The study follows 14,000 children born in the 90’s and their parents and offspring until they reach the age of 70. The large scale study records a vast amount of data including everyday characteristics such as diet, lifestyle, socioeconomic status, parent-child contact and so on, as well as tens of thousands of samples of urine, blood and DNA. A considerable amount of data to manage, search and study. Our challenge is to explore the potential use of Virtual Reality to aid and augment the use of this data set for researchers.
To kick off we looked at the sample data provided and decided that our first task was to get this data (10,000 data points) into Unreal to see how it performs. It was not as easy as it sounded and required some creative thinking from Pascal Auberson before we had anything to visualise. Eventually, we had our 10,000 data points generating coloured cubes of various scales (actually heights) representing different aspects of the data. Pascal explains the process.
Now that we were confident that our sample data could be used to generate geometry, we started our creative thinking and set to work designing a first pass at a VR interactive search tool. At the time of writing, this is where we are currently and will likely change as we prototype and tweak on user feedback. It will be interesting for the future me to look back at this to see how far we got in the 7 weeks to follow. Meanwhile, I will explain our initial designs here, adding to this blog as we develop.
Big Data
The cube experiment proved that we could generate geometry from the data, but it also proved that lots of bits of geometry is hard to organise in a navigable way. So we looked to nature for our inspiration; namely DNA. The coiling mechanic is a fantastic method in which to pack more data into an otherwise straight line, making it a useful tool for our geometry organisation. The 360 degree canvas offered by VR is another bonus that we could exploit and so we looked at wrapping the data around the user.
User Interface
We love making VR tools at Masters of Pie and so we drew upon our past experiments when it came to designing the experience and user interface. We have previously utilised a look-based system which takes the direction of gaze from the user and then triggers secondary information to appear in what we term ‘data-peeling’. This allows us to filter our view and prevent an overwhelming amount of information in front of the user any given time. By adding a slight delay in this process we can be fairly confident that the information only appears when you are genuinely interested in seeing more, rather than potentially annoying false hits from an overly sensitive system. Another aspect of our past work was useful for the interaction design; the Razor Hydra 3d mouse. This has proven to be one of the most accurate and intuitive input systems for VR that we have encountered so far (this is a rapidly advancing area though so we are hopeful that they will continue to improve). The device will allow researchers to ‘point’ at, grab, move and click on even the smallest piece of geometry in the virtual environment, and should easily allow for fine control of data interactions.
The data collected by the study is vast and varied, meaning that we needed to find a visual language that would work across different types of data, something not too specific but also versatile. This part will be ongoing and likely evolve over the course of our prototypes, but the current system is based on spheres textured with heat maps and scaled depending on their field value. The spheres are organised into ‘helix rings’ around the user and slowly rotate around. The coil size and number will be dynamic depending on the sample number in the current search, with more rings added for very large amounts of data. The idea is that researchers can quickly ‘eyeball’ the data and make judgments on their values without scrolling down an endlessly long spreadsheet of numbers.
Filters
Often researchers will begin their search without a clear idea of what they are looking for exactly, and so our approach is to aid this ‘meandering’ search technique by allowing them to filter their search and then repopulate the environment with the new results, which they can then filter more and repopulate again, and so on… Again, this will likely change over time, but our initial idea centers around a floor based menu, again using ‘data peel’ and/or the Hydra to navigate. The filters would be simple such as ‘group similar’, ‘collapse search’, ‘isolate variable’ etc.. which should allow the researchers to remove the unwanted data as they go, to refine the data presented and ‘meander’ towards their target. For the future we would like to test the use of voice recognition for this menu search, to further simplify and speed up the experience. We will add that to the ‘need to test’ pile (a large but exciting pile such as it is).
Next steps
In a word – prototype. Using further data samples we will test this concept by building as much of it as we need to get useful feedback. Read Phase 2 update now.