Geoffrey Richardson's profile

Virtual Reality - Peter 2.0

The Project
In 2017, Dr Peter Scott-Morgan, a visionary roboticist, was diagnosed with motor neuron disease (MND). This degenerative illness would eventually paralyze him completely except for the movement of his eyes. However, Peter chose to fight back with technology and become Peter 2.0, the “world’s first human cyborg.”

The task at hand was to create a POC application within two weeks that would portray the idea of a virtual realty environment allowing Dr Scott-Morgan to create his own artwork. Peter wanted his virtual reality environment to be based on sci fiction and therefore the concept of an art gallery set in outer space was created.

The POC was then presented at a build-a-thon and shown to the BBC. During the build-a-thon further research would be conducted into how the application would be developed further and integrated with an AI.​​​​​​​
The full virtual reality experience can be found here: https://vr-cyborg-studio-stable.glitch.me/
Storyboarding and Wireframing
Before developing the application it was important to create wireframes in the form of a storyboard that would serve as a visual guide throughout the development process. There are 3 important things to specify when it comes to wireframing 3D VR interactions: Scene, Spatial and Affordances.

1) Scene - The scene refers to what the user will see once the VR headset has been put on.
2) Spatial This refers to spatial elements within the scene such as the distance of objects to the user, the direction the user will originally be facing or whether objects are in the users field of vision or on the peripheral. 
3) Affordance - Affordances refer to what the user is able to do within the environment, how much control he/she has and what elements are interactable.
Research was conducted in order to put together metrics for an ideal field of vison etc
Scene 1: The user will start off spawning within a sci-fi themed moon base. The user can look around and see the surrounding environment. The Earth, other planets in the distance and also the moon's rocky surface. A red button hovers in front of the user. Affordance - The user can interact with the floating button.

Scene 2: Once the user interacts with the button a virtual screen will appear in front of them consisting of a UI that will allow the user to select a number of images to combine in order to create a new work of art. Affordance - The user can interact with a number of images on the screen.

Scene 3: Once the required images have been selected and arrow will then appear and once clicked will take the user to the front of the moon base where a final button will be clicked on in order to view the new artwork. Affordance - The user can interact an arrow in order to move in the direction the arrow is pointing to in addition to a final floating button.

Scene 4: Once the final button is selected a large screen will appear in front of the user with the newly created artwork. Affordance - None.
Outcome
The creation of the artwork would be operated by the AI, as a result this POC would only simulate that feature and would contain pre-identified images that the user would select in order to create a new piece of art. The final outcome was a WebVR based application written in A-Frame for the Oculus Go. Being only a proof of concept the aim of the project was only to portray how the idea could work. The oculus go however does not support eye tracking technology, therefore this idea would have to be adapted to devices such as the Vive Pro Eye VR headset. The POC featured a reticule at the centre of the screen that would follow the users head movements and allow them to interact with numerous objects within the environment. Objects that can be interacted with would constantly flash red in order to notify the user. If the user wanted to interact with an object they would focus the reticule on the object for three seconds. Movement within the virtual environment was a concern as Dr Scott-Morgan would not be able to operate a joystick or controller consequently limiting his freedom and overall experience. As a result movement could be achieved through the use of the reticule which would trigger an animation that would move the camera towards the area it was focused on.
"An inspiring journey of rebellion, resilience, and a man who rather than leave the one he loved chose to change the world instead."
Virtual Reality - Peter 2.0
Published:

Virtual Reality - Peter 2.0

Published:

Tools

Creative Fields