5 DoF VR Movement System
As virtual/augmented/mixed (VR/AR/XR) reality systems become more available and utilized across a spectrum of sports, entertainment, business, medical, and educational industries, the movement systems need to accommodate the diversity accessibility needs of populations - from youth to elder experts, or those with disabilities or injuries.
Background
I created a custom VR movement system in Unity with 5 degrees of freedom: translational movement on the three axes and rotational movement for yaw and roll with no way to control pitch. I created it as a way to move around a simple Unity landscape that I made as an assignment in one of my graduate courses at the University of Oregon. Although it was developed as part of the scene I used for the assignment, it had no relevance to the actual assignment - I created it because I am interested in novel movement systems for XR, and it would be a neat idea to try.
My desire to create this movement system stemmed from two thoughts. The first was that the standard VR movement system that uses the joysticks on controllers to navigate is not very immersive and can often feel awkward. The second is that most VR games and experiences are designed with the assumption that users will be standing and able to move freely; this is not possible or comfortable for everyone or in all circumstances. These two issues play off each other since joystick-based movement is generally based on head rotation, and people sitting down cannot quickly snap around to look in every direction. My movement system was designed for users to be sitting or standing.
​This isn't the first custom VR movement system that I have designed. My previous design used the direction each controller was facing and applied a force in that direction based on how much the user pulled the trigger on that controller. If they had both controllers facing straight forward and had each trigger fully pulled, they would be moving forward at twice the speed of if they only used one controller; if the user had both controllers facing opposite directions with the triggers pulled an equal amount, they wouldn't move. While I think this movement system would work well for movement in 3D space (especially if I added the ability to go in reverse), I wanted a more unique system that would feel better for a game.
I based this movement system on the Umbaran fighter ships in Star Wars: The Clone Wars. In the show, these ships are controlled by the pilot's hand movements, and while there doesn't seem to be robust logic regarding which hand movements translate to ship movements, I thought it would be an interesting way of moving in VR and a fun challenge.
Before a user can move, they have to calibrate the controllers to a control point, which involves pressing one of the buttons on the top of the controller (A or X) once they have moved their hand into a comfortable starting position. They can recalibrate by pressing the other button on the top of the controller (B or Y). Once a hand is calibrated, moving that hand away from its control point will cause the user to move in that direction. For example, moving a hand directly to the right of its control point will move the user to the right. However, I designed this with both hands in mind, so each hand only controls its respective side, allowing the user to rotate. If the user pushes both hands forward, they will move forward, but if one hand is in front of its control point and the other is behind, they will rotate. The same goes for up and down. Moving hands in opposite directions vertically will cause the user to roll. Since there are only two controllers, one for the left and one for the right, only two rotational axes can be controlled, Y and Z, which causes changes in yaw and roll doesn’t affect pitch. If the user wants to pitch up or down, they have to use a combination of yaw and roll.
Technical Details
The course assignment required us to implement SteamVR in my Unity scene, so all of the code I wrote and inputs I used are specific to SteamVR. Still, it would not be hard to convert it to Unity's XR Interaction Toolkit or a different XR package. The whole movement system is contained within a single script attached to a prefab. In this prefab, there is the model of the ship (which could be replaced with any model or removed from the prefab), the SteamVR Player prefab, a control point for each hand, models for the hands (they're small spheres to more accurately see their positions), and then empty game objects for where the player is set, and a set of 8 empties for where force is applied.
.png)
The ship that the user controls. Spheres were added to visualize the locations of where force is applied.
The first thing users will interact with in this movement system is calibration, which uses Unity's Update function, which is called once every frame. I created GameObject variables in the script for the two control points and then a boolean for each control point that tells if that hand is currently being calibrated. If a side is being calibrated, the position of the control point is continuously set to the hand's position. Pressing A or X (depending on which hand) will set the boolean value for that hand to False, locking the control point’s position. When it is False, pressing B or Y will set the variable for that hand to True, making the control point follow the hand again.
​
The actual movement is physics-based, so I used the FixedUpdate function, which is executed at a specific time interval instead of every frame and at the same frequency as Unity's physics system, making the movement consistent even if the frame rate changes. I added a Rigid Body component to the prefab so I could apply physics to it, then created a variable in the script to access it. Then, I subtracted the control point positions from the positions of their respective hands to get Vectors of the distance and direction from each hand from their control point. Using the RigidBody function AddForce, these Vectors could be used to apply a force to the ship to move it. However, using this function to apply the forces from the Vectors to the Rigid Body would apply them both directly in the center of the Rigid Body, causing translational movement but not rotating the user. To get around this, I created eight empties around the user as force points, arranged where the corners of a cube would be. I then used the RigidBody function AddForceAtPosition, rather than AddForce, to apply force to the right four points based on the right hand Vector, and to the left four based on the left hand Vector, which allowed each hand to control the side it was on and let the user rotate. For control over the speed of movement, I created a float variable that could be adjusted in the editor for quick changes and testing. When adding the force to the force points, this speed variable was multiplied by the Vector of the correct hand. By testing different speeds, I found that having just a single speed variable that affected both movement and rotation speeds equally either ended with a good movement speed but incredibly fast rotation, or good rotational speed and slow movement. To fix this, I created another adjustable speed variable. Then, at each force point, I applied a force equal to the right hand’s Vector plus the left hand’s Vector and then multiplied by this new speed variable. This increased the translational movement speed without affecting rotational speed, allowing for a movement speed that feels good without the dizzying rotation speed.