Top

Surgeon Simulator

A graphic of a virtual reality environment. In the environment is a 3d modeled operating room, with surgery table, three carts, an overhead light, and two monitors. There are also four interface panels with various buttons, drop downs, sliders.

Introduction, Purpose

I’m consistently captivated by emerging technologies, and virtual reality is currently the wild west for me. There’s still so much to learn and explore and every time I’ve put on a headset I have felt an immediate rush of excitement to design something using the technology.

One of the great things about virtual reality is the practicality of it, mostly due to the incredible level of immersion that it delivers. Despite rendering a virtual world, there are many real world applications for it, like:

- Education, and using it as a highly immersive and interactive experience that can cover every type of method of learning.

- Fitness, with a heightened excitement that makes exercise fun. Some companies, like FitXR, are doing great things here in this space.

- Therapy, specifically in things in exposure and cognitive behavioral therapy [1][2][3]

Ok so now that I’ve got you sold on virtual reality let’s get into the meat and potatoes. For this case study I decided to create a virtual setting of an operating room for the purpose of educating surgeons through a mock surgery experience.

Tools

Unity3D Engine

C# Programming Language

Asus Windows Mixed Reality Headset

Figma & FigJam

VR Rigging

The purpose of this case study was to explore and to create a proof of concept. Normally my UX design work is rife with cycles of user testing, data, feedback gathering, and problem solving all culminating in iterative phases. This case study placed more of an emphasis on exploring UX concepts in the space, and the implementation of an interface to create the bones of a product or service.

To get started, I used the Unity3D engine to power the whole scene, connected with my Asus Mixed Reality Headset. They don’t naturally interface with each other right out of the box, but after configuring some asset packages, like the great Unity3D XR Interaction Toolkit [4], and Oculus Integration SDK [5], I was able to get a user/character controller and head camera working in a test environment.

It’s not much but here’s what this looks like in 3D space at this step:

An image of very empty 3D space, with a 2D icon of a camera in the middle and a 3-d axis inside of the camera.

Next was implementing the controllers, getting models for the controllers so the user has some visible feedback, and then configuring raycasts. [6] [7] Raycasts are essentially red lasers that constantly render / point out from the controllers that act as signage for the user to tell what they might be selecting inside of an interface and the scene around them.

They are really important to this kind of setting because without it and without good aim, the user can find themselves accidentally selecting one button while intending to select another.

An image of empty 3D space that is lightly colored. In the space is a 3D modeled motion controller for virtual reality, with a large red laser/raycast aiming outwards from the controller into the empty 3D space.

After getting the basic movement, “look”, and controller mechanics down, I used some free 3D model packages [8] to start modeling the operating room.

A moving .gif of a 3D space, with 3D modeled walls, floor, and ceiling. Inside of the room is the last panel of a wall being placed into a gap in a wall.

Immersion is one of the greatest facets of virtual reality, so creating a setting that looked like an operating room was an obvious goal from the onset. The rest of the models (surgical bed, carts, TV monitors) are either open source models or ones that I had created. (See references at end of this case study for exact links.)

An image of a 3D modeled operating room, with surgery bed, carts, and two monitors in the middle.

Once the setting was there, I moved on to designing the interface.

Creating the Interface

Like typical 2D UX work, I started the design process with sketches and very low fidelity ideas. Lately for me this workflow has seen me skipping paper sketches and starting right in a wireframing app like Figma (I’m just as quick with mouse and keyboard as I am with pen and paper these days, and just as quick to iterate and move on).

Two images of low fidelity wireframe user interface sketches. There are several small grey elements representing UI interactions.
Two images of low fidelity wireframe user interface sketches. There are several small grey elements representing UI interactions.
An image of a low fidelity wireframe user interface sketches. There are several small grey elements representing UI interactions.

During this sketching process I started to research surgery tools and operating room items. I found many good resources, but I based the instruments mostly off of this chart from the University of Nebraska Medical Center. [9]

To help better organize the interface and sort the information architecture, I also created a basic diagram with FigJam that would incorporate all interface elements.

An image with a white background and several post-it notes. They come in two colors, pink, and yellow. Pink are for "interact-able", and yellow for "Content/Purpose"

I wanted to cover as many common UI elements as possible. As for the actual implementation, I used Unity3D’s integrated UI tools [10], along with their VR tools [11] and some light C# coding. Most of this work was done inside of Unity3D and the only time I had to code anything was when I had to have GUI elements interacting with each other, like toggles toggling other UI interfaces (what Unity3D calls “canvases”).

And here is the final product, or the first full iteration.

A graphic of a virtual reality environment. In the environment is a 3d modeled operating room, with surgery table, three carts, an overhead light, and two monitors. There are also four interface panels with various buttons, drop downs, sliders.
An image showing the right side of the interface in the operating room.
An image showing the right side of the interface in the operating room.

To help with immersion I added a light switch slider atop the middle/right interface/canvas. Having a slider be tied to a light source was a fun and simple way to find a purpose for this UI element.

Basic script that controls the light and interface (see comments in green for descriptions of the code)

An image of code for a light inside of the 3D operating room. The text paragraph underneath this image displays the same code.

---------------------------------------------------------------------

Here's the code from the image above in text format:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;

public class lightSlider : MonoBehaviour
{
   public Slider slider;
   public Light sceneLight; // this allows me to attach the light object to the script via the Unity3d inspector
   public GameObject haloeffect1; // this is an additional fun "halo" effect light that attempts to mimic a realistic light source
   void Update()
   {
       sceneLight.intensity = slider.value; // this sets the light object's intensity to whatever the slider is changed to in real time        
        if (sceneLight.intensity > 0)
       {
           haloeffect1.SetActive(true);
       }
       else
       {
           haloeffect1.SetActive(false);
       }
   }
}

---------------------------------------------------------------------

Here is the inspector element inside of Unity3D where you can see the attached script and public variables that allow for connecting script to other objects in the scene

An image of an inspector element inside of Unity3D, with the script above attached.

To help with the interactivity I also made it so that you could grab the tools and manipulate them in 3D space. These would be most useful when performing the stages of the mock surgery.

At this point the basic scene and elements are all there to build upon. Using 3D models I can further create the experience of actually performing a surgery, but this is the extent of where I wanted to take the concept. There are however some key UX takeaways that I learned from this project, along with some ideas I have elaborated on on how to further develop this.

User Experience Takeaways

- Curved UI for visual comfort that takes advantage of the immersive nature of the technology. The user is turning their body in this virtual space, so the interface should curve around their body. Instead of having a user move their position inside of VR space, just have it wrap around them. This is becoming normalized inside of many VR applications.

- Designing for comfort. Originally I had this idea for a surgical mask to be adorned by the 3D camera that would move with your head as some extra immersion, but I found that it was just on the edge of my peripheral vision and it was very distracting. Wearing a mask throughout the covid pandemic I felt like I just got comfortable with it being on my face but in VR this comfort level was vastly different. Simulator sickness is also a real thing [12] and there’s an obvious point where added immersion is not worth the tradeoff for comfort.

- 3D space affords more input types. There’s definitely some limitations to what is a comfortable amount of inputs. The controllers I’m using have 5 buttons each and a thumbstick per controller. The majority of the inputs should be relatively straightforward 1. Look at, B. Aim red raycast at, C. press primary button on controller. The sliders do however take this same sequence but then have C. hold down button and drag slider by moving controller left/right. Of course, in surgery there are plenty of variables present that require further considerations, like simulating stress in the severity of the operation, and the tedium often involved. These are hard to emulate in VR, but the concept is there and there is room to expand on it even with the primary inputs being mock hands.

- Haptics, and as much signage as possible, like high contrast color states are important for the user to know exactly what they are selecting. Having everything that is interactable have a highlight color and a different “pressed” or “selected” color is very important. In the gif below, this is obvious what the user is selecting because there is a mouse pointer on top of the UI element. In VR with motion controllers, this isn’t as obvious and the more haptic-like features the better.

A moving .gif of the 3D interface from a centered viewpoint. It shows the UI in various hovered/selected states and how the buttons change color to highlight what is being selected.

- Height and boundaries are tricky inside of mixed reality space. I’m using a Windows Mixed Reality headset, and testing it myself was a different experience than someone who was shorter. I noticed this was a problem with some UI canvases that I had made rather tall and they would get obscured behind other objects when a tester who is a foot shorter than me tried the application. This taught me that height and physical boundaries should be a consideration.

- Information Architecture in 3D interfaces aren't more nuanced than 2D, but are different. Having buttons containing shallow amounts of information and branching trees rather than extensive ones that take several micro-interactions is better, BUT there is a balance here. I don’t want to overload the user with a huge amount of buttons, and I also don’t want them to click through several button containers and interfaces/canvases to get to something simple. This seems obvious but in VR space, the real estate of interface space is a different beast than that of a computer monitor or mobile device screen.

This also helped validate the idea of having some menu toggles, like the Diagnostic Manual and Trainer Module. If these are found to be unnecessary for a surgeon learning a procedure at specific steps, then simply toggle one off. There isn’t deeply ingrained mouse and keyboard muscle memory here, and each interface button takes intentional effort to use. You have to physically move your head to look at the button, physically aim a controller and point at it, and then activate a button on that controller. This ties back to comfort in that I don’t want to overburden a user by an ineffectively designed information architecture.

Further Development & Wrapup

- Have surgeons & medical educators look at it and get their insights into how to improve it. I only had three people test it, and one was a doctor but not a surgeon.

- Further research into operating room environments, and medical education in general. This is a proof of concept and establishes the bones of a larger product/service, but I’m sure that in the current iteration there are some unrealistic elements or unexpected interface interactions, simply because it hasn’t had extensive rounds of research and testing.

Thanks for reading! Interested in chatting about VR Design using similar methods, or even trying the scene out? See my contact page to shoot me an email. I’m happy to discuss and/or send over an executable!

For more information on how to design user interfaces inside of VR, please see the following link (or send me a message through my contacts page):
https://docs.unity3d.com/Manual/XR.html

References

1. Wu, J., Sun, Y., Zhang, G., Zhou, Z., & Ren, Z. (2021). Virtual Reality-Assisted Cognitive Behavioral Therapy for Anxiety Disorders: A Systematic Review and Meta-Analysis. Frontiers In Psychiatry, 12. doi: 10.3389/fpsyt.2021.575094 

2. Strickland PhD, J. (2022). Virtual Reality Cognitive Behavioral Therapy (VRCBT). Retrieved 12 October 2022, from https://cbtpsychologicalassociates.com/virtual-reality-cognitive-behavioral-therapy-vrcbt/

3. Boeldt, D., McMahon, E., McFaul, M., & Greenleaf, W. (2019). Using Virtual Reality Exposure Therapy to Enhance Treatment of Anxiety Disorders: Identifying Areas of Clinical Adoption and Potential Obstacles. Frontiers In Psychiatry, 10. doi: 10.3389/fpsyt.2019.00773
 
4. XR Interaction Toolkit | XR Interaction Toolkit | 2.2.0. (2022). Retrieved 12 October 2022, from https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@2.2/manual/index.html  

5. Oculus Developer Center | Downloads. (2022). Retrieved 12 October 2022, from https://developer.oculus.com/downloads/package/unity-integration/ 

6. Technologies, U. (2022). Unity - Scripting API: Physics.Raycast. Retrieved 12 October 2022, from https://docs.unity3d.com/ScriptReference/Physics.Raycast.html 

7. French, J. (2022). Raycasts in Unity, made easy - Game Dev Beginner. Retrieved 12 October 2022, from https://gamedevbeginner.com/raycasts-in-unity-made-easy/ 

8. https://assetstore.unity.com/packages/3d/environments/3d-free-modular-kit-85732 

9. Visenio, M. (2022). Retrieved 12 October 2022, from https://www.facs.org/media/wgcmalet/common_surgical_instruments_module.pdf 

10. Technologies, U. (2022). Unity - Manual: Create user interfaces (UI). Retrieved 12 October 2022, from https://docs.unity3d.com/Manual/UIToolkits.html 

11. Technologies, U. (2022). Unity - Manual: Getting started with VR development in Unity. Retrieved 12 October 2022, from https://docs.unity3d.com/Manual/VROverview.html 

12. McIntosh, C. (2022). Understanding Simulator Sickness. Retrieved 12 October 2022, from https://injury.research.chop.edu/blog/posts/understanding-simulator-sickness 

3D Models and Stock Images:

https://www.shutterstock.com/image-vector/patient-monitor-displays-vital-signs-ecg-418321423
 

https://www.dreamstime.com/heart-ultrasound-image-computer-screen-heart-ultrasound-image-image109815134 

https://www.cgtrader.com/free-3d-models/science/medical/pinza-59f9b5f6-1d23-4504-bbed-d97a7c8f562c
 

https://www.cgtrader.com/free-3d-models/science/medical/pinza-a781811f-92a7-4193-b3b5-e6cac4ceb8bb
 

https://www.cgtrader.com/free-3d-models/science/medical/surgery-set
 

https://www.cgtrader.com/free-3d-models/science/medical/medical-trolley-1c2a1372-1ea8-4b1b-ba5b-4f9778939fa7
 

https://assetstore.unity.com/packages/3d/tv-arm-mount-73139 

https://assetstore.unity.com/packages/3d/environments/3d-free-modular-kit-85732
 

SFX:

https://freesound.org/people/hadescolossus/sounds/627541/