Remote Surgery using Virtual Reality – Final Year Project
I was finally in my final year of university, the year I had been waiting for, for quite a long time. Nothing could hold back my excitement of starting off with building a final year project like nothing before.
The first idea was, of course, the Augmented Reality smartphone accessory which I had been working on since my 3rd semester. But something struck me and forced me into building something more philanthropic and that is where the idea of remote surgery came into being. I was well aware of the fact that this was just a project and not a product and would in no way end up in the hospitals doing surgery, it was just a proof of concept, an idea, which’d make me feel like I am contributing towards a better world in a more effective way.
The 7th semester passed by doing the boring documentations & refining the idea itself. Usually the project’s scope becomes small in this phase, but the opposite happened to this project, instead, it kept feeling so much simpler & simpler that we had to increase the scope and keep adding features. And not to forget the criticism and heartbreaks by the evaluators. The amount of discouragement and criticism we received from them was just too much.
To understand this system, imagine two sides of the scenario, a patient side & a surgeon side. The surgeon is supposed to operate/diagnose the patient over the internet.
On the patient there is a prosthetic robotic arm, replacing the hand of the surgeon & a camera replacing the surgeon’s eyes.
On the surgeon side, there is a VR headset, through which the surgeon is able to see the environment of the patient side, and Leap Motion device which monitors the surgeon’s hands and makes the prosthetic robotic arm on the patient side move according to the movement of the surgeon’s hands.
Further, the surgeon’s head movements make the camera on the patient side move as well, and the camera stream from the patient side over to the surgeon side over an RTSP server was done too.
This system was first developed in Node.js years back for the NASA Space Apps Challenge in 2014. During the refinement of our project in the 7th semester we had to come up with a different platform with which we could easily develop a simulator and the VR part, hence we chose the easiest & most usable tool available in the market, a gaming engine, Unity3d. Porting everything over to unity3d seemed to be quite a challenge.
During the winter vacations in January, I decided to finish the project before the 8th semester starts so I can get myself busy with other productive things during that semester. It took me around 2 days to finish everything. 2 days meaning 16 hours sleep and rest making this. Bayan, my brother helped a lot making the hand, infact he did most of the work in making the prosthetic arm.
The biggest challenge was streaming live images over the internet using an RTSP server. Unity3d being a gaming engine lacked the support of such a feature. We had to develop our own plugin to make this work.
Basically, one very important lesson we learnt from this project was that, reinventing the wheel is a thing of the past, but is still encouraged by our colleges, at least here in our country. Nobody till the last day figured out that our project was more of joining pieces & using libraries than actual building from scratch. (Hence the two days 😉 )
Below are two videos. One of the movement of camera with the VR headset & the other of the prosthetic robotic arm.