Welcome to the MMVR17 Blog!

October 22, 2008

In this blog, we highlight some of the presentations at the upcoming conference. Berci Meskó, a member of the Organzing Committee, is interviewing presenters to give you an introduction to their research prior to their conference presentations. Your questions and comments are invited.

Please enjoy our first interviews! 


The Well: Virtual Reality Becomes Real

January 20, 2009

This is the second day of the MMVR17 conference and I spent hours in The Well that seems to be a huge success.

The Well is a space for one-on-one, laptop-based demos and select, large-scale technology displays. The Well complements the traditional commercial and academic exhibits, expanding the forum of ideas and devices. Demos in The Well will be scheduled and impromptu.

A few images, links and descriptions about the devices and tools we could see there.



Philip Weber & Jacopo Annese PhD (The Brain Observatory, University of California, San Diego & CalIt)

Installation: The Digital Light Box is a scalable visualization environment for radiological and pathological examinations that enables researchers to visualize and inspect high resolution (gigabyte size) images created by multiple imaging modalities, including virtual microscopy.


It was my personal favourite device today. It makes it quite easy to perform a proper intubation as the camera helps you how to navigate easily. More information here.


The Virtual Reality Medical Center presented the next generation injury creation science. Such models can make simulations as realistic as possible. You can also trigger bleeding or simulate different types of injuries. Click here for more information.


The project of Albert Rizzo demonstrates how post-traumatic stress disorders could be treated by using virtual reality therapy. You can see a military scene with weapon, if you stand on that square, you can feel the bombings and hear gunfire, etc. Read more about it here.


Forterra Systems created a new environment for medical simulations. This platform is quite different from Second Life as it is a closed and secure system and many simulations have already been implemented into it. Communication is easy but, of course, it’s not for free.


A prostethic arm from Hanger.com. Read more about their microprocessor controlled hydraulic knee in our interview.


An XBOX based bronchoscopy simulation. If you are good at video games, you will find it easy to handle the device. Future surgeons should start with military games and video games that require major skills.

That’s for today. Stay tuned for more images tomorrow from The Salon.

Live from the conference

January 20, 2009

The first day of the MMVR17 conference is just over and we’re very happy to have so many participants and prestigious presentations/posters. I’ll post a few images soon, but if you want to follow the conference live, please follow my account on Twitter or the #mmvr17 hash tag.

Feel free to join the discussion on Twitter and share your slideshows on Slideshare.net.

The Salon Interview: Gaze-Driven Head-Mounted Camera

January 14, 2009

Johannes Vockeroth ( University Hospital Munich ) will present the Gaze-Driven Head-Mounted Camera project in the Salon at this year’s Medicine Meets Virtual Reality 17 conference.

Check the video report.

1) Please tell us more about the Gaze-Driven Head-Mounted Camera and what you will present at this year’s MMVR conference.

EyeSeeCam is a novel head-mounted camera controlled by the user’s eye movements. It allows, for the first time, to literally see the world through somebody else’s eyes. EyeSeeCam is based on the combination of two technologies: an eye tracking and a camera motion device that operates as an artificial eye. The challenges in designing such a system are mobility, high bandwidth, and low total latency. These challenges are met by a newly developed lightweight eye tracker that is able to synchronously measure binocular eye positions at up to
650 Hertz and novel piezo actuators, that means the camera is driven by ultra-fast motors that are not based on conventional electro-mechanical transduction

At this year’s MMVR we applied the new camera in a dental treatment while filling a tooth. In the video footage you can clearly see how well you can look into the cavity of the tooth via the mirror. The EyeSeeCam consits of two individual cameras, one for the wide-angle scene and one for the gazed detail. You’ll see that the area around the mouth is overexposed in the scene image but the gaze-driven camera provides an individual exposure for this area. As a result, the EyeSeeCam provides not only a higher spatial resolution but also a higher dynamic range.


2) Can the camera record what it actually sees. Is there a chance it could record a whole surgery?

Yes, a very fast eye tracker system continuously directs the camera towards the user’s point of gaze, so that the camera captures exactly what the user’s eyes see. The delay between measured eye movements and corresponding camera rotation are down to 10 ms. This is quite an impressive number if one considers that it includes eye movement measurement by aquisition, transmission and processing of eye images, as well as the mechanical actuation of a video camera. The camera can thereby reproduce the whole range of human eye movements. This video can be recorded, transmitted and displayed in real time.

A surgeon, for example, wore the 850g device during a sugery that lasted 2.5 hours. He documented the whole surgery from his view without missing any crucial event. Thanks to EyeSeeCam’s mobility and lightweight setup the surgeon reported that he was not restricted by the system.


3) Do you think doctors will use it in the future? Are there any legal aspects they should consider?

If EyeSeeCam should succeed in offering an additional benefit as compared to conventional video documentation techniques, doctors will likely use it in the future. It’s already very common to document surgeries with video cameras above the operating table or by an external camera operator. But often the interesting parts are covered by hands or arms. Essentially the motivation for a head mounted camera is similar to the motivation for a surgical headlight.

We are no lawyers but we think doctors have to consider the same legal aspects as with conventional video documentation techniques.

4) What are your plans for the next few years? Are you working on a new device or still improving EyeSeeCam?

EyeSeeCam still needs further improvements. The cameras, for example, are still too big. We will also expand possible uses of EyeSeeCam to other fiels like the examination of natural visual exploration in humans. In some side projects I’m using the camera to make short movies or support media art projects. So we are further looking for new applications and improving EyeSeeCam.

Further Salon interviews:

The Salon Interview: VOXEL-MAN visualization system

January 13, 2009

Karl Heinz Hoehne (Medical Informatics; University Medical Center Hamburg-Eppendorf & Voxel Man) will present the VOXEL-MAN project in the Salon at this year’s Medicine Meets Virtual Reality 17 conference.


1) Please tell us more about what you will present in the Salon at this year’s MMVR conference.

I am a professor emeritus and the work presented is a collection of earlier more illustrative than scientific work, that was not been published so far to a broader audience.

One of the two presentations is a movie – created at the occasion of the 100th anniversary of Roentgens discovery in 1995 – that illustrates the history of medical imaging and image computing. It was generated completely with the tools of the VOXEL-MAN visualization system. Pictures on the walls of a virtual room lead to the different highlights: The discovery of the X-rays, CT and MR imaging, 3D models for surgery and training, virtual endoscopy and more. While today’s virtual body models, show a much higher spatial resolution and interactivity, the presented visualization techniques are still state of the art and might inspire the viewers.

The other exhibit is a poster showing compositions of some of the famous anatomical drawings of Leonardo da Vinci and today`s computer models.

2) What was the main concept behind launching the Leonardo meets VOXEL-MAN project?

Leonardo da Vinci (1452-1519) was both an ingenious painter and engineer. We can imagine that he – when living today – would have creatively used methods of virtual reality. Besides his paintings and visions in mechanical engineering, his anatomical drawings became famous. I was always intrigued by these drawings and have created- just as a hobby – compositions with VOXEL-MAN body models.

3) How can VOXEL-MAN be used in everyday medicine?

The VOXEL-MAN project dates back to the early eighties and many of the algorithms (especially for realistic visualization) developed in this project are state of the art in many applications since long time. The VOXEL-MAN software is still used for the creation of 3D atlases of anatomy and radiology and for the development of surgical simulators
4) Are your body models used in medical education? If so, do you get feedback from students?

We have three interactive atlases of anatomy and radiology (“VOXEL-MAN 3D Navigators”) available for the brain and skull, the inner organs, the upper limb and for abdominal ultrasound. They are very well accepted by students and teachers.

5) What are your plans for the near future?

As the founder of the VOXEL-MAN group I work as a senior advisor and part time collaborator the surgery simulator projects of the VOXEL-MAN group. I am convinced that training and planning based on virtual reality techniques will soon be indispensable in surgery.

Further Salon interviews:

The Salon Interview: Microprocessor-controlled hydraulic knee

January 9, 2009

The Well at this year’s Medicine Meets Virtual Reality 17 conference will include the demonstration of Hanger Orthotics & Prosthetics about a microprocessor-controlled hydraulic knee.

The responses are from Katy, a bi-lateral above knee amputee, who wears C-legs. She will be the patient working in the “Well” with Coryn Reich, the Prosthetist.


1) Please tell us more about what you will present in the Well at this year’s MMVR conference.

We will be presenting microprocessor knees and showing what a difference they can make for amputees. We will be demonstrating how they help amputees walk down inclines, sit and recover from stumbling.

2) How is a a microprocessor-controlled hydraulic knee different from the artificial knees that have been used so far?

The main difference is the computer inside the knee. The microprocessor is constantly monitoring what you are doing physically (eg. walking, sitting, doing down stairs) and giving you more or less resistance. For example, I could be walking along at a regular rate and approach a ramp and without slowing down or needing to hold on the a railing, I can walk down the incline with my knees slightly bent. This gives a more ‘nautral’ appearance to walking.

3) What kind of feedback do you receive from patients?

The patients that I know using these knees are very happy. For myself, I fall a lot less then I did and I have the feeling of freedom and confidence. There is a bit of a learning curve in that you have to trust that the knee is there and going to support you, but after that you are able walk with much more ease.

4) What is your opinion about the attempt of Oscar Pistorius to compete at the Beijing Olympic Games? Can such a medical device lead to unfair advantage over able-bodied runners?

Being a member of the US Paralympic team during this time I was thrilled that the IOC would even have the conversation. Oscar is a fantastic athlete and would being doing amazing things with or without prosthetics. If there was an advantage in using carbon fiber running feet there would be many challenged athletes attempting to do what Oscar is doing. The microprocessors are not used in running, they are for everyday walking around.

Further Salon interviews:

The Salon Interviews: The Use of Virtual Reality in Addiction Medicine

January 7, 2009

One of the Salon demonstrations at this year’s Medicine Meets Virtual Reality 17 conference is the demonstration of Chris Culbertson (Neuroscience Ph.D. student) who will present how virtual reality can be used in addiction medicine:

1) Please tell us more about the “Use of VR in Addiction Medicine” demonstration you will present at this year’s MMVR conference.

The demonstration presented in the Salon portion of MMVR17 will allow attendees to observe and interact within the virtual world we have created using Second Life. Participants will be able to freely navigate through the environment and interact with drug paraphernalia and animated avatars while observers may watch on an external monitor. My talk will cover the initial testing of this environment in methamphetamine users, highlight some of our current projects and discuss future application.


2) As a Neuroscience Ph.D. student, how did you get involved in this project?

I became involved in this project through a combination of adversity, ingenuity and luck. A handful of researchers around the globe had demonstrated the applicability of virtual reality (VR) in nicotine addicts (i.e. smokers) when I began working with Drs. Newton and De La Garza at UCLA. We discussed the possibility of applying this approach to our methamphetamine studies since our previous methods of inducing craving (videos) had demonstrated modest success. Our initial attempts to collaborate with researchers already using VR proved unsuccessful so we returned to the drawing board to determine the feasibility of developing such a system on our own. I contacted the Experiential Technology Institute (ETC) at UCLA, which specializes in 3-D modeling, and proposed the idea. With their technical expertise and my scientific vision we were able to create an initial environment for testing in a relatively short amount of time. Since I had my hands in both the technical development and scientific application I naturally became heavily involved in the project. More recently, Drs. Newton and De La Garza have relocated to Baylor College of Medicine in Houston, TX allowing me to lead the VR development/application at UCLA under the mentorship of Dr. Arthur Brody while continuing my collaboration with their research group.

3) Why did you choose Second Life for this purpose?

We decided to develop our research tool on Second Life (SL) for a few decisive reasons: 1) Time; 2) Money; 3) Accessibility; 4) Adaptability. SL provided a preexistent, user-friendly platform to immediately apply our ideas simply using a few PCs, which allowed us to hit the ground running. Additionally, for a small membership fee we could rely on SL to maintain and improve our virtual world. Since SL only requires an Internet connection to access, we realized that opportunities for mobility and future collaboration would be seamless. Lastly, SL may be easily adjusted in real time allowing researchers to easily individualize environments without any knowledge of scripting or modeling. Subsequently I have also realized the greatest benefit of SL is the opportunity for outsourcing – thousands of individuals around the world are creating new environments, objects, and animations daily and are willing to provide their products and services for little to no cost.

4) How can such a 3-D environment be used in behavioral pharmacology research and how can you exam drug taking behavior?

Previous preclinical and clinical research has demonstrated the importance of environment on the response elicited by a drug. Currently, phase I clinical trails assessing novel treatments for drug addiction are conducted in hospitals for safety purposes (i.e. in the case there is an adverse reaction between the medication and the drug of abuse). However, hospital environments tend to be very constrictive and completely irrelevant to the average drug user. Since we cannot take the drug addict to their environment, we must bring the environment to them. Virtual worlds allow a drug user to interact freely in a naturalistic environment while in the safety of a clinical setting. Additionally, such an environment allows researchers to closely track the drug users behavior while on a medication versus a placebo as one would track a rodent in a maze. Researchers may also allow the drug user access to their drug of choice under temporally controlled conditions (i.e. by simply initiating an animated drug taking behavior in the 3-D environment a single dose would be automatically administered) and monitor their drug seeking and taking behavior similar to a rodent in a self-administration chamber. We expect that by creating a more realistic environment we will obtain more realistic responses allowing for better assessment of treatment efficacy. This in turn may increase the predictability of treatment success to a larger, outpatient phase II trial and help aid in early and efficient identification of potential therapies for addictive disorders.


5) Do you plan to extend the research (e.g. different drugs)?

We have recently started a cigarette smoking cessation trial using a smoking virtual environment. 3-D environments have been successfully utilized for anxiety and phobic disorders through repeated exposure to aversive stimuli (e.g. heights, spiders, public speaking, war-like environments) in the safety of a clinical setting with a therapist on hand. Along these lines, we hope to help smokers “extinguish” smoking behavior or relearn more positive behavioral responses to smoking cues they will inevitably come across while trying to quit. If successful this type of treatment may be easily applied to additional addictive and psychiatric disorders.

The Salon and The Well: Interviews

January 1, 2009

This year’s MMVR will feature expanded spontaneous interactive environments. The Well will merge formal exhibits with casual demonstrations. Salon will mingle the visual arts, science, and medicine.

Now Kóan Jeff Baysa asked several of the Well & Salon artists:

Q: The Salon was created, in part, to present works of invited artists to stimulate thinking outside of the box. How do you anticipate that your project proposal will effect this with the attendees of MMVR?

Jiayi and Shih-wen Young respond about their work, “Sampling Rate in Audible and Visual Perception”

A: The audible perception part of our project explores visual representation of sound with different sampling rates. It hopes to provide the MMVR community with a different way of perceiving sound out side the conventional waveform format.

The visual perception part of our project challenges the brain to play a multitasking game demanding the brain to process more and more information simultaneously, while providing less and less cohesive information.


Click here to learn more about the project

Alessandro Marianantoni and Marcos Lutyens respond regarding “The Excarnation Machine (BETA)”

A: Our project is a playful way of representing the human body, through new techniques and mechanisms that specifically take into account the body’s adaptation and conditions of survival within the contextual world. We hope to inspire attendees to stretch the conceptual envelope of the use of tools showcased at the MMVR..

On a closer look, at a time of radical changes in the earth’s climate, as well as the mass extinction of over half the known species on the planet, perhaps the developments of scientific techniques that are showcased at the MMVR will actually be necessary to help us to respond and adapt to the changes around us, at a pace that is much faster than that afforded to us by natural genetic adaptation to the planet’s changing climatic conditions.

The tools and technologies that are on view at the MMVR are aimed at certain medical interventions, with specific applicable goals, which are usually to do with curing diseases and treating negative health conditions. We believe that it is important for us to think of these tools as possible vehicles for increasing wellness in the context of adaptation and survival. By juggling around with how the body is fundamentally designed, we may pause for a moment to think about how the body may be better adapted to deal with increased UV exposure, drier terrain, flooding, interplanetary travel and other challenges.

We are currently working on several projects that bring virtuality into the context of hospital environments for therapeutic ends and hope to exchange ideas with and learn from exhibitors at the MMVR.


Virgil Wong and Philip Forget reply about their “Phineasmap” project:

A: Stephen Hawking wrote: “Science fiction serves a serious purpose, that of expanding the human imagination. We can explore how the human spirit might respond to future developments in science, and we can speculate on what those developments might be.” The technological applications visualized by the Phineasmap patient portal and my fictitious medical institution, RYT Hospital, may appear as fanciful speculation, but I hope that they will serve as deliberate scenario-mapping tools for interested physicians, surgeons, educators, engineers, and data technologists.


Click here to read more about the project

Alan Liu: The World of Medical Simulators

December 12, 2008

Our fourth interviewee is Dr. Alan Liu who is the Director of the Virtual Medical Environments Laboratory at the National Capital Area Medical Simulation Center (SimCen). He is involved in defining the SimCen’s strategic research goals, and directing the development of the SimCen’s computer-based medical training systems.


Click here to see how Dr. Liu answered these questions:

  • 1) What do you consider the indicators of “success” in the development of medical simulator?
  • 2) What types of communication between engineers, educators, and physicians increase success in simulator development?
  • 3) What role does patient opinion or experience have in the development of simulators?
  • 4) Please tell us more about the workshop you will organize at the MMVR 17 conference.
  • 5) What are the major mistakes one can make while building medical simulators?
  • 6) Which medical specialties will benefit the most from medical simulators in the near future?

Read the rest of this entry »

Interview with Dr. Dennis Wood: Virtual Reality Graded Exposure Therapy

November 28, 2008

Our third interviewee is Dr. Dennis Wood who is a Clinical Psychologist at the Virtual Reality Medical Center in San Diego, CA. He is an expert in the field of virtual reality exposure therapies.


(Two of the three computer systems, with HMD and headphones, utilized by VRMC with various VR research and clinical projects.)

Click here to see how Dr. Wood answered these questions:

  • 1) Dr. Wood, what key advances in your research will you be presenting at this year’s meeting?
  • 2) Your therapy appears to focus on immersing patients in a simulated war environment. How does this immersive experience reduce depression and PTSD? Is there a possibility the patient’s condition could be worsened by immersion?
  • 3) Please tell us some details about the VRGET therapy.
  • 4) Do you use any kind of web 2.0 tools in your research (e.g. blogs, wikis, social bookmarking, etc.)?
  • 5) At MMVR17, will you demonstrate how your virtual reality system actually works?
  • 6) What developments do you predict will be most noteworthy in the future of the gaming/simulation industry/technology?

Read the rest of this entry »

Interview with Dr. James Kinross: Simulation in Second Life

November 27, 2008

Our second interviewee is Dr. James Kinross who is working at the Imperial College of London (Division of Surgery, Oncology, Reproductive Biology and Anaesthetics) and is a pioneer in conducting medical simulations in the virtual world of Second Life.


Click here to see how Dr. Kinross answered these questions:

  • 1) Mr. Kinross, you and your collegues have developed a virtual teaching environment for medical students. Why did you choose Second Life for such purpose?
  • 2) How realistic can objects and operating rooms be in Second Life? Does your simulation focus on interactive learning or animation-based training?
  • 3) Many educational meetings in SL find it difficult to interact because of the lag users have to face due to technical problems. Have you experienced something similar?
  • 4) Who can access the virtual site now? Is there a teleport link we could use?
  • 5) What do you think about other medical teaching opportunities such as the Ann Myers Medical Center? Are you open to co-operate with them?
  • 6) Do you plan to construct other educational resources in Second Life in the near future?

Read the rest of this entry »