Home

NeuroVR is a cost-free virtual reality platform based on open-source software, that allows providing te clinical professional with a cost-free VE editor, which allows non-expert users to easily modify a virtual scene, to best suit the needs of the clinical setting. 

The NeuroVR platform includes 2 main components, the Editor and the Player; they are implemented using open-source components that provide advanced features.This includes an interactive rendering system based on OpenGL which allows for high quality images. 

The new NeuroVR Editor is totally different from the previous releases, it is realized using QT libraries: a cross-platform application development framework widely used for the development of GUI programs. 

Thanks to these features, clinicians and researchers have the freedom to run, copy, distribute, study, change and improve the NeuroVR Editor software, so that the whole VR community benefits.  

Using the NeuroVR Editor, the psychological stimuli/stressors appropriate for any given scenario can be chosen from a rich database of 2D and 3D objects, and easily placed into the pre-designed virtual scenario by using an icon-based interface (no programming skills are required). 

In addition to static objects, the NeuroVR Editor allows to overlay on the 3D scene video composited with a transparent alpha channel. 

The other component of NeuroVR is the Player, which allows to navigate and interact with the VEs created using the NeuroVR Editor. 

When running a simulation, the system offers a set of standard features that contribute to increase the realism of the simulated scene. These include collision detection to control movements in the environment, realistic walk-style motion, advanced lighting techniques for enhanced image quality, and streaming of video textures using alpha channel for transparency.

 

NEW FEATURE: VIRTUAL BIOFEEDBACK

NeuroVr 2.0 uses biosensors to measure, amplify and feedback physiological information to the individuals being monitored, thanks to a graphic visualization of physiological parameters that they can see in real-time their bodily functions and how they react to different anxious stimuli
Those physiological data directly modify specific features of the virtual environment in real time: for example an object can get bigger or smaller depending of the levels of physiological information.
This is used in the Interstress Project funded by the European Commission to train for anxiety reduction...


Messages

  • CALL FOR REQUIREMENTS ! We are planning the evolution of our software. A lots of our users already send their opinion on this issue. Please, let us know what features you missed when you ...
    Posted Sep 6, 2012, 3:31 AM by cinzia vigna
  • Neuro VR STABLE VERSION 2.0.1NOW AVAILABLE ! After our debugging activities we release a stable version of our software !The main bugs were fixed, thanks also to your contributions.Now it's biofeedback ready ! It means you ...
    Posted Jul 11, 2012, 3:18 AM by cinzia vigna
  • New videos (transparent background) available During the preparation of Interstress scenarios we prepared some more videos to help you customizing your environment.For example :a new set of people sit on a chair, taking notes ...
    Posted May 9, 2012, 8:56 AM by cinzia vigna
  • Success for the presentation of Interstress Project at Innovation Convention Around 100 delegates experimented the immersive virtual reality stressing situation and the positive tecnology application for ipad to learn and train how to relax and had the opportunity to see ...
    Posted Dec 12, 2011, 3:11 AM by cinzia vigna
  • NeuroVR 2.0 and Interstress Project at Innovation Convention 2011, Bruxelles The INTERSTRESS project has been selected from among the 450 projects and will be one of the 50 projects on display in the Innovation Convention.The Innovation Convention exhibition will ...
    Posted Nov 30, 2011, 6:01 AM by cinzia vigna
Showing posts 1 - 5 of 10. View more »

Visualization Modality

The player can be configured for 2 basic visualization modalities: immersive and non- immersive


The immersive modality allows the scene to be visualized using a head-mounted display, either in stereoscopic or in mono-mode; compatibility with head-tracking sensor is also provided. 


In the non-immersive modality, the virtual environment can be displayed using a desktop monitor or a wall projector. The user can interact with the virtual environment using either keyboard commands, a mouse or a joypad, depending on the hardware configuration chosen.