Dev #01: Scenes, Avatars & Spatial Audio

This week marks the start of our devblog, in which we will try to keep you up to date with all the exciting (and the necessary mundane) stuff we are working on. As this is the first time we share our current activities with you, we would also like to talk about the work we did up until now.

Status Quo

The development of Figments.nrw began at the end of 2020, shortly after the kick-off of the AR/VR.nrw project. Initially, the focus was on creating functional prototypes as quickly as possible, whereas we currently focus on the first evaluation prototype (slated for end of 2021).

A whole series of smaller sub-steps have already been pursued to this end:

  • Selection of a suitable version control system (VCS), in our case a Skylab instance hosted by the Institute of Visual Computing at Bonn-Rhein-Sieg University of Applied Sciences.
  • Establishment of suitable communication channels for the scientific staff working on the project, realized by means of Mattermost.
  • Selection of a suitable integrated development environment taking into account the requirements and needs of all stakeholders involved in the project. After ample consideration, Unity was chosen here over Unreal Engine, which will be used in the 2020 Long Term Support (LTS) version.
  • Building the basic software framework for the realization of augmented and virtual reality, here using OpenXR in Unity.
  • Adoption of a suitable coding style for C#.
  • Development of a website for communication about the activities in the project.
  • Establishment of regular operational and strategic meetings of the project management and the project staff.

Current Prototype & Features

This work led to a current prototype, which allows several participants to simultaneously enter a virtual learning environment via VR-HMD, PC or Android devices. So far, the following is (more or less) possible in this environment (list not complete):

  • Import of 3D objects and environments
  • Move objects (e.g. grab, drop, bump)
  • Navigation via room-scale VR and virtual teleportation
  • “Holographic” user interface that is worn on the virtual wrist and can be used for more complex (system) settings.

Experimental Features

Experimental functions include, for example:

  • Integration of Power Point slides via virtual projection surface
  • Mirroring of the desktop surface (only if PCs are used)
  • Drawing on virtual surfaces

Dev #01: Current Activities

As mentioned in our roadmap, the first user study is planned for the coming semester. The focus of our current efforts is to develop the necessary functions for this.

While we will publish additional information on the subject content orientation of this first study (located in the area of crystalline structures) later, we will use this opportunity to test first modular extensions (import of subject-specific file formats) in addition to the basic functions of Figments.nrw. The main focus will be on investigating the usability of the software, and the suitability of different device types (especially in the area of VR-HMDs) for various teaching scenarios.

The project team discusses the current and planned developments in weekly meetings. In the past week, the following points were addressed, among others.

Scenes & Settings

Currently we are using a general demo environment to test new features. For the upcoming user study, various settings are envisaged, including a virtual representation of a conventional classroom. After developing a reference board, we are currently working on the 3D modeling of such a virtual environment. It is planned to investigate this and other environments created in the project also from the point of view of room design and its influence on learning processes. Especially in the context of non-traditional learning environments (which explicitly do not try to recreate existing classrooms or lecture halls) we hope to gain some new insights.

Virtual Avatars

Especially in situations where several learners use a virtual space at the same time, the role of player avatars becomes important. Not only do avatars allow us to see our counterparts, they also give us the opportunity to interact with others and work together on problem solving tasks. However, humanoid avatars in particular require an enormous amount of design if they are to be realistic without ending up in the Uncanny Valley. For this reason, we will mainly work with stylized avatars, as we are currently doing with very simple avatars. In the meantime, there are also free and open solutions (e.g. www.readyplayer.me) that allow users to design avatars themselves. We are therefore investigating the extent to which such services can be integrated into Figments.nrw.

Spatial Audio & Voice Chat

In addition to avatars, which enable non-verbal communication in virtual space on the level of the visible and observable counterpart, we are currently developing further interaction, collaboration and communication tools.

The exchange via the spoken word is a very important aspect here, which we wanted to address as quickly as possible in the development of Figments.nrw. Since there is a lot to consider here, in addition to auditory factors and qualitative aspects of it, also in the context of synchronization via networks, we are devoting a lot of attention to this point.

Virtual reality offers advantages here through a positionally accurate location of the participants (as well as any noise sources that may be in the room), which can be tapped using spatial audio. Here, signals become accessible binaurally, and can thus contribute significantly to orientation in the virtual space. In addition, this opens up possibilities for auditory feedback, e.g. in learning processes.

Drawing & Writing in 3D

Another way to enable communication in virtual space is through writing and drawing. Via controller or hand tracking, a natural interaction is possible that comes surprisingly close to writing in reality (e.g., using a pen or brush).

While we have already successfully implemented three-dimensional drawing in virtual reality, we are currently working on being able to write on two-dimensional virtual surfaces (e.g., a whiteboard) as well. We see potential here both in the habit with which users can get to grips with these drawing tools and in the possibility of saving and exporting these two-dimensional recordings conventionally (e.g. as JPG/PNG) in order to use them later.

Table of Contents