Immersive Mixed Reality Technologies in the Classroom

Mixed reality technology

By Isac Artzi, PhD
Faculty, College of Science, Engineering and Technology

Posted on April 06, 2018  in  [ Engineering & Technology ]

The current classroom experience across colleges and universities is typically based on one or more of the following: lecture, discussions, hands-on activities, laboratory experiments and knowledge assessment activities. Our applied research team (S.M.U.R.F) in the Department of Computer Science at GCU is investigating the potential of adding a new dimension to the classroom experience: mixed reality, i.e. the combination of virtual reality, augmented reality and interaction with real objects. The use of virtual reality technology while interacting with physical objects in class provides multiple opportunities for enhancing learning activities from a pedagogical and student engagement perspectives. Our team’s current research focuses on the following question: “What is the feasibility of virtual/mixed reality environments as experiential learning environments?”

Method

We embarked on a series of experiments intent on assessing the feasibility of transforming objects commonly found in a regular classroom (tables, floors and walls) into an interactive physics lab. The enabling technology is the Microsoft HoloLens™ mixed reality device and Unity™ 3D programming environment.

We were able to create a proof of concept demonstration, focused on experiments typically used in the context of teaching Newton’s Second Law of Motion and projectile motion. The environment was created by producing an immersive, virtual reality-based layer which fully interacts with the surrounding world. For example, when a student wears the Microsoft HoloLens, he or she sees a virtual ball bounces off the real table, onto the real floor, while calculating distance, velocity and acceleration. The student can interact with the ball using gestures and voice commands. Calculations associated with the experiment can be performed in virtual space using voice commands. Figure 1 depicts the initial capturing of environmental information, an essential first step towards build the immersive, mixed reality environment.

Figure 1. The HoloLens recognizes and captures the dimensions of the floor

The capturing of visual data employs a bilinear interpolation algorithm applied to each surface (Fig. 2). A three-dimensional image is composed of many two-dimensional surfaces. The HoloLens is scanning and sampling the surfaces viewed and filling in the missing data points based on the ones recorded.

Figure 2. Bilinear interpolation

The depth-sensing cameras in the HoloLens provide the coordinates of the triangles in Figure 1, while the interpolation method computes the remaining points to convert triangles into solid surfaces. The interpolation along the x-axis are performed independently of the y-axis. Then they are combined to produce the (x, y) coordinates for the inner point.

For example, if the coordinates of two points A(x1, y1) and B(x3, y3) are measured, then the point C(x2, y2) can be obtained using the formula in Figure 3:

Figure 3. Interpolation calculation

The above process is tediously repeated to scan every single object in a room and calculate the internal 3D representations of all objects. Thus, the virtual environment is created. The creation of the immersive, virtual environment that enables embedding virtual objects into real ones is possible by inserting a layer of visual information in the field of view of the person wearing the HoloLens (Figure 4).

Figure 4. A virtual ball appears in real space

The HoloLens recognizes a range of hand gestures which can be leveraged to interact with virtual objects. For example, an air tap in combination with a voice command can be used to “grab” a virtual object (e.g. the ball) and place it on a real table. The HoloLens, being a full-fledged computer, can be programmed to calculate values using mathematical formulas and display those results as another layer in the user’s field of view (Figure 5).

Figure 5. Display of various calculations

Outcome

A demonstration of the HoloLens is available here:

While this is a conceptual demo, it shows what is possible and the considerable effort required to create a useful application. A team of four junior computer science students (Shaun Wang, Chandler Van Dyke, Tommy Fowler and Joshua Lee) developed this project over 12 weeks. The skills employed included programming in C# in Unity 3D and a solid foundation in linear algebra, computer graphics and physics. The programming and mathematical effort were necessary to build the environment to help communicate to the HoloLens information about the environment. Fortunately, the device handles the heavy lifting of image processing, decoding human gestures and parsing voice commands. In addition to the effort in building the system, there is the need to develop the content (physics in this case), a teaching methodology, an assessment methodology and an overall approach to curricular integration.

The effort was successful and worthwhile. We firmly believe that in the near future, virtual reality and mixed reality will become as ubiquitous as any computer application currently in use. This is about to impact the way we consume visual content and interact with objects in a 3D environment.

Benefits

We demonstrated several key benefits of using this technology:

  1. The ability to create a physics experimentation lab anywhere, eliminating the need for a dedicated lab or instrumentation.
  2. Active student engagement and interaction while showing how the laws of physics apply to real objects.
  3. The potential for adding virtual instruction, assessment and custom learning pathways for a personalized learning experience.
  4. The possibility of offering science courses to remote locations and online students.
  5. Elimination of expenses associated for procurement, storage and maintenance of expensive lab equipment, while at the same time providing the most modern type of lab instruments: simulated.
  6. Creation of a highly stimulating and engaging environment, in which students with a wide range of learning styles can thrive.

To learn more about how Grand Canyon University’s Department of Computer Science in the College of Science, Engineering and Technology is providing students with modern and relevant curricula to give them an edge in the STEM field, visit our website or click the Request More Information Button on this page.

About College of Science, Engineering and Technology

The College of Science, Engineering and Technology offers degree programs that prepare students for high-demand professions in science, technology, engineering and math (STEM) fields. With an emphasis on Grand Canyon University’s Christian worldview, our college believes in instilling social awareness, responsibility, ethical character and compassion. Our blog, Brain STEM, focuses on topics related to science, engineering and technology, with engaging contributions from students, staff and faculty. 


Loading Form


Scroll back to top