We embarked on a series of experiments intent on assessing the feasibility of transforming objects commonly found in a regular classroom (tables, floors and walls) into an interactive physics lab. The enabling technology is the Microsoft HoloLens™ mixed reality device and Unity™ 3D programming environment.
We were able to create a proof of concept demonstration, focused on experiments typically used in the context of teaching Newton’s Second Law of Motion and projectile motion. The environment was created by producing an immersive, virtual reality-based layer which fully interacts with the surrounding world. For example, when a student wears the Microsoft HoloLens, he or she sees a virtual ball bounces off the real table, onto the real floor, while calculating distance, velocity and acceleration. The student can interact with the ball using gestures and voice commands. Calculations associated with the experiment can be performed in virtual space using voice commands. Figure 1 depicts the initial capturing of environmental information, an essential first step towards build the immersive, mixed reality environment.
Figure 1. The HoloLens recognizes and captures the dimensions of the floor
The capturing of visual data employs a bilinear interpolation algorithm applied to each surface (Fig. 2). A three-dimensional image is composed of many two-dimensional surfaces. The HoloLens is scanning and sampling the surfaces viewed and filling in the missing data points based on the ones recorded.
Figure 2. Bilinear interpolation
The depth-sensing cameras in the HoloLens provide the coordinates of the triangles in Figure 1, while the interpolation method computes the remaining points to convert triangles into solid surfaces. The interpolation along the x-axis are performed independently of the y-axis. Then they are combined to produce the (x, y) coordinates for the inner point.
For example, if the coordinates of two points A(x1, y1) and B(x3, y3) are measured, then the point C(x2, y2) can be obtained using the formula in Figure 3:
Figure 3. Interpolation calculation
The above process is tediously repeated to scan every single object in a room and calculate the internal 3D representations of all objects. Thus, the virtual environment is created. The creation of the immersive, virtual environment that enables embedding virtual objects into real ones is possible by inserting a layer of visual information in the field of view of the person wearing the HoloLens (Figure 4).
Figure 4. A virtual ball appears in real space
The HoloLens recognizes a range of hand gestures which can be leveraged to interact with virtual objects. For example, an air tap in combination with a voice command can be used to “grab” a virtual object (e.g. the ball) and place it on a real table. The HoloLens, being a full-fledged computer, can be programmed to calculate values using mathematical formulas and display those results as another layer in the user’s field of view (Figure 5).
Figure 5. Display of various calculations