2025-Group 3

Virtual Labyrinth

Project team member: Katie Kubiatko, Paul Portmann, Michelle Yao, Allen Luo

This project aims to create a haptic device that will realistically render a marble rolling through a maze and explore the benefits of haptic feedback on user performance when navigating the maze. This project was inspired by the potential of VR integration of games and sensations that are easily enjoyed in the real world. By using controls-based torque feedback and vibration feedback on the axles, the feeling of the mass and the movement of the marble can be conveyed realistically to the user. Additionally, visual rendering in the form of a screen display, as well as rolling sound noises, also improve the realism of the experience. Overall, the controls implemented are very successful, with participants rating the realism and noticeability high, especially with all the tools used together in tandem.


Team photo with the labyrinth.

Introduction

Our project integrates torque feedback and vibration to provide haptic information to the user about the position and movement of a virtual marble as it moves through a maze. Our haptic device allows for indirect haptic feeling, rather than direct interaction with a virtual object, and we wanted to investigate if this kind of touch sensation would be possible. The Labyrinth game is a classic puzzle game that uses precise balance and hand-eye coordination to move the marble through the maze, and we felt like this would be an interesting way to test this indirect perception of a virtual object. Virtual versions of this classic puzzle game exist, using in-built IMU’s on tablets, phones, controllers, etc., but due to limitations of these devices, they are unable to render the actual torques and forces that result from the weight of the ball and rely purely on visual or auditory cues. We wanted to see if it would become easier or harder to guide the virtual ball with these additional haptic cues as well as see if the presence of these cues could allow players to guide the ball through the maze without the use of sight.

Background

Due to the multifaceted nature of our project, vibrational feedback, kinesthetic feedback, and auditory assistance are all relevant parts of prior work in the haptic feedback to our project. While working on the integration of each of the parts to give users the most realistic feeling of movement possible, our group looked into previous work in each of these areas to research the best approaches.

In reviewing past work relevant to our project, we wanted to look into examples of what others had done for haptically recreating moving masses. In Frissen et. al’s 2022 paper on rolling objects in handheld containers, the concept of how well humans are able to sense a rolling object inside a container without being able to see it is tested. Their experiments involved changes on rolling angle, sound feedback, and different forms of feedback including collision and vibrations. From this, we learn that simply creating the noise vibrations from the ball rolling is sufficient to give the user the best illusion of a rolling ball. The paper’s method of producing the rolling noise, sampling a sine wave to simulate surface texture with a frequency proportional to ball velocity, is a good starting point for us to design the vibrational feedback the user should feel. We also know that creating impacts are not as important and we do not have to make perfect impact simulations to be effective. The paper also suggests that the underestimation of distance could be a result of a lack of torque due to the shifting mass of the ball. This means that if we are able to integrate this rolling noise with our two axis torque feedback we could see good results.

In line with the sound effects that could augment this experience, Hermes’ paper on a method for synthesizing the sound of a rolling ball was also a relevant part of experimental history on the effect of sound in collaboration with haptics. Their results could be used to produce vibrational and/or auditory feedback associated with marble rolling, enhancing the haptic illusion. The material of both a ball and the surface it rolls on affect the sound that is produced. This study theorizes that rolling sounds are vibrations caused by impacts between small textural irregularities of the ball and rolling surface. In summary, using different materials, textures, and their corresponding auditory effects, different perceptions can be created. This method of synthesizing the sound of a rolling ball could be helpful for producing accurate auditory and vibrational feedback for our project.

Park et al. (2019) demonstrate the effectiveness of combined vibrotactile feedback and kinesthetic feedback for haptic rendering of collisions. Each of these modalities is limited but supplemented by the other. Park et al. (2019) designed a hybrid device that implements both high-frequency vibrational feedback using voice coils and low-frequency “impact” feedback using solenoids and permanent magnets. The device was tested in a user study that sought to assess the subjective quality of haptic rendering via vibrotactile versus kinesthetic feedback for several simulated materials. For haptic rendering of steel and wood, a combination of both modalities was reportedly most realistic, felt the least unnatural, and was most liked in comparison to one type of feedback on its own. For the rendering of rubber, vibrational feedback significantly outperformed “impact” feedback and barely outperformed combined modality feedback across the same categories. This suggests that vibrations are most important for rendering rubber-like materials, whereas the rendering of non-polymeric materials can benefit from a combination of kinesthetic and vibrational haptic feedback. These results justify the implementation of both types of feedback in our device, especially since we hope to provide the sensation of playing a classic tabletop Labyrinth game made out of wood with a metal/glass marble.

Methods

Hardware Design and Implementation

Our design takes the original Labyrinth game and adds to the handles and axles. Thus, we created a custom base design that would be able to accommodate the additional electronics, sensors, and motors that would be required to achieve the haptic sensations we wanted.


Diagram of the 2 degree-of-freedom platform control mechanism used in many traditional tabletop Labyrinth games. Our project implements this design with the addition of motors and capstan drives used to provide kinesthetic feedback.

The original design of the game includes an outer casing, a middle frame, and then an inner frame. All these frames are designed in CAD and laser cut from birch plywood and then assembled using wood glue and nails. The two inner frames sit on wooden dowel rod axles in the length- and width-wise directions respectively, which allows for dual axis tilt motion. Because the handles should not move with the platforms, the handles are designed to attach to two separate axes lower than the platforms, which then tie to their respective frames with strings or wire wrapped under each axle. This essentially creates a motion translation similar to how a capstan works, with the string having high friction against the axles. To improve the friction, we wrapped neoprene around the long axles where the string is in contact. 3D-printed shaft collars and knobs are attached to both sides of the axles to secure them in place.


CAD model of the hardware.

Since we intended to fit a tablet into the inner frame instead of having the actual physical maze, the inner frame is empty, with a cut-out to make fixes easier to access. Additionally, cut-outs on the walls of the outer box were necessary so that wiring, adjusting the capstans, and pressing the reset button on the Arduinos would be accessible without removing the tablet.

Electronic Components

For our haptic additions, the first part involved attaching a sector to each of the long axles that the knobs turned. Each sector is very similar to the Hapkits used throughout the course, but scaled down in radius to 2 inches. A stand for the motor and Arduino was 3D printed out of PLA, and screwed into each of the walls for stability. When the knob turns, the sector turns, which turns the motor shaft with the sensor – this shaft is attached to the magnet, and the magnetometer reads the position of the knob.

The second part of our electronics system involves adhering two small coin motors to the insides of each of the handles. These motors are soldered to wires, which trace through the knob and into the Arduinos. The purpose of these motors is to vibrate as a simulation of wall collision.

Both of the Arduinos are wired to each other, with one primary board and one secondary board used for its magnetometer. The primary board handles all communication with processing as well as powers and controls both motors and the coin motors embedded in the handles. The secondary board’s magnetometer input pin is directly connected to one of the analog input pins on the primary Arduino and executes no code. The wiring diagram for this can be seen below:


Wiring diagram.

Electronic Components

For our haptic additions, the first part involved attaching a sector to each of the long axles that the knobs turned. Each sector is very similar to the Hapkits used throughout the course, but scaled down in radius to 2 inches. A stand for the motor and Arduino was 3D printed out of PLA, and screwed into each of the walls for stability. When the knob turns, the sector turns, which turns the motor shaft with the sensor – this shaft is attached to the magnet, and the magnetometer reads the position of the knob.

The second part of our electronics system involves adhering two small coin motors to the insides of each of the handles. These motors are soldered to wires, which trace through the knob and into the Arduinos. The purpose of these motors is to vibrate as a simulation of wall collision.

Both of the Arduinos are wired to each other, with one primary board and one secondary board used for its magnetometer. The primary board handles all communication with processing as well as powers and controls both motors and the coin motors embedded in the handles. The secondary board’s magnetometer input pin is directly connected to one of the analog input pins on the primary Arduino and executes no code. The wiring diagram for this can be seen below:


Wiring diagram.

System Analysis and Control

This system was designed using a combination of Processing software and Arduino coding. The torque outputs, communication to the motors, and the angle measurements were conducted by the Arduinos, which communicated via Serial to the Processing script running on the computer. From these angles, the position, velocity, and acceleration of the ball was calculated using physics modeling. The position and velocity was sent back to the Arduino to modulate the torque and the vibration respectively, and collision data was sent to provide additional force feedback.


Diagram of the electronics system structure.

Arduino (Haptic Rendering)

To achieve our goal of accurately rendering the weight of the virtual ball, we used a simple model of two perpendicular lever arms simply supported by a pivot point at the center. The torque of the motors is therefore the moment arm of the virtual ball from the centerlines multiplied by force of gravity perpendicular to the labyrinth plane. We calculate the angle of the plane using the magnetometers attached to the Arduinos and the magnets on the capstan drives. The motors are then powered with PWM using the Arduino to render this torque.

For the vibrational tactile feedback of collisions of the ball with the walls of the maze, we have Processing send a binary signal when it detects either an x or y-axis collision with a maze wall. The Arduino then triggers a fixed pulse to the coin motors embedded in the handles when this binary signal switches from not colliding to colliding. There is additional logic to prevent false pulses due to noise using a time lockout between pulsed collisions.

Processing (Physics and Rendering)

For the rendering of the walls in the maze, a Wall class was created, which holds the dimensions of the wall, the display function, as well as collision detection methods (isXColliding/isYColliding) that determine collisions based on the current position of the ball compared to each wall. These functions are later called in succession to check for collisions with each wall individually. System dynamics were simulated using simple particle physics equations.

Sound Effect Implementation

The rolling and collision sound effects were implemented in Processing using the Sound library. We found publicly available sound clips of a ball rolling and colliding and edited them in Audacity to adjust volume and pitch, as well as to ensure that the rolling sound was convincing when looped. The edited clips were imported into Processing and dynamically adjusted using tools in the Processing Sound library.

For rolling noise, when the velocity magnitude is detected to be above a threshold, we begin looping a sound file and stop it when the velocity goes below this threshold. We also take the virtual ball's velocity information and use it to modulate the speed using the magnitude of the ball's velocity.

For the collision sound effects, we play the audio clip whenever a collision above a certain velocity is detected. The volume is adjusted such that it is proportional to the ball’s collision velocity.

Demonstration and Application

For our demonstration, a tablet will be placed in the inner frame of the box. Then, using a video calling service, the computer running the Processing program will screen share Processing’s visual rendering to the tablet to use it as a display. For general demonstrations, the maze will be visible on the screen, and participants may move the handles to attempt the maze.

For testing, participants will be asked to rate the feeling of the marble rolling on a scale of 0 to 5 in terms of qualitative noticeability and realism. Additionally, we aim to do a true test of haptics by turning off the visuals. By turning this visual aid off, we aim to see if the participants can actually feel the “virtual” marble rolling around on the surface, without any visual guidance that might prime them to feel feedback. Other than these tests, we also collected general feedback and comments on the project.


Photo of the virtual labyrinth as presented at the project demonstration.

Results

Overall, the project is successful in creating convincing haptic effects that can be felt by the participants. Participants who tried the game during the open house noted that they could feel the movement of the ball across the surface, although a few of them mentioned that the feeling of the tablet’s mass was also apparent. Below is a chart of how participants rated the effect based on noticeability and realism on a scale of 0 (not noticeable/realistic) to 5 (very noticeable/realistic).


Histograms displaying the reported noticeability and realism of the haptic feedback.

Most users at the open house rated the noticeability and realism quite high, at an average of about 4.33 out of 5 noticeability and 4.13 out of 5 realism. Among other comments, users noted that especially near the edges of the maze, the illusion of the marble’s weight could be felt. Additionally, when the visuals were turned off, the movement of the marble could still be felt and the general behavior of the marble could be detected through haptics alone. Some participants noted that there was some delay that lowered realism, but this did not have a significant negative impact.

Future Work

For future testing, different elements of the maze can be added and removed in different timed trials to see if the effect is more reliant on one form of aid than another. In particular, turning off haptic feedback could be a good test to see if participants can still accurately move the ball when there are no forces present to help them. To account for practice bias, we would randomize which trial is attempted first for all participants and then account for any bias.

Another potential improvement is adding vibrations that simulate ball rolling in addition to the current collision vibrations. This improvement would require eccentric mass motors (or piezoelectric actuators) that are capable of producing much higher frequencies. These vibrations would be paired with the current rolling sound effects. Another future improvement is using a display that is better suited for this device or removing the platform tablet for a standing display. Even though the video share call had fairly low latency, the delay between the computer and tablet was occasionally noticeable.

Files

Code Files: https://github.com/13strings/me327

List of Materials:Attach:g3_list.pdf

References

C. Park, J. Park, S. Oh and S. Choi, "Realistic Haptic Rendering of Collision Effects Using Multimodal Vibrotactile and Impact Feedback," 2019 IEEE World Haptics Conference (WHC), Tokyo, Japan, 2019, pp. 449-454, doi: 10.1109/WHC.2019.8816116. Realistic Haptic Rendering of Collision Effects Using Multimodal Vibrotactile and Impact Feedback | IEEE Conference Publication | IEEE Xplore

Frissen, I., Yao, H.-Y., Guastavino, C., & Hayward, V. (2022). “Humans sense by touch the location of objects that roll in handheld containers.” Quarterly Journal of Experimental Psychology, 76(2), 381-390. https://doi-org.stanford.idm.oclc.org/10.1177/17470218221086458 (Original work published 2023)

Hermes, D.J. “Synthesis of the Sounds Produced by Rolling Balls.” Journal IPO-Rapport, vol. 1226, 9 Aug. 2000, pp. 1-9. Accessed 15 May 2025. Synthesis of the sounds produced by rolling balls - Research portal Eindhoven University of Technology


Appendix: Project Checkpoints

Checkpoint 1

The first checkpoint goal for our group was to create a plan for the physical assembly of our mechanism, as we expect it to be slightly more physically complicated than the simple HapKit. The goal is to have a first iteration of the CAD for the labyrinth created for group feedback, and to have a list of materials. The second goal our group had was to begin the code for the Arduino for our project. By having a framework for our code, we will be able to recalibrate and have a better understanding of the limits of our resources in creating a real and noticeable virtual effect. We were able to create a rough CAD model draft of the physical project, with help from existing CAD models of the labyrinth game, and with help from tutorials online. Additionally, we were able to outline pseudo-code for the Arduino portion of the project, which provides a strong basis for how we’ll approach the rest of the code. We have also completed preliminary physics simulations for the 2D tilting of the platform and having the ball respond. Code is on the repo.

Currently, we have a few changes for the plans. One change is that we realized the tablet used for the graphics rendering may be heavier than intended. This problem could affect the user’s realism experience, since the mass of the tablet is not a part of the original game, and could be comparable to the mass of the marble. To approach this problem, we intend to increase the virtual density of the marble, so that the user may still feel the weight of the tablet, but less so than the marble. Another approach to this problem could be implementing controls to counteract the weight, but we decided that it could lead to complications, and that priming the user to expect the weight of the tablet might be sufficient in this case.

Another change that we’ve discussed is the possibility of lag between the rendering screen (tablet surface) and the actual Processing software. The lag between Arduino and Processing is not very noticeable on the same computer, but because we wanted to render the graphics on the tablet, we needed a solution for this connection. Testing of the Airplay feature between Macbook and iPad will be necessary to establish if this lab is significant enough to cause problems, and if it is, our backup plan will be to remove the tablet, and render on a laptop screen. In replacement, a static maze will be attached to the platform.

Finally, we discussed removing the holes from the classic labyrinth game. In the classic game, pitfalls exist for the ball to fall through when the player makes a mistake, wherein the ball exits the box and the game restarts. We realized that not only would accounting for this mechanic would add coding and possibly add complexities to the process, but the haptic feedback for the falling motion might be confusing for the user. As a compromise, our team has decided to remove pitfalls from the virtual maze as a priority for this project; instead, we will focus on wall collisions. If time permits, the mechanic of pitfalls may be explored after that first priority has been fulfilled.

Checkpoint 2

Here you will write a few paragraphs about what you accomplished in the project so far. Include the checkpoint goals and describe which goals were met (and how), which were not (what were the challenges?), and any change of plans for the project based on what you learned. Include images and/or drawings where appropriate.

Our second checkpoint goals were:

- Complete manufacturing of the components.

- Assemble the preliminary maze.

- Complete maze collision code.

- Complete haptic feedback code.

- Complete basic wiring diagram.

Mechanical updates:

We have successfully completed our CAD design and achieved our goal of laser cutting and 3D printing the necessary components from our design. Images of our manufactured components can be seen below.

"

In terms of final project assembly, our manufacturing group has begun to assemble the final design using the components we have 3D printed and laser cut. We were unable to completely finish assembly due to issues we found during the component manufacturing process mainly due to tolerancing.

Images of the current state of assembly can be seen below.

"

Software Updates:

For the software, we successfully achieved our goal of completing the preliminary maze collision code. Code for the ball rolling inside of maze with wall collisions and physics-based dynamics has been created and tested. Currently, the code is being debugged for the ball collisions being stuck in corners. Creating the communications between the two Arduinos was a bit challenging, due to the fact that the SoftwareSerial package only runs at low baudrate, which was something that was necessary for the arduino to successfully send values to the leader.

An image of the current render can be seen below.

"

We were not able to complete the secondary goal of completing the haptic feedback code because of manufacturing delay. However, we have created psuedo-code for our expected controller and are confident we can implement it quickly once the assembly has finished.

The remaining goals are integrating motor vibration upon contact as well as generating the torque responses upon ball rolling.

Electronics Updates:

Our goals for electronics were relatively simple because we expected that there would be some delays in manufacturing. A KiCad schematic was created for ease of understanding for the project’s hardware which can be seen below.

"

Challenges and Changes:

The most notable challenge we identified this week was screen casting from a computer to the iPad. We found a couple applications that provided wireless screen sharing but they came with non-negligible latency. We have decided to create a backup plan of only displaying the maze without the ball.

Example Video: https://www.youtube.com/watch?v=i_aLBql4Ufo