2024-Group 17
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///CeOm.gif)
3D Whack a Mole - 3 DoF HapKit Rendering Platform
Project team member(s): Stanley Wang, Steven Salah-Eddine, Lingqi Li, Souparna Gangopadhyay
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///group17photo2.jpg)
Caption:
Team with 3 DoF Haptic Gaming Setup
at the 2024 Haptics Open House
Our 3 DoF HapKit rendering platform provides users with enhanced interactive gaming experiences through haptic feedback. This versatile device is constructed with three HapKits based on a delta configuration, which enables users to precisely navigate in a 3D workspace. With real-time haptic feedback, users can feel and explore virtual environments in a more immersive and engaging way. To demonstrate possible applications with our 3 DoF HapKit rendering platform, we implement the haptically augmented "whack a mole" game which utilizes both 3D movement and realistic haptic feedback. We received positive gameplay feedback from more than 60 users during the Haptics Open House hands-on demonstrations. With its capabilities and flexibility, our 3 DoF haptic device can be easily implemented into educational and recreational applications based on developers' needs.
Introduction
On this page... (hide)
Our project seeks to design a low-cost 3-DOF haptic device based on a Delta configuration to enhance interactive experiences across an array of different applications. Understanding the capabilities of haptic technology, we aim to develop our low-cost haptic device to create immersive experiences in areas such as gaming, textural rendering for artwork, and remote haptic board games.
The decision to design a 3-DOF haptic device is driven by both the mechanical novelty of such a system (utilizing existing Hapkits in an exciting new way) as well as the broad range of potential applications. The wide range of motion and interaction capabilities of a Delta mechanism makes it suitable for numerous applications. Our goal is to design the device with a modular architecture, allowing users to customize and upgrade specific features to suit their particular needs -- from professional settings involving precise artwork operations to personal and recreational uses like gaming and interacting with family members virtually.
With respect to the controls and dynamics of the device, we plan to focus on developing a robust control algorithm that can adapt to various amounts of applications. Each application demands a specific force feedback control system, where the users can also use this versatile haptic device to program their own applications in the future.
Background
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3-DOF.jpeg)
Caption:
3-DOF Haptic Device
Delta Configuration.
The concept of a 3-DoF delta parallel mechanism utilizing existing Hapkits originated from Zhang et al. from John Hopkins University. The authors in this paper demonstrated the capabilities of such a system, motivating its use for 3D force rendering. A key feature of this paper lies in the implementation. One Hapkit is configured as the "leader", with two "follower" Hapkits. Observed position and desired torque outputs are communicated between the Hapkits via an I2C protocol. This seems to be relevant to our dual-Hapkit tele-operation setup from a prior assignment and may be useful to implement for our system as well. In the industrial setting, we additionally see the Delta mechanism as a promising architecture for haptic interfaces. The delta.3 robot from Force Dimension is an excellent example. Olsson’s thesis provides an excellent reference on this platform and details the advantages of the Delta system (large workspace, modularity and reconfigurability, etc.). Additionally, the manipulator kinematics and Jacobians are derived in this reference, which we will utilize for our control implementation. Lastly, on the application side of our project, Johnson et al. provided a compelling argument for haptic technology in board games and entertainment media. The authors of this paper studied the effects of audio and haptic cues in enhancing the ability of visually impaired people to play board/card games, and noted a substantial increase in accessibility and playability. This motivates the use of our device as a low-cost and open-source technology capable of helping many people in society.
Methods
In this project, we aim to build a low-cost 3-DOF haptic device based on the delta parallel mechanism and expand the range of its applications in both recreational and professional settings, potentially assisting people with special needs in participating in social activities. We will follow the foundational design presented by Zhang et al. [1] to enhance our device's range of motion and sensitivity. This setup will allow for precise control and detailed force feedback. We plan to integrate the haptic device with gaming and virtual art sculpting environments where players can feel and interact with virtual objects. The device will provide realistic tactile feedback corresponding to different materials' texture and resistance, providing a more immersive user experience. If time permits, we will implement real-life setups for the applications.
Hardware Design and Implementation
We modified the three Hapkits we have and 3D printed the linkages and base to construct the delta configuration.
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3DOF_device.png)
Caption:
Assembled 3-DOF Hapkit Device
Mechanical Components: 3D printed parts for device structure, screws and fasteners, and assembly tools.
Electronic Components: 3 Hapkits (Arduino Uno boards, motors, MR sensors), cable and connectors, breadboard.
System Analysis and Control
Forward Kinematics
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3-DOF_link_length.png)
Caption:
3-DOF Hapkit Link Length
The forward kinematics for our 3-DOF HapKit based on a delta configuration involves calculating the position of the end-effector (handle) in 3D space based on the angles of the three arms. The kinematic equations are derived from the geometric relationships between the arm lengths L1, L2, the base radius R, and the end-effector radius r. Each arm's position is first calculated in a local coordinate system.
For each arm i (i = 1, 2, 3), the coordinates are shown below. The differences between the arm coordinates help set up the system of equations needed to solve for the global x, y, z coordinates of the handle. The equations are combined to form a system that can be solved using methods for nonlinear equations, yielding the x, y, z position of the handle.
These equations are implemented in the control software, where the actual angles are input from the HapKit's sensors, and the resulting position is calculated in real-time to render the appropriate haptic feedback based on the user's interaction with the virtual environment.
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3DOF_eqns.png)
The 3 DOF Delta Mechanism and Communications Establishment
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3DOF_leader_follower.png)
The 3 DOF Delta mechanism consists of one leader robot and two follower robots. The leader typically controls the motion of the system, and in our case, it involves reading the magneto resistive sensor data and sending commands to the two followers to achieve the desired motion or behavior.
To integrate the leader and the follower code, we need to ensure that they communicate effectively and synchronize their actions. Both the leader and the two followers use Arduino compatible boards for processing and control. Each Arduino board is connected to power connections (VCC and Ground) from the power supply. The ground pins (GND) of all three boards are connected together, ensuring a common ground reference. The leader and the follower microcontrollers communicate with each other using I2C protocols. The I2C communication pins (A4 for SDA, A5 for SCL) of all the three boards are connected together forming a shared I2C bus. The I2C bus allows communication between the microcontrollers of the three boards. One board act as the leader, while the other two act as followers. The master board initiates communication and controls the data transfer on the I2C bus. Each slave board has a unique address assigned to it, allowing the master to address and communicate with each slave individually.
The leader code includes the functionality to read the sensor data of angular displacement from the magneto resistive sensor attached to the microcontroller board to determine the desired motion or behavior of the system. Based on the sensor data or user input, the leader calculates the necessary control commands to achieve the desired motion. The leader sends control commands to the followers using the I2C communication protocol. These commands instruct the followers on how to move or behave.
Force Digram
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///FD.png)
Caption:
Force Diagram
The above digram shows the interaction forces with the virtual ground with holes that we created. The force (highlighted in red) on the hammer (end-effector) is pointed in the upwards direction when the ground is hit from the top, and the direction in downwards when hit from below. There is no force in the hole region except when the mole pops up through the hole, the user will feel a force in the upwards direction when hitting the mole.
Graphic Interface
The graphics of our game was developed in Processing. We first created a background of green meadows with blue sky and mountains in the horizon. All the kinematic calculations and logic was done in the Arduino and transferred to the Processing via the Serial BUS. This information included the position of the end-effector, which the human operator will hold as a hammer, the positions of the ground and the holes, the random position of the moles, the times, and the score of the game. To speed up the loop time, and communication between the two, we ran both the Arduino and the Processing at 115200 baud rate.
The authenticity of the experience was greatly enhanced by the quality of the graphics design. During the initial stages of design and testing, we faced challenges in generating the proper force and visual feedback from hitting the ground, and hitting a mole. We kept adjusting the stiffness of both, and the radius of the mole in graphics, so that the visual feedback in hitting a mole, and force feedback are properly synced. Adding a virtual background of green meadow with brown colored moles popping up from the ground increased the realism of the game.
Demonstration / Application
Please click on the picture below to see a short video of the demonstartion of our system.
The design of the delta mechanism used in our device has many potential applications. First, haptic game simulations, as they are becoming more popular, can be low-cost, and easily portable alternatives to existing form of those games. The virtual nature of the games has enormous potential to make possible gameplays between two players at different locations. Other than that, the delta mechanism can be used in industrial automation for tasks such as picking and placing objects, assembly operations, and packaging, in automotive manufacturing for tasks such as welding, painting, and assembling components in car production lines.
Results
More than 60 users tested our haptic device that simulates the game "Whack-a-Mole". Each player was amazed at the haptic rendering quality! We received very positive feedback on the haptic rendering, realism, and smoothness of our 3D haptic device's handle movement when users are exploring during the game demonstration. We gave each user 30 seconds to play Wrack a Mole and one point is recorded each time they successfully whack the moving mole. The overall points will be their final score.
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///openhouse_tryout.png)
Based on our users' statistics, the mean score is 18.2340 with a standard deviation of 7.8275. The highest score is 37. Those numbers are calculated based on the 47 game players who reported their scores to us.
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///Histogram.jpeg)
To conclude, the user experience with our novel haptic "Whack-a-Mole" gaming device was overwhelmingly positive. Besides the haptic device itself, participants enjoyed a 30-second game timer, a scoreboard, a leaderboard, and two projected screens to showcase the gaming progress. The overall setup has enhanced the competitive and engaging atmosphere. The immersive haptic feedback and smooth gameplay were particularly acknowledged, which confirmed that the device's rendering was successful.
Future Work
Use of good sensors (Encoders)
Sometimes, users felt a slight delay in the graphics simulation and the force feedback, when they moved the end-effector too fast. This is due to de-calibration of the system, as it uses a magnetoresistive sensor to sense the turns of the magnet which is attached concentric to the motor, But due to the offset between the magnet and sensor position due to vibration in the system, and wobbling of the motor shaft, and also when the magnet moves too fast in between the Arduino cycles, the Arduino board sometimes doesn't register it. Employing encoders with interrupts instead of magnetoresistive sensors can offer more precise position tracking and would enable the Delta mechanism to operate at higher speeds.
More cleaner game implementation
First we would like to clamp the system to a rigid surface to avoid movements and vibrations in the device when in use. This would help to give a much more rigid feeling to the user, as the user would not be able to tip the system while using it at higher speeds. Secondly, we would implement a serial communication via WiFi between the Arduino boards of the leader and the followers. This would help to reduce wire tangling which may affect sometimes the workspace of the manipulator when used for several times. Finally, we can design and render 3D moles and hammers in the processing to give the user a more realistic experience while playing the game.
More accurate force rendering and dynamics
In future work, we would also like to implement more accurate force rendering. Right now the forces users can feel are coming from the plate and mole in the XY plane. We will add forces from more directions so the gaming environment can have a more complex structure. We'd also like to simulate more realistic dynamics when users are hitting the mole such as damping and sound effects.
Acknowledgments
Thank you Dani, Ankitha, and Yiyang for your help in setting up our booth.
Files
CAD Drawings, Arduino Leader & Follower Code, and Processing Code can be found in the google drive link below:
https://drive.google.com/drive/folders/15tt2RkrzUspbd0qf6Yt5WnJQQ1dYeAe9
References
[1 Zhang, Han; Bartels, Jan U; and Brown, Jeremy D. "3D Hapkit: A Low-Cost, Open-Source, 3-DOF Haptic Device Based on the Delta Parallel Mechanism." World Haptics 2023 (2023) | https://2023.worldhaptics.org/wp-content/uploads/2023/06/1070-doc.pdf]
[2 Olsson, André. "Modeling and control of a Delta-3 robot." MSc Theses (2009). | https://lup.lub.lu.se/luur/downloadfunc=downloadFile&recordOId=8847521&fileOId=8859315]
[3 Johnson, Gabriella M., and Shaun K. Kane. "Game changer: accessible audio and tactile guidance for board and card games." Proceedings of the 17th International Web for All Conference. 2020. | https://dl.acm.org/doi/abs/10.1145/3371300.3383347casa_token=bvWVRFTaJgwAAAAA:92T1daGtuez4gjo1EeXnZgO-GwQmUQqDrir2_Bp72BGxe7QdzPghilJJpMhCberbWnw9H4EBTka4]
Appendix: Project Checkpoints
Checkpoint 1
As we start on our project to develop a 3-DOF haptic device, we’ve done extensive literature review as shown above and our initial focus has been to successfully assemble the framework of our haptics device. These included sourcing all necessary materials, assembling the 3-DOF haptic device, and establishing communication between sensors and Arduino to track motion. We’re currently working on the last part.
We gathered all required mechanical and electronic components. Then, we moved into the assembly phase, where these parts were put together to form the initial structure of our haptic device. We 3D printed several custom parts needed for our device. The assembly process involved detailed attention to integrating these parts with the electronic components, to make sure that everything aligned perfectly for their functionality.
We are currently focused on refining the communication protocols between the sensors and the Arduino boards.
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3dof_assembly.png)
Caption:
3-DOF Haptic devices Assembly
Checkpoint 2
We have successfully established Arduino-based motion tracking and I2C communication between a leader hapkit and 2 follower hapkit setup. We developed forward kinematics to capture precise movements. We then developed a Processing sketch to simulate the interactions in a virtual environment.
With this setup, we simulated a virtual wall with 2 holes and rendered haptic interactions with the surface.
Based on our testing results, the force calculation needs to be more precise and we try to render more possible applications in the virtual environment such as games (operation, etc)
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3DOF_device.png)
Caption:
Assembled 3-DOF Hapkit Device
CAD Model & Forward Kinematics
Here is an annotated 3D CAD models showing the linkages l1, l2, R, and r.
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3-DOF_link_length.png)
Caption:
3-DOF Hapkit Link Length
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3DOF_eqns.png)
The 3 DOF Delta Mechanism
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///3DOF_leader_follower.png)
Caption:
Circuit Diagram
The 3 DOF Delta mechanism consists of one leader robot and two follower robots. The leader typically controls the motion of the system, and in our case, it involves reading the magneto resistive sensor data and sending commands to the two followers to achieve the desired motion or behavior.
To integrate the leader and the follower code, we need to ensure that they communicate effectively and synchronize their actions. Both the leader and the two followers use Arduino compatible boards for processing and control. Each Arduino board is connected to power connections (VCC and Ground) from the power supply. The ground pins (GND) of all three boards are connected together, ensuring a common ground reference. The leader and the follower microcontrollers communicate with each other using I2C protocols. The I2C communication pins (A4 for SDA, A5 for SCL) of all the three boards are connected together forming a shared I2C bus. The I2C bus allows communication between the microcontrollers of the three boards. One board act as the leader, while the other two act as followers. The master board initiates communication and controls the data transfer on the I2C bus. Each slave board has a unique address assigned to it, allowing the master to address and communicate with each slave individually.
The leader code includes the functionality to read the sensor data of angular displacement from the magneto resistive sensor attached to the microcontroller board to determine the desired motion or behavior of the system. Based on the sensor data or user input, the leader calculates the necessary control commands to achieve the desired motion. The leader sends control commands to the followers using the I2C communication protocol. These commands instruct the followers on how to move or behave.
3 DOF Haptic Device Demonstraion
The motion tracking, haptic feedback, and virtual environment work seamlessly as shown below. Here are demonstrations of the translation in the x and y directions, going inside the hold on the surfaces, and moving around on the surface.
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///translation.gif)
Caption:
Translation Demonstration
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///hole_insert.gif)
Caption:
Hole Insersion Demonstration
![](https://web.stanford.edu/group/charm/cgi-bin/pmwiki/uploads///virtual_floor.gif)
Caption:
Virtual Floor Demonstration