2023-Group 6

Caption:
Teleoperated Haptic Gripper.
Teleoperated Haptic Gripper
Project team members: Blake Jones, Ashwin Vangipuram, Danny Blackburn, Ankitha Durvasula
We designed a 1 DoF teleoperated haptic gripper where a user can control the position of the gripper with their fingers, and force felt during grasping is fed back to the user through bilateral position control. Through visual cues from an LED, users can determine the minimum amount of force required for a stable grasp on an object. Haptic force feedback during teleoperation is important because it allows the user to feel the grasping of remote objects. This method only requires a position sensor to simulate force feedback to the user. The results of our study showed that with a little practice, users were able to detect the minimum force needed to grasp an object while blind. With the addition of vision, the majority of users were able to successfully grasp an object on the first try. Overall, the project was a successful demonstration of a teleoperated haptic system.
On this page... (hide)
Introduction
Teleoperation is an important field in robotics because it allows for users to be a safe distance away from a dangerous task or manipulate objects that are much smaller or larger. There are many other use cases for robotic teleoperation, but in all cases the goal is to utilize the accuracy, precision, and strength of robots with the dexterity of humans. Therefore, haptic feedback is essential for a user to modulate the position and force applied by a robot on an object. Therefore, we designed a 1-DOF teleoperated haptic gripper that allows the user to perceive force feedback while grasping objects. By using position data for our device, we avoid the necessity of expensive force sensors. We also only required 1-DOF to control the gripper because the fingers can move symmetrically.
Background
Teleoperation is widely used in haptics to give users the feeling of touching and manipulating objects at a distance. Humans use many different methods to detect objects and manipulation, such as texture, hardness, and contours, and haptic feedback allows us to transfer some of this detection through robots and mechanisms at a distance [1]. This haptic feedback is useful to provide cues to a user, and can help guide or convey meaning to a user [2]. The most important feedback that a user needs for gripping tasks, though, is gripper force so that a user can detect when they have grasped an object and can feel with how much pressure they are holding the object [3]. This allows users to firmly grasp and move objects while ensuring that they don’t break them. However, it is difficult for users to know how much force to grip an object without using visual cues. Although it is easy to trick humans into perceiving the weight of an object incorrectly, when a user just has haptic feedback, they are unable to make a great judgment and correction for the force required to grasp an object [4]. This is even more true when the force feedback from the gripper is new or unknown to a user as it can be unclear exactly how the teleoperated gripper moves, manipulated, and grips with respect to the user side [5]. Therefore, for a gripper to allow effective teleoperation tasks, it is important that a user both receive force feedback in a way to allow them to discern manipulation as well as visual feedback that helps them tune and correct their movements for what they discern.
Methods
We created a 1 DoF device that would allow us to study teleoperated gripping and run an experiment on users to see how visual feedback would affect their grasping perception and performance. A description of how we created this device and applied position following for force feedback is given below.
Hardware design and implementation
We modified our existing Hapkit in order to create a gripper that could use the motor to provide movement. We used the capstan from the Hapkit to continue applying the torque from the motor to rotational device movement, but we replaced the handle with a small gear that would rotate along with the device. We then meshed this gear with a larger gear in order to step up the torque to allow for higher gripping forces. We used a gear ratio of 0.6 to step this up. We then used the same gear on a mirrored finger so that we would have two fingers pinching together. An acrylic mount was created so that the device could be oriented sideways for gripping.
Both our gripper side and user side used these exact same designs for 1:1 movement. On the user side, we attached small fingers with cups on them to allow the user to pinch their gripper back and forth. This would apply positional movement that was picked up by the Hapkit to send to the gripper. The motor would then apply force back to the user based on the position sent by the gripper. On the gripper side, we created a 4-bar linkage that allowed for parallel gripping. This movement followed the same principle as the user side, with the object serving as a positional obstacle that would be sent back to the user and the user position leading to motor torque for gripper movement. The system design can be seen below.

Caption:
User and gripper side leading to object grasping.

Caption:
Gripper design CAD.

Caption:
Image of User Side.
We then connected our system together using long wires so that the system could be teleoperated from across a long table. An image of the system is seen below.

Caption:
Image of full system.
A video of the system moving can be seen below.
System analysis and control
To do the system analysis, the gripper can be idealized to a single finger. The force of this idealized finger represents the total force of the gripper. In the nominal case, the total force of the gripper is equal to ½ the force on each real finger.
Caption:
Free body diagram of idealized gripper finger.
To transform between motor torque and force at the finger, we need to find the Jacobian. To do this, we first derived an equation for torque at the capstan sector to torque at the pulley.
τ_s = r_s / r_p * τ_p
We also determined an equation for torque at the finger to torque at the sector. This is simply equal to the gear ratio of the two gears.
τf = τs * GR τf = GR * rs / rp * τp
Once torque of the finger was found, we could easily calculate the force of the finger in terms of pulley torque. This relationship represents the Jacobian.
F = GR * rs / (rf * rp) * τp
J-T = rs * GR / (rf * rp)
J = rf * rp / (rs * GR)
The Jacobian allows us to trivially convert between force at the finger and pulley torque. For our design, pulley torque is equal to motor torque because they are directly connected. We then implemented a teleoperated bilateral position controller. The desired force was calculated using the following control laws for the leader and follower.
F1 = kp1(x2 - x1) + kd1(x'2 - x'1)
F2 = kp2(x1 - x2) + kd2(x'1 - x'2)
Demonstration / Study
We demonstrated this device at the ME327 open house on June 6th. We conducted a study with 30 participants where half were asked to test the device while blind and half were asked to test the device with their eyes open. The groups were selected randomly. Each user was asked to try and grip a small penguin toy object with the minimum force required to grip the object without it falling. Users with eyes open were able to see an LED turn on when too much force was applied. Users with eyes closed were informed when the LED turned on or if the object fell and were asked to try again until the gasp was achieved. The object was held in place within the gripper until the user stated that they believe they had grasped it. Then, the object was let go to see if the object would drop. For each participant we recorded the number of times the penguin dropped or the LED turned on. The users would continue to attempt the task until they successfully grasped the object without dropping or squeezing too hard.
A video is attached below.
Attach:gripping.mp4
Results
Caption:
Blind vs. Not Blind Haptic Gripper Study.
The results of our study showed that while being able to see the object, users were easily able to find a minimum force grasp on the penguin on the first try, with only two subjects needing to try again. While not able to see the object, users required up to 3 tries to find the stable grasp. There was almost an even split between the failure being a dropped object versus too much force, with excess force being slightly more common. In conclusion, while the haptic feedback helped users detect a grasped object, significantly more force feedback is necessary to accurately perform the task while blind. This also demonstrated that when users are able to use visual cues to define haptic cues, they are much better to complete teleoperated tasks, even with low force feedback that saturated quickly. The visual cues combined with the force feedback is enough to provide the user control.
Future Work
There are several areas for improvement of the haptic gripper. Better motors would allow for a stronger force feedback before they reach stall torque. This would likely significantly improve the feel for the users and enable them to better grasp objects while blind. The addition of a better position sensor would improve the fidelity of the position control and accuracy of the velocity calculation. Serial connection between the two hapkit boards occupied the same pins for USB communication with a laptop. This prevented any transmission of data from the boards in real time which also prevented debugging or creating a graphical interface. Mechanically, the 3D printed gears had some backlash that could be felt on the user side. The system could instead be designed to use a cable drive for a smoother feel. The motor also felt notchy on the user side. A smoother motor could be found to make free space feel more free.
We would also want to add another degree of freedom to our gripper so that we can test a user's ability to both grip and lift. In our study, we held an object and had users tell us when they believe they were grasping an object. However, is the user were able to try to grab and lift the object themselves, while also receiving force feedback for the weight, they may have been better able to tell whether they have fully grabbed the object. This would also be another step towards users achieving teleoperated grasping tasks.
Files
Cost Breakdown
3D printed parts: Free (However if we paid for filament, would amount to only a few dollars)
Laser Cut Sheet: $10.60
Hardware (Screws, nuts, inserts, etc.): ~$10
References
[1] Shinya Takamuku, G. Gomez, Koh Hosoda and R. Pfeifer, "Haptic discrimination of material properties by a robotic hand," 2007 IEEE 6th International Conference on Development and Learning, London, UK, 2007, pp. 1-6, doi: 10.1109/DEVLRN.2007.4354057. <https://ieeexplore.ieee.org/abstract/document/4354057 >
[2] Walker, Julie M., et al. "Holdable haptic device for 4-dof motion guidance." 2019 IEEE World Haptics Conference (WHC). IEEE, 2019. <https://arxiv.org/pdf/1903.03150.pdf>
[3] Rahul Kumar, Utkal Mehta, Praneel Chand, "A Low Cost Linear Force Feedback Control System for a Two-fingered Parallel Configuration Gripper." IEEE International Symposium on Robotics and Intelligent Sensors. IEEE, Tokyo, Japan, 2016, pp. 264-269, doi: 10.1016/j.procs.2017.01.220. <https://www.sciencedirect.com/science/article/pii/S1877050917302430>
[4] Buckingham G, Goodale MA, "Lifting without Seeing: The Role of Vision in Perceiving and Acting upon the Size Weight Illusion." PLOS ONE 5(3): e9709. 2010. https://doi.org/10.1371/journal.pone.0009709
[5] Carol E. Reiley, Takintope Akinbiyi, Darius Burschka, David C. Chang, Allison M. Okamura, David D. Yuh, "Effects of visual force feedback on robot-assisted surgical task performance." The Journal of Thoracic and Cardiovascular Surgery,Volume 135, Issue 1, 2008, Pages 196-202, ISSN 0022-5223, https://doi.org/10.1016/j.jtcvs.2007.08.043.
Appendix: Project Checkpoints
Checkpoint 1
Our initial goals for checkpoint 1 were
- Develop and test our position control algorithm using our existing Hapkits to see if we can emulate force feedback through teleoperation.
- Design and print our initial gripper designs for both the robot and user
We were able to finish almost all of our tasks. First off, we created a preliminary CAD model of how we would convert the existing Hapkit into a gripper by replacing the handle with a gear extrusion. We started off with a simple 1 finger design that is rigidly rotated by the gear, and we decided to have a gear ratio of 1.5:2.5 = 3:5 between the sector gear and the gripper gear so that we can increase the torque output of the motor. We also were careful to account for the roughly 80° range of the Hapkit sector, which would get converted to a 48° range of motion for the gripper fingers. With this initial model, we hope to test out whether we can feel force feedback through teleoperation. For now, the user model will be the exact same except we invert the directions of the fingers.
Caption:
Teleoperated Gripper.
Caption:
User Interface.
We have begun printing and assembling the parts together but haven’t fully finished yet. The gears were printed and the gear used with the hapkit was assembled as shown below.
Caption:
3-D printed gears for gripper
Caption:
3-D printed gears for gripper
Additionally, we wrote the code(https://github.com/danwblackburn/grip) for performing bilateral teleoperation between the gripper and user which adapted our Assignment 5 code for our new dynamical system. Shown below is the sketch and calculations used for modeling. As a note, the force in this sketch is the total grip force, not the force on one finger.
Caption:
Calculations for haptic gripper rendering and teleoperation.
In terms of future plans for the project, we have decided that actual force control is beyond the scope of this project. We also are developing a parallel linkage gripper to replace the initial design so that our gripper will always face straight ahead, and thus be able to grasp a wider variety of objects.
Caption:
4-bar-linkage design for future gripper.
With the additional design changes we are making, for checkpoint two the adjusted goals are to complete assembling the gripper and user interface and testing our code for initial proof of teleoperation.
Checkpoint 2
Here you will write a few paragraphs about what you accomplished in the project so far. Include the checkpoint goals and describe which goals were met (and how), which were not (what were the challenges?), and any change of plans for the project based on what you learned. Include images and/or drawings where appropriate.
Our goals for checkpoint 2 were:
- to redesign the the gripper (if necessary)
- to assemble the gripper
- to test bilateral teleoperation between the user and the gripper
This week we accomplished all of our tasks. First, we successfully assembled a gripper along with the user interface. This required us to design a mounting for our hapkit which we laser cut out of acrylic. We also designed and 3D printed a mounting block for the gears that would provide stability to the system while allowing the gears to rotate freely. We also iterated on the distance between gears in order to reduce backlash without adding too much friction. Below is a photo of the assembled gripper.
Caption:
Assembled gripper.
This gripper used the original two arm gripper design. Additional work was done to design and assemble a parallel linkage gripper and user interface. This new design, while not necessary, is a nice to have to make our system a bit more robust and allow for the user to better feel the push-back from the object. The CAD is shown below. We had printed these parts for this new redesign, but due to some tolerances and small design flaws, we realized that this has a tendency to rotate as grasping objects further out. We are continuing to develop this design, but are moving forward with the assembled gripper we have as we know that it works.
Caption:
Parallel gripper design (user interface).
Caption:
Parallel gripper design (gripper interface).
In order not to delay the progress of the project, we went ahead and assembled the user interface side of the two arm gripper. Then using the code we wrote last week for bilateral teleoperation, we began testing our system. We were successfully able to control the gripper by moving the user side arms, however the user side was unable to move when the gripper side was adjusted. We believe this is due to the additional friction between the gears on the user side. Moving forward we plan to adjust the gains of the code to allow bilateral control and potentially reduce the friction in the system if necessary. If there is time, we may switch to the parallel linkage gripper for improved control, but for now, we want to ensure we get our system working. Below is an image of the two gripper devices connected.
Caption:
Parallel gripper design (gripper interface).
A video of the system moving is here: https://drive.google.com/file/d/1ggFRxrTzhBfw3omPBW23M7RW-BA6BUpH/view?usp=sharing
When designing our system, we discovered some tolerance issues with our 3D prints. These issues ranged from large (such as the gears not meshing correctly) to minimal (such as backlash or friction) but we needed to work these out to ensure we had a clear enough system for user feedback. Once we got our system moving well on its own, we set up the bilateral teleoperation code for initial testing. We discovered that we may still have some friction in our system, so for the future, we plan to continue iterating on our gear tolerances to get the system running smooth. We also plan to shift our focus more towards the code so we can tune our gains well enough for the user to feel object stiffness feedback. Our final future goal is to test different objects and record the force felt on the user side to demonstrate experimentally how different objects give different force feedback.

