Intelligent Control and Automation
Vol.10 No.03(2019), Article ID:94251,17 pages
10.4236/ica.2019.103006

Practical Application of a Tongue-Operated Joystick Device with Force Feedback Mechanism

Shinya Kajikawa1, Taku Ohba2

1Department of Mechanical Engineering and Intelligent Systems, Tohoku Gakuin University, Tagajo, Japan

2THK CO., LTD, Shibaura, Tokyo, Japan

Copyright © 2019 by author(s) and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: June 27, 2019; Accepted: August 9, 2019; Published: August 12, 2019

ABSTRACT

The human tongue has superior movement and tactile sensations. For individuals with severe disabilities, a tongue operated interface device can be used to operate life-support equipment, such as powered wheelchairs and robotic manipulators. A joystick-type device can directly translate various tongue motions to external equipment behavior. In addition, the user can interactively communicate with the equipment by tactile feedback. This helps the user to control the equipment safely and skillfully. Considering these factors, in a previous study [1], we developed a novel tongue-operated joystick device with reaction force feedback mechanism. We described the design process including the analysis of human tongue movement and tactile sensations and showed fundamental performances of reaction force feedback with the prototype device. In this study, we discuss the shape of the operational part that is used by the tongue. Two types of operational tools are prepared and their operability and perception of reaction force feedback are compared. Furthermore, we confirm the effectiveness of reaction force feedback to operate the joystick device safely and skillful controlling a mobile robot in an unknown environment.

Keywords:

Tongue, Joystick, Tactile Sensation, Mobility, Reaction Force Feedback

1. Introduction

There are many devices that assist persons with severe disabilities to perform daily activities, such as powered wheelchairs and robotic manipulators. These devices recognize the requirement of individuals with disabilities and send suitable commands to external assistive equipment. Persons with paralysis of all limbs caused by spinal cord injuries, traumatic brain injuries, or strokes need interface devices that will help them interact and freely operate life-support equipment. The effective use of assistive equipment will increase their motivation to perform daily activities and help them become independent.

However, individuals with severe disabilities often cannot operate typical interface devices, such as joysticks, buttons, and switches, which are designed to be operated by limbs or digits. Several interface devices can be operated by individuals with disabilities using other body movements, such as head motion [2], eye movement [3], and eye blinking [4]. Electro-biosignals such as electroencephalogram [5], electrooculography [6] and electromyography [7] [8] are other approaches to operate assistive equipment with minimal effort. In addition to these interface devices, several tongue computer interfaces have been proposed in recent years. The tongue also comprises various muscles and can perform complex and quick movements. Furthermore, it is known that the tongue scapes damage during spinal cord injuries [9]. These features have encouraged researchers to develop tongue-based interface devices. A shooting game using the real-time measurement of tongue motion, captured using a Kinect camera, has been proposed [10]. While the primary aim of this system is to train oral muscles, it has the potential to control life-support equipment. Orthodontic dental retainers embedded with infrared optical sensors have been proposed to sense complex tongue movements [11]. An intraoral electrode array system measures the position and movement of the tongue by measuring the contact impedance between the tip of the tongue and the array [12]. The inductive tongue-computer interface that comprises an activation unit made of magnetic material and a sensor with multiple coils placed on a palatal plate has been proposed [13]. The position of the activation unit attached on the tongue is estimated by the output voltage from inductors. Using this type of interface with 18 inductive sensors, tongue-based control of an assistive robotic arm and gripper system has been demonstrated and 14 movement types have been successfully generated [14]. A powered wheelchair system driven by tongue movement has been developed [15] [16]. In this system, a magnet is mounted on the user’s tongue, and inductive sensors set around their mouth detect the tongue position. This system can generate analog commands for smooth control of powered wheelchairs. These studies reveal the significant potential of tongue based interfaces in controlling external equipment skillfully with manual operation. In addition, the tongue has a further significant function as tactile organ [17]. A tongue display unit [18] that provides electro-tactile stimulation to the tongue was developed and widely used for vision substitution [19], balance substitution [20], and augmentative information display [21].

As described above, the mobility and tactile feedback of tongues have been used separately in the fields of control and sensing, respectively. In a previous study, we have proposed a novel joystick-type interface with reaction-force feedback mechanism [1]. The joystick interface can generate detailed instructions for the equipment by taking advantage of the numerous degrees of freedom of the tongue. Furthermore, through tactile stimulation to the tongue, the equipment can alert the user of changes in the environment and allow them to navigate to a safer situation. In [1], we described the design process of the joystick device in detail and investigated fundamental performances with respect to reaction force feedback (adjustable range of reaction force, presentation way of reaction force change, and perception of reaction force by the user). Especially we could confirm the user can perceive the changes in the reaction force presented by proposed device. This result shows the possibility for the user to control outer equipment interactively using this device. However, the operability and effectiveness of force feedback in practical applications, such as the control of a powered wheelchair or mobile robot, were not discussed.

In this paper, we describe the modification of the operational part enabling to follow dynamic motion of the tongue and also give a tactile stimulation firmly in practical use. Furthermore, we show the effectiveness of the joystick device by controlling the navigation of a mobile robot in an unknown environment.

2. Prototype Interface Device

Here we describe the design concept of the proposed joystick device and explain the mechanism of reaction force feedback adopted in the prototype device.

2.1. Design Concept

We aim to realize the interaction between external equipment and user by using reaction-force control during stick operation. The external equipment alerts the user in the case of an abnormal situation by adjusting the reaction force and helps the user to safely execute the operation. Therefore, it is necessary to rapidly adjust the reaction force when an environmental change is detected. We adopt a simple mechanism that comprises an elastic plate and a slider-crank mechanism that can actuate rapidly. The mechanism is shown in Figure 1.

Figure 1. Force feedback mechanism.

In this mechanism, the joystick tip is operated by the tongue. During the operation, the joystick movement is transmitted to the contact bar attached to the rotational axis of the gimbal. The contact bar in turn makes contact with the elastic plate while moving. The shock or reaction force at contact is sensed by the user’s tongue. The reaction force magnitude can be controlled by adjusting the elastic plate length with the slider-crank mechanism. Using this function, the user can operate the external equipment interactively and safely.

2.2. Reaction Force Feedback Mechanism

Figure 2 shows the prototype of the proposed joystick device. In [21], we reported that the tongue can generate a maximum operating force of approximately 4.0 N. The prototype has been designed to give a reaction force of up to approximately 4.0 N to the tongue. The elastic plate, which is a key element in the reaction force feedback mechanism, was made of a super-elastic metal alloy (Ni-Ti SMA, Yoshimi Inc.). The width, length, thickness, and Young’s modulus were 7.25 mm, 70.0 mm, 0.5 mm, and 54 GPa, respectively. The effective length was adjusted by the slider-crank mechanism actuated by a servomotor (ASG, Tower Pro Pte Ltd., stall torque 0.18 Nm, operating speed 600 deg/s, weight 9.0 g). The supporting part could move vertically in a range of 45.0 mm, making the effective length 5.0 - 50.0 mm. The length of the contact bar, Llower, was 25.0 mm and its rotational angle was measured with a potentiometer (SV01A103E01B00, Murata Manufacturing Co., Ltd., total resistance 10 kΩ, rotational angle 333.3 deg, rotational torque 0.21 × 102 Nm).

Here we describe the operational stiffness, directly related to the reaction force, according to the effective length of the elastic plate. Figure 3 shows the

Figure 2. Prototype of joystick with reaction force feedback mechanism.

Figure 3. Generation of reaction force: the relationship between the operational force, F, and the pushing force of the elastic plate, P (left). Reaction force from the plate, having the same magnitude as P, is exerted to the tongue. A cantilever beam model of the elastic plate (right).

simplified model. When the rotational angle of the joystick is small, we can suppose that the operational force by the tongue, F, acts on the elastic plate as a translational force, P, via the contact bar. The relationship between two forces is expressed as follows:

F = L lower L upper P , (1)

where Lupper and Llower are the lengths of the joystick from its tip to the rotation center and the contact bar, respectively. The stiffness sensed by the user’s tongue, K, is defined as the ratio of the operational force, F, to the displacement, δx.

K = F δ x = 1 δ x L lower L upper P . (2)

The pushing force, P, is equivalent to the reaction force from the elastic plate that is changed because of its effective length. We express the relationship between the reaction force and effective length by modeling the elastic plate as a simple cantilever beam as shown in Figure 3. The beam is pushed by the contact bar at point A, while being supported at point B. Pushing force, F, and supporting force, R, are applied at points A and B, respectively. Under this situation, the displacement of contact point A, Δ0, is expressed as follows:

Δ 0 = a 2 ( 3 l + a ) 12 E I P , (3)

where l and a are the length of the elastic plate between its base C, and contact position, A, and the effective length between the contact position A and supporting position, B, respectively. Furthermore, E and I are the Young’s modulus and second moment of area of the elastic plate, respectively. The geometric relationship between the displacement of the joystick-tip, δx, and Δ0 is as follows:

δ x = L upper L lower Δ 0 (4)

From Equations (2)-(4), the presented stiffness, K, is obtained as follows:

K = ( L lower L upper ) 2 P Δ 0 = ( L lower L upper ) 2 12 E I a 2 ( 3 l + a ) (5)

This model indicates that the presented stiffness, K, can be controlled by the effective length of the elastic plate, a, and can be magnified by the ratio of Lupper and Llower. Figure 4 shows the adjustable range of stiffness obtained theoretically and experimentally.

From this figure, we confirmed that experimental results almost coincide with theory. We can control the stiffness, i.e., the reaction force correctly by the elastic plate length.

3. Evaluation of Performance for Practical Application

A tool operational by tongue is important to closely interact with the joystick device. Therefore, we prepared two operational tools, A and B, and compared their performance with respect to operability and perception of force feedback. Tool A is a cylindrical stick with 4.0 mm diameter, while Tool B is a stick with four blocks (width: 12.0 mm, thickness: 10.0 mm, height: 20.0 mm, made with acrylic resin) that maintain contact with the tongue. The tip of Tool A is located inside the mouth and is pushed by the tongue toward the desired direction, as shown in Figure 5(a). In Tool B, the tip of the tongue is sandwiched between four blocks so that it is inside the blocks during movement, as shown in Figure 5(b).

Therefore, Tool B can follow the tongue motion easily. The position of each block is adjusted manually before operation to ensure steady contact with tongue as shown in Figure 6. In the following section, we compare these operational tools with regard to operability during fast motion and perception of reaction force.

Figure 4. Adjustable stiffness range: Experimental data was calculated with the force and displacement data through experiments where the joystick tip was pushed by a force sensor under six effective lengths of the elastic plate: 5, 10, 20, 30, 40, and 50 mm.

Figure 5. Two kinds of operational tools: When operating Tool A, the user bites the holding part, that is a cubic shell attached to the top of the joystick device body. Tool A is operated inside the mouth. In contrast, when operating Tool B, the user opens the mouth and thrusts out the tongue. (a) Operating tool A; (b) Operating tool B.

Figure 6. Operational tool B.

3.1. Operability

Operability was investigated by operating the joystick tip with the tongue in a circular motion. In this experiment, we instructed the users to move the joystick tip rapidly to draw a circle as large as possible.

They attempted to draw the circle in a counterclockwise direction several times. Typical result with Tool A is shown in Figure 7. As seen in Figure 7, the user was unable to draw a circle because of the difficulty in smoothly turning the stick using the tongue.

When operating the joystick with Tool A, the user typically pushed the tip using the side of the tongue in the direction of the movement. Therefore, when turning was required, they changed the contact surface of the tongue and moved to the opposite side. This prevented the user from executing smooth and seamless operations, resulting in an irregular trajectory. This means that the user cannot operate the joystick rapidly and accurately when the desired operation requires a change in direction.

On the other hands, Tool B did not require additional behavior of the tongue to make re-contact with the stick while changing the operational direction. This is because each block was always in contact with the tongue during operation. This enabled the user to freely operate Tool B. Figure 8 shows the circular trajectory drawn by utilizing Tool B. In this experiment, we gave the subject the

Figure 7. Trajectories of tip of the joystick operated by tongue (with Tool A).

Figure 8. Trajectories of tip of the joystick operated by tongue (with Tool B).

same instructions as those in the previous experiment. The trajectory drawn indicates that the user can operate the joystick more easily and draw an almost circular trajectory. The user took approximately 2 s to draw one circle.

3.2. Guidance by Tactile Stimulation

In this section, we examine the abilities of using the joystick device when the stiffness is changed during its operation.

In this experiment, one direction was randomly selected every 15 s to have reaction forces smaller than those in other directions. The subjects were instructed to return the joystick to the center position when they perceived a rapid increase in stiffness (reaction force) in the operational direction and then to find the direction with decreased stiffness by pushing the stick arbitrarily.

When changing the stiffness from low to high, the supporting point went to higher position so that effective length of the elastic plate was reduced and bending decreased, because the supporting point moved higher as shown in Figure 3.

As a result, the subjects sensed that the joystick pushed the tongue slightly backward and perceived the change in stiffness. While performing this task, the subjects wore an eye mask and headphones to eliminate visual or sound information from the actuation of the slider-crank mechanism.

The experimental results are shown in Figure 9 and Figure 10. The tables in Figure 9 and Figure 10 present the combination of stiffness settings in each direction, where H and L indicate high and low stiffness values in each direction, respectively.

In the first half of the experiment, from 0 to 60 s, we set the high and low stiffness values to 2.5 N/mm and 0.2 N/mm, respectively. Beyond 60 s, we changed the low stiffness value to 0.8 N/mm.

Figures located in lower sides of Figure 9 and Figure 10 show the operational patterns with Tools A and B, respectively. Irregular movements can be seen in both situations where the stiffness was changed. This indicates that the subject checked the stiffness by touching the stick with their tongue. In the latter half of the experiment, the movement to find the direction with decreased force tended to increase as the difference between low and high stiffness values decreased. These similarities are seen in Figure 9 and Figure 10.

On comparing these results, we noticed that irregular movement with Tool A tended to continue for longer time than that with Tool B. Moreover, the user

Figure 9. Guidance of operational direction by tactile stimulation (with Tool A).

Figure 10. Guidance of operational direction by tactile stimulation (with Tool B).

with Tool B could not perceive the change in stiffness immediately in the latter half of the experiment and continued pushing the stick toward the prohibited direction for a few seconds (at around 90 s or 105 s in Figure 9).

In contrast, the user with Tool B could stop his operation to react to the stiffness change. The operational patterns approximately matched the stiffness patterns described in the table. Therefore, we can confirm that the subject could perceive the change in stiffness and operate the stick quickly.

From these comparisons, we confirmed that Tool B is superior to Tool A in operability and perception of reaction force feedback. This indicates that the operational part should keep steady contact with the tongue during operation in order to establish interactive communication between the tongue and joystick device.

4. Mobile Robot Control Navigated by Proposed Joystick Device

The mobile robot was controlled in an unknown environment that had many obstacles. In this experiment, the joystick with Tool B was used by the subject to avoid collisions and select a safer route while navigating.

4.1. Experimental Environment and Condition

We think that a mobile robot system can be used for the disabled persons to move daily living goods or to communicate with other. Our interface device can help the operator to control the robot interactively and successfully when unknown obstacles exist on the route of the robot.

Figure 11 shows the experimental environment. The start position was to the right of the subject and the goal position was the front of the door located 2.5 m

Figure 11. Experimental environment.

ahead of the subject. Because many objects were placed between the start and end positions, a green line was drawn around the obstacles to indicate the route to be traced by the mobile robot. As Obstacle #1 was high, the subject could not see behind the obstacle. In this region, the subject had to operate the robot by predicting the route. Furthermore, an additional obstacle, Obstacle #3, was placed on the route: therefore, when the subject controlled the robot to follow a simple predicted route, there was a high probability of collision. The subject was not informed of the presence of Obstacle #3.

4.2. Experimental System

Figure 12 shows the experimental setup. The system primarily comprises a joystick, mobile robot, and host computer. The mobile robot has three position sensitive detectors (PSD) (GP2Y0A21YK, SHARP, measurable range 100 - 800 mm) on its front. These sensors detect objects and measure the distance to them. The host computer, PC#1, functioned as the relay between the mobile robot and joystick and transferred the control input, calculated from the joystick output, to the mobile robot. The computer also received the PSD distance data and determined the required stiffness level for each joystick direction. These levels were sent to a microcomputer (Arduino UNO). The joystick adjusted its stiffness and provided an alert to the subject through tactile stimulation to the tongue.

4.3. Navigation

We defined simple bit patterns to express the distance and direction of the obstacle

Figure 12. Experimental setup.

obtained by the three PSD sensors on the mobile robot. The output of the sensor was set as 1 when the distance was below 200 mm, and 0 for all other instances. An example of the bit pattern is shown in Figure 13. Table 1 shows the programmed bit patterns and stiffness settings given by each bit pattern. In this table, H and L are high and low stiffness levels, respectively. The stiffness settings for each direction were defined to allow the subject to avoid collisions and move in safer directions while navigating.

The subject was instructed to operate the robot to follow the green line and to modify their operation to follow the guidance of the joystick when perceiving a change in the reaction force. This assistance was essential for the subject to arrive at the end point safely as they were not aware of Obstacle #3.

The low and high stiffness values were set at 0.8 N/mm and 2.5 N/mm, respectively. We have two reasons for this setting. The one is that these levels can be distinguished sufficiently as shown in Figure 10. The second is that switching time of two stiffness levels can be shortened because the difference between the effective lengths of the elastic plate is smaller than the combination of 0.2 N/mm and 2.5 N/mm.

The subject wore an ear plug to mute sounds that could influence the operation.

4.4. Experimental Results

Figure 14 shows the condition of the experiment. The seven images in this figure show the behavior of the robot during navigation. As seen, the robot arrived at the end position without any collisions. During the period from Scenes #3 to #6, the subject could not recognize the position and posture of the robot correctly and was unaware of the presence of Obstacle #3.

In this situation, the joystick allowed the subject to navigate and avoid collisions because of the reaction force changes. The reaction force in each direction was adjusted on the basis of the mapping between sensor outputs and stiffness settings shown in Table 1.

Figure 15 shows the sensor outputs, stiffness settings, and operation by the

Table 1. Stiffness patterns according to sensor output.

Figure 13. An example of sensor output pattern according to the distance to the obstacle.

Figure 14. Experimental environment and the scenes of mobilie robot.

Figure 15. Sensor output, stiffness setting, and joystick operation in the period from scene #3 to #6.

subject from Scenes #3 to #6. At approximately 23 s in Scene #3, while the robot continued to rotate to the left, the left PSD sensor detected Obstacle #3 first, followed by the middle and right sensors.

At this point the patterns of the sensor outputs changed from (1,0,0) to (1,1,0) and from (1,1,0) to (1,1,1). According to these changes, the stiffness in each direction was changed. At first, the stiffness in the forward and left sides were increased to stop the rotational movement and prevent the approach to the obstacle. The subject sensed that their tongue was pushed back slightly as the stiffness in the forward and left sides increased and inputs in these directions were restricted.

After perceiving the changes in the reaction forces, the subject stopped the rotational operation and immediately returned the joystick to the initial position. They then moved the joystick in the direction that had lesser stiffness, thus avoiding the obstacle.

A similar situation can be seen in Scene #5. The subject again avoided collision with the wall with the aid of the joystick. These results indicate that the stiffness control could work effectively to guide users to safer routes. However, we noticed that the subject could not react immediately to changes in the stiffness. In Scenes #3 and #5, the subject began to change the operational direction approximately 500 ms after the force feedback was applied to the tongue.

This unavoidable delay was caused by the delay and dead time in the signal transfer of the central nervous system and the actuation of muscular systems. In numerous cases, this delay is longer than the dynamic changes in the environment. Therefore forcibly suspending the movement of the robot will be a risk reduction solution, thus refusing inputs until an adequate input to recover from the critical situation is generated.

Furthermore, the stick operation resembled a bang-bang control input pattern. This joystick system would originally generate velocity commands in proportion to the tilt angle of the joystick. However, in this experiment, the maximum speed of the robot was decreased for safety reasons. Therefore, the subject always inputted the maximum velocity.

These results indicate that a safe and effective control system that meets the demands of users is realized by designing the robot behavior taking into consideration the entire process from sensing and perception to actuation of muscles.

5. Conclusions

We proposed a novel joystick system for tongue operation, in which force feedback was given by a simple mechanism. The original concept was the utilization of not only the greater degrees of mobility but also the significant tactile feedback capabilities of the tongue to realize a cooperative control system between humans and external equipment. Force feedback was effectively used to alert the user to changes in the operated equipment or the environment, thus assisting the user to perform safer and more skillful operations. A simple slider-crank mechanism and a super-elastic metal plate provided a wide adjustable range of operational stiffness.

Furthermore, for effective use of the proposed function, we discussed the shape of the part to be operated by the tongue. Through experiments with two types of operational tools, we confirmed that the operability and perception of tactile stimulation were improved by stabilizing the contact situation with tongue.

Finally, we confirmed that the joystick system could assist the user to control the mobile robot safely in unknown environments by adjusting the operational stiffness in each direction according to the outputs of distance sensors on the robot. However, the reaction of the tongue was not sufficiently fast to change the input command and recover the situation. Therefore, we have to design a method to address changes in the situation until the operation has been modified effectively.

In addition, in this study we adopted stiffness adjustments from low to high, or high to low, as the simplest solution. However, other patterns, such as high-frequency repetitive changes between low and high levels, should be examined. In future studies, we will examine the possibility of increasing the types of information using several combinations of stiffness adjustment patterns and realize more effective interaction systems with the external equipment.

Acknowledgements

This work was supported by JSPS KAKENHI Grant Numbers JP16K01562, JP19K12901.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

Cite this paper

Kajikawa, S. and Ohba, T. (2019) Practical Application of a Tongue-Operated Joystick Device with Force Feedback Mechanism. Intelligent Control and Automation, 10, 90-106. https://doi.org/10.4236/ica.2019.103006

References

  1. 1. Ohba, T. and Kajikawa, S. (2017) Tongue-Operated Joystick Device with Reaction Force Feedback Mechanism. 2017 IEEE International Conference on Advanced Intelligent Mechatronics, Munich, 3-7 July 2017, 207-212. https://doi.org/10.1109/AIM.2017.8014019

  2. 2. Rudigkeit, N., Gebhard, M. and Graser, A. (2015) Evaluation of Control Modes for Head Motion-Based Control with Motion Sensors. 2015 IEEE International Symposium on Medical Measurements and Application, Turin, 7-9 May 2015, 135-140. https://doi.org/10.1109/MeMeA.2015.7145187

  3. 3. Barea, R., Boquete, L., Mazo, M. and Lopez, E. (2002) System for Assisted Mobility Using Eye Movements Based on Electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10, 209-218. https://doi.org/10.1109/TNSRE.2002.806829

  4. 4. Krolak, A. and Strumillo, P. (2008) Vision-Based Blink Monitoring System for Human-Computer Interfacing. 2008 IEEE International Conference on Human System Interactions, Krakow, Poland, 25-27 May 2008, 994-998. https://doi.org/10.1109/HSI.2008.4581580

  5. 5. Iturrate, I., Antelis, J.M., Kubler, A. and Minguez, J. (2009) A Noninvasive Brainactuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation. IEEE Transactions on Robotics, 25, 614-627. https://doi.org/10.1109/TRO.2009.2020347

  6. 6. Sasaki, M., Suhaimi, M.S.A.B., Ito, S. and Rusydi, M.L. (2015) Robot Control System Based on Electrooculography and Electromyogram. Journal of Computer and Communications, 3, 113-120. https://doi.org/10.4236/jcc.2015.311018

  7. 7. Shima, K., Fukuda, O. and Tsuji, T. (2012) EMG-Based Control for a Feeding Support Robot Using a Probabilistic Neural Network. 2012 IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics, Rome, 24-27 June 2012, 1788-1793. https://doi.org/10.1109/BioRob.2012.6290876

  8. 8. Wei, L. and Hu, H. (2011) A Hybrid Human-Machine Interface for Hands-Free Control of an Intelligent Wheelchair. International Journal Mechatronics and Automation, 1, 97-111. https://doi.org/10.1504/IJMA.2011.040040

  9. 9. Kandel, E.R., Schwartz, J.H. and Jessell, T.M. (2000) Principles of Neural Science. 5th Edition, McGraw-Hill, New York.

  10. 10. Kimura, T. (2012) User Interface Detects Tongue Movement with Kinect. https://www.youtube.com/watch?v=jWIl3CtH6SE

  11. 11. Saponas, T.S., Kelly, D., Parvix, B.A. and Tan, D.S. (2009) Optically Sensing Tongue Gestures for Computer Input. 22nd Annual ACM Symposium on User Interface Software and Technology, Victoria, 4-7 October 2009, 177-180. https://doi.org/10.1145/1622176.1622209

  12. 12. Draghici, O., Batkin, I., Bolic, M. and Chapman, I. (2013) The MouthPad: A Tongue-Computer Interface. 2013 IEEE International Symposium on Medical Measurements and Applications, Gatineau, 4-5 May 2013, 315-319. https://doi.org/10.1109/MeMeA.2013.6549759

  13. 13. Struijk, L.N.S.A. (2006) An Inductive Tongue Computer Interface for Control of Computers and Assistive Devices. IEEE Transactions on Biomedical Engineering, 53, 2594-2597. https://doi.org/10.1109/TBME.2006.880871

  14. 14. Struijk, L.N.S.A., Egsgaard, L.L., Lontis, R., Gaihede, M. and Bentsen, B. (2017) Wireless Intraoral Tongue Control of an Assistive Robotic Arm for Individuals with Tetraplegia. Journal of NeuroEngineering and Rehabilitation, 14, 110. https://doi.org/10.1186/s12984-017-0330-2

  15. 15. Lund, M.E., Christiense, H.V., Caltenco, H.A., Lontis, E.R., Bentse, B. and Andreasen, L.N.S.A. (2010) Inductive Tongue Control of Powered Wheelchairs. IEEE 32nd Engineering in Medicine and Biology Society Conference, Buenos Aires, Argentina, 31 August-4 September 2010, 3361-3364. https://doi.org/10.1109/IEMBS.2010.5627923

  16. 16. Kim, J., Huo, X., Minocha, J., Holbrook, J., Laumann, A. and Gjovanloo, M. (2012) Evaluation of a Smartphone Platform as a Wireless Interface between Tongue Drive System and Electric-Powered Wheelchairs. IEEE Transactions on Biomecical Engineering, 59, 1787-1796. https://doi.org/10.1109/TBME.2012.2194713

  17. 17. Tang, H. and Beebe, D.J. (1999) Tactile Sensitivity of the Tongue on Photolithographically Fabricated Patterns. Proceedings of the First Joint BMES/EMBS Conference Serving Humanity, Advancing Technology, Atlanta, GA, 13-16 October 1999, 633.

  18. 18. Kaczmarek, K.A. (2011) The Tongue Display Unit(TDU) for Electrotactile Spatiotemporal Pattern Presentation. Scientia Iranica, 18, 1476-1485. https://doi.org/10.1016/j.scient.2011.08.020

  19. 19. Sampaio, E., Maris, S. and Bach-y-Rita, P. (2001) Brain Plasticity: Visual Acuity of Blind Persons via the Tongue. Brain Research, 908, 204-207. https://doi.org/10.1016/S0006-8993(01)02667-1

  20. 20. Vuillerme, N., Pinsault, N., Chenu, O., Fleury, A., Payan, Y. and Demongeot, J. (2009) A Wireless Embedded Tongue Tactile Biofeedback System for Balance Control. Pervasive and Mobile Computing, 5, 268-275. https://doi.org/10.1016/j.pmcj.2008.04.001

  21. 21. Droessler, N.J., Hall, D.K., Tyler, M.E. and Ferrier, N.J. (2001) Tongue-Based Electrotactile Feedback to Perceive Objects Grasped by a Robotic Manipulator: Preliminay Results. 2012 23rd Annual International Conference IEEE Engineering in Medicine and Biology Society, Istanbul, 25-28 October 2001, 1404-1407.