Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Tongue Motion-based Operationof Support System for Paralyzed Patients
Junji TAKAHASHI, Satoru SUEZAWA, Yasuhisa HASEGAWA, Yoshiyuki SANKAIGraduate School of Systems and Information Engineering
University of Tsukuba
Ibaraki, 305-8573, JAPAN
Email: [email protected]
Abstract—This paper proposes a new control device based ontongue motions to control and communicate with a support sys-tem for a paralyzed patient. We focus on the tongue movementsas one of output of human intentions, because the tongue has oneof capable parts for the motions and it does not affected by spinalcord damage. The tongue motion is easily observed from his/hermouse inside, it is, however, hard to observe them from outside.We therefore propose a detection algorithm of the tongue motionsby using multiple array electrodes attached on a skin surfacearound a neck. The tongue motions are detected based on thecenter position of distributions of muscle elctric potentials thatare measured by the electrodes. We investigated the precisionsof the detection algorithm through some experiments and thenconfirmed that almost accucracy of discrimination is more than70 as for six tongue movements such as left, right, forward,back, up, and down. Additionally, we evaluated operability ofthe proposed algorithm quantitatively using Fitts’ law based testbed GUI, and the performance of the proposed interface wascompared with that of other available interfaces.
I. INTRODUCTION
Around two million disabled persons live in Japan and more
in the world. The population of paralyzed patients accounts
for 50 % of a number of physically challenged persons. The
medical reasons of paralyze have not been reveled enogh yet
and even if the case it reveled, in most case, there is no medical
coping technique. We are aiming to develop a support device
for helping their daily life. Before now, several types of devices
have been developed; electric wheel chair for mobility, robotic
manipulator or hands for carrying action especially in the meal
support systems which have been strong demand for improving
QOL of patients and reducing the burden of caregiver, and
mouse stick or head mounted stick for operating a PC.
Paralyzed patients need alternative interface system in order
to output their intentions and control these support devices.
Depends on the support device types, the demanded number of
signals are different. For example, an electric wheel chair has
two degree of freedom for locomotion in 2D space and pick-up
manipulators including meal support device needs three degree
of freedom for indicating the position of end effector. In many
cases, two distinguishable signals are required per one degree.
For instance, two motions, such as moving a manipulator to
right and left, belong to one degree of freedom. Therefore, 6
distinguishable signals are required for controlling a manipu-
lator. Moreover, each device needs at least one more additional
input signal for a special action, such as a gripping action of
manipulator and an emergency stop of wheel chair.
In many cases, a paralyzed patient needs wheel chair for
mobility as much as he/she needs meal support device. If
feasible, these several support devices should be controlled
by only one sophisticated alternative interface so that the user
masters how to use them easily. From this point of view, the
demanded number of input signals for a sophisticated interface
system should be accommodated to the maximum number
of the demand of fundamental support device. In concrete,
a manipulator type of support device needs 7 distinguishable
signals to be completely controlled. Consequently 7 distin-
guishable signals (right, left, up, down, forward, back and
action) are necessary for a sophisticated alternative interface
in order to control fundamental support devices. For the use
of alternative robotic devices, though the BMI is one of most
attractive HMI technologies, however many unsolved issues,
such as reliability, accuracy and price, still remain before
commercially used [1], [2].
The good interface system has a good usability and, in most
case, it can be used intuitively. Except for hands, the most
capable organ is a tongue. A tongue is able to take various
postures and has a potential to output multiple distinguishable
signals. Furthermore, a tongue has a good chance of survival,
even if a person is suffered from serious damages, such as
spinal cord injury.
Therefore, alternative interface systems with using tongue
have been researched and developed by many researches.
Huo et al. proposed “Tongue Drive” [3] which detects six
commands by tracking the movements of a permanent mag-
netic tracer implanted on a tongue. Ichinose et al. proposed
tongue-palate contact pressure sensors to control an electric
wheelchair [4]. Though these systems are useful to some
degree, inserting some artificial materials into cavity of mouth
is quite a burden for user. Ravi et al. proposed unique HMI
systems that measure airflow pressure changes in the ear canal
due to tongue movements [5]. It is a good point that the system
only requires putting the earphone into the ear to detect tongue
movements. However, they have not achieved yet to detect 6
distinguishable signals from the tongue motion.
In this paper, we proposed tongue motion based interface
which can detect 6 distinguishable signals noninvasively and
without inserting any artificial materials into the oral cavity.
2011 IEEE International Conference on Rehabilitation Robotics Rehab Week Zurich, ETH Zurich Science City, Switzerland, June 29 - July 1, 2011
978-1-4244-9861-1/11/$26.00 ©2011 IEEE 257
(2)(1)
ch1 ch2 ch3 ch4
ch9 ch10 ch11 ch12thyrohyoid
sternothyroid
sternohyoid
omohyoid
Fig. 1. (1) Anatomical figure of muscles related to tongue, (2) Electrodesposition
Fig. 2. ZeroWire EMG dveloped by AURION SRL
Our approach is to measure bioelectric potential (BEP) from
outside of the mouth and detect tongue motions. Using BEP
for detecting muscle activity has been studied for long and
many studies have demonstrated its effectiveness [6], [7], [8].
However, the method measuring and detecting the tongue
motion has not been reported.
Additionally, we developed a Fitts’ law based test bed GUI
which is extended to 2-dimensional model to evaluate quanti-
tatively our proposed method. The experimental results show
the validity of our proposed tongue motion based interface.
II. METHOD
A. Measurement of BEPs related to tongue muscles
Our approach is to observe Bioelectric Potentials (BEPs)
from electrodes attached on the surface skin so as to discrim-
inate tongue motions. In the ordinary case, when a device
observes BEPs to estimate muscle activity, electrodes have to
be attached on as close as possible to a target muscle so as to
reduce noise. In this case, however, because a tongue itself is
a complex of several muscles, it is impossible to observe the
BEPs of tongue muscles directory, as long as our approach
forbids inserting any material in a mouse. Therefore we try
to observe and utilize BEPs of muscles related to tongue
movements. The muscles related tongue movements, which
are sternohyoid, omohyoid, sternothyroid, and throhyoid (Fig.
1 (1)), locate in the anterior neck region and the BEPs of them
can be observed from the electrodes attached on neck surface.
These muscles related to tongue movements are too small
and their BEPs cannot be observed individually due to cross
talk. Although the individual BEP of each muscle is not able to
be detected, it is, however, considered that the spatiotemporal
patterns of BEPs observed from electrodes array have infor-
mation about tongue movements. In this research, we utilize
a electrodes array to observe the patterns of BEPs. Figure 1
(2) shows positions of electrodes. Since the fiber of muscles
8.0
0.0
2.0
4.0
6.0
8.0
0.0
2.0
4.0
6.0
8.0
0.0
2.0
4.0
6.0
8.0
0.0
2.0
4.0
6.0
0.0 2.0 4.0 6.0 8.0 10.0 0.0 2.0 4.0 6.0 8.0 10.0
1.0
0.8
0.0
0.2
0.4
0.6
up
neutral
down
right left
foward back
Fig. 3. Intensity distribution of observed electric potential on each electrodeaccording to tongue movements
TABLE ITHE 7 TONGUE MOTIONS
index motion way to move
n neutral relax and do nothingu up press the upper jaw with the whole tongued down push down the whole tonguer right press the right tooth with the tongue tipl left press the left tooth with the tongue tipf forward press the front tooth with the tongue tipb backward pull back the whole tongue
related to tongue are extending in the direction of vertial,
the difference of potential between the two vertical adjacent
electrodes are measured.
The measurement system named ZeroWire EMG (developed
by AURION SRL), which consists of electrodes and amplifier
device, utilized for experiments are shown in Fig. 2. The
sampling frequency is 2000 Hz and 10-1000 Hz band pass
filter are applied before the time series data are transmitted
to the PC. Let BEP (t) denote the data received at the PC
for in each instant of time t, τ denote integral time and ndenote channel number, then the integrated BEP for smoothing
is defined as follows,
iBEPn(t) =∫ t
t−τ
|BEPn(t)|dt. (1)
The exploratory experiments, which we conducted, to search
reliable relationship between iBEP patterns and tongue
movements indicated that 7 tongue movements including neu-
tral motion can be detected separately. The tongue motions
including the way to move are described in Table I and the
each intensity distribution of BEPs corresponding to each
tongue motions are shown in Fig. 3.
258
3
4
5
6
7
4 5 6
forwardbackright
leftup
down
x
y
Φ (f)
Φ (u)
Φ (l)
Φ (d)
Φ (b)
Φ (r)
Fig. 4. The center position of each movement
B. Motion classification method
A good classification method has a high success rate in
discrimination and works with a few parameters or indexes.
Our proposed classification method utilizes two indexes. One
is an ensemble average of each BEPs value: iBEP a(t) defined
as follows,
iBEP a(t) =1N
N∑n=1
iBEPn(t), (2)
where N is number of all channel. The other is the position
of gravity center, which is a positional average of all channel
location weighted by their BEP values. Let pn = (xn, yn)denotes the position of each channel, then the gravity center:
pg is defined as follows,
pg =1N
N∑n=1
iBEPn(t) pn. (3)
As a pilot experiment, the pg position according to each
motion was measured, while a user (subject 1) was taking each
tongue posture for 10 seconds in order. The plotting rate is
0.1 seconds through the experiment. Figure 4 shows a typical
distribution of pg according to respective motions. Although
the partial points are overlapping, it is obvious that most of
the plotted points can be classified into the distinguished area.
The plotted pg is categorized by a threshold-based process.
The 6 classes are defined as in Fig. 4. These classes are defined
mathematically as follows,
Φ(θ) �{
pg xmin(θ) ≤ xg ≤ xmax(θ)ymin(θ) ≤ yg ≤ ymax(θ)
},
(θ = u, d, r, l, f, b). (4)
When pg is placed in Φ(f) and Φ(u) at the same time, then
it is classified into either class by additional parameter which
is iBEP a. Figure 5 shows that the iBEP a of the up motion
tends to be higher than that of the forward motion. Therefore,
an additional threshold: zmin(u) is set to divided the overlap-
ping area. Each threshold parameter is set manually before the
4
4.6
5.2
4.5
5.5
6.5
7. 5
0.1
0.14
0.18
0.22 forwardup
x gy
g
Ense
mble
aver
age
of
iBE
Ps
on e
ach e
lect
rode:
iB
EP
a
zmin(u)
Fig. 5. The center position and strong average value
experiment. To develop a n auto calibration algorithm is one
of the future works.
C. Evaluation of classification accuracy
The proposed classification method were evaluated by fol-
lowing protocols,
(i) A experimenter set on the electrodes array on the
anterior neck of the subject (Fig. 1 (2)).
(ii) The system calculates and plots the pg while the user
take each tongue posture for five seconds. Then the
experimenter decides the each threshold.
(iii) The subject tries to take a each tongue posture during
Ttest � 20 seconds again. Let Tc denote the time
the system identify the motion correctly, the rate
correctly identified Qc is calculated as follows,
Qc =Tc
Ttest× 100. (5)
(iv) The subject also takes actions; chewing, drinking,
and talking during Ttest2 � 20 seconds, to evaluate
the rate wrongly identified Qw. Let Tw denote the
time the system wrongly identify the motion, Qw is
given by following equation,
Qw =Tw
Ttest2× 100. (6)
Table II shows the results of these experiment. Except for
the “up” motion, Qc is higher than 85 %. The rate wrongly
identified Qw, however, are not low enough. So far, a user
should be forbidden to take actions, such as chewing, drinking
and talking, when he/she uses this system.
TABLE IIACCURACY AND ERRONEOUS RATE OF IDENTIFICATION
Tongue motion Qc Action Qw
up 74.3% chewing 33.3%down 94.9% drinking 64.8%right 86.6% talking 19.9%
left 96.3%forward 89.4%
backward 85.0%
259
III. OPERABILITY TEST
A. Fitts’ law based operability test
So as to evaluate the performance of our proposed alter-
native interface system, we have developed Fitts’ law based
operability test GUI (Fig. 6) which was extended to two-
dimensional model in contrast with the traditional model
taking one-dimensional model. Fitt’s law is a model of human
movement in HCI and ergonomics which predicts that the time
required to rapidly move to a target area is a function of the
distance to and the size of the target. Fitts’ law is used to model
the action of pointing, either by physically touching an object
with a hand or finger, or virtually, by pointing to an object on a
computer screen using a pointing device. It is, therefore, useful
to be aware of the operability of a new computer pointing
device and to compare it with other computer pointing device.
According to Fitts’ law, the movement time (MT) of the cursor
to a target and the task difficulty (ID: index of difficulty) have
the following linear relationship,
MT = α + βID, (7)
where α represents the start/stop time of the device (intercept)
and β stands for the inherent speed of the device (slope). The
reciprocal number of β is called the index of performance
(IP) in bits per second and, IP represents how quickly the
pointing and clicking can be performed with the computer
pointing device. So the IP is just affected by the mouse cursor
speed which can be easily arranged under general operating
systems, such as Windows, Mac, and Linux. If cursor speed,
however, is set too rapid, the pointing accuracy decreases.
So, the mouse cursor speed has to be set with considering
the trade-off relation between speed and accuracy. Therefore
the IP represents overall operability of pointing device. The
equation shows that an interface with a high IP is better than
that with a lower IP, because a high IP indicates that the device
performance is less affected by a high ID.
The ID depends on the (W ) of the target, which is equal
to diameter of the target circle in our model, and the distance
(D) between the cursor and the target. The units of ID is ”bit”
and defined mathematically as follows,
ID = log2
(1 +
D
W
). (8)
D
W
1280 pixel
768 p
ixel
Fig. 6. Snapshot of the two-dementional test bed GUI to evaluate theoperability of the computer interface
500 pixel
start/stop
500 pixel
start/stop
low speed meidum speed
Fig. 7. Result trajectories
TABLE IIIRESULTS OF CURVE TRACING EXPERIMENTS
low speed medium speedtime [s] 56.8 22.8
maximum deviance [pixel] 22.5 45
Thus, it is obvious that the task becomes more difficult as
D increases or W decreases. The operability parameter IP
and also the parameter α are determined experimentally in
following section.
B. Experimental condition and subject information
The two-dimensional Fitts’ law based test bet GUI is
shown in Fig. 6. The screen size is WXGA(1280 x 768) and
background is colored black. One hundred of cyan colored
circle targets, the size and location of which are random
(30 < W < 300 pixel), appear in order in the screen after
the former target is clicked correctly. The MT is measured as
time interval from the time former target is clicked to the time
next target is clicked.
To determine the cursor speed, a pilot experiment where
user traces a target curve were done. While the subject 1 tried
to trace a circle on the screen, the deviance from the target
curve and the required time is measured. Figure 7 shows the
results trajectories and Table III shows experimental results.
From this result, the mouse cursor speed is arranged at medium
speed in Windows manner. All subject used the same cursor
speed on both experiment sessions.
The test was conducted in two sessions. The first session
used a mouse, which is a standard computer interface tool,
and second used our proposed interface system. Three subjects
(S1-S3) with intact limbs (three males, average 22.7 years
old) volunteered and sat comfortably in front of the computer
screen where the test bed GUI is displayed. The subjects were
instructed to point to and click a circle target by moving the
cursor. In this experiment, 4 tongue motions (right, left, up
and down) are corresponding to the mouse cursor movement
directions (right, left, up and down) and the forward motion
of tongue is corresponding to mouse click action.
One trial requests the subject to click the target one hundred
times. The subject did 10 trials with our proposed tongue
motion based interface and did 3 trials with mouse interface.
Every trial was done in one day per one subject. Then the
system evaluated by index of performance (IP).
260
0 1 2 3 4
Subject 3: Tongue
trial 8
0 1 2 3 4
Subject 2: Tongue
trial 10
0
10
20
30
40
Movem
ent
tim
e: M
T [
s]
0 1 2 3 4
Index of Difficulty: ID [bit] Index of Difficulty: ID [bit] Index of Difficulty: ID [bit] Index of Difficulty: ID [bit]
Subject 1: Tongue
trial 8
0 1 2 3 4
Subject 1: Mouse
trial 8
Fig. 8. The relationships between the MT and the ID
0 1 2 3 4 5 6 7 8 9 100
0.2
0.4
0.6
0.8
1.0
Number of trial
Ind
ex o
f P
erfo
rman
ce:
IP [
bit
/s]
Subject 1Subject 2Subject 3
Fig. 9. Transition of the IP
TABLE IVRESULTS FO THE EXTENDED FITTS’ LAW
S1 S2 S3 Overall
Max IP 0.569 0.384 0.367 0.440Tongue motion Average IP 0.450 0.191 0.158 0.266
[bit/s]
Mouse Average IP 8.172 7.833 7.010 7.700[bit/s]
C. Experimental results
The relationships between the MT and the ID with de-
veloped tongue motion based interface and with the mouse,
according to equation (7), from the experiment on each subject
are shown in Fig. 8. The transition of the each IP is shown in
Fig. 9. The average value of the IP according to the subjects
and overall average of them with developed tongue based
system and the mouses are shown in Table IV. Although the
resulted overall average was 0.266 bit/s in this experiment, it
can be, however, considered that user gets better performance
after enough training, because the Fig. 9 shows that the trend
of IP is upward. Through the whole experiments, the subject
1 earns good score and his maimum IP was 0.569 bit/s.
IV. SUMMARY AND DISCUSSION
We developed a classification algorithm of sophisticated
alternative interface with using BEPs of muscles related to
tongue motions. This algorithm utilizes two indexes, one of
which is an ensemble average of BEP values and the other is
gravity center of electrodes position weighted by BEP values.
Utilizing two indexes, 7 types motions including neutral are
distinguished with 70 percent accuracy in the classification
experiment.
Additionally the operability of the developed interface sys-
tem was quantitatively evaluated using Fitts’ law based GUI
test bed, which was extended to two dimensional model, and
the performance of the proposed interface was compared with
that of other available interfaces. The IP of the proposed
interface was 0.440 b/s in good condition, compared with the
reported IP (0.386 b/s)[2] of the commercial assistive pointing
device called Brainfinger. Although the 0.440 b/s is lower than
that of the method using forearm BEPs reported in [6], our
proposed method still has an advantage of applicable of scope,
because the proposed system can be applied for SCIs at the
C3-C4 functional levels in contrast with that the method of [6]
needs C6 functional levels. Moreover the proposed interface
system is no-invasive and is simple to apply, because the user
just attach the electrodes array on his/ her skin surface of
anterior neck region.
There, however, still remain some issues to be solved. The
biggest problem is that the robustness of classification algo-
rithm against donning-doffing is low. Since a skin condition
is sensitive and varies from day to day, it is impossible to
develop an algorithm which can compensate the disturbance
of electric condition of skin. Therefore, this system always
needs a calibration process in the beginning. However, the
burden of the calibration process can be reduced by developing
an auto calibration method. For this purpose, we will utilize
a canonical correlation analysis to refine the classification
method including a calibration process. Another issue is to
test our proposed system in more number of subjects for
investigating the effect of learning.
ACKNOWLEDGMENT
This study was supported in part by the Global COE
Program on ”Cybernics: fusion of human, machine, and in-
formation systems.”
REFERENCES
[1] Rebsamen, B. and Teo, C.L. and Zeng, Q. and Ang Jr, M.H. andBurdet, E. and Guan, C. and Zhang, H. and Laugier, C., “Controlling awheelchair indoors using thought,” IEEE intelligent systems, pp. 18–24,2007
[2] A. Pino, E. Kalogeros, E. Salemis, and G. Kouroupetroglou, “BrainComputer interface cursor measures for motion-impaired and able-bodied users,” Int. Conf. Human-Comput. Interact., Crete, Greece, 2003,vol. 4, pp. 1462–1466.
261
[3] Huo, X. and Wang, J. and Ghovanloo, M., “A magneto inductive sensorbased wireless tongue-computer interface,” IEEE Trans. on NeuralSystems and Rehab. Eng., vol. 16, NO. 5, pp. 497–504, 2008.
[4] Ichinose, Y. and Wakumoto, M. and Honda, K. and Azuma, T. andSatou, J., “Human interface using a wireless tongue-palate contactpressure sensor system and its application to the control of an electricwheelchair,” IEICE Trans. Inf. & Syst., vol 86, pp. 364–367, 2003, (inJapanease).
[5] Vaidyanathan, R. and Chung, B. and Gupta, L. and Kook, H. andKota, “Tongue-movement communication and control concept for hands-free human–machine interfaces,” IEEE Trans. on Systems, Man andCybernetics, Part A: Systems and Humans, vol. 37, NO. 4, pp. 533–546, 2007.
[6] Choi, C. and Micera, S. and Carpaneto, J. and Kim, J., “Developmentand quantitative performance evaluation of a noninvasive EMG computerinterface,” IEEE Trans. on Biomedical Eng., vol. 56, NO. 1, Jan. 2009.
[7] Fleischer, C. and Hommel, G., “A Human–Exoskeleton Interface Utiliz-ing Electromyography,” IEEE Trans. on Robotics, vol. 24, NO. 4, pp.872–882, Aug. 2008.
[8] Kawamoto, H. and Lee, S. and Kanbe, S. and Sankai, Y., “Power assistmethod for HAL-3 using EMG-based feedback controller,” IEEE Int.Conf. on Systems, Man and Cybernetics, vol. 2, pp. 1648–1653, 2003.
262