Skip to main content

A preliminary investigation assessing the viability of classifying hand postures in seniors

Abstract

Background

Fear of frailty is a main concern for seniors. Surface electromyography (sEMG) controlled assistive devices for the upper extremities could potentially be used to augment seniors' force while training their muscles and reduce their fear of frailty. In fact, these devices could both improve self confidence and facilitate independent leaving in domestic environments. The successful implementation of sEMG controlled devices for the elderly strongly relies on the capability of properly determining seniors' actions from their sEMG signals. In this research we investigated the viability of classifying hand postures in seniors from sEMG signals of their forearm muscles.

Methods

Nineteen volunteers, including seniors (70 years old in average) and young people (27 years old in average), participated in this study and sEMG signals from four of their forearm muscles (i.e. Extensor Digitorum, Palmaris Longus, Flexor Carpi Ulnaris and Extensor Carpi Radialis) were recorded. The feature vectors were built by extracting features from each channel of sEMG including autoregressive (AR) model coefficients, waveform length and root mean square (RMS). Multi-class support vector machines (SVM) was used as a classifier to distinguish between fifteen different essential hand gestures including finger pinching.

Results

Classification of hand gestures both in the pronation and supination positions of the arm was possible. Classified hand gestures were: rest, ulnar deviation, radial deviation, grasp and four different finger pinching configurations. The obtained average classification accuracy was 90.6% for the seniors and 97.6% for the young volunteers.

Conclusions

The obtained results proved that the pattern recognition of sEMG signals in seniors is feasible for both pronation and supination positions of the arm and the use of only four EMG channel is sufficient. The outcome of this study therefore validates the hypothesis that, although there are significant neurological and physical changes occurring in humans while ageing, sEMG controlled hand assistive devices could potentially be used by the older people.

Background

Improving independent living of seniors and maintenance of their autonomy are compelling research goals for our society. Some simple activities of daily living such as opening and closing the screw cap of a bottle or turning a tap handle can be difficult tasks for a senior. By increasing the age, the skeletal muscles lose their strength [1]. In order to do everyday simple operations, seniors would need using assistive devices that could provide an additional force for their hand movements and also train their muscles [2].

A compelling challenge in the development of assistive devices is how to acquire information from input signals that provide us with the information regarding the action the user is undertaking. Acquiring the input signals from the neurological activity of the user would provide us with the desired information. sEMG is a suitable technique for evaluating and measuring the electrical activity produced by skeletal muscles and can also provide us with important information regarding neuromuscular disorders [3]. Using sEMG, we are able to detect the electrical signals generated by muscle cells when they are neurologically or electrically activated and if we interpret this information correctly, it can guide us towards the intention of the user [2, 3].

EMG signals have been considered to control prosthetic hands and assistive devices. Different prosthetic hands have been prototyped including the Smart Hand [4] and the Cyber Hand [5]. Some EMG driven prostheses have also been commercialised; examples are the Otto Bock's Sensor Hand Speed [6] and the iLimb [7]. In the mentioned researches, the goal was to obtain a prosthetic hand that could perform movements similar to a human hand. A challenging part in the development of these prosthetic hands is the design of an intuitive control achieved by detection and interpretation of the user's neurological activity [8, 9]. Whether used for controlling prosthetic, rehabilitative or assistive devices, sEMG signals should be processed to identify the intention of the user.

One of the main challenges related to the processing and classification of sEMG is related to the synergistic use of upper extremity muscles. For example, raising the shoulder to lift the forearm results in forearm signal changes [9]; similarly, contracting the index finger results in co-contraction of forearm muscles [10–12].

Different pattern recognition techniques have been used for classification of sEMG [2, 3] and identification of hand gestures in young volunteers [13, 14]. For example, multilayer perceptron [15, 16], SVM [9, 17–20], hidden markov model [21], neural networks [22], bayesian classifier [23] and fuzzy classifier [24–26] techniques have been proposed. Multiple features have been investigated including AR model coefficients [22, 24, 26, 27], mean absolute value [27, 28], slope sign changes [29, 30], zero crossings [27–29], waveform length [29, 30] and wavelet packet transform [15, 31].

Most of the research has been performed with populations involving young healthy volunteers and amputees. Little research has however been carried out to assess if aging prevents a successful sEMG classification, which is needed to control assisted devices developed to augment force and reduce fear of frailty in the older people. It should be noted that there are significant neurological and physical changes occurring in humans while ageing [32]. This study therefore focuses on assessing the viability of classifying hand postures in seniors.

Methods

Data collection

A custom rig was used to measure hand force and torque exerted by the volunteers. The rig (see Figure 1) consisted of a force sensor (Futek LCM-300) which measured contraction force. This sensor was placed between two plastic halves, which formed together a semi-sphere to enable the volunteers to comfortably hold the rig with their hand. These two plastic halves were connected to a metallic platform through a torque sensor (Transducer Techniques TRT-100) that recorded torque produced by the volunteer while performing ulnar or radial deviation movements.

Figure 1
figure 1

Custom rig.

Guidelines presented in the sEMG for the non-invasive assessment of muscles (SENIAM) project [33] were followed to obtain a fine skin contact with the electrodes. According to these guidelines, the skin was cleaned with an alcohol swab and electrodes were placed at the locations shown in Figure 2. sEMG electrodes were attached to the volunteers' forearms using medical adhesive bands that made the electrodes' active faces adhere the skin.

Figure 2
figure 2

Location of surface electrodes on the forearm.

sEMG signals were recorded from the following four muscles in order to detect movement of wrist and fingers [34]: Extensor Digitorum (ED), Palmaris Longus (PL), Flexor Carpi Ulnaris (FCU) and Extensor Carpi Radialis(ECR). Function of each muscle is summarized in Table 1. sEMG signals were acquired through a Noraxon system (Myosystem 1400L). A data acquisition board from National Instruments (USB-6289) was used in this study for acquiring both the sEMG signals and the data obtained from the custom rig used to measure hand force and torque. Since the EMG signal has usable energy in the 0-500 Hz range [35], the acquired sEMG signal was digitized at 1024 samples per second and stored on a computer through an application developed in LAbVIEW software. The developed LabVIEW application also had a graphical interface to enable volunteers visualizing force they were exerting during the tests. For each participant, the maximum force exerted to the rig was used to define the participant's maximum voluntary contraction (MVC). According to [36], the applied force should not exceed 40-50% of the MVC in order to prevent upper extremity musculoskeletal injuries. For this reason, all the protocols were defined to prevent exceeding this limit.

Table 1 Muscle function

Protocol

12 seniors (70 years old on average) and 7 young volunteers (27 years old on average) participated in this study. The Office of Research Ethics, Simon Fraser University approved this study and each senior signed a consent form. Each volunteer followed the eight predefined protocols summarized in Table 2. These protocols were defined to simulate simple activities of daily living involving the wrist and fingers such as opening and closing the screw cap of a jar or grasping an object. The identified protocols considered a combination of several hand movements including grasping, finger pinching, wrist ulnar/radial deviation and forearm pronation/supination. Each volunteer started at rest position as shown in Figure 3-a.

Table 2 Protocols
Figure 3
figure 3

Hand gestures and motions chosen for classification in the pronation position of the arm. (a) rest, (b)grasp, (c) ulnar deviation, (d) radial deviation, (e)finger pinching: index finger, (f) finger pinching: middle finger, (g) finger pinching: ring finger, (h) finger pinching: little finger.

In protocol A, as shown in Figure 3-b, the volunteer was asked to squeeze the custom rig with maximum force in pronation position of the arm for two times. The recorded maximum force was used to define MVC for squeezing.

In protocol B, as shown in Figures 3c-d, the volunteer was asked to apply maximum torque in ulnar and radial deviation for two times (pronation position of the arm). Maximum torques for ulnar and radial deviations were used to identify ulnar/radial MVCs.

In protocol C, the volunteer was asked to squeeze the custom rig at 50% of her/his MVC for 5 seconds (pronation position of the arm). The volunteer repeated this protocol three times. Using the graphical interface of the developed LabVIEW application, the volunteer had visual feedback for the force applied to the custom rig.

In protocol D, the volunteer was asked to alternate radial and ulnar deviation for 5 seconds at 50% of MVC (pronation position of the arm). The volunteer repeated this procedure three times.

In protocol E, as shown in Figures 3e-h, the volunteer pinched the force sensor firstly with thumb and index finger, secondly with thumb and middle finger, thirdly with thumb and ring finger, and finally with thumb and little finger (pronation position of the arm). The pinching was repeated two times for each combination of fingers.

In Protocols FC, FD and FE (see Figures 4a-h), each volunteer started at rest position and repeated protocols C, D and E but with their arm in supinated position. Figure 5 presents the output recorded by the force and torque sensors for one of the volunteers following protocols A, B, C and D. Figure 6 presents a sample output of the force and torque sensors related to protocols E, FC, FD and FE.

Figure 4
figure 4

Hand gestures and motions chosen for classification in the supination position of the arm. (a) rest, (b) radial deviation, (c) ulnar deviation, (d) grasp, (e) finger pinching: index finger, (f) finger pinching: middle finger, (g) finger pinching: ring finger, (h) finger pinching: little finger.

Figure 5
figure 5

Forces and torques representing predefined protocols A, B, C and D. (a) Protocol A, (b) Protocol B, (c) Protocol C and (d) Protocol D.

Figure 6
figure 6

Forces and torque representing predefined protocols E, FC, FD and FE. (a) Protocol E, (b) Protocol FC, (c) Protocol FD and (d) Protocol FE.

Protocols A and B (see Table 2) were followed to record the maximum torque produced by the user. Protocols C, D, E, FC, FD, and FE were instead used to generate data for the formation of the different hand gesture classes summarized in Table 3. Specifically, protocols C, D and E enabled extracting data for classification purpose in the pronation position of the arm (classes 2-8 in Table 3) whereas protocols FC, FD and FE were used to extract data for classification in the supination position of the arm (classes 9-15 in Table 3).

Table 3 Class Definition

Feature extraction and classification

The proposed sEMG signal classification scheme is presented in Figure 7. As shown in this figure, signals recorded from the Noraxon measurement system were processed in MATLAB R2009a for feature extraction in order to reduce the dimensionality of the raw sEMG input.

Figure 7
figure 7

The proposed sEMG signal classification scheme.

Pattern recognition accuracy is influenced by the selection of extracted features and features cannot be extracted from the individual samples as the structural detail of the signal is lost [37]. In fact, the features need to be calculated by segmenting the raw sEMG signal and calculating a set of features from each segment. For this reason, the recorded data was segmented into 250 ms intervals corresponding to 256 samples in each segment and features were extracted from each segment. Then, for the next feature extraction, the segment window was incremented by 125 ms including 128 samples.

Waveform length, time windowed RMS and AR models were used to extract six features for each of the four sEMG channels. Specifically, waveform length and RMS provided one feature each, whereas AR models provided four features in total as explained in the following paragraphs.

The waveform length, which measures the waveform complexity in each segment, was computed as:

y = ∑ r - 1 N Δ t r = ∑ r - 1 N t r - t r - 1
(1)

where t r is the amplitude of the rth sample and N is the number of samples.

The time windowed RMS value of the raw sEMG signal was used in order to provide information regarding the amplitude of the signal. This feature is mathematically presented as:

m r m s = m 1 2 + m 2 2 + … + m n 2 n
(2)

where m i is the amplitude of the ith sample in the time domain, and n is the number of samples. In our case n was equal to 256.

The last feature used in this study was based on AR models. AR models can be defined as a linear combination of previous samples and noise. The mathematical representation of current value is given by (3):

t n = ∑ i = 1 p q i P t n - i + w n
(3)

where w is the additive noise and {q for i = 1, ..., p } are AR model coefficients. Four AR model coefficients were selected as adequate for modelling EMG signals as discussed in [38].

Six seconds of data per person per protocol was extracted. In order to train and test the pattern recognition model, the gathered data was divided into training and testing sets (see Figure 7) [39]. The testing set was limited to 3807 data segments, namely 90% of the gathered data, as the use of a higher number of segments did not significantly improve the classification accuracy. The remaining 10% of the gathered data, corresponding to 423 data segments, was used as testing set.

SVM [40] was chosen as classifier in this study. SVM was selected among all the other possible pattern recognition tools, as it is a well-known robust classifier, which has extensively and successfully been used to process bio-information signals [41–43]. In addition, SVM works well in high dimensional spaces and has shown good classification results in many practical applications [44–49].

In its general formulation, the SVM [40] requires solving the following optimization problem:

min 1 2 w 2 + c ∑ n = 1 N ξ n subject to a n z ( x n ) ≥ 1 - ξ n , n = 1 , . . . , N ξ n ≥ 0
(4)

where w is the vector representing adaptive model parameters, c>0 is the penalty factor, N is the total number of data points, a n is the label associated with a data point, ξ n is the slack variable, z is the learned model, x n is the vector representing a data point, and n is the index associated to a data point.

In this study, the LibSVM tool [50] was used in MATLAB R2009a environment. LibSVM has an implementation for multi class SVM using one-versus-one strategy, whose details are presented in [51]. The LibSVM supports well-known kernels such as the radial basis function (RBF), polynomial, sigmoid and Gaussian kernels.

Following guidelines presented in [52], the RBF was selected as it nonlinearly maps the samples and has limited numbers of hyper parameters thus reducing the complexity of model selection. The mathematical representation of the RBF kernel is:

k ( x i , x j ) = exp ( - γ x i - x j 2 ) , γ > 0
(5)

Eight fold cross validation along with grid search was used to select the pattern recognition optimal parameters c and γ. Figure 8 shows an illustrative example of results obtained for a single participant. It can be seen that the cross validation accuracy does occur in the interval (0,100) for c and (0,3) for γ. This interval was selected for the identification of the optimal parameters for all participants.

Figure 8
figure 8

Cross validation accuracy based on c and γ parameters.

Results and discussion

The optimal values for the parameters c and γ were selected according to the highest value of the cross validation accuracy for each individual volunteer. Table 4 presents the selected c and γ parameters for each of the twelve seniors (denoted with capital letters A-Q in Table 4) who participated in this study. Each pair of c and γ parameters was used to build a model for classifying the hand gestures of the participant. Results of the classification accuracies for the 12 seniors are presented in Table 5. An average accuracy of 90.62% was observed.

Table 4 The senior cross validation accuracy and model parameters c and γ
Table 5 The senior pattern recognition accuracy

The accuracy reached over 95% in the case of the senior Q and less than 85% in the case of the senior L (see Table 5). The senior Q controlled the hand functions well, which resulted in an accurate separation between torque patterns. As an illustrative example, the torque output recorded for the senior Q is shown in Figure 9-a. It is clear from this figure that the senior Q was executing the protocol FD (three repetitions of alternating radial and ulnar deviation). On the other hand, the senior L controlled hand functions poorly, which resulted in small separation between torque patterns. The torque output recorded for the senior L is shown in Figure 9-b; it is clear that this senior was not able to correctly follow protocol FD. It should be noted that, although the classification accuracy was smaller for the senior L (see Tables 5), it was still acceptable (above 83%).

Figure 9
figure 9

The output recorded by the torque sensor for seniors. (a) Senior Q following the protocol FD correctly and (b) Senior L following the protocol FD incorrectly.

The system was therefore able to accurately classify the action of the seniors' hand with minimum misclassification, which occurred mainly for finger pinching. Figure 10 shows, for example, sEMG signals extracted from ECR, ED, PL and FCU muscles of senior A (Figures 10a-d), the "predicted classes" identified by our classification system (Figure 10-e) and the "actual classes" corresponding to the different protocols (Figure 10-f). It can be seen that misclassification occurred for consecutive classes related to the finger pinching (see highlighted boxes in Figure 10-e). Specifically, class 7 (ring finger pinching in pronation position) was confused with class 6 (middle finger pinching in pronation position) and class 14 (ring finger pinching in supination position) was confused with class 13 (middle finger pinching in supination position) (see Table 3). It should be noted that this misclassification, which probably resulted by a co-contraction of the forearm muscles, is believed to be acceptable for future potential devices assisting finger movements, as generally middle, ring and little fingers have synergistic patterns during functional grasping [53].

Figure 10
figure 10

System performance. (a) ECR muscle activation, (b) ED muscle activation, (c) PL muscle activation, (d) FCU muscle activation, (e) Predicted class by the system, (f) Actual class.

Table 5 also reports the maximum force and the maximum torque each senior was able to exert. The average maximum force was 3.11N and the average maximum torque was 5.92 Nm. No clear relationship was identified between classification accuracy and maximum force or maximum torque exerted by the volunteers. For example, volunteers D and N had equal classification accuracy but their maximum force and torque were respectively the highest and the smallest of the entire group of seniors.

Table 6 and Table 7 respectively present the selected c and γ parameters and the corresponding classification accuracies for the group of young volunteers. An average classification accuracy of 97.6% was obtained. Table 7 also reports the maximum force and maximum torque each young volunteer was able to exert. The average maximum force was 4.20N and the average maximum torque was 3.37 Nm. In this case, data suggests a linear relationship between classification accuracy and maximum force and maximum torque, as shown in Figure 11. It should however be noted that the number of young volunteers participating in this study was limited to 7.

Table 6 The young volunteer cross validation accuracy and model parameters c and γ
Table 7 The young volunteer pattern recognition accuracy
Figure 11
figure 11

The relationship between the maximum force/torque and the classification accuracy.

A comparison between results obtained for seniors and the young volunteers shows that while maximum force decreased of about 26%, classification accuracy decreased of only 7% with age. Although there are major physical changes occurring in humans while ageing [32], successful sEMG classification is therefore possible in seniors.

Conclusions

The possibility of associating forearm sEMG patterns to seniors' hand postures was investigated. Results support the hypothesis that successful pattern recognition can be performed to distinguish different hand gestures of seniors in vital activities of daily living.

The identified classes in this study were grasping, radial/ulnar deviation and four different finger pinching in both pronation and supination positions of the seniors' arm. The use of only four sEMG channels demonstrated to be suitable for classifying the fifteen different hand gestures considered in this study. In fact, the implemented pattern recognition strategy was able to identify the different hand gestures with accuracy greater than 90% independently of age and gender. The difference (7%) in classification accuracy observed between the young and older people could be attributed to aging. Misclassification occurred especially in seniors with reduced hand functions. Such a misclassification was however acceptable as it was mainly related to the ring finger, whose use is generally coupled to middle and little fingers during functional grasping.

References

  1. Morley JE: The top ten hot topics in aging. Journal of Gerontology Medical Sciences 2004, 59: 24–33. 10.1093/gerona/59.1.M24

    Article  Google Scholar 

  2. Khokhar ZO, Xiao ZG, Sheridan C, Menon C: A Novel wrist/rehabilitation device. Proceedings of the 13th IEEE International Multitopic Conference, 14–15 Dec., Islamabad 2009 80–85.

    Google Scholar 

  3. Reaz MBI, Hussain MS, Yasin FM: Techniques of EMG signal analysis: detection, processing, classification and applications. Biological Proceedings Online 2006, 8: 11–35. 10.1251/bpo115

    Article  Google Scholar 

  4. The SmartHand project 2007. [http://www.elmat.lth.se/~smarthand]

  5. The CyberHand project 2007. [http://www.cyberhand.org]

  6. Otto Bock SensorHand hand prothesis 2010. [http://www.ottobock.com/cps/rde/xchg/ob_com_en/hs.xsl/3652.html]

  7. The iLimb prosthetic hand 2007. [http://www.touchbionics.com]

  8. Henry M, Sheridan C, Khokhar ZO, Menon C: Towards the development of a wearable rehabilitation device for stroke survivors. Proceedings of IEEE Toronto International Conference 26–27 Sep., Toronto 2009.

    Google Scholar 

  9. Castellini C, Smagt PVD: Surface EMG in advanced hand prosthetics. Biological cybernetics Springer-Verlag 2008.

    Google Scholar 

  10. Maier MA, Raymond MCH: EMG activation patterns during force production in precision grip. 1. Contribution of 15 finger muscles to isometric force. Exp Brain Res 1995, 103: 108–122. 10.1007/BF00241969

    Article  Google Scholar 

  11. Cuevas FJV, Zajac FE, Burgar CG: Large index-fingertip forces are produced by subject independent patterns of muscle excitation. J Biomechanics 1998, 31: 693–703. 10.1016/S0021-9290(98)00082-7

    Article  Google Scholar 

  12. Cuevas FJV: Predictive modulation of muscle coordination pattern magnitude scales fingertip force magnitude over the voluntary range. J Neurophysiology 2000, 83: 1469–1479.

    Google Scholar 

  13. Parker P, Englehart K, Hudgins B: Myoelectric Signal Processing for Control of Powered Limb Prostheses. Journal of Electromyography and Kinesiology 2006, 16: 541–548. 10.1016/j.jelekin.2006.08.006

    Article  Google Scholar 

  14. Clancy EA, Hogan N: Theoretic and Experimental Comparison of Root-Mean-Square and Mean-Absolute-Value Electromyogram Amplitude Detectors. Proc. nineteenth Annu. Int. Conf. IEEE Engineering in Medicine and Biology Society (EMBS '97) 1997, 3: 1267–1270.

    Google Scholar 

  15. Chu JU, Moon I, Lee YJ, Kim SK, Mun MS: A supervised feature-projection-based real-time EMG pattern recognition for multifunction myoelectric hand control. IEEE/ASME Transactions on Mechatronics 2007, 12: 282–290.

    Article  Google Scholar 

  16. Englehart K, Hudgins B, Parker PA, Stevenson M: Classification of the myoelectric signal using time-frequency based representations. Medical Eng & Physics 1999, 21: 431–438. 10.1016/S1350-4533(99)00066-1

    Article  Google Scholar 

  17. Yoshikawa M, Mikawa M, Tanaka K: A Myoelectric interface for robotic hand control using support vector machine. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems: 29 October - 2 November 2007; San Diego 2007, 2723–2728.

    Chapter  Google Scholar 

  18. Saunders C, Stitson MO, Weston J, Bottou L, Sch¨olkopf B, Smola A: Support vector machine reference manual. Technical Report CSD-TR-98–03, Royal Holloway, University of London, London 1998.

    Google Scholar 

  19. Liu YH, Huang HP, Weng CH: Recognition of electromyographic signals using cascaded kernel learning machine. IEEE/ASME Trans Mechatronics 2007, 12: 253–264.

    Article  Google Scholar 

  20. Bitzer S, Smagt PVD: Learning EMG control of a robotic hand: towards active prosthesis. Proceedings of the IEEE International Conference on Robotics and Automation: 15–19 May 2006; Orlando 2006, 2819–2823.

    Google Scholar 

  21. Chan ADC, Englehart KB: Continuous myoelectric control for powered prosthesis using hidden markov models. IEEE Trans Biomed Eng 2005, 52: 121–124. 10.1109/TBME.2004.836492

    Article  Google Scholar 

  22. Soares A, Andrade A, Lamounier E, Carrijo R: The development of a virtual myoelectric prosthesis controlled by an EMG pattern recognition system based on neural networks. Journal of Intelligent Information Systems 2003, 21: 127–141. 10.1023/A:1024758415877

    Article  Google Scholar 

  23. Englehart K, Hudgins B, Parker P: A wavelet-based continuous classification scheme for multifunction myoelectric control. IEEE Trans Biomed Eng 2001, 48: 302–311. 10.1109/10.914793

    Article  Google Scholar 

  24. Karlik B, Tokhi MO, Alci M: A fuzzy clustering neural network architecture for multifunction upper-limb prosthesis. IEEE Transactions on Biomedical Engineering 2003, 50: 1255–1261. 10.1109/TBME.2003.818469

    Article  Google Scholar 

  25. Inoue T, Abe S: Fuzzy support vector machines for pattern classification. Proceedings of International Joint Conference on Neural Networks(IJCNN '01) 2001, 2: 1449–1454.

    Article  Google Scholar 

  26. Park SH, Lee SP: EMG pattern recognition based on artificial intelligence techniques. IEEE Trans on Rehab Eng 1998, 6: 400–405. 10.1109/86.736154

    Article  Google Scholar 

  27. Khezri M, Jahed M: Real-time intelligent pattern recognition algorithm for surface EMG signals. Biomedical Engineering Online 2007, 6: 45. 10.1186/1475-925X-6-45

    Article  Google Scholar 

  28. Chan FHY, Yang YS, Lam FK, Zhang YT, Parker PA: Fuzzy EMG classification for prosthesis control. IEEE Trans on Rehab Eng 2000, 8: 305–311. 10.1109/86.867872

    Article  Google Scholar 

  29. Chu JU, Lee YJ: Conjugate-prior-penalized learning of Gaussian mixture models for multifunction myoelectric hand control. IEEE Trans Neural Sys and Rehab Eng 2009, 17: 287–297.

    Article  Google Scholar 

  30. Englehart K, Hudgins B: A robust, real-time control scheme for multifunction myoelectric control. IEEE Trans Biomed Eng 2003, 50: 848–854. 10.1109/TBME.2003.813539

    Article  Google Scholar 

  31. Chu JU, Moon I, Mun MS: A real-time EMG pattern recognition system based on linear-nonlinear feature projection for a multifunction myoelectric hand. IEEE Trans Biomed Eng 2006, 53: 2232–2239. 10.1109/TBME.2006.883695

    Article  Google Scholar 

  32. Carmeli E, Patish H, Coleman R: The Aging Hand, Journal of Gerontology. Medical Sciences 2003, 58A: 146–152.

    Google Scholar 

  33. SENIAM Project [http://www.seniam.org]

  34. Lew HL, TSAI SJ: Pictorial guide to muscles and surface anatomy. In Johnson's practical electromyography. 4th edition. Edited by: Pease WS, Lew HL, Johnson EW. Lippincott Williams & Wilkins; 2007:145–212.

    Google Scholar 

  35. Luca CJD: Surface electromyography: detection and recording. 2002 by DelSys Incorporated

  36. Mital A, Pennathur A: Musculoskeletal overexertion injuries in the United States: mitigating the problem through ergonomics and engineering interventions. Journal of Occupational Rehabilitation 1999, 9: 115–149. 10.1023/A:1021318204926

    Article  Google Scholar 

  37. Hudgins B, Parker P, Scott RN: A new strategy for multifunction myoelectric control. IEEE Trans Biomed Eng 1993, 40: 82–94. 10.1109/10.204774

    Article  Google Scholar 

  38. Huang HP, Chen CY: Development of a myoelectric discrimination system for a multi-degree prosthetic hand. Proceedings of the International Conference on Robotics and Automation: May 1999; Detroit 1999, 2392–2397.

    Google Scholar 

  39. Tavakolan M, Khokhar ZO, Menon C: Pattern Recognition for Estimation of Wrist Torque Based on Forearm Surface Electromyography Signals. Proceedings of the IEEE/RA/EMB/IFMBE International Conference on Applied Bionics and Biomechanics: October 2010; Venice

    Google Scholar 

  40. Vapnik V: The support vector method of function estimation. In Nonlinear modelling: Advanced black-box techniques. Edited by: Sukens JAK, Vandewalle J. Kluwer academic publishers, Boston; 1998:55–85.

    Chapter  Google Scholar 

  41. Khandoker AH, Palaniswami M, Karmakar CK: Support vector machines for automated recognition of obstructive sleep apnea syndrome from ECG recordings. IEEE Trans Inf Tech Biomedicine 2009, 13: 37–48.

    Article  Google Scholar 

  42. Guler I, Ubeyli ED: Multiclass support vector machines for EEG-signals classification. IEEE Trans Inf Tech Biomedicine 2007, 11: 117–126.

    Article  Google Scholar 

  43. Cristianini N, Shawe TJ: An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge University Press; 2000.

    Book  MATH  Google Scholar 

  44. Kreßel UHG: Pairwise classification and support vector machines. In Advances in Kernel Methods: Support Vector Learning. Edited by: Schölkopf B, Burges CJC, Smola AJ. The MIT Press, Cambridge, MA; 1999:255–268.

    Google Scholar 

  45. Burges CJC: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 1998, 2: 121–167. 10.1023/A:1009715923555

    Article  Google Scholar 

  46. Burges CJC, Schölkopf B: Improving the accuracy and speed of support vector learning machines. In Advances in Neural Information Processing Systems 9. MIT Press; 1997:375–381.

    Google Scholar 

  47. Cortes C, Vapnik V: Support vector networks. Machine Learning 20: 1–25.

  48. Naqa IE, Yang Y, Wernik M, Galatsanos N, Nishikawa R: A support vector machine approach for detection of microcalsifications. Med Imag 2002, 21: 1552–1563. 10.1109/TMI.2002.806569

    Article  Google Scholar 

  49. Joachims T: Text categorization with support vector machines: Learning with many relevant features. Proceedings of the 10th European Conference on Machine Learning (ECML): Chemnitz 1998, 137–142.

    Google Scholar 

  50. Chang CC, Lin CJ: LIBSVM: a library for support vector machines. 2001. [http://www.csie.ntu.edu.tw/~cjlin/libsvm]

    Google Scholar 

  51. Bishop CM: Sparse kernel machines. In Pattern recognition and machine learning. Edited by: Jordan M, Kleinberg J, Scholkopf B. Springer; 2006:325–358.

    Google Scholar 

  52. Hsu CW, Chang CC, Lin CJ: A practical guide to support vector classification. Department of computer science, National Taiwan University, Taipei, Taiwan 2009.

    Google Scholar 

  53. Santello M, Soechting JF: Force synergies for multifingered grasping. Exp Brain Res 2000, 133: 457–467. 10.1007/s002210000420

    Article  Google Scholar 

Download references

Acknowledgements

This study was supported by the BC Network for Aging Research (BCNAR) and the Canadian Institutes of Health Research (CIHR). The authors would like to thank Mr. Amirreza Ziai for helping in the design of the custom-made rig, and Ms. Lulu Chavez for her collaboration and support in the data collection.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Carlo Menon.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

MT designed and implemented the feature selection and classification and drafted the manuscript. ZGX collected the data and developed the measurement systems. CM supervised the project, contributed to discussions and analysis and participated in manuscript revisions. All authors read and approved the final manuscript.

Authors’ original submitted files for images

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Tavakolan, M., Xiao, Z.G. & Menon, C. A preliminary investigation assessing the viability of classifying hand postures in seniors. BioMed Eng OnLine 10, 79 (2011). https://doi.org/10.1186/1475-925X-10-79

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1475-925X-10-79

Keywords