12
Research Article Discrete-Time Sliding Mode Control Coupled with Asynchronous Sensor Fusion for Rigid-Link Flexible-Joint Manipulators Guangyue Xue , 1 Xuemei Ren, 2 Kexin Xing, 3 and Qiang Chen 3 1 China Transport Telecommunications & Information Center, Chaoyang District, Beijing 100011, China 2 School of Automation, Beijing Institute of Technology, Beijing 100081, China 3 College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China Correspondence should be addressed to Guangyue Xue; [email protected] Received 4 March 2021; Revised 11 May 2021; Accepted 15 July 2021; Published 27 July 2021 Academic Editor: Xue-bo Jin Copyright©2021GuangyueXueetal.isisanopenaccessarticledistributedundertheCreativeCommonsAttributionLicense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. is paper proposes a novel discrete-time terminal sliding mode controller (DTSMC) coupled with an asynchronous multirate sensor fusion estimator for rigid-link flexible-joint (RLFJ) manipulator tracking control. A camera is employed as external sensors to observe the RLFJ manipulator’s state which cannot be directly obtained from the encoders since gear mechanisms or flexible joints exist. e extended Kalman filter- (EKF-) based asynchronous multirate sensor fusion method deals with the slow sampling rate and the latency of camera by using motor encoders to cover the missing information between two visual samples. In the proposed control scheme, a novel sliding mode surface is presented by taking advantage of both the estimation error and tracking error. It is proved that the proposed controller achieves convergence results for tracking control in the theoretical derivation. Simulation and experimental studies are included to validate the effectiveness of the proposed approach. 1. Introduction In many high-performance robotic manipulator applica- tions, positioning of the end effector (or/and links) is critical as the ultimate goal is to track the desired trajectory. However, the challenge of achieving high accuracy and dynamic performance is increased due to the nonlinear flexibilities found in rigid-link flexible-joint (RLFJ) ma- nipulators. Some researchers have designed observers to estimate states in the robot model, since link positions/ velocities are typically not measured in the industrial robot system. Nicosia and Tomei [1, 2] designed a controller combined with the observer which estimates motor posi- tions or/and motor velocities for RLFJ manipulators. Dixon et al. [3] developed a globally adaptive output-feedback tracking controller for the RLFJ dynamics, which is based on the link velocity filter. Globally, output-feedback methods are not easily implemented in the real system because of the requirement of link position measurements which are not frequently measured in real systems. A controller was designed in [4, 5] by using a neural network (NN) observer to estimate link and motor positions/velocities and dynamic parameters. But NN observer-based controllers do not take advantage of motor positions. Kalman filter (KF) and ex- tended Kalman filter (EKF) are utilized to estimate positions of joints or/and end effectors, driving torque, and dynamic parameters for manipulators [6]. Lightcap and Banks [7] designed an EKF-RLFJ controller by using EKF to estimate link and motor positions/velocities. Garcła et al. [8] pro- posed compliant robot motion controllers by using EKF to fuse wrist force sensors, accelerometers, and joint position sensors. EKF-based sensor fusion was presented by Jassemi- Zargani and Necsulescu [9] to estimate the acceleration for operational space control of a robotic manipulator. How- ever, these reported EKF-based control methods do not discuss the case of asynchronous measurements from multirate sensors for RLFJ manipulators. Observer-based sliding mode control (SMC) is one of the most important approaches to handle systems with uncer- tainties and nonlinearities [10]. For the RLFJ manipulator system, observer-based SMC of robot manipulators has been widely studied since the state of manipulators (e.g., Hindawi Complexity Volume 2021, Article ID 9927850, 12 pages https://doi.org/10.1155/2021/9927850

Discrete-TimeSlidingModeControlCoupledwithAsynchronous

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

Research ArticleDiscrete-Time Sliding Mode Control Coupled with AsynchronousSensor Fusion for Rigid-Link Flexible-Joint Manipulators

Guangyue Xue 1 Xuemei Ren2 Kexin Xing3 and Qiang Chen3

1China Transport Telecommunications amp Information Center Chaoyang District Beijing 100011 China2School of Automation Beijing Institute of Technology Beijing 100081 China3College of Information Engineering Zhejiang University of Technology Hangzhou 310023 China

Correspondence should be addressed to Guangyue Xue iviiivi163com

Received 4 March 2021 Revised 11 May 2021 Accepted 15 July 2021 Published 27 July 2021

Academic Editor Xue-bo Jin

Copyright copy 2021Guangyue Xue et al)is is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

)is paper proposes a novel discrete-time terminal sliding mode controller (DTSMC) coupled with an asynchronous multiratesensor fusion estimator for rigid-link flexible-joint (RLFJ) manipulator tracking control A camera is employed as external sensorsto observe the RLFJ manipulatorrsquos state which cannot be directly obtained from the encoders since gear mechanisms or flexiblejoints exist )e extended Kalman filter- (EKF-) based asynchronous multirate sensor fusion method deals with the slow samplingrate and the latency of camera by using motor encoders to cover the missing information between two visual samples In theproposed control scheme a novel sliding mode surface is presented by taking advantage of both the estimation error and trackingerror It is proved that the proposed controller achieves convergence results for tracking control in the theoretical derivationSimulation and experimental studies are included to validate the effectiveness of the proposed approach

1 Introduction

In many high-performance robotic manipulator applica-tions positioning of the end effector (orand links) is criticalas the ultimate goal is to track the desired trajectoryHowever the challenge of achieving high accuracy anddynamic performance is increased due to the nonlinearflexibilities found in rigid-link flexible-joint (RLFJ) ma-nipulators Some researchers have designed observers toestimate states in the robot model since link positionsvelocities are typically not measured in the industrial robotsystem Nicosia and Tomei [1 2] designed a controllercombined with the observer which estimates motor posi-tions orand motor velocities for RLFJ manipulators Dixonet al [3] developed a globally adaptive output-feedbacktracking controller for the RLFJ dynamics which is based onthe link velocity filter Globally output-feedback methodsare not easily implemented in the real system because of therequirement of link position measurements which are notfrequently measured in real systems A controller wasdesigned in [4 5] by using a neural network (NN) observer

to estimate link and motor positionsvelocities and dynamicparameters But NN observer-based controllers do not takeadvantage of motor positions Kalman filter (KF) and ex-tended Kalman filter (EKF) are utilized to estimate positionsof joints orand end effectors driving torque and dynamicparameters for manipulators [6] Lightcap and Banks [7]designed an EKF-RLFJ controller by using EKF to estimatelink and motor positionsvelocities Garcła et al [8] pro-posed compliant robot motion controllers by using EKF tofuse wrist force sensors accelerometers and joint positionsensors EKF-based sensor fusion was presented by Jassemi-Zargani and Necsulescu [9] to estimate the acceleration foroperational space control of a robotic manipulator How-ever these reported EKF-based control methods do notdiscuss the case of asynchronous measurements frommultirate sensors for RLFJ manipulators

Observer-based slidingmode control (SMC) is one of themost important approaches to handle systems with uncer-tainties and nonlinearities [10] For the RLFJ manipulatorsystem observer-based SMC of robot manipulators has beenwidely studied since the state of manipulators (eg

HindawiComplexityVolume 2021 Article ID 9927850 12 pageshttpsdoiorg10115520219927850

acceleration velocity of joints) need not always be measureddirectly [11] )e terminal SMC (TSMC) is used in rigidmanipulator control (eg robust TSMC and finite-timecontrol) since it has some superior properties compared withSMC such as better tracking precision and fast error con-vergence [12ndash14] In particular the singularity problem ofthe TSMC was addressed in [15 16] However most of thesepapers use the continuous-time dynamic model of themanipulator Indeed discrete-time models exist widely inthe real digital control system It has been proved that thediscrete-time SMC has technological advances in digitalelectronics computer control and robotic system Corra-dinia and Orlando [17] presented a robust DSMC coupledwith an uncertainty estimator designed for planar roboticmanipulators However the flexibilities of joints are notinvolved in those controller designs

To remedy such limitations the paper proposes a novelcontroller AMSFE-DTSMC which is implemented basedon DTSMC coupled with an asynchronous multirate sensorfusion estimator )e robotic multirate sensor unit containsvision and non-vision-based sensors whose sampling rateand processing time are different In the proposed schemethe delayed slow sampling visionmeasurement is treated as akind of periodic ldquoout-of-sequencerdquo measurement (OOSM)[18] which is used to update the non-vision-based stateestimation in an EKF-based asynchronous multirate sensorfusion algorithm Using the position and velocity estimationfrom sensor fusion estimator the DTSMC is designed byusing a novel sliding surface which is implemented byconsidering estimation error and tracking error )e maincontributions of this work are summarized as follows

(i) We propose a novel tracking control schemeAMSFE-DTSMC which is based on the DTSMC

coupled with the sensor fusion estimator for a RLFJmanipulator)e sliding surface of AMSFE-DTSMCis designed by utilzing estimation error and trackingcontrol error

(ii) We construct an asynchronous multirate measure-ment model for robotic sensors and design a sensorfusion to fuse such asynchronous multirate data forrobotic state estimation

)is paper is organized as follows Section 2 gives theproblem formulation In Section 3 the multirate sensor datafusion algorithm is presented Section 4 designs a novelDTSMC for tracking control )e simulation and experi-mental studies are implemented in Section 5)e paper endswith conclusion about the proposed approach

2 Problem Formulation

In this paper a robotic manipulator system is given with thesensor unit including joint motor encoders and camerasfixed in the workspace)e tracking control scheme for RLFJmanipulators can be developed by using the robotic stateestimate via multisensor fusion )e state of the robot isestimated by these sensors directly and indirectly howeverthere are some limitations for a single sensor to obtainprecise information To fuse asynchronous multirate datafrom different sensors the dynamic and sensor models areformulated in this section

21 Discrete Rigid-Link Flexible-Joint Robot Model )ediscrete dynamic model of an nminus link RLFJ manipulator canbe obtained by the minimization of the action functionalsuggested by Nicosia [2] as follows

q(k + 1) q(k) + T _q(k) (1a)

_q(k + 1) minus _q(k) Mminus 1

(q θ)TKsδ(k) minus FL(q q _q θ)T (1b)

Mm _qm(k + 1) minus _qm(k)1113858 1113859 + Fm _qm(k)T Tu(k) minus Ksδ(k)T (1c)

δ(k) qm(k) minus q(k) (1d)

where q(k) _q(k) and qm(k) _qm(k) denote the positionvelocity of the link and motor angles at k time respectivelyq(k) q(k) + T _q(k) T is the sampling interval M(middot) is theinvertible inertia matrix which satisfies ξmin le Mle ξmaxFL(q q _q θ) minus Mminus 1(q θ)[M(q θ) _q minus M(q θ) _q minus Tf(q

_q) θ] f(middot) represents centrifugal Coriolis and gravitationalforces Ks Mm and Fm are constant diagonal positive-definite matrices representing joint stiffness motor inertiamotor viscous friction respectively the joint deflection δ(k)

is defined as the difference between the motor and linkposition and u(k) denotes the motor torque the unknownor varying dynamic parameter in the robotic model is

defined as θ(k) which satisfy θ(k + 1) minus θ(k) Twθ(k) thedynamics uncertainties of the links and motors are modeledwith random variables wl(k) and wm(k)

Define a state vector

x(k) q(k)T _q(k)T qm(k)T _qm(k)T θ(k)T1113960 1113961T (2)

)e dynamics in equations (1a)ndash(1d) can be transformedto a state space representation

x(k + 1) Fk+1k(x(k) u(k)) + Γkω(k + 1 k) (3)

where

2 Complexity

Fk+1k(x(k) u(k))

q(k) + T _q(k)

_q(k) minus FL(k)T + Mminus 1

(q) Ksδ(k)T( 1113857

qm(k) + T _qm(k)

Mminus 1m _qm(k) minus Fm _qm(k)T minus Ksδ(k)T + Tu(k)1113858 1113859

θ(k)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

Γk

0 0 0

M(q θ)minus 1 0 0

0 0 0

0 Mminus 1m 0

0 0 IT

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

ω(k + 1 k) wl(k)T wm(k)

T wθ(k)

T1113960 1113961

T

(4)

22MeasurementModel )e observation vector is given by

h(x(k)) qm(k)T

_qm(k)T

y(k)T

1113872 1113873T (5)

where qm(k) _qm(k)T are the position and velocity of motorangles and y(k) represents the position of the end effector inimage space observed by a fixed camera

By using the standard pinhole camera model themapping from the Cartesian space to image space is given by

y(k) Im(r(k)) (6)

where r(k) denotes the position of the end effector in theCartesian space and Im(middot) is a transformation function fromtask space to image space From the forward kinematics theposition relationship between robotic joints and end effectoris described by a transform relationship as follows

r(k) Fk(q(k)) (7)

where r(k) is the position of the end effector at kth time andFk(middot) is a transformation function from the joint space totask space

From the equations (6) and (7) )e joint position can beobserved by a fixed camera

y(k) Ik(q(k)) (8)

where Ik(middot) Im(Fk(middot)) is the mapping from the jointspace to image space With the random noise ](k) v

T11113960

vT2 (k)v

T3 (k)(k)] the measurement equation is given by

zi(k) h

ik(x(k)) + ]i

(k) i c m (9)

where zi(k) is the measurement of the state from differentsensor i c denotes the camera measurement and i m

represents the measurement from motors We assume thatthe process noise ω(k) and measurement noise ](k) aresampled from the independent and identically distributedwhite Gaussian noise which satisfies following equations attime tk

E ω(k + 1 k) 0

E ](k) 0

E ](j)ωT(k + 1 k)1113966 1113967 0

E ω(j + 1 j)ωTN(k + 1 k)1113966 1113967 Q(k)δjk

E ](j)]T(k)1113966 1113967 R(k)δjk

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(10)

Since the measurement of the vision sensor at time tk isobtained from the visual image taken at time steptκ tk minus td td denotes the delay time )e relation of dif-ferent sensorsrsquo sampling rates is given by

sm lsc (11)

where sm and sc denote the sampling rate of motor encodersand visual sensors respectively and l is a positive integer

We show the sampling rate difference between visionand non-vision measurements in Figure 1 where l step lagsof vision measurements are described According to thecharacteristics of visual sensor we treat the vision mea-surements as the periodic l-step lag out-of-sequencemeasurements

Remark 1 )e velocity of joints (orand end effector) alsocan be observed by vision sensors _y(k) Jik(q(k)) _q(k)with the Jacobian matrix Jik(q) Ji(r)Jk(q) mapping fromthe joint space to the image space Ji(r) is the image Jacobianmatrix and Jk(q) denotes the Jacobian matrix from jointspace to task space However the measurement of velocity isalways impaired by the noisy image data with slow samplingrate and the dynamic uncertainties in the RLFJ manipulatorsystem )erefore the velocity measurement in the imagespace is not utilized in the measurement model

Remark 2 Assume that the camera can cover the entireworkspace of the robot With the prior knowledge of motionplanning of the robot it is basically assumed that there is anone-to-one mapping from the image space to joint space inthe real robotic system

Complexity 3

3 Asynchronous Multirate SensorFusion Estimation

According to the system model described in Section 2 therobotic link position can be estimated by using the multi-sensor system For asynchronous multirate sensors thesensor fusion method is designed to use the late measure-ments for updating the current estimated state to get a moreaccurate estimation in two steps

Step 1 when the vision measurement is unavailable therobotic state is estimated using the non-vision-basedsensors which keep the real-time estimate by recoveringthe missing information between two vision samplesStep 2 as the delayed vision measurement arrives atevery vision sample at time t tlk l 1 2 the stateshould be re-estimated to cope with the limitations ofother sensors in the absolute position measurement

31 Estimation by Using Non-Vision Sensor MeasurementsFrom Figure 1 before the (k minus 1)th vision frame is availablewe have the estimation of the state at time tlk using the non-vision sensor measurements

1113954x(lk|lk) E x(lk)|Zlkm1113966 1113967

P(lk|lk) cov x(lk)|Zlkm1113966 1113967

(12)

where Zmlk zm(1) zm(2) zm(lk) represents all en-

coder motor measurements up to time tlk Using the ex-tended Kalman filter (EKF) the estimation at time tlk via themotor encoder measurement is given as follows

1113954xminus

(lk|lk) Flklkminus 1(1113954x(lk minus 1|lk minus 1) u(lk)) (13)

P(lk|lk minus 1) AlkP(lk minus 1|lk minus 1)ATlk + WlkQ(lk minus 1)W

Tlk

(14)

Klk P(lk|lk minus 1)HTlk Hlk times P(lk|lk minus 1)H

Tlk + R(lk)1113960 1113961

minus 1

(15)

1113954x(lk|lk) 1113954xminus

(lk|lk) + Klk zm

(lk) minus hmlk 1113954x

minus(lk|lk) 0( )( 1113857

(16)

P(lk|lk) I minus KlkHlk( 1113857P(lk|lk minus 1) (17)

where Alk Wlk Hlk and Vlk are Jacobian matrices and Klk isthe correction gain vector According to above equationsthe state and covariance estimates are updated with themeasurement with a fast sampling rate

32 Re-Estimation via Vision Sensor MeasurementsSuppose that at the time tlk the vision measurement at timetκ tklminus l is obtained A new estimation 1113954x(lk|κ+) is calculatedusing the information about the delayed (k minus 1)th visionmeasurement zc(κ) 1113954x(lk|κ+) is defined withZlk Zm

lk zc(κ)1113864 1113865

1113954x lk|κ+( 1113857 E x(lk)|Zlk1113864 1113865

P lk|κ+( 1113857 cov x(lk)|Zlk1113864 1113865

(18)

)e delayed measurement zc(κ) observed by visionmeasurement will be used to correct the accumulated es-timation errors which are caused by sensors with fastsampling rate )e equations for updating the estimation1113954x(lk|lk) with the delayed vision measurement zc(κ) aregiven by

1113954x lk|κ+( 1113857 1113954x(lk|lk) + Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1[z(κ) minus 1113954z(κ|lk)]

(19)

P lk|κ+( 1113857 P(lk|lk) minus Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1Pxz(lk κ|lk)

T

(20)

where Pxz(lk κ|lk) represents cross covariance between1113954x(lk|lk) and 1113954z(κ|lk) Pzz(κ|lk) is the covariance of 1113954z(κ|lk)

which denotes the estimation of measurement at time tκ and

Pxz(lk κ|lk) E 1113957x(lk)1113957zc(κ)

T1113966 1113967

Pxz(lk κ|lk) E 1113957zc(κ)1113957z

c(κ)

T1113966 1113967

1113954z(κ|lk) E z(κ)|Zlk1113864 1113865

(21)

with 1113957x(lk) x(lk) minus 1113954x(lk|lk) 1113957zc(κ) zc(κ) minus 1113954z(κ|lk)]TUsing the EKF Pxz(lk κ|lk) Pxz(lk κ|lk) and 1113954z(κ|lk)

are obtained by assuming that the function Fkκ(middot) is in-vertible Define Fminus 1

lkκ(middot) Fκlk(middot) which denotes the back-ward transition function to estimate the state back from tlk totκ Since the previous state x(κ) is not affected by presentinput signal u(lk) we give the state relationship between tκand tlk by

x(κ) Fκlk(x(lk) minus ω(lk κ)) (22)

where the process noise ω(lk κ) and covariance Q(kl κ) arecalculated by

lkth encoder measurement

kth visionmeasurement

(k minus 1) th visionmeasurement

(k minus 2) th visionmeasurement

Nonminusvision measurementsVision measurementsSystem states

lowast

Time tlkminusl tlkminus1 tlk tlk+1 tlk+l

Figure 1)e schematic diagram of the relation between vision andnon-vision measurements

4 Complexity

ω(lk κ) asymp Γlkω(lk lk minus 1) + Alklkminus 1Γlkminus 1ω(lk minus 1 lk minus 2)

+ middot middot middot + Alklkminus 1Alkminus 1lkminus 2 times middot middot middot times Alkminus l+2lkminus l+1Γlkminus l+1ω(lk minus l + 1 lk minus l)

Q(kl κ) E 1113957ω(lk κ)1113957ω(lk κ)T

1113966 1113967

1113957ω(lk κ) ω(lk κ) minus 1113954ω(lk κ)

(23)

)e estimation in equation (19) can be determined by

1113954z(κ|lk) Hmlk(κ)1113954x(κ|lk)

1113954x(κ|lk) Fκlk(1113954x(lk|lk) minus 1113954ω(lk κ|lk))(24)

To estimate the process noise as in equations (13)ndash(17)we have

1113954ω(lk κ|lk) Q(lk κ)HTlkPzz(lk|lk minus 1)

minus 1times(z(lk) minus z(lk minus 1))

(25)

)en Pxz(lk κ|lk) and Pzz(κ|lk) are obtained as follows

Pxz(lk κ|lk) P(lk κ|lk) minus Pxω(lk κ|lk)( 1113857ATκlkH(κ)

T

Pzz(κ|lk) H(κ)P(κ|lk)H(κ)T

+ R(κ)

(26)

where the covariances P(κ|lk) and Pxω(lk κ|lk) are derivedas follows

P(κ|lk) Aκlk P(lk|lk) + Pωω(κ|lk) minus Pxω(lk κ|lk)T

1113872 1113873ATκlk minus Pxω(lk κ|lk)

Pωω(κ|lk) Q(lk κ) minus Q(lk κ)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

Pxω(lk κ|lk) Q(lk κ) minus P(lk|lk minus 1)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

(27)

33 Summary of the Fusion Estimate Method )e state of aRLFJ manipulator is estimated using the indirect mea-surements from asynchronous multirate sensors )e sensorfusion estimate can be implemented in the practical

application using a switching mechanism in accordance withsampling time As shown in Figure 1 the update equationsare chosen at the different sampling times )e state esti-mation can be given by

1113954xF(k|k) Fkkminus 1 1113954x

minus(k|k)( + Kk z

m(k) minus h

mk 1113954x

minus(k|k) 0( )1113858 1113859 tne tlk

1113954x(k|k) + Pxz(k k minus l|k)Pminus 1zz(k minus l|k) z

c(k minus l) minus 1113954z(k minus l|k)( 1113857 t tlk

⎧⎨

⎩ (28)

Remark 3 )e exponential convergence of the sensorfusion estimate can be proved in the similar way which ispresented in [19] which has more detailed conclusion onthe stability

4 Rigid-Link Flexible-Joint Tracking Control

In this section the discrete-time terminal sliding modetracking control-based fusion estimation (AMSFE-DTSMC)is presented for rigid-link flexible manipulators whose stateis estimated by the sensor fusion method described in theprevious section )e controller is designed by using bothposition and velocity of the link from the sensor fusionestimate To design the novel controller the model in(1a)ndash(1d) can written as

q(k + 1) q(k) + T _q(k) (29a)

_q(k + 1) _q(k) + Mminus 1

(q θ)Tu(k) minus F q q _qm _qm T θ1113872 1113873

(29b)

where F(q q _qm _qm T θ) FL(q q _q θ)T + Mminus 1(q θ)[Mm

( _qm(k + 1) minus _qm(k)) + Fm _qm(k)] denotes the variable whichincludes dynamic parameters of links and motors

In order to formulate the tracking control define thetracking error 1113957qt(k) and estimation error 1113957qe(k) at time tk as

1113957qt(k) q(k) minus qd(k)

1113957qe(k) q(k) minus 1113954q(k)(30)

where qd(k) is the desired position and 1113954q(k) denotes theestimation position

Define the reference velocity for tracking and estimation

_qt(k) _qd(k) + Λt qd(k) minus 1113954q(k)( 1113857

_qd(k) minus Λt 1113957qt(k) minus 1113957qe(k)( 1113857

_qe(k) _1113954q(k) minus Λe(q(k) minus 1113954q(k))

_1113954q(k) minus Λe1113957qe(k)

(31)

where _1113954q(k) denotes the estimation of _q(k) and ΛtΛe areconstants and diagonal matrices

Complexity 5

Define the filtered variables including the estimationerror

1113957_qt(k) _q(k) minus _qt(k) _1113957qt(k) + Λt 1113957qt(k) minus 1113957qe(k)( 1113857 (32)

Consider the discrete terminal sliding surface as follows

s(k) 1113957_qt(k) + λ1113957qt(k)p (33)

where λ is a positive constant diagonal parameter matrix andp p1p2 in which p1 and p2 are positive odd integerssatisfying p2 gtp1 Motivated by the reaching law presented

by Weibing Gao et al in [20] we use the reaching law forexponential discrete sliding mode control as follows

s(k + 1) (I minus Th)s(k) minus Tεsgn(s(k)) (34)

where sgn(middot) is the signum function εgt 0 and hgt 0Since system states cannot be measured directly pa-

rameters which contain variables q(k) _q(k) and θ need tobe estimated in the controller design Assume that the es-timate error and uncertainties are bounded and we have

M(q θ) 1113954M(q θ) + ΔM(q θ)

F q q _qm _qm T θ1113872 1113873 1113954MF q q _qm _qm T θ1113872 1113873 + ΔF q q _qm _qm T θ1113872 1113873(35)

where _qm denotes _qm(k + 1) 1113954M(q θ) M(1113954q 1113954θ) and1113954F(q q _qm _qm T θ) F(1113954q 1113954q 1113954_qm 1113954_qm T 1113954θ) represent the es-timation of dynamic parameters and ΔM(q θ) andΔF(q q _qm _qm T θ) are bounded variables including theestimation error and system uncertainties which satisfy

ΔM(q θ)le δM

ΔF q q _qm _qm T θ1113872 1113873

le δF (36)

where δM and δF are constants

Theorem 1 Consider the rigid-link flexible-joint manipu-lator system described by equations (29a) and (29b) and the

discrete sliding manifold described by equation (33) by usingthe reaching law in equation (34) stable control law isdesigned as

u(k) 1113954M(q θ)

T1113954F q q _qm _qm T θ1113872 1113873 + _qt(k) minus _qt(k minus 1) + λ1113957q

pt (k)1113960

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

minus T(hs(k) + ε(k)sgn(s(k)))]

(37)

where ε and h are positive diagonal matrices which satisfy thefollowing inequations

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961

pminus λ1113957q

pt (k)

+1TξF + hs(k)

(38)

Tlt2s(k) minus ΔF

hs(k) +ε(k) (39)

Proof Substituting control law (37) in the rigid-link flex-ible-joint system equations (29a) and (29b) the error dy-namics are obtained

1113957qt(k + 1) 1113957qt(k) + T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 (40)

1113957_qt(k + 1) 1113957_qt(k) + λ1113957qt(k)p

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

+ Mminus 1 1113954M minus I1113872 1113873 1113954F + Δ _qt(k) minus λ T 1113957_qt(k)111387211139601113960

minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961

+ Mminus 1 1113954M[ΔF minus T(hs(k) + ε(k)sgn(s(k)))]

(41)

6 Complexity

For simplicity define M Mminus 1 1113954M which is invertibleand Mminus 1 1113954M

minus 1M Substituting (40) and (41) into (34)

yields

s(k + 1) s(k) + M[minus T(hs(k) minus ε(k)sgn(s(k)))]

+ M1113954F minus F +(M minus I) Δ _qt(k) minus λ1113957qpt (k)1113858

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p1113961

(42)

Stability conditions for the discrete sliding mode controlare given by Sarpturk [21]

sT(k)(s(k + 1) minus s(k))lt 0 (43a)

sT(k)(s(k + 1) + s(k))gt 0 (43b)

Combining (42) and (43a) we have

sT[minus MT[hs(k) + ε(k)sgn(s(k))] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)11139611113961lt 0

(44)

If s(k)gt 0 sgn(s(k)) 1 and s(k + 1) minus s(k)lt 0 thesliding gain lower bound can be given as

ε(k)gt1T

I minus Mminus 1

1113872 1113873 minus λ T 1113957_qt(k) minus Λt1113957qt(k)111387211139601113960

+Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961 +

1T

I minus Mminus 1

1113872 1113873Δ _qt(k) +1T

1113954F minus Mminus 1F1113872 1113873 minus hs(k)

(45)

Employ ξM and ξF which satisfy I minus Mminus 1lt ξM1113954F minus Mminus 1Flt ξF we can obtain ε(k) as follows

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 11138731113960

+ 1113957qt(k)1113859p

minus λ1113957qpt (k)

+1TξF + hs(k)

(46)

If s(k)lt 0 then sgn(s(k)) minus 1 and s(k + 1) minus s(k) canbe calculated by

s(k + 1) s(k) + MT[ε(k) minus hs(k)] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ1113957qpt (k) minus λ T 1113957_qt(k)111387211139601113960 minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859

p1113961

(47)

Employ ε(k) given by equation (46) and we have

s(k + 1) minus s(k)gtMTh[s(k) minus s(k)]

+ MξF minus M1113954F + F1113960 1113961

+ MξM Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)1113960

+ TΛt1113957qe(k) + 1113957qt(k)1113859p

minus λ1113957qpt (k)

minus (M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)1113961

(48)

Complexity 7

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 2: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

acceleration velocity of joints) need not always be measureddirectly [11] )e terminal SMC (TSMC) is used in rigidmanipulator control (eg robust TSMC and finite-timecontrol) since it has some superior properties compared withSMC such as better tracking precision and fast error con-vergence [12ndash14] In particular the singularity problem ofthe TSMC was addressed in [15 16] However most of thesepapers use the continuous-time dynamic model of themanipulator Indeed discrete-time models exist widely inthe real digital control system It has been proved that thediscrete-time SMC has technological advances in digitalelectronics computer control and robotic system Corra-dinia and Orlando [17] presented a robust DSMC coupledwith an uncertainty estimator designed for planar roboticmanipulators However the flexibilities of joints are notinvolved in those controller designs

To remedy such limitations the paper proposes a novelcontroller AMSFE-DTSMC which is implemented basedon DTSMC coupled with an asynchronous multirate sensorfusion estimator )e robotic multirate sensor unit containsvision and non-vision-based sensors whose sampling rateand processing time are different In the proposed schemethe delayed slow sampling visionmeasurement is treated as akind of periodic ldquoout-of-sequencerdquo measurement (OOSM)[18] which is used to update the non-vision-based stateestimation in an EKF-based asynchronous multirate sensorfusion algorithm Using the position and velocity estimationfrom sensor fusion estimator the DTSMC is designed byusing a novel sliding surface which is implemented byconsidering estimation error and tracking error )e maincontributions of this work are summarized as follows

(i) We propose a novel tracking control schemeAMSFE-DTSMC which is based on the DTSMC

coupled with the sensor fusion estimator for a RLFJmanipulator)e sliding surface of AMSFE-DTSMCis designed by utilzing estimation error and trackingcontrol error

(ii) We construct an asynchronous multirate measure-ment model for robotic sensors and design a sensorfusion to fuse such asynchronous multirate data forrobotic state estimation

)is paper is organized as follows Section 2 gives theproblem formulation In Section 3 the multirate sensor datafusion algorithm is presented Section 4 designs a novelDTSMC for tracking control )e simulation and experi-mental studies are implemented in Section 5)e paper endswith conclusion about the proposed approach

2 Problem Formulation

In this paper a robotic manipulator system is given with thesensor unit including joint motor encoders and camerasfixed in the workspace)e tracking control scheme for RLFJmanipulators can be developed by using the robotic stateestimate via multisensor fusion )e state of the robot isestimated by these sensors directly and indirectly howeverthere are some limitations for a single sensor to obtainprecise information To fuse asynchronous multirate datafrom different sensors the dynamic and sensor models areformulated in this section

21 Discrete Rigid-Link Flexible-Joint Robot Model )ediscrete dynamic model of an nminus link RLFJ manipulator canbe obtained by the minimization of the action functionalsuggested by Nicosia [2] as follows

q(k + 1) q(k) + T _q(k) (1a)

_q(k + 1) minus _q(k) Mminus 1

(q θ)TKsδ(k) minus FL(q q _q θ)T (1b)

Mm _qm(k + 1) minus _qm(k)1113858 1113859 + Fm _qm(k)T Tu(k) minus Ksδ(k)T (1c)

δ(k) qm(k) minus q(k) (1d)

where q(k) _q(k) and qm(k) _qm(k) denote the positionvelocity of the link and motor angles at k time respectivelyq(k) q(k) + T _q(k) T is the sampling interval M(middot) is theinvertible inertia matrix which satisfies ξmin le Mle ξmaxFL(q q _q θ) minus Mminus 1(q θ)[M(q θ) _q minus M(q θ) _q minus Tf(q

_q) θ] f(middot) represents centrifugal Coriolis and gravitationalforces Ks Mm and Fm are constant diagonal positive-definite matrices representing joint stiffness motor inertiamotor viscous friction respectively the joint deflection δ(k)

is defined as the difference between the motor and linkposition and u(k) denotes the motor torque the unknownor varying dynamic parameter in the robotic model is

defined as θ(k) which satisfy θ(k + 1) minus θ(k) Twθ(k) thedynamics uncertainties of the links and motors are modeledwith random variables wl(k) and wm(k)

Define a state vector

x(k) q(k)T _q(k)T qm(k)T _qm(k)T θ(k)T1113960 1113961T (2)

)e dynamics in equations (1a)ndash(1d) can be transformedto a state space representation

x(k + 1) Fk+1k(x(k) u(k)) + Γkω(k + 1 k) (3)

where

2 Complexity

Fk+1k(x(k) u(k))

q(k) + T _q(k)

_q(k) minus FL(k)T + Mminus 1

(q) Ksδ(k)T( 1113857

qm(k) + T _qm(k)

Mminus 1m _qm(k) minus Fm _qm(k)T minus Ksδ(k)T + Tu(k)1113858 1113859

θ(k)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

Γk

0 0 0

M(q θ)minus 1 0 0

0 0 0

0 Mminus 1m 0

0 0 IT

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

ω(k + 1 k) wl(k)T wm(k)

T wθ(k)

T1113960 1113961

T

(4)

22MeasurementModel )e observation vector is given by

h(x(k)) qm(k)T

_qm(k)T

y(k)T

1113872 1113873T (5)

where qm(k) _qm(k)T are the position and velocity of motorangles and y(k) represents the position of the end effector inimage space observed by a fixed camera

By using the standard pinhole camera model themapping from the Cartesian space to image space is given by

y(k) Im(r(k)) (6)

where r(k) denotes the position of the end effector in theCartesian space and Im(middot) is a transformation function fromtask space to image space From the forward kinematics theposition relationship between robotic joints and end effectoris described by a transform relationship as follows

r(k) Fk(q(k)) (7)

where r(k) is the position of the end effector at kth time andFk(middot) is a transformation function from the joint space totask space

From the equations (6) and (7) )e joint position can beobserved by a fixed camera

y(k) Ik(q(k)) (8)

where Ik(middot) Im(Fk(middot)) is the mapping from the jointspace to image space With the random noise ](k) v

T11113960

vT2 (k)v

T3 (k)(k)] the measurement equation is given by

zi(k) h

ik(x(k)) + ]i

(k) i c m (9)

where zi(k) is the measurement of the state from differentsensor i c denotes the camera measurement and i m

represents the measurement from motors We assume thatthe process noise ω(k) and measurement noise ](k) aresampled from the independent and identically distributedwhite Gaussian noise which satisfies following equations attime tk

E ω(k + 1 k) 0

E ](k) 0

E ](j)ωT(k + 1 k)1113966 1113967 0

E ω(j + 1 j)ωTN(k + 1 k)1113966 1113967 Q(k)δjk

E ](j)]T(k)1113966 1113967 R(k)δjk

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(10)

Since the measurement of the vision sensor at time tk isobtained from the visual image taken at time steptκ tk minus td td denotes the delay time )e relation of dif-ferent sensorsrsquo sampling rates is given by

sm lsc (11)

where sm and sc denote the sampling rate of motor encodersand visual sensors respectively and l is a positive integer

We show the sampling rate difference between visionand non-vision measurements in Figure 1 where l step lagsof vision measurements are described According to thecharacteristics of visual sensor we treat the vision mea-surements as the periodic l-step lag out-of-sequencemeasurements

Remark 1 )e velocity of joints (orand end effector) alsocan be observed by vision sensors _y(k) Jik(q(k)) _q(k)with the Jacobian matrix Jik(q) Ji(r)Jk(q) mapping fromthe joint space to the image space Ji(r) is the image Jacobianmatrix and Jk(q) denotes the Jacobian matrix from jointspace to task space However the measurement of velocity isalways impaired by the noisy image data with slow samplingrate and the dynamic uncertainties in the RLFJ manipulatorsystem )erefore the velocity measurement in the imagespace is not utilized in the measurement model

Remark 2 Assume that the camera can cover the entireworkspace of the robot With the prior knowledge of motionplanning of the robot it is basically assumed that there is anone-to-one mapping from the image space to joint space inthe real robotic system

Complexity 3

3 Asynchronous Multirate SensorFusion Estimation

According to the system model described in Section 2 therobotic link position can be estimated by using the multi-sensor system For asynchronous multirate sensors thesensor fusion method is designed to use the late measure-ments for updating the current estimated state to get a moreaccurate estimation in two steps

Step 1 when the vision measurement is unavailable therobotic state is estimated using the non-vision-basedsensors which keep the real-time estimate by recoveringthe missing information between two vision samplesStep 2 as the delayed vision measurement arrives atevery vision sample at time t tlk l 1 2 the stateshould be re-estimated to cope with the limitations ofother sensors in the absolute position measurement

31 Estimation by Using Non-Vision Sensor MeasurementsFrom Figure 1 before the (k minus 1)th vision frame is availablewe have the estimation of the state at time tlk using the non-vision sensor measurements

1113954x(lk|lk) E x(lk)|Zlkm1113966 1113967

P(lk|lk) cov x(lk)|Zlkm1113966 1113967

(12)

where Zmlk zm(1) zm(2) zm(lk) represents all en-

coder motor measurements up to time tlk Using the ex-tended Kalman filter (EKF) the estimation at time tlk via themotor encoder measurement is given as follows

1113954xminus

(lk|lk) Flklkminus 1(1113954x(lk minus 1|lk minus 1) u(lk)) (13)

P(lk|lk minus 1) AlkP(lk minus 1|lk minus 1)ATlk + WlkQ(lk minus 1)W

Tlk

(14)

Klk P(lk|lk minus 1)HTlk Hlk times P(lk|lk minus 1)H

Tlk + R(lk)1113960 1113961

minus 1

(15)

1113954x(lk|lk) 1113954xminus

(lk|lk) + Klk zm

(lk) minus hmlk 1113954x

minus(lk|lk) 0( )( 1113857

(16)

P(lk|lk) I minus KlkHlk( 1113857P(lk|lk minus 1) (17)

where Alk Wlk Hlk and Vlk are Jacobian matrices and Klk isthe correction gain vector According to above equationsthe state and covariance estimates are updated with themeasurement with a fast sampling rate

32 Re-Estimation via Vision Sensor MeasurementsSuppose that at the time tlk the vision measurement at timetκ tklminus l is obtained A new estimation 1113954x(lk|κ+) is calculatedusing the information about the delayed (k minus 1)th visionmeasurement zc(κ) 1113954x(lk|κ+) is defined withZlk Zm

lk zc(κ)1113864 1113865

1113954x lk|κ+( 1113857 E x(lk)|Zlk1113864 1113865

P lk|κ+( 1113857 cov x(lk)|Zlk1113864 1113865

(18)

)e delayed measurement zc(κ) observed by visionmeasurement will be used to correct the accumulated es-timation errors which are caused by sensors with fastsampling rate )e equations for updating the estimation1113954x(lk|lk) with the delayed vision measurement zc(κ) aregiven by

1113954x lk|κ+( 1113857 1113954x(lk|lk) + Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1[z(κ) minus 1113954z(κ|lk)]

(19)

P lk|κ+( 1113857 P(lk|lk) minus Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1Pxz(lk κ|lk)

T

(20)

where Pxz(lk κ|lk) represents cross covariance between1113954x(lk|lk) and 1113954z(κ|lk) Pzz(κ|lk) is the covariance of 1113954z(κ|lk)

which denotes the estimation of measurement at time tκ and

Pxz(lk κ|lk) E 1113957x(lk)1113957zc(κ)

T1113966 1113967

Pxz(lk κ|lk) E 1113957zc(κ)1113957z

c(κ)

T1113966 1113967

1113954z(κ|lk) E z(κ)|Zlk1113864 1113865

(21)

with 1113957x(lk) x(lk) minus 1113954x(lk|lk) 1113957zc(κ) zc(κ) minus 1113954z(κ|lk)]TUsing the EKF Pxz(lk κ|lk) Pxz(lk κ|lk) and 1113954z(κ|lk)

are obtained by assuming that the function Fkκ(middot) is in-vertible Define Fminus 1

lkκ(middot) Fκlk(middot) which denotes the back-ward transition function to estimate the state back from tlk totκ Since the previous state x(κ) is not affected by presentinput signal u(lk) we give the state relationship between tκand tlk by

x(κ) Fκlk(x(lk) minus ω(lk κ)) (22)

where the process noise ω(lk κ) and covariance Q(kl κ) arecalculated by

lkth encoder measurement

kth visionmeasurement

(k minus 1) th visionmeasurement

(k minus 2) th visionmeasurement

Nonminusvision measurementsVision measurementsSystem states

lowast

Time tlkminusl tlkminus1 tlk tlk+1 tlk+l

Figure 1)e schematic diagram of the relation between vision andnon-vision measurements

4 Complexity

ω(lk κ) asymp Γlkω(lk lk minus 1) + Alklkminus 1Γlkminus 1ω(lk minus 1 lk minus 2)

+ middot middot middot + Alklkminus 1Alkminus 1lkminus 2 times middot middot middot times Alkminus l+2lkminus l+1Γlkminus l+1ω(lk minus l + 1 lk minus l)

Q(kl κ) E 1113957ω(lk κ)1113957ω(lk κ)T

1113966 1113967

1113957ω(lk κ) ω(lk κ) minus 1113954ω(lk κ)

(23)

)e estimation in equation (19) can be determined by

1113954z(κ|lk) Hmlk(κ)1113954x(κ|lk)

1113954x(κ|lk) Fκlk(1113954x(lk|lk) minus 1113954ω(lk κ|lk))(24)

To estimate the process noise as in equations (13)ndash(17)we have

1113954ω(lk κ|lk) Q(lk κ)HTlkPzz(lk|lk minus 1)

minus 1times(z(lk) minus z(lk minus 1))

(25)

)en Pxz(lk κ|lk) and Pzz(κ|lk) are obtained as follows

Pxz(lk κ|lk) P(lk κ|lk) minus Pxω(lk κ|lk)( 1113857ATκlkH(κ)

T

Pzz(κ|lk) H(κ)P(κ|lk)H(κ)T

+ R(κ)

(26)

where the covariances P(κ|lk) and Pxω(lk κ|lk) are derivedas follows

P(κ|lk) Aκlk P(lk|lk) + Pωω(κ|lk) minus Pxω(lk κ|lk)T

1113872 1113873ATκlk minus Pxω(lk κ|lk)

Pωω(κ|lk) Q(lk κ) minus Q(lk κ)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

Pxω(lk κ|lk) Q(lk κ) minus P(lk|lk minus 1)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

(27)

33 Summary of the Fusion Estimate Method )e state of aRLFJ manipulator is estimated using the indirect mea-surements from asynchronous multirate sensors )e sensorfusion estimate can be implemented in the practical

application using a switching mechanism in accordance withsampling time As shown in Figure 1 the update equationsare chosen at the different sampling times )e state esti-mation can be given by

1113954xF(k|k) Fkkminus 1 1113954x

minus(k|k)( + Kk z

m(k) minus h

mk 1113954x

minus(k|k) 0( )1113858 1113859 tne tlk

1113954x(k|k) + Pxz(k k minus l|k)Pminus 1zz(k minus l|k) z

c(k minus l) minus 1113954z(k minus l|k)( 1113857 t tlk

⎧⎨

⎩ (28)

Remark 3 )e exponential convergence of the sensorfusion estimate can be proved in the similar way which ispresented in [19] which has more detailed conclusion onthe stability

4 Rigid-Link Flexible-Joint Tracking Control

In this section the discrete-time terminal sliding modetracking control-based fusion estimation (AMSFE-DTSMC)is presented for rigid-link flexible manipulators whose stateis estimated by the sensor fusion method described in theprevious section )e controller is designed by using bothposition and velocity of the link from the sensor fusionestimate To design the novel controller the model in(1a)ndash(1d) can written as

q(k + 1) q(k) + T _q(k) (29a)

_q(k + 1) _q(k) + Mminus 1

(q θ)Tu(k) minus F q q _qm _qm T θ1113872 1113873

(29b)

where F(q q _qm _qm T θ) FL(q q _q θ)T + Mminus 1(q θ)[Mm

( _qm(k + 1) minus _qm(k)) + Fm _qm(k)] denotes the variable whichincludes dynamic parameters of links and motors

In order to formulate the tracking control define thetracking error 1113957qt(k) and estimation error 1113957qe(k) at time tk as

1113957qt(k) q(k) minus qd(k)

1113957qe(k) q(k) minus 1113954q(k)(30)

where qd(k) is the desired position and 1113954q(k) denotes theestimation position

Define the reference velocity for tracking and estimation

_qt(k) _qd(k) + Λt qd(k) minus 1113954q(k)( 1113857

_qd(k) minus Λt 1113957qt(k) minus 1113957qe(k)( 1113857

_qe(k) _1113954q(k) minus Λe(q(k) minus 1113954q(k))

_1113954q(k) minus Λe1113957qe(k)

(31)

where _1113954q(k) denotes the estimation of _q(k) and ΛtΛe areconstants and diagonal matrices

Complexity 5

Define the filtered variables including the estimationerror

1113957_qt(k) _q(k) minus _qt(k) _1113957qt(k) + Λt 1113957qt(k) minus 1113957qe(k)( 1113857 (32)

Consider the discrete terminal sliding surface as follows

s(k) 1113957_qt(k) + λ1113957qt(k)p (33)

where λ is a positive constant diagonal parameter matrix andp p1p2 in which p1 and p2 are positive odd integerssatisfying p2 gtp1 Motivated by the reaching law presented

by Weibing Gao et al in [20] we use the reaching law forexponential discrete sliding mode control as follows

s(k + 1) (I minus Th)s(k) minus Tεsgn(s(k)) (34)

where sgn(middot) is the signum function εgt 0 and hgt 0Since system states cannot be measured directly pa-

rameters which contain variables q(k) _q(k) and θ need tobe estimated in the controller design Assume that the es-timate error and uncertainties are bounded and we have

M(q θ) 1113954M(q θ) + ΔM(q θ)

F q q _qm _qm T θ1113872 1113873 1113954MF q q _qm _qm T θ1113872 1113873 + ΔF q q _qm _qm T θ1113872 1113873(35)

where _qm denotes _qm(k + 1) 1113954M(q θ) M(1113954q 1113954θ) and1113954F(q q _qm _qm T θ) F(1113954q 1113954q 1113954_qm 1113954_qm T 1113954θ) represent the es-timation of dynamic parameters and ΔM(q θ) andΔF(q q _qm _qm T θ) are bounded variables including theestimation error and system uncertainties which satisfy

ΔM(q θ)le δM

ΔF q q _qm _qm T θ1113872 1113873

le δF (36)

where δM and δF are constants

Theorem 1 Consider the rigid-link flexible-joint manipu-lator system described by equations (29a) and (29b) and the

discrete sliding manifold described by equation (33) by usingthe reaching law in equation (34) stable control law isdesigned as

u(k) 1113954M(q θ)

T1113954F q q _qm _qm T θ1113872 1113873 + _qt(k) minus _qt(k minus 1) + λ1113957q

pt (k)1113960

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

minus T(hs(k) + ε(k)sgn(s(k)))]

(37)

where ε and h are positive diagonal matrices which satisfy thefollowing inequations

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961

pminus λ1113957q

pt (k)

+1TξF + hs(k)

(38)

Tlt2s(k) minus ΔF

hs(k) +ε(k) (39)

Proof Substituting control law (37) in the rigid-link flex-ible-joint system equations (29a) and (29b) the error dy-namics are obtained

1113957qt(k + 1) 1113957qt(k) + T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 (40)

1113957_qt(k + 1) 1113957_qt(k) + λ1113957qt(k)p

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

+ Mminus 1 1113954M minus I1113872 1113873 1113954F + Δ _qt(k) minus λ T 1113957_qt(k)111387211139601113960

minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961

+ Mminus 1 1113954M[ΔF minus T(hs(k) + ε(k)sgn(s(k)))]

(41)

6 Complexity

For simplicity define M Mminus 1 1113954M which is invertibleand Mminus 1 1113954M

minus 1M Substituting (40) and (41) into (34)

yields

s(k + 1) s(k) + M[minus T(hs(k) minus ε(k)sgn(s(k)))]

+ M1113954F minus F +(M minus I) Δ _qt(k) minus λ1113957qpt (k)1113858

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p1113961

(42)

Stability conditions for the discrete sliding mode controlare given by Sarpturk [21]

sT(k)(s(k + 1) minus s(k))lt 0 (43a)

sT(k)(s(k + 1) + s(k))gt 0 (43b)

Combining (42) and (43a) we have

sT[minus MT[hs(k) + ε(k)sgn(s(k))] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)11139611113961lt 0

(44)

If s(k)gt 0 sgn(s(k)) 1 and s(k + 1) minus s(k)lt 0 thesliding gain lower bound can be given as

ε(k)gt1T

I minus Mminus 1

1113872 1113873 minus λ T 1113957_qt(k) minus Λt1113957qt(k)111387211139601113960

+Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961 +

1T

I minus Mminus 1

1113872 1113873Δ _qt(k) +1T

1113954F minus Mminus 1F1113872 1113873 minus hs(k)

(45)

Employ ξM and ξF which satisfy I minus Mminus 1lt ξM1113954F minus Mminus 1Flt ξF we can obtain ε(k) as follows

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 11138731113960

+ 1113957qt(k)1113859p

minus λ1113957qpt (k)

+1TξF + hs(k)

(46)

If s(k)lt 0 then sgn(s(k)) minus 1 and s(k + 1) minus s(k) canbe calculated by

s(k + 1) s(k) + MT[ε(k) minus hs(k)] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ1113957qpt (k) minus λ T 1113957_qt(k)111387211139601113960 minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859

p1113961

(47)

Employ ε(k) given by equation (46) and we have

s(k + 1) minus s(k)gtMTh[s(k) minus s(k)]

+ MξF minus M1113954F + F1113960 1113961

+ MξM Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)1113960

+ TΛt1113957qe(k) + 1113957qt(k)1113859p

minus λ1113957qpt (k)

minus (M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)1113961

(48)

Complexity 7

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 3: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

Fk+1k(x(k) u(k))

q(k) + T _q(k)

_q(k) minus FL(k)T + Mminus 1

(q) Ksδ(k)T( 1113857

qm(k) + T _qm(k)

Mminus 1m _qm(k) minus Fm _qm(k)T minus Ksδ(k)T + Tu(k)1113858 1113859

θ(k)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

Γk

0 0 0

M(q θ)minus 1 0 0

0 0 0

0 Mminus 1m 0

0 0 IT

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

ω(k + 1 k) wl(k)T wm(k)

T wθ(k)

T1113960 1113961

T

(4)

22MeasurementModel )e observation vector is given by

h(x(k)) qm(k)T

_qm(k)T

y(k)T

1113872 1113873T (5)

where qm(k) _qm(k)T are the position and velocity of motorangles and y(k) represents the position of the end effector inimage space observed by a fixed camera

By using the standard pinhole camera model themapping from the Cartesian space to image space is given by

y(k) Im(r(k)) (6)

where r(k) denotes the position of the end effector in theCartesian space and Im(middot) is a transformation function fromtask space to image space From the forward kinematics theposition relationship between robotic joints and end effectoris described by a transform relationship as follows

r(k) Fk(q(k)) (7)

where r(k) is the position of the end effector at kth time andFk(middot) is a transformation function from the joint space totask space

From the equations (6) and (7) )e joint position can beobserved by a fixed camera

y(k) Ik(q(k)) (8)

where Ik(middot) Im(Fk(middot)) is the mapping from the jointspace to image space With the random noise ](k) v

T11113960

vT2 (k)v

T3 (k)(k)] the measurement equation is given by

zi(k) h

ik(x(k)) + ]i

(k) i c m (9)

where zi(k) is the measurement of the state from differentsensor i c denotes the camera measurement and i m

represents the measurement from motors We assume thatthe process noise ω(k) and measurement noise ](k) aresampled from the independent and identically distributedwhite Gaussian noise which satisfies following equations attime tk

E ω(k + 1 k) 0

E ](k) 0

E ](j)ωT(k + 1 k)1113966 1113967 0

E ω(j + 1 j)ωTN(k + 1 k)1113966 1113967 Q(k)δjk

E ](j)]T(k)1113966 1113967 R(k)δjk

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(10)

Since the measurement of the vision sensor at time tk isobtained from the visual image taken at time steptκ tk minus td td denotes the delay time )e relation of dif-ferent sensorsrsquo sampling rates is given by

sm lsc (11)

where sm and sc denote the sampling rate of motor encodersand visual sensors respectively and l is a positive integer

We show the sampling rate difference between visionand non-vision measurements in Figure 1 where l step lagsof vision measurements are described According to thecharacteristics of visual sensor we treat the vision mea-surements as the periodic l-step lag out-of-sequencemeasurements

Remark 1 )e velocity of joints (orand end effector) alsocan be observed by vision sensors _y(k) Jik(q(k)) _q(k)with the Jacobian matrix Jik(q) Ji(r)Jk(q) mapping fromthe joint space to the image space Ji(r) is the image Jacobianmatrix and Jk(q) denotes the Jacobian matrix from jointspace to task space However the measurement of velocity isalways impaired by the noisy image data with slow samplingrate and the dynamic uncertainties in the RLFJ manipulatorsystem )erefore the velocity measurement in the imagespace is not utilized in the measurement model

Remark 2 Assume that the camera can cover the entireworkspace of the robot With the prior knowledge of motionplanning of the robot it is basically assumed that there is anone-to-one mapping from the image space to joint space inthe real robotic system

Complexity 3

3 Asynchronous Multirate SensorFusion Estimation

According to the system model described in Section 2 therobotic link position can be estimated by using the multi-sensor system For asynchronous multirate sensors thesensor fusion method is designed to use the late measure-ments for updating the current estimated state to get a moreaccurate estimation in two steps

Step 1 when the vision measurement is unavailable therobotic state is estimated using the non-vision-basedsensors which keep the real-time estimate by recoveringthe missing information between two vision samplesStep 2 as the delayed vision measurement arrives atevery vision sample at time t tlk l 1 2 the stateshould be re-estimated to cope with the limitations ofother sensors in the absolute position measurement

31 Estimation by Using Non-Vision Sensor MeasurementsFrom Figure 1 before the (k minus 1)th vision frame is availablewe have the estimation of the state at time tlk using the non-vision sensor measurements

1113954x(lk|lk) E x(lk)|Zlkm1113966 1113967

P(lk|lk) cov x(lk)|Zlkm1113966 1113967

(12)

where Zmlk zm(1) zm(2) zm(lk) represents all en-

coder motor measurements up to time tlk Using the ex-tended Kalman filter (EKF) the estimation at time tlk via themotor encoder measurement is given as follows

1113954xminus

(lk|lk) Flklkminus 1(1113954x(lk minus 1|lk minus 1) u(lk)) (13)

P(lk|lk minus 1) AlkP(lk minus 1|lk minus 1)ATlk + WlkQ(lk minus 1)W

Tlk

(14)

Klk P(lk|lk minus 1)HTlk Hlk times P(lk|lk minus 1)H

Tlk + R(lk)1113960 1113961

minus 1

(15)

1113954x(lk|lk) 1113954xminus

(lk|lk) + Klk zm

(lk) minus hmlk 1113954x

minus(lk|lk) 0( )( 1113857

(16)

P(lk|lk) I minus KlkHlk( 1113857P(lk|lk minus 1) (17)

where Alk Wlk Hlk and Vlk are Jacobian matrices and Klk isthe correction gain vector According to above equationsthe state and covariance estimates are updated with themeasurement with a fast sampling rate

32 Re-Estimation via Vision Sensor MeasurementsSuppose that at the time tlk the vision measurement at timetκ tklminus l is obtained A new estimation 1113954x(lk|κ+) is calculatedusing the information about the delayed (k minus 1)th visionmeasurement zc(κ) 1113954x(lk|κ+) is defined withZlk Zm

lk zc(κ)1113864 1113865

1113954x lk|κ+( 1113857 E x(lk)|Zlk1113864 1113865

P lk|κ+( 1113857 cov x(lk)|Zlk1113864 1113865

(18)

)e delayed measurement zc(κ) observed by visionmeasurement will be used to correct the accumulated es-timation errors which are caused by sensors with fastsampling rate )e equations for updating the estimation1113954x(lk|lk) with the delayed vision measurement zc(κ) aregiven by

1113954x lk|κ+( 1113857 1113954x(lk|lk) + Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1[z(κ) minus 1113954z(κ|lk)]

(19)

P lk|κ+( 1113857 P(lk|lk) minus Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1Pxz(lk κ|lk)

T

(20)

where Pxz(lk κ|lk) represents cross covariance between1113954x(lk|lk) and 1113954z(κ|lk) Pzz(κ|lk) is the covariance of 1113954z(κ|lk)

which denotes the estimation of measurement at time tκ and

Pxz(lk κ|lk) E 1113957x(lk)1113957zc(κ)

T1113966 1113967

Pxz(lk κ|lk) E 1113957zc(κ)1113957z

c(κ)

T1113966 1113967

1113954z(κ|lk) E z(κ)|Zlk1113864 1113865

(21)

with 1113957x(lk) x(lk) minus 1113954x(lk|lk) 1113957zc(κ) zc(κ) minus 1113954z(κ|lk)]TUsing the EKF Pxz(lk κ|lk) Pxz(lk κ|lk) and 1113954z(κ|lk)

are obtained by assuming that the function Fkκ(middot) is in-vertible Define Fminus 1

lkκ(middot) Fκlk(middot) which denotes the back-ward transition function to estimate the state back from tlk totκ Since the previous state x(κ) is not affected by presentinput signal u(lk) we give the state relationship between tκand tlk by

x(κ) Fκlk(x(lk) minus ω(lk κ)) (22)

where the process noise ω(lk κ) and covariance Q(kl κ) arecalculated by

lkth encoder measurement

kth visionmeasurement

(k minus 1) th visionmeasurement

(k minus 2) th visionmeasurement

Nonminusvision measurementsVision measurementsSystem states

lowast

Time tlkminusl tlkminus1 tlk tlk+1 tlk+l

Figure 1)e schematic diagram of the relation between vision andnon-vision measurements

4 Complexity

ω(lk κ) asymp Γlkω(lk lk minus 1) + Alklkminus 1Γlkminus 1ω(lk minus 1 lk minus 2)

+ middot middot middot + Alklkminus 1Alkminus 1lkminus 2 times middot middot middot times Alkminus l+2lkminus l+1Γlkminus l+1ω(lk minus l + 1 lk minus l)

Q(kl κ) E 1113957ω(lk κ)1113957ω(lk κ)T

1113966 1113967

1113957ω(lk κ) ω(lk κ) minus 1113954ω(lk κ)

(23)

)e estimation in equation (19) can be determined by

1113954z(κ|lk) Hmlk(κ)1113954x(κ|lk)

1113954x(κ|lk) Fκlk(1113954x(lk|lk) minus 1113954ω(lk κ|lk))(24)

To estimate the process noise as in equations (13)ndash(17)we have

1113954ω(lk κ|lk) Q(lk κ)HTlkPzz(lk|lk minus 1)

minus 1times(z(lk) minus z(lk minus 1))

(25)

)en Pxz(lk κ|lk) and Pzz(κ|lk) are obtained as follows

Pxz(lk κ|lk) P(lk κ|lk) minus Pxω(lk κ|lk)( 1113857ATκlkH(κ)

T

Pzz(κ|lk) H(κ)P(κ|lk)H(κ)T

+ R(κ)

(26)

where the covariances P(κ|lk) and Pxω(lk κ|lk) are derivedas follows

P(κ|lk) Aκlk P(lk|lk) + Pωω(κ|lk) minus Pxω(lk κ|lk)T

1113872 1113873ATκlk minus Pxω(lk κ|lk)

Pωω(κ|lk) Q(lk κ) minus Q(lk κ)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

Pxω(lk κ|lk) Q(lk κ) minus P(lk|lk minus 1)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

(27)

33 Summary of the Fusion Estimate Method )e state of aRLFJ manipulator is estimated using the indirect mea-surements from asynchronous multirate sensors )e sensorfusion estimate can be implemented in the practical

application using a switching mechanism in accordance withsampling time As shown in Figure 1 the update equationsare chosen at the different sampling times )e state esti-mation can be given by

1113954xF(k|k) Fkkminus 1 1113954x

minus(k|k)( + Kk z

m(k) minus h

mk 1113954x

minus(k|k) 0( )1113858 1113859 tne tlk

1113954x(k|k) + Pxz(k k minus l|k)Pminus 1zz(k minus l|k) z

c(k minus l) minus 1113954z(k minus l|k)( 1113857 t tlk

⎧⎨

⎩ (28)

Remark 3 )e exponential convergence of the sensorfusion estimate can be proved in the similar way which ispresented in [19] which has more detailed conclusion onthe stability

4 Rigid-Link Flexible-Joint Tracking Control

In this section the discrete-time terminal sliding modetracking control-based fusion estimation (AMSFE-DTSMC)is presented for rigid-link flexible manipulators whose stateis estimated by the sensor fusion method described in theprevious section )e controller is designed by using bothposition and velocity of the link from the sensor fusionestimate To design the novel controller the model in(1a)ndash(1d) can written as

q(k + 1) q(k) + T _q(k) (29a)

_q(k + 1) _q(k) + Mminus 1

(q θ)Tu(k) minus F q q _qm _qm T θ1113872 1113873

(29b)

where F(q q _qm _qm T θ) FL(q q _q θ)T + Mminus 1(q θ)[Mm

( _qm(k + 1) minus _qm(k)) + Fm _qm(k)] denotes the variable whichincludes dynamic parameters of links and motors

In order to formulate the tracking control define thetracking error 1113957qt(k) and estimation error 1113957qe(k) at time tk as

1113957qt(k) q(k) minus qd(k)

1113957qe(k) q(k) minus 1113954q(k)(30)

where qd(k) is the desired position and 1113954q(k) denotes theestimation position

Define the reference velocity for tracking and estimation

_qt(k) _qd(k) + Λt qd(k) minus 1113954q(k)( 1113857

_qd(k) minus Λt 1113957qt(k) minus 1113957qe(k)( 1113857

_qe(k) _1113954q(k) minus Λe(q(k) minus 1113954q(k))

_1113954q(k) minus Λe1113957qe(k)

(31)

where _1113954q(k) denotes the estimation of _q(k) and ΛtΛe areconstants and diagonal matrices

Complexity 5

Define the filtered variables including the estimationerror

1113957_qt(k) _q(k) minus _qt(k) _1113957qt(k) + Λt 1113957qt(k) minus 1113957qe(k)( 1113857 (32)

Consider the discrete terminal sliding surface as follows

s(k) 1113957_qt(k) + λ1113957qt(k)p (33)

where λ is a positive constant diagonal parameter matrix andp p1p2 in which p1 and p2 are positive odd integerssatisfying p2 gtp1 Motivated by the reaching law presented

by Weibing Gao et al in [20] we use the reaching law forexponential discrete sliding mode control as follows

s(k + 1) (I minus Th)s(k) minus Tεsgn(s(k)) (34)

where sgn(middot) is the signum function εgt 0 and hgt 0Since system states cannot be measured directly pa-

rameters which contain variables q(k) _q(k) and θ need tobe estimated in the controller design Assume that the es-timate error and uncertainties are bounded and we have

M(q θ) 1113954M(q θ) + ΔM(q θ)

F q q _qm _qm T θ1113872 1113873 1113954MF q q _qm _qm T θ1113872 1113873 + ΔF q q _qm _qm T θ1113872 1113873(35)

where _qm denotes _qm(k + 1) 1113954M(q θ) M(1113954q 1113954θ) and1113954F(q q _qm _qm T θ) F(1113954q 1113954q 1113954_qm 1113954_qm T 1113954θ) represent the es-timation of dynamic parameters and ΔM(q θ) andΔF(q q _qm _qm T θ) are bounded variables including theestimation error and system uncertainties which satisfy

ΔM(q θ)le δM

ΔF q q _qm _qm T θ1113872 1113873

le δF (36)

where δM and δF are constants

Theorem 1 Consider the rigid-link flexible-joint manipu-lator system described by equations (29a) and (29b) and the

discrete sliding manifold described by equation (33) by usingthe reaching law in equation (34) stable control law isdesigned as

u(k) 1113954M(q θ)

T1113954F q q _qm _qm T θ1113872 1113873 + _qt(k) minus _qt(k minus 1) + λ1113957q

pt (k)1113960

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

minus T(hs(k) + ε(k)sgn(s(k)))]

(37)

where ε and h are positive diagonal matrices which satisfy thefollowing inequations

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961

pminus λ1113957q

pt (k)

+1TξF + hs(k)

(38)

Tlt2s(k) minus ΔF

hs(k) +ε(k) (39)

Proof Substituting control law (37) in the rigid-link flex-ible-joint system equations (29a) and (29b) the error dy-namics are obtained

1113957qt(k + 1) 1113957qt(k) + T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 (40)

1113957_qt(k + 1) 1113957_qt(k) + λ1113957qt(k)p

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

+ Mminus 1 1113954M minus I1113872 1113873 1113954F + Δ _qt(k) minus λ T 1113957_qt(k)111387211139601113960

minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961

+ Mminus 1 1113954M[ΔF minus T(hs(k) + ε(k)sgn(s(k)))]

(41)

6 Complexity

For simplicity define M Mminus 1 1113954M which is invertibleand Mminus 1 1113954M

minus 1M Substituting (40) and (41) into (34)

yields

s(k + 1) s(k) + M[minus T(hs(k) minus ε(k)sgn(s(k)))]

+ M1113954F minus F +(M minus I) Δ _qt(k) minus λ1113957qpt (k)1113858

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p1113961

(42)

Stability conditions for the discrete sliding mode controlare given by Sarpturk [21]

sT(k)(s(k + 1) minus s(k))lt 0 (43a)

sT(k)(s(k + 1) + s(k))gt 0 (43b)

Combining (42) and (43a) we have

sT[minus MT[hs(k) + ε(k)sgn(s(k))] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)11139611113961lt 0

(44)

If s(k)gt 0 sgn(s(k)) 1 and s(k + 1) minus s(k)lt 0 thesliding gain lower bound can be given as

ε(k)gt1T

I minus Mminus 1

1113872 1113873 minus λ T 1113957_qt(k) minus Λt1113957qt(k)111387211139601113960

+Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961 +

1T

I minus Mminus 1

1113872 1113873Δ _qt(k) +1T

1113954F minus Mminus 1F1113872 1113873 minus hs(k)

(45)

Employ ξM and ξF which satisfy I minus Mminus 1lt ξM1113954F minus Mminus 1Flt ξF we can obtain ε(k) as follows

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 11138731113960

+ 1113957qt(k)1113859p

minus λ1113957qpt (k)

+1TξF + hs(k)

(46)

If s(k)lt 0 then sgn(s(k)) minus 1 and s(k + 1) minus s(k) canbe calculated by

s(k + 1) s(k) + MT[ε(k) minus hs(k)] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ1113957qpt (k) minus λ T 1113957_qt(k)111387211139601113960 minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859

p1113961

(47)

Employ ε(k) given by equation (46) and we have

s(k + 1) minus s(k)gtMTh[s(k) minus s(k)]

+ MξF minus M1113954F + F1113960 1113961

+ MξM Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)1113960

+ TΛt1113957qe(k) + 1113957qt(k)1113859p

minus λ1113957qpt (k)

minus (M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)1113961

(48)

Complexity 7

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 4: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

3 Asynchronous Multirate SensorFusion Estimation

According to the system model described in Section 2 therobotic link position can be estimated by using the multi-sensor system For asynchronous multirate sensors thesensor fusion method is designed to use the late measure-ments for updating the current estimated state to get a moreaccurate estimation in two steps

Step 1 when the vision measurement is unavailable therobotic state is estimated using the non-vision-basedsensors which keep the real-time estimate by recoveringthe missing information between two vision samplesStep 2 as the delayed vision measurement arrives atevery vision sample at time t tlk l 1 2 the stateshould be re-estimated to cope with the limitations ofother sensors in the absolute position measurement

31 Estimation by Using Non-Vision Sensor MeasurementsFrom Figure 1 before the (k minus 1)th vision frame is availablewe have the estimation of the state at time tlk using the non-vision sensor measurements

1113954x(lk|lk) E x(lk)|Zlkm1113966 1113967

P(lk|lk) cov x(lk)|Zlkm1113966 1113967

(12)

where Zmlk zm(1) zm(2) zm(lk) represents all en-

coder motor measurements up to time tlk Using the ex-tended Kalman filter (EKF) the estimation at time tlk via themotor encoder measurement is given as follows

1113954xminus

(lk|lk) Flklkminus 1(1113954x(lk minus 1|lk minus 1) u(lk)) (13)

P(lk|lk minus 1) AlkP(lk minus 1|lk minus 1)ATlk + WlkQ(lk minus 1)W

Tlk

(14)

Klk P(lk|lk minus 1)HTlk Hlk times P(lk|lk minus 1)H

Tlk + R(lk)1113960 1113961

minus 1

(15)

1113954x(lk|lk) 1113954xminus

(lk|lk) + Klk zm

(lk) minus hmlk 1113954x

minus(lk|lk) 0( )( 1113857

(16)

P(lk|lk) I minus KlkHlk( 1113857P(lk|lk minus 1) (17)

where Alk Wlk Hlk and Vlk are Jacobian matrices and Klk isthe correction gain vector According to above equationsthe state and covariance estimates are updated with themeasurement with a fast sampling rate

32 Re-Estimation via Vision Sensor MeasurementsSuppose that at the time tlk the vision measurement at timetκ tklminus l is obtained A new estimation 1113954x(lk|κ+) is calculatedusing the information about the delayed (k minus 1)th visionmeasurement zc(κ) 1113954x(lk|κ+) is defined withZlk Zm

lk zc(κ)1113864 1113865

1113954x lk|κ+( 1113857 E x(lk)|Zlk1113864 1113865

P lk|κ+( 1113857 cov x(lk)|Zlk1113864 1113865

(18)

)e delayed measurement zc(κ) observed by visionmeasurement will be used to correct the accumulated es-timation errors which are caused by sensors with fastsampling rate )e equations for updating the estimation1113954x(lk|lk) with the delayed vision measurement zc(κ) aregiven by

1113954x lk|κ+( 1113857 1113954x(lk|lk) + Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1[z(κ) minus 1113954z(κ|lk)]

(19)

P lk|κ+( 1113857 P(lk|lk) minus Pxz(lk κ|lk) times Pzz(κ|lk)

minus 1Pxz(lk κ|lk)

T

(20)

where Pxz(lk κ|lk) represents cross covariance between1113954x(lk|lk) and 1113954z(κ|lk) Pzz(κ|lk) is the covariance of 1113954z(κ|lk)

which denotes the estimation of measurement at time tκ and

Pxz(lk κ|lk) E 1113957x(lk)1113957zc(κ)

T1113966 1113967

Pxz(lk κ|lk) E 1113957zc(κ)1113957z

c(κ)

T1113966 1113967

1113954z(κ|lk) E z(κ)|Zlk1113864 1113865

(21)

with 1113957x(lk) x(lk) minus 1113954x(lk|lk) 1113957zc(κ) zc(κ) minus 1113954z(κ|lk)]TUsing the EKF Pxz(lk κ|lk) Pxz(lk κ|lk) and 1113954z(κ|lk)

are obtained by assuming that the function Fkκ(middot) is in-vertible Define Fminus 1

lkκ(middot) Fκlk(middot) which denotes the back-ward transition function to estimate the state back from tlk totκ Since the previous state x(κ) is not affected by presentinput signal u(lk) we give the state relationship between tκand tlk by

x(κ) Fκlk(x(lk) minus ω(lk κ)) (22)

where the process noise ω(lk κ) and covariance Q(kl κ) arecalculated by

lkth encoder measurement

kth visionmeasurement

(k minus 1) th visionmeasurement

(k minus 2) th visionmeasurement

Nonminusvision measurementsVision measurementsSystem states

lowast

Time tlkminusl tlkminus1 tlk tlk+1 tlk+l

Figure 1)e schematic diagram of the relation between vision andnon-vision measurements

4 Complexity

ω(lk κ) asymp Γlkω(lk lk minus 1) + Alklkminus 1Γlkminus 1ω(lk minus 1 lk minus 2)

+ middot middot middot + Alklkminus 1Alkminus 1lkminus 2 times middot middot middot times Alkminus l+2lkminus l+1Γlkminus l+1ω(lk minus l + 1 lk minus l)

Q(kl κ) E 1113957ω(lk κ)1113957ω(lk κ)T

1113966 1113967

1113957ω(lk κ) ω(lk κ) minus 1113954ω(lk κ)

(23)

)e estimation in equation (19) can be determined by

1113954z(κ|lk) Hmlk(κ)1113954x(κ|lk)

1113954x(κ|lk) Fκlk(1113954x(lk|lk) minus 1113954ω(lk κ|lk))(24)

To estimate the process noise as in equations (13)ndash(17)we have

1113954ω(lk κ|lk) Q(lk κ)HTlkPzz(lk|lk minus 1)

minus 1times(z(lk) minus z(lk minus 1))

(25)

)en Pxz(lk κ|lk) and Pzz(κ|lk) are obtained as follows

Pxz(lk κ|lk) P(lk κ|lk) minus Pxω(lk κ|lk)( 1113857ATκlkH(κ)

T

Pzz(κ|lk) H(κ)P(κ|lk)H(κ)T

+ R(κ)

(26)

where the covariances P(κ|lk) and Pxω(lk κ|lk) are derivedas follows

P(κ|lk) Aκlk P(lk|lk) + Pωω(κ|lk) minus Pxω(lk κ|lk)T

1113872 1113873ATκlk minus Pxω(lk κ|lk)

Pωω(κ|lk) Q(lk κ) minus Q(lk κ)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

Pxω(lk κ|lk) Q(lk κ) minus P(lk|lk minus 1)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

(27)

33 Summary of the Fusion Estimate Method )e state of aRLFJ manipulator is estimated using the indirect mea-surements from asynchronous multirate sensors )e sensorfusion estimate can be implemented in the practical

application using a switching mechanism in accordance withsampling time As shown in Figure 1 the update equationsare chosen at the different sampling times )e state esti-mation can be given by

1113954xF(k|k) Fkkminus 1 1113954x

minus(k|k)( + Kk z

m(k) minus h

mk 1113954x

minus(k|k) 0( )1113858 1113859 tne tlk

1113954x(k|k) + Pxz(k k minus l|k)Pminus 1zz(k minus l|k) z

c(k minus l) minus 1113954z(k minus l|k)( 1113857 t tlk

⎧⎨

⎩ (28)

Remark 3 )e exponential convergence of the sensorfusion estimate can be proved in the similar way which ispresented in [19] which has more detailed conclusion onthe stability

4 Rigid-Link Flexible-Joint Tracking Control

In this section the discrete-time terminal sliding modetracking control-based fusion estimation (AMSFE-DTSMC)is presented for rigid-link flexible manipulators whose stateis estimated by the sensor fusion method described in theprevious section )e controller is designed by using bothposition and velocity of the link from the sensor fusionestimate To design the novel controller the model in(1a)ndash(1d) can written as

q(k + 1) q(k) + T _q(k) (29a)

_q(k + 1) _q(k) + Mminus 1

(q θ)Tu(k) minus F q q _qm _qm T θ1113872 1113873

(29b)

where F(q q _qm _qm T θ) FL(q q _q θ)T + Mminus 1(q θ)[Mm

( _qm(k + 1) minus _qm(k)) + Fm _qm(k)] denotes the variable whichincludes dynamic parameters of links and motors

In order to formulate the tracking control define thetracking error 1113957qt(k) and estimation error 1113957qe(k) at time tk as

1113957qt(k) q(k) minus qd(k)

1113957qe(k) q(k) minus 1113954q(k)(30)

where qd(k) is the desired position and 1113954q(k) denotes theestimation position

Define the reference velocity for tracking and estimation

_qt(k) _qd(k) + Λt qd(k) minus 1113954q(k)( 1113857

_qd(k) minus Λt 1113957qt(k) minus 1113957qe(k)( 1113857

_qe(k) _1113954q(k) minus Λe(q(k) minus 1113954q(k))

_1113954q(k) minus Λe1113957qe(k)

(31)

where _1113954q(k) denotes the estimation of _q(k) and ΛtΛe areconstants and diagonal matrices

Complexity 5

Define the filtered variables including the estimationerror

1113957_qt(k) _q(k) minus _qt(k) _1113957qt(k) + Λt 1113957qt(k) minus 1113957qe(k)( 1113857 (32)

Consider the discrete terminal sliding surface as follows

s(k) 1113957_qt(k) + λ1113957qt(k)p (33)

where λ is a positive constant diagonal parameter matrix andp p1p2 in which p1 and p2 are positive odd integerssatisfying p2 gtp1 Motivated by the reaching law presented

by Weibing Gao et al in [20] we use the reaching law forexponential discrete sliding mode control as follows

s(k + 1) (I minus Th)s(k) minus Tεsgn(s(k)) (34)

where sgn(middot) is the signum function εgt 0 and hgt 0Since system states cannot be measured directly pa-

rameters which contain variables q(k) _q(k) and θ need tobe estimated in the controller design Assume that the es-timate error and uncertainties are bounded and we have

M(q θ) 1113954M(q θ) + ΔM(q θ)

F q q _qm _qm T θ1113872 1113873 1113954MF q q _qm _qm T θ1113872 1113873 + ΔF q q _qm _qm T θ1113872 1113873(35)

where _qm denotes _qm(k + 1) 1113954M(q θ) M(1113954q 1113954θ) and1113954F(q q _qm _qm T θ) F(1113954q 1113954q 1113954_qm 1113954_qm T 1113954θ) represent the es-timation of dynamic parameters and ΔM(q θ) andΔF(q q _qm _qm T θ) are bounded variables including theestimation error and system uncertainties which satisfy

ΔM(q θ)le δM

ΔF q q _qm _qm T θ1113872 1113873

le δF (36)

where δM and δF are constants

Theorem 1 Consider the rigid-link flexible-joint manipu-lator system described by equations (29a) and (29b) and the

discrete sliding manifold described by equation (33) by usingthe reaching law in equation (34) stable control law isdesigned as

u(k) 1113954M(q θ)

T1113954F q q _qm _qm T θ1113872 1113873 + _qt(k) minus _qt(k minus 1) + λ1113957q

pt (k)1113960

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

minus T(hs(k) + ε(k)sgn(s(k)))]

(37)

where ε and h are positive diagonal matrices which satisfy thefollowing inequations

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961

pminus λ1113957q

pt (k)

+1TξF + hs(k)

(38)

Tlt2s(k) minus ΔF

hs(k) +ε(k) (39)

Proof Substituting control law (37) in the rigid-link flex-ible-joint system equations (29a) and (29b) the error dy-namics are obtained

1113957qt(k + 1) 1113957qt(k) + T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 (40)

1113957_qt(k + 1) 1113957_qt(k) + λ1113957qt(k)p

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

+ Mminus 1 1113954M minus I1113872 1113873 1113954F + Δ _qt(k) minus λ T 1113957_qt(k)111387211139601113960

minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961

+ Mminus 1 1113954M[ΔF minus T(hs(k) + ε(k)sgn(s(k)))]

(41)

6 Complexity

For simplicity define M Mminus 1 1113954M which is invertibleand Mminus 1 1113954M

minus 1M Substituting (40) and (41) into (34)

yields

s(k + 1) s(k) + M[minus T(hs(k) minus ε(k)sgn(s(k)))]

+ M1113954F minus F +(M minus I) Δ _qt(k) minus λ1113957qpt (k)1113858

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p1113961

(42)

Stability conditions for the discrete sliding mode controlare given by Sarpturk [21]

sT(k)(s(k + 1) minus s(k))lt 0 (43a)

sT(k)(s(k + 1) + s(k))gt 0 (43b)

Combining (42) and (43a) we have

sT[minus MT[hs(k) + ε(k)sgn(s(k))] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)11139611113961lt 0

(44)

If s(k)gt 0 sgn(s(k)) 1 and s(k + 1) minus s(k)lt 0 thesliding gain lower bound can be given as

ε(k)gt1T

I minus Mminus 1

1113872 1113873 minus λ T 1113957_qt(k) minus Λt1113957qt(k)111387211139601113960

+Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961 +

1T

I minus Mminus 1

1113872 1113873Δ _qt(k) +1T

1113954F minus Mminus 1F1113872 1113873 minus hs(k)

(45)

Employ ξM and ξF which satisfy I minus Mminus 1lt ξM1113954F minus Mminus 1Flt ξF we can obtain ε(k) as follows

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 11138731113960

+ 1113957qt(k)1113859p

minus λ1113957qpt (k)

+1TξF + hs(k)

(46)

If s(k)lt 0 then sgn(s(k)) minus 1 and s(k + 1) minus s(k) canbe calculated by

s(k + 1) s(k) + MT[ε(k) minus hs(k)] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ1113957qpt (k) minus λ T 1113957_qt(k)111387211139601113960 minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859

p1113961

(47)

Employ ε(k) given by equation (46) and we have

s(k + 1) minus s(k)gtMTh[s(k) minus s(k)]

+ MξF minus M1113954F + F1113960 1113961

+ MξM Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)1113960

+ TΛt1113957qe(k) + 1113957qt(k)1113859p

minus λ1113957qpt (k)

minus (M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)1113961

(48)

Complexity 7

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 5: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

ω(lk κ) asymp Γlkω(lk lk minus 1) + Alklkminus 1Γlkminus 1ω(lk minus 1 lk minus 2)

+ middot middot middot + Alklkminus 1Alkminus 1lkminus 2 times middot middot middot times Alkminus l+2lkminus l+1Γlkminus l+1ω(lk minus l + 1 lk minus l)

Q(kl κ) E 1113957ω(lk κ)1113957ω(lk κ)T

1113966 1113967

1113957ω(lk κ) ω(lk κ) minus 1113954ω(lk κ)

(23)

)e estimation in equation (19) can be determined by

1113954z(κ|lk) Hmlk(κ)1113954x(κ|lk)

1113954x(κ|lk) Fκlk(1113954x(lk|lk) minus 1113954ω(lk κ|lk))(24)

To estimate the process noise as in equations (13)ndash(17)we have

1113954ω(lk κ|lk) Q(lk κ)HTlkPzz(lk|lk minus 1)

minus 1times(z(lk) minus z(lk minus 1))

(25)

)en Pxz(lk κ|lk) and Pzz(κ|lk) are obtained as follows

Pxz(lk κ|lk) P(lk κ|lk) minus Pxω(lk κ|lk)( 1113857ATκlkH(κ)

T

Pzz(κ|lk) H(κ)P(κ|lk)H(κ)T

+ R(κ)

(26)

where the covariances P(κ|lk) and Pxω(lk κ|lk) are derivedas follows

P(κ|lk) Aκlk P(lk|lk) + Pωω(κ|lk) minus Pxω(lk κ|lk)T

1113872 1113873ATκlk minus Pxω(lk κ|lk)

Pωω(κ|lk) Q(lk κ) minus Q(lk κ)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

Pxω(lk κ|lk) Q(lk κ) minus P(lk|lk minus 1)HTlk times Pzz(lk|lk minus 1)

minus 1HlkQ(lk κ)

T

(27)

33 Summary of the Fusion Estimate Method )e state of aRLFJ manipulator is estimated using the indirect mea-surements from asynchronous multirate sensors )e sensorfusion estimate can be implemented in the practical

application using a switching mechanism in accordance withsampling time As shown in Figure 1 the update equationsare chosen at the different sampling times )e state esti-mation can be given by

1113954xF(k|k) Fkkminus 1 1113954x

minus(k|k)( + Kk z

m(k) minus h

mk 1113954x

minus(k|k) 0( )1113858 1113859 tne tlk

1113954x(k|k) + Pxz(k k minus l|k)Pminus 1zz(k minus l|k) z

c(k minus l) minus 1113954z(k minus l|k)( 1113857 t tlk

⎧⎨

⎩ (28)

Remark 3 )e exponential convergence of the sensorfusion estimate can be proved in the similar way which ispresented in [19] which has more detailed conclusion onthe stability

4 Rigid-Link Flexible-Joint Tracking Control

In this section the discrete-time terminal sliding modetracking control-based fusion estimation (AMSFE-DTSMC)is presented for rigid-link flexible manipulators whose stateis estimated by the sensor fusion method described in theprevious section )e controller is designed by using bothposition and velocity of the link from the sensor fusionestimate To design the novel controller the model in(1a)ndash(1d) can written as

q(k + 1) q(k) + T _q(k) (29a)

_q(k + 1) _q(k) + Mminus 1

(q θ)Tu(k) minus F q q _qm _qm T θ1113872 1113873

(29b)

where F(q q _qm _qm T θ) FL(q q _q θ)T + Mminus 1(q θ)[Mm

( _qm(k + 1) minus _qm(k)) + Fm _qm(k)] denotes the variable whichincludes dynamic parameters of links and motors

In order to formulate the tracking control define thetracking error 1113957qt(k) and estimation error 1113957qe(k) at time tk as

1113957qt(k) q(k) minus qd(k)

1113957qe(k) q(k) minus 1113954q(k)(30)

where qd(k) is the desired position and 1113954q(k) denotes theestimation position

Define the reference velocity for tracking and estimation

_qt(k) _qd(k) + Λt qd(k) minus 1113954q(k)( 1113857

_qd(k) minus Λt 1113957qt(k) minus 1113957qe(k)( 1113857

_qe(k) _1113954q(k) minus Λe(q(k) minus 1113954q(k))

_1113954q(k) minus Λe1113957qe(k)

(31)

where _1113954q(k) denotes the estimation of _q(k) and ΛtΛe areconstants and diagonal matrices

Complexity 5

Define the filtered variables including the estimationerror

1113957_qt(k) _q(k) minus _qt(k) _1113957qt(k) + Λt 1113957qt(k) minus 1113957qe(k)( 1113857 (32)

Consider the discrete terminal sliding surface as follows

s(k) 1113957_qt(k) + λ1113957qt(k)p (33)

where λ is a positive constant diagonal parameter matrix andp p1p2 in which p1 and p2 are positive odd integerssatisfying p2 gtp1 Motivated by the reaching law presented

by Weibing Gao et al in [20] we use the reaching law forexponential discrete sliding mode control as follows

s(k + 1) (I minus Th)s(k) minus Tεsgn(s(k)) (34)

where sgn(middot) is the signum function εgt 0 and hgt 0Since system states cannot be measured directly pa-

rameters which contain variables q(k) _q(k) and θ need tobe estimated in the controller design Assume that the es-timate error and uncertainties are bounded and we have

M(q θ) 1113954M(q θ) + ΔM(q θ)

F q q _qm _qm T θ1113872 1113873 1113954MF q q _qm _qm T θ1113872 1113873 + ΔF q q _qm _qm T θ1113872 1113873(35)

where _qm denotes _qm(k + 1) 1113954M(q θ) M(1113954q 1113954θ) and1113954F(q q _qm _qm T θ) F(1113954q 1113954q 1113954_qm 1113954_qm T 1113954θ) represent the es-timation of dynamic parameters and ΔM(q θ) andΔF(q q _qm _qm T θ) are bounded variables including theestimation error and system uncertainties which satisfy

ΔM(q θ)le δM

ΔF q q _qm _qm T θ1113872 1113873

le δF (36)

where δM and δF are constants

Theorem 1 Consider the rigid-link flexible-joint manipu-lator system described by equations (29a) and (29b) and the

discrete sliding manifold described by equation (33) by usingthe reaching law in equation (34) stable control law isdesigned as

u(k) 1113954M(q θ)

T1113954F q q _qm _qm T θ1113872 1113873 + _qt(k) minus _qt(k minus 1) + λ1113957q

pt (k)1113960

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

minus T(hs(k) + ε(k)sgn(s(k)))]

(37)

where ε and h are positive diagonal matrices which satisfy thefollowing inequations

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961

pminus λ1113957q

pt (k)

+1TξF + hs(k)

(38)

Tlt2s(k) minus ΔF

hs(k) +ε(k) (39)

Proof Substituting control law (37) in the rigid-link flex-ible-joint system equations (29a) and (29b) the error dy-namics are obtained

1113957qt(k + 1) 1113957qt(k) + T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 (40)

1113957_qt(k + 1) 1113957_qt(k) + λ1113957qt(k)p

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

+ Mminus 1 1113954M minus I1113872 1113873 1113954F + Δ _qt(k) minus λ T 1113957_qt(k)111387211139601113960

minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961

+ Mminus 1 1113954M[ΔF minus T(hs(k) + ε(k)sgn(s(k)))]

(41)

6 Complexity

For simplicity define M Mminus 1 1113954M which is invertibleand Mminus 1 1113954M

minus 1M Substituting (40) and (41) into (34)

yields

s(k + 1) s(k) + M[minus T(hs(k) minus ε(k)sgn(s(k)))]

+ M1113954F minus F +(M minus I) Δ _qt(k) minus λ1113957qpt (k)1113858

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p1113961

(42)

Stability conditions for the discrete sliding mode controlare given by Sarpturk [21]

sT(k)(s(k + 1) minus s(k))lt 0 (43a)

sT(k)(s(k + 1) + s(k))gt 0 (43b)

Combining (42) and (43a) we have

sT[minus MT[hs(k) + ε(k)sgn(s(k))] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)11139611113961lt 0

(44)

If s(k)gt 0 sgn(s(k)) 1 and s(k + 1) minus s(k)lt 0 thesliding gain lower bound can be given as

ε(k)gt1T

I minus Mminus 1

1113872 1113873 minus λ T 1113957_qt(k) minus Λt1113957qt(k)111387211139601113960

+Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961 +

1T

I minus Mminus 1

1113872 1113873Δ _qt(k) +1T

1113954F minus Mminus 1F1113872 1113873 minus hs(k)

(45)

Employ ξM and ξF which satisfy I minus Mminus 1lt ξM1113954F minus Mminus 1Flt ξF we can obtain ε(k) as follows

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 11138731113960

+ 1113957qt(k)1113859p

minus λ1113957qpt (k)

+1TξF + hs(k)

(46)

If s(k)lt 0 then sgn(s(k)) minus 1 and s(k + 1) minus s(k) canbe calculated by

s(k + 1) s(k) + MT[ε(k) minus hs(k)] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ1113957qpt (k) minus λ T 1113957_qt(k)111387211139601113960 minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859

p1113961

(47)

Employ ε(k) given by equation (46) and we have

s(k + 1) minus s(k)gtMTh[s(k) minus s(k)]

+ MξF minus M1113954F + F1113960 1113961

+ MξM Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)1113960

+ TΛt1113957qe(k) + 1113957qt(k)1113859p

minus λ1113957qpt (k)

minus (M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)1113961

(48)

Complexity 7

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 6: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

Define the filtered variables including the estimationerror

1113957_qt(k) _q(k) minus _qt(k) _1113957qt(k) + Λt 1113957qt(k) minus 1113957qe(k)( 1113857 (32)

Consider the discrete terminal sliding surface as follows

s(k) 1113957_qt(k) + λ1113957qt(k)p (33)

where λ is a positive constant diagonal parameter matrix andp p1p2 in which p1 and p2 are positive odd integerssatisfying p2 gtp1 Motivated by the reaching law presented

by Weibing Gao et al in [20] we use the reaching law forexponential discrete sliding mode control as follows

s(k + 1) (I minus Th)s(k) minus Tεsgn(s(k)) (34)

where sgn(middot) is the signum function εgt 0 and hgt 0Since system states cannot be measured directly pa-

rameters which contain variables q(k) _q(k) and θ need tobe estimated in the controller design Assume that the es-timate error and uncertainties are bounded and we have

M(q θ) 1113954M(q θ) + ΔM(q θ)

F q q _qm _qm T θ1113872 1113873 1113954MF q q _qm _qm T θ1113872 1113873 + ΔF q q _qm _qm T θ1113872 1113873(35)

where _qm denotes _qm(k + 1) 1113954M(q θ) M(1113954q 1113954θ) and1113954F(q q _qm _qm T θ) F(1113954q 1113954q 1113954_qm 1113954_qm T 1113954θ) represent the es-timation of dynamic parameters and ΔM(q θ) andΔF(q q _qm _qm T θ) are bounded variables including theestimation error and system uncertainties which satisfy

ΔM(q θ)le δM

ΔF q q _qm _qm T θ1113872 1113873

le δF (36)

where δM and δF are constants

Theorem 1 Consider the rigid-link flexible-joint manipu-lator system described by equations (29a) and (29b) and the

discrete sliding manifold described by equation (33) by usingthe reaching law in equation (34) stable control law isdesigned as

u(k) 1113954M(q θ)

T1113954F q q _qm _qm T θ1113872 1113873 + _qt(k) minus _qt(k minus 1) + λ1113957q

pt (k)1113960

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

minus T(hs(k) + ε(k)sgn(s(k)))]

(37)

where ε and h are positive diagonal matrices which satisfy thefollowing inequations

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961

pminus λ1113957q

pt (k)

+1TξF + hs(k)

(38)

Tlt2s(k) minus ΔF

hs(k) +ε(k) (39)

Proof Substituting control law (37) in the rigid-link flex-ible-joint system equations (29a) and (29b) the error dy-namics are obtained

1113957qt(k + 1) 1113957qt(k) + T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 (40)

1113957_qt(k + 1) 1113957_qt(k) + λ1113957qt(k)p

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p

+ Mminus 1 1113954M minus I1113872 1113873 1113954F + Δ _qt(k) minus λ T 1113957_qt(k)111387211139601113960

minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961

+ Mminus 1 1113954M[ΔF minus T(hs(k) + ε(k)sgn(s(k)))]

(41)

6 Complexity

For simplicity define M Mminus 1 1113954M which is invertibleand Mminus 1 1113954M

minus 1M Substituting (40) and (41) into (34)

yields

s(k + 1) s(k) + M[minus T(hs(k) minus ε(k)sgn(s(k)))]

+ M1113954F minus F +(M minus I) Δ _qt(k) minus λ1113957qpt (k)1113858

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p1113961

(42)

Stability conditions for the discrete sliding mode controlare given by Sarpturk [21]

sT(k)(s(k + 1) minus s(k))lt 0 (43a)

sT(k)(s(k + 1) + s(k))gt 0 (43b)

Combining (42) and (43a) we have

sT[minus MT[hs(k) + ε(k)sgn(s(k))] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)11139611113961lt 0

(44)

If s(k)gt 0 sgn(s(k)) 1 and s(k + 1) minus s(k)lt 0 thesliding gain lower bound can be given as

ε(k)gt1T

I minus Mminus 1

1113872 1113873 minus λ T 1113957_qt(k) minus Λt1113957qt(k)111387211139601113960

+Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961 +

1T

I minus Mminus 1

1113872 1113873Δ _qt(k) +1T

1113954F minus Mminus 1F1113872 1113873 minus hs(k)

(45)

Employ ξM and ξF which satisfy I minus Mminus 1lt ξM1113954F minus Mminus 1Flt ξF we can obtain ε(k) as follows

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 11138731113960

+ 1113957qt(k)1113859p

minus λ1113957qpt (k)

+1TξF + hs(k)

(46)

If s(k)lt 0 then sgn(s(k)) minus 1 and s(k + 1) minus s(k) canbe calculated by

s(k + 1) s(k) + MT[ε(k) minus hs(k)] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ1113957qpt (k) minus λ T 1113957_qt(k)111387211139601113960 minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859

p1113961

(47)

Employ ε(k) given by equation (46) and we have

s(k + 1) minus s(k)gtMTh[s(k) minus s(k)]

+ MξF minus M1113954F + F1113960 1113961

+ MξM Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)1113960

+ TΛt1113957qe(k) + 1113957qt(k)1113859p

minus λ1113957qpt (k)

minus (M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)1113961

(48)

Complexity 7

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 7: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

For simplicity define M Mminus 1 1113954M which is invertibleand Mminus 1 1113954M

minus 1M Substituting (40) and (41) into (34)

yields

s(k + 1) s(k) + M[minus T(hs(k) minus ε(k)sgn(s(k)))]

+ M1113954F minus F +(M minus I) Δ _qt(k) minus λ1113957qpt (k)1113858

minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 1113873 + 1113957qt(k)1113960 1113961p1113961

(42)

Stability conditions for the discrete sliding mode controlare given by Sarpturk [21]

sT(k)(s(k + 1) minus s(k))lt 0 (43a)

sT(k)(s(k + 1) + s(k))gt 0 (43b)

Combining (42) and (43a) we have

sT[minus MT[hs(k) + ε(k)sgn(s(k))] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)11139611113961lt 0

(44)

If s(k)gt 0 sgn(s(k)) 1 and s(k + 1) minus s(k)lt 0 thesliding gain lower bound can be given as

ε(k)gt1T

I minus Mminus 1

1113872 1113873 minus λ T 1113957_qt(k) minus Λt1113957qt(k)111387211139601113960

+Λt1113957qe(k)1113857 + 1113957qt(k)1113859p

minus λ1113957qpt (k)1113961 +

1T

I minus Mminus 1

1113872 1113873Δ _qt(k) +1T

1113954F minus Mminus 1F1113872 1113873 minus hs(k)

(45)

Employ ξM and ξF which satisfy I minus Mminus 1lt ξM1113954F minus Mminus 1Flt ξF we can obtain ε(k) as follows

ε(k)gt1TξM Δ _qt(k) minus λ T 1113957_qt(k) minus Λt1113957qt(k) + Λt1113957qe(k)1113872 11138731113960

+ 1113957qt(k)1113859p

minus λ1113957qpt (k)

+1TξF + hs(k)

(46)

If s(k)lt 0 then sgn(s(k)) minus 1 and s(k + 1) minus s(k) canbe calculated by

s(k + 1) s(k) + MT[ε(k) minus hs(k)] + M1113954F minus F

+(M minus I) Δ _qt(k) minus λ1113957qpt (k) minus λ T 1113957_qt(k)111387211139601113960 minus Λt1113957qt(k) + Λt1113957qe(k)1113857 + 1113957qt(k)1113859

p1113961

(47)

Employ ε(k) given by equation (46) and we have

s(k + 1) minus s(k)gtMTh[s(k) minus s(k)]

+ MξF minus M1113954F + F1113960 1113961

+ MξM Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)1113960

+ TΛt1113957qe(k) + 1113957qt(k)1113859p

minus λ1113957qpt (k)

minus (M minus I) Δ _qt(k) minus λ T1113957_qt(k) minus TΛt1113957qt(k)11138721113960

+ TΛt1113957qe(k) + 1113957qt(k)1113857p

minus λ1113957qpt (k)1113961

(48)

Complexity 7

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 8: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

where s(k + 1) minus s(k)gt 0 is obtained by adopting ε(k)

according to equation (46))erefore equation (46) satisfiedthe Sarpturk stability (43a)

Assuming M I Sarpturk condition (3) is formulatedby combining equations (42) and (43b)

s(k + 1) + s(k) 2s(k) minus ΔF

minus T[hs(k) + ε(k)sgn(s(k))](49)

If s(k)lt 0 s(k + 1) + s(k)lt 0 we have

2s(k) minus T(hs(k) minus ε(k)) minus ΔF lt 0 (50)

Else if s(k)gt 0 s(k + 1) + s(k)gt 0 we have

2s(k) minus T(hs(k) + ε(k)) minus ΔF gt 0 (51)

)e sampling time T which satisfies the followingequation can guarantee the condition in (43b)

0ltTlt2s(k) minus ΔF

hs(k) +ε(k) (52)

From (52) we know that the stability conditions ofDTSMC given by (43a) and (43b) are guaranteed by chosenparameters ε and h defined in (38) and (39) )eorem 1 isproved

5 Simulation and Experimental Studies

51 Simulation Study )e results obtained from the sim-ulation of proposed control scheme on a two-link RLFJmanipulators shown in Figure 2 are presented in this sectionIn the simulation the aim is make the RLFJ manipulatortrack desired trajectories q1d sin(πt) and q2d cos(πt)

from the initial position [q1i q2i] [05 05] )e roboticdynamic parameters are given by

M(q) M11 M12

M21 m2l22

1113890 1113891

g(q) g1

minus m2gl2 sin q1 + q2( 111385711138571113890 1113891

C minus m2l1l2 sin q2( 1113857 2 _q1 _q2 + _q

221113872 1113873

minus m2l1l2 sin q2( 1113857 _q1 _q2

⎡⎢⎣ ⎤⎥⎦

M11 m1 + m21113857l21 + m2l

22 + 2m2l1l2 cos q2( 1113857

M12 M21 m2l22 + m2l1l2 cos q2( 1113857

g1 minus m1 + m2( 1113857gl1 sin q1( 1113857 minus m2gl2 sin q1 + q2( 1113857

(53)

where mi and li are given in Table 1 For i 1 2 mi and lidenote mass and length of the ith linkkm diag 10 10 Numrad Mm

diag 05 05 kg middot m2 Fm diag 4 05 Nm middot secradIn order to demonstrate the influence of estimation for

tracking performance a DTSMC controller without esti-mation is also simulated)eDTSMC control law is given by

u(k) 1113954M

T1113954F + _qd(k) minus _qd(k minus 1) + λ1113957q

pt (k)(1113960

minus λ T _1113957qt(k) + 1113957qt(k)]p

1113960 1113961 minus hs(k) minus ε(k)sgn(s(k))

(54)

with a sliding surface s(k) _1113957qt(k) + λ1113957qt(k)p which is dif-ferent from proposal sliding mode surface

Parameters of the proposed controller and DTSMC areselected as T 0001 s h diag 2 2 λ diag 15 15

Λt diag 5 5 and ε diag 25 25 θ(k) 015 is chosen asa constant model parameter whose initial value is θ(k) 0p 3 q 5 ξM diag 3 3 and ξF diag 1 1 In thesimulation as shown in Figure 3 we assume that the endeffector is observed by a fixed camera whose delayedmeasurements are used directly to calculate the joint po-sition )e delayed time of slow measurement is 0125 s )eprocess and measurement noise are chosen as Q 001I andR 01I )e initial position value of joints (joint1 joint2)for tracking control is given by (05 rad 05 rad) and forfusion estimator it is given by (09 rad 09 rad) )e initialvelocity value for tracking control is given by (0 0) and forestimation it is given by (025 rads 0)

)e estimate errors of position and velocity are plotted inFigure 4 )e estimate errors of parameter θ are shown inFigure 5 )e position tracking of the proposed method fortwo links is shown in Figures 6 and 7 where the comparativeresult of SMC without fusion estimator is also plotted Inconclusion the simulation results clearly indicate that theproposed approach guarantees the convergence of trackingerrors and has better tracking accuracy

52 Experimental Study To validate the applicability of theproposed control schemes a single-link flexible-joint

X

Y

l1

l2

q1

q2

End effector

Figure 2 )e schematic diagram of robotic manipulator withkinematics parameters

Table 1 )e manipulator parameters

Linki mi(kg) li(cm) qi

1 25 35 q1(0∘ sim 180∘)2 3 20 q2(minus 160∘ sim 160∘)

8 Complexity

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 9: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

manipulator with a fixed camera is employed as the ex-perimental plant which is shown in Figure 8 )e aim is tomake the end effector move along the desired trajectoryxd l1 cos(qd) yd l1 sin(qd) with qd (04 minus t)75 andthe initial position of joint is qi 035 rad Parameters of thesingle-link robot are measured offline l1 06m andm1 10 kg the measurement errors are Δl1 005m andΔm1 01 kg

To observe the state of end effector the calibrated camerais fixed perpendicular to the robot notion plane )e co-ordinate relation of Cartesian space and image space isshown in Figure 9 Camera measurements are obtained fromthe image sequences shown in Figure 10 We define theposition in the image coordinate y(k) [xm(k) ym(k)]

given in Figure 11 and the position in Cartesian coordinate

Join

t1

2

1

0

ndash1

ndash2

Join

t2

2

1

0

ndash1

ndash2

0 05 1 15 2 25 3 35 4 45Time (s)

Delayed measurementsNoisy measurements

0 05 1 15 2 25 3 35 4 45Time (s)

Figure 3 Asynchronous multirate sensor measurements

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Velocity estimation error of link1Position estimation error of link1Velocity estimation error of link2Position estimation error of link2

Figure 4 )e estimation errors of q1 _q1 q2 _q2

0 05 1 15 2 25 3 35 4 45minus03

minus02

minus01

0

01

02

03

04

Time (s)

Estim

atio

n er

ror

Figure 5 )e estimation error of θ

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

1 (r

ad)

Reference trajectoryProposed methodSMC

Figure 6 Comparative control performances of joint1 between theproposed method and SMC

0 05 1 15 2 25 3 35 4 45

minus1minus08minus06minus04minus02

002040608

1

Time (s)

Posit

ion

trac

king

of j

oint

2

Reference trajectoryProposed methodSMC

Figure 7 Comparative control performances of joint2 between theproposed method and SMC

Complexity 9

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 10: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

is r(k) [xc(k) yc(k)] By using the pinhole camera modelwe calculate themapping from the joint space to image spaceIk(middot) in equation (14) as follows

f

Hc minus f

xm(k)

ym(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

xc(k)

yc(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦

l1 cos q(k)

l2 sin q(k)

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ (55)

where Hc is the perpendicular distance between the cameraand the robot motion plane and f denotes the focal length of

the camera In the experiment camera parameters aref 0035m 640 times 512 pixels and Hc 08m According tothe mapping Ik(middot) the joint position can be obtained bycalculating the inverse circular trigonometric functionsarccos[xm(k)f(l1(Hc minus f))] or arcsin[ym(k)f(l1(Hcminus

f))] )e delayed vision measurements [xm(k) ym(k)] areconverted to joint measurements arccos[xm(k)f(l1(Hc minus

f))] which are shown in Figure 12 where the encodermeasurements are also plotted )e position estimation of

Fusionestimate

Imageprocessing

CameraGear

mechanism

End effector

Encoder

Controlalgorithm

PMAC-2A

Controller

Servodrive

Servomotor

Figure 8 )e experiment setup

Image space

He

Robot

Yc

Cartesian space

Xi

Xc

l1

Yi

q

Camera

Figure 9 )e coordinate relation of Cartesian space and image space

Frame 1 Frame 2 Frame 3 Frame 4 Frame 5

Frame 6 Frame 7 Frame 8 Frame 9 Frame 10

Figure 10 )e image sequences of 10 frames

10 Complexity

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 11: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

the joint is calculated by using the proposed fusion estimatemethod shown as red solid line in Figure 12 To show the

performance detail of the fusion estimate method Figure 13shows the error of joint position estimation

To validate the performance of the fusion estimate-basedDTSMC comparative experiments are implemented in thispaper )e proposed controller and a DTSMC without es-timator are employed in this test Parameters are selected asT 001 s h 3 λ 2 Λt 4 and ε 20 θ(k) 005p 3 q 5 ξM 02 and ξF 2 According to the com-parative position tracking performances given in Figure 14it is obvious that the proposed controller provides a superiorbehavior

6 Conclusion

A novel RLFJ manipulator tracking controller AMSFE-DTSMC is proposed in this paper based on DTSMCcoupled with asynchronousmultirate sensor fusion States ofthe manipulator are estimated by EKF-based sensor fusionalgorithm which combines asynchronous multirate mea-surements from visual and non-vision-based sensorsCompared with the non-vision measurements visualmeasurements are considered as periodic out-of-sequencemeasurements which are used to re-estimate the state Withthe state estimation DTSMC is designed by using a novelsliding surface that includes tracking error and estimationerror By using the Sarpturk inequalities boundedness of thecontrolled variables is proved )e effectiveness of theproposed approach is shown in simulation and experimentalstudies

Data Availability

No data were used to support this study

Position measurements in image space

Figure 11 )e trajectories of end effector in image space

Vision measurementsEncoder measurementsPosition estimation

04

035

03

025

02

015

01

005

0

Posit

ion

of jo

int (

rad)

0 5 10 15 20 25 30Time (times100ms)

Figure 12 )e encoder and visual measurements

0 5 10 15 20 25 30minus002

minus0015

minus001

minus0005

0

0005

001

Time (times100ms)

Posit

ion

estim

atio

n er

ror (

rad)

Estimation error

Figure 13 )e position estimation error of the joint

minus2

0

2

4 times10minus3

X ax

is tr

acki

ng er

orr

0 5 10 15 20 25 30minus001

minus0005

0

0005

001

Time (times100ms)

0 5 10 15 20 25 30Time (times100ms)

Y ax

is tr

acki

ng er

orr

Proposed method DTSMC

Proposed method DTSMC

Figure 14 Comparative position tracking errors of the endeffector

Complexity 11

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity

Page 12: Discrete-TimeSlidingModeControlCoupledwithAsynchronous

Disclosure

An earlier version of multirate sensor fusion-based DTSMChas been presented in the 10th International Conference onControl and Automation (ICCA) [12] however a totallynew controller is designed by using a new sliding surfaceincluding both tracking error and estimation error in thispaper and the effectiveness of the proposed approach isproved both in the simulation and experimental studies

Conflicts of Interest

)e authors declare that they have no conflicts of interest

References

[1] S Nicosia and P Tomei ldquoA tracking controller for flexiblejoint robots using only link position feedbackrdquo IEEETransactions on Automatic Control vol 40 no 5 pp 885ndash890 1995

[2] S Nicosia P Tomei and A Tornambe ldquoA nonlinear observerfor elastic robotsrdquo IEEE Journal on Robotics and Automationvol 4 no 1 pp 45ndash52 1988

[3] W E Dixon E Zergeroglu D M Dawson andM W Hannan ldquoGlobal adaptive partial state feedbacktracking control of rigid-link flexible-joint robotsrdquo Roboticavol 18 no 3 pp 325ndash336 2000

[4] F Abdollahi H A Talebi and R V Patel ldquoA stable neuralnetwork-based observer with application to flexible-jointmanipulatorsrdquo IEEE Transactions on Neural Networks vol 17no 1 pp 118ndash129 2006

[5] J Na B Jing Y Huang G Gao and C Zhang ldquoUnknownsystem dynamics estimator for motion control of nonlinearrobotic systemsrdquo IEEE Transactions on Industrial Electronicsvol 67 no 5 pp 3850ndash3859 2020

[6] S Jean M Tomizuka and T Katou ldquoKinematic kalman filter(KKF) for robot end-effector sensingrdquo Journal of DynamicSystems Measurement and Control vol 131 no 2 pp 1ndash82009

[7] C A Lightcap and S A Banks ldquoAn extended kalman filter forreal-time estimation and control of a rigid-link flexible-jointmanipulatorrdquo IEEE Transactions on Control Systems Tech-nology vol 18 no 1 pp 91ndash103 2010

[8] J G Garcła A Robertsson J G Ortega and R JohanssonldquoSensor fusion for compliant robot motion controlrdquo IEEETransactions on Robotics vol 24 no 2 pp 430ndash441 2008

[9] R Jassemi-Zargani and D Necsulescu ldquoExtended kalmanfilter-based sensor fusion for operational space control of arobot armrdquo IEEE Transactions on Instrumentation andMeasurement vol 51 no 6 pp 1279ndash1282 2002

[10] S Wang L Tao Q Chen J Na and X Ren ldquoUSDE-basedsliding mode control for servo mechanisms with unknownsystem dynamicsrdquo IEEEASME Transactions onMechatronicsvol 25 no 2 pp 1056ndash1066 2020

[11] K Jezernik B Curk and J Harnik ldquoObserver-based slidingmode control of a robotic manipulatorrdquo Robotica vol 12no 5 pp 443ndash448 1994

[12] G Xue X Ren K Xing and Q Chen ldquoDiscrete-time slidingmode control coupled with asynchronous sensor fusion forrigid-link flexible-joint manipulatorsrdquo in Proceedings of the2013 10th IEEE International Conference on Control andAutomation (ICCA) pp 238ndash243 Hangzhou China June2013

[13] Y Gao S S Majidabad and H T Shandiz ldquoDiscrete-timebased sliding-mode control of robot manipulatorsrdquo Inter-national Journal of Intelligent Computing and Cyberneticsvol 5 no 3 pp 340ndash358 2012

[14] M L Corradini V Fossi A Giantomassi G IppolitiS Orlando and G Orlandob ldquoDiscrete time sliding modecontrol of robotic manipulators development and experi-mental validationrdquoControl Engineering Practice vol 20 no 8pp 816ndash822 2012

[15] L Wang T Chai and L Zhai ldquoNeural-network-based ter-minal sliding-mode control of robotic manipulators includingactuator dynamicsrdquo IEEE Transactions on Industrial Elec-tronics vol 56 no 9 pp 3296ndash3304 2009

[16] Y Feng X Yu and Z Man ldquoNon-singular terminal slidingmode control of rigid manipulatorsrdquo Automatica vol 38no 12 pp 2159ndash2167 2002

[17] M L Corradini and G Orlando ldquoLinear unstable plants withsaturating actuators robust stabilization by a time varyingsliding surfacerdquo Automatica vol 43 no 1 pp 88ndash94 2007

[18] Y Bar-Shalom H Huimin Chen and M Mallick ldquoOne-stepsolution for the multistep out-of-sequence-measurementproblem in trackingrdquo IEEE Transactions on Aerospace andElectronic Systems vol 40 no 1 pp 27ndash37 2004

[19] K Reif F Sonnemann and R Unbehauen ldquoAn EKF-basednonlinear observer with a prescribed degree of stabilityrdquoAutomatica vol 34 no 9 pp 1119ndash1123 1998

[20] W Gao Y Wang and A Homaifa ldquoDiscrete-time variablestructure control systemsrdquo IEEE Transactions on IndustrialElectronics vol 42 no 2 pp 117ndash122 1995

[21] S Sarpturk S Y Istefanopulos Y Istefanopulos andO Kaynak ldquoOn the stability of discrete-time sliding modecontrol systemsrdquo IEEE Transactions on Automatic Controlvol 32 no 10 pp 930ndash932 1987

12 Complexity