8
MOTION TRACKING OF BRACHIATION ROBOT BASED ON LABVIEW USING SEAMLESS INTEGRATION TECHNOLOGY 1 Henry Liu (劉又豪), 2 Chiou-Shann Fuh (傅楸善) 1 Department of Mechanical Engineering, 2 Department of Computer Science and Information Engineering, National Taiwan University, Taipei, Taiwan E-mail: [email protected] [email protected] ABSTRACT We report on the realization of the real-time synchronous image acquisition and the image processing for detecting the motion of the brachiation robot. To derive the best performance, we cooperate two platforms, LabVIEW and MATLAB. LabVIEW provides the environment to control the brachiation robot, and MATLAB has powerful image-processing functions. Furthermore, by using COM technology, slow processing speed caused by the integration of two platforms could be eliminated. Finally, we conducted experiments to investigate the performance of the proposed method. The webcam successfully captured the motion of the robot in the real-time system. Keywords: LabVIEW, Real-time visual processing, MATLAB, COM technology 1. INTRODUCTION Brachiation robot is a typical bio-inspired robot of gibbons. While analyzing the dynamic motion of brachiation robot, some necessary measurements, such as angle, acceleration and mass center, are required for the calculation. To collect these data, the equipment like inertial measurement unit (IMU) is commonly used. However, several difficulties need to be addressed while utilizing this device. First of all, IMU consists of two elements, which are accelerometer and gyroscope. Fig. 1 presents several crucial parameters of brachiation robot model. We need angle θ 1 and the angle θ 3 to derive the current posture of brachiation robot. θ 1 , refers to the angle between the Fig. 1 Brachiation robot model. hand and the horizontal branch, and θ 3 refers to the angle between the arm and the body. These results come from calculating the integration of the angular velocity from the gyroscope and the angle between two acceleration vector [1]. Nevertheless, the results are generally not ideal due to the intense sensitivity of two instruments. Filters are implemented for improving the poor performance caused by the sensitivity, but sometimes it takes even more time to be familiar with the filter. Take Kalman filter for example, although the filter does enhance the results, several experiments must be done to derive parameters needed by the filter equation, and it’s a painstaking process [2]. Furthermore, there are only a few I/O pins left on myRio, which is the embedded device on the brachiation robot, after the basic requirements. Since IMU takes five I/O pins to connect with myRio, the measurements we could detect are limited.

MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

MOTION TRACKING OF BRACHIATION ROBOT BASED ON LABVIEW

USING SEAMLESS INTEGRATION TECHNOLOGY

1 Henry Liu (劉又豪), 2 Chiou-Shann Fuh (傅楸善)

1 Department of Mechanical Engineering,

2 Department of Computer Science and Information Engineering,

National Taiwan University, Taipei, Taiwan

E-mail: [email protected] [email protected]

ABSTRACT

We report on the realization of the real-time

synchronous image acquisition and the image processing

for detecting the motion of the brachiation robot. To

derive the best performance, we cooperate two platforms,

LabVIEW and MATLAB. LabVIEW provides the

environment to control the brachiation robot, and

MATLAB has powerful image-processing functions.

Furthermore, by using COM technology, slow processing

speed caused by the integration of two platforms could

be eliminated. Finally, we conducted experiments to

investigate the performance of the proposed method. The

webcam successfully captured the motion of the robot in

the real-time system.

Keywords: LabVIEW, Real-time visual processing,

MATLAB, COM technology

1. INTRODUCTION

Brachiation robot is a typical bio-inspired robot of

gibbons. While analyzing the dynamic motion of

brachiation robot, some necessary measurements, such as

angle, acceleration and mass center, are required for the

calculation. To collect these data, the equipment like

inertial measurement unit (IMU) is commonly used.

However, several difficulties need to be addressed

while utilizing this device. First of all, IMU consists of

two elements, which are accelerometer and gyroscope.

Fig. 1 presents several crucial parameters of brachiation

robot model. We need angle θ1 and the angle θ3 to

derive the current posture of brachiation robot. θ1, refers

to the angle between the

Fig. 1 Brachiation robot model.

hand and the horizontal branch, and θ3 refers to the

angle between the arm and the body. These results come

from calculating the integration of the angular velocity

from the gyroscope and the angle between two

acceleration vector [1]. Nevertheless, the results are

generally not ideal due to the intense sensitivity of two

instruments.

Filters are implemented for improving the poor

performance caused by the sensitivity, but sometimes it

takes even more time to be familiar with the filter. Take

Kalman filter for example, although the filter does

enhance the results, several experiments must be done to

derive parameters needed by the filter equation, and it’s a

painstaking process [2]. Furthermore, there are only a

few I/O pins left on myRio, which is the embedded

device on the brachiation robot, after the basic

requirements. Since IMU takes five I/O pins to connect

with myRio, the measurements we could detect are

limited.

Page 2: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

Therefore, a proposed method is given to facilitate

the whole process. A real-time image acquisition is

realized by the virtual instrument design software

LabVIEW, which is the original software that used to

control the robot so additional development time for

other software could be saved. On the other hand, the

image-processing algorithm is designed in Matlab and

implemented into LabVIEW with COM technology

instead of using Matlab Script Node, a commonly used

method in LabVIEW, to enhance the processing speed.

2. LABVIEW OVERVIEW

LabVIEW (Laboratory Virtual Instrumentation

Engineering Workbench), developed by the National

Instruments, is a system design platform and

development environment. It uses a graphical

programming language which is easy to understand and

simplifies the process. LabVIEW gives the design

concept a more accurate picture since it implements a

data-flow paradigm in which the code is not written, but

instead drawn or represented graphically similar to a

flowchart diagram. (Fig. 2) [3]. Functions in LabVIEW

are stored in virtual instruments (VIs) which has two

essential parts: the front panel for displaying or

controlling

Fig. 3 Example of NI-IMAQdx.

the parameters, and the block diagram where the

algorithms are designed by using the graphical

components. Due to the user-friendly graphical interfere,

the design process is much more straightforward,

reducing the programming time. Furthermore, with

plenty of image processing and machine vision functions

or toolkits, LabVIEW is available for both image

acquisition and image processing.

3. SEAMLESS INTEGRATION BETWEEN

LABVIEW AND MATLAB

3.1. Comparison of different approaches

There are two approaches for LabVIEW to acquire

the images: NI-IMAQdx and using another programming

language. NI-IMAQdx is the built-in image processing

toolkit that offers image-acquisition and processing

functions in Virtual Instrument environment. The whole

toolkit covers the complete process of image processing.

From controlling the camera to processing images, NI-

IMAQdx has similar functions to handle the

requirements (Fig. 3).

Since some image processing algorithms are pretty

complicated and require excellent computation ability,

using graphical programming language might face the

difficulty on managing the program. Thus, using other

programming language is a preferable way for LabVIEW

to achieve the target. Among all languages, it is common

to use MATLAB programming language as a solution.

MATLAB has three ways to cooperate with LabVIEW:

using MATLAB Script node

directly calling MATLAB ActiveX server

using COM technology.

However, [4] shows that to realize a real-time system,

which requires faster response speed and higher

application efficiency, COM technology is the best

choice.

As for NI-IMAQdx approach and COM

technology, [5] gives the conclusion that COM

(a)

(b)

Fig. 2 Example of LabVIEW. (a) Front panel. (b) Block diagram.

Page 3: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

Fig. 5 COM compiler.

technology is more flexible and has better performance,

and it can accomplish image acquisition and camera

control excellently. Furthermore, COM technology

spends less time and memory compared to NI-IMAQdx

method. Therefore, it is a better choice for measurements.

3.2. COM Technology Overview

Component Object Model(COM), which created

by Microsoft Company, is a system connects various

platform, creating binary software components that can

interact. It is also the foundation of the ActiveX function

we call in the LabVIEW. The main idea of COM is that

it gives the rule to regulate how a component forms. The

designers are asked to provide the well-defined interface

to deal with the detail of realization without showing. By

reference counting, the component itself could handle the

memory distribution which is a huge difference in several

languages. In this application binary interface (ABI), the

data and machine code are packaged and operated in a

specific way according to the structure of COM. Since a

component are implemented with several interfaces, a

variety of platform could execute the code and access its

functionality. The dynamic interface negotiation in COM

is addressed by the Query Interface.

3.3. Realization of COM Method

Fig. 4 shows the system flow chart of COM. The

whole process is divided into two parts. The first half

executes in the MATLAB, and the last half is

accomplished in the LabVIEW. The primary target of

MATLAB is to design the image processing algorithm

for the motion tracking. Complicated image processing

could be simplified since MATLAB has a built-in image

processing toolbox. Furthermore, with COM technology,

the MATLAB script files are compiled into the binary

COM components, and the advantage could be preserved

in the LabVIEW.

To compile the MATLAB script file into COM

component, the Library Compiler could be found by

entering “deploytool” in MATLAB command line and

the compiler dialog would appear (Fig. 5). Choose the m-

file which stores the image-processing algorithm for

packaging. Note that the m-file for COM component

could only accept the type of function. Furthermore, the

title of the function has to be identical to the name of the

m-file. Also, it is recommended that the code for image

processing should be examined twice to ensure the

correct results since that it is nonadjustable after

implementing into the LabVIEW.

After compiling the m-file into COM component,

we could derive a folder which contains two main files: a

BAT file and a DLL file. The BAT file is for initializing

the installation, and the DLL file is the COM component.

Each file would be used through the process. First of all,

it is necessary to register the component into the

computer before we can load the file into LabVIEW. Find

the “_install.bat” in the folder and double-click it. This

registration could also be accomplished by using the

“regsvr32” command. Open “start/run” interface and

type “Regsvr32 file_path\function_name.dll” command,

and wait for the installation. Note that the file path where

the COM component file locates could not contain

Chinese characters or error would show up afterward.

Furthermore, the file path of COM components is not

allowed to change after the installation, or it will not be

found in the LabVIEW.

In the final part of the COM-component application, we

are going to call the COM component into the LabVIEW.

Fig. 6 gives a brief block diagram of calling COM

component. The process and principle of calling COM

components are similar to VB environment.

Fig. 4 System flow of COM technology.

Page 4: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

Fig. 6 Block diagram of calling COM.

Fig. 7 Complete block diagram.

However, some specific difference must be noticed

to avoid compile error. Loading the DLL file into

LabVIEW will not appear on the Front Panel, so the file

must be called in the Block Diagram. Open the Functions

Panel and find the “ActiveX” block in the

“Communication” category. The “Automation Open.vi”

is the function that we can choose the ActiveX class.

Nevertheless, the class would show only if we installed

the DLL file in the previous process, or it will not show

up in the list.

The next step is to operate the function we

designed. Use “Invoke Node.vi” in the ActiveX block to

activate the function. This vi provides both input and

output terminals and variable “nargout,” which indicates

the number of the output parameter. If the MATLAB

function would return output variables, the value of

nargout should not be zero. When calling COM

component, the parameters being input from LabVIEW

would be converted into a MATLAB array form for the

operation of the inner function. The output results of

Fig. 8 System flowchart.

(a) (b)

Fig. 9 Hardware. (a) Brachiation robot. (b) Logitech

C922 Pro Stream Webcam.

MATLAB function would return to LabVIEW in a

particular form, which is called “Variant.” By using

“Variant to Data.vi,” this specific type allows LabVIEW

to transfer the results into assigned data type such as

double, integer and array. Note that the MATLAB

function determines the type of output data. If one selects

the wrong type for the variant to transfer, LabVIEW will

show the error message. Finally, after the whole process

is finished, the sub-vi “Close Reference.vi” is responsible

for the disconnection with the DLL file and cleaning up

the temporary pointer to the object. The complete

program diagram of motion tracking, which contains the

function of acquiring the image from the webcam and

processing the frame through the MATLAB function, is

shown in Fig. 7.

4. SYSTEM DESIGN

4.1. Hardware System

The system is presented in Fig. 8. The user PC runs

LabVIEW to control the brachiation robot’s motion. The

brachiation robot would receive the signals from the PC,

which is addressed by the microprocessor (MyRIO-1990,

National Instruments, Fig. 9 (a)). In the meanwhile,

MyRIO would transfer the signals to the command of

actuators, and the brachiation robot would

Page 5: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

Fig. 10 Algorithm of image processing.

start to swing. Since the orientation of the whole robot

could not be measured by the encoder of the actuator.

Extra sensor is needed. However, instead of considering

the IMU which generally used in robots, we propose a

real-time system that use the webcam to grab images of

the robot as the feedback. The webcam we choose is

Logitech C922 Pro Stream Webcam, a high-quality

webcam that satisfies our need (Fig. 9 (b)). By applying

the computer vision, the feedback sensor would not be

influenced by the sensitive fluctuation resulted from the

brachiation robot’s movements, and therefore ensure the

stability and accuracy of the measurements. Also, the

brachiation robot saves at least five I/O pins that an IMU

needs, so these pins are available for other purposes.

4.2. The algorithm of Motion Tracking

To detect the motion, we set signs with bright color,

red, on the brachiation robot’s arm. Furthermore, the

white background is necessary for reducing the

environmental noise. Fig. 10 shows the flowchart of

image processing. We design the algorithm in the

following steps, some steps are processed in LabVIEW,

and others are operated with MATLAB function.

Grabbing the image frame through a Logitech

C922 webcam with LabVIEW IMAQdx.

Split the frame into three independent color

planes.

Transfer three color planes into MATLAB array

formats

To detect target linkage, give three layers

different weights and subtract each other,

deriving a single layer.

Use the median filter to reduce the noise and

enhance the target.

Calculate the global image threshold through

Otsu’s method to binarize grayscale picture.

Execute the morphology process. We operate

opening, which is the dilation of the erosion, in

this section. Use rectangle kernel to erode the

small region and fill the crack within the target

with dilation.

Read the region properties with MATLAB

function “regionprops.” The results show

properties of all regions it finds. The maximal

region is the final target we aim to focus.

Read the orientation of the target region through

regionprops and derive corresponding properties

such as center position for plotting.

If external noises interfere results significantly, a

simple filter could be implemented again to

reduce the effect.

5. EXPERIMENTS AND RESULTS

The experiments would be presented from two

major perspectives. In the first part, we will compare the

measurements resulted from IMU and the webcam. As

for the second part, we will examine whether the whole

process could be facilitated by using COM technology.

5.1. Examination of Accuracy

To ensure the result coming from the algorithm

above is precise and better than that of the IMU,

preceding examination is designed and applied before we

start the official experiment. First, we mount the camera

on a shelf and make sure the direction of view is correctly

corresponding to the red mark on the robot by setting

both the camera and the mark to the same height and

using the spirit level for fine-tuning. Note that even if we

use another camera, as long as the height of the camera

and the robot remain the same, the distance between the

camera and the robot will not affect the orientation since

that this feature is scale-invariant. Next, a series of value

is assigned as the angle that the robot arm tilt from the

initial position i.e. θ1. We capture images in different θ1,

and measure the actual angle with a protractor (shown in

Fig. 11). In the meantime, the vi that contains the

algorithm is also executed and giving the instant

Page 6: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

angle measurement. The comparison of two

measurements is shown in Table 1. The error between

measurements is around 0.5 degree, which indicates the

great accuracy the algorithm provides.

5.2. Comparison between IMU and the webcam

After confirming the accuracy of the algorithm, we

begin to examine whether the performance of the

webcam is better than that of the IMU. The discussion

would mainly focus on two situations: normal

brachiation and brachiation with undesired disturbance.

The robot needs to swing in regular order so that

we could record the data and compare. We set a

continuous sinusoidal wave as a command on the actuator.

The frequency of this wave is 0.75 Hertz, which is close

to the natural frequency of this robot. Therefore, when the

robot swings with this command, it can gradually swing

high and remain stable. We operate and record the

experiment for 60 seconds, and capture the data in a

specific time section for comparison.

Fig. 12 shows two charts of different situations.

The chart (a) refers to normal brachiation while the chart

(b) presents the brachiation under disturbance. Each chart

gives the relationship of the desired angle θ1 as well as

time, and consists of three types of data. The green dotted

line represents the raw data receiving from the

IMU while the blue line refers to the IMU data that

filtered by using Complementary Filter. The

complementary filter is the most common filter that

applied to IMU since that a complicated calculation is not

necessary and could easily understand. It combines both

the data from the gyroscope and the accelerometer and

performs high-pass and low-pass filtering on each data

[6]. Last but not least, the red line refers to the results

from the webcam.

While doing the normal brachiation, as chart (a)

shows, the IMU data without filtering is pretty noisy,

experiencing fluctuations through the whole process.

However, the transient response is fast and able to capture

the peak value. The IMU data with filtering has a better

waveform than the previous one because it eliminates

most fluctuations. Nevertheless, there exists one major

problem of this data. The transient response is not fast

enough for this system due to the time spending on the

operation of the Complementary filter. Even though we

use Timed loop structure, which can specify how fast one

loop should iterate, to speed up our program, it is not

sufficient for the requirement. Therefore, in the chart (a),

we could tell that the rise time of the wave is too large,

missing the moment when the robot reaches the peak, and

then the peak value of the IMU data is always lower than

the actual one. We can reduce the error through

adjustment on the

(a) (b) (c)

Fig. 11 Snapshots of three angles.

Angle (deg.)

Method

Small Medium Large

Hand measure 2.0 18.8 36.1

Algorithm

Calculation 2.48 18.15 36.52

Table 1 Angle Comparison.

(a)

(b)

Fig. 12 Performance of two situations.

Page 7: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

Fig. 13 Coordinates of the Brachiation robot.

weighting factors in the Complementary filter. However,

it can never reach the actual value. As for the

measurement from the webcam, it presents an excellent

performance under normal brachiation. The waveform is

the smoothest one since that the webcam is mounted on

the shelf instead of the robot, excluding the effect of most

fluctuations. Furthermore, with the help of the computer

computing power, the transient response is sufficiently

fast, and the peak value of the robot is perfectly captured.

When it comes to the disturbance, there are two

types of disturbances in our robot. The first type is

fluctuations of the motion caused by the mechanisms. For

example, while stretching the sliding mechanism, the

robot would experience the drop force. The magnitude of

this force may change with a wide range of command,

giving the system various degree of the influence.

To IMU, this type of disturbances has a dominant

impact on the IMU. Since the IMU measures the angle

with the instant data of acceleration and angular speed,

every time the drop force occurs, the value would

increase( or decrease) dramatically, affecting the

accuracy. On the contrary, this kind of disturbance does

not cause much effect on webcam data since the webcam

is not mounted on the robot. Also, we paint the red mark

on the first linkage connected with the branch, so it

receives less impact when the disturbance occurs at the

lower linkage. From the chart (b), there are two moments

when the disturbance occurs. Both the IMU data are

affected by the disturbance acutely. The filter could not

operate well in this situation since the period is too short,

and the value of the gyroscope, which response speed is

higher than the accelerometer, takes control of the results.

However, the webcam data shows excellent immunity to

the disturbance. The waveform is still as smooth as

normal, and the value during the period of the disturbance

remains unchanged.

Fig. 14 MATLAB script node.

Secondly, for simplicity, we assume that our robot

swings in pure brachiation on a single plane (x-y plane)

that is perpendicular to the view of the webcam (Fig. 13).

This assumption implies the constant distance

between the red mask and the webcam. However, in the

perspective of biology, instead of keeping the body on the

single plane, gibbons swing with a slightly tilted posture.

The reason is that they swing by one hand holding the

branch at a time during the brachiation. The asymmetric

body tends to tilt to keep the center of mass of the body

on the x-y plane. This phenomenon results in periodic

displacement of z-coordinate, a value that is undesirable

to our assumption. It also occurs in our robot, affecting

the performance of the two methods.

The displacement of z-coordinate influences the

accelerometer in IMU. The accelerometer measures the

acceleration of three dimensions. If the brachiation

motion is not purely on the x-y plane, the bias of the

composite acceleration vector would appear, reducing the

accuracy. Also, the gyroscope is sensitive enough to

detect undesired angular velocity coming from the

displacement. Therefore, the results would be

underestimated. As for the webcam, the periodic

displacement enforces the surface of the red mark not to

be parallel with the x-y plane. The actual red mask that

the webcam captures becomes the projection of the real

mask, changing the shape and area. In our algorithm, the

deformation of the red mask may lead to the error on the

calculation. However, due to the hand design, the

displacement of z-coordinate is limited by the mechanism

constraints, so the deformation has little effect on the

accuracy, about 0.5 to 1 degree.

5.3. Performance with and without COM technology

The previous experiments confirm the advantage of

using the webcam. In this section, we examine the power

of COM technology. As mentioned above, there are three

ways for LabVIEW to cooperate with MATLAB, and the

MATLAB Script Node is the most popular way since it’s

very convenient to use. Just call

Page 8: MOTION TRACKING OF BRACHIATION ROBOT BASED ON …fuh/personal/MotionTrackingofBrachiat… · We report on the realization of the real-time synchronous image acquisition and the image

the function, type the MATLAB code into the frame, and

set the input and output terminals for other function to

connect (shown in Fig. 14). However, a convenient

method implies higher workload for the system, reducing

the operating efficiency.

To illustrate the efficiency of these two method, we

create two vi with identical structure. One vi uses the

COM technology while the other chooses the MATLAB

Script Node. To acquire the information about the

processing speed, we use a built-in function called

“Calculate Frames per Second.vi,” and run the vi

separately. The testing format is 320*240 pixels /30fps

with MJPG. Fig. 15 shows the comparison of two

methods. The fps of using COM technology remains 30

after few iterations, while the fps of using MATLAB

Script Node drops to half of the acquired fps after the

operation. Therefore, we can conclude that the COM

technology works faster than the MATLAB Script Node.

6. CONCLUSION

In this paper, we propose a method to realize the

real-time synchronous image acquisition and image

processing. By using MATLAB, we could take

advantage of powerful image-processing functions

instead of using the built-in LabVIEW vi. Furthermore,

to enhance the processing speed for real-time operation,

we use the COM technology as the connection between

LabVIEW and MATLAB. The experiments confirm the

better performance while using this method compared to

others such as MATLAB Script Node. The next stage is

to detect the whole body with the webcam and

reconstruct the model using this designed system.

Furthermore, with COM technology, we would refine the

algorithm in MATLAB to enhance the ability to resist

disturbances and reduce the false rate.

REFERENCES

[1] Pengfei Gui, Liqiong Tang, Subhas

Mukhopadhyay, “MEMS based IMU for tilting

measurement: Comparison of complementary and

kalman filter based data fusion,” IEEE 10th

Conference on Industrial Electronics and

Applications (ICIEA), 2015.

[2] Viktor P. Semenov, Vladimir V. Chernokulsky,

Natalya V. Razmochaeva, “Investigation of the

Kalman filter in problems of increasing the

accuracy of measurements using MEMS (9 axis)

motion sensors,” IEEE Conference of Russian

Young Researchers in Electrical and Electronic

Engineering (EIConRus), 2018.

[3] Chance Elliott, Vipin Vijayakumar, Wesley Zink,

Richard Hansen, “National Instruments LabVIEW:

A Programming Environment for Laboratory

Automation and Measurement,” 2007.

[4] Yupeng Xu, Feihong Yu, “Reasearch on the image

acquisition and camera control of machine vision

camera based on labview,” Fifth International

Conference on Intelligent Human-Machine Systems

and Cybernetics, 2013.

[5] Wuni Xu, Lanxiang Zhong, Dingyuan Wang,

“Image Processing Based on Seamless Integration

Technology Between LabVIEW and MATLAB,”

Intern ational Conference on Information,

Networking and Automation (ICINA) , 2010.

[6] S. Colton, “The Balance Filter,” 2007.

Fig. 15 Comparison of frames per second.