首页    期刊浏览 2025年10月12日 星期日
登录注册

文章基本信息

  • 标题:The functional model of a servovisual robot system.
  • 作者:Tarca, Radu ; Tarca, Ioan ; Tripe Vidican, Aron
  • 期刊名称:DAAAM International Scientific Book
  • 印刷版ISSN:1726-9687
  • 出版年度:2006
  • 期号:January
  • 语种:English
  • 出版社:DAAAM International Vienna
  • 摘要:Key words: visual servoing, CCD camera, MATLAB Simulink, real time driving
  • 关键词:Robot control systems;Robots

The functional model of a servovisual robot system.


Tarca, Radu ; Tarca, Ioan ; Tripe Vidican, Aron 等


Abstract: This paper presents some achievements regarding the link components models of a robot system, which presents a sensory control. The information regarding the position of a target materialized by a moving light source which is given by a visual system is used to realize the robot control.

Key words: visual servoing, CCD camera, MATLAB Simulink, real time driving

1. Introduction

The functional model of a robot system using a visual servoing control is presented in this paper. The robot's system control is visual, based on a target position, using two CCD cameras in the feed-back. The robot block scheme is presented in figure 1.

[FIGURE 1 OMITTED]

[FIGURE 2 OMITTED]

Because the block diagram of the robot system has a global regulator which compares the real position of the end-effector with the prescribed one, one can say that the robot control is a global control. The robot cinematic diagram of the system realized by the authors is presented in figure 2. The robot system contains a Cartesian robot (Translation-Translation-Translation structure) and a visual subsystem materialized by two charge-coupled device (CCD) cameras. The visual subsystem has to follow the robot end-effector (point G in figure 2). The CCD cameras present two DOF (pan and tilt). The robot joints are actuated by direct current (DC) actuators, and the visual devices are actuated by three step motors (one of them generates the motion along Ox axis, and the other two generate the motions along [Oy.sub.v] and [Oy.sub.v]' axis).

2. The aim of the functional model

The aim of the modeling and simulation of the robot system is to foresee how the system works before the effective realization of the system.

Because the robot system is complex and needs to be dynamically controlled, and because its components are heterogeneous, some of them being described by continuous parameters while others need discrete parameters, a model for each conceived component of the system was needed to be made.

The authors generate a model for each component of the robot system in order to be able to establish:

* the acquisition time of the visual subsystem;

* the command execution time;

* the response time of the computer system,

which are needed to establish if the robot system is working in real time or not. In order to realize the model of the robot system behavior, a program which have the capability of the dynamic behavior redeem of the robot subsystems is needed.

MATLAB (R) SYMULINK(R) possesses all the attributes necessary to model and simulate systems with time-modifying states.

3. The system components functional model

3.1. The DC motor functional model

[FIGURE 3 OMITTED]

The DC motor's mathematical model can be found in (Bogdanov, 1989). It contains two transfer functions "Electric", respectively "Load", which represents the Laplace transformations of the DC motors differential equations. The "Load" block output is the motor angular velocity, which after the integration becomes the angular coordinate of the corresponding joint. Supply voltage represents the "Electric" block input. Frictions during the motor function are modeled using the "Dry and viscous friction" block. The aim of the block is to study the dependency of its inputs with speed.

The DC motor model block diagram is presented in figure 3.

The main components of the DC motor model block diagram are:

1. "Electric" represents the transfer function of the DC motor electric component having as characteristics the inductance L and the resistance R of the winding (for the considered DC motor L = 0.0015 [mH], R = 1.8 [[OMEGA]]). The block input is the supply voltage [V] while the output is the torque [Nm].

2. "Load" represents the transfer function of the DC motor mechanical component loaded with a resistant torque. Its characteristics are: moment of inertia J=0,99x10-6[kg x m2], global viscous friction coefficient B=4,57x10-6 [Nms/rad]. The block input is the torque [Nm], while the output is the angular speed [rad/s].

3. "Dry and viscous friction" block deals with viscous friction which depends on speed and dry friction. The input of the block is the angular speed and the output is the friction torque which has to be substituted from the effective torque.

4. "Integration" is the block that realizes the integration of the angular speed necessary to obtain the generalized parameter of the corresponding cinematic couple [rad].

5. "Sum" is the block for two inputs addition.

6. "Supply voltage" is the input for the D.C motor [V].

7. "Position" represents the angular position of the motor shaft [rad].

8. "Real speed" the angular speed of the motor shaft [rad/s].

3.2. The step motor functional model

[FIGURE 4 OMITTED]

Angular displacements of the visual sensor are made using step motors. Mathematic model of the step motor is presented in (Tarca, 2001).

The step motors used in the experimental device are 4-phase motors, having the principle schema represented in figure 4.

In this figure the rotor is symbolically represented by a single pair of poles. In realty the rotor have 50 poles thus resulting an angular step of 2d/(50u4) radians for a single impulse applied to the windings.

The previously established relations shall be used, multiplied to obtain the 4 phases step motor. The model for one winding is presented in figure 5.

[FIGURE 5 OMITTED]

The following components were used:

1. "In1"--command signal of the winding (step voltage impulses)

2. "Winding 1 parameters"--the transfer function of the winding having as parameters the inductance L=0.017[mH] and resistance R=5[OMEGA]]. Parameters determination is presented in (Tarca, 2001).

3. "Kmpp"--motor constant

4. "Kgpp"--mechanical constant of the transmission

5. "Winding 1 position"--function which models the winding position. It's expression is sin(n*u-v) where u is the input (the instant position of the rotor) and v is the winding position.

6. "[+ or -]"--summing block

7. "x"--multiplying block

8. "Out1"--motor torque [daNm]

9. "In2"--instant position of the rotor [rad]

10. "In3"--instant speed of the rotor [rad/s]

The previously presented model is multiplied by 4 as can be seen in the complete motor model presented in figure 6. Supplementary, the following blocks were added:

1. "Load"--the transfer function which models the motor shaft loading behavior. It contains the inertia moment J, and the dry friction coefficient B. Their calculus is presented in [4].

2. "Wear"--the transfer function which models the viscous friction

3. "Integrator"--integration block having the same role as the one presented at the d.c. motor model.

4. "Angular position MPP"--output of the step motor [rad].

5. "Command signal"--port which transmits the command impulse to the windings phase-shifted for each winding. Command impulses can be generated by a subsystem shown in (Tarca, 2001) or separate generating blocks for each winding can be used.

6. "Winding 'n'"--where n represents the winding number is the subsystem obtained from the model presented in figure 5.

7. "Transmission"--represents the mechanical module of the step motor

8. "MPP transmission"--represents a reduction gear with a single pair of tooth gear, having the gear ratio of 1/3.

[FIGURE 6 OMITTED]

3.3 Functional model of the visual sensor

Two major aspects were taken into account while realizing the visual sensor model:

* perspective transformation

* discreet character of the image acquisition

The following characteristic values were defined for the perspective transformation:

* the distance from the lens focus point of the visual sensor to the characteristic point plane of the end-effector: "D" (in functional model "dist")

* angle between the instant position of the optical axe of the visual sensor and the perpendicular from the lens focus to the robot axle "[alpha]" (in functional model named "alfa");

* distance from the lens focus to the CCD plane, "f" ("dist_focal" in functional model);

* instant position of the characteristic point relative to an absolute measuring mark (end of perpendicular from the sensor lens focus to the plane of the characteristic point) "[x.sub.r]"

* instant position of the sensor optical axis with the characteristic point plane intersection "{x.sub.opt]"

* instant position of the characteristic point relative to the intersection of the optical axis of the sensor with the characteristic point plane [x.sub.r] (named "u" in figure 6, and in equation (6))

* the characteristic point instant position relative to the optical axis in image plane [x.sub.p] (named u in equation 6)

[FIGURE 7 OMITTED]

Having these notations and using figure 7, following equations can be written:

B = D / cos[alpha] (1)

[x.sub.opt] = D x tg[alpha] (2)

[x'.sub.r] = D x tg([alpha] + [theta]) (3)

[x.sub.r] = [x'.sub.r] = [x.sub.opt] = D x [tg([alpha] + arctg [x.sub.p] / f) - tg[alpha]] (4)

[x.sub.p] = f x tg(arctg([x.sub.r] / D + tg[alpha])- [alpha]) (5)

Angular position of the visual sensor is modified by the step motor at the moment when the characteristic point image exits the limits of the "interest rectangle". This behavior can be graphically described as shown in figure 8.

[FIGURE 8 OMITTED]

Let's consider the initial moment 1 in which the target image in the plane image is located in its center and the visualizing sensor direction intersects the moving target plane in a point having the ordinate [x.sub.1].

Moving the target toward left, the corresponding displacement in the target image is toward right. As long as the target image is inside the interest rectangle, according to equation (2) and (3) instant position of the characteristic point relative to the intersection of the optical axis of the sensor with characteristic point plane can be calculated (image 2 figure 8).

If target image passes the interest rectangle perimeter a command is transmitted to the step motor in order to reposition the target image inside the interest rectangle (case 3 in figure 8).

The algorithm repeats, each time calculating the instant position of the characteristic point inside the interest rectangle using equations (2) and (3).

Functional model of the visual sensor is shown in figure 9. Component blocks are:

1. "Noticed position"--input port which transmits the characteristic point position evolution

2. "Optical axis position"--input port which transmits the evolution of the intersection point of the optical axis with the robot axis moving direction

3. "Perspective transformation"--algebraic calculus port which uses previously defined equations having in simulation language the expression:

u=tan(atan(u/dist+tan(alfa))-alfa) * dist_focal * n_pixeli (6)

in which ,"u" represents the input of the (u=[x.sub.r]) block.

[FIGURE 9 OMITTED]

4. "Sampler"--block for modeling the discreet characteristic of the image acquisition board. Every image is acquired at a rate of 28 frames per second, images being preserved between two consecutive acquisitions. Maximum acquisition frame rate for a 640x480x8 (8 is the number of colors) is 30 frames/s.

5. "Quantifier"--is a quantifier of the [x.sub.r] value defined at the beginning of the paragraph, which is a real value expressed in an integer number that represents the number of pixels that corresponds to [x.sub.r].

6. "Position in pixels"--is the output port of the subsystem which transfers xp to the command subsystem.

3.4. Functional modeling of the computation system

[FIGURE 10 OMITTED]

The computation system generates the command values for the robot actuators and for the positioning motors of the tracking visual system. Figure 10 presents the computation system model. In this figure "Sensor signal", "Limit reach signal" blocks can be noticed; their role is to display the signals they receive in order to track the simulation evolution. The mentioned command values are computed using the visual sensor transmitted value xp as input.

The command signal of the step motor is generated when the limits of the image plane interest rectangle are overstepped, as shown in figure 8. Inside the model this is realized through the following blocks:

1. "Memory initialization"--defines a space of memory in which system writes and reads the number of steps realized by the step motor.

2. "Sensor signal"--system input port.

3. "Space limit"--contains the relation:

u(1)-x_frame *(u(2)+1) (7)

where:--x_frame is the border of the limitation rectangle;

--u(1) is the input value of the visual sensor block

--u(2) is the number of steps executed by the step motor

4. "Limit detection"--the block that initiates the function of the "Step motor command signal generator" at the moment of the border overstep.

5. "Steps counted"--read from memory the number of steps realized by the step motor

6. "Step motor command signal"--subsystem having the structure shown in figure 11.

This subsystem is triggered by the "Trigger" block in the condition mentioned above and receives the input signal from the port "Previously counted steps". Input signal increments by 1 after which the function:

Mod(u+1,8) (8)

is applied. Function (8) computes the rest of the incremented signal divided by 8, thus computing the index of the column of the table memorized in the "Signal values table" block. Values memorized in this block are shown in table 1.

Depending on the index value (position of the winding which is to be actuated) four signal are generated, one for each winding. These signals are transmitted through the "Signals" port to the computing system.

7. "Computed angular displacement of the step motor" is a block which calculates the "alfa" parameter using the equation:

alfa=step_number * angular_step (9)

where:--step_number is the number of steps realized by the step motor

--angular_step is the value of a single step of the step motor [rad] which is constant and equal to 2 *[pi]/200.

"Alfa" parameter is needed to compute the level of the d.c. motor command value.

[FIGURE 11 OMITTED]

8. "Step counting" is a subsystem which counts the steps realized by the step motor and transfers the information in memory. The block diagram of the subsystem is presented in figure 12.

[The meanings of the figure 12 notations are:

* "Signal"--input system port;

* "abs1", ... "abs4"--returns the absolute value of the signals on the 4 channels, corresponding to the 4 windings;

* "counter1", ..., "counter4"--counters having as input "Clock" (clk) and as output "Count" (cnt) block;

* "Sum"--computes the sum of the steps executed by each one of the 4 windings.

* "Realized number of steps storage" writes in memory the computed value.

9. "Step motor command"--output port from the command system which transmits the generated signals toward step motor

[FIGURE 12 OMITTED]

10. "[x.sub.r] from [x.sub.p] computing"--the d.c. motor signal command component is computed with equation (4) which in modeling language can be written:

dist * (tan(alfa+atan(u/dist_focal))-tan(alfa)) (10) 11. "D.C. motor command"--is the output port of the command system which transmits the "[x.sub.r]" parameter value to the d.c. motor actuating devices of the robot.

3.5 Functional model

In the previous paragraphs functional models of the component elements of the robot system were presented. These elements now must be integrated in the complete schema of the model. The functional model of the robot was realized for one axis targeting system--figure 13. In order to study the dynamic behavior of a command system with visual sensor the functional model of the system for a robot cinematic couple was realized. Previously described elements are used in this model.

3.6 Functional model of the system

"Commanded position" block generates a step signal which represents the programmed position of the axles. At the moment of receiving the signal "Chopper" block generates the signal that represents the necessary supply voltage for the d.c. motor. "MCC" block has as output the signal that represents the relative position of the modeled cinematic couple elements. This position is detected by the "visual sensor" and evaluated by the "computing system", being afterward compared with the computed value. The comparison result is the position error and represents the new level of command. As the value that characterizes the achieved position increases, the difference between the programmed values and the real value decreases, and as a fact the command voltage of the d.c. motor decreases, in the end the d.c. motor stops at the programmed position.

[FIGURE 13 OMITTED]

Because the visual field of the sensor doesn't fit the entire moving domain of the modeled axis this is moving in such a way that the characteristic point should permanently remain in the visual field of the sensor. The displacement is achieved using a step motor commanded by the computing system in such a manner that it executes a step each time the characteristic point approaches to a defined distance from the visual field limit of the sensor.

The structure and the functioning mode of each of the modules have been previously described. To evaluate the behavior of the modeled system in the block diagram, graphical displayed elements were introduced for different signals, as functions of time ("Real position", "Optic axis position", etc.)

[FIGURE 14 OMITTED]

[FIGURE 15 OMITTED]

The results of the robot system simulation for one cinematic couple that have relative movements along [O.sub.0] [x.sub.0] axis were achieved for a step signal having the shape presented in figure 14. This signal represents the prescribed position of the lightning target along [O.sub.0][x.sub.0] axis. The command signal was generated with a delay of 0.01 [s]. The signal is applied to the chopper and thus will generate a command voltage having the shape presented in figure15. After an abrupt rise, signal begins to decrease as long as the real position approaches to the prescribed one as value. The digitization of the signal appears due to the visual sensor behavior which, due to digital acquisition of the frames, generates discrete signals downstream.

Figure 16 presents the robot system answer to the step command signal previously presented. This signal represents the time variation of the targeted position and is collected at the output port of the MCC block. The robot system answer to the step command signal offers information related to its settling time. It was observed that the setting-up time after which the robot system reaches the prescribed position of 0.2 [m] with an error of 2.5% is 0.4 [s]. If the prescribed positioning error needs to be under 1%, setting-up time increases to 0.5 [s].

[FIGURE 16 OMITTED]

Angular speed variation with time measured at the d.c motor axle is presented in figure 17 and is collected from the "Load" block output. As seen in figure 17 angular speed of the d.c. motor rotor was not limited during simulation, in order to obtain a trapezoidal shape for the speed variation with time. The visual sensor response, which in the mean time is an input for the computing system, is presented in figure 18, and is collected from the "Sensor signal" block output.

[FIGURE 17 OMITTED]

[FIGURE 18 OMITTED]

[FIGURE 19 OMITTED]

"Computing system" block generates the signal necessary to the step motor command and also the command signal for the d.c. motor. These two signals are presented in figures 20 and respectively 21. During target displacement "Computing system" block concluded the necessity of two command signals for the step motor so that the target image should be placed inside the interest rectangle from the image plane. The step motor output signal is transformed in "Sensor optical axis position modification" block into a signal corresponding to [x.sub.opt] computed with equation (2), having the evolution with time presented in figure 20. Over this signal the signal presented in figure 21 is superimposed, thus the position detected by the mean of the visual sensor (real position) being obtained. Figure 22 presents this signal.

[FIGURE 20 OMITTED]

[FIGURE 21 OMITTED]

The robot system reaction time can be determined from the figure 20. Thus it can be seen that the tracking visual system realizes two consecutive steps in an interval of 0.0332 [s], time in which the visual sensor achieves a frame. Consequently the necessary time for the robot system to travel through a complete loop (including step motor command) is at least 0.0332 [s]. The computed position which in fact represents the feed-back of the robot system; if it is superimposed to the reference signal generates the correction command for the cinematic couple.

Because the results of the functioning simulation of the robot system for cinematic couple have similar movements on the [O.sub.0][y.sub.0] axis there are not presented in the paper.

[FIGURE 22 OMITTED]

4. Conclusions

The results of the robot system that uses information from a visual tracking system functioning simulation validate the conceived models. As can be seen from the robot system response to a step signal input (figure 14) the time necessary to achieve the prescribed position with 1% error is 0.53 [s], time in which the adjustment loop of the system was run for 16 times. Consequently, the average time for one transition of the loop is 0.033 [s]. The time for the generation of the d.c. and step motors commands is under 0.01 [s].

The corresponding time for the image acquisition results from the Matrox[R] board acquisition frequency, which is 30 frames per second. Consequently, the limitation regarding the execution time of the loop came from the visual sensor. Table 1 presents the time achieved at the robot system functioning simulation.

Functioning simulation of the robot system results must be validated by the real system of the sensorial robot driving which uses information from a visual tracking system.

Concluding, considering the times obtained in the simulation process, one can say that the system of the sensorial robot driving which uses information from a visual tracking system functions in real time.

5. References

Bogdanov, I., (1989) Microprocesorul in comanda actionarilor electrice, Editura Facla, Timisoara.

Kovacs, F.,V., Radulescu, C., (1992), Roboti Industriali, vol.I-II, Lito. Universitatea "Politehnica" Timisoara

Tarca, R.C. (2001) "Utilizarea informatiilor cu privire la situarea efectorului final achizitionate prin senzori in vederea conducerii in timp real a robotilor", PhD Thesis, Timisoara.

Corke, P.I. (1994) High-Performance Visual Closed-Loop Robot Control, PhD Thesis, University of Melbourne.

Hashimoto, K. (1991) Model-Based Control of Manipulator with Image-Based Dynamic Visual Servo, PhD Thesis, Osaka University.

Authors' data: Prof. PhD. Tarca R.[adu], Assoc. Prof. PhD. Tarca I.[oan], Prof. PhD. Tripe Vidican A.[ron], lecturer Tocut P.[avel] D.[anut], lecturer Tripe Vidican C.[alin], University of Oradea, Romania

This Publication has to be referred as: Tarca, R.; Tarca, I.; Tripe Vidican, A.; Tocut, P.D. & Tripe Vidican, C. (2006). The Functional Model of a Servovisual Robot System, Chapter 48 in DAAAM International Scientific Book 2006, B. Katalinic (Ed.), Published by DAAAM International, ISBN 3-901509-47-X, ISSN 1726-9687, Vienna, Austria

DOI: 10.2507/daaam.scibook.2006.48
Table 1. Values memorized in "Signal values table"

Index 0 1 2 3 4 5 6 7

Winding 1 -1 0 0 0 1 0 0 0
Winding 2 0 -1 0 0 0 1 0 0
Winding 3 0 0 -1 0 0 0 1 0
Winding 4 0 0 0 -1 0 0 0 1

Table 1. Times obtained during simulation [s]

Simulated process Times obtained during simulation

D.C. motor signal generating 0,01
Image acqusition 0,03
Compute in "Computing system" block 0,01
Step motor signal generating 0,01
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有