首页    期刊浏览 2024年11月15日 星期五
登录注册

文章基本信息

  • 标题:Remote robot control via internet using augmented reality.
  • 作者:Tarca, Radu ; Pasc, Ildiko ; Tarca, Naiana
  • 期刊名称:Annals of DAAAM & Proceedings
  • 印刷版ISSN:1726-9679
  • 出版年度:2007
  • 期号:January
  • 语种:English
  • 出版社:DAAAM International Vienna
  • 摘要:Key words: remote control, Internet, augmented reality
  • 关键词:Augmented reality;Internet;Remote control;Robots

Remote robot control via internet using augmented reality.


Tarca, Radu ; Pasc, Ildiko ; Tarca, Naiana 等


Abstract: This paper presents a remote robot control via Internet using an Augmented Reality Interface. The Mitshubishi Telerobot project demonstrates how much an improved AR interface can increase the performance of a telerobotic system without having to change any of the telerobots technical features.

Key words: remote control, Internet, augmented reality

1. INTRODUCTION

In his recent AR survey Azuma (1997) defines Augmented Reality as any system with the following three characteristics:

* Combines real and virtual;

* Is interactive in real time;

* Is registered in three dimensions.

Although AR systems may also augment other human senses, like the auditory or haptic sense, most current systems only implement the visual channel.

This paper presents the possibilities of using the augmented reality (AR) to control a robot system via Internet.

In 1998 Harald Friz in his PhD thesis developed an AR tool used to specify the robot's end effector position and orientation. In October 2003 a research team from Perth University of Western Australia generates an AR tool (version 1.0) for the UWA Telerobot, which allows operators to model objects for easier robot manipulations.

Our research team give another solution for the telerobot control. In the first step we realise the telerobot system, and then we developed an AR interface that gives the possibilities to operators to realise the 3D model of any piece from the visual field, to overlay this model on the real object, and in this way to obtain the mass centre position and the orientation of the object. With this information is easy to command robot via Internet to pick the object and place it anywhere in the workspace.

Our AR interface has a new conception, and gives the possibility to manipulate any kind of object, not only prismatic one (as in the previous cases).

The next step in development of our telerobot system is to include in the AR interface not only the visual sense, but also the haptic, using haptic gloves and HMD to command and control the process.

2. THE TELEROBOT SYSTEM

2.1 The System Structure

The concept of "human supervisory control" (Sheridan, 1992) that underlies a telerobot is illustrated in figure 1. The human operator interacts with the human-interactive computer (HIC). It should provide the human with meaningful and immediate feedback. The subordinate task-interactive computer (TIC) that accompanies the controlled robot receives commands, translates them into executable command sequences, and controls command execution.

In a supervisory control system the human supervisor has the following functions (Sheridan, 1992):

* Planning what task to do and how to do it.

* Teaching the computer what was planned.

* Monitoring the automatic action to make sure all is going as planned and to detect failures.

* Intervening, which means that the operator supplements ongoing automatic control activities, takes over control entirely after the desired goal has been reached satisfactorily, or interrupts the automatic control to teach a new plan.

* Learning from experience so as to do better in the future.

The role of computers in telerobotics can be classified according to how much task-load is carried compared to what the human operator alone can carry (Sheridan, 1992). They can trade or share control. Trading control includes the following cases:

* The computer replaces the human. It has full control over the system.

* The computer backs up the human.

* The human backs up the computer.

The most common case in telerobotics is sharing control, meaning that the human and the computer control different aspects of the task:

* The computer relieves the human operator from certain tasks. This is very common in telerobotics when the remote system performs subtasks according to the plans specified by the human operator.

[FIGURE 1 OMITTED]

[FIGURE 2 OMITTED]

[FIGURE 3 OMITTED]

* The computer extends the human's capabilities. This typically occurs in telerobotics when high precision of movements and applied forces is required.

The telerobot system developed by our research team is presented in figure 2. We have used a Mitshubishi Movemaster RV-M1 robot with 5 axes.

Different kinds of objects are placed on a table, in its workspace. The scene is observed by a CCD camera (figure 3). As it can be seen, different kinds of objects (prisms, screws, nuts, and bushes) are placed on a rectangular grid in the robot workspace. The images acquired by the CCD camera are compressed and transferred through Internet to the human operator computer where the operator, using the AR interface, establishes the position and orientation of each object. Using this information a command is generated through the soft and transferred through Internet to the telerobot, in order to execute the task.

2.2 The AR Interface

The AR interface has been realized using LabView 7.1 software. In the first step the calibration of the system is made in order to improve accuracy and usability of the AR Interface. The purpose of this module is to map the two-dimensional coordinates as shown on the captured image to three-dimensional coordinates in real space around the grid. The algorithm which simulates the third coordinate dimension (depth) is based on a single vanishing point model.

[FIGURE 4 OMITTED]

After that for each type of object a wireframe model is generated using geometrical primitives. Using 3D transformations (translation, rotation and scaling) wireframe models can now be moved at the desired location.

The dimensions of the object model in the robot's image plan are computed through 3D to 2D transformations, considering the vanish point, thus resulting the object's model in the image plane which is overlaid on the object's image (in the image plane--figure 4).

The mass centre position and the orientation of the object are computed through a software procedure and are used to command the robot.

2.3 The Robot Control

Having this information the human operator transfers via Internet a command to the remote computer; this transfers it to the robot controller through parallel port. The telerobot will execute the task.

3. CONCLUSION

The Mitshubishi Telerobot project demonstrates how much an improved AR interface can increase the performance of a telerobotic system without having to change any of the telerobots technical features.

The project was successful in the development of the AR interface for the Mitshubishi Telerobot. The objective of the project was therefore met.

4. REFERENCES

Azuma, R. T. (1997): A Survey of Augmented Reality. Presence, Vol. 6, No. 4, August 1997, pp. 355-385.

Sheridan, T. B. (1992): Telerobotics, automation and human supervisory control.Cambridge, MA: MIT Press.

Harald F. (1998) Design of an Augmented Reality User Interface for an Internet based Telerobot using Multiple Monoscopic Views, Diplomarbeit, Institute for Process and Production Control Techniques, Technical University of Clausthal Clausthal-Zellerfeld, Germany

Friz, H. (1999) Design of an Augmented Reality User Interface for an Internet based Telerobot using Multiple Monoscopic Views. Diploma Thesis, Institute for Process and Production Control Techniques, Technical University of Clausthal, Clausthal-Zellerfeld, Germany Available at: http://telerobot.mech.uwa.edu.au Accessed: 2007-07-22

Palmer, R. 2003 Augmented reality and Telerobots, Honours thesis, University of Western Australia
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有