首页    期刊浏览 2024年11月30日 星期六
登录注册

文章基本信息

  • 标题:Functional Safety Concept for a Handling Robot Built on Optical Systems.
  • 作者:Radinger, Thomas ; Stuja, Kemajl ; Wolfel, Walter
  • 期刊名称:Annals of DAAAM & Proceedings
  • 印刷版ISSN:1726-9679
  • 出版年度:2018
  • 期号:January
  • 出版社:DAAAM International Vienna
  • 摘要:1. Introduction

    A typically safety concept variant to protect humans from the movements of industrial robots in the past was to place the robots in guarded robotic cells. Safety guarded robotic cells avoids not only mechanically interaction between human and robots, in the same time between the mobile and stationary robots too. For that reason, this kind of safety systems is incompatible with the future industrial concept such as industry 4.0. A main task of the new safety concept is to allow the sharing of the work-areas between mobile and stationary robots, and to avoid the mechanical contact- collisions between a humans and non-collaborative robots. Differentiation between human and mobile robots can be achieved, only with adding of very expensive third-party hardware and software components. This would be immense disadvantage for the small and medium sized companies. In order to offer a low-cost solution for mentioned companies, at the University of Applied Sciences Technikum Wien was written to works by [1] and [2] for these subject.

    The knowledge of this two works would be used in this research to build the new cost-effective safety optical system concept for one pick & place robotic cell, which among other enables the possibility for recognizing of humans entering of the threatened area of the stationary robots. This system implements the Kinect low-level sensor from Microsoft, capable to recognize humans before they can enter a mentioned areas.

Functional Safety Concept for a Handling Robot Built on Optical Systems.


Radinger, Thomas ; Stuja, Kemajl ; Wolfel, Walter 等


Functional Safety Concept for a Handling Robot Built on Optical Systems.

1. Introduction

A typically safety concept variant to protect humans from the movements of industrial robots in the past was to place the robots in guarded robotic cells. Safety guarded robotic cells avoids not only mechanically interaction between human and robots, in the same time between the mobile and stationary robots too. For that reason, this kind of safety systems is incompatible with the future industrial concept such as industry 4.0. A main task of the new safety concept is to allow the sharing of the work-areas between mobile and stationary robots, and to avoid the mechanical contact- collisions between a humans and non-collaborative robots. Differentiation between human and mobile robots can be achieved, only with adding of very expensive third-party hardware and software components. This would be immense disadvantage for the small and medium sized companies. In order to offer a low-cost solution for mentioned companies, at the University of Applied Sciences Technikum Wien was written to works by [1] and [2] for these subject.

The knowledge of this two works would be used in this research to build the new cost-effective safety optical system concept for one pick & place robotic cell, which among other enables the possibility for recognizing of humans entering of the threatened area of the stationary robots. This system implements the Kinect low-level sensor from Microsoft, capable to recognize humans before they can enter a mentioned areas.

2. Problem statement and requirements

For the digital factory of the UAS Technikum Wien should be built and implement a safety work system based on optical system for three-dimensional workspace monitoring capable for recognizing (among others) of humans. The safety system must to take into account all relevant safety-related standards, as well as the state of the art (German Institute for Standardization, 2010). Among that, the system status should be detailed by changing of signal state, as well as to activate acoustic devices and warning lights. By default, when persons or objects do not enter the warning and protection area, a green lamp should be light up. When entering the warning area, should be switched from green to orange light. Entering the protection area by human, the orange light should be switched to red light, and the speed of the robot should be reduced to 0 mm/s. On the other hand, if the mobile robot would come into the safety area the robot should operate with programmed speed. A flashing orange should indicates a failure of the safety system. Therefore request of this work/project was, to find a suitable inexpensive real-time monitoring system on the local market capable for differentiating of humans from other moving components in the digital factory on Technikum Vienna.

3. Solution and methods

The problem was solved by combining two optical monitoring systems: the core part "SafetyEyes" provided by Pilz GmbH and add-ins part "Kinect" provided by Microsoft. The core part is responsible for protected area and second part monitors the warning area.

3.1. Components of safety system:

Using "SafetyEye" library is possible to create several virtual warning and protection areas. The hardware of this system is shown in the figure 1 and contains [3]:

* the sensor unit consisting of three cameras

* the processing unit for processing of the image data

* the safety controller for interacting with the robot and the periphery. The safety controller is relayed to the input module of the robot controller.

A second part of the safety system is a Kinect sensor shown in the figure 2. The system contains [5]:

* An RGB camera for capturing a colour image that stores data in a 1280x960 resolution

* An infrared (IR) emitter, which emits infrared light beams

* An IR depth sensor. Which reads the IR beams reflected from object back to the sensor and converts into depth information measuring the distance between an object and the sensor

* A multi-array microphone for capturing sound, as well as to find the location of the sound source and the direction of the audio wave

* A 3-axis accelerometer designed for a 2G range and used to determine the current orientation of the Kinect. G is the acceleration due to gravity

* A 3-axis accelerometer is the most important sensor at Kinect. Accuracy of the accelerometer is 1 degree. The accuracy of the Kinect is slightly temperature sensitive [6], with up to 3 degree of drift. If required the user can compensate this drift by comparing the accelerometer vertical (the y-axis in the accelerometer's coordinate system) and floor plane depth data.

3.2. Calculation of minimum distance for mounting of "SafetyEyes"

The calculation of the minimum distance S is dependent of various statically predetermined as well as on dynamic parameters. These dynamic parameters depending on factors such as mounting and on the configuration of the system. The minimum distance S was calculated using the formula (1) by following parameters (as [1] and [3]):

S = K x (t1+t2) + C + Zg (1)

Where:

* The minimum distance S is determined by the approach speed K=1600 [mm/s] (Deutsches Institut fur Normung e. V., 2010)

* the response time of the SafetyEyes t1=0,365 [s]

* the response and stopping time of the robot as well as the robot control t2=0,4 [s]

* the addition for the mounting height of the sensor unit for used case C=850 [mm]

* the addition for system-specific measurement tolerances Zg=235 [mm].

Using the given formula and parameters, the minimum distance S can be defined like in (2)

S = 1600 x (0,365 + 0,4) + 850 + 235 = 2309 [mm] (2)

3.3. Minimum distance for mounting of " Kinect"

Kinect (in default mode) can recognize humans [5] standing between 0.8 meters and 4.0 meters away. A practical range (shown in Fig 3) is between 1.2 to 3.5 meters. High accuracy of recognizing can be achieved only in this range. Technically is possible to extend a range of safety area by using of a second camera. Though, if more than one Kinect sensor is used to light up the safety area, a reduction in the accuracy and precision tracking will be noticed due to interferences of light sources.

4. Results

The final implementation of the safety system is presented in Layout (Figure 4) with components used in [7] and [8]. This layout shows pick & place robot cell. In normal/working mode (see the flow chart Figure 5), when persons or objects are moving outside of warning-, safety- and robot area, a green lamp will light up. When entering the warning area by human the industrial robot decelerate to programming speed max 250 mm/s. An acoustic/light signal is activated in order to warn the threatened person. When the Robot/Safety is violated by human, the SafetyEye controller sends a signal to the robot controller, which forces the manipulator to stop its operation. The system is in Emergency mode. On the other hand, if the mobile robot enters the warning area an Arduino board relayed from Kinect sensor sends a signal to SafetyEye controller in order to deactivate Robot area an dot enables the working mode. The Safety area still remains active.

Furthermore, it was observed a moving persons in different orientation toward camera in the warning area. Four samples, each with 84 tests, were carried out with the following result:
Table 1. Recognition percentage of moving human toward Kinect sensor.

                                                        Recognition
Human orientation moving toward Kinect sensor:         percentage [%]

Frontal                                                    100,00
Backward                                                   39,29
Lateral left (Eyes along negative x-axis of Kinect)        28,57
Lateral right (Eyes along positive x-axis of Kinect)       32,14


As can be seen in Table 1, the lateral movement of humans toward to Kinect is very difficult for sensing, although the probability for this kinds of moving is very low.

5. Conclusion

For the digital factory of the University of Applied Sciences Technikum-Wien in Vienna should be built and implement a safety system for the robot handling cell. The safety system must to take into account all relevant safety-related standards, as well as the state of the art (German Institute for Standardization, 2010). Among others, this system must be capable to monitor work areas in real time in order to prevent human casualties. Another very important demand for this system, is to be inexpensive. Summary, the main task of this research was, to find a suitable inexpensive real- time monitoring system on the market capable for differentiating of humans from other moving components for the robot handling system in the digital factory.

The problem was solved by combining two optical monitoring systems: the core part "SafetyEyes" provided by Pilz GmbH and add-ins part "Kinect" provided by Microsoft. The core part is responsible for recognizing of any kind objects entering the safety and robot area. The second part is responsible for recognizing of humans entering the warning area. The designed safety system was implemented, on the digital Factory of University of Applied Sciences Technikum Wien and was tested/ analysed for functionality, failures and difficulties. Different data, like reaction rate and different orientation of human entering to the safety area has been collected and presented above. Deficiency of this system is sensitivity against lighting conditions for both cameras and lateral movement of the humans toward Kinect camera. While the first part can be resolved, more by providing of constant light conditions and less by reducing reflective surfaces of the objects. The second deficit can be resolved by mounting the second Kinect camera crosswise to the first one.

The further works of this project is to increase the level of accuracy of safety system for the poor light conditions, as well as implementing within entire manufacturing system. In order to achieve a high level for the recognizing of the human actions and behaviours, a design and implementation of advanced algorithm based on artificially intelligence for the new Kinect Version 2 sensor would be of great interest.

DOI: 10.2507/28th.daaam.proceedings.022

6. References

[1] Angel, N. (2017). Roboteruberwachung mittels Kinect Sensorsystem, University of Applied Sciences Technikum Wien, Department Robotics and Mechatronics, Vienna.

[2] Walter, R. (2017). Arbeitsraumuberwachung eines sechs-achs-Knickarmroboters mittels Pilz SafetyEye, University of Applied Sciences Technikum Wien, Department Robotics and Mechatronics, Vienna.

[3] Pilz GmbH & Co. KG, (2011). SafetyEye: Bedienungsanleitung/Dokumentation. [Datenblatt] Ostfildern, Deutschland: Pilz GmbH & Co. KG :<https://www.pilz.com/download/restricted/PSENse_Operat_Man_21743-DE-11.pdf> Accessed on 21.03.2017.

[4] Pilz GmbH & Co. KG (2017). Pilz--Sichere Automation. [Online]: <https://www.pilz.com/de-AT> Accessed on 21.03.2017.

[5] Kataumo, B. (2014). Kinect for Windows Sensor Components and Specifications. [Online]: https://msdn.microsoft.com/en-us/library/jj131033.aspx [Accessed on 01.09.2017], Microsoft, 2014.

[6] https://msdn.microsoft.com/en-us/library/jj663790.aspx. Microsoft (2014). Kinect for Windows Sensor Components and Specifications, Accelerometer. [Accessed on 01.09.2017].

[7] Stuja K., Bruqi, M., Markl E. & Aburaia, M. (2016). Lightweight 4 -Axis Scara Robot for Education and Research, Proceedings of the 27th DAAAM International Symposium,, B. Katalinic (Ed.), Published by DAAAM International, ISBN 978-3-902734-08-2, ISSN 1726-9679, Vienna, Austria.

[8] Aburaia, B.; Markl, E. & Stuja, K. (2014). New concept for design and control of 4 axis robot using the additive manufacturing technology, 25th DAAAM International Symposium on Intelligent Manufacturing and Automation, Elsevier, pp. 1364-136, ISSN 1877-7058, Vienna.

[9] Meisner, J. (2013). Collaboration, expertise produce enhanced sensing in Xbox One https://blogs.technet.microsoft.com/microsoft_blog/2013/10/02/collaboration-expertise-produce-enhanced- sensing-in-xbox-one. Microsoft (2013). [Accessed on 22.10.2017].

Caption: Fig. 1. SafetyEye--Pilz GmbH. [4]

Caption: Fig. 2. Microsoft Kinect [5]

Caption: Fig. 3. Kinect recognizing range: horizontal view (a) and vertical view (b) like in [6]

Caption: Fig. 4. Layout of robot Cell and monitoring areas

Caption: Fig. 5. Decision logic of safety system.
COPYRIGHT 2018 DAAAM International Vienna
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有