Abstract — Early approaches to robotic system safety employed mechanical barriers to physically isolate robotic systems from their human operators. Modern collaborative applications for robotic systems, however, are driving the replacement of these physical barriers with other approaches to robotic system safety. This paper begins with a brief discussion of the elements comprising a robotic system and an admission that the robotics engineer can no longer rely on the ever-increasing density of microprocessors to solve problems through sheer computing power. The paper then discusses an automatic pedestrian door example system. Though perhaps not immediately obvious, automatic pedestrian doors are examples of robotic systems. People and automatic doors have been operating in the same workspace for decades and there are lessons to be learned from this experience. One of them being that even single-axis collaborative robotic systems present complex safety challenges. Systems with multiple robots and tens, if not hundreds, of axes will be exponentially more complex. Finally, the paper overviews a layered approach to collaborative robotic system safety. This approach is rich enough to the address the complexity of the problem domain, yet still computationally manageable by distributing the computing load across multiple
layers.
Introduction - This paper begins with a basic definition of a robotic system. In devising the definition, the paper looks at the origin of the word robot and discusses the state of the art in robotics technology today in the context of the original definition of the word. The main point of this discussion as it relates to collaborative robotics and safety is that the robotics engineer can no longer rely on the ever-increasing density of microprocessors described by Moore’s law to solve problems through sheer computing power. The paper then discusses an automatic pedestrian door example system. Though perhaps not immediately obvious, automatic pedestrian doors are examples of robotic systems. They have motors, drives, sensors, moving parts and embedded computers. People have been operating in the same workspace as automatic doors for decades and there are valuable lessons to be learned from this early experience in what is now known as collaborative robotics. These lessons show deployment of even the most basic collaborative robotic systems presents complex safety challenges. Systems with multiple robots and tens, if not hundreds, of axes will be exponentially more complex. Finally, the paper presents a layered approach to safety that addresses the collaborative robotics challenge given the complexity of the problem and the reality of available computing power. This layered approach is deployed across different physical components, across different spans of time and with sensing that is multi-modal across different sensing technologies. It is rich enough to address the problem, yet still computationally manageable by distributing the computing load across multiple layers.
Definition of a Robotic System - Because this is a paper about robotic system safety, it is important to define the term robotic system. As is widely known, the term robot was first written by the Czech playwright Karel Čapek. Čapek used the word to mean human-like creatures created by humans. These robots looked like humans and acted very much like humans. Indeed, some might even have been mistaken for humans [1]. If this paper used the word robot as Čapek intended, it would be a very short paper because (as discussed below) robots like these do not exist. Instead, this paper uses the words robotic system to mean “a system with moving parts, a drive mechanism to drive the moving parts and a computer to choreograph and control the system.” This definition intentionally excludes entirely software constructs such as internet robots. →
|