4th RTL Workshop: Space Robotics: an Experimental Set-up based on RTAI-Linux
Dec 19, 1997 — by LinuxDevices Staff — from the LinuxDevices Archive — viewsAbstract
Thanks to a constant research and development activity in the field of robotics and, more in general, of automation, reliable and relatively low-cost machines that can substitute or assist human operators in repetitive or dangerous tasks are now available. In robotics, this is particularly true for applications in structured environment, i.e. when the workspace in which the robot operates is known with great accuracy and precisions. On the other hand, it is obviously of interest the development of applications in ustructured or unknown environments. In this case, control algorithms that can give a sort of human behavior to the machines are needed: in other words, the machines should be characterized by a functional autonomy, i.e. they should be able to modify their behavior on the basis of information acquired in real-time from the environment. In space application, it is of great interest the development of autonomous or semi-autonomous robotic devices that can substitute the astronauts in routine operations in order to free them from repetitive tasks and reduce mission costs. In this work, an experimental setup based on a 6 degrees of freedom (dof) manipulator with a 3 dof gripper designed for a possible application within PaT, the Payload Tutor proposed by ASI (Italian Space Agency), is presented. This system consists of a robotic arm, a vision system, and a gripper. Since the gripper has to interact with free-floating and irregular objects, the vision subsystem provides all the information needed for grasping unknown objects in an optimal way. The robotic arm is a Comau SMART 3-S robot, a standard industrial 6 dof anthropomorphic manipulator with a non-spherical wrist, equipped with the standard controller C3G-9000. Each joint is actuated by a DC-brushless motor, and its angular position is measured by a resolver. In our setup, the controller is only used as an interface between the resolvers and drives on t he robot and a PC running RTAI-Linux. In each sampling period generated by the controller, the real-time control system running on the PC must acquire the data from the encoders, compute the new control input for the actuators and send their values to the C3G-9000. On the robot wrist, both the video-camera and the gripper are installed. The vision system is used to provide visual information about the environment and, in particular, about objects within the workspace of the robot. These information are needed to track a moving object in the workspace with the robot, to move the robot in a desired position in order to grasp an object with the gripper and to automatically calculate the optimal grasping configuration. At the moment, the vision system consists of a monocular camera, connected to a frame-grabber board installed on the same PC that implements the robot control algorithms. By properly moving the robot arm and, at the same time, acquiring images of the object from different points of view, the vision algorithms can give a good estimate of the distance of the object from the robot wrist. This information is essential to correctly move the robot in order to grasp the object with the gripper. Moreover, by means of the vision system, the shape of the object is recognized in order to calculate the better grasping points (target points). The object is caught on these points and the contact forces are properly controlled. Generally, the target points are selected on the basis of a kinematic analysis of a first order model: the resulting points do not depend on the shape of the object and on the geometric characteristics of the gripper. In our case, the target points are calculated by means of a kinematic analysis of a second order model that takes into account also the shape of the object and of the gripper's finger: in this manner, the resulting grasping configuration is more stable. The third component of our setup is the A.S.I. Gripper. It has three degrees of freedom, and is particularly suited for no-gravity manipulation tasks (i.e. in space applications), since it can interact with free-floating and irregularly shaped objects. Its control algorithms are executed on a custom DSP board (based on the TMS320C32 chip). For this board, a loader and a DSP-monitor have been developed under Linux, together with some drivers for the DSP board. Once the object distance and the target points are calculated, the gripper is correctly positioned in the workspace with respect to the object. Then, the fingers are closed and the object is grasped on the target points. At this point, the DSP board executes the control algorithm in order to assure proper contact forces. In the final version of the paper, a detailed description of all the subsystems will be presented, together with some experimental results.
Read full paper (PDF download)
This article was originally published on LinuxDevices.com and has been donated to the open source community by QuinStreet Inc. Please visit LinuxToday.com for up-to-date news and articles about Linux and open source.