Vision-based velocity control on a Philips Experimental Robot Arm

More Info
expand_more

Abstract

The challenge in this thesis is to find out if an off-the-shelf embedded system can replace an off-the-shelf laptop or desktop computer when its task is to perform vision-based velocity control using inverse kinematics on a robotic arm. The results of this thesis are that an algorithm was developed which had to be tested in simulation and should run (semi-)autonomously on an embedded system but there are no good test results on the algorithm. Developing and testing an algorithm using an existing simulation proves to be very problematic as the used simulation software is very complex and has gone out of support by its developers. Although the embedded system was chosen because it is equiped with a digital signal processor, I sadly found out that its proprietary driver is mutually exclusive with robot-messaging middleware, when it comes to operating systems ? kernel support: the choice was between the driver by using an old kernel or the middleware by using a new kernel. The latter was chosen. A real-time software kernelpatch necessary to communicate with the robotic arm unfortunately was still in development in the final stage of this work. Porting an inverse kinematics algorithm from Matlab to C++ and adapting the trajectory generating algorithm for middleware went well, but could not be tested thoroughly because of simulation and real-time issues. This also holds for testing the velocity control algorithm. The conclusion of this report is that there is future work necessary in order to see if the developed algorithm for vision-based velocity control actually works.

Files