This journal is maintained by Winlab summer interns Vamsi and Aniket, who are working on [wiki:robotmobility]. Unless otherwise noted, the entries are typed up by me (Aniket) and I refer to myself in first person for simplicity. [wiki:robotmobilityweek1: Week1] [wiki:robotmobilityweek2: Week2] = Week 1 = == 6/4/07 - Day 1 == All day we wrestled with the ERSP installation. The error we got over and over was a license error which appeared immediately after the installer asks for the product id (312). Apparently, the installer uses the ''less'' command to display the EULA. For future reference, ERSP will not install unless the ''less'' package is available. Many hours were spent on this problem. Once this was done, the packages ersp-license and ersp-config installed successfully. The main package, ''ersp'', faced a chain of unmet dependencies. Tracing them to their root led to the packages ''libraw1394-5'' and ''libdc1394-11'', libraries involved in firewire data transfer. Once these packages were downloaded from the debian site and installed via dpkg, the ersp package installed without further error. We’ve been working on Mobile 33, which after the day’s work has working copy of Debian Sarge with Gnome. The ''/opt/evolution_robotics'' directory is populated by the software package. The package ''ersp-sample-code'' installed easily. I created a folder in our home directory in which I untarred the sample code. I attempted to run the classic ''triangles.py'' example script, and received an !ImportError. The same error appears regardless of whether or not the thinkpad is connected to the chassis, leading me to believe the problem is a dead battery. I tried both batteries with identical results. I would like the charge the batteries and see if we can make this thing move tomorrow, but I’m unsure which of the many adapters lying around are correct. Ivan is characteristically unfindable. == 6/5/07 – Day 2 == We downgraded from python 2.4 to 2.3 on Mobile 33. The labtop can now power the chassis via python! Great success. We have begun setting up Debian on Mobile 37. After the minimal install, we upgraded the kernel to ''2.6.8-2-386''. We then attempted to install the package ''xserver-xorg''. We copied the file xorg.conf from the working version on Mobile 33 using ssh. We still have issues with fonts. The needed font package was ''xfonts-base''. Once this was done, the X server launched. We then installed the Gnome window manager. Python is a bit of an issue, as the scripts seem to expect python 2.3 whereas the software itself installs python 2.4. After much reconfiguring, some of which included switching window managers on Mobile 33 (now running KDE), the solution was found in the form of a symlink called ''libpython2.3.s.0.1'' pointing to the corresponding python 2.4 file in ''/usr/lib''. Some more packages were required in order to successfully compile the sample c++ code. Namely ''build-essential'' and ''libstdc++5''. Once compiled, the example 01-simple executed and turned the robot’s wheels. This is the first time I’ve successfully controlled the robot using c++ code. One observation that I’ve made is that the code will not compile passed to gcc on its own; the ''make'' command must be used. This makes it necessary for us to generate makefiles for our own programs – something I currently know nothing about. I spent a large part of the afternoon trying, unsuccessfully, to test the camera. ERSP comes with a binary package called ''test_camera'', which gives several hints at the problem. I don’t know the specific model name of the camera we are using, and google is not being very helpful either. One goal for tomorrow is to obtain printouts of the User’s Guide and Programming Tutorials. == 06/06/07 – Day 3 == For reasons unknown, Mobile 36 (gnome) crashed X when booting for the first time this morning. We found the problem to be a missing line, m''ousedev'' in the file ''/etc/modules''. A big accomplishment for today was to get printouts of both the User’s Guide and Tutorials. This will undoubtedly help us a great deal when we get started programming. Based on a pdf file from evolution found online, the robot’s text-to-speech capabilities are restricted to the Windows version of the software. This is lame, and we will look into it more in the future. Debian does have some text-to-speech packages in its repositories, so we can probably achieve that functionality of the robot even if the ERSP software doesn’t support it. Getting the camera to work took a while. We needed to load the module ''ov511'' (it was available) using ''modprobe''. Once this was done, the ERSP test application ''test_camera'' successfully snapped 5 images and stored them as jpegs. An attempt to enable sound on Mobile 33 (KDE) was wildly unsuccessful. I learned some useful commands I’d like to remember though: ''pcimodules'' – guesses which modules should be loaded based on ''lspci'' ''xargs [command]'' – performs [command] on every line of the data piped into it So, for example # pcimodules | xargs modprobe Attempts to load every module that ''pcimodules'' believes should be loaded. == 06/07/07 – Day 4 == Sound is working! The necessary module was ''snd-intel8x0''. Once installed (along with ''alsa-base'', ''alsa-utils'', ''alsa-tools'') sound is working great. The tools ''aplay'' and ''amarok'' are able to play wav files and audio CDs. We have installed some text-to-speech applications, one called ''festival'' and another called ''flite''. Both can generate audio speech from text information. Neither is yet linked with the ERSP software however, and ''triangles.py'' continues to give errors when it is trying to speak. Getting wireless to work was a small nightmare. The madwifi driver had a few dependencies unavailable in our repository, but we were able to find them online. Once the drivers were all installed we still struggled to get connected, but we think this might be do to interference in winlab. A new discovery: if instead of running the script ''run_client_gui'' we run the program ''navtool'' directly, the camera works. The mechanics of this are still under investigation. At the end of the day, the ''navtool'' and ''run_client_gui'' script are still a mystery. The program ''navtool'' outputs countless warnings that “The audio device has not been opened yet,” which follow a failed attempt to make a directory. The program appears to me to be searching for something in ''/home/builder'', a directory which does not exist at all as there is no user named builder. Also, whereas once attempting to open the vSlam applet resulted in an error in the debug output, it now causes an immediate crash of the application (step backwards?). An additional problem I have come across is that the tool ''composer.sh'', which is supposed to be a graphical tool for composing robot behaviors, outputs an error regarding ''libpthread.so.0''. This is the first tool so far we have attempted that uses java. We tested a second chassis today. All is in working order. I need an orbit account. == 06/08/07 - Day 5 == I got an orbit account. We spent all morning trying to get ''navtool'' to work. We had some success with Mobile 33, which began a preliminary exploration of the orbit room. Unfortunately there are still some kinks that need to be worked out before the robot can be controlled remotely via SSH. Mobile 33, controlled locally (as opposed to ssh) has a functioning ''navtools'' application. The camera and software joystick both work well. The IR sensors appear not to be doing anything however, or at least the robot is not responding when it crashes into walls. One of the steps in getting this software to work was creating an obscure symlink. The navtools application searches for the file USB.xml in all the wrong places, one of which includes a directory that appears to reflect the ERSP software developers' home directory. We are currently working on ghosting the image of Mobile 33 to Mobile 36. Weekly meetings for summer interns will be held on Wendesdays at 2pm. One feature I would like to add is battery monitoring for the labtops. As they are, we have no way of knowing how much battery is left. Twice so far I've had my computer abruptly switch off on me. = Week 2 = == 6/11/07 – Day 6 == Today we used Ghost for Linux to obtain image of Mobile 33. We are able to setup an ftp server and upload the image on there. It has become clear that there is some sort of internal conflict with USB devices that is preventing all of our peripherals from operating simultaneously. I noticed the package ''udev'' fails to launch on boot. The daemon can still be started afterwards however, using the command ''udevd''. Studying dmesg output reveals that the USB hub we are using registers devices as low speed USB. There are only two USB ports on the labtop, and we have three devices to attach: camera, motor, and IR sensors. Our working configuration has the IR sensors and motor plugged into the hub (low speed USB) and the camera plugged directly into the labtop. I have been able to successfully run ''run_client_gui'' a few times now, with all three peripherals functioning. I believe udev was a large part of the problem, but it is also important that the devices are plugged in to the ports as described above. I've now used the navtool application successfully several times, and believe the trick is all in the order of testing peripherals. As far as I can tell, if this sequence is followed, the navigation tool will work: * Plug in motor and IR sensors to hub * Plug in hub to side USB port * Plug in Camera to back USB port * Run ''test_battery'' and ''test_range_sensors'' * Run ''test_camera'' * Run ''run_slam_explore'' * Simultaneously, run ''run_client_gui'' At the present time, Mobile 33 works well when these steps are followed. Mobile 37 is temporarily out of commission, but we hope to have a clone of Mobile 33's drive installed early tomorrow. The disk image will be created overnight. One of the next things I would like to do is calibrate the camera and odometry systems. The odometry system demands a joystick, which we do not have yet, but Ivan has given the go ahead to order one. There are two calibration tools for the camera, an intrinsic and an extrinsic. The intrinsic calibration deals with parameters of the camera itself, whereas the extrinsic calibration involves the camera's positioning on the robot. I experimented with ''calibrate_camera_intrinsic'' today. The program involves taping a checkerboard printout to a flat surface (one of the pillars in Orbit) and positioning the camera in such a way that the software can recognize the squares and corners. Several pictures from different angles need to be taken to calibrate the camera well. We experienced much difficulty getting the software to recognize the shapes. This seems to be at least partially due to the room's lighting. After playing with different distances between the camera and the image, and different ambient lighting scenarios, I was able to take one successful calibration image. I hope to take several more tomorrow. I'd like to set some goals for this week. First off, I would like to complete calibration for both robots and have them wandering around successfully using ''navtool''. Additionally, I would like for us to write some of our own code, either in c++ or python, that uses the obstacle avoidance feature successfully. == 6/12/07 - Day 7 == Image ghosting was successful. Not only did we make a duplicate of Mobile 33 onto Mobile 37, we also made a third copy on our recently acquired Mobile 40. For now I will refer to each robot by the fictional robot on its KDE wallpaper, so Mobile 33 = Terminator, Mobile 37 = Bender, and Mobile 40 = Artoo. Maybe we'll come up for better names for them soon. We had some success calibrating the Terminator's camera with ''calibrate_camera_intrinsic''. The best place we found to do it was the lobby right next to the yellow doors of Winlab, where the light is dim and non-fluorescent. We intend to complete the now mundane task of calibrating the remaining two cameras soon. We accidentally ripped one of the chords that connects the battery to the motors today. This was an unfortunate setback, and it took up some time to solder new connectors together. I played around with the teleoperation applet of ''navtool'' for a little while. The premise of this software is to allow the user to click on the floor somewhere ahead of the robot in a panel representing the camera's point of view and have the robot successfully navigate to the point clicked. The applet has an odd behavior, in that the camera picture does not show appear in the panel where it is supposed to. It does, however, appear in the adjacent panel labeled "camera local." The applet still tries to work, and if I click somewhere in the blank panel the robot takes off, attempting (presumably) to move to the point where I would have clicked had the picture shown up. I hope to debug this error soon and have the teloperaterion applet functioning properly. I installed Sun's Java SDK on Bender. This bypasses the java errors we were having before, and I can now run the Behavior Composer using a modified script I called ''composer.sh.2''. The functionality of this program is still a little elusive but I'm sure it will be useful once we get deeper into programming. I feel like today we are close to completing the ''Getting Started Guide'' portion of the documentation. We have yet to calibrate all three robots and still need to figure out teleoperation. In addition, we need to manually edit (for all three robots) the ''resource-config.xml'' file which keeps track of the spacial positioning of the cameras, sensors, and wheels of each robot (All three are a bit differently shaped.) == 6/13/07 - Day 8 == We completed intrinsic camera calibration for all three robots and extrinsic calibration for two of them (the terminator's camera is not mounted properly and we cannot perform calibration until we have some glue to fix it.) A potentially serious problem we've realized is that the navigation tool is not remembering any landmarks. This is a big problem because landmarks are the way the robots remember obstacles they've encountered in a room and build a map of their surroundings. Originally it was thought that lack of calibration was the problem, but having calibrated the robots we find that at least Bender still does not catalog any landmarks in Navtool. I now suspect the culprit may be a broken communication link between the vSlam applet and the camera. An error in the logs warns us to check permissions on /dev/video0, which are set to 666 as I believe they should be. Today, magically, the teleop applet was working with camera support on Bender. When we clicked a destination for the robot, it moved in generally the right direction, but passed the target and spent a long time colliding into the wall until we switched off the battery. This is disappointing, especially considering that the camera is supposed to be calibrated. This afternoon I figured out the landmarks problem. The very simple solution: run ''run_slam_explore'' before ever running ''run_client_gui''. The theory: When we first started playing with navtool, we realized that if we started the server first, we did not see any video on the side panel labeled "camera local." We took this lack of video to mean that things were not working properly, and soon discovered that if the client was launched first and the server second, the video appeared. What we didn't understand was that whichever program is launched first hijacks the camera and prevents the other from accessing it. Therefore; when we launch the client first, we see the video, but the slam server cannot take pictures and therefore cannot create landmarks. However, if the server is launched first, we cannot see the video but the slam server can take pictures and can and does create landmarks. Now, when I run the client, if I see the video right away I know something is wrong. (Often times I forgot to turn on the battery). I spent part of the afternoon reassembling the chassis for Artoo. The previous design was pretty abysmal; there was no place for the motor battery and the labtop couldn't even sit on it unless it was partially closed. I reworked it to solve those two problems, but I fear that the motors might be installed backwards as well. I'm not sure if this is the case, but we'll check it out early tomorrow and resolve the issue if it is one. == 6/14/07 - Day 9 == As it stands, only Bender is fully operational. Artoo's chassis is looking good, but for some reason the robot still refuses to make landmarks. When on explore mode in Navtool, it moves in a peculiar pattern that made us wonder if there was something wrong with the left motor so as to make it sluggish. The robot tends to travel in a curved path. However, using a python script (triangles.py) we found that the robot is quite capable of moving in a straight line. I suspect that the lack of landmarks is a result of this erratic movement rather than any disconnection between the software and the camera; I remember reading somewhere in the guide that it is difficult for the robot to create landmarks while moving along curved paths. (This is by no means certain) The thing keeping the terminator from working for now is that the camera is unmounted. We need some epoxy to mount the camera and then we will test it's navigation system. We learned how to calibrate the IR sensors today. The task involves measuring the position of the sensors with respect to the coordinate system of the robot, which is defined as follows: The origin is located on the floor directly between the two back wheels and directly below their common axis. Positive X is the forward direction of the robot, positive Y is leftwards, and positive Z is up. The parameters Roll, pitch, and yaw are angles defined as rotations about the axes X, Y, and Z respectively. To calibrate the IR sensors, all we need to do is measure these 6 quantities (position and rotation) and declare them in the file ''resources-config.xml''. We turned our full attention to the Tutorials Guide for the first time today. The first tutorial, 01-simple, is a code that makes the robot move a given distance forward, take a picture, and display the picture momentarily. We got the code to make the robot move and snap a picture, but code creating the GUI picture display refused to compile. The compiler complains about the type ''!IImageWindow'' being undefined, although the !API indicates that it is a class in the Evolution namespace. We will investigate this more tomorrow. It took quite a while before I felt like I understood this first bit of code. The programming is very high level, and many objects need to be created for apparently elusive reasons. For example, one method takes two arguments. Before these arguments can be passed to the method, they must be stored in a !TaskArgs object. To complicate things further, one of the arguments is a !DoubleArray. A !DoubleArray is an array of doubles but stored in an object defined by Evolution. Rather than simply creating an array of doubles, we need to create an array of doubles and pass it as an argument to a new !DoubleArray object. This instance of !DoubleArray is then passed to !TaskArgs which in turn gets passed to a new !TaskContextPtr, which is a sort of smart pointer containing the data needed to execute a task. To execute a method, we must pass the result of a !TaskContextPtr's get() method to a task functor's run() method. The whole process seems a bit convoluted, but we're hopeful that we'll be able to accept it and use it. I read ahead a bit in the tutorials guide. Among the many features Evolution provides is an object which handles the processing of multiple tasks simultaneously. This will certainly come in handy when we write our own code. We've gotten a few suggestions for code we should write, and I think we should get started on it soon. First off, I would very much like to see obstacle avoidance working with our code, as I feel it is in many ways fundamental if the robots are ever to be useful in Winlab. Once this is figured out, (on Ivan's suggestion), we should write some code that makes the robots move about basically randomly (except for obstacle avoidance). Additionally, I would like to make some robust and easy to use !MoveToCoordinate type commands which through SSH or whatever other means will cause the robots to create design a smart path and travel to a desired location. With these goals accomplished reliably, we'll be able to start using the robots to study wireless communication. == 6/15/07 - Day 10 == It's been an exciting day. After a small amount of work, we've confirmed that all three robots can happily wander around Orbit, making landmarks. Something we realized today was that the sensors on Bender's chassis were mixed up. The East sensor was on the left side and vice versa. We corrected this issue by modifying the sensor locations in resource-config.xml because it would have taken quite a bit of disassembly to physically switch the two sensors. We're having bad luck compiling our own c++ code. The sample code shipped with ERSP comes with ''Makefile''s which reduce the job of compiling to issuing the command ''make'' in the correct directory. We have not been able to successully generate our own make file to compile our code, and are finding that running gcc or g++ directly does not work at all. Many flags and parameters are needed by the compiler. I'm trying to learn how to use the commands ''autoscan'' and ''automake'' to generate make files automatically, but the going is slow and many errors are appearing along the way. In the mean time, Vamsi has been playing with python scripting with some great success. He discovered a command which causes the robot to wander around with obstacle avoidance enabled. It seems python has access to much of the ERSP and may actually be enough for what we need from the robots. However, it is not nearly as well documented and might be much more difficult for us to learn. = Week 3 = == 6/18/07 – Day 11 == Today was full of reading: reading the API, reading the User's guide, reading sample code, and reading the tutorials. We started using the behavior composer for the first time, and find it has some pretty nifty features. The essential command is ''behave''. A behavior is something the robot does constantly, ask compared to a task which is generally a one time assignment. For example, a task might be to move across a room, whereas a behavior is to repeatedly check for obstacles and change course to avoid one. We still cannot figure out how to compile our own code or generate makefiles. Instead, we have been writing small code snipets using the same filenames as those in the sample code, (eg ''simple.cpp'') and using the makefiles provided with our fingers crossed. Right now I believe the most important thing for us to understand is the last three tutorials in the navigation section. These are the tutorials about using vSlam and creating landmarks. Understanding this will be, in my opinion, the critical step towards successfully using the robots in wifi experiments. Unfortunately (though not surprisingly) the code resists compiling. Navigation Tutorial !#4 is the introduction to vSlam, and consists of 4 programs, only the first of which actually works. This first program causes the robot to move in a "bow-tie" pattern, but does not demonstrate any of the map-building capabilites of the robot. The remaining tutorials give compile errors, often with this error: {{{ error: extra qualification 'Evolution::OccupancyGrid::' on member 'distance_to_obstacle' }}} I sincerely hope we can overcome our compiler issues shortly. In the meantime, we have python to work in. A complication we keep running into is that we do not have a joystick here at winlab to operate these robots, and while a joystick is not strictly required, much of the sample code assumes you have one. It is hard for us to study the sample code as we cannot see it in action. Additionally, a joystick is required to run the provided odometry calibration scripts (something we probably should have done by now). We plan to bring in some of our home gaming equipment tomorrow; Vamsi has a USB joystick and I have a USB xbox controller. Hopefully at least one of those will work and allow us to explore the sample code better.