**Project 1: Line Following -- Due 9/28** **[ENGR 28 Fall 2023](index.html#schedule1_2023-9-14)** # Overview In this lab, you will learn about the Open Source Robotics Foundation ROS software, and get to know your TurtleBot 2 robot. You will also program your TurtleBot robot to follow a dashed red tape line on the floor of the lab. # Tasks ## Computer setup This semester, your group will be assigned one of our TurtleBot 2 robots to work with. Each robot has a laptop associated with it, with the robot name labeled on the lid. In the future, it will be useful to have another computer to work with while you are completing the labs, in order to run software remotely on the robots. Your personal laptops will probably suffice (but you won’t need them today). Robots will be shared between two teams. Your group should choose a login password, and share it with me. I will let you know which robot to use, and the username of the login account on your robot. Once you write your names and your login password on the slip of paper I give you, I’ll get you up and running with the computer. You will be using the Linux command line extensively when operating your TurtleBot, so you should familiarize yourself with as you complete this project. One tutorial you might want to look over later on is http://www.ee.surrey.ac.uk/Teaching/Unix/. We will be using the **terminator** software to access the command line. The icon (on the left of your laptop’s screen) looks like this:  As the icon suggests, **terminator** conveniently allows you to split its window into a bunch of tiled terminal sessions. The shortcut keys to do this are `Ctrl + Shift + O` to split the current pane vertically, and `Ctrl + Shift + E` to split it horizontally. When the tutorials linked below ask you to open a new window, consider just splitting the current one instead. !!! Tip You can often use tab completion in terminal windows by typing the first few letters of a command or a command line argument and then hitting tab to automatically complete the rest. ## ROS Tutorials Load the ROS tutorial page at http://www.ros.org/wiki/ROS/Tutorials. Along with your group, and taking time to make sure that everyone is following what is going on, complete at least the following tutorials: - [2. Navigating the ROS Filesystem](https://wiki.ros.org/ROS/Tutorials/NavigatingTheFilesystem) - [3. Creating a ROS Package](https://wiki.ros.org/ROS/Tutorials/CreatingPackage) - [5. Understanding ROS Nodes](https://wiki.ros.org/ROS/Tutorials/UnderstandingNodes) - [6. Understanding ROS Topics](https://wiki.ros.org/ROS/Tutorials/UnderstandingTopics) - [7. Understanding ROS Services and Parameters](https://wiki.ros.org/ROS/Tutorials/UnderstandingServicesParams) - [8. Using rqt_console and roslaunch](https://wiki.ros.org/ROS/Tutorials/UsingRqtconsoleRoslaunch) - [12. Writing a Simple Publisher and Subscriber (Python)](https://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28python%29) - [13. Examining the Simple Publisher and Subscriber](https://wiki.ros.org/ROS/Tutorials/ExaminingPublisherSubscriber) **Keep these in mind when completing the tutorials:** !!! Warning For all of the tutorials above, choose to use the *catkin* as opposed to *rosbuild* build system. !!! Warning You do not need to run any commands in these tutorials that begin with **`sudo`** -- these commands have already been run on your laptop for you, and they only need to be run once. !!! Warning For tutorial 3, part 2, you will not need to follow the “Creating a workspace for catkin” tutorial, since your account was already set up with a workspace. It’s probably a good idea to take turns reading over each tutorial, performing the steps, and explaining the material to your group members. There will be an in-class quiz on Thursday, Sep 21 on the material in the tutorials, so make sure that everyone in the group understands what’s going on! ## Modify the simple publisher and subscriber In the same directory you created for tutorial 12, make a new program, **`reverser`**, that subscribes to messages on the `chatter` topic, and pub- lishes the same message, backwards, on a topic named `rchatter`. For instance, if the talker program publishes the message ~~~ none hello world 123 ~~~ Then the reverser progam should publish the message ~~~ none 321 dlrow olleh ~~~ You should be able to verify that the **`reverser`** program is working by running the command ~~~ none rostopic echo /rchatter ~~~ in the terminal. !!! Warning When writing ROS programs, it is inefficient (and generally incorrect) to create a publisher more than once (e.g. every time a subscriber callback is called). Instead, you should prefer to make publishers [global variables](https://stackoverflow.com/questions/423379/using-global-variables-in-a-function) or class member variables, or find some other way of using an already-defined publisher in your subscriber callback. ## Get to know your Turtlebot Once you’re familiar with ROS, you should get to know your TurtleBot. Make sure the robot is powered up, make sure the robot and Kinect are plugged into the USB ports, and then start the robot control software by typing the command ~~~ none roslaunch turtlebot_bringup minimal.launch ~~~ !!! Tip You may occasionally see an error in this window that looks something like `[ERROR] Kobuki: malformed sub-payload detected.` This is normal and can be ignored. You should hear a rising series of beeps from the robot to indicate that it is connected and ready to go. In a separate terminal, run ~~~ none roslaunch turtlebot_bringup 3dsensor.launch ~~~ !!! Tip When you start up the 3D sensor launch file, you may see warnings about not finding a compatible depth output mode and/or a camera calibration file not found. These are normal and can be ignored. Next, you can launch the visualizer tool **`rviz`**: ~~~ none roslaunch turtlebot_rviz_launchers view_robot.launch ~~~ If the **`rviz`** program crashes (this sometimes happens, sadly), just try the command again. When it launches, you will see the **`rviz`** interface. Maximize the window, and then try bringing up the "Image" and "Registered DepthCloud" displays by clicking the check boxes on the left. You might need to wait a second for either to appear. Finally, you might want to rotate the 3D display (by clicking and dragging) to see what’s going on with the DepthCloud. Finally, close the rviz window (pick the “Close without saving” option if prompted whether to save the settings), and go back to the terminal and run ~~~ none roslaunch turtlebot_teleop keyboard_teleop.launch ~~~ Gently place the robot on the floor, with the laptop on top of it and the lid still open. Being careful to catch the laptop if it falls, try using the keyboard to control the robot and drive it around for a bit. Please do not increase the speeds from the default. To safely shut down your robot, hit `Ctrl + C` in terminal windows with running ROS processes, in the reverse order from which you started them (i.e. kill the minimal.launch process last). When the robot control software is shut down, you will hear a series of falling beeps from the robot to let you know it’s ready to be safely powered off. ## Color blob detection Now you will prototype some simple color-seeking behaviors for your Turtlebot. Make sure the `minimal.launch` and `3dsensor.launch` files are running in their own terminal windows (as described in the previous task), and then run ~~~ none rosrun image_view image_view image:=/camera/rgb/image_raw ~~~ It should bring up a window that shows you what the camera is seeing. Be patient when bringing it up for the first time – if it still doesn’t work after a while, ask me for help. Now that you know the camera is working, close **`image_view`** and you can run two more applications. In one window, run ~~~ none roslaunch blobfinder2 2d.launch ~~~ And in a fourth window, run ~~~ none rosrun image_view image_view image:=/blobfinder2/debug_image ~~~ Now you should be able to position the robot to look at red tape and cones on the floor, and see them detected as colored blobs. You can also get information about detected blobs by running ~~~ none rostopic echo /blobfinder2/blobs ~~~ You can see the message definitions for blob messages [here](https://github.com/swatbotics/blobfinder2/tree/main/msg). You can also see some sample code for dealing with these messages in Python in the [Project 1 starter code](https://github.com/swatbotics/e28-project1-f2023/blob/main/scripts/blobfinder2_example.py). ## Make a line follower. There is directory called project1 in your e28 labs directory, which you can access via ~~~ none roscd project1 ~~~ There is starter code in [`project1/scripts/starter.py`](https://github.com/swatbotics/e28-project1-f2023/blob/main/scripts/starter.py), which implements the skeleton of a state-machine based controller. This node will cause your robot to wander randomly, while observing the safety sensors such as bump and cliff sensors. After you read the code and run it (with the robot safely on the ground), copy the file to a new file called `line_follower.py` and use it as a starting point for this task. !!! Tip Note that the state machine implementation for this controller never immediately changes state or control outputs in the subscriber callbacks. Instead, the software simply updates some internal variables and lets the main control callback handle state changes and actions. I suggest you preserve this scheme when you add in a subscription to blob messages! Your new node should control the robot to follow the tape line on the floor. It should look at the most recent set of blob messages to find a "dominant" blob corresponding to the red tape – perhaps the one with the largest area. Then, a good basic control law to start out with is to set the robot’s turning rate proportional to the distance from the $x$-coordinate of the dominant blob to the center of the image: $$ \dot{\theta} = k_p (320 - x)$$ You will need to use some basic physical intution and experimentation to discover a good coefficient $k_p$. As you develop your line following behavior, you should preserve the functionality of the bump and cliff sensors -- that is, when following the red line, the robot should stop immediately when picked up, and should back up and/or turn when bumped. The best way to do this is to add one or more additional states to the state machine. Once you get basic line following working, try to get your robot to do something intelligent (i.e. turn in place until it sees the line again, or look for another line). At the very least, your robot should return to the `wander` state once it loses the line. Going further: can you make a robot that can still follow a particular line accurately if there are a number of them next to each other? What about a continuous spiral? # What to turn in You should demonstrate to me: * completion of the ROS tutorials, and knowlege of their contents * your finished **`reverser`** node * your finished **`line_follower.py`** node You will submit your code via the [Swarthmore github](https://github.swarthmore.edu/) (instructions coming soon). Please also submit a short (one to three page) lab report describing how you developed your program, including: * who did what * what problems you ran into, and how you solved them * what functionality you added beyond the minimal line following