Me. I am Jerry Webb, creator of graffiti bot. I built a distance detector using ultrasound waves. I love to produce music and build computers.
The first song that my robot sings is Yuck – by Two Chainz ft. Lil Wayne. The second song is a song I composed and produced myself, which is called Rainy Reality. The robot uses a command robot.beep(time ,frequency). The frequency determines at what pitch the beep will play, and the time corresponds to how long the beep lasts. Instead entering a frequency in for each note, I used the notes of an actual piano to represent each frequency. I also used tempo and bpm (beats per min) to set the tempo of the song. Graffiti Bot loves the holiday season, so it decided to draw a Christmas tree. I programed each movement individually, and had to calculate the radius of each tree branch in order to make the higher branches smaller than the lower branch. Here the robot.motors and the robot.forward functions were used. Graffiti Bot can do one of three things, sing the other song that I composed, spin around aimlessly, or refuse to continue to perform. These three functions are given a random number form one to three, and randomly select one action to do. The since my robot is called Graffiti Bot, it likes to run around and draw graffiti all over the floor. In order to do this, it uses a combination of IR, line, light and obstacle sensors to detect object that might interrupt its drawing session. The IR sensor detects what is behind Graffiti Bot, and the obstacle sensor detects what’s in front of Graffiti Bot. The line sensor detects how dark or light the surface in which Graffiti Bot is driving on. While Graffiti Bot is moving, it will always check to see if it is on a dark surface. Because Graffiti Bot likes to tag at night, the light sensors will check to see if the light value is above 65,000. If so, then it will beep to alert other Graffiti Bots to run away.Check out just a few of my robot's talents. It can sing, draw, sense objects in front of it and behind it, and sense light. This is the code for my robot's performance.
Today, 9/26/2012, I gave my robot personalities. I gave him five: alive, coward, aggressive, love, and explorer. Alive moves when light is shown onto the robot. Coward moves away from the light, while aggressive moves toward it. Love drives toward the light, while explorer moves away from it.(Graffiti Bot moves backwards for love and explorer.Behavior Codes
My group memebers for the behavior lab are Kristen McNeal, Zach Zeff, Mike Ciesielka. 9/26/2012
Today, 10/5/2012, Graffiti Bot competed in the 'Robot Games, which were a series of different challenges that each robot had to complete. My teammates were Brice, Teresa, and Taylor. Our team was responsable to create and compile code for four different events, my event was the openning ceremony. I had to create a program that could control the robot with commands from the keyboard of a computer and that could play the USC fight song. I also created the main command consol that the user uses to select which of the four programs to execute. One of the tasks that our robots had to complete was to follow a black line on the ground. Our roboot had to use all four line sensors on the bottom of the robot to constantly scan for a dark surface on the ground. When one of the sensors detected a white surface, it would turn away from it. Graffiti Bot had to move very slowly in order to accurately detect the line on the ground. The next task was to navigate through a maze made of carboard boxes. Our code made use of the IR sensors on the back of the robot, instead of the obstcal sensor on the front of the robot.We found that the IR sensors are more accurate at detecting the distance from the object. Graffiti bot also had to draw a sereies of triangles in the shortest time possible. For this algorithm, we used a series of commands that would control each motor induvidually, and we told the robot to draw the picture without repeating a line.Here is the code for the games.
Today,1/16/12, Graffiti Bot was used for search and rescue. we used our scribbler for search and rescue. We were instructed to move into an unknown area with only our robots, and search for other lost scribblers within the area. The catch to this mission was that we were not able to see our robot when we operate it. Due to this, we had to program our robot to send back information using its IR and obstacle sensors. Graffiti bot also has a forward facing camera that can take pictures of whatever is in front of them robot, but the latency between using the camera and displaying the picture to the end user is too high for the camera to be used for streaming. My code askes the user for an input comand to either make the robot move forward, backward, left, right, or take a picture. Before the robot would execute the move commands, it would scan its surroundings using the appropriate sensors. If there is an object close by, it would display the distance of that object from the scribbler, and then ask the user if he wants to continue. Unfortunately, two of the four teammembers, including myself, were not able to run the code with thier robots. (my virtual box data was erased and destroyed ALL of the code that was present in my virtual machine) This left our team (Brice, Taylor, Theresa and myself), with only two robots, forcing our strategy to "just go and take pictures". We were able to find two of the three lost scribblers.Here is the code for the rescue.