What is the Killdozer? The Killdozer is the robot that's gonna beat up your robot. It's gonna punch your robot so hard its little robot kids are gonna feel it. I mean, look at this thing:
That is one lean, mean, robot killing machine. It's killed more robots than John Connor. More robots than paradoxes. It can launch more predator drones than Kyle Negrete. It's gonna fly to mars and beat up the Curiosity for daring to play crappy Will.i.am albums. It's going to retroactively destroy the works of Isaac Asimov and erase the movie Utopia from history. If you're a robot, you better watch your back.
The Killdozer can move so fast it goes back in time. It can punt for 200 yards. As for me, its mere handler, I like Computer Science because of the ability to get tangible results quickly. During the summer I stayed at my summer home in the Bahamas that I let Mitt Romney rent out when I'm not using it. Sipping mojitos with playboy models does get old eventually though, so I decided to go to USC. In my spare time I do nothing but read the great classics in my smoking jacket with an attendant to top off my drink. I'm passionate about robots, websites, and robot websites.
Here is where I write some words about what we did in class and also my deepest fears maybe:
9/4/12So my group of dudes and I used a robot to draw this weird triangle thing from a video game because we are lonely nerds and will never know the touch of a woman. Here's a picture if you want to see it but why would you unless you were some sort of professor grading things or something:
9/6/12My group and I tested out our sensors today. My group being Jeffrey Chau(robot = Crank) Timothy So(LGM) and Carmen Ta(Castille). Seeing as how I can't connect to my robot yet, I got Jeffrey to connect to mine and check the sensors. For the line sensors we got a 1 when it was on a dark surface and a 0 when it was light. the end
9/7/12 My group, consisting of Carmen Tan, Tim So, and Jeffrey Chau tested our robots sensors in varying positions today. First we tested our light sensors, which display high values for darkness and low values for light which probably only makes sense to the guy who coded it. There's three light sensors, so three result values for each.
|Me||65284, 65400, 65287||65408, 65402, 65406||3343, 3138, 3435|
|Tim So||64879, 64609, 64829||65408, 65408, 65408||2197, 2815, 1940|
|Jeffrey Chau||64777, 64825, 65149||65155, 65059, 65276||1408, 1720, 1710|
|Tim So||9 inches|
|Jeffrey Chau||15 inches|
|Carmen Tan||11 inches|
|1 Foot||2 Feet||3 Feet|
|Me||6400, 6400, 6400||3200, 3200, 4480||1920, 1280, 2560|
|Tim So||1280, 1920, 640||5760, 5760, 5120||6400, 6400, 6400|
|Jeffrey Chau||6400, 6400, 6400||5760, 6400, 6400||2560, 2560, 3200|
|Carmen Tan||6400, 6400, 6400||5120,5120,5120||5120,4480,6400|
My group, consisting of Carmen Tan, Tim So, and Jeffrey Chau, drew a Fibonacci spiral with our robot today. I coded a loop to generate the Fibonacci sequence and they found the math to draw the spiral.
The code can be found here
Robot Talent Show Assignment Pre-programming questions
1) The robot will beep the melody to the 1812 Overture by Tchaikovsky when the Obstacle Sensor is tripped. There will be no special control algorithm, just a series of .beep commands.2)The robot will draw a stick figure when either IR sensor is tripped. There wil be no algorithm, just a series of movement commands.
3)The robot will draw a regular polygon of a random number of sides when a light sensor is covered up. It will use a for loop to do this.
4) The entire code will be in a while loop that closes when the line sensors do not detect solid ground or the battery drops too low. The robot will use if statements to see if the sensors have been triggered to start the song, obstacle, or surprise round.
A video of my talent show can be found on youtube here: click here Here's a link to my code
My group, consisting of Carmen Tan, Tim So, and Jeffrey Chau wrote code to make the robot perform different behaviors based on user input. I wrote the indecisive behavior, which goes forward when it is in light and backward when it is in shadow. When it finds a divide between light and shadow, it will move back and forth perpetually. My code can be foundhere.
My group, consisting of Carmen Tan, Tim So did the robot games stuff. Here's a link to the code.
1.The robot runs a loop and the user provides inputs that allow it to move and play the fight song. The loop ends and it returns to the main function when the loop is over. This uses no sensors.
2. I will use the robot's Line sensors to detect whether the robot is on a black surface or white surface. I will use if...else statements to control actions when both sensors detect a white surface (it will keep moving), to turn right if only the left sensor detects the black line, and to turn left if only the right sensor detects the black line.
3. The algorithm makes use of a while loop to have the robot go forward while the sensors register no obstacle. When the sensors register an obstacle, it has the robot turn once and check for obstacles. If there is an obstacle after the turn, it will turn around 180 degrees and try again. This process allows it to proceed in any of the four cardinal directions without hitting an obstacle.
4. I drew a step by step diagram of how I would draw the Sierpinski Triangles. I first tried only to prevent redrawing over lines. After I was able to do so, I then tried to minimize the amount of time the robot spends turning. I found a way to draw the triangles in an efficient manner, but my robot may have problems with making precise enough turns. The program will not use any sensors.
The main function is a loop that allows the user to input a string to pick which part of the code they want to use. The method will also use the battery sensor to check if the robot has enough batteries./p>
Today I made a playlist program that uses arrays to allow the user to make a playlist of songs or drawings. Then it prints out that list, and then executes the actions. Here is a link to my code.
Homework 3: Urban Search and Rescue
My group, consisting of Carmen Tan, Tim So, and Jeffrey Chau worked on the Urban Search and Rescue lab.
1. Each member will start at a corner of the grid and control their robot through the maze using a for loop to
2. The robot will be manually controlled through the maze using the camera.
3. The robot will analyze pictures that I think a lost scribbler is in for red pixels and then highlight the red pixels on the pic.
4. The team will develop a mapping system once we get a good look at the course and understand the size and shape of it. We will probably use a grid
My code can be found here.