tirsdag den 18. januar 2011

Final project


All terrain search vehicle




Date: 14-January 2011
Duration of activity: Project week 1-3, documented on block
Group members:
Brian Hansen 20097765
Michael Kaalund 20097660

Goal:
The project idea is described in http://teamiha.blogspot.com/2010/12/final-project-ideas.html where the different project ideas are stated.
The project description defines an all terrain vehicle that can find the location of a certain item by using the built in navigator in leJos.
The Goals from the project description
- Robot that detects a red brick tower and display the location on computer
- Avoid objects and walls
- Predefined yellow tower locations help the robot make error correction on the location class
- Wall detection by its color and by sonic sensor
- Belt drive and gearing implementation research and practical usage
From the beginning of the project it was expected that the belt drive would give problems in the SimpleNavigator because of the different math behind navigation based on how the belt behaves.
This project will from test show some of the problems, and how the issue can be handled by knowing the location of certain objects.
figure 1: Belt driven robot with sensor platform in front
The sub goals and the expected steps needed was:
- Coordinate system for belt drive
- Bluetooth debug channel
- RGB color sensor test
- Sonic sensor sweeping system
- Search algorithm
- Uplink and display reed tower location
- Coordinate correction by known yellow tower locations

Plan
Construction of device:
From the device described in figure 1, there were several experiments with the belt drive that in the end ended with the setup in figure 2.
figure 2: version 1 of the robot, important observations (sensors, gear)
The sweeping platform in the first construction was mounted as in figure 2.
A few changes were necessary during the development because of the planned test that showed it was necessary. A complete description of the setup is described in the results section. Some of the changes are (sensor platform, NXT main device location for better weight distribution)
RGB test, color input and distance test
The purpose of a test of the RGB sensor is to figure out the output the driver gives. Furthermore we need to detect a tower at a distance so a distance test of the RGB is needed.
Debug connection to computer
During the course we have had issues on how to debug the code when testing. A solution is given by the leJos system, a debug console that uses the Bluetooth. This system is set up in our programs and can be turned on and off.
Menu system on NXT
We wanted control of the debug for turning on and off and the choice of making configurations on the different systems like the sweeping platform. The sweeping platform is to be configured in direction by the menu system. The leJos class TextMenu is used.
figure 3: the intended menu system and sensor platform calibration
Navigator test
The goal of the SimpleNavigator test is to find a best fix for the wheel size and track width which is expected not to be exactly the same as the measured because of the belt drive. A best fix is found by two tests:
Distance covered in direct line: Give at fix for the wheel size
Rotation angle: determines the track width
figure 4: coordinate test
Combined: The last test is a combination where the device drives in a square, figure 4. This test is also used in the presentation to show the problem in small scale.
Main goal solution idea
To solve the main goal (detection of red tower), the sensors sweep around. The idea is taken from the project idea [RADAR] and [Explorer]. By the means of the sweeping we only need one sensor to detect obstacles and the red tower (sonic sensor) and the RGB to determine if it is the tower or something else.
figure 5: coordinate calculation based on direction and roter angel.
On figure 5 the sweeping implemented is described and the math needed to determine the coordinate to the object. To determine the coordinate you need the coordinate and direction of the NXT to determine rotor coordinate and the rotation angle and the distance from rotor coordinate to object from this the X,Y coordinate of the object detected can be estimated. System not implemented or tested
To find the towers the device needs to know the size of the environment to search in. In the project description we have defined the test environment to be a square table with a raised edge we can detect by the sonic sensor.
The NXT should run 4 modes where the first mode estimates the outline and from there give four coordinates (each corner). The 4 modes are described on figure 6
Figure 6: the 4 modes for determining outline and search
Result
Final construction
From the project description we have a setup that needs 3 motors and 2 sensors. On figure 7 the setup during the project is described.
Figure 7: hardware setup
The first requirement to the construction is the belt drive. During the start of the project different types of belt drive setup were discussed and constructed but the chosen design is shown in figure 8. The actual drive consists of a big gear wheel mounted on the motor and a smaller mounted at wheel that drives the belt. This construction was needed to give better ground clearance and resulted in a vehicle that could drive fast. The gear ratio is 40/12.
Figure 8: belt drive with extra support
The 2 extra wheels is only for support especially the middle one, which was implemented for support purpose and improved some of the rotation accuracy. The rod connecting the 3 wheels is to strengthen the design because the belt will try to squeeze the 3 wheels together, resulting in bent rods. The rods shown in figure 9 goes all the way throe to make sure that the weight of the NXT is supported or else the 2 belt drive will be spread further apart.

figure 9: underside of robot, smaller gear wheel visible.
With the implementation and testing of the sensors we encountered a problem with the sweeping functionality. When the platform is in the 90 degree position meaning it looks to the side then in the original design the RGB would always say black because it could reach far enough out.

figure 10: top of the sweep platform, 90 degree angle.
By implementing a extra block to the sweeping arm shown in figure 10 and 11 the RGB could now reach to the side of the device and given the maximum RGB reading length of 4cm it could now detect objects.

figure 11: RGB and Sonic sensor mounted with the extra brick
Later on in the project we encountered a problem when the device started an especially stopped. When stopping the robot would tip back and forth, resulting with a small change in direction angle. It seemed like there was a problem with design of having the sweeping motor so far in front giving a bad weight distribution. It was not possible to move the motor back so there for the NXT brick was moved further back to balance the robot. This is shown in figure 12.

figure 12: support system for the NXT to move weight back

The final result is shown in figure 13. The Claws mounted on the side is only for show. An extra not for reconstruction of our device is to make a better solution for the cables to the sensor. The cables have a tendency to work as spring making the sweeping platform get imprecise because of stiffness in the cables.
figure 13: final result that was tested
RGB and Sonic sensor test

The RGB sensor that we used is from the NXT series 2.0 and one of the difference from this sensor and the other Lego sensors, is that this one uses the CRC that the class SensorPort generates. This gave problems with leJos system (version 0.85 beta), sometimes it will put something out on the display and it would not read from the sensor. After some searches on the leJos forum, we found out that we was not the only ones that had problems with this RGB sensor, in one of the replies an user said that the problem could be with the CRC in the SensorPort.java file. Afterwards we looked at the SVN version of the leJos and found that they made a change to the CRC.
figure 14: Picture shows the difference between 8.5 beta and the SVN version
We implemented this change to our version and recompiled the Classes.jar. Recompiled our test of the RGB sensor and this fixed our problem.
Our distance test of the RGB sensor, yield that minimum distance that we could get an reading of, was about 0,5 mm from the sensor to the object.
The maximum possible object distance that could give a correct color measurement was measured to 40 mm, anything further away; the RGB sensor will return the value black.
The Sonic sensor performance needs to be known because the sonic sensor is mounted close to the floor that could give a problem on maximum distance.
In the documentations of the sonic sensor in leJos it says the maximum range is about 170 cm, and our measurement with our setup yielded:
figure 15: Sonic sensor measurements
When there is no object in front of the sensor, the sensor reads 24. We would expect a maximum readout of 255 if there is no object in front of the sensor, but as we expected because of the height from the floor the ultrasonic sensor gets some reflections from the ground.
Sweeping platform

The idea with the sweeping platform is to have it doing continues distance reading on the environment in front and the sides of the NXT. When it then detects something with a distance smaller than a desired value, then it should take an RGB reading. The RGB determines if it’s the wall or a brick tower.
Because of the ultrasonic sensor the sweep-platform needs to be stationary while it takes a reading, and therefore we need to determine how many degrees to rotate pr reading. After a discussion it was determined that 10 degrees was plenty which give 18 readings for the 180 degree sweep.

figure 16: sweeping rotation output
Figure 16 describes the rotation degree reading used and returned from the motor class. The sweep as describes in the figure goes +-90 degrees
On system startup the tacho counter will be at zero, even if this is not the case in the real world. We cannot just turn the platform manually, because of the tachocounter will not register it. Therefore in the menu, we can align the swing platform, when it is aligned it will reset the tachocounter to 0.
In one of the first attempts to make an alignment function, we used rotateTo. This gave some problems because of the tachocounter thinking it is in a certain position. In this functionally we uses rotate, this is because, as the function names indicate rotate only rotate an amount of degrees and rotateTo rotates to an angle. Under the development of the menu correction function, it was discovered the function rotate, has a minimum it can turn, and our trials shown that it needs an minimum angle of 5 degrees.
RotateTo has to be used for in the actual sweeping function because of the error. If the rotate function is used it will not always return to our define 0 degree position, test showed a misalignment of more than 10 degrees and increasing over time. RotateTo have some error correction implemented that works nearly perfect.
Outline, no correction

When the NXT starts it don't know in how big environment it is in, so the idea is to get it to define outline of the environment as seen in figure 6. When it comes to the first corner it will define this one as the first corner setting the coordinate to 0;0. Afterwards it will turn in the direction with no obstructions, go until it gets to the next corner. The reason it look in either direction, is the NXT really don't know which way around the track it is going, so it needs to make a decision on which way to turn.


Figure 17: straight moving outline detection with debug mode on
In this case we assumes that it will drive straight when driving forward, but we encountered some problems, as it would not drive straight. This could indicate some problem in either the leJos tachopilot, simpleNavigater or in the hardware; further investigation is needed to determent where the problem lies. We tried to correct the course by implementing a PID regulator that uses the outline of the table.
PD regulation
Our PID regulator works by getting two coordinates an X and a Y coordinate, if we want to move in the X direction, but want to keep an distance to the wall, it regulates on the Y coordinate, those two are then feed into the SimpleNavigator.goTo, which takes an X and a Y coordinate. If it then turns and we want it now to go in the Y direction, it then regulates on the X coordinate. This would seem like a complex way to turn the device but it is necessary because the simpleNavigator is the class that has control of the motor class.

figure 18: the implemented PID code
During the development with the PID regulator, we discovered a new problem. The device has to stop while taking a reading to the wall; the problem was that it skids out with the back of the NXT. This presented a quit serious problem as it meant that it puts the NXT more out of course, and it became difficult to get the PID regulator regulate correct. We were told that shifting the weight to the back in the construction, described in “Results: final construction”. We eventually found that changing the speed of the NXT helped on the problem but not enough and it also resulted in a very slow moving robot.
Figure 19 movie 2: wall following PID regulation
Presentation
The final presentation showed the problem of simpleNavigator together with the belt drive. The result was a nearly perfect 20 by 20 square. From this test program the error facing the next step was not clear because on small distance the system seems to work perfect.
Next presentation was the test we used to specify the distance detection of the RGB sensor, and the color output. The distance tests have during the project resulted in construction changes to make sure the RGB could reach the object when in sensor platform was in 90 degrees position. With this presentation there were also a presentation of the Menu system and the debug channel.
The final test was to present the simpleNavigator issue on large scale and to show the progress so far on the main goals. The program should follow the table outline to define the corner coordinates. First test showed large error because of motor issue where one of the motor did not put out enough power and therefore could not drive straight. The next step was the idea of using a PID regulator to make sure we had the same distance to the side wall. Issue here again is the difference in motor and it seems like one of the motor is broken.
Finally we had a discussion on the research of using simpleNavigator for belt drive.
Future work
Further research into belt drive is needed to calculate correct angle of rotation. The problem lies in the friction of long tracks. If the tracks have lots of contact with the ground because of the length of the vehicle there will be more friction then with a wheel. The math needs to take the actual track length into account. There will still be error because with the friction issue you have to take the floor type into account that will change the friction constant.
In the project the search drive and the outline detection did not work as planned. With the outline and search we had hoped to make course correction calculations to minimize the error. The solutions could work in theory but there needs to be made some improvement to the simpleNavigator to be able to get to the necessary position which was not possible.
Because we could not detect the outline we could not get the object detection and sweep tested and implemented in this project. Especially the calculations of tower positions have a big priority in the future work because this is the basis of the course correction.
When the different system works as planned then the code should be optimized with the use of threads for a more fluently sweeping and search movement because the stop, search and drive takes too long.
Conclusion

We did not achieve the main goal of the project which was detection of red tower and presenting the position of the tower. We did however make the conclusion from the test that the simpleNavigator in leJos 0.85 beta cannot be used for belt drive with the intended precision needed for the project.
During the project several new items appeared, documented in this blog. The menu feature was not mentioned in the early faces of the project but we found it necessary. The debug channel implemented where used extensively and the hope is that further project will use this earlier in the lab work.
We looked at fixed gearing in the project mostly because of the construction which made more ground clearance with the gear mounted. Fixed gear was easy to implement in the code by using it in the wheel size calculation.
The bad result in the end of the project shown in the wall following video figure 19 could be because one of the motors seemed to be broken. If analyzing one of the videos closely you will see that one of the motors is totally locked it should have gone into reverse.
Overall the project have given an insight in the use of sensors and motor in the real world. There are still lots of work to be done in this project before we can say it is completed.
References
The developed code can be found here:
http://code.google.com/p/teamiha/source/browse/trunk

Project Week 1

Project Week 2

Project Week 3

Radar, NXT project
http://www.nxtprograms.com/radar/index.html

Explorer, NXR project
http://www.nxtprograms.com/NXT2/explorer/index.html

RGB CRC solution
http://lejos.sourceforge.net/forum/viewtopic.php?t=2165&highlight=colorsensor

Text menu class
http://lejos.sourceforge.net/nxt/nxj/api/lejos/util/TextMenu.html

Debug Console
http://lejosnxt.blogspot.com/2009/01/powerful-rconsole-class.html

PID regulator

lørdag den 15. januar 2011

Project week 3

Saturday, Sunday, Monday, Weensday (08/01, 9/01, 10/01, 12/01)
From 10 o’clock until 17:00
Participants: Michael Kaalund and Brian Hansen

Shcedule of the week
- Get the robot to define outline
- Search drive with and without RGB sensing
- Find atleast one tower.
- Send koordinate for tower
- construct behavior diagram

Progress (notes)
- Constructed flowchart for the different states
- We had to implement a wall following function because the direction of th NXT was wery unprecise
- The simple wall following was not enough so a PD regulator was implemented
- The system is designed to get a faraway destination coordinate and by knowing its own coordinate it define all the necesary coordinates on the rute with a grid of 10 cm.
- for every 10 cm it corrects its coordinate drive if the wall is near and it makes a search sweep for the reed tower.
- Debug PC connection used for wall distance measurements and applied coordinate corrections
- Tacho precision for the sensor platform is wery unprecise and needs to be manually calibrated for every try. The problem is that the direction does not return to the original position when a sweep is preformed. The motor control does not register the error it is only visual seen.
- Encountered PD error. After some time the PD results go "beserk". Only solution now is to makes an average of 10 sonic sence at every wall correction. This is time consuming !!!
- Problem if the start position is to farr from wall then it will turn ex. 45 degreess resulting in a colision with the wall.
- Still problems but now the device turn 45 degrees away from wall when it gets to close, but works perfektly when to far away.
- Trying to solve problems by inserting statement that decrease the drive forward(10cm) to two steps of 5cm if the calculated error of the PD is large.
- sweep platform issue seems to be fixed by using rotate_to instead of rotate. Rotate_to have been used before but was buggy meaning that sometimes in turn in the wrong direction.
- Because we use the leJos goto we have not direct control of the motor power that migth be the reason why the motor connected to the C port behaves badly by not stopping at the same time as A or giving the same amount of power.
- an idea of moving the weigth back to try to stop the jumping when it stops doenst seem to work. Changing motor.flt does not help with the problem either
- an idea of the startvalue distance calculation is that the rotation of the platform has not completed before we calculated the distance. therefor we tried to use the function is_rotating to make sure it has completed before distance measurement is started. Even tho it is documented it is not implemented, so we implemented it on over own. the result was that this was not the source of the error.
- after some trial and error the wall following is improved by decreasing driving speed and turning speed.
- implemented function that makes a decision on witch way to turn when it gets to a corner. if left and rigth are blocked it will stop and write game over to the consol
- problem at corners with rotation speed/power, cannot turn enough, and the garbage collector seems to be used often, that migth be because of the constant bluetooth connection.
- tower detection does not work, and no search drive only wall follower and corner detection.

Next time 13/01
- Make precentation program for sonic-RGB distance sence with debug mode
- Make precentation program for Tacho drive to show coordinate errors
- Get the corner detection and drive to work.
- precentation at 15:00

lørdag den 8. januar 2011

Project week 2

Wednesday and Thursday (08/12 an 09/12)
From 9 o’clock until 16:00
Participants: Michael Kaalund and Brian Hansen

Shcedule of the week
-Get RGB readings and calibration
-RGB test, measuring color over distance
-Get sonice reading and calibration
-distance testing, can we detect lego color towers (bunch of matching bricks)
-Construct a simple environment for testing


Progress (notes)
We did ...
RGB sensor:

We had some problems with the RGB sensor in our lejos, that we could'nt get it to read from the sensor because of some bugs in the ”Classes.jar” and at first we could find some chatter about you'll need to recompile the ”Classes.jar” package and reflash the NXT (source: http://lejos.sourceforge.net/forum/viewtopic.php?t=2165&highlight=colorsensor).
After some additinonal searches we found another thread in the lejos forum, where they had found the problem.
The problem lies in that the new RGB sensor acualt uses the checksum. They have found that the error is in the ”SensorPort.java”, and as the checksum is'nt really being used in the other sensors, it has'nt been fund before. The linies that creates the checksum is between 452 to 454 (see the picture)

After we recompiled the ”Classes.jar” and our test program, we runnend some test of the RGB sensor, where we discovered, that we could'nt meassure further that a couple of centimeters away. This meanes for our robot that it either shall turn (the hole robot) around and drive to the object or we are going to make some constructions changes, so that the sensor arm will be extended so the sensors reaches to the sides. Update: We made an quick measuring of the RGB sensor, where we measured the minimum distance to the an object is 0,5 mm and the maximum distances is 40 mm.

UltraSonic:
We did an test on the ultrasonic sensor, the distances is measured from the sensor.
Real distance : Measurement from ultrasonic sensor.
--------------------------------------------------------------
1 cm : 4
2 cm : 5
3 cm : 5
4 cm : 6
5 cm : 7
6 cm : 8
7 cm : 9
8 cm : 10
9 cm : 12
10 cm : 13
11 cm : 13
12 cm : 14
13 cm : 15
14 cm : 16

It gives 24 when there is no object in front of it. As there can been seen from our measurements, there is instances where it in two measurements gives the same reading.

Environment:
The area is 237 x 104,5 cm in it dimensions, futher the environment is defined to not have any obstructions 30 cm from the edge.

Behavior:
The way we imagning how the robot is going to accomplish the main tasks

We have decided here at the beginning, there are two "modes"
The first "fashion" it must run is "define outline" where to find the edge around the track. While driving, the swing's sensor to find the edge, if it were to find a red or ayellow pad, then it will just "remember" where it is, find back there after it has foundall around.
The second mode is where it should not have found the red or the yellow brick in its"define outline" mode, then starting its "search"mode. Where the goal is to find thered block, then send its coordinates to the computer.
When running in search mode and then it will still swing the sensor back and forth acircular motion, to identify possible obstacles and respectively the red and yellowbrick.
For this "search" mode, there will be various search methods. descibed below.

Next week
- Get the robot to define outline
- Search drive with and without RGB sensing
- Find atleast one tower.
- Send koordinate for tower

onsdag den 8. december 2010

Project week 1

Wednesday and Thursday (01/12 an 02/12)
From 9 o’clock until 16:00
Participants: Michael Kaalund and Brian Hansen

Shcedule of the week
-Begin testing the belt drive to get it working with the built in coordinate system.
-Build rotational sensor platform
-Build some code for the sensor platform rotation
-Make debug bluetooth connection between NXT an computer
-Construct menu system for calibration and debug purpose
figure: The robot at the end of the day

Progress (notes)
We did some tests with the SimpleNavigator, so we will get an x,y coordinate system.
We saw that the system did'nt turn 90 degrees i most occurentes, and we believe that there is'nt really anything we can do to make up for it.... but we can use it as an gefyle coordinate.
By strengtening the tracks we saw some improvements on the navigation.

figure: The beltdrive with gearing and an external beam for strength.

Platform in the front constructed so the wehicle can turn the sonic and RGB sensor +-90 degrees.
But there is an issue with the turning of the platform. If the rotation exeds a certain point the motor wont run in the opposite direction.
figure: The sensor platform with RGB sensor on top


Bluetooth
We wanted to have an debuging channel so we could, if there is something there goes wrong, or to check that it is the right place in the program. At first we tried to use the bluetooth class in the api, but this is more for communication between the NXT's.
After some searchs on google and the lejos forum, we found that there was an class, that was implemented in the lejos api.
This class is called "RConsole" and the we can use the "nxjconsole" to get the out put from the nxt. We won't go into how to use the RConsole class, but we will link to the blog that we used to get it to work. [url]

Menu system
We found that there was a need for some calibration of the motor that hold the RGB sensor and the Sonic Sensor.
The reason for that is we want the sensors, to be dead front, at the start of the program, so it will be easier to implement an swiping motion.
And from pas experience we know that sensor like the RGB properly need some start calibration, futher more it could be an nice feature to turn on an off the debug channel.
So after some digging in the lejos api, we found that there was an class that made it possible for us, to implement an menu system easy.
We used the "TextMenu" class, the way you use it, is to first make an String array, with your menu items.
The class will display and handle all the button calls.
To get the menu item that has been selected, we are going to use "select()" function, which can return the number in the array.
then we uses "switch" to go into the submenus. [1]
Here is an little example:

Next week
-Get RGB readings and calibration
-RGB test, measuring color over distance
-Get sonice reading and calibration
-distance testing, can we detect lego color towers (bunch of matching bricks)
-Construct a simple envirement for testing
-Construct behavior diagram

onsdag den 1. december 2010

Final project ideas

0. Navigate and find the red tower (the chosen)
The Goal

- Robot that detects a red brick tower and display the location on computer
- Avoid objects and walls
- Predefined yellow tower locations help the robot make error correction on the location class
- Wall detection by its color and by sonic sensor
- Belt drive and gearing implementation research and practical usage

The Robot
This project includes 1 robot, an all terrain search vehicle.


The robot uses belt drive and is geared to make it go fast. In front a sensor platform is mounted that can sweep +-90 degrees from center. The ultrasonic sensor makes sure it keeps a certain distance from objects and the RGB detect the different colors, the color of the wall is predefined in the robot. A search algorithm determines the direction of the robot and by using the built in coordinate system en LeJos we know where the robot is and it can make course correction. The robot uses Bluetooth for communication with computer for debug purpose and location uplink.

Evaluation
The project involves a lot of existing lab work and some new test for belt drive and the new RGB sensor. By focusing some off the time on a computer connection we will have a great debugging tool for future development. By making this search robot we will have a great platform for future development of search robots en hazarded areas.

The steps needed
- Coordinate system for belt drive
- Bluetooth debug channel
- RGB color sensor test
- Sonic sensor sweeping system
- Search algorithm
- Uplink and display reed tower location
- Coordinate correction by known yellow tower locations
-

1. The Hunter and the poor NXT
The Goal
- Robot that can detect another robot and “kill” it
- Make some robot interaction like a animal hunting another
- If possible be able to shoot something
The Robot
This project includes 2 robots, a hunter and the prey.
The hunter detects the prey by color diodes, and when the prey is “killed” the diode color are changed so the hunter know the prey is dead and a victory signal are given.
The prey follows a predefined area and uses a coordinate system drive to maneuver around. The hunter does not know the environment and has to use all of its sensors to adapt.
The hunter can be fitted with some kind of shooting mechanism which there is a lot of design of on the net. And by making a shooting mechanism we have to be able to calculate distance to object and be sure it is the right object.
Evaluation
Robot interaction and a lot of the old Lab exercises to build from
By using belt drive instead of regular wheels we also have an opportunity to extend the original motor and coordinate classes to be able to handle the belt drive.
The steps needed
- Coordinate system for belt drive
- Hunter tracker, track different light sources and detect color.
- Bum sensor on both robots to know there is a kill
- Shooting mechanism.
- Prey avoid, detect the hunter and speed up like a real animal
2. Find my coffee
The Goal
- A robot that from a start position drives out to find a heat source
- Bring the coffee cup back to start position or give a sound indicating it has achieved the goal.
- Avoid objects by scanning the area, like a radar
The Robot
The robot consists of a belt driven robot; the belt is for stability to handle the weight.
To achieve the main goal the robot has a temperature sensor mounted on top to seek out the heat source. When the heat source is found, the robot maneuvers around, so to be able to lift the coffee cup, like a fork lift. The area in it should search in is filled with obstacles that it has to manuvre around and detect with a sonic sensor radar.
Evaluation
The robot can be seen as part of some excising robot used in hazarded areas to find exposed pipes underground. This robot should do the same tasks but without remote control and therefore there is a massive amount of robot environment interaction.
By using belt drive instead of regular wheels we also have an opportunity to extend the original motor and coordinate classes to be able to handle the belt drive.
Altogether the robot implements a few of the excising lab exercises and contains new areas that needs to be explored.
The steps needed
- Coordinate system for belt drive
- Find Heat source
- Victory sound
- Avoid objects
- Lifting coffee cup
- Bring back coffee
3. Walk on stairs (up and down)
The Goal
- A robot that have the ability to go nearly anywhere.
- Main goal is to walk upstairs and again downstairs.
- Detect the stairs and measure if the height can be achieved.
The Robot
A search on the need for robots that can climb stairs shows 2 obvious models
The first model gives a robot with legs. A massive amount of motors are needed for the task and some of them seems to be hardcoded and does not know about the surroundings
Another model is a belt driven vehicle that use leavers that lifts and push the vehicle upstairs. The robot uses s tacho counter know how much it has lifted itself and sensor to measure if the obstacle are to high
Evaluation
The project is a sprig of on some problems with people in wheelchairs that cannot get upstairs because there is no ramp. Therefore this project can solve some social problems
The negative aspect of this project is that it is more a mechanical project and not a robot interaction project. Therefore only few of the previous lab exercises can be implemented.