tirsdag den 18. januar 2011

Final project


All terrain search vehicle




Date: 14-January 2011
Duration of activity: Project week 1-3, documented on block
Group members:
Brian Hansen 20097765
Michael Kaalund 20097660

Goal:
The project idea is described in http://teamiha.blogspot.com/2010/12/final-project-ideas.html where the different project ideas are stated.
The project description defines an all terrain vehicle that can find the location of a certain item by using the built in navigator in leJos.
The Goals from the project description
- Robot that detects a red brick tower and display the location on computer
- Avoid objects and walls
- Predefined yellow tower locations help the robot make error correction on the location class
- Wall detection by its color and by sonic sensor
- Belt drive and gearing implementation research and practical usage
From the beginning of the project it was expected that the belt drive would give problems in the SimpleNavigator because of the different math behind navigation based on how the belt behaves.
This project will from test show some of the problems, and how the issue can be handled by knowing the location of certain objects.
figure 1: Belt driven robot with sensor platform in front
The sub goals and the expected steps needed was:
- Coordinate system for belt drive
- Bluetooth debug channel
- RGB color sensor test
- Sonic sensor sweeping system
- Search algorithm
- Uplink and display reed tower location
- Coordinate correction by known yellow tower locations

Plan
Construction of device:
From the device described in figure 1, there were several experiments with the belt drive that in the end ended with the setup in figure 2.
figure 2: version 1 of the robot, important observations (sensors, gear)
The sweeping platform in the first construction was mounted as in figure 2.
A few changes were necessary during the development because of the planned test that showed it was necessary. A complete description of the setup is described in the results section. Some of the changes are (sensor platform, NXT main device location for better weight distribution)
RGB test, color input and distance test
The purpose of a test of the RGB sensor is to figure out the output the driver gives. Furthermore we need to detect a tower at a distance so a distance test of the RGB is needed.
Debug connection to computer
During the course we have had issues on how to debug the code when testing. A solution is given by the leJos system, a debug console that uses the Bluetooth. This system is set up in our programs and can be turned on and off.
Menu system on NXT
We wanted control of the debug for turning on and off and the choice of making configurations on the different systems like the sweeping platform. The sweeping platform is to be configured in direction by the menu system. The leJos class TextMenu is used.
figure 3: the intended menu system and sensor platform calibration
Navigator test
The goal of the SimpleNavigator test is to find a best fix for the wheel size and track width which is expected not to be exactly the same as the measured because of the belt drive. A best fix is found by two tests:
Distance covered in direct line: Give at fix for the wheel size
Rotation angle: determines the track width
figure 4: coordinate test
Combined: The last test is a combination where the device drives in a square, figure 4. This test is also used in the presentation to show the problem in small scale.
Main goal solution idea
To solve the main goal (detection of red tower), the sensors sweep around. The idea is taken from the project idea [RADAR] and [Explorer]. By the means of the sweeping we only need one sensor to detect obstacles and the red tower (sonic sensor) and the RGB to determine if it is the tower or something else.
figure 5: coordinate calculation based on direction and roter angel.
On figure 5 the sweeping implemented is described and the math needed to determine the coordinate to the object. To determine the coordinate you need the coordinate and direction of the NXT to determine rotor coordinate and the rotation angle and the distance from rotor coordinate to object from this the X,Y coordinate of the object detected can be estimated. System not implemented or tested
To find the towers the device needs to know the size of the environment to search in. In the project description we have defined the test environment to be a square table with a raised edge we can detect by the sonic sensor.
The NXT should run 4 modes where the first mode estimates the outline and from there give four coordinates (each corner). The 4 modes are described on figure 6
Figure 6: the 4 modes for determining outline and search
Result
Final construction
From the project description we have a setup that needs 3 motors and 2 sensors. On figure 7 the setup during the project is described.
Figure 7: hardware setup
The first requirement to the construction is the belt drive. During the start of the project different types of belt drive setup were discussed and constructed but the chosen design is shown in figure 8. The actual drive consists of a big gear wheel mounted on the motor and a smaller mounted at wheel that drives the belt. This construction was needed to give better ground clearance and resulted in a vehicle that could drive fast. The gear ratio is 40/12.
Figure 8: belt drive with extra support
The 2 extra wheels is only for support especially the middle one, which was implemented for support purpose and improved some of the rotation accuracy. The rod connecting the 3 wheels is to strengthen the design because the belt will try to squeeze the 3 wheels together, resulting in bent rods. The rods shown in figure 9 goes all the way throe to make sure that the weight of the NXT is supported or else the 2 belt drive will be spread further apart.

figure 9: underside of robot, smaller gear wheel visible.
With the implementation and testing of the sensors we encountered a problem with the sweeping functionality. When the platform is in the 90 degree position meaning it looks to the side then in the original design the RGB would always say black because it could reach far enough out.

figure 10: top of the sweep platform, 90 degree angle.
By implementing a extra block to the sweeping arm shown in figure 10 and 11 the RGB could now reach to the side of the device and given the maximum RGB reading length of 4cm it could now detect objects.

figure 11: RGB and Sonic sensor mounted with the extra brick
Later on in the project we encountered a problem when the device started an especially stopped. When stopping the robot would tip back and forth, resulting with a small change in direction angle. It seemed like there was a problem with design of having the sweeping motor so far in front giving a bad weight distribution. It was not possible to move the motor back so there for the NXT brick was moved further back to balance the robot. This is shown in figure 12.

figure 12: support system for the NXT to move weight back

The final result is shown in figure 13. The Claws mounted on the side is only for show. An extra not for reconstruction of our device is to make a better solution for the cables to the sensor. The cables have a tendency to work as spring making the sweeping platform get imprecise because of stiffness in the cables.
figure 13: final result that was tested
RGB and Sonic sensor test

The RGB sensor that we used is from the NXT series 2.0 and one of the difference from this sensor and the other Lego sensors, is that this one uses the CRC that the class SensorPort generates. This gave problems with leJos system (version 0.85 beta), sometimes it will put something out on the display and it would not read from the sensor. After some searches on the leJos forum, we found out that we was not the only ones that had problems with this RGB sensor, in one of the replies an user said that the problem could be with the CRC in the SensorPort.java file. Afterwards we looked at the SVN version of the leJos and found that they made a change to the CRC.
figure 14: Picture shows the difference between 8.5 beta and the SVN version
We implemented this change to our version and recompiled the Classes.jar. Recompiled our test of the RGB sensor and this fixed our problem.
Our distance test of the RGB sensor, yield that minimum distance that we could get an reading of, was about 0,5 mm from the sensor to the object.
The maximum possible object distance that could give a correct color measurement was measured to 40 mm, anything further away; the RGB sensor will return the value black.
The Sonic sensor performance needs to be known because the sonic sensor is mounted close to the floor that could give a problem on maximum distance.
In the documentations of the sonic sensor in leJos it says the maximum range is about 170 cm, and our measurement with our setup yielded:
figure 15: Sonic sensor measurements
When there is no object in front of the sensor, the sensor reads 24. We would expect a maximum readout of 255 if there is no object in front of the sensor, but as we expected because of the height from the floor the ultrasonic sensor gets some reflections from the ground.
Sweeping platform

The idea with the sweeping platform is to have it doing continues distance reading on the environment in front and the sides of the NXT. When it then detects something with a distance smaller than a desired value, then it should take an RGB reading. The RGB determines if it’s the wall or a brick tower.
Because of the ultrasonic sensor the sweep-platform needs to be stationary while it takes a reading, and therefore we need to determine how many degrees to rotate pr reading. After a discussion it was determined that 10 degrees was plenty which give 18 readings for the 180 degree sweep.

figure 16: sweeping rotation output
Figure 16 describes the rotation degree reading used and returned from the motor class. The sweep as describes in the figure goes +-90 degrees
On system startup the tacho counter will be at zero, even if this is not the case in the real world. We cannot just turn the platform manually, because of the tachocounter will not register it. Therefore in the menu, we can align the swing platform, when it is aligned it will reset the tachocounter to 0.
In one of the first attempts to make an alignment function, we used rotateTo. This gave some problems because of the tachocounter thinking it is in a certain position. In this functionally we uses rotate, this is because, as the function names indicate rotate only rotate an amount of degrees and rotateTo rotates to an angle. Under the development of the menu correction function, it was discovered the function rotate, has a minimum it can turn, and our trials shown that it needs an minimum angle of 5 degrees.
RotateTo has to be used for in the actual sweeping function because of the error. If the rotate function is used it will not always return to our define 0 degree position, test showed a misalignment of more than 10 degrees and increasing over time. RotateTo have some error correction implemented that works nearly perfect.
Outline, no correction

When the NXT starts it don't know in how big environment it is in, so the idea is to get it to define outline of the environment as seen in figure 6. When it comes to the first corner it will define this one as the first corner setting the coordinate to 0;0. Afterwards it will turn in the direction with no obstructions, go until it gets to the next corner. The reason it look in either direction, is the NXT really don't know which way around the track it is going, so it needs to make a decision on which way to turn.


Figure 17: straight moving outline detection with debug mode on
In this case we assumes that it will drive straight when driving forward, but we encountered some problems, as it would not drive straight. This could indicate some problem in either the leJos tachopilot, simpleNavigater or in the hardware; further investigation is needed to determent where the problem lies. We tried to correct the course by implementing a PID regulator that uses the outline of the table.
PD regulation
Our PID regulator works by getting two coordinates an X and a Y coordinate, if we want to move in the X direction, but want to keep an distance to the wall, it regulates on the Y coordinate, those two are then feed into the SimpleNavigator.goTo, which takes an X and a Y coordinate. If it then turns and we want it now to go in the Y direction, it then regulates on the X coordinate. This would seem like a complex way to turn the device but it is necessary because the simpleNavigator is the class that has control of the motor class.

figure 18: the implemented PID code
During the development with the PID regulator, we discovered a new problem. The device has to stop while taking a reading to the wall; the problem was that it skids out with the back of the NXT. This presented a quit serious problem as it meant that it puts the NXT more out of course, and it became difficult to get the PID regulator regulate correct. We were told that shifting the weight to the back in the construction, described in “Results: final construction”. We eventually found that changing the speed of the NXT helped on the problem but not enough and it also resulted in a very slow moving robot.
Figure 19 movie 2: wall following PID regulation
Presentation
The final presentation showed the problem of simpleNavigator together with the belt drive. The result was a nearly perfect 20 by 20 square. From this test program the error facing the next step was not clear because on small distance the system seems to work perfect.
Next presentation was the test we used to specify the distance detection of the RGB sensor, and the color output. The distance tests have during the project resulted in construction changes to make sure the RGB could reach the object when in sensor platform was in 90 degrees position. With this presentation there were also a presentation of the Menu system and the debug channel.
The final test was to present the simpleNavigator issue on large scale and to show the progress so far on the main goals. The program should follow the table outline to define the corner coordinates. First test showed large error because of motor issue where one of the motor did not put out enough power and therefore could not drive straight. The next step was the idea of using a PID regulator to make sure we had the same distance to the side wall. Issue here again is the difference in motor and it seems like one of the motor is broken.
Finally we had a discussion on the research of using simpleNavigator for belt drive.
Future work
Further research into belt drive is needed to calculate correct angle of rotation. The problem lies in the friction of long tracks. If the tracks have lots of contact with the ground because of the length of the vehicle there will be more friction then with a wheel. The math needs to take the actual track length into account. There will still be error because with the friction issue you have to take the floor type into account that will change the friction constant.
In the project the search drive and the outline detection did not work as planned. With the outline and search we had hoped to make course correction calculations to minimize the error. The solutions could work in theory but there needs to be made some improvement to the simpleNavigator to be able to get to the necessary position which was not possible.
Because we could not detect the outline we could not get the object detection and sweep tested and implemented in this project. Especially the calculations of tower positions have a big priority in the future work because this is the basis of the course correction.
When the different system works as planned then the code should be optimized with the use of threads for a more fluently sweeping and search movement because the stop, search and drive takes too long.
Conclusion

We did not achieve the main goal of the project which was detection of red tower and presenting the position of the tower. We did however make the conclusion from the test that the simpleNavigator in leJos 0.85 beta cannot be used for belt drive with the intended precision needed for the project.
During the project several new items appeared, documented in this blog. The menu feature was not mentioned in the early faces of the project but we found it necessary. The debug channel implemented where used extensively and the hope is that further project will use this earlier in the lab work.
We looked at fixed gearing in the project mostly because of the construction which made more ground clearance with the gear mounted. Fixed gear was easy to implement in the code by using it in the wheel size calculation.
The bad result in the end of the project shown in the wall following video figure 19 could be because one of the motors seemed to be broken. If analyzing one of the videos closely you will see that one of the motors is totally locked it should have gone into reverse.
Overall the project have given an insight in the use of sensors and motor in the real world. There are still lots of work to be done in this project before we can say it is completed.
References
The developed code can be found here:
http://code.google.com/p/teamiha/source/browse/trunk

Project Week 1

Project Week 2

Project Week 3

Radar, NXT project
http://www.nxtprograms.com/radar/index.html

Explorer, NXR project
http://www.nxtprograms.com/NXT2/explorer/index.html

RGB CRC solution
http://lejos.sourceforge.net/forum/viewtopic.php?t=2165&highlight=colorsensor

Text menu class
http://lejos.sourceforge.net/nxt/nxj/api/lejos/util/TextMenu.html

Debug Console
http://lejosnxt.blogspot.com/2009/01/powerful-rconsole-class.html

PID regulator

Ingen kommentarer:

Send en kommentar