onsdag den 8. december 2010

Project week 1

Wednesday and Thursday (01/12 an 02/12)
From 9 o’clock until 16:00
Participants: Michael Kaalund and Brian Hansen

Shcedule of the week
-Begin testing the belt drive to get it working with the built in coordinate system.
-Build rotational sensor platform
-Build some code for the sensor platform rotation
-Make debug bluetooth connection between NXT an computer
-Construct menu system for calibration and debug purpose
figure: The robot at the end of the day

Progress (notes)
We did some tests with the SimpleNavigator, so we will get an x,y coordinate system.
We saw that the system did'nt turn 90 degrees i most occurentes, and we believe that there is'nt really anything we can do to make up for it.... but we can use it as an gefyle coordinate.
By strengtening the tracks we saw some improvements on the navigation.

figure: The beltdrive with gearing and an external beam for strength.

Platform in the front constructed so the wehicle can turn the sonic and RGB sensor +-90 degrees.
But there is an issue with the turning of the platform. If the rotation exeds a certain point the motor wont run in the opposite direction.
figure: The sensor platform with RGB sensor on top


Bluetooth
We wanted to have an debuging channel so we could, if there is something there goes wrong, or to check that it is the right place in the program. At first we tried to use the bluetooth class in the api, but this is more for communication between the NXT's.
After some searchs on google and the lejos forum, we found that there was an class, that was implemented in the lejos api.
This class is called "RConsole" and the we can use the "nxjconsole" to get the out put from the nxt. We won't go into how to use the RConsole class, but we will link to the blog that we used to get it to work. [url]

Menu system
We found that there was a need for some calibration of the motor that hold the RGB sensor and the Sonic Sensor.
The reason for that is we want the sensors, to be dead front, at the start of the program, so it will be easier to implement an swiping motion.
And from pas experience we know that sensor like the RGB properly need some start calibration, futher more it could be an nice feature to turn on an off the debug channel.
So after some digging in the lejos api, we found that there was an class that made it possible for us, to implement an menu system easy.
We used the "TextMenu" class, the way you use it, is to first make an String array, with your menu items.
The class will display and handle all the button calls.
To get the menu item that has been selected, we are going to use "select()" function, which can return the number in the array.
then we uses "switch" to go into the submenus. [1]
Here is an little example:

Next week
-Get RGB readings and calibration
-RGB test, measuring color over distance
-Get sonice reading and calibration
-distance testing, can we detect lego color towers (bunch of matching bricks)
-Construct a simple envirement for testing
-Construct behavior diagram

Ingen kommentarer:

Send en kommentar