Quote:
Originally Posted by JamesTerm
I do have a question for you... what kind of tracking setup are you going to use (e.g. Raspberry pi, m1011, m1013 etc.). Will it be on-board or over the network and processed on the driver station?
|
After much (semi) heated discussion between me, another student, our mentor who is a ee, a mentor who is a biomed engineer that does biotechnology, and our teacher sponsor, we decided on the O-droid xu. Last year we used the O-droid x2. It will be on board and will relay info from the xu to the labview side of things via a udp message. We might have multiple xus. we are not sure how much computational power we will need this year. As of right now, the tape tracking program runs at ~27-28 fps without uncapping the 30 fps limit or threading it.
We are going the route of 3, 120 degree cameras for a complete view of the field. A case sensitive pose calculation will be done to calculate x, y, and z displacement and pitch, roll, and yaw, with respect to the middle of the wall on the floor. There are 3 cases: left corner, right corner, and both.
We are also going to have an asus xtion to do robot and ball detection, maybe 2, still unsure, we are prioritizing our tasks in case we run out of time. The xtions will be for ball and robot detection, and then we can measure velocity of whatever we see (assuming we aren't moving and everything around us)
I'm going to be outsourcing some of the code that tracks the vision tape in the next few days. I taught another student some computer vision and he is starting to work on depth tracking with the xtion. A concern we have with the xtion is lighting in the area. The depth works by projecting a pattern of dots in ir, but if there is so much light, the depth won't work, so it's a gamble.