Quote:
Originally Posted by Jared341
Easy, it's because I am not the only person who writes software for our team! Java is what is taught to our AP CS students, and is a lot friendlier to our students (in that it is a lot harder to accidentally shoot yourself in the foot).
|
Ah ok.
Quote:
Originally Posted by Jared341
But if this is the part of the code that engenders the most discussion, then I'm a bit disappointed
|
Don't be disappointed... this discussion has taught/reminded us something that we rarely use in c++, and this discussion indirectly helped my co-worker fix a bug today. I do know how you feel though as there is a *lot* of effort that goes into this! I had most-all my vision code written as well, and unfortunately it is all going straight into the bit bucket, as we could not get the deliverables to make it work in time. I do want to look your code over in more detail and post what I did as well, and hopefully at that time the discussion will have more meat in it as I do want some closure in the work that I have done thus far.
I will reveal one piece now with this video:
http://www.termstech.com/files/RR_LockingDemo2.mp4
When I first saw the original video, it screamed high saturation levels of red and blue on the alliance colors, and this turns out to be true. The advantage is that there is a larger line to track at a higher point as I could use particle detection alone. The goal then was to interpret the line to perspective and use that to determine my location on the field. From the location I had everything I needed as I then go to an array table error correction grid with linear interpolation from one point to the next. (The grid among other tweaks are written in LUA more on that later too).
more to come...
There is one question that I would like to throw out there now though... Does anyone at all work with UYVY color space (a.k.a YPbPr). We work with this natively at NewTek, and it would be nice to see who else does.