|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#1
|
||||
|
||||
|
Team 2073 presents: DoubleVision
First disclaimer; this thread is inspired by, and was requested in, the CheesyVision thread. We do not want to derail or hijack that thread, so here it is.
Second disclaimer; Our original inspiration for this code was provided by Spectrum 3847 and this White paper. As you look through the code, you will see a lot of their original code in there. We intentionally left all our lumps and warts in there so they can be utilized in the future, should the need arise. The goal of this thread is to share ideas and approaches to different ways to do the task of object tracking. Please feel free to share code if you feel lead to do so. Don't worry about sidetracking this thread, but try to keep it to object tracking and various methods of doing this that have worked for your team. Ask questions, propose alternatives. The purpose of "DoubleVision" is to allow the DS to tell the robot which color ball to track. (It can be dynamic if you wish, allowing you to change mid match, but I have not seen a game yet where that would help.) The selection of the color to track is done by reading a switch on the DS, and setting a DIO pin on the robot to the corresponding state. That pin is connected to a GPIO pin on the PCDuino. (WARNING, GPIO pins on PCDuino's are 3.3vdc. You will need a transistor or other method to convert the 5vdc of the DSC's DIO pins to 3.3vdc.). You may want to modify your Robot code to use the FMS information to automatically set the color for you. We chose to do it manually with a switch to allow this to work while not connected to FMS. This code has been configured to run on a PCDuino v.1 This should run just as well on PCDuino v.2. We used a USB Web Cam on the robot for image acquisition. As you look through the code, you will see that it would be easy to have this performed with a Network camera, such as the Axis cameras, instead. In fact, with just minor modification(s), both could work at the same time, for various purposes, and all image processing would be removed from the cRio. In the attached zip file you will find three files. Their names explain their purpose. Dual_colorTracking.py is the file that actually runs on the board during competition. Dual_Cal_Targets.py is used to determine the upper and lower threshold, dilate, and approx values to edit into the Dual_colorTracking.py file. Dual_colorTracking_DEMO.py This is the file that allowed us to demonstrate visually to the Judges what our code was doing. It was used on a Demo PCDuino setup in the pit with a monitor. With it you can see the tracking changing from a Red ball to a Blue ball with the flip of a switch. (This is part of how we won the Innovation in Control Award at both regional's we attended.) |
|
#2
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Bill thanks for posting this...
![]() I want to open this thread with some of the difficulties we had of pulling off a task like this... as posted from the cheesy thread this was our attempt at this... getting the tracking detection proved to be somewhat of a challenge... if you look at my avatar you'll see I have last year's bumper around my waste... I was on my way to test this bumper against a blue ball to see how the color would impact the geometry detection of the ball. We found a solution that could splice the bumper from the ball, but the performance (using imac calls from NI Vision) was expensive in performance. Here is our project... that has been ongoing since 2012. As I write now it is due for an update in the source so I'll get this updated soon and post back when it has the ball tracking. Ok so that is one issue... so for the most part we can track the ball, but then how to move the angle... this is not so trivial at least for us. We have been pushing the idea to move to h264 to keep the bandwidth down and finally this year has been successful as shown here we get about 1.5 or less megabits per second for push 600x800 unlimited framerate on the m1013 as shown in these video clips. The only drawback is the latency, which for camera tracking is a bit challenging to overcome potential oscillations. So the solution is to wait out the latency in bursts. I'm not sure if it is the best solution, but it was effective in the tests. I think the most attractive piece to this solution is the ability to re-use the same resources that were used otherwise for live streaming feedback for drive team. Even if there was not latency it is tricky to tell a robot to turn a given delta angle... for us in this demo it was open loop with no gyro and no encoders all turning was "timed" to a curve fitted poly that was empiraclly tuned to 90 45 22.5 angles etc... once again... trying to work with the minimal amount of sensors. Its funny how there are many ways to solve the same problem, but for us we didn't want this feature to have any overhead mechanically speaking, so we worked with what was available otherwise. One other footnote: We have been using Network Tables for all communication between robot and driver station (i.e. sending ball tracking information), so this may introduce some other latency, but it should be minimal. The good news is that we can reuse the same code for all Network traffic including all feedback of voltages issued, other sensor feedback, hotspot detection, autonomous tweaking, and ball count to perform in autonomous... etc. |
|
#3
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
If you're processing through the cRIO (or at least communicating through it), can't you just use this?
Code:
DriverStation.getInstance().getAlliance() Don't get me wrong, I love the idea. It just seems like you reinvented the wheel a little bit for this application of it. |
|
#4
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
In regards to getAlliance(), it is cool though that people actually use that function call and verified that it works. There are several more cool methods like that in DriverStation.h for c++ it is the same name just different case GetAlliance(). |
|
#5
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
The reason we went with a switch is, well, because we don't always trust FMS to work correctly every time. Can you think of any examples from this year? Additionally, we want to be able to set the color to track when not on FMS, like in demo's and off season competitions. |
|
#6
|
|||||
|
|||||
|
Re: Team 2073 presents: DoubleVision
Thanks for posting this! It is awesome that everyone seems to be in a sharing mood
![]() |
|
#7
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Was this a hint of some incident? If so please let me know... thanks.
... Sometimes I get in the mood to fix other peoples bugs. ![]() |
|
#8
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
The biggest errors with FMS that are happening this year are timing related. The timing of hot goal activation, dynamic reflectors, hot goal duration, and even tele-op duration has been, well let's just say, a bit suspect this year. FIRST has appeared to improve it, but I don't believe they have made it "as advertised" just yet. Honestly, I have not heard of any issues with "Alliance assignment", but if it did get messed up, that could potentially cause your robot to pursue the wrong ball and thus incur a HUGE penalty this year. |
|
#9
|
|||
|
|||
|
Re: Team 2073 presents: DoubleVision
When not on FMS this parameter is set by choosing the Team Station in the bottom center of the Operation tab of the DS.
|
|
#10
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
Dang, there we go, over thinking again! Thanks, I guess we have a small bit of re-coding to do. |
|
#11
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
![]() |
|
#12
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
Using this would make Trussing to an HP MUCH easier, too!! So, is 2.2 available, or it that happening after St. Louis? |
|
#13
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
![]() I am a bit sad as I release it... because it has yet to be used to its fullest potential. I am hoping somebody will take and run with it. If that happens... then maybe I won't feel that it was a wasted effort. |
|
#14
|
|||
|
|||
|
Re: Team 2073 presents: DoubleVision
Quote:
Thanks, and awesome job. |
|
#15
|
||||
|
||||
|
Re: Team 2073 presents: DoubleVision
Quote:
This is a targeting reticle as such where the spheres are calibrated to the actual balls size of it's apex and place to land. When we had our manipulator working it shot 27 feet. So there are 27 segments in the path align each measured at one foot. This video can hopefully illustrate how these work a little better. Our strategy was to look for robots that could catch and use this system to deliver balls to them over the truss. That and being able to score with the aid of the apex marking. If you watch the video you may notice the path align moves... this moved when the robot moves, but I had that demo not connected so it moved 1 fps by default... the idea there is when connected and fully moving it almost looks like the robot is riding on train tracks and the rails width are 25 inches apart (i.e. ball diameter). This helps predict where the ball is going to land easier for the eyes to anticipate hitting the mark for a motion shot. Thanks for the feedback... these compositing tools are highly configurable and should be able to be used in future games as well. ![]() There is one other note worth mentioning... on the m1013 we stream 600x800 30 fps using h264 running 1.5 or less mbps, with 4-5 frames of latency (that is only one frame of latency more than using mjpeg with the benefit of low bandwidth). Last edited by JamesTerm : 04-18-2014 at 07:59 PM. |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|