Hot Goal Targeting Help

I am part of Frc team 3293(the otterbots) and we are trying to program our robot to detect the hot goal. I have followed several online tutorials and I have been getting nowhere. I am wondering if anyone knows the best way to go about vision targeting for the hot goal.

Without a specific question, I can’t give you an answer with the degree of specificity I think you’re looking for. Would you mind elaborating?

I’ve found the WPI documentation to be useful (though lackluster in more advanced/custom stuff). They have a section on Vision for 2014, if you haven’t looked at it already.

As far as the best way to go about hot goal targeting, I can say that our team is not looking at the LEDs on the goals. We found it much easier to segment the dynamic horizontal target. See here (and scroll down about halfway) for how a horizontal target behaves during autonomous.

I have looked at those sites.

We are targeting the retroreflective tape. I have used the 2014 target detection example vi in the labview tutorials i have managed to get that code to identify hot and not hot targets. my problem has been that i can’t figure out how to integrate the target detection code into our robot code.

Thanks for being more specific. I’m not familiar with LabView, but I assume it has the ability to abstract certain code fragments into subroutines.

I would implement it as some sort of subroutine/function that returns targeting information. Based on the information returned from this function, you can feed that into an if statement in your autonomous routine.

i haven’t figured out how to do that. I know that it goes into the vision processing section of the code but I haven’t been able to figure out which code to place in it. I have tried several different strings from the target detection vi but i have been unsuccessful so far.

Have you looked at tutorial number 8?

Greg McKaskle

I have tried that. It did not work. I didn’t get any hot/ not hot data on the dashboard.

I’m not familiar with labview, but try diagnostic information. Sounds cool and can be very helpful! Put descriptive print outs all over the code, so you can find out what’s hanging it up. Also, look for something called an aspect ratio. Should be very helpful.

Tutorial 8 has directions for doing the processing on robot or dashboard and has many steps and screenshots. Which didn’t work? Do you need more help with step 13 and 14?

Please let me know which steps you didn’t understand and I’ll go into more detail.

Greg McKaskle

I went through all the steps and i thought i was successful. However, when i ran the code, no information on distance and whether or not the target was hot displayed on the dashboard.

The steps I’d recommend are to get the vision portion working. Then send the information using SD, receive the info, and make use of it to control your robot.

Do you still have the working vision code? Was it on the dashboard or robot? Did you probe or verify that the data was being flattened and sent to an SD variable with no errors? Where did you place the SD read on the robot?

Break it down into smaller elements that you can verify. It is useful now to work your way towards a solution, and it will be later in the future when things stop working and you are supposed to fix it.

Greg McKaskle