Detecting trailer position

Due to the differences in maneuverability when the trailer is at different positions, I realized that some sensor to determine what angle the trailer is at could be quite useful. However, being unable to modify the trailer, and the trailer hitch being outside the maximum robot dimensions, I was unsure of how to do this.
Any suggestions?
(I was considering using SONAR on one side, but I wanted to know if there was a better way)

Your camera can detect trailer position relative to your robot easily. Unless you are using it for something else :]

Of course, that’s always the problem.
Unless I find it beneficial to have the turret pointed at the trailer the whole time, I’ll be using the camera to aim the turret.

Using an ultra-sonic was the first thing that cam to my mind.

Using Ultra-Sonic or Sonar would’nt give you the best results however because if a ball flies in the way or if another robot comes, it would block it. I would try to put something say an encoder and connect it inside the trailer hitch to the pin.

2 30 cm sharp Ir sensors could do it .

A pair of sharp IR sensors should work well for this.

That’s what I’d tend to go with. Maybe have them on servos so that the left one is slightly offset to the left and the right one is slightly offset to the right, so you can tell which way it’s moving and track it. It would probably work best to point them at the bar that goes to the trailer hitch, because that’s less likely to have a ball fly past it.

Just curious, what will you do with the trailer position information?

the problem I see with IR sensors is that you can’t attach them to the trailer, unless you though of something I completely missed.

The thing that came to my mind was actually a photocell stuck in the trailer hitch and the angle determines how much light it gets and is based off of that. unless there is a rule saying you can’t mount things inside the trailer hitch, I didn’t see anything against it.

I was hoping that we would mount the camera far enough back, so that it would cover the majority of the field we could fire in. Position would be detected by change in said objects size. Movement would be calculated after 2 frames, and acceleration in 3 frames. Naturally the later info for acceleration would allow us to figure where the trailer would be in say a couple tenths of a second that it would take the ball to get there, and for a second ball to get there so we could lead enough for say 7 or so balls to reach the trailer. Oh lots of trig to do this. The cRIO can do floating point right?

Put some sort of rotation sensor on the bolt the hitch attatches to or something.

Due to the trajectory of the balls, I would estimate half a second to reach a trailer. Unfortunately, this allegedly 30 fps camera seems to have been reduced to 10 fps (frames per second), and with horrible blurring.
Labview has a standard of 64-bit floating point integers (dbl), but there is an option of 32-bit floating point integers (sgl).

I don’t believe it is legal to attach anything to the trailer or trailer hitch, as this would likely be outside the allowable limits for the robot. Assuming the trailer hitch tongue is steel, you could possibly use a very strong neodymium magnet on a spring-loaded potentiometer to detect which side it is on. More likely, though, would be looking down the tongue and dealing with reflectivity. I’m rather surprised FIRST didn’t allow a reliable way for us to detect this.