pic: Team 341 presents Miss Daisy XI



Team 341 presents Miss Daisy XI

Our 13th robot, and 11th to bear the name Miss Daisy. After a long and interesting build season, we are VERY happy with how she turned out.

CHASSIS

  • 119.8 lbs
  • 27.5" x 37" x 53"
  • Wide drive base for ease of fit on bridges
  • Actual Center of Gravity is 4" off the ground - robot must tip MORE than 90 degrees forward or back before not returning to its wheelbase

DRIVE

  • 8 Wheel, 4 CIM Drive
  • 6" Performance Wheels with Blue Nitrile Tread
  • 2 speeds: 5 ft/sec and 12 ft/sec
  • Effectively climbs the barrier and bridges

INTAKE

  • 37" wide because we really like picking up balls
  • Deploys over the bumper to manipulate the bridge
  • Powered by AM 9015 motor
  • 2 stage funneling system (3->2 in intake, 2->1 in hopper) eliminates jams

SHOOTER

  • Fixed shooter on opposite side of intake (I wonder why?)
  • Dual 6" Skyway wheels powered by dual FP 0673 motors
  • Can make shots anywhere in the offensive zone

SENSORS

  • Encoders and dual axis (pitch, yaw) gyro on the drive
  • Optical photosensor for accurately measuring shooter wheel speed
  • Camera and dual LED rings for detecting the top target

SOFTWARE

  • Completely automatic targeting via laptop-based DaisyCV image processing application
  • Target detection code can run at more than 200 frames per second; it has no problems performing real-time tracking of the target regardless of lighting or even partial occlusion (such as from the rim/net)
  • Camera system outputs are fused with onboard telemetry to perform high-speed positioning and shooter spinup (camera commands azimuth to the gyro and RPMs to the shooter)
  • Software-assisted balancing for an easy endgame
  • A few different autonomous routines that we hope the crowd will enjoy :smiley:

Looks like another awesome miss daisy machine. That intake system is really cool and I look forward to seeing it in action. From the videos I have seen of competition and the practice we have done, it is becoming clear how difficult picking up the balls is going to be and your intake system looks like it has more than solved that issue.

This is definitely a Daisy machine. I’m sure that come Championships, Jared and the team should be seasoned in and ready to what they wanted to achieve last year to really happen. Really impressive and can’t wait to see it in action soon. Good luck and see you at Champs!

Wow, that COG is impressive, This looks like an awesome robot, cant wait to see it in action.

Daisy makes extrusion look sexy. Can’t wait to see Miss Daisy XI at MAR. It’s been a pleasure to watch her come together, can’t wait to watch her run.

Best of luck Daisy!

Great-looking bot! Can’t wait to see it in action at the MAR disctrict events. See you all at Chestnut Hill!

Great robot Team 341! I really love the intake system on the robot, very unique and cool. See you at Chestnut Hill!

This frame rate is very impressive.

  1. What camera are you using ?

  2. What are the specs of your laptop ?

  3. How do you transfer information between a laptop and cRio?

  4. Do you have a openCV backboard tracking example available?

  1. The Axis M1011 camera. This does limit us to 30 frames per second when operating on the robot (at 640x480 resolution), but the code itself has been shown to process upwards of 200 frames per second when streaming images from disk. In actuality, 30 frames per second is more than enough since we are actually using the gyro for doing our feedback control anyhow. At 30FPS we utilize about 15% of our CPU.

  2. It’s a Core i5 with 6GB of RAM.

  3. Camera data goes robot -> laptop through the wireless link to the Driver Station. Computed outputs go back through the link to the cRIO using WPI Network Tables.

  4. I’ll post our full code after the competition season has begun. For achieving basic throughput between camera, laptop, and cRIO, you can use the example square tracker that comes with the SmartDashboard installer. Here is our basic algorithm:

  5. Convert image to HSV color space

  6. Perform thresholding in HSV (enforce minimum and maximum hue, minimum saturation, minimum value)

  7. Find contours

  8. Take convex hull of contours (this is the step that helps ensure that partial obscuration from the rim/basket doesn’t kill us)

  9. Perform polygon fitting to contours

  10. Filter polygons based on (a) number of vertices, (b) aspect ratio, (c) angles of horizontal and vertical lines.

  11. Select highest remaining polygon as the top target.

  12. Compute angle from camera center to target center and add this to the current robot heading

  13. Command the new heading to the robot (it uses gyro and encoders to achieve this heading)

  14. Compute range based on trigonometry comparing center of camera axis to center of target height

  15. Command shooter RPM setpoint based on linear interpolation of a lookup table

The code has been very carefully optimized to reduce the allocation/deallocation of dynamic memory between subsequent calls, which is what lets us operate at breakneck speed. This also involved a lot of debugging to hunt down latent memory leaks existing somewhere in the layering of OpenCV/JavaCV/WPIJavaCV APIs.

How did you solve the potential networking issues and latency? Do you have some interpolation for delayed frames? Do you have timestamps on the frames to know how long they took to trasmit… when you computed your position, and transmit this back to the robot does it take into account the latency on this end. How did you work around or simulate the FMS locked stress, and did you test against a typical network traffic of a match.

All of these questions scared me away from this kind of solution. I hope you have some good answers for them. :wink:

I guess you can always set still for a few seconds and that should solve that.

The short answer is that if we discover network latency/dropout to be a significant problem, we will move our image processing application to an onboard laptop. Failing that, our next fallback is to reduce resolution and/or framerate. To be frank, we auto target just fine at 5 fps (because the gyro loop is closed at 200Hz); we do 30 because we can :slight_smile:

However, I do not expect this to be a major concern. In past seasons, teams have streamed live camera data directly to their dashboards with few problems. The only difference is now we are cutting out the cRIO altogether. While we haven’t run simulations against an “FRC network simulator” (but if you know of a tool that could be used for this purpose I would be interested in trying it), in theory there is PLENTY of bandwidth to go around. With reasonable compression settings these images are only on the order of 10-20 kilobytes a piece.

We don’t timestamp the images, but we do transmit our heading synchronously with new camera images being available. That way, the results returned by the vision application do not go “out of date” if they are received late. Out of order packets would be a bigger problem (it’s UDP under the hood). But absolute worst case - like you said - this would be a transient problem and would straighten itself out within a second or two.

EDIT: Forgot to add, we also do low pass filtering of both outputs from the vision system to help smoothness (and to reject momentary disturbances like when we occlude the vision target with a flying basketball :)). This should help with occasional frame drops as well.

Clean looking machine!

It’s nice to see another robot with a wide pickup! :slight_smile:

See you at Chestnut Hill!

Nice job there Jared and Al!

Awesome. Can’t wait to play with 341 at Hatboro-Horsham. I’d volunteer to use your opposite-loader, but I’m thinking we want to shoot our balls in auto. :cool:

What part number did you use for the optical photosensor / tachometer ?

Ed

It is the Allen Bradley 42EF-D1MNAK-A2 from last years’ Kit of Parts.

I know why.:wink: Nice looking robot.

Thank you Jared and Team 341, do you have an upclose picture of your shooter and encoder setup ?

Here is a closeup of the shooter.

The sensor is on the left side of the picture, and two retroreflectors are mounted 180 degrees apart on one of the shooter wheels.

Wonderful bot.

A couple of questions:
1> when your manipulator flips out it looks like it breaks the front plane of the bot in more than 1 location. Are you concerned that it might be ruled 2 appendeges?
2> when your manipulator is deployed it appears to obscure your numbers. Is that just the angle of the picture?

Otherwise, the bot looks gorgeous.