View Single Post
  #2   Spotlight this post!  
Unread 11-01-2017, 02:59
wsh32's Avatar
wsh32 wsh32 is offline
The Nerdiest of the Nerd Herd
AKA: Wesley Soo-Hoo
FRC #0687 (The Nerd Herd)
Team Role: Leadership
 
Join Date: Sep 2014
Rookie Year: 2014
Location: SoCal
Posts: 16
wsh32 is on a distinguished road
Re: Lost and Confused about Grip, Java, roboRIO, and Network Tables

Check out team 254's vision seminar at champs last year: https://www.youtube.com/watch?v=rLwOkAJqImo

To answer your question, if you want to see data coming out of the GRIP debugging, use localhost or 127.0.0.1.

A couple key points about processing the video stream:
1. Underexpose your camera. 254 goes more into detail about why in the video, but it does wonders.
2. You can also eliminate noise by applying a gaussian blur and running cv erode followed by cv dilate (same number of iterations). I'd also use the Filter Contours block and filter by area.

Also, make sure your camera is mounted in the center of your robot. It helps so much when it comes to alignment. Also, not as important, but keeping your camera behind your center of rotation makes your perceived angle of alignment less, which eliminates oscillation.

On 687 last year, a few of these problems resulted in our vision system not working. First, the camera had a lot of latency. Make sure you take camera latency into account when you write your vision system. Also, make sure you get your drive PID loop tuned perfectly. A bad PID loop would make your drivebase spin around randomly.

I hope this helps, good luck this season!
__________________


FRC 687 - Lead Programmer (2015-2016), Team Captain (2016-present)
Reply With Quote