My team is trying to use vision processing to position our robot during hybrid mode, but we can’t figure out how to get the camera to send the images back to the classmate for the processing to occur.
Any help possible would be greatly appreciated.
My team is trying to use vision processing to position our robot during hybrid mode, but we can’t figure out how to get the camera to send the images back to the classmate for the processing to occur.
Any help possible would be greatly appreciated.
The camera is an http server, and it can serve about five different sessions at once. That means that the dashboard program can open an mjpg session independent of the cRIO and the the images will arrive periodically over TCP. The default dashboard does this already. It is written in LV, and the source is provided. You can make the code from the wizard on the LV getting started window.
Greg Mckaskle
If you choose to forgo the dashboard (which I think you do, seeing as you are trying to get images back to the classmate for processing) you can configure the ethernet camera to work over one of the open ports outlined in the documentation (which I do believe includes port 80, specifically for the camera). Personally, I found better response times by attempting to fetch a static image in quick succession than accessing the mjpg stream.
Here’s a code listing for a crudely hacked-together dashboard I made last year (and haven’t had much of a chance to work on since):
package edu.wpi.first.team._1984.dashboard;
import java.awt.image.BufferedImage;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import javax.imageio.ImageIO;
/**
* Class which encapsulates access to the robot's ethernet camera.
*
* @author sid
*/
public class AxisCamera
{
protected String host;
protected String res = "320x240";
protected String grabPage = "/axis-cgi/jpg/image.cgi?resolution=";
protected String streamPage = "/mjpg/video.mjpg";
protected BufferedImage lastImg;
protected int] lastImgArr;
public AxisCamera(String h)
{
host = h;
grabPage += res;
}
public AxisCamera(String h, String r)
{
host = h;
grabPage += r;
lastImg = null;
lastImgArr = null;
}
public AxisCamera(String h, String gp, String r, String sp)
{
host = h;
grabPage = gp + r;
streamPage = sp;
lastImg = null;
lastImgArr = null;
}
public BufferedImage grab() throws IOException
{
return lastImg = ImageIO.read(new URL(host + grabPage));
}
public int] grabArray() throws IOException
{
BufferedImage img = grab();
int] data = new int;
img.getRGB(0, 0, img.getWidth(), img.getHeight(), data, 0, img.getWidth());
lastImg = img;
lastImgArr = data;
return data;
}
public InputStream openMjpgStream() throws IOException
{
return new URL(host + streamPage).openStream();
}
}
Usage would be as:
AxisCamera cam = new AxisCamera("http://10.19.84.10", "320x240");
BufferedImage img = cam.grab();
Alternatively, you could try to pick frames out of the mjpg stream, but if you are going to process the data, it would be more economical of bandwidth to only request a frame when required.
You can indeed request individual JPEGs. The M1011 doesn’t seem to have a great framerate with that technique, so setting up an MJPG stream at the requested rate doesn’t necessarily waste bandwidth, especially if you shutdown the stream when you no longer need it.
I mentioned using the dashboard because you’ll find a number of them to choose from and possibly give the driver something to look at. You can of course use a different program, but you also need to make sure to launch it from the DS or the dashboard or by hand before each match.
Greg McKaskle
We have the Axis 206, how is the frame rate for grabbing individual JPEGs from it? Also, to grab images we would use something similar to this, possibly with a different resolution?
AxisCamera cam = new AxisCamera("http://10.19.84.10", "320x240");
BufferedImage img = cam.grab();
Well, I wouldn’t really recommend you use my code verbatim, but to request a different size replace “320x240” with “640x480”, or any other valid size (my team has the axis 206, which I believe only has those two sizes). If you follow what I posted you should see it merely forms the options into a valid URL.
As for framerate, I remember getting 29 or 30 fps w/ processing on 320x240 resolution. It was closer to 14 or 13 on the larger 640x480. Part of the reason I reverted to grabbing single jpegs was due to the fact I couldn’t get a library to decode mjpeg streams working at the time – also, since we didn’t really want a camera readout, the image processing only grabbed frames as it needed them. That didn’t end up speeding anything up, but I like to feel… optimized!
The Axis 206 and M1011 both support 160x120, 320x240, and 640x480. The typical IP address for the camera is .11 instead of .10, and the 206 has much better response serving up JPEGs than the M1011.
Other than that, I’d say give it a shot.
Greg McKaskle
In what format does the image need to be in order to feed it into the NIVision? I haven’t had much of a chance to work with it in order to implement it for our code yet, as my team is still doing much more theoretical work on our robot right now.
JPEGs and MJPGs are essentially the same image format. One has a single header and a stream of images, the other is more often used for file system storage. NI-IMAQ has functions for loading images from disk and others for loading from a memory stream. The second set makes more sense for the FRC camera case.
Greg McKaskle