View Full Version : New Camera Class
TheDominis
26-01-2010, 08:22
BETA
I've heavily modified some camera code. There is slightly different functionality in smash_camera.cpp in GetJpegImageBlocking. It blocks with different functionality. It will not always succeed. You can change the functionality by copying the original method from PCVideoServer.cpp. Also, for now, you must manually set the IP to the computer to want to receive video from. You must do this in the method VideoServerHelper on line 65 in smash_video_server.cpp.
Required Code:
#include "smash_camera.h"
smash::AxisCamera& camera = smash::AxisCamera::getInstance();
camera.writeResolution(smash::k160x120);
camera.writeBrightness(0);
The Dashboard will fill a drop down box with all available IP addresses that are IPv4. This will automatically detect the data and start the server. If you want to stop the server, close the program. There is a timeout period of 250ms and the server status will be "waiting". The dashboard supports any Jpeg size. However, it seems to be slightly buggy on 320x240 and very buggy on 640x480. I'll be working on this.
I didn't try to 2Go PC, but I got 20+ FPS on 160x120.
I'll upload dashboard later.... must go!
slavik262
26-01-2010, 11:34
I'm sorry if I'm not getting this, but what does this do? Does it give an increased framerate or is it just designed to let the user customize what IP they want to use to receive video data?
Does this run on a LabVIEW dashboard, or some custom implementation of your own?
TheDominis
26-01-2010, 12:36
I didn't know that the zip limit was greater than the rar format. It failed for that reason.
The Dashboard frame rates at 160x120 were 20+ on the laptop I tested it on (not the 2Go computer). The other formats were less, but I think that was due to an error in the dashboard.
You will need 3.5 .NET Framework for this to work. Perhaps even SP1 for 3.5 .NET.
slavik262
26-01-2010, 16:42
You still didn't really answer my question. Is this just a custom video dashboard? What advantages does this have over the standard code?
TheDominis
26-01-2010, 17:25
You still didn't really answer my question. Is this just a custom video dashboard? What advantages does this have over the standard code?
Opposed to the stock camera server, my server gets 20+ FPS at 160x120. Other image sizes are slightly buggy, but I'll fix that soon. Adding more features later.
slavik262
26-01-2010, 17:46
Awesome! Without pouring through the code, how did you manage to do this?
TheDominis
26-01-2010, 18:30
Awesome! Without pouring through the code, how did you manage to do this?
Server
Step 1. Acquire Image
Step 2. Calculate how many packets will be required to send the image
Step 3. Send packet(s) via UDP (1032 byes), populate any outdated data
Step 4. Repeat 1 - 3
Receiver
Step 1. Acquire a 1032 byte packet
Step 2. Translate the packet into usable information (i.e Big Endian to Little Endian)
Step 3. Populate "Status" with new information
Step 4. If the image is finished, present it
Step 5. Repeat 1 - 4
This is, of course, a simplified overview.
slavik262
26-01-2010, 18:36
So you segment the image into several packets and send it over UDP instead of TCP? I assume you have error-checking built in? I'll have to check it out.
TheDominis
26-01-2010, 18:38
So you segment the image into several packets and send it over UDP instead of TCP? I assume you have error-checking built in? I'll have to check it out.
I carry very little about errors. I throw them out and wait for a new image.
Tom Line
26-01-2010, 20:17
I'm curious - how does this affect the total bandwidth used by a single Crio and dashboard.
In other words, have you increased 20-fold (or is it 20^2 for an image) the number of packets that are going to be sent to increase your framerate by 20 times?
For some reason I thought the communication protocalls were sancrosanct - that is I thought we weren't allowed to touch them because of bandwidth concerns. I had thought that was why they limited the dashboard to a 1 hz update, though I might be wrong: you know what they say about assuming.
TheDominis
26-01-2010, 20:53
I'm curious - how does this affect the total bandwidth used by a single Crio and dashboard.
In other words, have you increased 20-fold (or is it 20^2 for an image) the number of packets that are going to be sent to increase your framerate by 20 times?
For some reason I thought the communication protocalls were sancrosanct - that is I thought we weren't allowed to touch them because of bandwidth concerns. I had thought that was why they limited the dashboard to a 1 hz update, though I might be wrong: you know what they say about assuming.
I'm not sure either. The TCP stream tries to send an image pretty fast, but fails. Also it implicitly fragments, but I do not know how this affects socket I/O. I know that I use ~1MB/s for 640x480@30Hz with Jpegs drawn on the computer, but they are much larger than the camera's Jpegs at no compression. I'll post bandwidth usage for each image size tomorrow.
TheDominis
28-01-2010, 19:39
Bandwidth Usage:
160x120: 100kbs
360x240: 400kbs (peak, average was around 300kbs)
640x480: Not Tested
The Lucas
28-01-2010, 22:53
Good work. However, before you train drivers with this video feed available, you might want to ask FIRST Q&A if UDP traffic will be blocked by FMS at competition. Unfortunately, I assume it will be blocked, just like the PCVideoserver port was blocked last year. FIRST is obviously concerned about the bandwidth usage at competition. It would be nice if QoS was used so video could be sent at a lower priority but utilize all available bandwidth.
slavik262
29-01-2010, 08:19
Good work. However, before you train drivers with this video feed available, you might want to ask FIRST Q&A if UDP traffic will be blocked by FMS at competition. Unfortunately, I assume it will be blocked, just like the PCVideoserver port was blocked last year. FIRST is obviously concerned about the bandwidth usage at competition. It would be nice if QoS was used so video could be sent at a lower priority but utilize all available bandwidth.
Already being done. Yesterday we submitted a question to the Q & A forums. This system won't use any more bandwith than the WPI version (in fact it will probably use less due to the more unstructured nature of UDP and the fact that this system doesn't re-transmit packets that get dropped or lost).
TheDominis
29-01-2010, 21:54
The source code can be found by using svn.
svn checkout http://frc-video-collab.googlecode.com/svn/trunk/ frc-video-collab-read-only
The C++ code will be updated tomorrow morning at around 10:30 EST.
slavik262
01-02-2010, 11:28
The GDC got back to us, and their answer is, in a word, no:
http://forums.usfirst.org/showthread.php?t=14284
We'll have to see what this update brings before moving forward.
It looks like what they really said is no to UDP... can you make it work over TCP?
slavik262
01-02-2010, 14:14
It looks like what they really said is no to UDP... can you make it work over TCP?
Sure, but we think that a lot of the slowness was caused by TCP. For example, if a frame of the video being transmitted gets screwed up, it's not that big of a deal using UDP, especially if you have a high framerate. You can just drop the frame and wait for the next one (no big deal if you're getting a good 20 of them every second - the user will barely notice). If we use TCP and the same thing happens to a packet, the receive() function stops the entire program execution, sends a request to resend the packet, and waits for the packet to get sent again, all for just one frame (or even just one part of a frame). You're making the entire system, which could be better spending its time just displaying the next frame(s), wait on just one frame instead of moving on.
We're also aware that some of the latency is being caused by the Classmate not being enough to keep up with the cRIO, but TCP doesn't help in this respect.
TheDominis
01-02-2010, 14:26
They said no to UDP on port 1180. This makes perfect sense as our packets would corrupt the driver station's packets. Currently, the video server works off of port 1234 (25FPS@160x120 resized to 640x480). However, I see a lot of erroneous packets as the robot gets farther away.
TCP hates not being acknowledged...
The GDC does not mention that UDP couldn't be used on another port. They do mention custom TCP protocols are allowed, but not UDP. They do not mention no UDP for all ports or just 1180.
slavik262
01-02-2010, 14:36
We'd need to have clearance to use a port though, because the FMS firewalls off any ports not cleared for use by the robot and driver station.
Also, the video feed and the rest of the driver station data run on different ports. The video uses a TCP connection to port 1180, while the rest of the dashboard data sends 1018 byte packets through UDP on port 1165. They wouldn't corrupt each other at all.
I'll have to post the whitepaper I made about the rest of the dashboard data some time.
The GDC does not mention that UDP couldn't be used on another port. They do mention custom TCP protocols are allowed, but not UDP. They do not mention no UDP for all ports or just 1180.
You should ask for official clarification on this, but my understanding is that UDP is not provisioned on any port on the FMS except for the official control and status packets, and you certainly can't use those.
I'll have to post the whitepaper I made about the rest of the dashboard data some time.
That would be great!
Sure, but we think that a lot of the slowness was caused by TCP. For example, if a frame of the video being transmitted gets screwed up, it's not that big of a deal using UDP, especially if you have a high framerate. You can just drop the frame and wait for the next one (no big deal if you're getting a good 20 of them every second - the user will barely notice). If we use TCP and the same thing happens to a packet, the receive() function stops the entire program execution, sends a request to resend the packet, and waits for the packet to get sent again, all for just one frame (or even just one part of a frame). You're making the entire system, which could be better spending its time just displaying the next frame(s), wait on just one frame instead of moving on.
I believe you (or someone else) mentioned sending video over the UserData dashboard data mechanism last year... it is implemented as UDP transfers, and only marginally smaller packets that you get with raw datagrams. I'm not aware of any changes that would have made this stop working. Is this method no longer feasible or desirable for some reason? This method has about 47kBytes/s throughput.
-Joe
TheDominis
01-02-2010, 16:56
I believe you (or someone else) mentioned sending video over the UserData dashboard data mechanism last year... it is implemented as UDP transfers, and only marginally smaller packets that you get with raw datagrams. I'm not aware of any changes that would have made this stop working. Is this method no longer feasible or desirable for some reason? This method has about 47kBytes/s throughput.
-Joe
Our team implemented this last year. We were trying to get a good video stream this year at high FPS with good quality. Each frame at 0 compression takes 5 packets. That's 10 FPS . We did something high around 70-80 and got 16 FPS, but the quality couldn't be used for image tracking.
The stream uses 100KB/s at 160x120 and we resize it to 640x480. It looks good actually.
slavik262
01-02-2010, 17:01
I believe you (or someone else) mentioned sending video over the UserData dashboard data mechanism last year... it is implemented as UDP transfers, and only marginally smaller packets that you get with raw datagrams. I'm not aware of any changes that would have made this stop working. Is this method no longer feasible or desirable for some reason? This method has about 47kBytes/s throughput.
-Joe
We could definitely try using the user data to send video; the obvious drawback is it leaves less room for other data. From the testing my team has done with the user data stream, the data it sends is often redundant, so the actual useful bytes/second value is lower than the potential.
We could definitely try using the user data to send video; the obvious drawback is it leaves less room for other data. From the testing my team has done with the user data stream, the data it sends is often redundant, so the actual useful bytes/second value is lower than the potential.
I only see redundant data sent if the user data is updated by a task that is not synchronized with the control packets.
slavik262
02-02-2010, 08:37
We'll have to check it out. Right now we're putting this project on the backburner for a week or so until this supposed update comes out.
We'll have to check it out. Right now we're putting this project on the backburner for a week or so until this supposed update comes out.
Have you tried out the new update yet?
TheDominis
05-02-2010, 18:13
I get about 150-250ms delay on the new camera.
I get about 150-250ms delay on the new camera.
Is that something that you are happy with? It seems to be a lot better than you were getting before. How about frame rate?
Also, if you look at the code and have any suggestions for improvement, please let me know.
Joe
PranavSathy
06-02-2010, 18:57
Pretty nice, I dont mean to be the joykill here, but I hope you know you are not ALLOWED to use UDP to send packets, I believe that post is found here on Cheif Delphi somwehere, search around, either way I am sure you cannot use UDP.
TheDominis
06-02-2010, 22:59
Pretty nice, I dont mean to be the joykill here, but I hope you know you are not ALLOWED to use UDP to send packets, I believe that post is found here on Cheif Delphi somwehere, search around, either way I am sure you cannot use UDP.
We posted the question to GDC so we know :*(.
vBulletin® v3.6.4, Copyright ©2000-2017, Jelsoft Enterprises Ltd.