Network Tables

Yeah, I know everybody has asked for this before, but I really need a Network Tables library for C++. I’d used pynetworktables and had a fully functioning code with Python, but the latency while using an Axis IP cam (even while connected to my laptop’s ethernet port) is too dang high! When I tried C++, it worked like a charm, but there’s no network table implementation for sending simple values such as x and y coordinates. Network sockets are sort of intimidating, and I haven’t the faintest on how to implement them on our DS (with C++ and the openCV library) and the robot (using java). I would be truly grateful if someone could send me a part of their sockets code, just to get an idea on how to code it.
Thanks in advance! :smiley:

How high is the latency? We haven’t had any significant problems with it, and all of our code is implemented in python (robot + image processing + custom dashboard).

I’d reckon like 3 seconds of latency…my FPS is great, it’s just that there has to be some problems with the buffer :frowning:


import cv2
import urllib 
import numpy as np

stream=urllib.urlopen('http://10.25.76.11/mjpg/video.mjpg')
bytes=''
while True:
    bytes+=stream.read(16384)
    a = bytes.find('\xff\xd8')
    b = bytes.find('\xff\xd9')
    if a!=-1 and b!=-1:
        jpg = bytes[a:b+2]
        bytes= bytes**
        i = cv2.imdecode(np.fromstring(jpg, dtype=np.uint8),cv2.CV_LOAD_IMAGE_COLOR)
        cv2.imshow('i',i)
        if cv2.waitKey(1) ==27:
            exit(0)


I got the code from StackOverflow. I had tried using a simple VideoCapture(ip), and got it to work on C++, but Python just throws up some errors:


Traceback (most recent call last):
  File "C:\Users\Lucas\Documents\opencv\opencv\webcam.py", line 7, in <module>
    cv2.imshow("window", img)
error: C:\slave\WinInstallerMegaPack\src\opencv\modules\core\src\array.cpp:2482: error: (-206) Unrecognized or unsupported array type

And when I print out the value of “ret” (the first variable outputted by VideoCapture::read()), I get False, which indicates that there is no image being captured (duh).

Any ideas?**

I use this:


        vc = cv2.VideoCapture()
        vc.set(cv2.cv.CV_CAP_PROP_FPS, 1)
        
        if not vc.open('http://%s/mjpg/video.mjpg' % self.camera_ip):
            return

        while True:
            retval, img = vc.read(buffer)
            ... 

One bug present in OpenCV that hasn’t been fixed can be found on their bug tracker here: http://code.opencv.org/issues/2877 . If you compile your own OpenCV you can patch the bug. I’ve been meaning to patch it in a better way but haven’t done so yet, as I don’t have an axis camera easily available for testing.

Ok so I tried this:


import cv2
import numpy as np
import time

camera_ip = "10.25.76.11"

vc = cv2.VideoCapture()
vc.set(cv2.cv.CV_CAP_PROP_FPS, 1)
        
if not vc.open('http://%s/mjpg/video.mjpg' % camera_ip):
    time.sleep(0)

while True:
    retval, img = vc.read(buffer)
    cv2.imshow("img", img)
    if cv2.waitKey(20) == 27:
        break

cv2.destroyAllWindows()
exit(0)

And got this:


Traceback (most recent call last):
  File "C:/Users/Lucas/Desktop/cdch/render_stream3.py", line 14, in <module>
    retval, img = vc.read(buffer)
TypeError: <unknown> is not a numpy array

So truth be told, I’m not quite sure what’s going on… :confused:

And yeah, I’d read somewhere that ffmpeg could sometimes be the source of the problem, but I’m on windows and haven’t the faintest idea on how to compile from source…

I’m very sorry Dustin if the problem is to obvious, but I’ve been struggling with this for weeks and my team REALLY needs it for the championships…

Sorry, I copy/pasted that incorrectly. You should remove the buffer from the vc.read() call for initial testing. buffer happens to be a python keyword, and I was using it as a variable, so I wasn’t allocating a new image buffer each time. So it was actually something like…



            h = vc.get(cv2.cv.CV_CAP_PROP_FRAME_WIDTH)
            w = vc.get(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT)
            
            capture_buffer = np.empty(shape=(h, w, 3), dtype=np.uint8)

            while True:
                retval, img = vc.read(capture_buffer)

Yeaaaaah…same error…


Traceback (most recent call last):
  File "C:\Users\Lucas\Desktop\cdch\render_stream3.py", line 19, in <module>
    cv2.imshow("img", img)
error: C:\slave\WinInstallerMegaPack\src\opencv\modules\core\src\array.cpp:2482: error: (-206) Unrecognized or unsupported array type

Using this code:


import cv2
import numpy as np
import time

camera_ip = "10.25.76.11"

vc = cv2.VideoCapture()
vc.set(cv2.cv.CV_CAP_PROP_FPS, 1)
        
if not vc.open('http://%s/mjpg/video.mjpg' % camera_ip):
    time.sleep(0)

h = vc.get(cv2.cv.CV_CAP_PROP_FRAME_WIDTH)
w = vc.get(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT)
capture_buffer = np.empty(shape=(h, w, 3), dtype=np.uint8)

while True:
    retval, img = vc.read(capture_buffer)
    cv2.imshow("img", img)
    if cv2.waitKey(20) == 27:
        break

cv2.destroyAllWindows()
exit(0)

Interesting. Don’t pass it a capture buffer then, and see what happens.

Wait, the error is on the imshow. Odd. What type/shape is the image?


print img
print img.shape
print img.type

With or without buffer, there’s a problem with the capture. retval returns false, and the img array returns None, so I can’t print img.shape or img.type.


AttributeError: 'NoneType' object has no attribute 'shape'

Odd. That’s very strange. Must be something with the way your ffmpeg is compiled.

Well, if Windows 8.1 or OpenCV doesn’t include ffmpeg by default, then I should go and install it…

Oh that’s right! You have to copy the ffmpeg DLL to C:\Python27 for it to work correctly. It should be included with the opencv binary distribution. It’s rather odd to me that they’re separate. It’ll be called something like ‘opencv_ffmpeg2xx.dll’

Did you try SmartCppDashboard? Its in first forge with full source to network tables using winsock2. It also uses ffmpeg to support h264 a special build to minimize latency for it… I can get link later once I get to a PC.

Oh that’s right! You have to copy the ffmpeg DLL to C:\Python27 for it to work correctly. It should be included with the opencv binary distribution. It’s rather odd to me that they’re separate. It’ll be called something like ‘opencv_ffmpeg2xx.dll’

Ok Dustin, so I copied the .dll and discovered that I had an old .dll in there (246. I’m using 248). So it’s still not working, but I did discover in the source folder that there’s an ffmpeg folder in the 3rdparty folder which include a make file and dll files. So I’m gonna give that a try.

Did you try SmartCppDashboard? Its in first forge with full source to network tables using winsock2. It also uses ffmpeg to support h264 a special build to minimize latency for it… I can get link later once I get to a PC.

James, that would be awesome! Please post the link as soon as you can. Thanks man!

Here

Shows a demo of everything… and in here is a link to the first forge… I’ll include here for convenience:
http://firstforge.wpi.edu/sf/projects/smartcppdashboard

Now… this code is due for an update, but we are just about ready to head out to Lonestar regionals… so after that I may get the code all updated probably in the next two weeks, but in its current state it should get you going pretty well.

Here is a sneak peak of what the newer stuff can do (not yet checked in there): https://www.youtube.com/watch?v=fccxxlvMqY0

I think he’s talking about this: http://firstforge.wpi.edu/sf/sfmain/do/viewProject/projects.smartcppdashboard

The opencv files should be structured something like so:


C:\Python27\opencv_ffmpeg2xx.dll
C:\Python27\lib\site-packages\cv2.pyd

I custom compiled my version of ffmpeg, but at this point I don’t recall if I had to do something special to get MJPG support. I’ll have to go find the source tree if I still have it…

We had to install ffdshow as well to get loading from a file working on windows.

So it really seems as though you were in the same boat as me three weeks ago. There is one very easy fix to the problem you are fixing, but it requires kinda out of the box thinking :wink:
Through my VC++ opencv adventure, I learned that many processor intensive routines may have a such a low performance that the processing rate will become less than the capture rate. I believe I properly understand your setup. You are using an MJPEG stream with VideoCapture. I have two solutions available for you that should be relatively easy to use.
If you must stay with the network MJPEG stream approach,
–Decrease the capture rate of the camera to the MAX rate that your vision software will run at. This will cause you to economize on your lag and if for some reason the software runs a bit faster, it will just wait for the next frame.
–THREAD YOUR GRABBER
The second option is what saved me. I was before getting 30 seconds of lag even though I was running at 5fps. What this will do is run the grabber parallelly. I believe you are pretty decent at C++ so you should be able to figure it out. Try <thread> and <mutex>.
So what you want to do is wait for the next frame to be available from the camera and download it immediately. Do not do this in the main Mat because it will make the entire program wait and brick up your effort. After the frame grab, send the data to the actual processing Mat. In your processing loop, copy over that global Mat into a local Mat so the system can free the resource as quick as possible! Then, perform the operations you want to do. This way, the old frames are constantly deleted as new frames are available
Try to use both the following together. Using just the first option will make no difference. Using both options is when you get both efficient and get rid of the lag.

I have been paying attention to your posts lately and I believe you have a PandaBoard. Snag an inexpensive USB webcam and that should eliminate most of the lag you are experiencing!

I just gave you my two cents, so good luck!

Also, you might find it easier to implement a UDP bidirectional socket instead of NetworkTables. There are libraries for this in C++, Java and even LabView!

I am not sure of yash101’s setup, but if UDP is used from robot to driver-station beware. It costs more bandwidth by nature… which shouldn’t matter if it is a direct connection, but if the robot is receiving packets it will need a dedicate thread to listen to packets on startup, due to the VxWorks issue. This thread elaborates on that and also talks about a bug fix with the Network Tables.