Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   LifeCam USBCamera changing settings from Java (http://www.chiefdelphi.com/forums/showthread.php?t=142633)

ahartnet 08-02-2016 16:02

Re: LifeCam USBCamera changing settings from Java
 
Just wanted to say thanks for reporting your findings on all of this. I've been busy taking care of some other mentor duties, and this is a life saver in allowing some students still get some help in trouble shooting what is going on with limited software experience

medofbr 13-02-2016 15:03

Re: LifeCam USBCamera changing settings from Java
 
Quote:

Originally Posted by robert1356 (Post 1536919)
Using the NIVision IMAQ commands, the min/max values returned are 5 and 20,000 respectively. I duplicated the USB Camera class (too bad they made all the variable private instead of protected) and replaced the setExposureManual() with code that allowed me to set the value explicitly. It turns out that anything above about 40 or 50 will give you quite a bright image. I'm actually using 10 I think for the exposure and 10 for the brightness.

As for losing the settings - as long as you don't power the camera off, you will not lose the settings. It's tricky, but you can use Robot code to configure the camera, then disconnect, then run GRIP and it will have the setting you just set. There are some real caveats in all this and I think I finally have all the cases worked out. I'm probably going to post the code when I get it working.

Caveat #1 - The robot code cannot use CameraServer - if it does, GRIP will not be able to publish a stream. Unfortunately, while you CAN disconnect from the USBCamera, you CANNOT kill the CameraServer stream without rebooting the robot or manually killing the robot code.

Caveat #2 - don't forget that during development, if you use your robot code to set the settings, then launch GRIP, if you reload robot code, GRIP is still running and your robot code will throw an exception trying to connect to the USBCamera - you have to either reboot your roborio or kill the GRIP process. I've written code to kill the GRIP process if the USBCamera open() method throws and exception.

When you say you can use the Robot code to configure the camera, then disconnect, then run GRIP, what do you mean by "then disconnect"? Does the mean calling the CloseCamera() function?

kmckay 26-02-2016 20:33

Re: LifeCam USBCamera changing settings from Java
 
Quote:

Originally Posted by robert1356 (Post 1530848)
We are trying to use the LifeCam and the USBCamera class for our vision processing. One thing we need is the ability to control exposure and brightness so we get a good image going into our image processing pipeline.

So far I have been able to get an image on the SmartDashboard, provide a way to enter Exposure and Brightness values on the SmartDashboard and send them to the USBCamera class. The problem is, when I change the values, the image doesn't change brightness or exposure. I know the image is updating because I see the motion in new frames.

Here is what we're currently doing:

Startup:
Code:

USBCamera targetCam = new USBCamera("cam0");  // create connection to camera
NIVision.Image frame = NIVision.imaqCreateImage(NIVision.ImageType.IMAGE_RGB,0); // create frame buffer
targetCam.openCamera(); // open the camera connection
targetCam.startCapture(); // start the frame capturing process (internal to USBCamera)

Loop:
Code:

targetCam.getImage(frame);  // retrieve a frame from the USBCamera class
CameraServer.getInstance().setImage(frame);  // push that frame to the SmartDashboard using the CamServer class

The above works as expected, and is pretty cool to boot.

Now adding the brightness/exposure control to the loop
Loop:
Code:

int exposure = Preferences.getInstance.getInt("camExposure", 50);
int brightness = Preferences.getInstance.getInt("camBrightness", 50);

if (brightness <= 100 && brightness >=0)
  targetCam.setBrightness(brightness);

if (exposure <=100 && exposure >= 0)
  targetCam.setExposureManual(exposure);

targetCam.UpdateSettings();

updatedBrightness = targetCam.getBrightness();
SmartDashboard.putNumber("Current Brightness", updatedBrightness);

targetCam.getImage(frame);  // retrieve a frame from the USBCamera class
CameraServer.getInstance().setImage(frame);  // push that frame to the SmartDashboard using the CamServer class

The above updated code, with brightness and exposure still gives me updated frames from the camera, I can also change the brightness and exposure from the SmartDashboard and the getBrightness() will return the NEW value and display it on the SmartDashboard, so I know the USBCamera class THINKS the brightness is changing BUT the actual brightness and exposure of the video does NOT change.

Does anyone have any experience on getting this to work? Any hints, tips or suggestions? We may have to punt with the USBCam and switch to the Axis Cam, which would be a shame since the LifeCam is so compact and "simple"

I don't suppose anyone has done something similar in C++? I tried to translate it but it crashes the rio.

robert1356 26-02-2016 22:57

Re: LifeCam USBCamera changing settings from Java
 
I gave up trying to get image processing working on the roboRIO. I ended up having really bad problems - either GRIP or the robot code was crashing and generating a core dump (a HUGE file). This core dump was taking up all the storage space on the roboRIO which in turn would prevent the robot code from launching (you'd see exceptions that it couldn't write certain files because there was no room left on the device). I ended up moving everything to a Raspberry Pi. We now have a very stable system that works quite well and gives us all the control we need. I intended to post detailed documentation when I get a chance.

As for doing the above in C++, I did get that part of the process working in Java. C++ should be quite similar. Where is your code crashing?

kmckay 26-02-2016 23:36

Re: LifeCam USBCamera changing settings from Java
 
Quote:

Originally Posted by robert1356 (Post 1547773)
I gave up trying to get image processing working on the roboRIO. I ended up having really bad problems - either GRIP or the robot code was crashing and generating a core dump (a HUGE file). This core dump was taking up all the storage space on the roboRIO which in turn would prevent the robot code from launching (you'd see exceptions that it couldn't write certain files because there was no room left on the device). I ended up moving everything to a Raspberry Pi. We now have a very stable system that works quite well and gives us all the control we need. I intended to post detailed documentation when I get a chance.

As for doing the above in C++, I did get that part of the process working in Java. C++ should be quite similar. Where is your code crashing?

Our code isnt crashing. But we get a whitewashed image. I can adjust cam settings in the ms util, but they dont reliably stick. With the brightness turned down, we have great results. But a random number of power cycles or code reloads, the settings revert.

jwatson12 03-03-2016 09:47

Re: LifeCam USBCamera changing settings from Java
 
Hello, any more updates to this post? I was reading on another post GRIP will not work with Lifecam. Were you able to get past this without using Axiscam and produce code in network tables for targeting?

robert1356 03-03-2016 10:23

Re: LifeCam USBCamera changing settings from Java
 
Quote:

Originally Posted by jwatson12 (Post 1550790)
Hello, any more updates to this post? I was reading on another post GRIP will not work with Lifecam. Were you able to get past this without using Axiscam and produce code in network tables for targeting?

On the roborio, it would work with Lifecam. However, I moved to the RPi using the instructions from the GRIP wiki and a lot of my own discovery. I need to post everything, but haven't had the time. Bottom line - it works pretty well on the Pi. I have not actually tested connecting to a USB Cam - I just assumed the instructions were correct that GRIP on the Pi does not work with USB cams. I set up the mjpg-streamer and configured GRIP to connect to port 5800 (I configured the streamer to stream on 5800, not 1180 like the instructions say because the publish module in GRIP publishes on 1180 and creates a conflict). You can use v4l2-ctl to adjust all of the camera settings - lots of control and it does it on the fly, without having to stop the stream. This means I can look at the stream in a browser and adjust the settings exactly as needed. I have some improvements to make - I would like to get the USB cam working because I'd like to eliminate the lag of the streamer. That's a summary - if you have any specific questions, post them and I'll try to answer them.

jwatson12 03-03-2016 10:50

Re: LifeCam USBCamera changing settings from Java
 
Thanks for the quick reply. We are using Lifecam with roborio and having trouble getting the network tables to update after publishing GRIP. Getting HSL Threshold needs a 3-channel input. When we open Outline Viewer we see the report but no coordinates. We are publishing the contour report with images and trying with webcam. Both result in no network table updates. Any feedback is appreciated.

robert1356 03-03-2016 11:15

Re: LifeCam USBCamera changing settings from Java
 
Quote:

Originally Posted by jwatson12 (Post 1550817)
Thanks for the quick reply. We are using Lifecam with roborio and having trouble getting the network tables to update after publishing GRIP. Getting HSL Threshold needs a 3-channel input. When we open Outline Viewer we see the report but no coordinates. We are publishing the contour report with images and trying with webcam. Both result in no network table updates. Any feedback is appreciated.

I don't understand the HSL threshold error - I still see that, but it works fine. I think it's a sequencing bug (starting the threshold process before they provide a valid input image). You should see a message later in the traces that indicates that errors have been cleared and everything is normal.

What do you mean "publishing the contour report WITH IMAGES"?

I did see that sometimes I would have to restart the Outline Viewer to get it to see the GRIP changes. Network Table weirdness.

Step back and do things one step at a time. Make a GRIP pipeline that just pulls in the Lifecam, publish the video and the framerate, make sure that works. Add pieces and publish, etc.

BE CAREFUL ABOUT BANDWIDTH and CPU UTILIZATION.
320x240 definitely was slow on the roboRIO. (all we did was a HSL threshold and contour)
we dropped to 160x120. One problem with GRIP is that you can't get the camera to generate a smaller frame size, you have to do a resize in GRIP - this is wasteful and bad. Grip is pulling in a 720p or 640x480 image (not sure which) and they you resize in GRIP to something that you can operate on and transmit to the driver station. Resize is an expensive operation, especially if you do one of the interpolations.

You can check CPU Utilization by logging into the roboRIO (from a terminal / command line) and typing:
top

look at the java process with GRIP.

On the RPi I had the CPU pegged, GRIP was taking 2/3 processor, jpg-streamer was taking 1/3. I dropped the frame rate and frame size to get the CPU utilization down to about 80%

What we tried on the roboRIO ---

What I was originally doing was:
HSL threshold
publish frame rate right out of the source (that let me know that it was actually generating frames)
contour
contour filter
publish contour report
created a mask with original image and contours
published the mask

It worked fine when we would run manually and deploy robot code (basically during testing). But when I added the code to have the robot code launch GRIP, we started seeing memory issues and fairly regularly, it would crash the JVM, generating a core-dump file in the process. Core dump files are HUGE and it would eat up all the device storage space. When the robot code tried to relaunch, it would hang because it attempts to create some files (preferences for example) and it couldn't because the file system was out of space. This is a VERY bad situation - robot code won't run. When this happened, I had to manually delete the core-dump files to get robot code running again. This is why I switched to the Pi - too risky to have the robot code / JVM crash in the middle of a match.

robert1356 03-03-2016 11:17

Re: LifeCam USBCamera changing settings from Java
 
BTW, make sure you have the latest version of GRIP. They're up to 1.3.1 now. We are currently using 1.2.1. I know 1.1.1 had problems

jwatson12 03-03-2016 12:10

Re: LifeCam USBCamera changing settings from Java
 
Thanks again. When I said I updated Rio with images I meant the input source in GRIP was all of the Stronghold images. I tried to publish this way and then with Lifecam as source. Both gave me issues. Once we setup the hue,etc are we to publish GRIP with images or Webcam as source? As you can tell this is our first time using GRIP.

brk 20-03-2016 22:23

Re: LifeCam USBCamera changing settings from Java
 
Quote:

Originally Posted by jwatson12 (Post 1550817)
Thanks for the quick reply. We are using Lifecam with roborio and having trouble getting the network tables to update after publishing GRIP. Getting HSL Threshold needs a 3-channel input. When we open Outline Viewer we see the report but no coordinates. We are publishing the contour report with images and trying with webcam. Both result in no network table updates. Any feedback is appreciated.

I found that problem as well. We received good info when connected to a laptop, but then nothing on the roborio. I found I had to hack into the project.grip file and update the settings there and with trial & error, got it working. I think solidity was one of the keys... Make it 0-100 (full scale).

brk 20-03-2016 22:26

Re: LifeCam USBCamera changing settings from Java
 
Quote:

Originally Posted by robert1356 (Post 1550831)
I don't understand the HSL threshold error - I still see that, but it works fine. I think it's a sequencing bug (starting the threshold process before they provide a valid input image). You should see a message later in the traces that indicates that errors have been cleared and everything is normal.

What do you mean "publishing the contour report WITH IMAGES"?

I did see that sometimes I would have to restart the Outline Viewer to get it to see the GRIP changes. Network Table weirdness.

Step back and do things one step at a time. Make a GRIP pipeline that just pulls in the Lifecam, publish the video and the framerate, make sure that works. Add pieces and publish, etc.

BE CAREFUL ABOUT BANDWIDTH and CPU UTILIZATION.
320x240 definitely was slow on the roboRIO. (all we did was a HSL threshold and contour)
we dropped to 160x120. One problem with GRIP is that you can't get the camera to generate a smaller frame size, you have to do a resize in GRIP - this is wasteful and bad. Grip is pulling in a 720p or 640x480 image (not sure which) and they you resize in GRIP to something that you can operate on and transmit to the driver station. Resize is an expensive operation, especially if you do one of the interpolations.

You can check CPU Utilization by logging into the roboRIO (from a terminal / command line) and typing:
top

look at the java process with GRIP.

On the RPi I had the CPU pegged, GRIP was taking 2/3 processor, jpg-streamer was taking 1/3. I dropped the frame rate and frame size to get the CPU utilization down to about 80%

What we tried on the roboRIO ---

What I was originally doing was:
HSL threshold
publish frame rate right out of the source (that let me know that it was actually generating frames)
contour
contour filter
publish contour report
created a mask with original image and contours
published the mask

It worked fine when we would run manually and deploy robot code (basically during testing). But when I added the code to have the robot code launch GRIP, we started seeing memory issues and fairly regularly, it would crash the JVM, generating a core-dump file in the process. Core dump files are HUGE and it would eat up all the device storage space. When the robot code tried to relaunch, it would hang because it attempts to create some files (preferences for example) and it couldn't because the file system was out of space. This is a VERY bad situation - robot code won't run. When this happened, I had to manually delete the core-dump files to get robot code running again. This is why I switched to the Pi - too risky to have the robot code / JVM crash in the middle of a match.


This makes me nervous... I haven't filled the filesystem, but the executable crashes with out of memory. Didn't see a core. Before GRIP, I see only 25M free, so I know its close.

brk 20-03-2016 22:29

Re: LifeCam USBCamera changing settings from Java
 
Does anyone know how long it takes to process a pipeline? Is there a way to find out the publish rate to NetworkTables?


All times are GMT -5. The time now is 02:28.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi