Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   NI LabVIEW (http://www.chiefdelphi.com/forums/forumdisplay.php?f=182)
-   -   Processed Image > Robot Movement Help Needed (http://www.chiefdelphi.com/forums/showthread.php?t=124709)

BenGrapevine 15-01-2014 17:03

Processed Image > Robot Movement Help Needed
 
I have spent the last week or so modifying the dashboard to be able to process the vision system for this year's game.



As you can see above, the dashboard is able to take our live incoming image and give back the specific scores pertaining to left and right, etc. We can also see the distance to the wall (not pictured because it's cut off currently, and it's also a little off on the math).

With that information and knowledge handy, I'm completely stumped on how to integrate it into the teleop or autonomous VIs and make the robot do things (such as move to 10 feet away from the wall, or turn 60 degrees to the left, etc.) based on score or distance away. If anyone considers themself a guru of vision, any help with pictured examples or a .pdf or something like that would be fantastic.

faust1706 15-01-2014 17:26

Re: Processed Image > Robot Movement Help Needed
 
Quote:

Originally Posted by BenGrapevine (Post 1327720)
I have spent the last week or so modifying the dashboard to be able to process the vision system for this year's game.



As you can see above, the dashboard is able to take our live incoming image and give back the specific scores pertaining to left and right, etc. We can also see the distance to the wall (not pictured because it's cut off currently, and it's also a little off on the math).

With that information and knowledge handy, I'm completely stumped on how to integrate it into the teleop or autonomous VIs and make the robot do things (such as move to 10 feet away from the wall, or turn 60 degrees to the left, etc.) based on score or distance away. If anyone considers themself a guru of vision, any help with pictured examples or a .pdf or something like that would be fantastic.

To my understanding, you are getting distance to the wall like you said. My team has used closed loop control (PID) to do similar tasks such as moving a certain distance. A some what older CD thread is about a mentor's book he wrote about labview, and in it he goes into detail about pid control. I think that would be a great start to your problem at hand.

http://www.chiefdelphi.com/forums/sh....php?p=1297945

BenGrapevine 15-01-2014 17:28

Re: Processed Image > Robot Movement Help Needed
 
Alright thanks for the link, looking into it now.

BenGrapevine 15-01-2014 17:44

Re: Processed Image > Robot Movement Help Needed
 
Okay to be more specific:

How do I take information being created in one VI project and give it to another?

e.g. Scores from dashboard.lvproj and transmit that to robotcode.lvproj (teleop.vi and the like)

I get that once that information is in the VI, the world is open to possibilities with PIDs and etc, but I can't seem to get there yet.

Phalanx 15-01-2014 23:02

Re: Processed Image > Robot Movement Help Needed
 
You can take your targeting cluster, flatten to a string, UDP Write (port 1130) to the Robot. On the robot you have a UDP listener on port 1130 that reads that string, performs an unflatten from string back into a similar cluster on your robot code.

Alan Anderson 15-01-2014 23:46

Re: Processed Image > Robot Movement Help Needed
 
You can transfer a value or cluster via UDP as Michael suggested. You can also use SmartDashboard variables (via the Network Tables infrastructure) to share the values between the Dashboard and the robot program.

Greg McKaskle 16-01-2014 07:00

Re: Processed Image > Robot Movement Help Needed
 
Part 3 of the tutorial, step 5, shows a picture of flattening and sending the string via network tables.

To compare the choices:
1. Unbundle and write individual elements using network tables/smart dashboard:
a. This may be the simplest to approach.
b. You will be restricted to the types supported by network tables.
c. It is possible that the elements do not arrive at the same time. If you write ten elements, five may be delivered now and the other five in 100ms.
2. Flatten the cluster and write it using SD/NT.
a. This is similarly simple, but involves flattening and unflattening data.
b. You must match the types exactly. An int and a float aren't the same, and order matters.
3. Format or flatten to a string and write using TCP.
a. This is still not hard, but you need to use the TCP nodes in addition to flattening or string formatting functions. You also need to unflatten or parse the string on the other end.
b. You need to specify the correct port that will be opened on a official field.
c. This is how NT/SD works under the hood.
4. Format or flatten and write using UDP.
a. Similar to TCP, but some find UDP easier to understand.
b. UDP isn't guaranteed to arrive, though for repetitive writes, you can typically ignore this.
c. Be sure to use the correct ports for UDP, not the TCP ones.

Once the data is sent, you need to read it on the robot. You can put the SD/NT reads almost anywhere, but I'd suggest Periodic tasks. TCP and UDP should really go there and not in auto or tele code. Once you have it in periodic, reform your cluster and write it to a global. Read the global in auto or tele.

Greg McKaskle

Invictus3593 16-01-2014 12:49

Re: Processed Image > Robot Movement Help Needed
 
I agree with Greg, Periodic Tasks>Global is the way to go for sure.

BenGrapevine 16-01-2014 16:51

Re: Processed Image > Robot Movement Help Needed
 
Alright, I've flattened the distance information to a string and did some poking around with SD/NT and I'm getting pretty confused. I've got it writing a string array to a SmartDashboard table which, like the tutorial shows, calls it "Distances". I was seeing some Network Tables .vis that were calling for the cRIO IP address to start communicating information but I know absolutely nothing about it and I don't know how SD and NT relate to each other, if at all. Though I've got a hold on reading it on the robot once the information is received (as far as I know, I haven't done it before), what's the next step for establishing a TCP or UDP connection in the Dashboard the Robot? I feel like it's right there in the Network Tables palette but I can't figure it out.

EDIT: I apologize if my battery of questions are annoying, I've spent 4 years looking into vision and it's a miracle the dashboard can see shapes as of Tuesday and I'm excited but know nothing more about the subject.

Animal Control 16-01-2014 16:52

Re: Processed Image > Robot Movement Help Needed
 
This is unrelated but do you know if we can put color sensors on the robot?

BenGrapevine 16-01-2014 17:09

Re: Processed Image > Robot Movement Help Needed
 
UPDATE: I found UDP write but I don't know what information on the .vi means what? What is the difference between Connection ID and Address?

EDIT: Man I'm stupid 'detailed help' is a life saver.

EDIT 2: It's not as much of a life saver as I thought, I still don't know what they mean.

Joe Ross 16-01-2014 17:44

Re: Processed Image > Robot Movement Help Needed
 
Quote:

Originally Posted by BenGrapevine (Post 1328193)
Alright, I've flattened the distance information to a string and did some poking around with SD/NT and I'm getting pretty confused. I've got it writing a string array to a SmartDashboard table which, like the tutorial shows, calls it "Distances". I was seeing some Network Tables .vis that were calling for the cRIO IP address to start communicating information but I know absolutely nothing about it and I don't know how SD and NT relate to each other, if at all. Though I've got a hold on reading it on the robot once the information is received (as far as I know, I haven't done it before), what's the next step for establishing a TCP or UDP connection in the Dashboard the Robot? I feel like it's right there in the Network Tables palette but I can't figure it out.

Both the Dashboard and the Robot code have all of the behind the scenes "stuff" to send/receive SmartDashboard data. All you need to do is write on the dashboard and read on the robot side.

Greg McKaskle 16-01-2014 18:11

Re: Processed Image > Robot Movement Help Needed
 
As Joe said. NT and SD are already in the template code. So that it isn't magic, let me answer a few of your questions.

Network Tables is a name for the networked variables. The variables have a path that organizes them hierarchically and you can think of the elements at the same level of hierarchy as a table of variables.

Smart Dashboard is really the same thing, but we put all of the variables into the /SmartDashboard/ table. This simply isolates them from some book keeping areas and avoids a bit of clutter.

The LabVIEW Dashboard has a loop that calls the NT Binding function. It is loop 3. It takes in a list of controls and indicators that are then bound to the identically named NT variables. The loop keeps it running even if the robot disconnects and reappears.

The LabVIEW robot template has Robot Main start the NT Server. It is just below the loop towards the bottom of the window.

With those two pieces running, you should simply need to use SD VIs to read and write data. Make sure the name and datatype matches.

On the dashboard, there is a second feature where you don't even need to use a read or write. By placing your control in the Operation or Auto tab and naming it will make a variable of that name and keep it synched to the robot where you can read or write it. This is more useful for simple checkboxes and numerics, but is not sufficient for more complex types like the cluster you are using.

Greg McKaskle

BenGrapevine 17-01-2014 12:56

Re: Processed Image > Robot Movement Help Needed
 
Okay I have the dashboard flattening Distance information to a string, and then sending it to SD Write String, which should put it into it's hierarchy or whatever. Now in the PeriodicTasks.vi I have placed a SD Read String and made it unflatten the string like this:



What do I do with this unstrung data in regards to getting it global and moving to mutliple .vi's? I've never used globals before, and I can't find any documentation that talks about this step in the vision processing systems.

My end goal for this is to be able to see boxes (done), identify which boxes mean left and right by using a score system (done), calculate the distance the robot is from the wall (done), send that information from the dashboard to the robot code, interpret the score values it gets, and on a button pressing move to a specific xy location to line up a shot. I am just starting with distance so I can at least make the robot go a certain distance from the goal. After that is when I would like to look into turning the robot to face the goal straight on, or moving to be in an exact spot on the field.


Any help with moving forward on this is and will be appreciated.

Greg McKaskle 17-01-2014 14:43

Re: Processed Image > Robot Movement Help Needed
 
The input to the unflatten node called "type" is used to describe what was flattened in the string. You need to wire the same datatype to type as you wired into the flatten node on the dashboard. It is often easiest to drag and drop directly from one window to the other to ensure they are the same. If you envision this changing, you should read up on typedefs as a means to share types between projects.

The robot project contains a file called robot global.vi. If you open that you'll see that there are already a few in use. To create another, drag your type into robot global and name it. You can now drag robot global from the project into tele op and periodic tasks and select the global to read or write.

Another thing you should do in the loop you show that reads the SD, unflattens, and updates the global is to add a delay. As written, this parallel loop requests that it run as fast as possible. This will max out the CPU. In reality, this loop probably needs to wait for 30ms or 50ms in parallel with the operations it contains. Notice that the other loops do this already with 10ms and 100ms wired up.

For more details on globals, please search the general LabVIEW help documentation.

Greg McKaskle

BenGrapevine 18-01-2014 10:31

Re: Processed Image > Robot Movement Help Needed
 
(dashboardmain.vi)


(periodictasks.vi)


Looking at the flatten to string, it appears the data coming in is a 1D array. I have no clue what you mean by dragging and dropping stuff from there to the 'type' in the unflatten from string.

Greg McKaskle 19-01-2014 10:41

Re: Processed Image > Robot Movement Help Needed
 
Actually, I'd suggest right-clicking on the Distances indicator terminal. Choose Create >> Constant. This creates a constant of the same datatype. Use the mouse to drag the constant from your dashboard VI's block diagram to the periodic VI's block diagram. If this is awkward, you can copy and paste. You can then delete the constant from the dashboard diagram.

Then wire the constant to the Type input. Next you can unbundle the data front he unflatten or otherwise share the data via globals.

Greg McKaskle

BenGrapevine 21-01-2014 17:32

Re: Processed Image > Robot Movement Help Needed
 
Okay so I followed your instruction here and added the distances to work on a joystick button like so:



On button3 and once checked if the distance is >= 6 feet, drive forward. This seems like it would work fine but I have no idea why it's not calculating distance information:



As you can see on the dashboard, in both places the distance is indicated as 0. What am I doing wrong?

BenGrapevine 21-01-2014 18:04

Re: Processed Image > Robot Movement Help Needed
 
I've modified the dashboard so that the smaller "distances" panel next to the HSV control now displays hot/not hot and lights up when it is "left", what still doesnt show up is distance and target to center. Do I need to do something with the computing distances vi to allow it to calculate?

Greg McKaskle 21-01-2014 21:06

Re: Processed Image > Robot Movement Help Needed
 
I'd suggest opening additional subVI panels to see what is causing the zero distance. The Calc distance is based on the camera type and is pretty simple trig. Does it make more sense now?

Another thing looks odd in the joystick button code you attached. It unbundles a double and compares it to 6. The double numeric seems to be coming from a constant. I suspect that should be the distance value that is being communicated and unflattened.

Greg McKaskle

BenGrapevine 25-01-2014 11:03

Re: Processed Image > Robot Movement Help Needed
 
What would you recommend doing? I redid the vision processing to make sure distance works now (which it does, thanks for the help on that).



But now I'm working on integrating it into something I can work with in teleop. I would like to have it read the distance data value and while both the button being pressed is true, and the distance value being greater than a certain number is true, drive forward until that is no longer true.

Here is what I have for teleop.vi at this point in regards to vision (now that I've started over):


Greg McKaskle 25-01-2014 20:58

Re: Processed Image > Robot Movement Help Needed
 
1 Attachment(s)
I attached an image with a few edits.

You were heading in the right direction, but making a few wrong turns.

Your code was reading "Left and Right Motors" from both RobotDrive and also from the Motors list. While this is supported, I suspect you were really meaning to update robot drive again. Please correct me if I'm wrong.

The next thing to think about is what the result is when a motor is given multiple commands during a teleOp execution or in parallel portions of the app. The better approach is to combine the logic and update the motors or other actuators just once. That is what I attempted to show I'm my modified image. You bring the joystick and the distance info into a single switch statement. You can update the motors either in a common location using outputs.

A slight variation that you may see or decide to use is to bring the values together and use a ternary ?: operator to select one of them to go to the motors.

Greg McKaskle

BenGrapevine 27-01-2014 12:23

Re: Processed Image > Robot Movement Help Needed
 
I fixed the code and ran it on the robot, but I kept getting "Error Code 74: Unflatten from String in periodic task.vi > robotmain.vi "Memory corrupt"" when I move the joystick. Am I forgetting to do something with the unflatten?




EDIT: Woops, I wasn't sending the right data from the dashboard to the robot (I was tinkering around with different ways to send it and forgot to undo it)

EDIT 2: This is the error I'm getting when running it now:
Quote:

ERROR <Code> -44061 occurred at "Left and Right Motors" in the VI path: Robot Main.vi
<time>00:00:58 01/01/1970
FRC: The loop that contains RobotDrive is not running fast enough. This error can occur if the loop contains too much code, or if one or more other loops are starving the RobotDrive loop.

Greg McKaskle 28-01-2014 07:08

Re: Processed Image > Robot Movement Help Needed
 
That message is telling you that your RobotDrive wasn't updated for over 100ms. So the safety mechanism set it to zero for you. This is a good thing if you happen to be debugging your code or you slow it down accidentally.

Do you know which loop it is referring to? It may just be in your teleop. Do you see any reason why teleop would take a long time to finish? A common issue is to place a delay to sequence an action directly into teleop when a button is pressed or a target is seen. This causes that iteration of teleop to take a long time and the message is printed.

If you see many of these messages, it may be worth instrumenting teleop and other loops to see how often they are running. To do this, the project window has a folder called support code. The Elapsed Times VI can be placed into the teleop or other loops. Then run the code on the robot in debug mode and open the Elapsed Times panel. It will show you the delta between each call. Between this and the CPU usage on charts, you can identify if loops are running faster than you need, slower than you want, etc.

Greg McKaskle

BenGrapevine 29-01-2014 21:51

Re: Processed Image > Robot Movement Help Needed
 
It works!

Greg McKaskle 29-01-2014 21:59

Re: Processed Image > Robot Movement Help Needed
 
Congrats. Care to share why the issue was or what settings you changed to get it working?

Greg McKaskle

BenGrapevine 29-01-2014 22:03

Re: Processed Image > Robot Movement Help Needed
 
Quote:

Originally Posted by Greg McKaskle (Post 1334556)
Congrats. Care to share why the issue was or what settings you changed to get it working?

Greg McKaskle

Yeah I plan on putting together a start to finish tutorial asap. That was my main issue was with doing vision. There are (great) whitepapers out there on how do complete certain steps, but I haven't seen a complete guide along with options for things to do with your robot once you can see.\

The issue was the actual drive code was being starved of information due to a case structure that filtered the flow of data to it. The workaround was allowing the drive code to constantly receive information even if it's 0. The problem was not overflow of code or something running too slow (e.g. too much data analysis in teleop.vi). Another thing that helped was to only send through the distance values as a double instead of the entire distance + hot? + left? + xy coords as a flattened and then unflattened string, although that is still possible we just don't see a need for all of that right now.


All times are GMT -5. The time now is 20:59.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi