![]() |
Processed Image > Robot Movement Help Needed
I have spent the last week or so modifying the dashboard to be able to process the vision system for this year's game.
![]() As you can see above, the dashboard is able to take our live incoming image and give back the specific scores pertaining to left and right, etc. We can also see the distance to the wall (not pictured because it's cut off currently, and it's also a little off on the math). With that information and knowledge handy, I'm completely stumped on how to integrate it into the teleop or autonomous VIs and make the robot do things (such as move to 10 feet away from the wall, or turn 60 degrees to the left, etc.) based on score or distance away. If anyone considers themself a guru of vision, any help with pictured examples or a .pdf or something like that would be fantastic. |
Re: Processed Image > Robot Movement Help Needed
Quote:
http://www.chiefdelphi.com/forums/sh....php?p=1297945 |
Re: Processed Image > Robot Movement Help Needed
Alright thanks for the link, looking into it now.
|
Re: Processed Image > Robot Movement Help Needed
Okay to be more specific:
How do I take information being created in one VI project and give it to another? e.g. Scores from dashboard.lvproj and transmit that to robotcode.lvproj (teleop.vi and the like) I get that once that information is in the VI, the world is open to possibilities with PIDs and etc, but I can't seem to get there yet. |
Re: Processed Image > Robot Movement Help Needed
You can take your targeting cluster, flatten to a string, UDP Write (port 1130) to the Robot. On the robot you have a UDP listener on port 1130 that reads that string, performs an unflatten from string back into a similar cluster on your robot code.
|
Re: Processed Image > Robot Movement Help Needed
You can transfer a value or cluster via UDP as Michael suggested. You can also use SmartDashboard variables (via the Network Tables infrastructure) to share the values between the Dashboard and the robot program.
|
Re: Processed Image > Robot Movement Help Needed
Part 3 of the tutorial, step 5, shows a picture of flattening and sending the string via network tables.
To compare the choices: 1. Unbundle and write individual elements using network tables/smart dashboard: a. This may be the simplest to approach. b. You will be restricted to the types supported by network tables. c. It is possible that the elements do not arrive at the same time. If you write ten elements, five may be delivered now and the other five in 100ms. 2. Flatten the cluster and write it using SD/NT. a. This is similarly simple, but involves flattening and unflattening data. b. You must match the types exactly. An int and a float aren't the same, and order matters. 3. Format or flatten to a string and write using TCP. a. This is still not hard, but you need to use the TCP nodes in addition to flattening or string formatting functions. You also need to unflatten or parse the string on the other end. b. You need to specify the correct port that will be opened on a official field. c. This is how NT/SD works under the hood. 4. Format or flatten and write using UDP. a. Similar to TCP, but some find UDP easier to understand. b. UDP isn't guaranteed to arrive, though for repetitive writes, you can typically ignore this. c. Be sure to use the correct ports for UDP, not the TCP ones. Once the data is sent, you need to read it on the robot. You can put the SD/NT reads almost anywhere, but I'd suggest Periodic tasks. TCP and UDP should really go there and not in auto or tele code. Once you have it in periodic, reform your cluster and write it to a global. Read the global in auto or tele. Greg McKaskle |
Re: Processed Image > Robot Movement Help Needed
I agree with Greg, Periodic Tasks>Global is the way to go for sure.
|
Re: Processed Image > Robot Movement Help Needed
Alright, I've flattened the distance information to a string and did some poking around with SD/NT and I'm getting pretty confused. I've got it writing a string array to a SmartDashboard table which, like the tutorial shows, calls it "Distances". I was seeing some Network Tables .vis that were calling for the cRIO IP address to start communicating information but I know absolutely nothing about it and I don't know how SD and NT relate to each other, if at all. Though I've got a hold on reading it on the robot once the information is received (as far as I know, I haven't done it before), what's the next step for establishing a TCP or UDP connection in the Dashboard the Robot? I feel like it's right there in the Network Tables palette but I can't figure it out.
EDIT: I apologize if my battery of questions are annoying, I've spent 4 years looking into vision and it's a miracle the dashboard can see shapes as of Tuesday and I'm excited but know nothing more about the subject. |
Re: Processed Image > Robot Movement Help Needed
This is unrelated but do you know if we can put color sensors on the robot?
|
Re: Processed Image > Robot Movement Help Needed
UPDATE: I found UDP write but I don't know what information on the .vi means what? What is the difference between Connection ID and Address?
EDIT: Man I'm stupid 'detailed help' is a life saver. EDIT 2: It's not as much of a life saver as I thought, I still don't know what they mean. |
Re: Processed Image > Robot Movement Help Needed
Quote:
|
Re: Processed Image > Robot Movement Help Needed
As Joe said. NT and SD are already in the template code. So that it isn't magic, let me answer a few of your questions.
Network Tables is a name for the networked variables. The variables have a path that organizes them hierarchically and you can think of the elements at the same level of hierarchy as a table of variables. Smart Dashboard is really the same thing, but we put all of the variables into the /SmartDashboard/ table. This simply isolates them from some book keeping areas and avoids a bit of clutter. The LabVIEW Dashboard has a loop that calls the NT Binding function. It is loop 3. It takes in a list of controls and indicators that are then bound to the identically named NT variables. The loop keeps it running even if the robot disconnects and reappears. The LabVIEW robot template has Robot Main start the NT Server. It is just below the loop towards the bottom of the window. With those two pieces running, you should simply need to use SD VIs to read and write data. Make sure the name and datatype matches. On the dashboard, there is a second feature where you don't even need to use a read or write. By placing your control in the Operation or Auto tab and naming it will make a variable of that name and keep it synched to the robot where you can read or write it. This is more useful for simple checkboxes and numerics, but is not sufficient for more complex types like the cluster you are using. Greg McKaskle |
Re: Processed Image > Robot Movement Help Needed
Okay I have the dashboard flattening Distance information to a string, and then sending it to SD Write String, which should put it into it's hierarchy or whatever. Now in the PeriodicTasks.vi I have placed a SD Read String and made it unflatten the string like this:
![]() What do I do with this unstrung data in regards to getting it global and moving to mutliple .vi's? I've never used globals before, and I can't find any documentation that talks about this step in the vision processing systems. My end goal for this is to be able to see boxes (done), identify which boxes mean left and right by using a score system (done), calculate the distance the robot is from the wall (done), send that information from the dashboard to the robot code, interpret the score values it gets, and on a button pressing move to a specific xy location to line up a shot. I am just starting with distance so I can at least make the robot go a certain distance from the goal. After that is when I would like to look into turning the robot to face the goal straight on, or moving to be in an exact spot on the field. Any help with moving forward on this is and will be appreciated. |
Re: Processed Image > Robot Movement Help Needed
The input to the unflatten node called "type" is used to describe what was flattened in the string. You need to wire the same datatype to type as you wired into the flatten node on the dashboard. It is often easiest to drag and drop directly from one window to the other to ensure they are the same. If you envision this changing, you should read up on typedefs as a means to share types between projects.
The robot project contains a file called robot global.vi. If you open that you'll see that there are already a few in use. To create another, drag your type into robot global and name it. You can now drag robot global from the project into tele op and periodic tasks and select the global to read or write. Another thing you should do in the loop you show that reads the SD, unflattens, and updates the global is to add a delay. As written, this parallel loop requests that it run as fast as possible. This will max out the CPU. In reality, this loop probably needs to wait for 30ms or 50ms in parallel with the operations it contains. Notice that the other loops do this already with 10ms and 100ms wired up. For more details on globals, please search the general LabVIEW help documentation. Greg McKaskle |
| All times are GMT -5. The time now is 09:50. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi