|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools | Rate Thread | Display Modes |
|
|
|
#1
|
|||
|
|||
|
Re: Using Target Distance to Move Robot
The example project has code in the computer target section for running on a laptop, and code in the cRIO section for running on the cRIO.
The primary task would be to integrate the LV loop into the dashboard and to get the data back to the robot -- I'd recommend using UDP. Greg Mckaskle |
|
#2
|
||||
|
||||
|
Re: Using Target Distance to Move Robot
...he says, as though this were the simplest thing :|
Another potential hiccup: whenever something obstructs the camera view, or even when the camera is simply stationary, the index of a particular target in the Target Info array changes. That could be a problem if we're trying to narrow on a particular target. My mentor says we should be able to make the code remember certain position coordinates, and assign them to a specific index, so that if a target is at index 1 and its postion is (-0.5, 0.5), for argument's sake, if we obstruct the camera view and then unobstruct it the target should still be reassigned to index 1. Any idea how to implement this? |
|
#3
|
|||
|
|||
|
Re: Using Target Distance to Move Robot
The dashboard code already has independent loops that do UDP. For example, the loop that reads the Kinect Server data is towards the top of the Dashboard diagram. The important part is shown below.
It reads from port 1166 about once a second, or whenever data arrives. It reads at most 1018 bytes as a string, and then interprets it as the agreed datatype. In the situation we are considering, a similar loop would be placed on your robot and run in parallel with everything else -- I'd suggest doing it in Periodic Tasks. The second image shows the code that needs to run on the dashboard to send the data. You need to change team and you need to make the data constant be your own data either formatted or flattened to a string. The final piece is to identify the UDP port to use and use it for both the read and write. As for the index problem, I'm pretty sure that is currently based on the particle size. I'd probably try to sort them by location and label them as top, left, right, and bottom. You could then store them in a cluster or an array with a unique cell for top, left, right, and bottom. You should be able to identify them with any sorts of simple sort techniques. Greg McKaskle |
|
#4
|
||||
|
||||
|
Re: Using Target Distance to Move Robot
So then, would we have to run the Vision Processing code on the dashboard, write to the robot using UDP, then have the robot read the UDP packets using code in the Periodic Tasks VI?
never mind that, wasn't reading correctly. I guess the challenge is finding out how to convert the image to a string (the Get Image Data String VI requires a CameraDevRef input). Last edited by Pirate programe : 06-02-2012 at 16:02. |
|
#5
|
|||
|
|||
|
Re: Using Target Distance to Move Robot
Provided the camera is connected to the dlink, you don't need the cRIO to do anything with the image.
The dashboard reads the image directly from the camera. The dashboard processes it. The dashboard sends any target info to the robot via UDP string. The robot reads the TDP string and updates setpoints which ultimately move the robot. You don't have to do it this way, but if you want to use the laptop to do the processing, to allow the cRIO CPU to do other things, this is the way I'd approach it. Greg McKaskle |
|
#6
|
||||
|
||||
|
Re: Using Target Distance to Move Robot
Quote:
also it'd be nice if you could direct your attention here, if it's not too much trouble? |
|
#7
|
|||
|
|||
|
Re: Using Target Distance to Move Robot
Yes, the dashboard is quite capable of doing vision processing. The classmate and typical laptops are quite a bit more powerful than the cRIO. Less capable to do I/O, but more powerful CPU.
I'll look at the other thread as well. Thanks for bringing it to my attention. Greg McKaskle |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|