|
Re: Team 1124's Pet Robot
We used the MJPEG loop and get image VI in our LabVIEW code. We used NI Vision Assistant to tune the image processing, then used those VI's and values in our robot code to mimic the tests. This found the fabric's center, which we used to tell the robot how far to drive left or right.
__________________
Justin Yost
Systems Team
President
|