View Single Post
  #3   Spotlight this post!  
Unread 10-01-2017, 08:21
JamesBrown JamesBrown is offline
Back after 4 years off
FRC #5279
Team Role: Engineer
 
Join Date: Nov 2004
Rookie Year: 2005
Location: Lynchburg VA
Posts: 1,284
JamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond reputeJamesBrown has a reputation beyond repute
Re: Where do I begin with the Jetson TK1?

Quote:
Originally Posted by Lesafian View Post
Hi everyone, as many of you know this years FIRST challenge is one that vision tracking may be useful for.

My team is 100% planning on using vision tracking.

Aside from that, I am clueless as to where I should start. We do not have any programming mentors, and I'm the only programmer on our team (8 people).

Also, if it's not as easy as programming it in the same project as the rest of the robot code, how would I go about using the other code in the main robot project?

Thank you very much for reading and any responses you leave!
I would make a suggestion, if you are the only one programming the robot, and there is a good chance you will get pulled into doing other things (since there are only 8 people on your team) stay away from doing vision on the Jetson, it is not a small undertaking, and the resources are limited. It is a great offseason project. I mentor a team with multiple programmers and programming mentors, and we just got a TX1, currently we are leaning away from using it. It is 100% definitely possible to do this on your own, but it will be a lot of work. I don't want to stop you from learning something new, but we did a Cost/Benefit analysis for our own team and are having trouble committing to the project.

I am nearly 100% certain at this point that a PiXY cam see here will be capable of doing the vision tracking you want to do. Interfacing with it can be as simple as using analog IO on the rio. The cost is low, $60ish and it stops you from having to do most of the programming. Some teams used this to great success last year. The majority of the top teams used an older version of this as far back as 2006 to build auto aiming shooters. I strongly recommend it to a team with limited programming resources.


Quote:
Originally Posted by andrewthomas View Post
I would recommend using Python, as it is very easy to pick up and use for vision processing. It is also very easy to run on the Jetson.
Correct me if I am wrong, but all of the documentation I have found says that OpenCV does not have the Python bindings for the GPU. So if you use Python then you are only able to use the regular CPU functions. I think the main advantage of the Jetson is having the Tegra Accelerated OpenCV functions. If a team is committed to using Python, then I would recommend other boards that are cheaper and better supported/documented.
__________________
I'm Back


5279 (2015-Present)
3594 (2011)
3280 (2010)
1665 (2009)
1350 (2008-2009)
1493 (2007-2008)
1568 (2005-2007)

Last edited by JamesBrown : 10-01-2017 at 08:33.
Reply With Quote