paper: 1706's 2012 Computer Vision, final copy

Thread created automatically to discuss a document in CD-Media.

1706’s 2012 Computer Vision, final copy
by: faust1706

The FINISHED scientific paper describing how 1706 implemented the microsoft kinect in te FRC challenge Rebound Rumble.

This paper presents a method to track, organize, and view reflective squares in real time using the Microsoft Kinect sensor. Before the possibility of tracking could occur, image processing was required. The images were captured from the Kinect IR camera as grayscale and then went through numerous filters such as erosion, dilation and thresholds. After the image was processed, calculations could be made to show how far away in Cartesian coordinates and the degree of turn in respect to the X, Y, and Z planes the squares were in relation to a fixed position. In order to do this, image and object point pairs are created, making the center of the top target the origin. The image feed to the Kinect defines the image’s image points and uses vector calculus to tell the change in the vector’s characteristics in relation to the standard image. The object pose of a given set of object points is estimated, the translation and rotation vectors are found between the two images, and then converts the eigenvalues to eigenvectors, and the final result are the eigenvectors that tells how far away the squares are in relation to the Kinect, as well as the pitch, roll, and yaw.

Real Time Multi-Square Detection and Tracking.doc (832 KB)