|
Re: Collaborate on LabView virtual Kalman robot?
Thank-you Joe Ross,
I didn't know you had to tune Kalman filters. Probably should have known that though, since you have to fiddle with other kinds of filters.
I don't think this is all that ambitious, considering we have a pretty well written guide. Besides, if it proves to be too difficult....well that is something worth knowing BEFORE the next build cycle.
Has anyone read the paper yet? I had sort of thought about just following the paper section by section and seeing what happens.
Sections 1 and 2 of the paper are descriptions that seem perfectly reasonable and close enough to FRC robots to seem applicable.
Section 3 starts with:
"We define perception as: The process of maintaining of an internal description of the external environment." Um, well it is a research paper.
Then it kind of goes on a bit as research papers do, but finally settles down and starts making some really good points:
"
Principle 1) Primitives in the world model should be expressed as a set of properties.
Principle 2) Observation and Model should be expressed in a common coordinate system.
Principle 3) Observation and model should be expressed in a common vocabulary.
Principle 4) Properties should include an explicit representation of uncertainty.
Principle 5) Primitives should be accompanied by a confidence factor."
If you get the powerpoint above, I've proposed the layout and defined the primatives to follow this model. The 'common vocabulary' is meters. So the next thing that needs to happen is to create some arrays to hold the data collected by the sensors.
Specifically, the paper suggests:
Model: M(t) ={ P1(t), P2(t),... , Pm(t)} -- That mean the model is a collection of primitives, each with a timestamp, and:
Primitive: P(t) = {ID, X^ (t), CF(t)} -- A primitive is a single data reading from a sensor. Also included with the sensor's measurement is an ID (so you can tell which sensor it came from), a timestamp of when the measurement was taken, and a confidence factor.
Somewhere later it says we're going to use the Kalman filter to adjust the confidence factor. You always start with a low confidence number for any new reading. As you take more measurements, the filter 'decides' if the data is good and increases the confidence or if the data is bad and it gets thrown out.
So far it all this seems simple enough and really quite workable. So what we need next are some containers to hold the sensor data in a way that is going to be easy to work with. LabView will to be able to get the sensor data, timestamps, etc and write them into arrays. And the LabView math routines should make processing the arrays straightforward.
Is anyone out there a wizard of array processing? It would be nice at this point to have a good logical data structure for the primitives. Considering so far we've only defined 6 sensors, I'm thinking it would be possible to keep a dozen or so primitives in the robot's world model before letting the oldest primitive drop off into the bit bucket.
Does anyone know of a good way to store the primitives and the cycle them in such a way that we are not constantly shifting a bunch of data around or running weird routines so we end up tangling ourselves in loops? It would be nice to have a easy to understand way of storing the primitives in a chronological order so we don't have to worry about which data is which.
Regards,
KHall
|