Interesting cRIO delays

So while testing on bag&tag day I noticed some odd behavior coming from our robot.

First I will explain our setup. We have 2 separate axle shafts connected to encoders which I feed into 4 DMA channels which then return the rates of those shafts. I used modified versions of the example code for DMA encoders to accomplish this. I have these rates fed into PID functions to control the rate of the motors. This code is in its own loop in periodic tasks and is running as a 10ms loop. (Problem?)

I noticed that immediately after deploying the code, the delay between spinning the wheel and the encoder response was in-observable. However, as the testing continued the delay began to grow until it was at about a second after a few minutes of testing. When I stopped and redeployed the code, I would get temporary relif, but after a few minutes, the delay returned. I also noted that as the evening progressed the delay became larger and larger.

Additionally, at some point in the evening several motors including some on the drivetrain and shooter wheels began to twitch (turn on momentarily (<.5sec) and then stop.

Some of our mentors believe that the problem might be the ribbon cable that we were using to connect the DIO modules to the DS…

Does anyone have any explanations/workarounds for these problems.

Does this happen because your control loop isn’t keeping up with the DMA rate? If you are consuming a 100Hz stream with a 10ms delay, you are approximately keeping up, but each little delay in the FPGA producer adds a little bit of extra data to the buffer that the consumer doesn’t get to. This would probably improve enough if you will switch from a regular loop to a Timed Loop. The timed loop runs on a different scheduler and at an elevated priority. The other solution is to run the consumer loop until the buffer is empty, take the latest data from the buffer, etc.

Greg McKaskle

It is possible.

  1. How do I implement a timed loop?

  2. How do I know 100Hz?

  3. Would this make my teleop controls laggy?

-Patrick

Why do you care about encoder rate in the past?

I don’t? I am measuring the current encoder rate.

The only reason I can think of to use DMA is if you care about past measurements. If you only care about the most recent measurement, you can just use the rate output from the encoder get.

DMA is typically used when you want to guarantee that you see all the values that have been generated. If you only care about the most recent value, you don’t need to bother with DMA.

The problem that we ran into with the regular code is that we were not seeing all of the data. Our encoders have 256 pulses per rotation and at ~1000 rpm the cRio was losing pulses. Therefore we figured that DMA would be a good choice to not lose any pulses (which would screw up rate). We are just having a delay problem which may or may not be related to the DMA. Anyone have any explanation for the twitchy robot?

The encoder should be good for close to 40,000 per second, far more than you describe. Additionally, the DMA cannot go faster than the decoding, so it will not help to speed up encoders.

Greg Mckaskle

Is there a downside to using the DMA though?

Robot twitching?

Do you have any other threads sharing time with or stealing time from your encoder thread? We don’t use NI on the robot. We use C++ and would just raise the relative priority (by lowering the priority number) of the encoder thread/task to make sure it runs when needed. Is there a way to do that in LabView? Greg?

HTH

You’ll spend a lot more time processing data, without any improvement to your problem.

Alright, it just seemed that our encoder data was much cleaner with DMA than without, but I am willing to give the other way a try. How does one elevate the thread priority of the encoder code?

Timed loops have a priority value, and by default are higher priority than normal code anyway. But as with DMA, be careful playing with priorities, they are an advanced feature and can starve important processing tasks if used inappropriately.

The other place to change priority is on a VI. You go to VI Properties>Execution and you can adjust the priority.

I’m not sure what the original issue was, but reading encoders gives the instantaneous value and most recently computed rate. The DMA will give the sequence of values and/or the sequence of whatever else you are adding to the DMA. This is awesome when trying to understand the relationship or trend of data coming from a sensor, but not helpful for your case unless you are smoothing the data or something.

Before playing with priorities, I’d turn off DMA, put the code back to the simple form, and see if the issue is back. If so, post some symptoms so that we can determine what may be causing it.

Greg McKaskle

Alright, I will work on that this afternoon

It worked! Using just the regular encoder function in the Timed While Loop produces clean results. Thanks everyone