Ok, I’ve been working on this for a week and I haven’t managed to fix any problems, just find ways around them. It’s time to get some help. I’ve been working on a speed controller based on a PIC 12F683. The reason I chose it is that it has hardware PWM. I set the frequency and duty cycle (period and high time, rather) and I have a nice signal to send to my h-bridge. I managed to get this PWM output working correctly by reading a pot. I decided to move on and try to read a servo pulse. (To avoid confusion between the motor control PWM and hobby PWM, I’m going to start calling the latter a servo pulse or the signal, the former will be called the PWM output)
In my early attempts, I was polling for the signal. This worked, but my decoding algorithm still need some fine tuning. After rooting around the datasheet, I discovered that I actually did have free interrupts (apparently all the pins can be set as interrupts) and also found a 16-bit timer (Timer1). I decided that it would be more elegant to use these. This is my current scheme:
-when an interrupt is triggered (state change on input), check the signal’s state
-if it’s high, clear the timer and start it
-if it’s low, stop the timer and decode it
In the interrupt handler, I set an output so that I know it made it inside. Essentially, this output follows the servo pulse. (If the signal is high, the output goes high) This is how I know it’s working properly. Now, however, I’m having some trouble with the timer.
For some reason, the time it measures is too short. I’m sure there’s some bugs in my decoding algorithm, so I bypassed it by setting the duty cycle of the PWM output to the lower byte of what the timer measured. I measured the PWM output with a scope. Assuming I did all the math correctly, the timer byte is about 50. I did the same with the upper byte and it’s 0. It stays the same no matter what the input signal is. If I don’t clear the timer bytes, however, it keeps counting and overflows. This tells me that it’s counting something. I’ve gone over all the code that initializes the timer and checked the datasheet many many times already, but I can’t spot the problem.
As a separate problem, for some reason, I can’t set any pins to be inputs. I set the appropriate bits in TRISIO, but it doesn’t work. The only input that works is GP3, but that’s because it works only as an input and can’t be changed. I can’t remember if it reads the “inputs” as high or low, but whichever it is, it never changes no matter what I apply to the pin.
GP4 does something strange, however. No matter what I set TRISIO<4> to be, GP4 outputs some strange signal. It’s almost square, but it oscillates between transitions (when it is supposed to be a flat high or low). …Actually, I take that back, the header I’m using has some bad numbers. It’s setting GP4 to output the clock. But that doesn’t explain why the others don’t work.
Of course, I’ve attached code. You can ignore my signal decoding routine (procsig). I’m sure there are still bugs in it and there aren’t many comments (I did a lot of it on paper, so I copied the code, but not the comments). The interrupts are handled in gpio_change (called from my interrupt service routine at the top). This is where the timer is stopped and started.
sc_servo.asm.txt (4.71 KB)
sc_servo.asm.txt (4.71 KB)