View Single Post
  #1   Spotlight this post!  
Unread 08-02-2008, 19:56
bomber7 bomber7 is offline
Humanoid
FRC #0585
Team Role: Programmer
 
Join Date: Feb 2007
Rookie Year: 2007
Location: CA - Tehachapi
Posts: 20
bomber7 is an unknown quantity at this point
Serial Port Interrupts

Ok. I'm using Kevin Watson's serial port code. It has a few simple, easy to use functions. Three main functions for each serial port. I will only be talking about the second serial port (TTL). Theres one function for initilizing the serial port. This comes implemented already. Then theres a function for determining how many bytes are in the serail port's buffer. Finally theres a function for reading a single byte from that buffer.

I built a large segment of code for communicating with the AVR cam. It reads data off the serial port's buffer until it hits the finish/escape character or it reads all the data in the buffer. When it recieves a complete packet (It gets the \r or 0xFF char) it will send the packet off to another peice of code I built to process the packet.

I've had it explained that the serial port code works by interupts. (Meaning everytime some bytes come through the serial port it stops everything to add that code to a buffer, then returns to w/e it was doing before) Now the problem with that, is whille my buffer is reading, it interrupts to add more information. I didn't build my system to be able to handle the buffer changing. For example, I built a safe guard in the begining that stops the overflow of my local buffer. If the bytes in the serial port's buffer surpass the amount of bytes my array can hold, it stops the read process. However if after that it adds a whole bunch to the buffer then theres a chance of my local array/buffer overflowing.

Thats just one of a few dozen different errors that can happen if its updating its buffer as I'm trying to read from it.

I have onle thought of one solution, however since I haven't looked through the Serial Port code I don't know if its fesable. I'm thinking if I set a flag at the begining of my codes and unset it at the end, then I'll add a check for that flag during the interrupt. If that flag is set during the interrupt then it'll pass it to a different buffer. During the first write which that flag isn't set it'll write everything from the temp. buffer to the start of the actual buffer.

What does everyone else think? This problem is very aggravating.