I2C implementation on the cRIO

Hi all,

If you’re not familiar with the I2C standard, wikipedia has a good explanation here: http://en.wikipedia.org/wiki/I²C

If you feel comfortable with the protocol, please read on.

I’m trying to get an ATtiny26 microprocessor to talk to the cRIO using I2C, the cRIO being the master. (I’m not sure if it’s possible for the cRIO to be a slave, but we can get into that later.)

What I’d like to do is have the cRIO read a single byte from the ATtiny, and hold the clock low until it determines how many bytes it wants to read.

Once it determines that, it should either release the bus, or read more bytes.

I’m curious how I2C is implemented on the cRIO, so that I can make sure this works. From looking at the LabVIEW code, it appears it writes 8 bytes, then reads 8 bytes, and then throws away everything except the number of bytes you asked for.

Now, I2C is a low-speed bus. I’ll probably be running it at 100khz - maybe a little faster if my ATtiny supports it. Using my understanding of how it is implemented on the cRIO, it would take over 150 clock pulses to just get one or two bytes. Reading one byte should only take 18 clock pulses from start to stop.

Anyways, I’ll be doing some tests soon, but I was curios if someone with a better understanding of the WPI library and FPGA image could share some knowledge so I know what to expect.

Well, folks, I’m getting funny results.
I’m still not sure if the cRIO writes 8 bytes and reads 8 bytes every time. I suspect that it only writes and reads as much as it needs to.

The I2C driver I wrote for the tiny26 works when I test it by hand. (I use pushbuttons to generate the signal, and LEDs to display the signal. There’s some stuff in there for debouncing as well.) However, when I hook it up to the cRIO, it works intermittently. Sometimes the cRIO even reads a bit or two offset from what the microcontroller is sending. Sometimes it gives me gibberish (aka does something else I don’t yet understand). Most of the time it fails and returns an error.

Unfortunately, until I have access to a newer oscope and more than one set of leads, I can’t really look at the signal. Unless I monitor the signal using two GPIO on the digital sidecar. I haven’t looked into that yet, but if the digital module is fast enough to output this, then it should be fast enough to input it as well.

My best guess of what is happening is that the cRIO is running the I2C clock a bit too fast for the tiny26, and so most of the time it doesn’t work.

Any other guesses?
Does anyone have info on how fast the bus is (as implemented here), and what options like “I2C register” and “compatibility mode” actually do?
I think “compatibility mode” means it deals with ACK and NACK. I have no clue why you WOULDN’T want these features - they are your error checking.
The I2C register input - and I’m only guessing here - might be a message that is sent immediately prior to the read request. I don’t know it’s actually two messages, or some hybrid form of read and write that isn’t in the I2C spec. Maybe this is something from SMBus?

Joe Hershberger can give you better details that I can, but I’ll take a whack at it until he finds this thread.

The short of it is that the I2C implementation is limited by the speed of the digital IO and the bi-directional nature of the protocol.

A 100% compliant implementation requires the master to check whether the slave is holding the bus low on every single bit. Checking for clock stretching like this takes a few extra IO cycles, and is usually unnecessary. For speed, it is by default omitted. Compatibility mode checks for stretching in more places.

What I’m not clear on is where stretching is checked in which mode.

Well that could certainly cause some issues.
I haven’t checked yet what the response time is of my program.
I thought that it was implemented in the FPGA, so it could use a transparent latch on a shift register, so the processor can deal with things four bytes at a time.