Hi folks,
I am working on some firmware that has external interrupts which then times how long between interrupts.
My interrupt function just subtracts the previous micros() time from the current micros() time to calculate the difference. It sets a flag high and thats all.
I've tested this under different speeds (times) and for example it will work with 60 millisecond cycles and outputs the right time difference (around 60,000 microseconds). But then when I slow things down or speed things up, I'll get these weird values, like 900 microseconds on one cycle and then like 40000000 on the next?
I've stripped everything else out my code and it still does this. Is this related to the bug in micros that someone discovered a while back? I'm on version 0.0.12
My variables are all unsigned integers and I don't think I'm having problems with the values rolling over or going negative.
Just can't figure out whats going on....(scratches head).
Thanks for the help