Thursday September 30, 2010
Hello,
Let me try to flesh out my idea some more. As I said in my first post I do not have an immediate need. I am just looking for more ways to benchmark and debug the Maple software.
Let me clear about what I am NOT proposing. I was NOT hoping the DLL could run on the Maple (too complicated and not enough memory). It is not clear how a DLL written for an X86 processor could run easily on a RISC processor such as the STM32. I was NOT hoping to reprogram, update, or patch a sketch running on the Maple (but that would be an interesting trick if it could be done).
When I named the initial post "Processing of Maple data using a DLL" I was focusing on the potential ability of the Maple to acquire DATA (via ADC) faster (ie. more samples/sec) and at a greater resolution (ie. 12-bit vs 10-bit) than the Arduino.
So I was imagining this sequence of events:
1) the Maple acquires data by reading analog pins (at a high rate & 12-bit resolution)
2) the Maple sends the analog data to the system (via some method). I will be honest and say I still do not fully appreciate the full abilities/roles and limitations of DMA, I2C, SPI, and the serial communications options available to the Maple/Arduino. The analog data might be stored temporarily in the Maple SRAM or FLASH if the data can be more sent more efficiently (eg faster) in large bundles to the system (instead of sending each individual analog read one at a time; which is what I am doing now with my LabVIEW/Arduino code).
3) a separate program running on the system (eg. Processing) simply reads the data sent from the Maple and relays the data to a DLL (eg. a graphing program). This program might reformat the data a little, but not much is done.
4) the DLL running on the system processes or interprets the data (eg. graphing the data and saving the interpreted results in a file)
So the Maple is just acquiring the data and sending it somehow to the system.
Assuming this can be done already with the slower Arduino, the "added value" is knowing if the program running on the system (list #3) can read the data at the faster rate of the Maple and if the DLL (list #4) can also keep up.
Of course, in the ideal world the original code for the DLL would be available and could be easily modified to interpret the increased 12-bit range (0-4095) of data values. Perhaps the person who wrote the DLL thought ahead and created a flexible method for informing the DLL of the resolution of the incoming data (eg. 10-bit vs 12-bit).
Where does an FPGA enter this conversation? How is an FPGA used? I know one will be on the Oak, but I have never used an FPGA before.
I hope this makes more sense.
Thanks!
Stephen from NYC