rs232 communication, general timing question
Posted
by Sunny Dee
on Stack Overflow
See other posts from Stack Overflow
or by Sunny Dee
Published on 2010-04-29T00:57:23Z
Indexed on
2010/04/29
1:07 UTC
Read the original article
Hit count: 391
Hi,
I have a piece of hardware which sends out a byte of data representing a voltage signal at a frequency of 100Hz over the serial port.
I want to write a program that will read in the data so I can plot it. I know I need to open the serial port and open an inputstream. But this next part is confusing me and I'm having trouble understanding the process conceptually:
I create a while loop that reads in the data from the inputstream 1 byte at a time. How do I get the while loop timing so that there is always a byte available to be read whenever it reaches the readbyte line? I'm guessing that I can't just put a sleep function inside the while loop to try and match it to the hardware sample rate. Is it just a matter of continuing reading the inputstream in the while loop, and if it's too fast then it won't do anything (since there's no new data), and if it's too slow then it will accumulate in the inputstream buffer?
Like I said, i'm only trying to understand this conceptually so any guidance would be much appreciated! I'm guessing the idea is independent of which programming language I'm using, but if not, assume it is for use in Java.
Thanks!
© Stack Overflow or respective owner