Quote:
Originally Posted by Tom Line
That's great information.
Ok - next question. If we place the polling blocks in "parallel" (I assume that simply means next to each other rather than in a series string) can the hardware actually poll the sensors in parallel, or will it be qued up and poll each piece of hardware in series?
|
Yes, next to each other. You can have common inputs going into the loops, common outputs coming from the loops, but if one loop feeds the other through a dataflow wire, the downstream loop cannot begin until the upstream loop completes and hands over the data.
The parallel loops are internally scheduled using a pool of threads, typically 4 per processor in the system. This means that if a loop calls into some code the suspends the thread waiting for an operation to complete, a parallel loop can be scheduled on a parallel thread to fill the void. Or, most of the I/O in LV is built using nonblocking APIs, and that means that even the original thread is available to do work while the first loop's task is in progress. A good example of this is network communications. Doing a read with a timeout of 100ms means dependent nodes will wait for the results of the read. It doesn't mean that parallel code can't borrow the thread and make progress while the TCP stack does its job.
And of course on truly multicore machines, the loops really can run in parallel, even the contents of the loop can be run in parallel to the extent that dataflow allows.
Greg McKaskle