Quote:
Originally Posted by Greg McKaskle
On the LV compilation topic, the LV source code is a dataflow graph of objects -- diagrams contain nodes connected by wires, with the occasional node containing other diagrams
The objects are visited over several passes in order to perform compilation tasks.
1. Data types are propagated after each edit.
2. Nodes are validated and syntax errors identified after each edit.
3. An algorithm performs what we call clumping -- coloring the graph based upon asynchronous operation.
4. Another algorithm improves inplaceness, reordering nodes to execute in an order which minimizes data copies.
5. Nodes allocate data storage.
6. Nodes emit code into clumps.
Clumps are blocks of memory that contain machine instructions in binary form. You can disassemble the instructions if you like and display them in text.
|
Thanks for this. It is good information to know.
That said, it's not the compilation I was asking about.
Assuming that a node is what I was calling an icon and graphically represents some logical or computational operation, there must exist some sequence of machine instructions to implement that operation.
You referred to these as clumps.
An assertion was made that there is no traditional textual source code associated with clumps -- notwithstanding that that terminology was not part of the discussion.
My claim is simply that the machine instructions contained in those clumps almost certainly were produced by a traditional compiler using a traditional text-based programming language -- quite likely C.
For the record, this didn't start out as a Labview discussion and I didn't lead it here -- nor did I want to.
There was first a claim that iconic programming was replacing text based programming.
I claimed that, on the contrary, text based programming is the foundation upon which iconic programming is built.
Icons (nodes if you prefer) graphically represent machine code.
AFAIK, other than compiling and/or assembling text files, we have no spiffier way of producing the machine code.