I’ve been admiring LabView, a visual programming language created by a company called National Instruments. To be precise, the language is actually called “G”. These guys didn’t set out to solve the multicore problem, in fact LabView was introduced way back in 1986 before anyone was worried about multicore. It’s used in data acquisition, instrument control, and industrial automation tasks.
Jim Kring called my attention to LabView in a couple of different ways. First, I came across his great book, LabView for Everyone, in a Borders Books. It’s a wonderful introduction to the potential for this language, but you won’t find the book in the programming section–it’s over by the Engineering books. Second, Jim and I have corresponded about the Multicore Crisis for a little while and he recently posted on his great blog about how LabView is one potential answer to the problem.
Why is LabView able to scale more easily than conventional curly braced languages? The answer is simple, and perhaps surprising: because the language says very little about the sequence of execution. It is what’s called a Dataflow Language. You’re almost certainly familiar with the world’s most common dataflow language already: spreadsheets. Like spreadsheets, LabView has a very simple way of specifying sequence of execution. An object in LabView executes whenever all of its inputs are ready. This corresponds to a spreadsheet where a formula may be recalculated as soon as any other cells it depends on have been recalculated. So, in theory, every single cell in the spreadsheet that creates an input for a certain cell may be computed in parallel. Likewise with the objects in LabView. Conventional languages, by contrast, consist of lists of steps that must be evaluated strictly in order.
Here is the other amazing thing about these two. People may complain about how hard it is to learn parallel languages like Erlang, but who complains about learning spreadsheets? They’re easy. LabView is so easy that National Instruments has managed to build an almost $700M a year business around it. Their EBITDA income was a little over $100M on that, and they are growing at about 20% a year. Now Sun won’t break out Java’s revenues, but this company is nearly 10% of Sun’s size. I have to figure that if Java were close to $700M they’d want to talk about it. How’s that for the biggest language company you’ve never heard of?
When we compare LabView to something like Erlang, it shows up pretty well. Recursion is a fundamental construct in many parallel languages like Erlang, but it isn’t necessarily a fundamental construct for many human minds. Yet the idea of delegation, which is one form of parallelism, and of an assembly line, another form of parallelism often called pipelining, is very natural for people, and is inherent in dataflow languages such as spreadsheets and LabView.
There are other languages like this floating around too. Yahoo’s Pipes is one designed for doing mashups that’s very cool. The ease of use seems to carry over into this domain too as I read various examples:
- SOA is easier in Yahoo Pipes.
- Blog filtering
- Preprocess data for use in a Flash application
- A better “Web 2.0” answer to BPEL?
The list goes on, but there seems to be a lot of potential unleashed when you quit thinking about things as a linear list to be solved strictly in order and break loose a bit more parallelism. Like I said, it’s something completely different to think about.