In early August, I had the pleasure of participating in National Instruments’ annual NIWeek user group event. NI’s customers are scientists and engineers who are experts in a wide array of disciplines, from nanotechnology and photo-optics to the design of alternative energy power supplies in automobiles, the control of robots and other manufacturing processes, to the design of signal processing systems on programmable embedded chips in today’s cell phones.
These engineers and scientists use NI’s virtual instrumentation software innovation toolkit, LabView, to design, prototype, and deploy applications that measure real world phenomena—analog signals and physical movement—analyze these signals, describe actions that need to be taken, send out the signals to execute those actions (usually in parallel), analyze the results, and take additional actions. Whether the device being programmed is a nanorobot being used to splice genes or a spectrum analyzer being used to measure radio frequency interference, the scientist is dealing with real world phenomena in real time.
Hanging out with these real world scientists and engineers got me thinking about the future of programming as we know it today. The future of programming is a topic to which NI’s top executives have also been giving a lot of thought.
As faithful readers know, LabView is now the graphical programming environment used by elementary school age kids who can design, build, and manipulate robots using LEGO Mindstorms NXT. Jeff Kodosky is the original inventor of the LabView environment, which was designed to be the equivalent of a spreadsheet for engineers and scientists—a simple tool that lets them model their world and do their jobs. At NI Week, Jeff Kodosky gave a compelling peak into the future of LabView1.
Parallel Programming for a Parallel World
“The world is parallel. We always do things at the same time, like walking and chewing gum. So why do we constrain ourselves to sequential thinking when we write computer programs?” Jeff Kodosky said. “The fundamental problem is that we have the wrong model of computation in a parallel world. Von Neumann’s stored program model in which a program is a sequence of instructions acting against a global state is an embarrassingly simple model of computation.”
“Programming models need to be more human centric,” Kodosky went on to say, “and to allow programs to describe concurrency naturally. Stop thinking of programs as sequences of instructions and think of them as parallel flows of data.” This inherently parallel data flow model is at the core of the LabView toolkit. Back when LabView was originally designed, Jeff explains, “We knew that measurement apps were inherently parallel. Physical signals exist as parallel phenomena.”
The Challenge of Deploying Single Applications Across Multicore Processors
The LabView graphical programming language and environment has been in use now for 29 years. NI’s founder and president, Dr. James Truchard, or “Dr. T” as he’s affectionately called, has a vision for LabView as a more general purpose programming language for the future of computing. “As multicore processors become standard, there is an increased need for parallel programming languages that can take advantage of ever-quickening multiprocessor speeds,” Dr. T said in his opening keynote. “LabView, with its inherent multithreaded architecture, fits the bill.”
The computer industry is already moving to multicore processors. “You can’t increase performance by increasing clock speed,” Jeff Kodowsky explained. “Two and four-core machines are common today... Multicore machines allow you to run multiple programs in parallel. But getting a single app to run faster on a multicore machine is a completely different and much more challenging problem. Every app that needs to run across multiple processors will need to be rewritten as a multithreaded application, and this is a daunting task. In fact, Microsoft’s chief research and strategy officer said about the multicore challenge, ‘I personally think this is one of the most disruptive things the industry will have to go thru.’ Good tools for multicore programming are not expected to be available until 5 to 10 years from now, and they’ll need to be able to deal with processors that have 16 times the power as today’s current machines.”
“We have a successful parallel language for multicore machines today. You can exploit the performance of multicore machines now. The ultimate architecture for parallel programming is the FPGA (Field Programmable Gate Array) and, of course, LabView is already there,” Jeff Kodosky exclaimed.
Dr. T highlighted real-world examples of how LabView, combined with multicore processors, provides high-performance computation and I/O capabilities for test, control, and design today. He described how researchers at the Max-Planck-Institut für Plasmaphysik obtained a 20 times processing speed increase on an octal-core processor machine over a single processor.
Jeff Kodosky went on to say, “As the rest of the industry struggles to come to grips with multicore machines, you LabView programmers can just continue to do what you’ve always done. You focus on your application, drawing as much parallel activity you want.” He explained that it’s NI’s job to be sure that the parallelism expressed in your diagram, translates and executes well on today’s computational engines.
Using a Graphical Programming Language
The other area in which NI is pushing the envelope with its development plans for LabView is in its evolution as a graphical programming language. One of the original ideas behind LabView was to enable engineers to draw block diagrams of the systems they are designing and to have those diagrams come to life as running code. LabView has come pretty far down that path. Originally, block diagrams and data flow diagrams, such as the ones we all do in Visio and other diagramming tools, are high level abstractions with no real rules. “The graphical depiction was a form of system documentation, not a language for system documentation,” Jeff Kodowsky explained. We wanted “a way to turn these engineering block diagrams into a rigorous design tool. We started with data flow diagrams,” Jeff explained, “then added structured programming. Structured data flow is the heart of LabView. LabView’s block diagrams are rigorously defined with precise semantics which allows them to be executed. The diagram is the code.”
At NIWeek, the R&D team offered a sneak peak into the future of LabView as a system diagram cum graphical programming environment. They are adding multiple layers of abstraction, the ability to monitor and simulate process flows, the ability to see interrelationships among system components that are interacting with one another, and much, much more. One way to think of LabView in the future is as a design and simulation tool. You’ll be able to design complex systems, model and simulate complex systems, monitor and test actual physical systems, and make adjustments to them based on your real-time measurements and simulations.
And, best of all, our kids and grandkids—at least those who have had experience using LabView in the form of LEGO Mindstorms NXT or in the classroom, will already know how to design and model their worlds!
Insights into Innovation and Invention
My real purpose for attending NI Week was to participate in a panel of experts on innovation. Although the panel was entitled “breakthrough innovations,” all of us agreed that many innovations are actually evolutionary in nature.
The panelists included:
- Scott Jordan, Director of Nano-Automation Technologies at Physik Instrumente, one of the first 100 LabView customers. Scott became infected with the virus of LabView and brought LabView to the field of opto-photonics. He has been part of the LabView ecosystem for 29 years.
- Suchit Jain, VP of Product Strategy at SolidWorks. Suchit’s goal is to work with innovative customers to figure out growth opportunities for SolidWorks.
- Dr. Jim Truchard, “Dr. T.,” president of National Instruments, who is working on LabView for the next 29 years.
- Andrew Hargadon, Professor at the University of California at Davis Graduate School of Management and author of How Breakthroughs Happen. His take: most innovations are actually new combinations of old ideas.
You are welcome to watch the video of our panel discussion. I found it pretty interesting. You’ll find it at http://www.ni.com/niweek
Making Innovation a Core Competency
We’ve engaged in a lot of discussions over the past several months with clients who are interested in tapping the power of their lead customers and lead users. This week, we offer an assessment survey tool you can use to determine where your firm sits on the continuum between closed, proprietary development processes and open innovative development processes.
*Footnote*
1) You can view a video of Jeff Kodosky’s presentation by going to this URL http://www.ni.com/niweek
*Footnote*
Let me clarify my point on LabVIEW and touch ready...
I mean that the concept of direct manipulation in LabVIEW,
of picking blocks and placing them and then wiring them together is more amenable to "touch-based development" than using a touch-based text editor to write code.
I am optimistic, that for many use cases, developing code on a pure touch interface with LabVIEW will be preferable to mouse and keyboard
Posted by: labview programming | December 01, 2014 at 11:08 AM
Patty, I spotlighted a few of your comments in my own blog post: http://carpenano.blogspot.com/2007/08/patricia-seybold-on-breakthrough.html
Posted by: Scott Jordan | August 30, 2007 at 01:49 PM
Thank you, Patty, for this insightful and detailed post. I enjoyed serving with you on the NI Week "Industry Experts" panel on Breakthrough Innovations, and learning more about your work with organizations seeking to optimize the innovation process by putting the customer in the driver's seat.
Posted by: Scott Jordan | August 30, 2007 at 12:55 PM