In early August, I had the pleasure of participating in National Instruments’ annual NIWeek user group event. NI’s customers are scientists and engineers who are experts in a wide array of disciplines, from nanotechnology and photo-optics to the design of alternative energy power supplies in automobiles, the control of robots and other manufacturing processes, to the design of signal processing systems on programmable embedded chips in today’s cell phones.
These engineers and scientists use NI’s virtual instrumentation software innovation toolkit, LabView, to design, prototype, and deploy applications that measure real world phenomena—analog signals and physical movement—analyze these signals, describe actions that need to be taken, send out the signals to execute those actions (usually in parallel), analyze the results, and take additional actions. Whether the device being programmed is a nanorobot being used to splice genes or a spectrum analyzer being used to measure radio frequency interference, the scientist is dealing with real world phenomena in real time.
Hanging out with these real world scientists and engineers got me thinking about the future of programming as we know it today. The future of programming is a topic to which NI’s top executives have also been giving a lot of thought.
As faithful readers know, LabView is now the graphical programming environment used by elementary school age kids who can design, build, and manipulate robots using LEGO Mindstorms NXT. Jeff Kodosky is the original inventor of the LabView environment, which was designed to be the equivalent of a spreadsheet for engineers and scientists—a simple tool that lets them model their world and do their jobs. At NI Week, Jeff Kodosky gave a compelling peak into the future of LabView1.