Archive for the ‘labview_news’ Category

The vast majority of LabVIEW applications, particularly in the broad test and measurement market, start with hardware configuration. Whether it’s installing the software components in the wrong order, default device name errors, or device discovery—the initial process of configuring the hardware is often an intrusive and unnecessary barrier to starting the application.

Among the many market research studies, focus groups, and customer feedback engagements that  our group drives, strong hardware integration continues to be one of the top two benefits to LabVIEW users (along with the productivity of graphical programming). I can talk—or I guess write—for days about the benefit of our driver APIs, consistency in programming paradigms across hardware families, and the sheer number of devices that LabVIEW can communicate with, but I’d rather focus on where we can improve.


One of the aspects of my role in Product Marketing  that I immensely enjoy is teaching our sales engineers how to demo LabVIEW. Every one of these engineers are experts in the product and can teach it all day long with their eyes closed, but demoing a product is a completely different skill than knowing it or even teaching it. It’s maddening to see how many of these demos actually start by opening Measurement and Automation Explorer. Gasp! As great as LabVIEW is at hardware integration, it’s actually quite lacking in the system configuration aspect of system design, so let’s focus there.


There are three key areas of system design where we’re innovating:


Initial Device Discovery

Some of the most consistent feedback I hear on LabVIEW typically sounds something like  “I have to write code to do anything.” Even miniscule tasks, such as verifying a hardware target is discovered, require wires to be connected and the run button pressed. My response to this feedback initially was a nice, professional way of saying “Duh, it’s a programming language.” But, let’s be disciplined about separating how you accomplish a task in G and how you accomplish a task in LabVIEW. For a software that carries the torch for “world-class hardware integration,” you should expect to discover your connected devices without writing code. Sure, doing so in G requires coding, but the LabVIEW environment should empower this simple task.




Real-Time Feedback on Signal Connectivity

Human physiology is an amazing, beautiful thing. Our brain actually chemically rewards us with dopamine—a hormone that improves our cognitive ability—for things like being right, completing a task, or working out. This is one of the fundamental reasons that engineers derive emotional satisfaction from LabVIEW. Since LabVIEW is constantly compiling code, instead of a large explicit compile step at the end, you’re continually getting these dopamine hits while you’re building out the individual components of your VI.

We should be driving this instant gratification within the configuration of hardware. Beyond just knowing the hardware is connected and recognized, you


should also be able to validate its intended functionality quickly and easily—and without that pesky requirement of writing code.



To the right is a snippet of a screenshot from a super-secret prototype version of LabVIEW. This shows some of the concepts we’re driving towards, like integrating some “liveness” into the environment. What you see is a view within the environment itself confirming the signal input with a preview visualization.




System Visualization

Particularly for a graphical environment housing a graphical language, we can surely find a better way than a vertically organized tree to manage and display hardware. Imagine the world where you plug in a device (left) and open the software to see a physical representation that reinforces not only what the hardware looks like, but visualizes the connectivity and functional organization of the device (right).


Device_1.PNG Untitled.png


This level of integration is made possible by NI’s platform approach, and the natural synergy that exists between our hardware and software. From plug-in hardware to deployed modular platforms, this level of insight and visualization would drastically simplify system visualization and configuration.

Now, if only this could display the next level of connections for sensors or DUTs.




Phillips headshot.jpg


Today’s Featured Author

Jeff Phillips considers LabVIEW as essential to his day as food, water, and oxygen. As senior group manager for LabVIEW product marketing at NI, Jeff focuses on how LabVIEW can meet the changing needs of users. You can follow him on Twitter at @TheLabVIEWLion.

Shelley Intro Image.JPG

I’m going to contradict myself in this month’s blog post. I don’t use clichés much, however I (usually) find them to be accurate and descriptive. (Note: because most clichés have become trite or irritating, we often forget that their novel origin was based in truth). Let me take you back to when I started at NI.


In the late ‘90s, reprogrammable silicon was considered mainstream across consumer, automotive, and industrial applications. Based on the critical invention of the XC2064 FPGA by Freeman and Vondershmitt, the FPGA was becoming a coveted technology for the compute power, field-upgradability, and performance capabilities. However, tools to program the FPGA prevented domain expert access: creating a technology that was too good to be true. Or so I thought.


In 2001, I began working with an in-development product we had demoed at NIWeek a few years earlier, but hadn’t released yet. This not-so-secret project, code named “RVI” or reconfigurable virtual instrumentation, was a graphical design approach to programming an FPGA. Having a computer science and math background, abstract and software-centric is more comfortable and familiar to me than meticulous hardware design. So the idea that you (or even a CS-person like me) could abstract a ton of silicon details and program the hardware with a productive tool like LabVIEW (rather than an HDL) seemed impossible.


This is where the contradiction begins. It wasn’t too good to be true; the cliché was wrong. It was good AND it was true. Luckily, I could rely on another well-known phrase used at NI to describe the innovation taking place: “the genius of the AND” inspired by author Jim Collins. With productive, graphical programming; system abstraction; AND hardware design for dedicated determinism including 25 ns I/O response, protocol customization, and rapid prototyping, LabVIEW FPGA breaks the cliché.


I’m not the only geek who gets excited about this capability. Stijn Schacht of T&M Solutions took advantage of the control accuracy of an FPGA to lift 20-metric-ton unbalanced trays of uncured concrete more than 6 meters while maintaining a strict accuracy of two millimeters. Because he used LabVIEW to get that precision from the FPGA, his team developed an application in only two months and was able to reuse the modular code for their next project.


Kurt Osborne at Ford Motor Company is a believer as well. Ford used LabVIEW FPGA to design and implement a real-time embedded control system for an automotive fuel cell system.


LabVIEW Communications.png

The LabVIEW Communications environment enables an entire design team to map an idea from algorithm to FPGA using a single high-level representation.


So what’s next? I encourage you to explore the latest cliché contradiction that takes FPGA design to the next level – LabVIEW Communications System Design Suite.


LabVIEW Communications is a complete design flow (with bundled software defined radio hardware) for wireless communications algorithms. This suite includes everything from an integrated FPGA flow, to an HLS compiler, to a new canvas (Multirate Diagram) for streaming algorithms, and an innovative way to explore your hardware system with the NI System Designer. The genius of the AND lives on in LabVIEW Communications.


Explore the latest cliché contradiction today at


Shelley headshot.jpg


Today’s Featured Author

Shelley Gretlein is a self-proclaimed software geek and robot aficionado. As NI’s director of software marketing, you can find Shelley championing LabVIEW from keynote stages to user forums to elevator conversations. You can follow her on Twitter at @LadyLabVIEW.

Today’s post is part of a monthly series exploring areas of focus and innovation for NI software. term “user interface” doesn’t adequately capture the depth of experiences we have while interacting with today’s mobile and desktop computing platforms that serve as portals into the evolving virtual world. Personally, I have never been more motivated by the trends in computing platforms with the web and cloud and radical innovations in UI ranging from touch to advanced visualizations of massive amounts of data. However, the fracturing of the Wintel monopoly creates a challenge for software developers of all kinds to be able to develop solutions that can reach all customers on all of the screens they use.

There isn’t really one technology that can deliver native, first-class experiences on all mobile platforms, all web platforms (aka browsers), and all operating systems. We are faced with a clear cost versus reach versus “native-ness” tradeoff.

It’s interesting to watch as key technologies like HTML5 rapidly adapt their core capabilities and browser vendors continue to optimize the performance of their JavaScript engines. There is massive potential in these rapidly improving Web technologies with strong use cases that not only provide traditionally-oriented controls and indicators for data visualization, but also have the power to support full “sovereign” capabilities like rendering and even editing LabVIEW diagrams themselves.

You can’t really talk about Web front ends without acknowledging the back-end server or cloud infrastructure with which the front end communicates. When NI looks at the world through the lens of acquire, analyze, and present in the context of the Web, we naturally map our presentation layer of VIs and controls and indicators to elements within the browser. However, the acquisition, storage, and analysis functions need to run server-side and server-side at scale with Big Data, or as we like to say, Big Analog Data. We are creating IT-friendly, server-side middleware that can manage the acquisition and storage of Big Analog Data and then provide Web-friendly APIs to that data so customers can quickly create VIs to visualize and analyze the data.

We are not only seeing massive shifts on the Web UI front, but also with mobile and tablet experiences, specifically around touch. For a language like LabVIEW, predicated on direct manipulation and a rich visual and spatial interaction model, we see a clear match between touch-enabled platforms, graphical programming, and interacting with your data.

We feel LabVIEW is the most touch-ready language on the planet and we think the best way to interact with controls and indicators for data visualization is also touch-based. Thus, we want to enable touch-based features for LabVIEW and touch-based features for your VIs. 

Today, the NI R&D team is simplifying a version of the LabVIEW editor, in collaboration with LEGO, which is touch-ready and tablet-friendly. The early prototypes in the lab are a delight to use and when you see and use it, I hope you agree. NI is well suited to map the beneficial evolutions of UI capabilities on the Web and in touch to engineers, scientists, and students and are working hard to do exactly that.

Stay tuned as we discuss more features the NI team is exploring for updates to NI software.


Fuller headshot.jpg

Today’s Featured Author

David Fuller has nearly 20 years experience in software engineering and is currently NI’s vice president of application and embedded software. You can’t follow him on Twitter because he’s a software engineer.

Today’s post is part of a monthly series exploring areas of focus and innovation for NI software.


Jeff_LV1.0_Block Diagram_ss.jpg

The introduction of LabVIEW to the market was defined by a revolutionary graphical user interface that included a combination of built-in libraries for engineering applications and a drag-and-drop UX schema. I’ve heard stories from the “original” generation at NI regarding the jaw-dropping nature of LabVIEW in the early 80s and 90s due primarily to the simplicity of the development of the UI. Of course, 30 years ago the Internet didn’t exist, cellular networks were a novel idea for military applications, and touch-screen tablets were a distant dream.

For decades, the GUI set LabVIEW apart from traditional programming languages for two reasons. First, the UI components were built into the environment instead of being paid, add-on libraries. Second, the UI components were custom designed for engineering applications including gauges, tanks, and intensity charts. As a generality, the most critical aspect of UI development in an engineering application is the simplicity of data visualization. The LabVIEW UI was designed for the concept of visualizing time-based waveform measurements, the kind of measurements most commonly associated with real-world analog data such as temperature, vibration, and voltage.


Almost 30 years later, the LabVIEW UI is still well-suited for visualizing measurement data. Over the years, engineering tools have continued to evolve, closing this gap in UI functionality and ease of use. But it was the evolution of the consumer mobile market that seriously changed user expectations. Users now not only want, but expect advanced concepts in their UIs such as gestures, resolution scaling, and dynamic movement.

Most pragmatic users of LabVIEW are entirely content with the functional display capabilities of LabVIEW today. However, they voice their opinion on the Idea Exchange, and these typically fall into three categories:


1.      General Look and Feel
The LabVIEW UI was designed to mimic the front panel of a physical instrument, graphing measurement data and providing a layout of knobs and sliders to control the measurement parameters. The design decisions made to this effect are often antiquated for the expectations of today’s modern UI principles. I often hear LabVIEW users say “this UI looks like it was built in the 80’s.”

2.     Customizability
Generally speaking, the Control Editor inside of LabVIEW helps developers apply a wide range of customizations to UI elements, breaking down each control to its elemental components where pictures, designs, and color schemes can be applied.


Blog Screenshot.png

It’s not the functionality capability that most developers want to see improved, but the simplicity in applying sophisticated customizations and extensions. LabVIEW lacks a well-structured API to interface to these customizations, instead relying on an interaction editor to accomplish the task. That lack of automation from a tool centered on the concept of automation is limiting.


3.     Portability
Lastly, the portability of LabVIEW UIs are minimal. The portability requirement is generally either to scale to a different size or resolution of monitor, or to a mobile-based device. By design, the LabVIEW UI is vector-based, which fundamentally limits the scalability or portability to different sizes or platforms.


These limitations leave LabVIEW behind where we want to be in terms of UI design, functionality, and portability. Redesigning the foundation of any UI is a difficult, multi-year effort. Fortunately, we’re significantly far into this investment-unfortunately, we’re not quite far enough. Over the next few years, you’ll begin to see our investment reach the point of deployment…deployment to you.


Follow along as we near that deployment; we’ll be sharing more information with you along the way.



Phillips headshot.jpg

Today’s Featured Author

Jeff Phillips considers LabVIEW as essential to his day as food, water, and oxygen. As senior group manager for LabVIEW product marketing at NI, Jeff focuses on how LabVIEW can meet the changing needs of users. You can follow him on Twitter at @TheLabVIEWLion.

Today’s post is part of a monthly series exploring areas of focus and innovation for NI software.


Shelley Intro Image.JPG


Whether we love them or hate them, UIs–the veneer on our code, configuration, hardware, and data analytics–define our experience and influence our productivity. GUI history is remarkable. From the unsuccessful yet influential Xerox Star in 1981, to the 1984 Macintosh and Windows 3.0, we saw multipanel windows; desktop metaphors with paper, folders, clocks, and trashcans; browser wars brought dashboards; and Windows 8 introduced live tiles. This history provides a map for what GUIs may look like in the future.

Engineering and scientific UIs were historically distinct from consumer or architectural design. For NI, GUIs are rooted in the success of virtual instrumentation, which mimicked legacy box instruments. The virtual instrumentation approach trumped traditional test and measurement software teams who lost sight of basic UI design in a frenzy to build more features.

LabWindows/CVI is an example of an engineering-specific GUI, an interface to mimic traditional box equipment.


Today, we are in a transition where UI design is more than a competitive advantage, it’s a requirement. Heavily influenced by consumer experiences, today’s users seek solutions which offer as much intuitiveness as function.

We should all demand GUIs which are:

  1. (actually) Graphical – End unintuitive TUIs (I thought I made that up but didn’t).
  2. Skinnable – Different than customizable, your UI should come with themes, skins, and documented extensibility points.
  3. Modern - This means a clean, minimalist design (more is NOT better in the UI world) to let you focus on the data and information rather than the control.
  4. Designed – Layout, color, font, and hue saturation all matter. Don’t assume these elements aren’t necessary.

To meet these demands, vendors are investing in interaction, user experience, and user interface designers (including NI). I predict we will see more UI trends such as:


  • Flat – 2013 was the year most design experts began teaching and preaching the minimalistic design approach featuring clean, open space, crisp edges, and bright colors to emphasize usability
  • Mobile design – with consideration for high-end graphics power and heat leads to simpler interfaces and two dimensionality (Metro and 2012 Gmail redesign)
  • 3D – led by gaming and content industries, direct 3D and OpenGL technologies give us a beautiful experience on powerful platforms with 3D rendering, shading, and transparency effects (AIGLX for Red Hat Fedora, Quartz Extreme for Mac OS X, and Vista's Aero)
  • Virtual Reality – growing in feasibility, heads up displays are no longer reserved for pilots, VR is showing up everywhere from the 2013 PriusAirbus Smart Factory


Regardless of future designs, the most important element to plan for: design trends will and must evolve.   Profit margins and adoption of your products will be defined by the user experience – which is first experienced through your user interface. 


Need more convincing? Forrester Research finds that “a focus on customers’ experience increases their willingness to pay by 14.4 percent, reduces their reluctance to switch brands by 15.8 percent, and boosts their likelihood to recommend your product by 16.6 percent.”


Do you agree? Tell us what you think by commenting below or connecting with the UI Interest Group to learn tips and tricks from top LabVIEW developers.


Shelley headshot.jpg


Today’s Featured Author

Shelley Gretlein is a self-proclaimed software geek and robot aficionado. As NI’s director of software marketing, you can find Shelley championing LabVIEW from keynote stages to user forums to elevator conversations. You can follow her on Twitter at @LadyLabVIEW.

Bipedal humanoid robots have been around for over 30 years, but developing and implementing intelligent motion algorithms to keep their moves from looking Frankenstein-esque have remained a challenge. Using NI hardware, LabVIEW, and third-party add-ons, a team of engineers at the Temasek Polytechnic School of Engineering have built a teenager-sized humanoid robot with a smooth gait.




The project focused on developing a user-friendly graphical interface to implement motion control algorithms. Engineers used LabVIEW to create control software that students used to easily develop and debug the program, and they’re going to be able to flexibly adapt and redeploy the program in the future on other robotics projects. PXI-8101 was the main system controller and students programmed wireless LAN using the LabVIEW Internet Toolkit. The LabVIEW MathScript RT Module executed The MathWorks, Inc. MATLAB® code to generate gait trajectory.


LabVIEW reduced development time to one semester, made it possible to perform motion simulation with SolidWorks, and executed code created with MATLAB. The bipedal humanoid robot made its debut at the SRG 2014 Singapore Robotics Games.


>> Read the full case study.

By default, intensity graphs and intensity charts have a blue color map for the Z scale, but did you know there are other color maps to choose from? To reconfigure the blue color map, right-click any of the numbers in the Z scale and choose “Marker Color.”


You can also programmatically manipulate the scale with the color table property of the intensity graph and chart. By following the link below, you can download a VI that will allow you to select between several pre-defined color maps for your intensity graphs and charts.


Thanks to Darren Nattinger for this LabVIEW tip!



>> Download the VI here.

NIWeek 2014 is only two months away, and the LabVIEW team has been busy preparing the biggest LabVIEW Zone we’ve ever had. This year, you’ll get the chance to demonstrate feats of strength, lightning-fast reflexes, and quick feet as you compete against your friends and colleagues with three new interactive sports science demos. You can also prove your intellectual chops by taking on LabVIEW in a chess challenge – we’ve been working hard on our algorithms all year. Finally, you can compete to win the most prestigious trophy that NIWeek has to offer at the LabVIEW Coding Challenge. Beat the six-time undefeated champion, Darren Nattinger, and have your name immortalized in LabVIEW lore.




We’re really excited to show off the amazing things LabVIEW can do on the NIWeek show floor this year. Come by and check out cutting edge technologies and exciting new products. We’ll also have engineers on site throughout the conference to answer your burning LabVIEW questions.


  >> Register for NIWeek today.

In LabVIEW R&D, there’s a lot that goes into creating LabVIEW features. The process includes specification documents, design reviews, code reviews, automated test plans, manual test plans, documentation, and more. When we’re done, we have a feature that is documented, tested, marketed, and officially supported.


But what about all those features that, for whatever reason, don’t get that level of attention? Maybe a feature is written just for an internal team at NI. Or maybe a developer didn’t have enough time to dot all the ‘i’s and cross all the ‘t’s on a really useful API. What happens to all of those features?


They get included in the vi.lib folder, of course! Most of the VIs you can drop from Quick Drop or the palettes live in your [LabVIEW 20xx]\vi.lib folder. But many other VIs that are not “official” LabVIEW features are available in this folder as well.


To learn more about some of these unofficial libraries that are already included with your installation of LabVIEW, join the Hidden Gems in vi.lib community group on This group includes a presentation given by Darren Nattinger, Principal Engineer in LabVIEW R&D, discussing many of the Hidden Gems VIs that he uses on a regular basis. Some of his favorite libraries include:


1. VariantDataType VIsvi.lib\Utility\VariantDataType
These VIs allow you to parse variant data to learn more about the specific data type contained within the variant. For example, here is a VI that uses the VariantDataType VIs to determine the strings used to define an enum:

And here is a VI that uses the VariantDataType VIs to determine whether or not a variant is an error cluster:


2. AdvancedString VIsvi.lib\AdvancedString
These VIs perform advanced string manipulation. One of the most useful VIs in this folder is Match 1D String, which searches the elements of a string array for one that matches a user-specified pattern. It’s basically like the
Search 1D Array function, but much more useful when dealing with string arrays:


3. Libraryn VIsvi.lib\Utility\libraryn.llb
These VIs perform myriad File I/O operations, including operations on LLBs and files inside of LLBs. Several of the VIs in this library are officially supported and included in Quick Drop and the palettes. But there are many other “unofficial” VIs in this library that are useful as well. For example, Create Directory will create a folder on disk, and any parent folders that don’t exist:

Another useful VI, Is Name, will test a string to see if it can be used as a file name on the specified file system:


4. _analyzerutils.llb VIsvi.lib\addons\analyzer\_analyzerutils.llb
These VIs are a random assortment of useful VI Server and VI Scripting VIs. They are used extensively by the VI Analyzer Toolkit, but they are also generally useful for other scripting applications. Check out
this page on the VI Analyzer Enthusiasts group on for a detailed description of many of the VIs in this library.

5. lvconfig.llb VIsresource\dialog\lvconfig.llb
Ok, this last one doesn’t live in vi.lib, but it’s still really useful! Located in your [LabVIEW 20xx]\resource\dialog folder, the VIs in lvconfig.llb allow you to read and write tokens in your LabVIEW INI file, regardless of what platform you’re using, and where the file is located. One potential use case for these VIs involves programmatically updating your LabVIEW preferences settings (i.e. the settings in Tools > Options), perhaps as part of an automated install of LabVIEW.


These are just a few of the many hidden gems readily available in your LabVIEW folder. So check out the Hidden Gems in vi.lib community group to learn more about all these great libraries and utilities that you already have access to in your LabVIEW installation. You don’t want to end up writing a really useful VI only to find out it was already in VI.lib!

Many areas of Africa lack affordable, reliable motorized transport. In these areas, pedestrians often carry commercial and domestic goods, such as water, firewood, and crops, on their heads. This practice, known as head-loading, is an exhausting task predominantly performed by women and children.






Of the estimated 750 million women and children in Sub-Saharan Africa, the majority will carry heavy loads (more than 40 kg/88 lb for adult females) on their heads. Thus, head-loading represents a huge potential public health issue in Africa. Despite this, the long-term impacts on maternal health, quality of life, labor productivity, and life expectancy have largely gone unrecognized and unstudied. Until now!


NI Alliance Partner Key Engineering Solutions Ltd partnered with engineers and product design students from the University of Leeds to develop a wearable measurement device for a head-loading study. The resulting device, the intelligent load orientation assessment device (iLOAD), uses accelerometers, gyroscopes and GPS positioning to provide measurement feedback. The team used LabVIEW to develop an intuitive user interface, establish wireless connectivity to the iLOAD using Bluetooth, and stream measurement data from iLOAD.






Results from the study will be published in international journals. Using this data, researchers can help make this essential means of goods transportation a safer practice for the millions of African women and children who do it every day.


>> Read the full case study.