Posts by

There are a million scenarios where data can get lost. Did someone forget to move the data from the test machine to the final data store? Was the spreadsheet you were analyzing closed without saving any changes? Did a test fail and cause a corrupt file to be created? One effect of losing data could be that a test has to be re-run.


We all know that feeling. Imagine yourself working on a PowerPoint presentation and you are in your groove. All of a sudden PowerPoint crashes and you realize the last time you saved was 30 minutes ago! And of course auto-recovery doesn’t fully capture all of the changes you made in the last hour. So you say a few choice words, put your head down, and re-create all of the work you just lost.


When test data is lost, you go through the same process. And sure, the second time around it goes by faster, but sometimes this just isn’t an option and capturing the data is critical.


Now, currently in LabVIEW we have the project to organize and store your VIs, subVIs, controls, documentation, libraries, etc., but our question to you is, "where is the data?" Why is that data you are collecting in your application not automatically saved to the project?


NI is investing in this exact scenario to automatically save data to the project and to collect data without programming (as discussed in a previous blog). Now, when you share a project, the data files will be sent along with the application. Just think of the possibilities! When there is a bug and the data is returning something different than usual, you can package up the whole project to send to the troubleshooting team. There could be an overall increase in efficiency because data will never be lost again. Just look at what benefits the High Power Laboratory at ABB Switzerland has achieved by focusing on analyzing the data they already have available: "We can save up to $50k per test by simply avoiding the costs of rerunning tests for which existing data might be used."


Data Blog.png



The bottom line is, we want you to be as efficient as possible when developing your application, and ensuring that your data is well managed is one of NI’s priorities for the future.

When you look at traditional programming languages, you see all the same things: text characters and punctuation symbols. To understand the meaning of this code, you read and interpret a form of text that was designed from the perspective of a machine’s sequential operation. Various development environments apply transforms to this text to help you – color-coding keywords, automatically indenting sections of code to show scope, collapsing sections of a large file for easier navigation – but in the end, you are still left facing a wall of text that you must interpret.


The graphical programming language in LabVIEW describes functionality the way that users think: visually. Data flows along patterned, color-coded wires, parallel processes are shown side by side, and code sections are abstracted into nodes with visual depictions of their functionality. Over 30 years, engineers and scientists have used graphical programming as their tool of choice for automation and system design, because it visually reflects their way of thinking, rather than the computer’s.


It follows that visual design in a graphical programming language affects not only aesthetics, but also utility. As we developed the LabVIEW platform over these last 30 years, we have added thousands more functions across a variety of areas, from data acquisition, to embedded control and monitoring, to 5G research. With such a diverse application space, it is important that the platform stay visually consistent and disciplined. With that in mind, we have embarked on a significant visual design initiative intended to keep all developers productive.


The first major result that you will see from this initiative is more consistent and meaningful iconography for VIs. In the past, you may have found different glyphs that meant “search” – binoculars or different styles of magnifying glass – in the future we will have a single glyph used throughout the platform. The same applies for “write,” “configure,” “reset,” and so on.


Modern UX.png


Figure 1: Consistent iconography across the software platform makes visual metaphors more effective.


We've also turned our attention to VI icon design. We took inspiration from everyday traffic signs that are simple to understand at a glance, and applied this to color scheme and glyph use. The results are icons that have only one bold color, one accent color, and a few key glyphs per icon. This reduces visual complexity, while still elevating important functionality.




Figure 2: The NI-DAQmx palette uses a bold dark color and secondary accent color.

Graphical programming has always derived its value from effective visual design. With continued investment, we will further differentiate this benefit compared to traditional programming languages.


Let us know what you think.

There’s no way around it, expectations around UIs have changed dramatically with the rise of touch-friendly smartphone apps and the proliferation of sophisticated web applications (we’ve talked about it before). For 30 years, NI has empowered engineers to build UIs that look exactly like they want in LabVIEW, but it’s time to do more.


These new experiences are not just the result of an evolving design aesthetic for what makes a UI effective. Even more important than changing shapes and color schemes is the underlying technology that enables these new capabilities. This underlying technology is what we are investing in to enable you to meet design expectations.




WinForms (Desktop)



WPF (Desktop)



Silverlight (Web)



HTML5 Prototype (Web)




Let’s take a look at the evolution of UI technology over the last 30 years. From Windows Graphics Device Interface (GDI) to Windows Forms (WinForms) to Windows Presentation Foundation (WPF), Microsoft has continually provided new APIs and libraries for composing UIs in Windows applications.

Over the same time period, web browser and mobile OSs such as Android and iOS have emerged as new application platforms. With each of these platforms, there has been an entire evolution of UI technologies – Java, Flash, Silverlight, and HTML5, to name a few.

Right now we are investing in two UI technologies – WPF and HTML5. WPF is hardware accelerated and based on DirectX, incorporating advanced graphical features like gradients, opacity, animation, and an advanced composition model. Simply put, with WPF we can build you stunning, theme-able controls that you can customize with artwork imported directly from Adobe design tools. We added WPF support to Measurement Studio in 2013 and are currently working to bring this technology to the rest of our software platform.

Meanwhile, HTML5 has emerged as the de facto web standard for beautiful, interactive web applications. In addition to richer graphics, HTML5 can facilitate dynamic animations and responsive layouts in combination with CSS3 and JavaScript. Very importantly, HTML5 does not require any client-side plugins like Silverlight and is supported in all modern web browsers.

We’re excited about bringing these technologies into tools like LabVIEW because of the new things it will enable application developers to do. Each brings new capabilities for native controls, such as WPF tables with mixed data types or Google Maps integrated into an HTML5 page. Controls also have the potential to be customized in new ways using vector-based graphics that can scale flawlessly to different resolutions. In addition to these advances, technologies like WPF and HTML5 bring rich ecosystems of existing controls and frameworks that can be reused by application developers.

Changing UI requirements are much more than just changes in design fashion from skeuomorphic to flat. UI technology has been evolving, and NI is investing to keep up.

What UI technologies are you excited about?

Today’s post is part of a series exploring areas of focus and innovation for NI software.

Shelley headshot.jpg


Today’s Featured Author

Shelley Gretlein is a self-proclaimed software geek and robot aficionado. As NI’s director of software marketing, you can find Shelley championing LabVIEW from keynote stages to user forums to elevator conversations. You can follow her on Twitter at @LadyLabVIEW.





You’re finally given the project you’ve been working towards: it’s “THE” project – big, hairy, complex, and visible to top management. It’s the project the company needs to get back on track. Before you dive in – remember the company needs this, meaning if it doesn’t go well it can threaten the very existence of the company and, therefore, your livelihood.

Now that there’s no pressure – what should you consider? With large software projects typically running 66% over budget and 33% over schedule, you need to have a plan to defy the statistics. Complex project management must be coordinated with clear accountability, clear communication, and shorter iterations. Luckily for you, there are reams of organizations that specialize in consulting for large application management. But – NI can help, too what we know is software for designing, prototyping, and deploying your engineering solution. And we also know that the right tool makes all the difference.

Imagine for a moment you didn’t go to engineering school, but instead became a drywaller. You’ve recently been hired by a local small business and you show up for your first day of work excited for a busy day. You and your team of three other workers show up to the first house – a straightforward job of room repair, taping and floating. But here’s the catch – you have a step stool, a piece of sand paper, and a putty knife. You begin on a small square and quickly become frustrated with bumpy spackle and tearing sand paper. Your colleagues are on stilts quickly moving across the ceiling with their drywall saws, automated sanders, and expert tape. They’ve completed three rooms and you’re still working on one patch.

You get the point – the right tools make all the difference. You need an engineering software tool for your engineering job. LabVIEW has been proven for over 30 years to be the most productive engineering software package on the market. But that’s not enough – the touch points to the process and the system are also critical. Tracking the relationship from requirements to test, measurement, and control software is crucial for validating implementation, analyzing the full impact of changing requirements, and understanding the impact of test failures. Performing this coverage and impact analysis helps engineers meet their requirements and streamline development efforts. NI Requirements Gateway is ideal for applications that simulate or test complex components against documented requirements in industries such as automotive, defense, aerospace, and consumer electronics.  Requirements Gateway works seamlessly with your NI software – from LabVIEW to TestStand to LabWindows™/CVI.

Whether you are running THE project, developing a simple UI, or creating the next test system for the team – ensure you use the right tools to allow you to focus on solving the complexity of the engineering challenge, not trying to unravel the complexity of application software tooling.

Today’s post is part of a series exploring areas of focus and innovation for NI software.

Phillips headshot.jpg


Today’s Featured Author

Jeff Phillips considers LabVIEW as essential to his day as food, water, and oxygen. As senior group manager for LabVIEW product marketing at NI, Jeff focuses on how LabVIEW can meet the changing needs of users. You can follow him on Twitter at @TheLabVIEWLion.





It’s like a bad dream. The kind of dream that reoccurs, never changes, never gets better, just happens over and over again. And there’s nothing that you can do about it. I had a conversation with someone who downloaded the LabVIEW evaluation, but we saw no further engagement from. I asked what prevented that particular individual from continuing.


His answer?


“Well, I’m really just trying to automate this instrument. I gave it a look, but LabVIEW just isn’t for me.”


The words pierced my heart like a perfectly placed knife inserted by the expert hands of Jason Bourne. Not for him? Not for him? He’s doing the EXACT thing that LabVIEW was conceived 30 years ago to do – automate benchtop instruments.


Those words have haunted me since that day. The sad fact is that he was right. LabVIEW has evolved so much as an enabling technology for any engineer to accomplish almost literally anything, it’s no longer highly optimized for any specific task. Within the walls of NI, we call this the “Blank VI Syndrome”. Even a blank PowerPoint slide says “Click to Add Title”. That’s why the investment we’ve been making in in-product learning is so important.


How do you teach someone to use a tool that can do anything?


Well, the answer to that is beautifully simple. You don’t teach them to use the tool. You teach them to accomplish their task using the tool. It might seem subtle, but that subtlety is important. You aren’t taught how to use a pencil. You’re taught how to write with the pencil.


Within the entirety of NI software, and not just LabVIEW, we’re building capabilities that solve a few issues.



Within the walls of NI, we call this the “Blank VI Syndrome”. Even a blank PowerPoint slide says “Click to Add Title”. We're working on fixing this.




There’s a ton of valuable IP, functions, and controls built into LabVIEW. You can find them if you know where to look. But, by definition, new users don’t know where to look. Making these capabilities easily and naturally discoverable is a critical aspect to shortening the learning curve.


Capture (1).PNG


Integrated Guidance


Today, when you’re learning to use LabVIEW, your best resource is the vast expanse of the internet. The internet, where funny cat videos, MEMEs, and bad lip reading can take you away from the task at hand in a moment’s notice. The software product itself needs to be smart enough to help you solve the task. Be both the tool and the teacher.


Capture (1).PNG


Better Starting Points



As LabVIEW has become world-renowned for its unrivaled ability to integrate hardware – any hardware, acquiring data from that hardware is a common starting place. Of course, not everyone has this same starting point. LabVIEW has become popular in design applications as well, where the starting point is typically hardware-agnostic. Regardless, we can take the vast majority of applications and boil the starting points down to a manageable number.


Then, we should just design approachable starting points and flows around those. Right?


Capture (1).PNG

Today’s post is part of a series exploring areas of focus and innovation for NI software.

Shelley headshot.jpg


Today’s Featured Author

Shelley Gretlein is a self-proclaimed software geek and robot aficionado. As NI’s director of software marketing, you can find Shelley championing LabVIEW from keynote stages to user forums to elevator conversations. You can follow her on Twitter at @LadyLabVIEW.




The days of the droning instructor—whether in undergrad or executive education—are (thankfully) long gone. No one has the time or patience to learn new techniques and technologies the old way. 


The pace of technology is ridiculous, the Internet of Things (IoT) is exploding system complexity, and we’re all finding it hard to keep up. The facts are undeniable, you and your teams have more to learn in your field and have to learn beyond your field, too, if you really want to be competitive. Learning formats must be compatible with your lifestyle.


Thankfully there’s good news. Technology, investments, and management trends are in our favor. I’m seeing a rise in learning technologies and techniques from massive open online courses (MOOCs) at universities to in-product learning from customer education departments. But I’m also seeing innovation outside traditional spaces. Options like Connexions, TechShop, and Khan Academy are popping up everywhere.


The 70:20:10 Rule: Learning Is More Than the Classroom


We need to expand our definition of ‘training’ beyond the classroom to all forms of learning. The 70:20:10 rule reinforces this concept. Traditionally, "customer education" content lives in the 10% (formal learning). NI is also providing the 20% (social) heavily rooted in peer to peer (user groups, summits, developer days). We are evolving our portfolio to include more content that lives in the 70% (experiential learning). Learning spans tutorials to online modules, YouTube to code snippets, mentoring to code reviews, and seminars to white papers. Learning is popping up online, in cubes, and in-product.


One key learning enhancement in LabVIEW, shaped by the LabVIEW Community, first came in LabVIEW 2012 with the introduction of templates and sample projects. These recommended starting points are designed to ensure the quality and scalability of a system by demonstrating recommended architectures and illustrating best practices for documenting and organizing code. More than a conceptual illustration of how to use a low-level API or technology within LabVIEW, these open-source projects demonstrate how the code works and best practices for adding or modifying functionality, so you learn by doing.


But we aren’t stopping there, we have learning built into LabVIEW Communications Design Suite (the revolution in rapid prototyping for communications) to minimize design flow interruptions and encourage seamless learning. "Just-in-time" learning and access to learning material in product allows you to learn by doing—commonly referred to as "performance support." 




Overcoming the Access Hurdle


Regardless of the learning format you prefer, it’s clear to me that access is the key hurdle to overcome. If you know what you don’t know and can access the right training to learn what you need to learn—study upon study demonstrates you will be significantly more productive.




We are doing our part here as well. For the past several years, NI has included online training with most of our products as part of staying on active software service. This online format is optimized to fit your schedule while complementing other formats including live instructor-led virtual training, classroom training, and on-site custom courses. The online courses respect your time and your budget as Thomas Sumrak from PAR Technologies reported, “I estimate that the training courses have saved me more than 150 hours in self-paced learning time to get to an equivalent skill level.”


It’s All About Proficiency

As your intuition would tell you, learning, and more importantly—proficiency—really matters. Everyone knows someone who takes every corporate course available and doesn’t learn a thing. You have to learn, not just listen, for the investment to matter. When you do—it does. After becoming proficient (via certification), customers reported the following:


  • Over 54% said the quality of their work improved
  • Nearly 45% said their peers’ perceptions of them improved
  • Nearly 30% received new project opportunities

Take your learning into your own hands and take advantage of the many new resources available and suited to your learning preferences, time constraints, and budget needs. Don’t just check a box, but take the time to do, and cement your understanding through experience. It’s at your fingertips.


Identify the skills you need and find learning resources to help you successfully develop your application. Visit  or download the course catalog to review the training courses, certification exams, and proficiency events available in your area of interest.

Today’s post is part of a series exploring areas of focus and innovation for NI software.

Phillips headshot.jpg


Today’s Featured Author

Jeff Phillips considers LabVIEW as essential to his day as food, water, and oxygen. As senior group manager for LabVIEW product marketing at NI, Jeff focuses on how LabVIEW can meet the changing needs of users. You can follow him on Twitter at @TheLabVIEWLion.




The light switch—that little component of our lighting system that is afforded no delay, no ramp time, and no warm up routine. When you flip up that switch, you expect to see light immediately. This expectation is the result of years of mental conditioning; however, many fail to understand how it actually happens.

What’s the light switch for LabVIEW developers? It’s the compiler. It’s that tiny run arrow button that switches your code from edit mode to run mode. Again, many years of mental conditioning force you to expect the immediate translation of your code into executable action. But do you understand what actually happens?

The run button isn’t the action that spawns the compiler down the magical world of optimizing, unrolling, and memory managing. The LabVIEW compiler is constant; it’s always on and compiling your code. From a usability standpoint, this is one area that makes LabVIEW so engaging. Our brains live off the dopamine hits of being right, solving problems, and answering questions. With each incomplete step of your code development, the compiler visually warns you that your code won’t run. Obviously, the next step is to fix that. Alas! The run button is complete again.


And LabVIEW says, “you’re welcome” for that adrenaline rush you just enjoyed—perhaps even subconsciously.

Unlike other languages where the compile is an explicit step that you take when you’re done and ready to run, LabVIEW does this action constantly. This always-on compiler streamlines the development process for programmers.

I can’t summarize the complaints I get from LabVIEW users about the compiler, because, frankly, I just don’t get them. Yes, I hear a lot about the need for by-reference classes and generics, but those are more language implementations and not specific compiler complaints. Subtle, but I’ll consider it accurate just so I’m right.

As we continue to iterate, improve, and evolve the LabVIEW compiler, our focus is on two places: writing code faster and writing faster code (this gem of language genius came from one of our senior LabVIEW architects, Darren Nattinger, also known as the fastest LabVIEW programmer on the planet).


Write Code Faster


LabVIEW was designed for engineers and scientists. The semantics of writing graphical code maps directly to most engineers who lay out solutions in their minds (see any of the maps I drew on napkins, before I was blessed with a smart phone).

We’ve had a significant focus over the last few years on reducing mouse clicks and keyboard strokes. This is one measure of an engineer’s productivity—getting to final code in minimal physical effort (right-click shortcuts, default wiring options, and so on).

Sneak Peek: As you’ll see in the upcoming release of LabVIEW (at NIWeek perhaps?), these types of improvements are continuing.

In addition to these critical improvements in the compiler, we’re investing in areas that carry this benefit into other product areas. I previously discussed one such example in the realm of hardware discovery and configuration. But, we’re also looking to extend into other areas such as managing deployed systems, simplifying data communication, and introducing other language components.


Capture (1).PNG


Write Faster Code


The optimizations introduced in LabVIEW 2010 with the new DataFlow Intermediate Representation (DFIR) and integration of off-the-shelf compiler technology in the Low-Level Virtual Machine (LLVM) has drastically increased the run-time performance of code without requiring rewrites of the code itself.


This compiler overhaul has laid the groundwork for continued innovation for both code optimization on the desktop and code optimizations within deployed targets, both real-time processors and FPGAs. The newly introduced multirate diagram in the LabVIEW Communications System Design Suite is a perfect example of this innovation—a novel algorithm representation that enables one high-level representation of a mathematical algorithm, even with different execution rates and sample counts from node to node.




This is representative of the focus on the LabVIEW compiler in a constant commitment to highly optimized code.


Sneak Peek: Again, you’ll see some very impressive performance improvements for large applications in the next release.

Capture (1).PNG

Today’s post is part of a series exploring areas of focus and innovation for NI software.

Shelley headshot.jpg


Today’s Featured Author

Shelley Gretlein is a self-proclaimed software geek and robot aficionado. As NI’s director of software marketing, you can find Shelley championing LabVIEW from keynote stages to user forums to elevator conversations. You can follow her on Twitter at @LadyLabVIEW.




Grace Murray Hopper (1906–1992), computer scientist and US Navy rear admiral, is on record as the individual who invented the first compiler and popularized the concept of machine-independent programming languages, and her work led to the development of COBOL. Her influence earned her the nickname Grandma COBOL[1]


Hopper’s contributions led to the productivity through abstraction that engineers and scientists benefit from regularly. While her work was ahead of her time, compilers have advanced significantly since the Harvard Mark I computer Hopper used in 1944. Compiler design, even for a trivial programming language, can easily become complex, which makes compiler theory one of the most specialized fields among software professionals today. This complexity translates to either a frightening or magical trick for the majority of engineers.


However, all engineers can benefit from this art form of software mastery. From Python to LabVIEW, compilers are primarily used to translate source code from a high-level programming language (G, C#, and so on) to a lower level language (assembly or machine code). Even more beneficial are cross-compilers that can create code to execute on a computer where the target and development CPU or OS are different.


Modern LabVIEW provides a multiparadeigmatic language that embraces a wide variety of concepts including data flow, object orientation, event-driven programming, and, more recently, multirate data flow, which extends LabVIEW development by giving you the ability to implement multirate, streaming digital signal processing algorithms more intuitively. LabVIEW also supports cross compilation, a powerful language that offers flexibility to protect the investment in your code by developing on one platform and deploying to Windows, MacOS, NI Linux Real-Time, CPUs, and FPGAs. All of this to say, the LabVIEW compiler is a pretty awesome, sophisticated element of our platform.




Figure 1. LabVIEW has a cross-compiler and contains a multiparadeigmatic language that embraces a wide variety of concepts including data flow, object orientation, event-driven programming, and, more recently, a multirate diagram, which defines a synchronous multirate dataflow system (shown here).

For LabVIEW and most languages, the compiler is an area that is always reviewed, renewed, and improved. Major compiler investments (following the first huge transition from an interpreter to a compiler in LabVIEW 2.0) began in LabVIEW 2009 when we added 64-bit compilation and DataFlow Intermediate Representation (DFIR). Complementary to that investment was the adoption of a Low-Level Virtual Machine (LLVM) into the compiler chain in LabVIEW 2010. These significant improvements provided more advanced forms of loop-invariant code motion, constant folding, dead code elimination, and unreachable code elimination, as well as new compiler optimizations such as instruction scheduling, loop unswitching, instruction combining, conditional propagation, and a more sophisticated register allocator.


So aren’t other compilers amazing, too? Of course, anything that provides a valuable abstraction from machine code to encourage innovation and increase productivity is amazing. However, all compilers are not created equal. 


Python, for example, has a compiler that compiles to a byte code used by a virtual machine, similar to Java. This provides portability but keeps it very far from “the metal” or hardware creating noteworthy challenges if your application relies on timing or I/O. 


Why does this all matter? Why do you care about Grandma COBOL’s work? As a busy, practical engineer who programs to get an application built and trusts the compiler will just work, what’s in it for you? When done well, compilers can contribute significant value to your day-to-day development by increasing your executable code’s performance. Modern compilers can also contribute to your productivity if they are designed to be extensible so that partners can build on tools, languages, and integrated development environment features for more rapid innovation. This complex and critical element of every language provides a constant “green field” of innovation for computer scientists—an opportunity for continuous improvement in each release.


What’s your favorite compiler? Where should we invest next to improve LabVIEW compiler technology for you?



The registered trademark Linux® is used pursuant to a sublicense from LMI, the exclusive licensee of Linus Torvalds, owner of the mark on a worldwide basis.


[1] Hopper, well known for her lively style, is also credited for popularizing the term debugging, for solving small glitches with engineering problems, when her associates discovered a moth stuck in a relay (Dahlgren, Virginia 1947).

Today’s post is part of a series exploring areas of focus and innovation for NI software.


Phillips headshot.jpg


Today’s Featured Author

Omid Sojoodi is currently the leader of application
and embedded software for National Instruments.





With the rise of the Industrial Internet of Things, one thing is clear: engineers need to extract meaningful information from the massive amounts of machine data collected.


Data from machines, the fastest growing type of data, is expected to exceed 4.4 zettabytes (that’s 21 zeros) by 2020. This type of data is growing faster than social media data and other traditional sources. This may sound surprising, but when you think about those other data sources, which I call “human limited,” consider that there are only so many tweets or pictures a person can upload throughout the day. And there are only so many movies or TV shows a person can binge watch on Netflix to get to the next set of recommendations. But machines can collect hundreds or even thousands of signals 24/7 in an automated fashion. In the very near future, the data generated by our more than 50 billion connected devices will easily surpass the amount of data humans generate.


The data that machines generate is unique, and big data analysis tools that work for social media data or traditional big data sources just won’t cut it for engineering data. That is why NI is investing in tools to help you overcome common challenges and make data-driven decisions based on your engineering data (no matter the size) confidently.


Challenge 1: 78 percent of data is undocumented.

According to research firm International Data Corporation (IDC), “The Internet of Things will also influence the massive amounts of ’useful data’—data that could be analyzed—in the digital universe. In 2013, only 22 percent of the information in the digital universe was considered useful data, but less than 5 percent of the useful data was actually analyzed.”


Data that is considered useful includes metadata or data that is tagged with additional information. No one wants to open a data source and wonder what the test was, what the channels of information are called, what units the data was collected in, and so on. NI is helping to resolve this issue with our Technical Data Management (TDM) data model. With it, you can add an unlimited number of attributes for a channel, a group of channels, or the entire file. We are constantly updating the infrastructure of this binary (but open) data file, and have recently reached streaming rates of 13.6 GB/s. To make documenting data easier, NI is investing in technologies that will recommend metadata to save with your raw data while offering you the flexibility to add attributes at any point before, during, or after acquisition.


Challenge 2: The average NI customer uses three to five file types for projects.

With so many custom solutions on the market, your current application likely involves a variety of vendors to accomplish your task. Sometimes these vendors require you to use closed software that exports in a custom format. Considered a common pain point, aggregating data from these multiple formats often requires multiple tools to read and analyze the data. NI addresses this challenge with DataPlugins, which map any file format to the universal TDM data model. Then you can use a single tool, such as LabVIEW or DIAdem, to create analysis routines. To date, NI has developed over 1,000 DataPlugins. If one isn’t readily available, NI can write a DataPlugin for you.


Challenge 3: It takes too long to find the data you need to analyze.

The Aberdeen Master Data Management research study interviewed 122 companies and asked how long it takes to find the data they need to analyze. They answered five hours per week! That’s just looking for the data—not analyzing it. From an engineering perspective, this to me is not that shocking. How many of us have faced what I consider to be the ”blank VI syndrome” for data? How do you even begin to start analyzing your data?





A little-known technology that NI continues to invest in is DataFinder. DataFinder indexes any metadata included in the file, file name, or folder hierarchy of any file format. Again, this relies on a well-documented file, but by now I’m sure you have decided to use TDM for your next application.


Once the metadata has been indexed, you can perform queries—either text-based, like you would in your favorite search engine, or conditional queries like in a database—to find data in seconds. With this advanced querying, you can return results at a channel level to track trends in individual channels from multiple files over time.


In addition, NI is continuing to innovate to make analyzing your data easier than ever. Imagine a future when, as soon as a file is saved, the DataFinder recognizes the data, indexes the metadata, and cleanses the raw data (normalizes channel names so that rpm = speed = revs or performs statistical calculations automatically). Then an analysis routine, written in your language of choice, acts on each data file and automatically archives the data or sends a report to your email or mobile device. This technology ensures that your data-driven decisions are being made with 100 percent of the data and not just 5 percent, as IDC estimates suggest today.


Stay tuned, everyone.