[Point of VIEW] Data Data Data!

Friday, June 12, 2015

Today’s post is part of a series exploring areas of focus and innovation for NI software.


Phillips headshot.jpg

 

Today’s Featured Author

Jeff Phillips considers LabVIEW as essential to his day as food, water, and oxygen. As senior group manager for LabVIEW product marketing at NI, Jeff focuses on how LabVIEW can meet the changing needs of users. You can follow him on Twitter at @TheLabVIEWLion.

 

 

 

If you were to ask me what the most critical element of LabVIEW is, I would have to say “data.” It’s so elementally important to LabVIEW that the term to explain the software’s execution semantics is “dataflow.” The data itself is the factor that defines the timing, flow, and output of any LabVIEW code. In fact, LabVIEW features many elements that elevate data beyond what a general-purpose programming language would.

 

Native data types in LabVIEW, such as the waveform data type, treat the nature of the data as important—and not just the data values that are acquired. This data type packages together the timing information associated with the data to provide context. LabVIEW includes over 950 built-in analysis and signal processing functions because they’re fundamental to getting to data insights, not just to getting to the raw data. LabVIEW also features a lesser known ability to define Data Plugins, which are essentially drivers for software file types that help you easily format your measurement data into known file types for sharing, streaming, or even report generation macros.

 

Anyone who knows my “LabVIEW story” knows that I was academically trained on The MathWorks, Inc. MATLAB® software—so much so that I actually did my engineering homework problems by writing .m files that rotated different input data sets through the same set of equations. When I was first introduced to LabVIEW, I struggled (of course, I did get it eventually, and now I am an avid LabVIEW evangelist who doesn’t use MATLAB® anymore). One of the areas that I struggled with was the lack of interaction with the data itself. I’ve heard this echoed by many users.

 

I talked about a similar challenge in last month’s post: being able to get to measurement data without the requirement of writing code. The same concept extends to analyzing and interacting with measurement data. To actually analyze data within LabVIEW, you need to lay down blocks of code, wire them together, and run the code. Of course, this is the use case that LabVIEW was optimized for, but, in many cases, users want to iteratively develop the analysis or even dive into it interactively. LabVIEW almost forces users to leave the environment for that type of interaction. At NI, we have complementary products like DIAdem that are designed around this use case.

 

Untitled.png

We are aggressively investing in the ability to pull this interactive model into all of our software products to simplify your development process.

You can see some elements of this in the LabVIEW Communications System Design Suite featuring an interactive data viewing window and built-in analysis routines that can be run against the data sets.

The real beauty of this approach is being able to actually build the G code behind this interactive model to even further simplify the development of the automated code itself.


Capture (1).PNG

 

The amazing thing about a data-driven product is that everyone is “doing the same thing.” They’re trying to analyze the data and draw insights from it. The challenging thing with a data-driven product is that nearly every engineer is trying to accomplish these tasks by following different steps, applying different algorithms, or even having a different ending point for that insight. The more that we know about what you’re trying to do, the better we can get at designing a flow around that into the product.

 

So, I ask you now: What are you trying to do with your data? Where is LabVIEW falling short?

 

 

MATLAB® is a registered trademark of The MathWorks, Inc.