Archive for December 21st 2011



NI has just released a free LabVIEW app for iPads and Android tablets -- the NI Data Dashboard for LabVIEW.

The Data Dashboard for LabVIEW lets you create a custom, portable tablet view of your LabVIEW applications by displaying the values of network-published shared variables and web services on charts, gauges, text indicators and LEDs. With the Dashboard, you can now use your iPad or Android tablet to...


·    Connect to String, Boolean, or Numeric data types

·    Create user-panel layouts of one, two, four, or six indicators

·    Swipe between multiple pages

·    Double tap to enlarge any indicator

    ...and more.


> Learn more at ...or go straight to the app download for your iPad here or Android device here.

NI engineer and community member RoboticsME built a robot with the LabVIEW RoboticsStarter Kit that can better maneuver around obstacles with the help of Microsoft’s Xbox Kinect.


The LabVIEW Robotics Starter Kit is an out-of-the-box mobile robot platform complete with sensors, motors, and NI Single-Board RIO hardware for embedded control. RoboticsME attached a Kinect to the robot's upper body with a time-of-flight camera to get a 3D image of its surroundings. The Kinect can see objects in the robot’s surrounding, such as chair legs, but it cannot see objects in close proximity. That’s where the robot’s built-in sonar sensors come in. They can sense objects that are in close proximity to the robot, like in the robot’s blind spot.


RoboticsME used FitPC running Windows Embedded 7 and LabVIEW Robotics to collect and process Kinect data while the LabVIEW FPGA Module collects and processes the sonar data. The LabVIEW Real-Time Module fuses the data from the Kinect and the sonar sensor and performs obstacle avoidance.


>> Check out a video of the robot in action and download the code here.

Using only a Microsoft Kinect, a laptop, and LabVIEW, BYU-Idaho undergraduate engineering students Kevin Smith and Bryce Perry built an application that acts as a virtual whiteboard. The Kinect depth sensor tracks where a person is in the field and overlays it on the video image.



The user can easily change the color of the drawing, clear the whiteboard, and edit the threshold for picking up where to draw based on the size of the room. As shown in the video, the system can easily handle multiple users drawing at the same time, and any object can be used as a stencil.


The best part? It only took Smith and Perry about two hours to get a working prototype of the system, and about six more hours to fine-tune the whiteboard to the point seen in the video.


>> Read about another application that uses LabVIEW and Kinect.