Archive for the ‘sweetapp’ Category

We know that remote controlled robots are being developed and played with every day, but when it comes to what these robots can do, sometimes we’re limited. Thanks to students and researchers at Temasek Polytechnic, creating a robot that is capable of surveillance, motion, and awareness of its surroundings proves there are no more limitations.

 

 

They used NI PXI hardware to control and operate the webcam, sensors, and motors essential for surveillance, and LabVIEW software to show the webcam image to the controller. Using this technology, users can remotely control the robot with a PC and the controller can detect where the robot is going using sensors. 

 

robot.jpg

“Can” photo by Mo Riza

 

 

>> Read the full case study. 



Using the NI LabVIEW Real-Time Module, the NI Vision Development Module, and the NI 3110 embedded controller to control and aim high-energy lasers, a small group of engineers and scientists at LaserMotive demonstrated the ability to beam hundreds of watts of energy at a distance of up to 1 km, and over a kilowatt at shorter distances. 

Page 15_Fig 1_Q4_INL.png

>> Read more about LaserMotive and how the company found a way to transmit power wirelessly and supply remote locations with energy.  

Using only a Microsoft Kinect, a laptop, and LabVIEW, BYU-Idaho undergraduate engineering students Kevin Smith and Bryce Perry built an application that acts as a virtual whiteboard. The Kinect depth sensor tracks where a person is in the field and overlays it on the video image.

 

 

The user can easily change the color of the drawing, clear the whiteboard, and edit the threshold for picking up where to draw based on the size of the room. As shown in the video, the system can easily handle multiple users drawing at the same time, and any object can be used as a stencil.

 

The best part? It only took Smith and Perry about two hours to get a working prototype of the system, and about six more hours to fine-tune the whiteboard to the point seen in the video.

 

>> Read about another application that uses LabVIEW and Kinect.

Some of the most common human activities are actually rather complex. Take walking, for example. It is a “repetitive process that requires the coordination of the lower limbs to move forward and maintain body balance with one foot in contact with the ground at all times.”

Darwin Gouwanda and Arosha Senananayake are two engineers from Monash University in Malaysia who developed an application that analyzes stride, stance phase, and swing phase – things that occur between the heel-strike to another heel-strike of the same foot (otherwise known as a gait cycle). Acute injury to one of the limbs can disrupt this process and cause abnormal gait. Significant differences between normal and abnormal gait can be found in the duration of a stride, stance phase, and swing phase. To quantify these parameters and study a person's gait, they developed a wireless gyroscope-based gait monitoring system to help them diagnose and track the rehabilitation progress of patients.

System Overview.jpg

The gait monitoring system measures the angular rates of the lower limbs, and identifies and quantifies gait cycles. They used LabVIEW to develop a user-friendly GUI and collect simultaneous real-time data streaming from two wireless gyroscopes, which is in turn sent to a workstation. Using the LabVIEW Advanced Signal Processing Toolkit shortened the development time and reduced the tedious programming work because it offers comprehensive signal processing tools and algorithms.

Just another example of how engineers use NI tools to improve everyday life.

 

>> Get more technical details for this application.