Test & Measurement

National Instruments' Top Trends in Test and Measurement for 2009

14th January 2009
ES Admin
0
As the global economic climate places additional constraints on budgets, test engineers are challenged to identify ways to test devices more efficiently than ever before. National Instruments has identified three trends – software-defined instrumentation, parallel processing technologies and new methods for wireless and semiconductor test – that will significantly improve the efficiency of test and measurement systems in 2009.
These trends help engineers develop faster and more flexible automated test systems while reducing their overall cost of test, and companies worldwide and from all industry segments are seeing significant benefits from applying these methods and technologies.

“The challenging world economy is forcing more companies to look at alternatives to their existing test engineering strategies,” said Eric Starkloff, Vice President of Product Marketing for Test and Measurement at National Instruments. “More engineers than ever before are turning to software-defined instrumentation and the latest commercial technologies to achieve significant performance and flexibility gains while reducing their overall cost of test.”

The adoption of software-defined instrumentation is the most significant trend in test and measurement for 2009. Engineers are using software-defined instrumentation to achieve new levels of measurement performance and lower test costs by applying the latest technological advancements such as multicore processing and field-programmable gate arrays (FPGAs) in their test systems to meet the demands of new application areas such as wireless and protocol-aware test. The quick return on investment from these benefits is contributing significantly to the mainstream adoption of software-defined instrumentation.

Growth of Software-Defined Instrumentation
Software-defined instruments, also known as virtual instruments, consist of modular hardware and user-defined software that give engineers the ability to combine standard and user-defined measurements with custom data processing using common hardware components. This flexibility has become critical as electronic devices such as next-generation navigation systems and smart phones integrate diverse capabilities and rapidly adopt new communication standards. Using software-defined instruments, engineers can rapidly reconfigure their test equipment by modifying software algorithms to meet changing test requirements.

Because of the flexibility and cost-effectiveness, thousands of companies are adopting software-defined instrumentation based on the National Instruments LabVIEW graphical programming platform and the open, multivendor PXI hardware standard. According to the PXI Systems Alliance, more than 100,000 PXI systems will be deployed by the end of 2009, and the number of deployed PXI systems is expected to double in the next decade.

“The open, modular architecture of software-defined instruments such as those in PXI have proven beneficial to a wide range of industries, and, as a result, PXI revenue in measurement and automation is expected to grow at 17.6 percent CAGR through 2014,” said Jessy Cavazos, Test and Measurement Industry Manager at Frost & Sullivan. “The performance delivered by the PXI platform has successfully addressed areas such as RF applications in radar testing, mobile phone testing and other wireless applications that were previously impossible to address with other instrumentation.”

Increased Adoption of Parallel Processing Technologies
Multicore technology has become a standard feature in automated test systems and a necessity for today’s electronic devices that are processing unprecedented amounts of data. Software-defined instrumentation takes advantage of the latest multicore processors and high-speed bus technologies to generate, capture, analyse and process the gigabytes of data required to properly design and test electronic devices. Multicore architectures can present a challenge when used with traditional text-based programming environments that are not inherently parallel and require low-level programming techniques. However, test engineers can quickly realise the benefits of multicore technology through inherently parallel programming environments such as LabVIEW, which automatically distributes multithreaded applications across multiple computing cores for maximum performance and throughput.

Another area of growth for software-defined instrumentation is the increase in system-level tools for FPGAs. Many modular instruments now come equipped with FPGAs, including several released in the past year that offer the high-performance Xilinx Virtex-5 FPGA. These FPGA-based instruments provide test engineers with the ability to implement more complex digital signal processing at faster rates than ever before. Because software programs such as LabVIEW give test engineers the ability to program FPGAs without requiring knowledge of VHDL, the performance benefits of FPGAs are no longer limited to a subset of hardware engineers with extensive knowledge in digital design.

Expansion of Wireless and Protocol-Aware Test
In addition to emerging technological advances, software-defined instrumentation has proved ideal for rapid-growth areas such as wireless and protocol-aware test. For example, consumer electronics devices including cell phones and automotive in-dash entertainment systems often integrate multiple communication protocols and standards such as GPS, WiMAX and WLAN. Test engineers using traditional instruments have to wait for a dominant standard to emerge and then for vendors to develop a dedicated, stand-alone box instrument to test that standard. With software-defined instruments, engineers can test multiple standards using common modular hardware components and implement emerging and custom wireless protocols and algorithms in their test systems regardless of the maturity of a new wireless standard.

Additionally, the emergence of complex systems on a chip (SoCs) and systems in a package (SiPs) in the semiconductor industry has led to increased demand for “protocol aware ATE” or the ability to test devices by emulating the real-world signals connected to them. These increasing requirements for semiconductor test and the need to reduce total test costs have led industry organisations such as the Semiconductor Test Consortium and the Collaborative Alliance for Semiconductor Test to investigate standards around open test architectures that support the integration of modular, software-defined instrumentation such as PXI into traditional semiconductor ATE. By using software-defined, FPGA-based instrumentation in these semiconductor test systems, engineers can achieve real-time responses with the standard pin electronics found in traditional ATE, lowering the total cost of test through better use-case coverage and improving the user’s ability to debug failures.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier