Test & Measurement

Seeking A Test Panacea

15th May 2013
ES Admin
Testing electronic circuits has been a key topic in the industry since the first transistor was developed, and today it is as relevant as ever. But how close to the ideal solution can developers get? By Thomas Wenzel, Co-Founder and Managing Director GOEPEL electronic GmbH in Jena/Germany, and Heiko Ehrenberg, President, North American operations, GOEPEL electronics LLC in Austin, TX/USA.
Test strategies are graded by how close they come to the ideal test solution; it doesn’t add any cost to the product under test, during the design nor during production. Most of us agree that product testing is absolutely necessary as part of design validation, a quality indicator for manufacturing process control, or for the detection of defective products prior to shipping. However, ideal test solutions should meet certain requirements: test development and execution should be fully automated and should be done in essentially no time, OEMs want the test equipment to be very inexpensive and provide 100% fault coverage.

Industry trends give cause for concern, though, considering that the cost of test today can be a significant part of the overall development and manufacturing cost. The causes for this trend are primarily the complexity of high-speed designs and the lack of available test access on many of today’s printed circuit board assemblies (PCBAs).

The combined forces of these characteristics result in systematic changes in the balance of product design and test. We start to see a correlation between problems seen in chip test and those seen in board test.

PCBs now look more like integrated circuits (IC) due to the loss of access to internal circuit nodes, the rapid development of three-dimensional (3D) ICs with multi-die integration results in structures that are similar to boards and systems. The 3D board with very little physical access seems to be looming on the horizon. At the same time, the combination of new packaging and integration technologies result in hitherto unaccustomed complexity. While several years ago multiple boards were necessary to create complete system designs, today such systems may be realised in integrated circuits as System-On Chip (SOC) or System-In Package (SIP) designs. As a result, board size can be minimised and new possibilities are available to create super-complex systems.

No matter how a design is arranged, however, from the perspective of test engineering the fundamental question is how such highly complex systems can be tested appropriately and efficiently, and how one can take advantage of synergies between chip test and board test approaches.


Partitioning circuit structures into testable elements is a prerequisite for a successful test strategy. This is one of the reasons why in-circuit test (ICT) has became so successful for board level tests. ICT approaches circuit test structurally and tests components individually, however, the required bed-of-nail based invasive test access is becoming a big dilemma with modern boards. The predictability of test access problems led to the creation of IEEE Std 1149.1 in 1990. The brilliance of 1149.1 is the open expandability of its register architecture, combined with the universal test bus interface (Test Access Port, or TAP) and its protocol definition. These properties allowed IEEE Std 1149.1 to become the base technology for new non-intrusive methodologies and standards for testing, debugging, programming, and emulation. As a result, the portfolio of test access strategies at the board level has definitively changed.

Today we can differentiate three principle classes of access strategies (Figure 1):

Native Connector Access (access through design integrated I/O interfaces);

Intrusive Board Access (access through physical test nails and probes);

Embedded System Access (access through design integrated test bus).

While these classes are not mutually exclusive in their practical utilisation, the applicability of an actual combination of these access strategies depends on the individual capabilities of the chosen automated test equipment (ATE) platform.

So, how do these access strategies relate to each other and what does Embedded System Access mean practically?

Figure 1: Classification of electrical test access strategies at board level

A New Era?

A look at the qualitative development of trends for the various access strategies reveals interesting facts, including a long adoption period of IEEE Std 1149.1 as the first representative for Embedded System Access. The accelerated adoption of Embedded System Access in the market is primarily owed to the fact that it is now a class by itself comprising a variety of non-invasive access technologies, including:

Boundary-Scan Test (IEEE Std 1149.1/.4/.6/.7)

Processor-Emulation Test

Chip-Embedded Instrumentation (IJTAG, IEEE P1687)

In-System Programming

Core-Assisted Programming

FPGA-Assisted Test

FPGA-Assisted Programming


The electrical access embedded in the target system allows Embedded System Access (ESA) to work without invasive test nails and probes. In principle, every ESA technology utilises a task specific pin-electronic which is controlled by the test bus and, as a result, can directly execute test functions and programming routines in the target system. This target system can be an individual chip, a board, or a complete system assembly; Embedded System Access can be utilised throughout the entire product life cycle.

A detailed analysis of key ESA technologies at the board level reveals considerable differences in operation and goals and it becomes clear how important it is for ATE platforms to support all these ESA technologies alike.

Boundary Scan utilises so called Boundary-Scan Cells, combined into a Boundary-Scan Register, as primary access points for a target system’s circuit nodes The Boundary-Scan Register is accessed and controlled through the Test Access Port (TAP). All vectors are scanned serially.

Processor Emulation Test (PET) utilises the debug interface to transform the processor core temporarily into a native test controller (Figure 2). The processor and its system bus interface become the pin-electronics used as access points for the connected circuitry in the target systems. Remote-controlled through the JTAG interface or some other debug interface, the processor core utilises write and read access to the system bus with respective test vectors in order to manipulate and test the connected internal and external resources and components.

Figure 2: Principle of Processor Emulation Test (PET)

Chip-embedded Instruments are test and measurement intellectual property (IP) blocks integrated into ICs, often accessible through the JTAG port. The functionality of chip-embedded instruments is completely open and ranges from simple sensors, over complex signal processing and data collection, and all the way to complete analysis instruments and programming engines. The IP is either integrated permanently in the chip (hard-macro), or it can be temporarily instantiated and configured (soft-macro) in Field Programmable Gate Arrays (FPGA). As a result, the pin-electronics are unrestricted in principle and can provide a wide variety of functionality, within the frame and scope of the respective technology of the host device, of course.

In particular FPGA-embedded instruments have enjoyed strong interest recently. By enabling strategies such as FPGA Assisted Test (FAT) and FPGA Assisted Programming (FAP) they provide enormous flexibility for the adaptation to individual test and measurement requirements.

Chip-embedded instruments have been utilised for years in chip test, for example in the form of built-in self-test (BIST) IP. However, access to these instruments has not been standardised in the past, something that will be changed with the new IEEE P1687 (also known as IJTAG).

In-System-Programming (ISP) is a collective term for the programming of Flash devices via Boundary Scan and for the programming of PLD/FPGA devices through their Test Access Port (TAP) and built-in programming registers, while the devices are mounted on the printed circuit board.

For in-system programming of PLD/FGPA, special standards exist, such as IEEE Std 1532, JESD-71, and an industrial standard called Serial Vector Format (SVF).

The premise of the Core Assisted Programming (CAP) strategy is similar to Processor Emulation Test. The processor is controlled through its native debug interface in a way that allows Flash or FPGA (design permitting) connected to the system bus to be erased, programmed, and verified. In the case of Flash it does not matter whether it is integrated in the processor/microcontroller (on-chip Flash) or connected as external, discrete Flash device(s). Furthermore, it is possible to load only the Flash handler/programming engine via JTAG into the processor and to download the Flash data image through a high-speed communication interface on the processor CAP technology.

One of the most interesting technologies for Flash ISP, referred to as FPGA Assisted Programming (FAP), is based on FPGA-embedded instruments. The embedded instrument in this case is a programming engine (programmer) soft macro, typically provided by a tool vendor and temporarily downloaded into the FPGA. Depending on the architecture of the programmer IP and the performance of the external control system, drastic improvements in programming speed compared to boundary-scan based ISP are possible.

The last access technology in this discussion is referred to as System Level JTAG. While remote control through an external controller is possible, this technique typically employs a central test control unit integrated directly into the system design.

Test vectors are usually stored locally on the system and a separate IC is commonly used as the test bus controller (although, there is also the possibility to integrate the test bus controller function in an IC that also performs other functions in the system design). As the name implies, this method can be employed not only for individual boards but also for systems comprising multiple boards and modules.

System-Integrated Tester

The transition from traditional invasive test access and techniques to Embedded System Access is not a marginal change in the handling of test and programming vectors but rather a fundamental technological metamorphosis. Characteristics of these changes include:

Integration of test electronics in the system under test;

Inseparable coupling of functional and test circuitry in the system design;

Forming of partitioned test centres with various features;

Significantly wider range of test and programming strategies;

Possible utilisation throughout the entire product life cycle;

Flexibility of re-configurable pin-electronics with FPGAs;

Availability of completely new instrumentation platforms.

Figure 3: Transformation caused by Embedded System Access

In practice, Embedded System Access represents a transformation from a purely functional design into a functional design with integrated test capabilities, a combination of unit under test and tester, so to speak (Figure 3).

Depending on the actual implementation of Embedded System Access, a wide variety of applications is possible (Figure 4). Currently, FPGA-based test in particular is a technology driver for progressively more complex test and measurement functions. This includes applications such as:

Voltage measurements;

Frequency measurements;

Temperature measurements;

Bit Error Rate Tests (BERT) for high-speed signals;

Event counters, and;

Logic Scopes.

Figure 4: Control of ESA applications through external hardware and software

Tester Instrumentation

So far the discussion has been about the JTAG interface as the test bus. However, there are also a number of proprietary bus interfaces used in the industry, in particular for debug interfaces on processors, such as Serial Wire Debug (SWD), Spy-Bi-Wire (SBW), or Background Debug Mode (BDM). For ATE vendors this means that their test bus controllers need to provide the required flexibility to support any of such interfaces; even a mix of different test bus interfaces in multi-processor applications should be supported. Furthermore, the various ESA technologies must be supported by powerful software tools and must be made available to the user in intuitive graphical user interfaces. In this context we need to consider not only the independent use of individual ESA methods, but also the potentially interactive application of various ESA technologies in order to gain extra benefits.

Featured products

Upcoming Events

View all events
Latest global electronics news