Electrical Instrument Test Automation

Testing automation and storage management 

Client: Freescale Semiconductor

HSSI Selection
Selecting a device to test
HSSI Job Creation
Writing the test kernel
HSSI Job Runner
Running a queue of jobs
HSSI Query
Querying the database for results


Freescale’s High Speed Signal Integrity (HSSI) group is responsible for testing the electrical characteristics of the communications subsystems of the chips Freescale makes. The standards bodies for the serial communications protocols that Freescale implements set criteria for things like the amount of electrical noise the chip needs to tolerate while keeping the bit-error rate down to a specific amount.

The HSSI group needs to test their chips according to these criteria but also their own performance targets and sometimes simply to debug a design problem. Most of the test equipment is computer-controllable via a semi-standard protocol, so it is feasible to automate much of this testing.

Freescale needed a toolset for scripting their tests, queueing up jobs, and storing the information to a central database. The data need flexible reporting and visualization to allow the user to diagnose problems. They must also be tied to the script that generated them as well as to the instruments used in order for the procedure to be revisable years later.


At the bottom of the stack, we have a Python library for communicating with instruments that speak the SCPI protocol, which are most of the instruments in the HSSI domain. The most useful features of the instrument are exposed in the Python API, and it is easy for us or the HSSI group to implement the other features of the instruments as needed.

We also handle a few instruments that do not speak the SCPI protocol with a consistent API. Above this library, we have a GUI application for writing the automation scripts, queueing the test jobs, and viewing the resulting data. Each job is composed of a number of runs of the same script with different input parameters. This permits an exploration of the causes of test failures. The test script may also pass back partial results while it is running for plotting so the user can monitor the progress of a long test run. The user can pause, stop, and reorder jobs to accomodate changing schedules. The resulting data can be viewed in the form of tables and plots according to flexible user-defined queries. The user can push the data to the central database for public viewing and archival purposes.

Want to learn more about what we can help you achieve? Contact Us