Behind the Code: Janusz Rajski, DFT (Design-for-testability)
27 Feb, 20255 mins
Janusz Rajski, VP of Engineering at Tessent, Siemens, shares insights on tackling semiconductor test challenges, from test compression to in-system solutions. He discusses the evolution of DFT, the impact of 3D packaging, and the future of chip reliability in an increasingly complex landscape.
Q: What were the biggest challenges you faced in developing advanced ATPG techniques, and how did you overcome them?
A: The biggest challenges for me and the Tessent team at Siemens were always the test challenges of the semiconductor industry. Every 5 to 10 years the industry faces new grand test challenges.
About 25 years ago it was the cost of test caused by a rapidly growing volume of scan test data and test time. Our response was test compression technology implemented in TestKompress product [1]. It initially reduced the volume of test date by ten times, and it kept up with the growing sizes of designs ever since, exceeding 100 times today.
The next challenge of paramount importance was test quality. It became extremely relevant when the traditional gate-level fault models were no longer able to provide adequate screening quality. To address this challenge, we developed Cell-Aware Test technology [2] that looks at all possible defects inside standard cells, taking cell layout information to generate the list of potential defects. Subsequently we added timing aware capability to handle small-delay defects. Currently upgrading its ability to handle the Silent Data Corruption Errors that have become a major issue in data centers.
When single-die designs started exceeding complexity of tens of billions of transistors, broken into many cores, and assembled into multi-die packages, the traditional flat Design for Test (DFT) flow, with long test pattern generation times, complex planning and slow test application, was no longer good enough. There was a pressing need to develop a hierarchical DFT flow, where DFT insertion and test pattern generation for every core could be done independently from all other cores. Our solution was a hierarchical DFT flow where test patterns for all cores are mapped to the top design level and delivered in a packetized data format by Streaming Scan Network [3] at a much higher rate than a traditional scan. The functionality provided with SSN and In-System Test [4], a new product introduced recently by Tessent Siemens, is now used not only for manufacturing test but also in-system to apply high-quality deterministic (ATPG) patterns.
Q: What inspired your work on logic BIST, and how do you see its role evolving in emerging technologies like AI accelerators and heterogeneous computing systems?
A: Initially the development of LBIST was driven by in-system, in-field applications. The minimum volume of test data makes it attractive for applications with limited in-system memory and this is where LBIST is used today. LBIST is based on pseudorandom patterns
and test points inserted during the design stage; this limits its ability to achieve high coverage and handle advanced fault models. The test content, once determined at the design phase, is difficult to change.
At some point test data and cost of test were driving interest in using LBIST in in-field test, but when we considered the limited test quality, we developed test compression and in-system deterministic test. Still, there is a place for LBIST in applications with limited system memory.
Q: What advice would you give to engineers or researchers just starting in the DFT field?
A: DFT is a fascinating and fast-moving area, quickly expanding its scope and applications. Every year there are new developments, new issues, and changes in best practices. If I was a DFT Engineer, I would try to stay on top of all new developments. I would try to constantly learn about recent advancements by attending technical conferences, like the International Test Conference (ITC), sign-up for tutorials, attend DFT seminars given by Tessent Siemens. I would try to follow and learn from the best practices of the industry leaders.
Albert Einstein famously said, "Strive not to be a success, but rather to be of value." I strongly believe that any research project worth pursuing should aim at creating value to society, either by pushing the state-of-the-art in science or industrial practice. The number of publications or even citations by itself is not a measure of value. Projects of value make a lasting impact and eventually lead to real success. A research project in applied sciences, like DFT, that improves our understanding of nature and results in significant benefits through broad adoption is of value and worth pursuing. While we should accept the risk that not every project will be successful in achieving its initial goal, a project is not a failure if it provides learnings. Researchers, either starting or established, should look at long-term trends to anticipate the challenges that will appear in the future. That is usually a good way to select promising research projects.
Q: Looking back on your career, what project or achievement are you most proud of, and why?
A: I am very proud of all the projects that solved major test challenges of the semiconductor industry using highly innovative solutions incorporated in products developed by the Tessent organization of Siemens. The most impactful products like: TestKompress, Cell-Aware Test, Streaming Scan Network and In-System Test, addressed major test challenges of the semiconductor industry, were the first in the industry and set trends for the industry to follow.
But what I am even more proud of is the culture of innovation that we have created at Tessent. It is based on long-term trend analysis and planning (5-10 years), focused research leading to innovative solutions, and collaboration with leading semiconductor companies. Products developed with these principles have extended longevity and offer big value to the users. Breakthrough products change the competitive landscape for their users as well as the developers.
Q: How has DFT evolved over the past decade, and what are the latest trends in the field?
A: The new significant changes in DFT that shaped the industrial practice today started with test compression pioneered by TestKompress in 2001. The reduced volume of test data enabled the adoption of the already-known transition fault model and new fault models, like Cell-Aware Test and other variants of defect-oriented test. The test patterns based on the advanced fault models combined with timing awareness resulted in a major improvement in the quality of test. As the designs grew in complexity, became multi-core and recently also multi-die, hierarchical DFT flow was introduced to allow DFT insertion and test pattern generation to be performed concurrently and independently for each core as well as the system for which they are intended. The hierarchical DFT flow built on TestKompress and combined with Streaming Scan Network efficiently handles designs with multiple identical cores and this why it has been quickly adopted in the last couple of years. Very recently a need emerged in data centers to perform a periodic in-field test. This accelerated the adoption of In-System Test that enables deterministic, ATPG quality.
In the past decade, there has been significant progress in development and adoption of new solutions in Analog mixed-signal test, Memory BIST, 3D test, IJTAG, diagnosis and yield learning. Although we don’t have the time to cover all those advancements here, there are several publicly available IEEE publications with technical details.
Q: What advice would you give to companies looking to improve their DFT processes?
A: The process of adoption of new disruptive technology is well documented by Geoffrey A. Moore in Crossing the Chasm. We always have early adopters, early majority, late majority, and laggards. While not every company can afford to be an early adopter before a new technology crosses the chasm, it is advantageous to be in the group of early majority rather than late majority or laggards. Many companies extensively documented the benefits coming from the early adoption of new Tessent DFT technologies in their publications at various IEEE-sponsored technical conferences. The improvement in cost of test, quality, productivity, or time to market can be a source of a significant competitive advantage.
Q: What do you see as the biggest challenges and opportunities for DFT in the next 5–10 years? And how do you envision the role of a DFT Engineer changing in the future?
A: The proliferation of 3D packaging will drive the quality of test to even higher standards. It would be counterproductive to integrate a bad die in a very expensive 3D package and then throw it away. The new tests will have to ensure that the die works correctly under different environmental conditions in all the process, voltage and temperature (PVT) corners it was designed to work. Early-life failures will need to be accelerated by stress test during manufacturing and prevented from being used in the final product. Back-side power and 3D packaging will make silicon bring up, debug, and diagnosis even a bigger challenge. Silicon Lifecycle Management will require a high-quality test not only during manufacturing but also periodically in-system. The periodic in-system test combined with readout from PVT sensors and slack monitors will be needed to monitor aging of silicon and preventive maintenance. With a wider adoption of in-system deterministic test, hardware security and protection of DFT infrastructure from malicious attacks will need to be addressed as well. None of these challenges is insurmountable. These are all opportunities for the test community to come up with appropriate future solutions [5].
Q: If you could change one thing about how the industry approaches DFT, what would it be and why?
A: Let me answer this question by commenting on what works well and why these are good approaches to follow. Usually, the leaders of the semiconductor industry, who approach DFT in an efficient and productive way, have very knowledgeable DFT teams led by DFT experts/visionaries who pay attention to long-term trends. They know when to adopt new tools or change flows. They do it before they must. They look for ways to improve their processes knowing that the productivity gains will compensate for the upfront effort. Any preparations needed to accommodate requirements of a new technology node, advanced packaging, design complexity or silicon lifecycle management paradigm take time. Proactive thinking in these situations is always better than a more expensive and less reliable reactive attitude. The semiconductor DFT leaders often engage in technology partnerships with test automation leaders like the Tessent division of Siemens to solve new emerging problems. The partnerships involve sharing of and aligning the technology roadmaps, defining the requirements for the solutions, and validation in silicon. The semiconductor company benefits by having early access to the technology, future enhancements prioritized in the roadmap and fully automated flow.
Download our DFT (US) H2 2024 Talent Report
References:
[1] J. Rajski et al., “Embedded Deterministic Test”, IEEE TCAD 2004.
[2] F. Hapke et al., “Cell-Aware Test”, IEEE TCAD 2014.
[3] J.F. Cote et al., “Streaming Scan Network (SSN): An Efficient Packetized Data Network for Testing of Complex SoCs”, ITC 2020.
[4] D. Trock et al., “Deterministic In-Fleet Scan Test for a Cloud Computing Platform”, ITC 2024. [5] J. Rajski et al., “The Future of Design for Test and Silicon Lifecycle Management”, IEEE Design & Test, 2024.