Skip Nav Destination
Close Modal
Update search
Filter
- Title
- Authors
- Author Affiliations
- Full Text
- Abstract
- Keywords
- DOI
- ISBN
- EISBN
- Issue
- ISSN
- EISSN
- Volume
- References
Filter
- Title
- Authors
- Author Affiliations
- Full Text
- Abstract
- Keywords
- DOI
- ISBN
- EISBN
- Issue
- ISSN
- EISSN
- Volume
- References
Filter
- Title
- Authors
- Author Affiliations
- Full Text
- Abstract
- Keywords
- DOI
- ISBN
- EISBN
- Issue
- ISSN
- EISSN
- Volume
- References
Filter
- Title
- Authors
- Author Affiliations
- Full Text
- Abstract
- Keywords
- DOI
- ISBN
- EISBN
- Issue
- ISSN
- EISSN
- Volume
- References
Filter
- Title
- Authors
- Author Affiliations
- Full Text
- Abstract
- Keywords
- DOI
- ISBN
- EISBN
- Issue
- ISSN
- EISSN
- Volume
- References
Filter
- Title
- Authors
- Author Affiliations
- Full Text
- Abstract
- Keywords
- DOI
- ISBN
- EISBN
- Issue
- ISSN
- EISSN
- Volume
- References
NARROW
Format
Subjects
Journal
Article Type
Volume Subject Area
Date
Availability
1-5 of 5
Anna Safont-Andreu
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Proceedings Papers
ISTFA2023, ISTFA 2023: Conference Proceedings from the 49th International Symposium for Testing and Failure Analysis, 16-22, November 12–16, 2023,
Abstract
View Paper
PDF
During the activity in the Failure Analysis (FA) laboratory, all corresponding findings and conclusions are included in a series of documents known as the FA reports. They shall, in the first place, inform the requestor about the analysis results. But additionally, they shall provide information to solve similar cases. Therefore, these documents play a key role in preserving the knowledge acquired by the engineers as they become available for consultation during future works. The different information systems in FA consist of databases, file shares, wikis, or other human-readable forms. However, the heterogeneity of these databases and the large number of independent documents make it inefficient for manual consultation. In this context, this paper proposes an application of Natural Language Processing (NLP) known as Named Entity Recognition (NER), consisting of an AI-based detection of key concepts in textual data in the form of annotations. These annotations can then be used to boost search systems or other AI models.
Journal Articles
Journal: EDFA Technical Articles
EDFA Technical Articles (2023) 25 (2): 16–28.
Published: 01 May 2023
Abstract
View article
PDF
This article provides a systematic overview of knowledge-based and machine-learning AI methods and their potential for use in automated testing, defect identification, fault prediction, root cause analysis, and equipment scheduling. It also discusses the role of decision-making rules, image annotations, and ontologies in automated workflows, data sharing, and interoperability.
Proceedings Papers
ISTFA2022, ISTFA 2022: Conference Proceedings from the 48th International Symposium for Testing and Failure Analysis, 28-35, October 30–November 3, 2022,
Abstract
View Paper
PDF
Failure Analysis (FA) is a complex activity that requires careful and complete documentation of all findings and conclusions to preserve knowledge acquired by engineers in this process. Modern FA systems store this data in text or image formats and organize it in databases, file shares, wikis, or other human-readable forms. Given a large volume of generated FA data, navigating it or searching for particular information is hard since machines cannot process the stored knowledge automatically and require much interaction with experts. In this paper, we investigate applications of modern Natural Language Processing (NLP) approaches to the classification of FA texts with respect to electrical and/or physical failures they describe. In particular, we study the efficiency of pretrained Language Models (LM) in the semiconductors domain for text classification with deep neural networks. Evaluation results of LMs show that their vocabulary is not suitable for FA applications, and the best classification accuracy of appr. 60% and 70% for physical and electrical failures, respectively, can only be reached with fine-tuning techniques.
Proceedings Papers
ISTFA2021, ISTFA 2021: Conference Proceedings from the 47th International Symposium for Testing and Failure Analysis, 1-5, October 31–November 4, 2021,
Abstract
View Paper
PDF
In their daily work, engineers in semiconductor Failure Analysis (FA) laboratories generate numerous documents, recording the tasks, findings, and conclusions related to every device they handle. This data stores valuable knowledge for the laboratory that other experts can consult, but being in the form of a collection of documents pertaining to particular devices and their processing history makes it difficult if not practically impossible to find answers to specific questions. This paper therefore proposes a Natural Language Processing (NLP) solution to make the gathering of FA knowledge from numerous documents more efficient. It explains how the authors generated a dataset of FA reports along with corresponding electrical signatures and physical failures in order to train different machine-learning algorithms and compare their performance. Three of the most common classification algorithms were used in the study: K-Nearest Neighbors (kNN), Support Vector Machines (SVM), and Deep Neural Networks (DNN). All of the classification models produced were able to capture patterns associated with different types of failures and predict the causes. The outcomes were best with the SVM classifier and all classifiers did slightly better in regard to physical faults. The reasons are discussed in the paper, which also provides suggestions for future work.
Proceedings Papers
ISTFA2021, ISTFA 2021: Conference Proceedings from the 47th International Symposium for Testing and Failure Analysis, 23-28, October 31–November 4, 2021,
Abstract
View Paper
PDF
Fault analysis is a complex task that requires engineers to perform various analyses to detect and localize physical defects in semiconductor devices. The process is knowledge intensive and must be precisely documented. In order to ensure unambiguous documentation, engineers must agree on a clearly defined terminology specifying methods, tools, physical faults and their electrical signatures among other things, and it must be stored in a way that is usable for both engineers and software. One possible solution to this challenge is to formalize domain knowledge as an ontology, a knowledge base designed to store terminological definitions. This paper discusses the development of an ontology for electronic device failure analysis that uses a logic-based representation. The latter ensures that terms are interpreted the same way by engineers and software systems, facilitating the automation of tasks such as text classification, information retrieval, and workflow verification.