TY - GEN
T1 - In-Situ Measurement of Extrusion Width for Fused Filament Fabrication Process Using Vision and Machine Learning Models
AU - Shabani, Arya
AU - Martinez-Hernandez, Uriel
PY - 2023/10/5
Y1 - 2023/10/5
N2 - Measuring geometry of the printing road is key for detection of anomalies in 3D printing processes. Although commercial 3D printers can measure the extrusion height using various distance sensors, measuring of the width in real-time remains a challenge. This paper presents a visual in-situ monitoring system to measure width of the printing filament road in 2D patterns. The proposed system is composed of a printable shroud with embedded camera setup and a visual detection approach based on a two-stage instance segmentation method. Each of the segmentation and localization stages can use multiple computational approaches including Gaussian mixture model, color filter, and deep neural network models. The visual monitoring system is mounted on a standard 3D printer and validated with the measurement of printed filament roads of sub-millimeter widths. The results on accuracy and robustness reveal that combinations of deep models for both segmentation and localization stages have better performance. Particularly, fully connected CNN segmentation model combined with YOLO object detector can measure sub-millimeter extrusion width with 90 μm accuracy at 125 ms speed. This visual monitoring system has potential to improve the control of printing processes by the real-time measurement of printed filament geometry.
AB - Measuring geometry of the printing road is key for detection of anomalies in 3D printing processes. Although commercial 3D printers can measure the extrusion height using various distance sensors, measuring of the width in real-time remains a challenge. This paper presents a visual in-situ monitoring system to measure width of the printing filament road in 2D patterns. The proposed system is composed of a printable shroud with embedded camera setup and a visual detection approach based on a two-stage instance segmentation method. Each of the segmentation and localization stages can use multiple computational approaches including Gaussian mixture model, color filter, and deep neural network models. The visual monitoring system is mounted on a standard 3D printer and validated with the measurement of printed filament roads of sub-millimeter widths. The results on accuracy and robustness reveal that combinations of deep models for both segmentation and localization stages have better performance. Particularly, fully connected CNN segmentation model combined with YOLO object detector can measure sub-millimeter extrusion width with 90 μm accuracy at 125 ms speed. This visual monitoring system has potential to improve the control of printing processes by the real-time measurement of printed filament geometry.
KW - Additive manufacturing
KW - Computer vision
KW - Instance segmentation
KW - Machine learning
UR - http://www.scopus.com/inward/record.url?scp=85182526664&partnerID=8YFLogxK
U2 - 10.1109/IROS55552.2023.10341406
DO - 10.1109/IROS55552.2023.10341406
M3 - Chapter in a published conference proceeding
AN - SCOPUS:85182526664
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 8298
EP - 8303
BT - 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2023
PB - IEEE
CY - U. S. A.
T2 - 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2023
Y2 - 1 October 2023 through 5 October 2023
ER -