Multimodal Data Fusion and Real-Time Monitoring in Advanced Manufacturing

Multimodal Data Fusion and Real-Time Monitoring in Advanced Manufacturing

The field of advanced manufacturing is witnessing a significant shift towards the integration of multimodal data fusion and real-time monitoring techniques. This trend is driven by the need for more efficient, cost-effective, and accurate quality control methods in industrial processes. Recent developments highlight the use of thermal sensor data for real-time pore detection in metal 3D printing, offering a viable alternative to traditional, resource-intensive methods like CT scanning. Additionally, unsupervised learning approaches are being employed to fuse multimodal sensor data, enabling enhanced process monitoring and anomaly detection without the reliance on labeled datasets.

Another notable advancement is the application of joint embedding frameworks that leverage multimodal data to learn transferable semantic representations, simplifying hardware requirements and computational overhead. These frameworks are particularly effective in metal additive manufacturing processes, demonstrating improvements in downstream tasks such as melt pool geometry prediction.

Deep learning models are also making significant strides in predictive quality assessment, particularly in processes like Ultrasonic Additive Manufacturing, where they achieve high accuracy in classifying and monitoring process conditions.

Noteworthy Papers

  • High-Precision Real-Time Pores Detection in LPBF using Thermal Energy Density (TED) Signals: Introduces a real-time pore detection method using thermal sensor data, offering a more efficient, cost-effective alternative for quality control during the LPBF process.
  • Unsupervised Multimodal Fusion of In-process Sensor Data for Advanced Manufacturing Process Monitoring: Presents a novel approach to multimodal sensor data fusion in manufacturing processes, leveraging contrastive learning techniques to correlate different data modalities without the need for labeled data.
  • JEMA: A Joint Embedding Framework for Scalable Co-Learning with Multimodal Alignment: Introduces a co-learning framework that leverages multimodal data to learn transferable semantic representations, simplifying hardware requirements and computational overhead in metal additive manufacturing processes.
  • Advanced Predictive Quality Assessment for Ultrasonic Additive Manufacturing with Deep Learning Model: Develops a deep learning-based method for monitoring in-process quality in UAM, achieving high accuracy in classifying and monitoring process conditions.

Sources

High-Precision Real-Time Pores Detection in LPBF using Thermal Energy Density (TED) Signals

A Survey on RGB, 3D, and Multimodal Approaches for Unsupervised Industrial Anomaly Detection

Unsupervised Multimodal Fusion of In-process Sensor Data for Advanced Manufacturing Process Monitoring

JEMA: A Joint Embedding Framework for Scalable Co-Learning with Multimodal Alignment

Advanced Predictive Quality Assessment for Ultrasonic Additive Manufacturing with Deep Learning Model

Built with on top of