首页    期刊浏览 2024年07月05日 星期五
登录注册

文章基本信息

  • 标题:Knowledge Distillation for Energy Consumption Prediction in Additive Manufacturing
  • 本地全文:下载
  • 作者:Yixin Li ; Fu Hu ; Michael Ryan
  • 期刊名称:IFAC PapersOnLine
  • 印刷版ISSN:2405-8963
  • 出版年度:2022
  • 卷号:55
  • 期号:2
  • 页码:390-395
  • DOI:10.1016/j.ifacol.2022.04.225
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractOwing to the advances of data sensing and collecting technologies, more production data of additive manufacturing (AM) systems is available and advanced data analytics techniques are increasingly employed for improving energy management. Current supervised learning-based analytical methods, however, typically require extracting and learning valuable information from a significant amount of data during training. It is difficult to make a trade-off between latency and computing resources to implement the analytical models. As such, this paper developed a method utilizing the knowledge distillation (KD) technique for predicting AM energy consumption based on product geometry information to reduce computational burdens while simultaneously retaining model performance. Through a teacher-student architecture, layer-by-layer images of products and energy consumption datasets are used to train a teacher model from which the knowledge is extracted and used to build a student model to predict energy consumption. A case study was conducted to demonstrate the feasibility and effectiveness of the proposed approach using real-world data from a selective laser sintering (SLS) system. Comparisons between distilled and independently trained student models were made in terms of the root mean square error (RMSE) and training time. The distilled student model performed better (14.3947KWh/kg) and required a shorter training time (34s) than the complex teacher model.
  • 关键词:KeywordsAdditive manufacturingKnowledge distillationEnergy consumptionMachine learning
国家哲学社会科学文献中心版权所有