Electronics, Vol. 13, Pages 1420: Comparative Analysis of Machine Learning Techniques for Non-Intrusive Load Monitoring

3 weeks ago 12

Electronics, Vol. 13, Pages 1420: Comparative Analysis of Machine Learning Techniques for Non-Intrusive Load Monitoring

Electronics doi: 10.3390/electronics13081420

Authors: Noman Shabbir Kristina Vassiljeva Hossein Nourollahi Hokmabad Oleksandr Husev Eduard Petlenkov Juri Belikov

Non-intrusive load monitoring (NILM) has emerged as a pivotal technology in energy management applications by enabling precise monitoring of individual appliance energy consumption without the requirements of intrusive sensors or smart meters. In this technique, the load disaggregation for the individual device is accrued by the recognition of their current signals by employing machine learning (ML) methods. This research paper conducts a comprehensive comparative analysis of various ML techniques applied to NILM, aiming to identify the most effective methodologies for accurate load disaggregation. The study employs a diverse dataset comprising high-resolution electricity consumption data collected from an Estonian household. The ML algorithms, including deep neural networks based on long short-term memory networks (LSTM), extreme gradient boost (XgBoost), logistic regression (LR), and dynamic time warping with K-nearest neighbor (DTW-KNN) are implemented and evaluated for their performance in load disaggregation. Key evaluation metrics such as accuracy, precision, recall, and F1 score are utilized to assess the effectiveness of each technique in capturing the nuanced energy consumption patterns of diverse appliances. Results indicate that the XgBoost-based model demonstrates superior performance in accurately identifying and disaggregating individual loads from aggregated energy consumption data. Insights derived from this research contribute to the optimization of NILM techniques for real-world applications, facilitating enhanced energy efficiency and informed decision-making in smart grid environments.

Read Entire Article