Abstract: Massive amounts of time-series data have been produced as a result of the spread of Internet of Things (IoT) devices, which offers both opportunities and difficulties for analysis in highly dynamic and resource-constrained environments. In this study, methods for unsupervised anomaly detection and clustering in IoT time-series data based on deep learning (DL) are investigated. The performance of network analysis systems is severely hampered by important issues like noise, high dimensionality, and irregular sampling, which must be addressed. While noise can obscure subtle anomalies and result in high false positive rates, irregularity breaks temporal coherence, making it challenging to consistently identify patterns over time. High dimensionality has a detrimental effect on clustering accuracy and model interpretability because it raises computational complexity and may dilute important signals. These difficulties make it more difficult to deploy DL models in real time, particularly on resource-constrained edge devices. To address these problems, this project suggests a systematic strategy that combines algorithm development, theoretical modeling, and practical validation. The main objective is to improve the state of time-series analysis for IoT in order to facilitate anomaly detection, predictive maintenance, and more precise and effective monitoring. The results of this study have the potential to greatly improve the intelligence and dependability of IoT systems in a variety of fields by bridging the gap between theoretical innovation and real-world application.

Index Terms: Long Short-Term Memory (LSTM), network analysis, anomaly detection, clustering, deep learning, time series data, Internet of Things (IoT), and high-dimensional data.


PDF | DOI: 10.17148/IJARCCE.2025.14465

Open chat
Chat with IJARCCE