The study investigates an array of machine learning algorithms to identify the most proficient algorithm for accurate IGBT lifetime prediction. Most research has analyzed single sensor signals to reduce the complexity and cost while sacrificing accuracy. Traditionally, research has relied on single sensor signals to reduce complexity and costs, but the current availability of low-cost, powerful computational resources in modern power electronics systems allows for data-driven multi-sensor-based monitoring systems, enhancing the protection of critical infrastructure, including hydrogen infrastructure. In this study, both single-variate and multivariate machine learning models are examined to conduct a comprehensive comparative performance analysis. The paper further details the development of an experimental setup, utilizing an NVIDIA Jetson Nano GPU for real-time prediction. Initial results from the small-scale test setup serve as the foundation for future data acquisition from a newly formulated larger testbed (100 kW 1200V 400A 3-phase inverter) in the complete paper.