Publication: State-of-charge estimation for lithium-ion batteries with optimized self-supervised transformer deep learning model
Loading...
Date
2022-06
Authors
Dickson Neoh Tze How, Dr.
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
State-of-charge(SOC) is a quantity that reflects the amount of available energy left in lithium-ion(Li-ion)cells. Accurate SOC estimation allows for the optimization of charging-discharging schedule and usage time and facilitates the computation of other quantities needed to prolong battery lifespan and protect users from hazards. However, the SOC is not an observable quantity and cannot be practically measured outside the laboratory environment. Additionally, the SOC behaves in a highly nonlinear way and is a function of various incalculable factors such as cell condition, ambient temperature, manufacturing inconsistency, and cell chemistry. Fortunately, these issues can be addressed with machine learning methods such as deep learning(DL) due to its capacity in modeling highly non-linear phenomena, powerful generalization capability, and fast run-time. Nevertheless, the performance of DL models varies heavily depending on data type, model architecture, hyperparameter selection, training framework, and so forth. To overcome these limitations, this study proposes a novel DL architecture for SOC estimation using the Transformer model with the self-supervised learning (SSL) framework. The SSL framework trains the Transformer in two stages. In the first stage, the model is pre-trained using unlabeled data with unsupervised learning. In the second stage, the model is fine-tuned or re-trained using labeled data with supervised learning. To evaluate its effectiveness, the Transformer model is benchmarked against various commonly used DL architectures such as Long Short-Term Memory(LSTM), Gated Recurrent Unit(GRU), Convolutional Neural Networks (CNN), and Deep Neural Networks (DNN). The evaluation is carried out based on multiple electric vehicles (EV) drive cycles with different Li-ion cells chemistry at varying ambient temperatures. Experiment results show that the Transformer model achieves an error metric of RMSE ≤ 1.7% and MAE ≤ 1% on all datasets while maintaining a relatively low computation cost with a model size of approximately 2MB. The use of SSL framework also reduces the need for labeled data during training and significantly decreases training time. In addition, using the SSL framework makes transfer learning (TL) possible in which the weights of a model trained on specific cell chemistry can be transferred to other models running on different cell chemistry. In this study, the weights of a model trained on an NMC cell are transferred to an NCA cell and vice versa. TL results in the model scoring an RMSE ≈ 2.0% or lower with only five training epochs(approximately 30 minutes on an RTX3090GPU). The Transformer model with transferred weights outperformed models trained from scratch using supervised learning. To select the optimal hyperparameters for the Transformer model, the Tree Parzen Estimator(TPE) optimization in combination with the Hyperband pruning algorithm is employed to search for the best combination that yields the lowest Root Mean Squared Error(RMSE)and Mean Absolute Error (MAE) error metrics. The outcome is an optimized Transformer model that scores the lowest errorofRMSE≈1.12%. The optimized model also outperformed all other state-of-the art DL models on error metrics, adaptability, and robustness under diverse operating conditions. This study concludes that the proposed optimized Transformer model has great potential to be incorporated in Li-ion energy storage systems to estimate SOC with very low estimation error and broad applicability to various cell types. All models in this study were implemented using the open-source PyTorch and TSAI deep learning package on an Ubuntu 20.04 LTS machine with an RTX3090 graphical processing unit.
Description
Keywords
State-of-charge estimation for lithium-ion batteries