Abstract
This work focuses on the use of linear regression analysis-based machine learning for the prediction of the end of discharge of a prismatic Li-ion cell. The cell temperature was recorded during the cycling of Li-ion cells and the relation between the open circuit voltage (OCV) and cell temperature was used in the development of the linear regression-based machine learning algorithm. The peak temperature was selected as the indicator of battery end of discharge. A battery management system (BMS) using a pyboard microcontroller was constructed to monitor the temperature of the cell under test and was also used to control a MOSFET that acted as a switch to disconnect the cell from the circuit. The method used an initial 10 charge and discharge cycles at a rate of 1C as the training data, then another charge and discharge cycle for the testing data. During the test cycling, the discharge was continued beyond the cutoff voltage to initiate an overdischarge while the temperature of the cell was continuously monitored. When the temperature of the cell exceeded the predetermined threshold, the pyboard triggered the MOSFET to disconnect the cell and stop the overdischarge. The experiment was performed on three different cells, and the overdischarge for each was secured within 0.1 V of the cutoff voltage. The results of these experiments show that a linear regression-based analysis can be implemented to detect an overdischarge condition of a cell based on the anticipated peak temperature during discharge.
1 Introduction
Numerous technological advancements in recent years have led to the development of lithium-ion batteries (LIBs) into efficient and dependable sources of energy for many different applications. These applications range from small systems that do not require large amounts of power, such as mobile phones and portable computers, to large systems that have traditionally used combustible fuels as their source of energy, such as electric vehicles (EVs) and airplanes [1–3]. Because of their increasing popularity, maintaining the safe operation of LIBs has become a major focus for governmental agencies, professional societies, and industrial companies. Multiple testing standards have been developed to ensure the safe operation of LIBs is prioritized during transport, when operating under different environmental conditions, and when experiencing different types of abuse [4–7]. One of the main concerns for these standards is the onset of thermal runaway, which can lead to cell degradation from increased operating temperatures, combustion, and even explosion in the most extreme cases [8,9]. These events become even more likely as cells age from repeated cycling, causing deposition of lithium on the anode and lowering the initiation temperature of thermal runaway [10]. There are many different methods of initiating thermal runaway, and systems that operate in abusive environments are especially at risk due to their increased physical hazards.
One of the more common ways that thermal runaway can occur is from overdischarging of an individual cell or battery pack, which can occur regardless of the operational environment. Overdischarging of cells can lead to capacity loss, structural degradation of the electrode materials, and a failure to continue operating after extreme overcharges [11,12]. Overdischarging of cells increases the likelihood of an internal short circuit, and the depth of overdischarge of a cell is positively related to the probability and severity of internal short circuits [13]. The configuration of cells in a battery pack can also increase the likelihood of overdischarge, as cells that are connected in series can be more prone to overdischarge [14]. Overdischarge of LIBs leads to the release of H2 from the cathode material of some cells, which is flammable and can worsen thermal runaway-caused combustion and explosion [15]. While there are numerous mechanisms of thermal runaway, overdischarge is a common cause in many different types of environments and is the primary focus of this study.
Thermal runaway becomes more likely and more catastrophic as the cell ages due to different mechanisms of degradation mentioned earlier [16]. To study these effects further, recent research has focused on the use of data-driven modeling in order to predict the capacity degradation of LIBs [17]. The results from these models show that using data from a small number of cycles occurring at the beginning of a cell’s life can be more accurate than using a larger number of cycles for predicting capacity degradation. This is because the data are extracted from high-rate discharge curves as opposed to actual capacity fade curves and also does not factor in any prior knowledge of degradation mechanisms due to cell chemistry [18]. While these methods have been accurate in predicting the degradation of cells, there has been little implementation of their results into different kinds of technology that help to directly increase the safe operation of LIBs.
One type of technology that has been successful in improving the cycling performance of batteries is the battery management system (BMS). Some of the most basic systems monitor for conditions of overcharge, overdischarge, and overcurrent and will remove a cell from a circuit if any of these conditions are detected. More sophisticated BMS technologies currently use many different methods of monitoring and operation, with one of the most common being the balancing of cell loading in battery packs to promote uniform cycling profiles and cell aging [19,20]. Some battery management systems also use methods such as electrochemical impedance spectroscopy and incremental capacity analysis to estimate the capacity loss and degradation of cells [21,22].
Some current battery managements systems also use temperature monitoring equipment for detecting and characterizing thermal issues. Most of these systems use the ambient temperature of the operational environment to estimate the temperature-dependent decay of cells used by EVs and other systems [23,24]. While this method is helpful in improving the long-term performance of cells and packs, they do not provide direct protection from short-term issues that can be difficult to detect.
The method we have developed here combined the use of data-driven models to predict the behavior of a cell with current technology that was used to directly improve the safety of systems that use LIBs as their source of power. The BMS developed for these experiments implemented a minimal amount of past operational data to predict the operating temperatures of cells based on their capacity and took automatic action to limit or prevent the amount of damage to a cell that has begun to experience an over discharge condition. The main focus of the work was to test the effectiveness of using a limited amount of information (capacity and temperature) to perform a comprehensive analysis of the service condition of a cell (voltage). In order to establish the validity of this concept, the experiments maintained strict cycling conditions (constant ambient temperature, C-rate, time in cell life when test occurs, etc.) in order to focus the research specifically on the relationship between cell capacity and peak temperature used to detect an overdischarge condition. This innovative method has the potential to provide greater protection for systems that use batteries as their source of energy, further enhancing the safety that has become critical in the advancement of many different types of technology.
2 Method
2.1 Equipment and Setup.
The experiment utilized commercially available Li-ion prismatic cells with LiCoO2 (LCO) cathodes and graphite anodes manufactured by Powerizer. Table 1 provides the specifications of the cells that were used.
Average lab-rated capacity (from test using 1C charge and discharge rate) | 140 mAh |
Charge voltage | 4.2 V |
Discharge cutoff voltage | 3.0 V |
Nominal voltage | 3.7 V |
Charging current | 0.17 A Max. |
Cycle life | >500 cycles |
Average lab-rated capacity (from test using 1C charge and discharge rate) | 140 mAh |
Charge voltage | 4.2 V |
Discharge cutoff voltage | 3.0 V |
Nominal voltage | 3.7 V |
Charging current | 0.17 A Max. |
Cycle life | >500 cycles |
The cells were cycled using an 8 Channel Battery Analyzer manufactured by AA Portable Power Corp., which recorded the voltage, current, and capacity of the cells during cycling. The external temperatures of the cells were monitored using a Pt-100 resistance temperature detector (RTD) from Omega Engineering. The temperatures from the RTD were first recorded using an 8-Channel Temperature/Voltage Input USB Data Acquisition Module, also from Omega Engineering. This module was used to record the temperatures during the training cycles that were first performed for initial data collection.
The cells used in the experiment were housed in a custom designed holder prepared with additive manufacturing of polylactic acid to secure the RTD to the middle of the bottom surface of the cell. This area of the cell was chosen for RTD placement due to it being the area with the most constrained heat dissipation, resulting in higher temperatures in that area when discharge rates are low [25]. Figure 1 shows the construction of the holder and the placement of a cell inside with an RTD secured to the bottom surface.
During the testing cycles, the BMS that was developed for this work was used to detect the cell temperatures. The same battery analyzer was used to cycle the cells, and a different Pt-100 RTD that was connected to the BMS was used to monitor the temperatures. The programming and control for the BMS used a pyboard, which is a microcontroller that runs on a scaled down version of the python programming language. The pyboard was connected to a laptop during testing in order to monitor the temperature output of the RTD. A constant voltage DC power supply was used during testing to provide a constant current to the RTD for temperature measurement.
2.2 Procedure.
Over the lifetime of a cell, aging occurs due to the morphological changes occurring in the electrodes, electrolyte, current collectors, and separator. The capacity fade associated with these processes are initially linear, but become more significant and cause an exponential decrease in capacity at approximately the time of cell end of life of 70% rated capacity [29]. At the beginning of cell life, the formation of a solid-electrolyte interface (SEI) also contributes to the aging of a cell during its initial formation. After the initial SEI layer forms, the aging contribution from this mechanism diminishes to only that needed to maintain a relatively constant thickness [30]. Based on our observations, when the capacity change is limited during the initial cycles in the life of a cell due to the lower rate of the aging contributions mentioned, the fitting result of Eq. (2) is similar to a linear function. Thus, it is possible to simplify the relationship between battery capacity and maximum temperature increase and simply use a linear relation to correlate them. In this condition, the simple and effective machine learning approach, linear regression, can be used to predict the maximum battery temperature during regular LIB cycling when early in the life of a cell.
Linear regression is a powerful tool used to model the relationship between two variables when there is a linear dependency between them. It has been used in the modeling of LIB aging rate [31] and open circuit voltage (OCV) [32]. In this work, we extend the usage of the linear regression model to the prediction of battery temperatures. Since a temperature increase during battery cycling is related to the discharge capacity, it is possible to use temperature as the indicator of the state of charge (SOC) of a battery. During the LIB cycling, it was observed that the battery temperature would suddenly increase when approaching the end of discharge [27], which makes it possible to use a temperature measurement to detect and prevent overdischarge events. Overdischarge is a common reason for battery failure [33], which typically happens when a voltage monitoring system fails to reflect the actual voltage of a LIB. Overdischarge tests are also one of the standard safety tests commonly included in many LIB testing standards [5–7]. Due to the resistance of the battery terminals and connection wires between a BMS and a cell, etc., it is difficult to accurately obtain the voltage of a cell. Therefore, it can be desirable to use temperature as an indicator of the voltage and SOC of a cell, as heat generation in LIBs is related to the discharge capacity, which is not influenced by the resistance of the external circuit. Since temperature measuring units are a common component in many different battery management systems, this new approach can be compatible with most existing commercial BMS units.
Based on the relationship between the battery capacity and peak temperature rise, a temperature monitoring based BMS unit was designed for detection and prevention of overdischarge. The main body of the BMS is a printed circuit board (PCB), where the resistance temperature detectors and their constant current supply are mounted. There are also multiple MOSFET-style transistors mounted on the PCB that are used as switching devices for the circuit. The temperature sensors are mounted on the bottom surface of the LIBs while they are under test, as described in our previous work [27]. The temperature was recorded by the pyboard, which was also mounted to the printed circuit board of the BMS. Once any abnormal temperature increase was identified by the pyboard based on the results from the linear regression model, the pyboard triggered the MOSFETs to disconnect the LIB under test from the battery analyzer, which stopped the discharging process and prevented further overdischarge.
The outline of temperature-based overdischarge identification and prevention can be described as follows: first the LIBs used for testing were cycled under constant current mode at a rate of 1C for 10 complete cycles (training cycles) between 3.0 V and 4.2 V. The cell temperature was continuously recorded using an RTD at a rate of 0.2 Hz. After the initial cycling, the peak temperature for each cycle was identified and the relationship between the peak temperature and battery capacity for each cycle was obtained using linear regression. Then another 1C rate constant current cycle (testing cycle) was performed. For the testing cycle, the cells were first charged to 4.2 V, and then discharged without the preassigned cutoff voltage of 3.0 V. In the testing cycle, the charging capacity was obtained and used for prediction of the peak temperature in the following discharge process with the linear regression model, as the Columbic Efficiency of commercial LIBs is typically higher than 99.5% [34–36]. The linear regression model generated an expected peak temperature rise ΔT based on this information. Then, a 10% margin was added to ΔT when calculating the cutoff temperature to accommodate random measuring error and inconsistency between different cycles. When the pyboard detected, a based on the RTD reading, a MOSFET was triggered to disconnect the battery from the BMS circuit to stop the discharge process. Figure 2 shows the process used by the BMS to predict the peak temperature of the cells for the calculation of the cutoff temperature, as well as the decision process for removing an overdischarging cell from the circuit.
3 Results
During the initial training cycles, the temperatures of the cells were continuously recorded in order to observe their behavior during charging and discharging as to determine the point at which the peak temperatures occurred during cycling. The results from the training cycles showed a normal temperature profile during cycling for each of the cells, with the peak temperatures occurring at the end of the discharge cycles when the cells reached the discharge cutoff voltage of 3.0 V [26]. This observation agreed with the theoretical analysis of heat generation and temperature change of Li-ion cells. Cell 3 does not exhibit a decrease in peak temperature over the first cycles as the other cells do, but does show the same general relationship between capacity and peak temperature. This is likely caused by manufacturing processes resulting in differing internal resistances due to side reactions such as gas generation and electrode particle cracking, which is a common occurrence and affects the heat generation of cells [37,38]. The temperatures recorded during the 10 training cycles for the 3 Li-ion pouch cells are presented in Fig. 3.
From Fig. 3, it can be seen that the peak temperatures during the discharge cycles generally decreased over the aging of the cells. A decrease in cell capacity is also seen during the first few cycles, which is mainly due to the formation of the SEI layer on the negative electrode, causing a reduction in the amount of active material available in a cell [39–41]. The peak temperatures during the discharge cycles, in relation to their corresponding discharge capacities, are presented in Fig. 4. The capacity-based regression models resulted in the following parameters: intercepts of −49.82, −35.29, and 27.37, and slopes of 0.5920, 0.4795, and 0.0259 for cell 1, cell 2, and cell 3, respectively. These slopes correspond to the relationship between the capacity and peak temperature observed in each of the training cycles for the three cells. In general, as the cell capacities decreased slightly over the initial cycles the peak temperatures also decreased, resulting in a positive relationship between the two factors. These are the cell temperatures and discharge capacities that were used as the training data for the linear regression model.
During the testing cycles, the temperatures of the three cells were monitored and recorded continuously, which are presented in Fig. 5. The cycling conditions for each of the cells were maintained constant, and a controlled overdischarge was performed to test specifically for a temperature that exceeded the expected value caused by overdischarge. During the overdischarge test at the end of the test cycling, all three cells were disconnected from the circuit by the MOSFET when the threshold temperature was detected by the RTD and pyboard. The voltages of the three cells at the time of cutoff were 2.932 V, 2.991 V, and 2.921 V, respectively. All were disconnected within 0.1 V of the target cutoff voltage 3.0 V.
The results of the overdischarge tests show the effectiveness of the use of linear regression in predicting the temperature of a cell during overdischarge based on the temperatures and capacities of the previous cycles. It also shows the potential of using a temperature-based BMS that implements a machine learning approach for the prevention of overdischarge.
4 Conclusion
In this work, a temperature measurement-based approach for LIB overdischarge detection and prevention was developed. A BMS prototype with linear regression-based machine learning capability was designed and tested using the information obtained from the cells. A simple constant current cycling test was designed to verify the capability of temperature measurement and linear regression for overdischarge detection, where the BMS successfully prevented undesired overdischarge of LIBs based on the data available from the previous training cycles and real-time measurement in the testing cycle. This approach also has the potential to be compatible with most existing commercial battery management systems that implement temperature monitoring units.
Acknowledgment
The authors acknowledge the School of Aeronautics and Astronautics, Purdue University for their financial support in this project. The authors acknowledge the Office of Naval Research (ONR) for supporting the work under the grant N00014-18-1-2397 (Program Manager: Dr. Michele Anderson).
Conflict of Interest
There are no conflicts of interest.