With operational ranges of 20-30 miles, electric bicycles (eBikes) are a fun, inexpensive alternative to fossil-fueled vehicles for many urban and suburban applications. However, since the load, battery condition, and terrain all dramatically affect an eBike’s range, it can be difficult to estimate whether you will make it home without having to pedal your grocery-laden vehicle up that last hill. While no production car since pre-1962 Volkswagen Beetles have failed to include a fuel gauge as standard equipment, many of today’s eBikes still provide only rough clues about how many miles of charge they’ve got left. This is changing with the arrival of so-called “battery fuel gauge” devices, whose higher accuracy is helping make eBikes more practical as everyday commuter and delivery vehicles in many parts of the world.
It’s in the chemistry
Most of the challenges involved with measuring a battery’s state of charge (SOC) lie in the complex processes that enable a two-way exchange between chemical and electrical energy. Among the phenomena common to all these processes is that a battery cell’s open-circuit voltage (OCV) drops as it discharges, a behavior that can be used to infer its state of charge. Some battery chemistries, such as the lead-acid (PbA) gel cells used in most early eBikes, have a relatively linear voltage profile and a large voltage difference between their charged and discharged states (Figure 1). In this case, a simple mechanical voltmeter can give the rider a useable, if imprecise, sense of the percentage of charge remaining.
Figure 1: OCV vs. SOC for a 12V lead acid battery – (Courtesy of Paul Hill, “Batteries” V2.07 5 Jul, 2004)
A problem arises however when using Lithium-based, high-performance battery chemistries, whose flatter, less linear depth-of-discharge (DoD) curves make mechanical voltmeters useless for indicating much more than “full” or “empty” (Figure 2). The simplest solution for these applications is interpreting the voltage readings against a hardware or software lookup table that contains a set of chemistry-specific battery profiles which correlate a cell’s DoD with output voltage. The relatively flat discharge curves of most modern battery chemistries require accurate, high-resolution (12 bits or more) voltage measurements to provide useful SOC data within their 20 percent and 80 percent charge region. Since the OCV curves tend to shift as a function of temperature, even a simple lookup-based fuel gauge also needs to also factor the battery’s thermal status into its capacity calculations.
Figure 2: Typical voltage vs. depth-of-discharge profiles for Lithium-Ion and Lithium-Iron-Phosphate batteries. (Courtesy of Texas Instruments).
OCV isn’t enough
The most obvious downside to using only a lookup-based OCV methodology is that it cannot tell you how much energy is actually available from the battery. This is because a battery’s charge capacity decreases as a function of age and the number of discharge cycles it experiences. Without some sort of compensation, there will be a big difference between the distance delivered by a new battery and a year-old unit when they both show a 75 percent charge reading. For this reason, a true “fuel gauge” battery monitor also keeps track of the current flowing into and out of the battery, typically by measuring the voltage across a low-value sense resistor. When processed to account for charging losses, discharge losses (due to the battery’s internal impedance), temperature, and other effects, this current data can be used in conjunction with the SOC lookup table to provide a reasonable estimate of the actual charge remaining in the battery.
Devices such as Maxim’s DS2788 stand-alone fuel gauge use these basic techniques to estimate available capacity for rechargeable lithium-ion (Li+) and Li+ polymer batteries. The cell-specific characteristics and application parameters are stored in the DS2788’s on-chip EEPROM and used to calculate conservative estimate of the amount of useable charge, given the present temperature, discharge rate, stored charge, and application parameters. The gauge’s capacity estimates can be displayed on a 5-segment LED, or made available to a host processor via a series of registers that report the remaining charge in terms of mAh remaining and percentage maximum capacity (Figure 3).
Figure 3: Maxim’s DS2788 stand-alone fuel gauge uses OCV, temperature, and input/output current monitoring to estimate a battery’s available capacity. (Courtesy of Maxim Semiconductor)
Texas Instruments uses the same principles in its bq2060A multi-chemistry gas gauge to calculate remaining battery capacity, temperature, voltage, current, and remaining run-time predictions for NiCd, NiMH, Li-Ion, and lead-acid batteries.
Advanced algorithms for enhanced accuracy
While some devices can use these relatively simple techniques to deliver battery capacity estimates with accuracies approaching 90 percent, this may not be sufficient for drivers who want to squeeze the last possible mile out of their vehicles’ batteries. In addition, it can be highly desirable to have a battery fuel gauge that can give accurate predictions of run time at a particular discharge rate, which then can be used to produce reliable estimates of how far the vehicle’s remaining charge will carry it.
Producing more accurate measurements of a battery’s capacity can involve more accurate tracking of the energy going in and out of its cells (usually known as “coulomb counting”) and taking a closer look some of the more subtle phenomena that occur within its innards. Devices such as Maxim’s MAX17044 use a sophisticated version of coulomb counting in their ModelGauge battery-modeling scheme to track the battery's relative state-of-charge (SOC) continuously over a widely varying charge/discharge profile. Unlike traditional fuel gauges, the ModelGauge algorithm eliminates the need for battery relearn cycles and an external current-sense resistor. Temperature compensation is possible in the application with minimal interaction between a µC and the device.
Texas Instruments has developed its own enhanced battery modeling technique called Impedance Track, which supplements coulomb counting with measurement techniques that determine the actual physical condition of the battery. The self-learning mechanism produces a more accurate model of the battery by monitoring the increases in impedance a lithium battery experiences as it ages. Impedance Track does this by taking a series of voltage and current readings, when the battery is at rest and under load, and using them to calculate changes in the battery’s impedance and its total chemical capacity (Qmax). It also accounts for the fact that battery impedance also varies significantly between cells and at different usage conditions, such as temperature and state-of-charge.
The Impedance Track methodology is used by most of Texas Instruments’ recently-released battery fuel gauges, including its bq20Z70 battery gas gauges (Figure 4). It monitors capacity change, battery impedance, open-circuit voltage, and other critical parameters of the battery pack, and reports the information to the system host controller over a serial-communication bus. When used in conjunction with its companion bq29330 battery monitor/protector, the bq20Z70 forms a complete battery management solution.
Figure 4 - Texas Instruments’ battery gas gauge measures a battery’s internal impedance to help determine how its overall capacity has been affected by temperature, repeated charge/discharge cycles, and other aging effects. It can be used in conjunction with its companion bq29330 battery monitor/protector to form a complete battery management solution. (Courtesy of Texas Instruments).
Estimating how much battery energy is left is a key factor in determining the range of an e-Bike. This article has explored stand-alone “Fuel Gauges” from Maxim and TI, as well as a new enhanced battery modeling technique from TI called Impedance Track. Further information can be found by using the Hotenda links provided.