|Elliott Sound Products||Lithium Cell Charging|
1 - Battery Management System (BMS)
2 - Charging Profile
3 - Constant Voltage And Constant Current Power Supplies (Chargers)
4 - IC Single Cell Charging Circuit
5 - Multi-Cell Charging
6 - Battery Protection
7 - State Of Charge (SOC) Monitoring
8 - Battery Powered Projects
Charging lithium batteries or cells is (theoretically) simple, but can be fraught with difficulties as has been shown by the multiple serious failures in commercial products. These range from laptop computers, mobile ('cell') phones, the so-called 'hoverboards' (aka balance boards), and even aircraft. Balance boards caused a number of house fires and destroyed or damaged many properties worldwide. If the cells aren't charged properly, there is a high risk of venting (release of high pressure gasses), which is often followed by fire.
Lithium is the lightest of all metallic elements, and will float on water. It is very soft, but oxidises quickly in air. Exposure to water vapour and oxygen is often enough to cause combustion, and especially so if there is heat involved (for example, from overcharging a lithium cell). Exposure to moist/ humid air causes hydrogen gas to be generated (from the water vapour), which is of course highly flammable. Lithium melts at 180°C. Most airlines insist that lithium cells and batteries be charged to no more than 30% for transport, due to the very real risk of catastrophic fire. Despite the limitations, lithium batteries are now used in nearly all new equipment because of the very high energy density and light weight.
Batteries have charge and discharge rates that are referred to 'C' - the battery or cell capacity, in Ah or mAh (amp or milliamp hours). A battery with a capacity of 1.8Ah (1,800mAh) therefore has a 'C' rating of 1.8 amps. This means that (at least in theory) the battery can supply 180mA for 10 hours (0.1C), 1.8A for 1 hour, or 18A for 6 minutes (0.1 hour or 10C). Depending on the design, Lithium batteries can supply up to 30C or more, so our hypothetical 1,800mAh battery could theoretically supply 54A for 2 minutes. Capacity may also be stated in Wh (watt hours), although this figure is usually not helpful other than in advertising brochures.
In the US and some countries elsewhere, the Wh rating is required by shipping companies so they can determine the packaging standard needed. A single 1.8Ah cell has a stored energy of 6.7Wh [ 4 ]. Alternatively, the lithium content may need to be stated. The reference also shows how this can be calculated, although any calculation made will only be an estimate unless the battery maker specifically states the lithium content. The reason for this is the risk of fire - carriers dislike having shipments catch fire, and the lithium content may dictate how the goods will be shipped. When batteries are shipped separately (not built into equipment) they must be charged to no more than 30% capacity.
Unlike some older battery technologies, lithium batteries cannot (and should not) be left on float charge, although it may be possible if the voltage is maintained below the maximum charge voltage. For most of the common cells in use, the maximum cell voltage is 4.2V, called the 'saturation charge' voltage. The charge voltage should be maintained at this level only for long enough for the charge current to have fallen to 10% of the initial or 1C value. However, this may be subject to interpretation because the initial charge current can have a wide range, depending on the battery and the charger.
Unfortunately, while there are countless articles about lithium battery charging, there are nearly as many different suggestions, recommendations and opinions as there are articles. One of the main things that is essential when charging a lithium battery is to ensure that the voltage across each cell never exceeds the maximum allowable, and this means that each and every cell in the battery has to be monitored. There are many ICs available that have been specifically designed for lithium battery balance charging, with some systems being quite complex, but extremely comprehensive in terms of ensuring optimum performance.
While the traditional lithium-ion (Li-Ion) or lithium-polymer (Li-Po) has a nominal cell voltage of 3.70V, Li-iron-phosphate (LiFePO4, aka LFP - lithium ferrophosphate) makes an exception with a nominal cell voltage of 3.20V and charging to 3.65V. Many commercial LiFePO4 batteries have in-built balancing a protection circuits, and only need to be connected to the proper charger. A relatively new addition is the Li-titanate (LTO) with a nominal cell voltage of 2.40V and charging to 2.85V.
Chargers for these alternative lithium chemistry cells are not compatible with regular 3.70-volt Li-Ion. Provision must be made to identify the systems and provide the correct charging voltage. A 3.70-volt lithium battery in a charger designed for LiFePO4 would not receive sufficient charge; a LiFePO4 in a regular charger would cause overcharge. Unlike many other chemistries, Li-Ion cells cannot absorb an overcharge, and the specific battery chemistry must be known and charging conditions adjusted to suit.
Li-Ion cells operate safely within the designated operating voltages, but the battery (or a cell within the battery) becomes unstable if inadvertently charged to a higher than specified voltage. Prolonged charging above 4.30V on a Li-Ion cell designed for 4.20V will plate metallic lithium on the anode. The cathode material becomes an oxidizing agent, loses stability and produces carbon dioxide (CO2). The cell pressure rises and if the charge is allowed to continue, the current interrupt device responsible for cell safety disconnects at 1,000-1,380kPa (145-200psi). Should the pressure rise further, the safety membrane on some Li-Ion cells bursts open at about 3,450kPa (500psi) and the cell may eventually vent - with flames !
Not all cells are designed to withstand high internal pressures, and will show visible bulging well before the pressure has reached anything near the values shown. This is a sure sign that the cell (or battery) is damaged, and it should not be used again. Unfortunately, many of the articles you find on-line discussing balance boards (in particular) talk about the cell quality (or lack thereof) and/or the charger quality (ditto), but neglect to mention the battery management system (BMS) discussed next.
This is one of the most critical elements of a lithium battery charger, but is rarely mentioned in most articles that discuss battery fires. In general, it's assumed (or not known to the writer) that the battery pack includes - or should include - a protection circuit to ensure that each cell is monitored and protected against overcharge. It's likely that cheap (or counterfeit) battery packs don't include a protection circuit at all, and any battery without this essential circuitry is generally to be avoided unless you have a proper external balance charger with a multi-pin connector. The problem is that sellers will rarely disclose (or even know) if the battery has protection or not.
It's not especially helpful, but many sellers of batteries and chargers fail to make the distinction between battery monitoring and battery protection. These are two separate functions, and in general they are separate pieces of circuitry. Unfortunately, the term 'BMS' can mean either monitoring or protection, depending largely on the definition used by the the seller, and/or understanding of what is actually being sold.
I will use the term 'balancing' to apply to the management of the charging process, and for batteries (as opposed to single cells), it's the balancing process that ensures that each cell is closely monitored during charging to maintain the correct maximum cell voltage. Protection circuits are usually connected to the battery permanently, and are often integrated within the battery pack. These are covered further below. In some cases, protection and balancing may be provided as a complete solution, in which case it truly deserves the term 'BMS' or 'battery management system'.
For proper control of the charge process with more than a single cell, a battery balance system is absolutely essential. The balance circuits are responsible for ensuring that the voltage across any one cell never exceeds the maximum allowed, and is often integrated with the battery charger. Some have further provisions, such as monitoring the cell temperature as well. In large installations, the individual cell controllers communicate with a central 'master' controller that provides signalling to the device being powered, indicating state of charge (inasmuch as this parameter can be determined - it's less than an exact science), along with any other data that may be considered essential.
For comparatively simple batteries with from 2 to 5 series cells, giving nominal voltages from 7.4V to 18.5V respectively, cell balance isn't particularly difficult. It does become a challenge when perhaps 110 cells are connected in series, for an output of around 400V (as may be found in an electric car for example). Cells can also be connected in parallel, most commonly as a series-parallel network. Common terminology (especially for 'hobby' batteries for model airplanes and the like) will refer to a battery as being 5S (5 series cells), or 4S2P (4 series cells, with each comprised of 2 cells in parallel).
Operating cells in parallel is not a problem, and it's possible (though usually not recommended) that they can have different capacities. Of course they must be using the exact same chemistry. When run in series, the cells must be as close to identical as possible. Of course, as the calls age they will do so at different rates - some cells will always deteriorate faster than others. This is where the balance system becomes essential, because the cell(s) with the lowest capacity will charge (and discharge) faster than the others in the pack. The majority of balance chargers use a regulator across each cell, and that ensures that each individual cell's charge voltage never exceeds the maximum allowed.
In its simplest form, this could be done with a string of precision zener diodes, and that is actually fairly close to the systems commonly used. The voltage has to be very accurate, and ideally will be within 50mV of the desired maximum charge voltage. Although the saturation charge voltage is generally 4.2V per cell, battery life can be extended by limiting the charge voltage to perhaps 4.1 volts. Naturally, this results in slightly less energy storage.
The two major components of a BMS will be looked at separately below. These may be augmented by performance monitoring (state of charge, remaining capacity, etc.), but this article concentrates on the important bits - those that maximise both safety and battery life. So-called 'fuel gauges' are a complete topic unto themselves, and they are only covered in passing here.
The graph shows the essential elements of the charge process. Initially, the charger operates in constant current (current limit) mode, with the maximum current ideally being no more than 1C (1.8A for a 1.8Ah cell or battery). Often it will be less, and sometimes a great deal less. Charging at 0.1C (180mA) would result in a charge time of 30 hours if the full saturation charge is applied. However, when a comparatively slow charge is used (typically less than 0.2C), it is possible to terminate charging as soon as the cell(s) reach 4.2V and the saturation charge isn't necessary. For example, based on the 'new' charging algorithm, the cell shown in Figure 1 may require somewhere between 12 and 15 hours to charge at 0.1C, and the charge cycle is ended as soon as the voltage reaches 4.2 volts. This is somewhat kinder to the Li-Ion cell, and voltage stress is minimised.
Figure 1 - Lithium Ion Charging Profile (1 Cell)
As is clearly shown in the graph, a fast charge means that the capacity lags the charge voltage, and 1C is fairly fast - especially for batteries designed for low consumption devices. After about 35 minutes, the voltage has (almost) reached the 4.2V maximum and charge current starts to fall, but the cell is only charged to around 65%. A slower charge rate means that the charge level is more closely aligned with the voltage. Like all batteries, you never get out quite as much as you put in, and you generally need to put in about 10-20% more ampere hours (or milliamp hours) than you will get back during discharge.
Some chargers provide a pre-conditioning charge if the cell voltage is less than 2.5 volts. This is generally a constant current of 1/10 of the nominal full constant current charge. For example, if the charge current is set for 180mA, the cell will be charged at 18mA until the cell voltage has risen to about 3V (this varies depending on the design of the charger). Most systems will never need pre-conditioning though, because the electronics will (or should!) shut down before the cell reaches a potentially damaging level of discharge.
In use, Li-Ion batteries should be kept cool. Normal room temperature (between 20° and 25°C) is ideal. Leaving charged lithium batteries in cars out in the sun is ill-advised, as is any other location where the temperature is likely to be higher than 30°C. This is doubly important when the battery is being charged. When discharged, some means of cutout is required to ensure that the cell voltage (of any cell in the battery) does not fall below 2.5 volts.
It's usually better not to fully charge lithium batteries, nor allow a deep discharge. Battery life can be extended by charging to around 80-90% rather than 100%, as this all but eliminates 'voltage stress' experienced when the cell voltage reaches the full 4.2 volts. If the battery is to be stored, a charge of 30-40% is recommended, rather than a full charge. There are many recommendations, and most are ignored by most people. This is not the users' fault though - manufacturers of phones, tablets and cameras could offer an option for a reduced charge - there's plenty of processing power available to do it. This is especially important for items that don't have a user replaceable battery, because it often means that otherwise perfectly good equipment is discarded just because the battery is tired. Given the proliferation of malware for just about every operating system, it's important to ensure that battery charge settings can never be set in such a way that may cause damage.
During the initial part of the charge cycle, the charger supply should be constant current. Current regulation doesn't have to be perfect, but it does need to be within reasonable limits. We don't much care if a 1A supply actually delivers 1.1A or 0.9A, or if it varies a little depending on the voltage across the regulator. We obviously should be very concerned if it's found that the maximum current is 10A, but that simply won't happen even with a fairly crude regulator.
For a purely analogue design, the LM317 is well suited for the task of current regulation, and it's also ideal for the essential voltage regulation. This reduces the overall BOM (bill of materials), since multiple different parts aren't needed. Of course, these are both linear devices, so efficiency is poor, and they require a supply voltage that's greater than the total battery voltage by at least 5 volts, and preferably somewhat more.
As an alternative to using two LM317 ICs you can add a couple of transistors and resistors to create a current limiter. However, it doesn't work quite as well, the PCB real estate will be greater than the version shown here, and the cost saving is minimal. The circuit below does not include the facility for a 'pre-conditioning' or 'wake-up' charge before the full current is applied. This isn't essential if the battery is never allowed to discharge below 3V, and may not even be needed for a 2.5V minimum. Anything less than a discharged cell voltage of 2.5V will require a C/10 pre-conditioning charge. If you only ever charge at the C/10 rate, a lower charge rate is not needed.
Figure 2 - Constant Current / Constant Voltage Charge Circuit
The arrangement shown will limit the current to the value determined by R1. With 12 ohms, the current is 100mA (close enough - actually 104mA), set by the resistance and the LM317's internal 1.25V reference voltage. For 1A use 1.2 ohms (5W is recommended), and the value can be determined for any current needed up to the maximum 1.5A that the LM317 can provide. At higher current, the regulator will need a heatsink, especially for the initial charge phase when considerable voltage will be across U1. The diodes prevent the battery from applying reverse polarity to the regulator (U2) if the battery is connected before the DC supply is turned on. D1 should be rated for at least double the maximum current, and will ideally be a Schottky device to minimise dissipation and voltage loss.
This is simply the basic charger, which can be designed to fulfil the requirements described above. This is far from the full system though, as the management system and balancing circuits are missing at this stage. Each system will be different, but the basic circuit is flexible enough to accommodate most 2-4 cell battery packs. Charging can be stopped by connecting the 'Adj' pin of U1 to ground with a transistor as shown. When charging is complete, a voltage (5V is fine) is applied to the end of R3, and the current limiter is shut down. Be aware that the battery will be discharged by the combination of the balance circuits and the current passed through R4, R5 and VR1 (the latter is about 5.7mA).
A single cell (or parallel cell batteries) charger is conceptually quite straightforward. However, when the full requirements are considered it becomes obvious that a simple current limited precision regulator as shown above may not be enough. Many IC makers have complete lithium cell chargers on a chip, with most needing nothing more than a programming resistor, a couple of bypass capacitors and an optional LED indicator. One (of many) that incorporates everything needed is the Microchip MCP73831, shown below. Most of the major IC manufacturers make specialised ICs, and the range is vast. TI (Texas Instruments) makes a range of devices designed for full BMS applications ranging from a single cell to 400V batteries used for electric vehicles. Another simple IC is the LM3622 which is available in a number of versions, depending on the end point voltage. A version is also available for a two-cell battery, but it lacks balancing circuitry which makes it rather pointless (IMO).
Figure 3 - Single Cell Charger Using MCP73831 IC
Four termination voltages are available - 4.20V, 4.35V, 4.40V and 4.50V, so it's important to get the correct version for the cell type you will be charging. The constant current mode is controlled by R2, which is used to 'program' the IC. Leaving pin 5 ('PROG') open circuit inhibits charging. The IC automatically stops charging when the voltage reaches the maximum set by the IC, and will supply a 'top up' charge when the cell voltage falls to around 3.95 volts. The optional LED can be used to indicate charge or end-of-charge, or both using a tri-colour LED or separate LEDs. The status output is open-circuit if the IC is shut down (due to over temperature for example) or no battery is present. Once charging is initiated, the status output goes low, and it goes high when the charge cycle is complete. Note that this IC is only available in SMD packaging, and through hole versions are not available. The same applies to most devices from other manufacturers.
The charger shown is a linear regulator, so dissipates power when charging the cell. If the discharged cell voltage is 3V, the IC will only dissipate 300mW with a 100mA charge current. If increased to the maximum the IC can provide (500mA), the IC will dissipate 1.5W, and that means it will get very hot (it's a small SMD device after all). Should the cell voltage be less than 3V (deeply discharged due to accident or long term storage), the dissipation will be such that the IC will almost certainly shut down, as it has internal over-temperature sensing. It will cycle on and off until the voltage across the cell has risen far enough to reduce the dissipation to allow continuous operation. Switchmode chargers are far more efficient, but are larger, more complex, and more expensive to build.
Some controllers include temperature sensing, or have provision for a thermistor to monitor the cell temperature. ICs such as the LTC4050 will only charge when the temperature is between 0°C and 50°C when used with the NTC (negative temperature coefficient) thermistor specified. Others can be designed to be mounted so that the IC itself monitors the temperature. These are intended to be installed with the IC in direct thermal contact with the cell. The series pass transistor must be external to the IC to ensure that its dissipation doesn't affect the die temperature of the IC.
The current programming resistor is set for 10k in the above drawing, and that sets the charge current to about 100mA. The datasheet for the IC has a graph that shows charge current versus programming resistor, and there doesn't appear to be a formula that can be applied. A 2k resistor gives the maximum rated charging current of 500mA. As discussed earlier, a slow charge is probably the best option for maximum cell life, unless the cell is designed for fast charging. Unfortunately, the IC has a preset maximum voltage, and it can't be reduced to limit the voltage to a slightly lower value which will prolong the life of the cell. R1 allows about 2.5mA for the LED, so a high brightness type may be needed. R1 can be reduced to 470 ohms if desired.
For low current charging, there's probably no reason not to use an accurate 4.2V supply and a series resistor. The charge process will be fairly slow, but if limited to around 0.1C or 100mA (whichever is the smaller), a charge cycle will take around 15 hours. The resistor should be selected to provide the desired current with 1.2V across it (12 ohms for 100mA). There is little or no chance that the low current will cause any damage to the cell, and although it's a pretty crude way to charge, there's no reason that it shouldn't work perfectly well. I have tried it, and there don't seem to be any 'contra indications'.
While charging a single cell (or parallel cell battery) is fairly simple with the right IC(s), it becomes more difficult when there are two or more cells in series to create a higher voltage battery. Because the voltage across each cell must be monitored and limited, you end up with a fairly complex circuit. Again, there are plenty of options from most of the major IC manufacturers, and in many cases a dedicated microcontroller ends up being needed to manage the individual cell monitoring circuits.
There are undoubtedly products that don't provide any form of charge balancing, and these are the ones that are most likely to cause problems in use - including fire. Using lithium batteries without a proper balance charger is asking for trouble, and should not be done even in the cheapest of products. You might imagine that in a 2 cell series pack, only one cell needs to be monitored, and the other one will look after itself. This isn't the case though. If the cell that isn't monitored happens to have the lower capacity, it will charge faster than the other cell. It may reach a dangerous voltage before the monitored cell has reached its maximum.
The principle of multi-cell monitoring is simple enough in concept. It's only when you realise that fairly sophisticated and accurate circuitry has to be applied to every cell that it becomes daunting. Because cells are all at different voltages, the main controller needs level shifting circuits to each cell monitor. This may use opto-isolators or more 'conventional' level shifting circuits, but the latter are not usually suitable for high voltage battery packs.
Figure 4 - Simplified Multi-Cell Balancing Circuits
Note: The circuits shown are conceptual, and are intended to show the basic principles. They are not designed for construction, and the ICs shown in 'A' are not any particular device, as the 'real' ICs used are often controlled by a dedicated microcontroller. There's no point sending me an email asking for the device types, because they don't exist as a separate IC. The idea is only to show the basics - this isn't a project article, it's provided primarily to highlight the issues you will be faced with when dealing with LiPo series cells.
There are two classes of cell balancing circuit - active and passive (both of those shown are passive). Passive systems are comparatively simple and can work very well, but they have poor power efficiency. This is unlikely to be a problem for small packs (2-5 series cells) charged at relatively low rates (1C or less). However, it's critical for large packs as used in electric bikes or cars, because they cost a significant amount of money to charge, so inefficiency in the BMS translates to higher cost per charge and considerable wasted energy.
I'm not about to even try to show a complete circuit for multi-cell balancing, because most rely on very specialised ICs, and the end result is similar regardless of who makes the chips. The system shown in 'A' uses a control signal to the charger to reduce its current once the first cell in the pack reaches its maximum voltage. The resistor as shown can pass a maximum current of 75mA at 4.2V, and the charger must not provide more than this or the discharge circuit can't prevent an over charge. Each resistor will only dissipate 315mW, but this adds up quickly for a very large battery pack, and that's where active balancing becomes important.
The implementation is very different for the devices from the various manufacturers, and depends on the approach taken. Some are controlled by microprocessors, and provide status info to the micro to adjust the charge rate, while others are stand-alone and are often largely analogue. The arrangement shown above ('B') is simplistic, but is also quite usable as shown. The three 20k pots are adjusted to give exactly 4.2V across each regulator. When balancing is in effect (at the end-of-charge), the available current from the charger must be less than 50mA, or the shunt regulators will be unable to limit the voltage. There is an important limitation to this type of balancer - if one cell goes 'bad' (low voltage or shorted), the remaining cells will be seriously overcharged!
However (and this is important), as with many other solutions, it cannot remain connected when the battery is not charging. There is a constant drain of about 100µA on each cell, and assuming 1.8Ah cells as before, they will be completely discharged in about 2 years. While this may not seem to be an issue, if the equipment is not used for some time it's entirely possible for the cells to be discharged below the point of no return.
Quite a few balance chargers that I've tested are in the same position. They must not be left connected to the battery, so some additional circuitry is needed to ensure that the balance circuits are disconnected when there's no incoming power from the charger. One product I developed for a client needed an internal balance charger, so a relay circuit was added to disconnect the balance circuits unless the charger was powered. See Section 8 for more details on this approach.
With any 'active zener diode' system as shown above, it's vitally important that the charger's output voltage is tightly regulated, and has thermal tracking that matches the transistors' (Q1 to Q3) emitter-base voltage. It would be easy for the charger to continue providing its maximum output current, but having it all dissipated in the cell bypass circuits. It also makes it impossible to sense the actual battery current, so it probably won't turn off when it should.
Battery and/or cell protection is important to ensure that no cell is charged beyond its safe limits, and to monitor the battery upon discharge to switch off the battery if there is a fault (excess current or temperature for example), and to turn off the battery if its voltage falls below the allowable minimum. Ideally, each cell in the battery will be monitored, so that each is protected against deep discharge. For Li-Ion cells, they should not be discharged below 2.5V, and it's even better if the minimum cell voltage is limited to 3 volts. The loss of capacity resulting from the higher cutoff voltage is small, because lithium cell voltage drops very quickly when it reaches the discharge limit.
Because these circuits are usually integrated within the battery pack and permanently connected, it's important that they draw the minimum possible current. Anything that draws more than a few microamps will drain the battery - especially if it's a relatively low capacity. A 500mA/h cell (or battery) will be completely discharged in 500 hours (20 days) if the circuit draws 1mA, but this extends to nearly 3 years if the current drain can be reduced to 20µA.
Protection circuits often incorporate over-current detection, and some may disconnect permanently (e.g. by way of an internal fuse) if the battery is heavily abused. Many use 'self-resetting' thermal fuses (e.g. Polyswitch devices), or the overload is detected electronically, and the battery is turned off only for as long as the fault condition exists. There are many approaches, but it's important to know that some external events (such as a static discharge) may render the circuit(s) inoperable. Lithium batteries must be treated with care - always.
Figure 5 - SII S-8253D Application Circuit
The drawing above shows a 3-cell lithium battery protection circuit. It doesn't balance the cells, but it does detect if any cell in the pack is above the 'overcharge' threshold, and stops charging. It will also stop discharge if the voltage on any cell falls below the minimum. Switching is controlled by the external MOSFETs, and the charger must be set to the correct voltage (12.6V for the 3-cell circuit shown, assuming Li-Ion cells).
These ICs (and others from the various manufacturers) are quite common in Asian BMS boards. The datasheets are not usually very friendly though, and in some cases there is a vast amount of information supplied, but little by way of application circuits. This appears common for many of these ICs from other makers as well - it is assumed that the user has a good familiarity with battery balance circuits, which will not always be the case. The S-8253 shown has a typical current drain of 14µA in operation, and this can be reduced to almost zero if the CTL (control) input is used to disable the IC when the battery is not being used or charged. The MOSFETs will turn off the input/ output if a cell is charged or discharged beyond the limits determined by the IC.
Battery 'fuel gauges' are often no more than a gimmick, but new techniques have made the science somewhat less arbitrary than it used to be. The simplest (and least useful) is to monitor the battery voltage, because lithium batteries have a fairly flat discharge curve. This means that very small voltage changes have to be detected, and the voltage is a very unreliable indicator of the state of charge. Voltage monitoring may be acceptable for light loads over a limited temperature range. It monitors self discharge, but overall accuracy is poor.
So-called 'Coulomb counting' measures and records the charge going into the battery and the energy drawn from the battery, and calculates the probable state of charge at any given time. It's not good at providing accurate data for a battery that's deteriorated due to age, and can't account for self discharge other than by modelling. Coulomb counting systems must be initialised by a 'learning' cycle, consisting of a full charge and discharge. Variations due to temperature cannot be reliably determined.
Impedance analysis is another method, and is potentially the most accurate (at least according to Texas Instruments who make ICs that perform the analysis). By monitoring the cell's (or battery's) impedance, the state of charge can be determined regardless of age, self discharge or current temperature. TI calls their impedance analysis technique 'Impedance Track™' (IT for short), and makes some rather bold claims for its accuracy. I can't comment one way or another because I don't have a battery using it, nor do I have the facilities to run tests, but it appears promising from the info I've seen so far.
This article is about proper charge and discharge monitoring, not state-of-charge monitoring. The latter is nice for the end user, but isn't an essential part of the charge or discharge process. I have no plans to provide further info on 'fuel gauges' in general, regardless of the technology.
The 18650 cell (18mm diameter × 65mm long) cell has become very popular for many portable products, and these are now readily available at fairly reasonable prices. They are not all equal of course, and many on-line sellers make rather outlandish claims for capacity. Genuine 18650 cells have a typical capacity ranging from 1,500mA/h (milliamp hours) up to 3,500mA/h, but fakes will often grossly exaggerate the ratings. I've seen them advertised as being up to 6,000mA/h, which is simply impossible. The highest I've seen is 9,900mA/h, and that's even more impossible, but no-one seems to care that buyers are being misled.
The 18650 cell is the mainstay of many laptop battery packs, with a 6-cell battery being fairly common. These may be connected in a series/ parallel combination to provide twice the capacity (in mA/h) at 11.1 volts. The battery enclosure contains the balancing and protection circuits, and the cells are not replaceable. This is (IMO) a shame, because it will always be cheaper to replace the cells rather than the entire sealed battery pack. However, the cells in these packs are generally of the 'tabbed' type, having metal tabs welded to the cells so they don't rely on physical contact to make the electrical connection. This means that it's not possible to make them 'user replaceable'.
One of the advantages of using separate cells is that many of the issues raised in this article can be avoided, at least to a degree. Being separate cells, they will normally be used in a plastic 'battery pack', typically wired in series. A set of four can provide ±7.4V nominal (each cell is 3.7V), and that's sufficient to operate many opamp circuits, including mic preamps, test equipment and most others as well. Recharging is easy - remove the cells from the battery pack and charge them in parallel with a designated Li-Ion charger. Provided the charger uses the correct terminal voltage (no more than 4.2V, preferably a bit less) and limits the peak charging current to suit the cells used, charging is safe, and no balancing is necessary.
As with all things, there are caveats. The circuitry being powered needs some additional circuitry to switch off the battery pack when the minimum voltage is reached. This is typically 2.5V/ cell, so the cutout needs to detect this fairly accurately and disconnect the battery when the voltage reaches the minimum. However, if you use 'protected' cells, they have a small PCB inside the cell case that will disconnect power if the cell is shorted, it (usually) prevents over-charging, and (usually) has an under-voltage cutout.
There's a catch though! While they still use the same size designation (18650), many protected cells are slightly longer. Some can be up to 70mm long, and they won't fit into battery compartments that are designed for 'true' 18650 cells. Others are the correct length, but have lower capacity, because the cell itself is slightly smaller so the protection circuit will fit. These cells also differ in the positive end termination - some use a 'button' (much the same as is seen on most alkaline cells), while others have a flat top. They are often not interchangeable.
Just to confuse the issue, there are also AA sized lithium cells (14500 - 14mm diameter × 50mm long). Because they are 3.7V cells, they are not 'AA' cells, even though they are the same size. You can also buy 'dummy' AA cells, which are nothing more than a AA sized shell (with wrapping like a 'real' cell) that provides a short circuit. These are used in conjunction with Li-Ion cells in devices intended to use two or four cells. One or two Li-Ion and one or two dummy cells are used, and most devices are quite happy with the result. My 'workhorse' digital camera is fitted with a pair of AA size Li-Ion cells and a pair of dummies, and it usually only needs recharging every few weeks (or even up to a couple of months if it's not used much). There is absolutely no comparison between the Li-Ion cells and the NiMh cells I used previously.
There are several ways that more 'traditional' Li-Ion batteries can be used safely. A project I worked on a while ago used a 3S Li-Ion pack (three series cells) with a nominal voltage of 11.1V. It was installed in the case along with the electronics, so removal for charging wasn't practical. A small balance charger was installed along with the battery, with the balancing terminals connected via relays. This was necessary because the balance circuits would otherwise discharge the battery. The cost of the balance charger was such that it wouldn't be sensible to try to build one for anything like the same money. Even getting hold of the parts needed can be a challenge!
By adding the relays and balance charger to the system, it was only necessary to connect an external supply (12V) to a standard DC socket on the back, and that would activate the relays and charge the battery. The relays dropped out as soon as the external voltage source was disconnected. This made a potentially irksome task (connecting the charger and balance connector) to something that the 'average' user could handle easily. Those using the device would normally be (decidedly) non-technical, and expecting them to mess around with fiddly connectors was not an option. A photo of the arrangement I used is shown below. The battery normally used was rated for 1,500mA/h and could keep the data logging system running continuously for 24 hours. The charger could be plugged in or removed while the system was running.
Figure 6 - 3S Li-Ion Battery Charging System
The balance charger is designed specifically for 2S and 3S batteries, and cost less than $10.00 from an on-line supplier of various hobby batteries, chargers, etc. A diode is used to prevent the battery from keeping the relays activated when the charger supply is disconnected. Without the relay disconnection scheme used, the balance circuits would discharge the battery in a couple of days. The circuitry powered by the system shown had built-in voltage detection, and that was designed to turn everything off when the total supply voltage fell to around 8 volts. A fuse (½A) was included in line with the DC output as a final protection system, lest anything fail catastrophically on the powered circuitry.
In the photo you can see the balance charger board mounted above the relay and connector PCB. The LEDs were extended so they peeped out through the back plate, and the DC input connector is at the far left. The high-current leads from the battery aren't used in this application, because the current drain is so far below the maximum discharge rate. The two relays are visible on the right, and only three balance terminals are disconnected when external DC power is not present. The balance charger looks very sparse, but it has several SMD ICs and other parts on the underside of the board.
Figure 7 - 3S Li-Ion Battery Charging System Schematic
The circuit diagram shows how the system is connected. This is easy to do for anyone thinking of using a similar arrangement, and a small piece of Veroboard is easily wired with the relays and diodes. A diode is shown in parallel with the relay coils, and this is necessary to ensure that the back-EMF doesn't damage the charger circuit when the 12V input is disconnected. D1 must be able to carry the full charger input current, which for this example is less than 1A. All the complexity is in the balance charger - everything else is as simple as it can be. D1 prevents the battery voltage from being coupled back from the charger, so the relays will only be energised when external power is present. The fuse should be selected to suit the load. This circuit is only suitable for low current loads, because it doesn't use the battery's high current leads.
This is only one of many possible applications, and as described above, sometimes it's easier to use an 'off-the-shelf' charger than it is to build one from scratch. With other applications you may not have a choice, because 'better' chargers can become quite expensive and may not be suitable for reuse in the manner shown. For one-off or small production runs, using what you can get is usually more cost effective, but this changes if a large number of units is to be manufactured.
Figure 8 - Single Cell Li-Ion Charging System
Sometimes you only need a single cell, and it may be uneconomical to get a dedicated charger. This is especially true if the Li-Ion cell is low-cost, but needs to be charged safely, possibly from a solar cell array or a 5V charger. Solar cell arrays are found in all manner of budget lighting, such as 'solar' path lighting and other similar products. I have an LED 'lantern' that's regularly used when I need to delve behind my computer system or anywhere else that doesn't get much light. When the original battery died (3 x Ni-MH cells) I went for this instead. The series diode scheme is intended where you aren't too fussy about getting the cell to the full 4.2V, but it will reach 3.99V with 'typical' 1N4004 diodes. The main circuit just uses the diodes, with a transistor to disconnect them when the cell isn't being charged. Without D1 and Q1, the cell will be discharged to (about) 3V or so quite quickly, as the diodes will continue to conduct down to ~500mV. This is a true 'junk box' design, as it only uses parts that most people will have in stock.
A better scheme if you have to buy parts is to use a TL431 variable voltage reference. The trimpot (VR1) lets you set the voltage precisely, ideally to about 4.1V maximum. The transistor and D1 are still essential to disconnect the regulator when charging stops, or the cell will discharge through VR1, eventually becoming completely discharged. This will ruin the cell unless it has internal protection against over-discharge (some do, others don't). This circuit will win no prizes for accuracy, but it's cheap, and works quite well in practice.
Lithium cells and batteries are the current 'state of the art' in storage technology. Improvements over the years have made them much safer than the early versions, and it's fair to say that IC development is one of the major advances, since there is an IC (or family of ICs) designed to monitor and control the charge process and limit the voltages applied to each cell in the battery. This process has reduced the risk of damage (and/ or fire) caused by overcharging, and has improved the life of lithium battery packs.
In reality, no battery formulation can be considered 100% safe. Ni-Mh and Ni-Cd (nickel-metal hydride & nickel cadmium) cells won't burn, but they can cause massive current flow if shorted which is quite capable of igniting insulation on wires, setting PCBs on fire, etc. Cadmium is toxic, so disposal is regulated. Lead-acid batteries can (and do) explode, showering everything around them with sulphuric acid. They are also capable of huge output current, and vent a highly explosive mixture of hydrogen and oxygen if overcharged. When you need high energy density, there is no alternative to lithium, and if treated properly the risk is actually very low. Well made cells and batteries will have all the proper safeguards against catastrophic failure.
This doesn't mean that lithium batteries are always going to be safe, as has been proved by the many failures and recalls worldwide. However, one has to consider the vast number of lithium cells and batteries in use. Every modern mobile phone, laptop and tablet uses them, and they are common in many hobby model products and most new cameras - and that's just a small sample. Model aircraft use lithium batteries because they have such good energy density and low weight, and many of the latest 'fad' models (e.g. drones/ quad-copters) would be unusable without lithium based batteries. Try getting one off the ground with a lead-acid battery on board!
It's generally recommended that people avoid cheap Asian 'no-name' lithium cells and batteries. While some might be perfectly alright, you have no real redress if one burns your house to the ground. There's little hope that complaining to an online auction website will result in a financial settlement, although that can apply equally to name brand products bought from 'bricks & mortar' shops. Since most (often unread and regularly ignored) instructions state that lithium batteries should never be charged unattended, it's a difficult argument. However, when the number of lithium based batteries in use is considered, failures are actually very rare. It's unfortunate that when a failure does occur, the results can be disastrous. It probably doesn't help that the media has made a great fuss every time a lithium battery pack is shown to have a potential fault - it's apparently news-worthy.
One thing is certain - these batteries must be charged properly, with all the necessary precautions against over-voltage (full cell balancing) in place at all times. Ensure that batteries are never charged if the temperature is at or below 0°C, nor if it exceeds 35-40°C. Lithium becomes unstable at 150°C, so careful cell temperature monitoring is needed if you must charge at high temperatures, and should ideally be part of the charger. Avoid using lithium cells and batteries in ways where the case may be damaged, or where they may be exposed to high temperatures (such as full sun), as this raises the internal temperature and dramatically affects reliability, safety and battery life.
As should be apparent, a single lithium cell is fairly easy to charge. You can use a dedicated IC, but even a much simpler combination of a 4.2V regulator and a series resistor will work just fine for a basic (slow) charger. Single cell (or multiple parallel cell) chargers can be obtained quite cheaply, and those I've used work well and pose very little risk. Even so, I would never leave the house while a lithium battery or cell was on charge. I have never personally had any problems with Li-Ion batteries or cells, and I use quite a few of them for various purposes. These are apart from the most common ones - phones, tablets and laptop PCs. Li-Ion chemistry has proven to be a far more reliable option compared to Ni-MH (nickel metal-hydride), where I recently had to recycle (as in take to a recycler, not 'cycle' the cells themselves) more than half of those I had!
When you need lots of power in a small, low weight package, with the ability to recharge up to 500-1000 times, there's no better material than lithium. If they are treated with respect and not abused, you can generally expect a long and happy relationship with your cells and batteries. They're not perfect, but they most certainly beat most other chemistries by a wide margin. There's a lot to be said for LiFePO4 (commonly known as simply LFP, LiFePO or LiFe), because they use a more stable chemical composition and are less likely to do anything 'nasty'. However, as long as they are not abused, Li-Ion cells and batteries are capable of a safe, long and happy life.
For a battery cutout circuit that will disconnect the battery completely when the voltage falls to a preset limit, see Project 184. This was designed specifically to prevent a damaging over-discharge if battery powered equipment is accidentally left turned on after use.
|Copyright Notice. This article, including but not limited to all text and diagrams, is the intellectual property of Rod Elliott, and is Copyright © 2016. Reproduction or re-publication by any means whatsoever, whether electronic, mechanical or electro- mechanical, is strictly prohibited under International Copyright laws. The author (Rod Elliott) grants the reader the right to use this information for personal use only, and further allows that one (1) copy may be made for reference. Commercial use is prohibited without express written authorisation from Rod Elliott.|