|Elliott Sound Products||Lithium Cell Charging|
Copyright © 2016 - Rod Elliott (ESP)
Page Created November 2016, Published February 2017
1 - Battery Management System (BMS)
2 - Charging Profile
3 - Constant Voltage And Constant Current Power Supplies (Chargers)
4 - IC Single Cell Charging Circuit
5 - Multi-Cell Charging
6 - Battery Protection
7 - State Of Charge (SOC) Monitoring
Charging lithium batteries or cells is (theoretically) simple, but can be fraught with difficulties as has been shown by the multiple serious failures in commercial products. These range from laptop computers, mobile ('cell') phones, the so-called 'hoverboards' (aka balance boards), and even aircraft. Balance boards caused a number of house fires and destroyed or damaged many properties worldwide. If the cells aren't charged properly, there is a high risk of venting (release of high pressure gasses), which is often followed by fire.
Lithium is the lightest of all metallic elements, and will float on water. It is very soft, but oxidises quickly in air. Exposure to water vapour and oxygen is often enough to cause combustion, and especially so if there is heat involved (for example, from overcharging a lithium cell). Exposure to moist/ humid air causes hydrogen gas to be generated (from the water vapour), which is of course highly flammable. Lithium melts at 180°C. Most airlines insist that lithium cells and batteries be charged to no more than 30% for transport, due to the very real risk of catastrophic fire. Despite the limitations, lithium batteries are now used in nearly all new equipment because of the very high energy density and light weight.
Batteries have charge and discharge rates that are referred to 'C' - the battery or cell capacity, in Ah or mAh (amp or milliamp hours). A battery with a capacity of 1.8Ah (1,800mAh) therefore has a 'C' rating of 1.8 amps. This means that (at least in theory) the battery can supply 180mA for 10 hours (0.1C), 1.8A for 1 hour, or 18A for 6 minutes (0.1 hour or 10C). Depending on the design, Lithium batteries can supply up to 30C or more, so our hypothetical 1,800mAh battery could theoretically supply 54A for 2 minutes. Capacity may also be stated in Wh (watt hours), although this figure is usually not helpful other than in advertising brochures.
In the US and some countries elsewhere, the Wh rating is required by shipping companies so they can determine the packaging standard needed. A single 1.8Ah cell has a stored energy of 6.7Wh [ 4 ]. Alternatively, the lithium content may need to be stated. The reference also shows how this can be calculated, although any calculation made will only be an estimate unless the battery maker specifically states the lithium content. The reason for this is the risk of fire - carriers dislike having shipments catch fire, and the lithium content may dictate how the goods will be shipped. When batteries are shipped separately (not built into equipment) they must be charged to no more than 30% capacity.
Unlike some older battery technologies, lithium batteries cannot (and should not) be left on float charge, although it may be possible if the voltage is maintained below the maximum charge voltage. For most of the common cells in use, the maximum cell voltage is 4.2V, called the 'saturation charge' voltage. The charge voltage should be maintained at this level only for long enough for the charge current to have fallen to 10% of the initial or 1C value. However, this may be subject to interpretation because the initial charge current can have a wide range, depending on the battery and the charger.
Unfortunately, while there are countless articles about lithium battery charging, there are nearly as many different suggestions, recommendations and opinions as there are articles. One of the main things that is essential when charging a lithium battery is to ensure that the voltage across each cell never exceeds the maximum allowable, and this means that each and every cell in the battery has to be monitored. There are many ICs available that have been specifically designed for lithium battery balance charging, with some systems being quite complex, but extremely comprehensive in terms of ensuring optimum performance.
While the traditional lithium-ion (Li-ion) or lithium-polymer (Li-Po) has a nominal cell voltage of 3.70V, Li-iron-phosphate (LiFePO4, aka LFP - lithium ferrophosphate) makes an exception with a nominal cell voltage of 3.20V and charging to 3.65V. A relatively new addition is the Li-titanate (LTO) with a nominal cell voltage of 2.40V and charging to 2.85V.
Chargers for these alternative lithium chemistry cells are not compatible with regular 3.70-volt Li-ion. Provision must be made to identify the systems and provide the correct charging voltage. A 3.70-volt lithium battery in a charger designed for LiFePO4 would not receive sufficient charge; a LiFePO4 in a regular charger would cause overcharge. Unlike many other chemistries, Li-ion cells cannot absorb an overcharge, and the specific battery chemistry must be known and charging conditions adjusted to suit.
Li-ion cells operate safely within the designated operating voltages, but the battery (or a cell within the battery) becomes unstable if inadvertently charged to a higher than specified voltage. Prolonged charging above 4.30V on a Li-ion cell designed for 4.20V will plate metallic lithium on the anode. The cathode material becomes an oxidizing agent, loses stability and produces carbon dioxide (CO2). The cell pressure rises and if the charge is allowed to continue, the current interrupt device responsible for cell safety disconnects at 1,000–1,380kPa (145–200psi). Should the pressure rise further, the safety membrane on some Li-ion cells bursts open at about 3,450kPa (500psi) and the cell may eventually vent - with flames !
Not all cells are designed to withstand high internal pressures, and will show visible bulging well before the pressure has reached anything near the values shown. This is a sure sign that the cell (or battery) is damaged, and it should not be used again. Unfortunately, many of the articles you find on-line discussing balance boards (in particular) talk about the cell quality (or lack thereof) and/or the charger quality (ditto), but neglect to mention the battery management system (BMS) discussed next.
This is one of the most critical elements of a lithium battery charger, but is rarely mentioned in most articles that discuss battery fires. In general, it's assumed (or not known to the writer) that the battery pack includes - or should include - a protection circuit to ensure that each cell is monitored and protected against overcharge. It's likely that cheap (or counterfeit) battery packs don't include a protection circuit at all, and any battery without this essential circuitry is to be avoided at all costs. The problem is that sellers will rarely disclose (or even know) if the battery has protection or not.
It's not especially helpful, but many sellers of batteries and chargers fail to make the distinction between battery monitoring and battery protection. These are two separate functions, and in general they are separate pieces of circuitry. Unfortunately, the term 'BMS' can mean either monitoring or protection, depending largely on the definition used by the the seller, and/or understanding of what is actually being sold.
I will use the term 'balancing' to apply to the management of the charging process, and for batteries (as opposed to single cells), it's the balancing process that ensures that each cell is closely monitored during charging to maintain the correct maximum cell voltage. Protection circuits are usually connected to the battery permanently, and are often integrated within the battery pack. These are covered further below. In some cases, protection and balancing may be provided as a complete solution, in which case it truly deserves the term 'BMS' or 'battery management system'.
For proper control of the charge process with more than a single cell, a battery balance system is absolutely essential. The balance circuits are responsible for ensuring that the voltage across any one cell never exceeds the maximum allowed, and is often integrated with the battery charger. Some have further provisions, such as monitoring the cell temperature as well. In large installations, the individual cell controllers communicate with a central 'master' controller that provides signalling to the device being powered, indicating state of charge (inasmuch as this parameter can be determined - it's less than an exact science), along with any other data that may be considered essential.
For comparatively simple batteries with from 2 to 5 series cells, giving nominal voltages from 7.4V to 18.5V respectively, cell balance isn't particularly difficult. It does become a challenge when perhaps 110 cells are connected in series, for an output of around 400V (as may be found in an electric car for example). Cells can also be connected in parallel, most commonly as a series-parallel network. Common terminology (especially for 'hobby' batteries for model airplanes and the like) will refer to a battery as being 5S (5 series cells), or 4S2P (4 series cells, with each comprised of 2 cells in parallel).
Operating cells in parallel is not a problem, and it's possible (though usually not recommended) that they can have different capacities. Of course they must be using the exact same chemistry. When run in series, the cells must be as close to identical as possible. Of course, as the calls age they will do so at different rates - some cells will always deteriorate faster than others. This is where the balance system becomes essential, because the cell(s) with the lowest capacity will charge (and discharge) faster than the others in the pack. The majority of balance chargers use a regulator across each cell, and that ensures that each individual cell's charge voltage never exceeds the maximum allowed.
In its simplest form, this could be done with a string of precision zener diodes, and that is actually fairly close to the systems commonly used. The voltage has to be very accurate, and ideally will be within 50mV of the desired maximum charge voltage. Although the saturation charge voltage is generally 4.2V per cell, battery life can be extended by limiting the charge voltage to perhaps 4.1 volts. Naturally, this results in slightly less energy storage.
The two major components of a BMS will be looked at separately below. These may be augmented by performance monitoring (state of charge, remaining capacity, etc.), but this article concentrates on the important bits - those that maximise both safety and battery life. So-called 'fuel gauges' are a complete topic unto themselves, and they are only covered in passing here.
The graph shows the essential elements of the charge process. Initially, the charger operates in constant current (current limit) mode, with the maximum current ideally being no more than 1C (1.8A for a 1.8Ah cell or battery). Often it will be less, and sometimes a great deal less. Charging at 0.1C (180mA) would result in a charge time of 30 hours if the full saturation charge is applied. However, when a comparatively slow charge is used (typically less than 0.2C), it is possible to terminate charging as soon as the cell(s) reach 4.2V and the saturation charge isn't necessary. For example, based on the 'new' charging algorithm, the cell shown in Figure 1 may require somewhere between 12 and 15 hours to charge at 0.1C, and the charge cycle is ended as soon as the voltage reaches 4.2 volts. This is somewhat kinder to the Li-ion cell, and voltage stress is minimised.
Figure 1 - Lithium Ion Charging Profile (1 Cell)
As is clearly shown in the graph, a fast charge means that the capacity lags the charge voltage, and 1C is fairly fast - especially for batteries designed for low consumption devices. After about 35 minutes, the voltage has (almost) reached the 4.2V maximum and charge current starts to fall, but the cell is only charged to around 65%. A slower charge rate means that the charge level is more closely aligned with the voltage. Like all batteries, you never get out quite as much as you put in, and you generally need to put in about 10-20% more ampere hours (or milliamp hours) than you will get back during discharge.
Some chargers provide a pre-conditioning charge if the cell voltage is less than 2.5 volts. This is generally a constant current of 1/10 of the nominal full constant current charge. For example, if the charge current is set for 180mA, the cell will be charged at 18mA until the cell voltage has risen to about 3V (this varies depending on the design of the charger). Most systems will never need pre-conditioning though, because the electronics will (or should!) shut down before the cell reaches a potentially damaging level of discharge.
In use, Li-ion batteries should be kept cool. Normal room temperature (between 20° and 25°C) is ideal. Leaving charged lithium batteries in cars out in the sun is ill-advised, as is any other location where the temperature is likely to be higher than 30°C. This is doubly important when the battery is being charged. When discharged, some means of cutout is required to ensure that the cell voltage (of any cell in the battery) does not fall below 2.5 volts.
It's usually better not to fully charge lithium batteries, nor allow a deep discharge. Battery life can be extended by charging to around 80-90% rather than 100%, as this all but eliminates 'voltage stress' experienced when the cell voltage reaches the full 4.2 volts. If the battery is to be stored, a charge of 30-40% is recommended, rather than a full charge. There are many recommendations, and most are ignored by most people. This is not the users' fault though - manufacturers of phones, tablets and cameras could offer an option for a reduced charge - there's plenty of processing power available to do it. This is especially important for items that don't have a user replaceable battery, because it often means that otherwise perfectly good equipment is discarded just because the battery is tired. Given the proliferation of malware for just about every operating system, it's important to ensure that battery charge settings can never be set in such a way that may cause damage.
During the initial part of the charge cycle, the charger supply should be constant current. Current regulation doesn't have to be perfect, but it does need to be within reasonable limits. We don't much care if a 1A supply actually delivers 1.1A or 0.9A, or if it varies a little depending on the voltage across the regulator. We obviously should be very concerned if it's found that the maximum current is 10A, but that simply won't happen even with a fairly crude regulator.
For a purely analogue design, the LM317 is well suited for the task of current regulation, and it's also ideal for the essential voltage regulation. This reduces the overall BOM (bill of materials), since multiple different parts aren't needed. Of course, these are both linear devices, so efficiency is poor, and they require a supply voltage that's greater than the total battery voltage by at least 5 volts, and preferably somewhat more.
As an alternative to using two LM317 ICs you can add a couple of transistors and resistors to create a current limiter. However, it doesn't work quite as well, the PCB real estate will be greater than the version shown here, and the cost saving is minimal. The circuit below does not include the facility for a 'pre-conditioning' or 'wake-up' charge before the full current is applied. This isn't essential if the battery is never allowed to discharge below 3V, and may not even be needed for a 2.5V minimum. Anything less than a discharged cell voltage of 2.5V will require a C/10 pre-conditioning charge. If you only ever charge at the C/10 rate, a lower charge rate is not needed.
Figure 2 - Constant Current / Constant Voltage Charge Circuit
The arrangement shown will limit the current to the value determined by R1. With 12 ohms, the current is 100mA (close enough - actually 104mA), set by the resistance and the LM317's internal 1.25V reference voltage. For 1A use 1.2 ohms (5W is recommended), and the value can be determined for any current needed up to the maximum 1.5A that the LM317 can provide. At higher current, the regulator will need a heatsink, especially for the initial charge phase when considerable voltage will be across U1. The diodes prevent the battery from applying reverse polarity to the regulator (U2) if the battery is connected before the DC supply is turned on. D1 should be rated for at least double the maximum current, and will ideally be a Schottky device to minimise dissipation and voltage loss.
This is simply the basic charger, which can be designed to fulfil the requirements described above. This is far from the full system though, as the management system and balancing circuits are missing at this stage. Each system will be different, but the basic circuit is flexible enough to accommodate most 2-4 cell battery packs. Charging can be stopped by connecting the 'Adj' pin of U1 to ground with a transistor as shown. When charging is complete, a voltage (5V is fine) is applied to the end of R3, and the current limiter is shut down. Be aware that the battery will be discharged by the combination of the balance circuits and the current passed through R4, R5 and VR1 (the latter is about 5.7mA).
A single cell charger is conceptually quite straightforward. However, when the full requirements are considered it becomes obvious that a simple current limited precision regulator as shown above isn't enough. Many IC makers have complete lithium cell chargers on a chip, with most needing nothing more than a programming resistor, a couple of bypass capacitors and an optional LED indicator. One (of many) that incorporates everything needed is the Microchip MCP73831, shown below. Most of the major IC manufacturers make specialised ICs, and the range is vast. TI (Texas Instruments) makes a range of devices designed for full BMS applications ranging from a single cell to 400V batteries used for electric vehicles. Another simple IC is the LM3622 which is available in a number of versions, depending on the end point voltage. A version is also available for a two-cell battery, but it lacks balancing circuitry.
Figure 3 - Single Cell Charger Using MCP73831 IC
Four termination voltages are available - 4.20V, 4.35V, 4.40V and 4.50V, so it's important to get the correct version for the cell you will be charging. The constant current mode is controlled by R2, which is used to 'program' the IC. Leaving pin 5 ('PROG') open circuit inhibits charging. The IC automatically stops charging when the voltage reaches the maximum set by the IC, and will supply a 'top up' charge when the cell voltage falls to around 3.95 volts. The optional LED can be used to indicate charge or end-of-charge, or both using a tri-colour LED or separate LEDs. The status output is open-circuit if the IC is shut down (due to over temperature for example) or no battery is present. Once charging is initiated, the status output goes low, and it goes high when the charge cycle is complete. Note that this IC is only available in SMD packaging, and through hole versions are not available. The same applies to most devices from other manufacturers.
The charger shown uses a linear regulator, so dissipates power when charging the cell. If the discharged cell voltage is 3V, the IC will only dissipate 300mW with a 100mA charge current. If increased to the maximum the IC can provide (500mA), the IC will dissipate 1.5W, and that means it will get very hot (it's a small SMD device after all). Should the cell voltage be less than 3V (deeply discharged due to accident or long term storage), the dissipation will be such that the IC will almost certainly shut down, as it has internal over-temperature sensing. It will cycle on and off until the voltage across the IC has fallen far enough to reduce the dissipation sufficiently to allow continuous operation. Switchmode chargers are far more efficient, but are larger and more expensive to build.
Some controllers include temperature sensing, or have provision for a thermistor to monitor the cell temperature. ICs such as the LTC4050 will only charge when the temperature is between 0°C and 50°C when used with the NTC (negative temperature coefficient) thermistor specified. Others can be designed to be mounted so that the IC itself monitors the temperature. These are intended to be installed with the IC in direct thermal contact with the cell. The series pass transistor must be external to the IC to ensure that its dissipation doesn't affect the die temperature of the IC.
The current programming resistor is set for 10k in the above drawing, and that sets the charge current to about 100mA. The datasheet for the IC has a graph that shows charge current versus programming resistor, and there doesn't appear to be a formula that can be applied. A 2k resistor gives the maximum rated charging current of 500mA. As discussed earlier, a slow charge is probably the best option for maximum cell life, unless the cell is designed for fast charging. Unfortunately, the IC has a preset maximum voltage, and it can't be reduced to limit the voltage to a slightly lower value which will prolong the life of the cell. R1 allows about 2.5mA for the LED, so a high brightness type may be needed. R1 can be reduced to 470 ohms if desired.
While charging a single cell is fairly simple (with the right IC), it becomes more difficult when there are two or more cells in series to create a battery. Because the voltage across each cell must be monitored and limited, you end up with a fairly complex circuit. Again, there are plenty of options from most of the major IC manufacturers, and in many cases a dedicated microcontroller ends up being needed to manage the individual cell monitoring circuits.
There are undoubtedly products that don't provide any form of charge balancing, and these are the ones that are most likely to cause problems in use - including fire. Using lithium batteries without a proper balance charger is asking for trouble, and should not be done even in the cheapest of products. You might imagine that in a 2 cell pack, only one cell needs to be monitored, and the other one will look after itself. This isn't the case though. If one cell isn't monitored and it happens to have the lowest capacity, it will charge faster than the other cell, and may reach a dangerous voltage before the monitored cell has reached its maximum.
The principle of multi-cell monitoring is simple enough in concept. It's only when you realise that fairly sophisticated circuitry has to be applied to every cell that it becomes daunting. Because cells are all at different voltages, the main controller needs level shifting circuits to each cell monitor. This may use opto-isolators or more 'conventional' level shifting circuits, but the latter are not usually suitable for high voltage battery packs.
Figure 4 - Simplified Multi-Cell Balancing Circuits
There are two classes of cell balancing circuit - active and passive (both of those shown are passive). Passive systems are comparatively simple and can work very well, but they have poor power efficiency. This is unlikely to be a problem for small packs (2-5 series cells) charged at relatively low rates (1C or less). However, it's critical for large packs as used in electric bikes or cars, because they cost a significant amount of money to charge, so inefficiency in the BMS translates to higher cost per charge and considerable wasted energy.
I'm not about to even try to show a complete circuit for multi-cell balancing, because most rely on very specialised ICs, and the end result is similar regardless of who makes the chips. The system shown in 'A' uses a control signal to the charger to reduce its current once the first cell in the pack reaches its maximum voltage. The resistor as shown can pass a maximum current of 75mA at 4.2V, and the charger must not provide more than this or the discharge circuit can't prevent an over charge. Each resistor will only dissipate 315mW, but this adds up quickly for a very large battery pack, and that's where active balancing becomes important.
The implementation is very different for the devices from the various manufacturers, and depends on the approach taken. Some are controlled by microprocessors, and provide status info to the micro to adjust the charge rate, while others are stand-alone and are often largely analogue. The arrangement shown above ('B') is simplistic, but is also quite usable as shown.
However (and this is important), as with many other solutions, it cannot remain connected when the battery is not charging. There is a constant drain of about 100µA on each cell, and assuming 1.8Ah cells as before, they will be completely discharged in about 2 years. While this may not seem to be an issue, if the equipment is not used for some time it's entirely possible for the cells to be discharged below the point of no return.
Quite a few balance chargers that I've tested are in the same position. They must not be left connected to the battery, so some additional circuitry is needed to ensure that the balance circuits are disconnected when there's no incoming power from the charger. One product I developed for a client needed an internal balance charger, so a relay circuit was added to disconnect the balance circuits unless the charger was powered.
With any 'active zener diode' system as shown above, it's vitally important that the charger's output voltage is tightly regulated, and has thermal tracking that matches the transistors' (Q1 to Q3) emitter-base voltage. It would be easy for the charger to continue providing its maximum output current, but having it all dissipated in the cell bypass circuits. It also makes it impossible to sense the actual battery current, so it probably won't turn off when it should.
Battery and/or cell protection is important to ensure that no cell is charged beyond its safe limits, and to monitor the battery upon discharge to switch off the battery if there is a fault (excess current or temperature for example), and to turn off the battery if its voltage falls below the allowable minimum. Ideally, each cell in the battery will be monitored, so that each is protected against deep discharge. For Li-ion cells, they should not be discharged below 2.5V, and it's even better if the minimum cell voltage is limited to 3 volts. The loss of capacity resulting from the higher cutoff voltage is small, because lithium cell voltage drops very quickly when it reaches the discharge limit.
Because these circuits are usually integrated within the battery pack and permanently connected, it's important that they draw the minimum possible current. Anything that draws more than a few microamps will drain the battery - especially if it's a relatively low capacity. A 500mA/h cell (or battery) will be completely discharged in 500 hours (20 days) if the circuit draws 1mA, but this extends to nearly 3 years if the current drain can be reduced to 20µA.
Protection circuits often incorporate over-current detection, and some may disconnect permanently (e.g. by way of an internal fuse) if the battery is heavily abused. Many use 'self-resetting' thermal fuses (e.g. Polyswitch devices), or the overload is detected electronically, and the battery is turned off only for as long as the fault condition exists. There are many approaches, but it's important to know that some external events (such as a static discharge) may render the circuit(s) inoperable. Lithium batteries must be treated with care - always.
Figure 5 - SII S-8253D Application Circuit
The drawing above shows a 3-cell lithium battery protection circuit. It doesn't balance the cells, but it does detect if any cell in the pack is above the 'overcharge' threshold, and stops charging. It will also stop discharge if the voltage on any cell falls below the minimum. Switching is controlled by the external MOSFETs, and the charger must be set to the correct voltage (12.6V for the 3-cell circuit shown, assuming Li-ion cells).
These ICs (and others from the various manufacturers) are quite common in Asian BMS boards. The datasheets are not usually very friendly though, and in some cases there is a vast amount of information supplied, but little by way of application circuits. This appears common for many of these ICs from other makers as well - it is assumed that the user has a good familiarity with battery balance circuits, which will not always be the case. The S-8253 shown has a typical current drain of 14µA in operation, and this can be reduced to almost zero if the CTL (control) input is used to disable the IC when the battery is not being used or charged. The MOSFETs will turn off the input/ output if a cell is charged or discharged beyond the limits determined by the IC.
Battery 'fuel gauges' are often no more than a gimmick, but new techniques have made the science somewhat less arbitrary than it used to be. The simplest (and least useful) is to monitor the battery voltage, because lithium batteries have a fairly flat discharge curve. This means that very small voltage changes have to be detected, and the voltage is a very unreliable indicator of the state of charge. Voltage monitoring may be acceptable for light loads over a limited temperature range. It monitors self discharge, but overall accuracy is poor.
So-called 'Coulomb counting' measures and records the charge going into the battery and the energy drawn from the battery, and calculates the probable state of charge at any given time. It's not good at providing accurate data for a battery that's deteriorated due to age, and can't account for self discharge other than by modelling. Coulomb counting systems must be initialised by a 'learning' cycle, consisting of a full charge and discharge. Variations due to temperature cannot be reliably determined.
Impedance analysis is another method, and is potentially the most accurate (at least according to Texas Instruments who make ICs that perform the analysis). By monitoring the cell's (or battery's) impedance, the state of charge can be determined regardless of age, self discharge or current temperature. TI calls their impedance analysis technique 'Impedance Track™' (IT for short), and makes some rather bold claims for its accuracy. I can't comment one way or another because I don't have a battery using it, nor do I have the facilities to run tests, but it appears promising from the info I've seen so far.
This article is about proper charge and discharge monitoring, not state-of-charge monitoring. The latter is nice for the end user, but isn't an essential part of the charge or discharge process. I have no plans to provide further info on 'fuel gauges' in general, regardless of the technology.
Lithium cells are the current 'state of the art' in battery technology. Improvements over the years have made them much safer than the early versions, and it's fair to say that IC development is one of the major advances, since there is an IC (or family of ICs) designed to monitor and control the charge process and limit the voltages applied to each cell in the battery. This process has reduced the risk of damage (and/ or fire) cause by overcharging, and has improved the life of lithium battery packs.
In reality, no battery formulation can be considered 100% safe. Ni-Mh and Ni-Cd (nickel-metal hydride & nickel cadmium) cells won't burn, but they can cause massive current flow if shorted which is quite capable of igniting insulation on wires, setting PCBs on fire, etc. Cadmium is toxic, so disposal is regulated. Lead-acid batteries can (and do) explode, showering everything around them with sulphuric acid. They are also capable of huge output current, and vent a highly explosive mixture of hydrogen and oxygen if overcharged. When you need high energy density, there is no alternative to lithium, and if treated properly the risk is actually very low. Well made cells and batteries will have all the proper safeguards against catastrophic failure.
This doesn't mean that lithium batteries are always going to be safe, as has been proved by the many failures and recalls worldwide. However, one has to consider the vast number of lithium cells and batteries in use. Every modern mobile phone, laptop and tablet uses them, and they are common in many hobby model products and most new cameras - and that's just a small sample. Model aircraft use lithium batteries because they have such good energy density and low weight, and many of the latest 'fad' models (e.g. drones/ quad-copters) would be unusable without lithium based batteries. Try getting one off the ground with a lead-acid battery on board!
It's generally recommended that people avoid cheap Asian 'no-name' lithium cells and batteries. While some might be perfectly alright, you have no real redress if one burns your house to the ground. There's little hope that complaining to an online auction website will result in a financial settlement, although that can apply equally to name brand products bought from 'bricks & mortar' shops. Since most (often unread and regularly ignored) instructions state that lithium batteries should never be charged unattended, it's a difficult argument. However, when the number of lithium based batteries in use is considered, failures are actually very rare. It's unfortunate that when a failure does occur, the results can be disastrous. It probably doesn't help that the media has made a great fuss every time a lithium battery pack is shown to have a potential fault - it's apparently news-worthy.
One thing is certain - these batteries must be charged properly, with all the necessary precautions against over-voltage (full cell balancing) in place at all times. Ensure that batteries are never charged if the temperature is at or below 0°C, nor if it exceeds 35-40°C. Lithium becomes unstable at 150°C, so careful cell temperature monitoring is needed if you must charge at high temperatures, and should ideally be part of the charger. Avoid using lithium cells and batteries in ways where the case may be damaged, or where they may be exposed to high temperatures (such as full sun), as this raises the internal temperature and dramatically affects reliability, safety and battery life.
When you need lots of power in a small, low weight package, with the ability to recharge up to 500 times, there's no better material than lithium. If they are treated with respect and not abused, you can generally expect a long and happy relationship with your batteries. They're not perfect, but they most certainly beat most other chemistries by a good margin.
|Copyright Notice. This article, including but not limited to all text and diagrams, is the intellectual property of Rod Elliott, and is Copyright © 2016. Reproduction or re-publication by any means whatsoever, whether electronic, mechanical or electro- mechanical, is strictly prohibited under International Copyright laws. The author (Rod Elliott) grants the reader the right to use this information for personal use only, and further allows that one (1) copy may be made for reference. Commercial use is prohibited without express written authorisation from Rod Elliott.|