Calculating the perfect match for battery and application

The choice of battery for an application is critical. It affects the bill of materials and product lifetime cost, says Björn Rosqvist.

Calculating the perfect match for battery and application

Abstract Battery Charging Icon form lines and triangles, point connecting network on blue background. Illustration vector

Consider a battery-powered remote sensor. The cost of its battery is usually small compared with the overall product cost, but then think about the cost of replacing that battery. For wireless sensors deployed in remote areas, used for anything from asset tracking to animal tracking, the cost of locating and servicing that sensor by replacing its battery may be many times its cost. This makes choosing the optimum battery for an application a critical decision in terms of battery life, product lifetime cost and perhaps the price that a sensor commands in the market.

Even where battery specifications appear to be similar between manufacturers, substantial differences in performance can be experienced due to differences in the quality of materials used in their construction.


Longer-life products are more attractive to customers, but how can one compare batteries and evaluate battery performance for an application? How can one ensure that a battery has the optimum chemistry and capacity to power a product and that it will prove to be reliable in an application? Doing so will minimise the number of times, if any, that a change of battery will be needed.


Selection factors

Battery charge capacity for smaller batteries such as AA types or coin cells is specified in mA/hours (mAh). An alkaline or NiMH AA battery will usually have a rated capacity of 2,000mAh to 3,000mAh. For a CR2032 – the most popular size of coin cell – that figure is normally in the range of 200-250mAh.
Predicting battery performance in an application is complicated by the varied characteristics of different battery chemistries. For example, the internal resistance of an AA alkaline battery is typically somewhere between 0.1Ω and 0.9Ω, but for a CR2032 it can be anywhere between 15Ω and 20Ω. In both types these figures rise as the battery ages and they vary significantly, and to differing degrees, with changes in ambient temperature (Figure 1).

Figure 2: Measuring both energy and power consumption simultaneously over time enables engineers to match batteries to applications [Graphic: Courtesy of Qoitech]

Environment considerations

Consider when a battery is connected to a dc-dc converter. When current is drawn the battery voltage will drop a little due to the battery’s internal resistance and rise again as the demand for current falls. As the battery voltage reduces over time with its shrinking capacity, the converter will draw more current from the battery, accelerating the rate of discharge; it is an avalanche effect.

In embedded designs wireless microcontrollers connected to sensors place varying demands on batteries, perhaps drawing 10mA when transmitting data, but only a fraction of a µA in standby mode. This change happens in microseconds. The duty cycle will depend on the application. Utility meters may be required to deliver data far less frequently than asset trackers, for example, so their duty cycle will be much lower.

Matching the type and capacity of the battery to the energy demands of the application maximises its operating life.

Misleading measurements

Measuring a battery’s charge capacity requires only the current to be measured, but for an accurate assessment of battery performance in an application it is also necessary to measure its energy capacity in Watt hours (Wh), joules (J), or calories. This defines the amount of work that can be done by the battery and requires simultaneous measurement of voltage and current, even when peak consumption happens in a series of bursts, perhaps lasting just a few microseconds.

Measurements can be taken with conventional multimeters, but it is not possible to determine the pattern of energy consumption over time, or the precise impact on energy consumption when modifying hardware or software in the system. This makes it difficult to optimise system design for the lowest energy consumption or to determine the most appropriate battery for the task. Battery life is invariably over-estimated when relying on average charge capacity as the sole measurement criterion.

Dedicated measurement tools have been developed to gain a more holistic picture of what’s happening. These connect to the application hardware – the device under test – to measure voltage and current simultaneously, using a high sample rate to capture and record the short-duration bursts in energy demand where these occur. As the application is run, the tool measures battery voltage, battery internal resistance and environmental conditions. It then plots the discharge characteristics of different batteries and predicts their real-world performance, including operating life (Figure 2).

Figure 1: The discharge characteristics of battery types are differently affected by changes in environmental conditions [Graphic: Courtesy of Qoitech]

Power analysers that perform these tasks are not new, but have often been beyond the budgets of design departments. However, as with so many other types of test equipment, there have been price reductions in recent years as the cost of processor performance has fallen. As a result, power analysers with the ability to accurately profile battery performance in specific applications are now within economic reach of most design departments.

About The Author

Björn Rosqvist is chief product officer at Qoitech


Leave a Reply

Your email address will not be published. Required fields are marked *

*