12 Metrics to Track When Testing Li-Po Battery Samples from New Suppliers

12 Metrics to Track When Testing Li-Po Battery Samples from New Suppliers

12 Metrics to Track When Testing Li-Po Battery Samples from New Suppliers

Receiving that first box of prototype battery samples from a new supplier is a critical moment in any product development cycle. For the procurement team, it represents the potential for cost savings or a more resilient supply chain. But for the R&D engineering team, those samples represent risk. We have seen countless projects delayed by months because a client rushed the validation phase, assuming that if the battery turned on their device, it was “good to go.”

At Hanery, we don’t just build batteries; we validate them against the most demanding industrial and medical specifications. We know that a datasheet is simply a marketing document until its claims are verified by rigorous, empirical testing. A battery that looks perfect on paper can hide fatal flaws: cells with mismatched internal resistance, a BMS that trips prematurely under dynamic loads, or a thermal profile that will cook your product from the inside out.

This guide is written for the engineers tasked with qualifying those new samples. We are sharing our internal testing playbook—the exact metrics we track when qualifying a new cell chemistry or validating a custom pack design. This is not a superficial “charge and discharge” test. This is a comprehensive, 12-point analytical framework designed to stress-test the supplier’s engineering, uncover hidden weaknesses, and provide the hard data you need to confidently approve a new power source for mass production.

Table of Contents

1. What is the Actual vs. Rated Capacity (at Your Specific C-Rate)?

This is the most fundamental test, but it is often performed incorrectly. A supplier will rate their battery’s capacity (e.g., 5000mAh) based on a very slow, gentle discharge rate (typically 0.2C). However, your device likely draws power much faster than that.

Testing Under Real-World Load Conditions

You must test the capacity at the maximum continuous discharge rate your product will actually use. If your device draws 1C (5 Amps), test the battery at 1C.

The Capacity / C-Rate Disconnect

Measured Capacity (mAh) Capacity Ratings at Low C-Rate Do Not Reflect Real-World Performance Supplier Rating (0.2C) 5000 mAh Measured @ 1C 4800 mAh Measured @ 3C 4200 mAh Why capacity drops: • Internal resistance (IR drop) • Electrochemical polarization • Ion diffusion limitations A "5000mAh" battery may deliver only ~4200mAh under real operating load conditions High-performance applications require low internal resistance and high-rate optimized cell design

If the measured capacity at your operating C-rate falls significantly below the rated capacity, the supplier has likely used lower-grade “energy” cells instead of the “power” cells your application requires. This will lead to disappointingly short runtimes in the field.

2. What is the AC Internal Resistance (ACR) and DC Internal Resistance (DCIR)?

Internal resistance is the defining metric of a battery’s health and quality. It dictates how efficiently the battery can deliver power and how much waste heat it will generate. You must measure both AC and DC resistance, as they tell you different things.

ACR: The Manufacturing Quality Baseline

ACR (measured at 1kHz) is primarily an indicator of the cell’s manufacturing quality and the integrity of the internal chemical structure. We measure this on every cell that enters our factory. You should measure the ACR of the sample packs as soon as you receive them. High ACR out of the box indicates poor manufacturing or degraded cells. More importantly, measure the ACR across multiple samples. If Sample A is 15mΩ and Sample B is 30mΩ, the supplier has terrible batch consistency, which is a massive red flag for mass production.

DCIR: The Real-World Power Delivery Metric

DCIR is a better indicator of how the battery will actually perform under a real load. It is calculated by applying a sudden current pulse and measuring the immediate voltage drop (R=ΔV/ΔI). High DCIR means the battery will suffer from severe voltage sag when your device demands power.

3. How Severe is the Voltage Sag Under Peak Load?

Building on DCIR, you must visually map the battery’s voltage response to your device’s specific power profile.

Simulating Your Device's Power Profile

Don’t just use a static electronic load. Use a programmable load to simulate the exact current profile of your device. If your device has a motor that pulls 10A for 2 seconds on startup, and then settles to 2A, program that profile.

Observe the voltage curve during that 10A peak. If the voltage sags close to your device’s minimum operating voltage (or the BMS’s low-voltage cut-off), you have zero engineering margin. A battery that sags this deeply when new will almost certainly cause unexpected shutdowns as it ages and its internal resistance naturally increases.

4. What is the Thermal Profile During Maximum Discharge?

Heat is the enemy. A battery that runs hot is wasting energy and killing its own cycle life.

Mapping Heat Generation

While running the maximum continuous discharge test, you must monitor the pack’s temperature. Do not just rely on the BMS’s internal thermistor (which only measures one point).

  • Attach external thermocouples to the surface of the pack, particularly near the cell tabs and the BMS MOSFETs.
  • Use a thermal imaging camera to identify localized hot spots.

If the pack’s surface temperature rises excessively (e.g., above 45°C or 50°C) during your normal operating profile in a room-temperature lab, it will overheat and fail when enclosed inside your product on a hot summer day.

5. Are the Cells Perfectly Balanced (in a Multi-Cell Pack)?

If your sample is a multi-cell pack (e.g., 2S, 3S, 4S), the balance of those cells is critical. If one cell is out of balance, the entire pack’s capacity is constrained by that single weak link.

Verifying the Supplier's Cell Matching Process

Before testing, measure the voltage of each individual cell (via the balance connector, if available). They should be nearly identical (within 10-20 millivolts).

After a full discharge test, measure them again. If the voltages have drifted significantly apart, it proves the supplier did not properly grade and match the cells before assembly. This pack will have a very short cycle life.

6. Do the BMS Protection Thresholds Match the Specification?

You cannot trust the datasheet; you must force the BMS to act and verify its trip points. The BMS is your safety net, and you must prove it catches you.

Testing the Limits of the Protection Circuit

Using a programmable power supply and electronic load, deliberately push the battery past its limits:

  • Over-Charge: Slowly increase the charge voltage past 4.2V/cell. Verify exactly when the BMS cuts off the current.
  • Over-Discharge: Slowly drain the battery past 3.0V/cell. Verify the exact cut-off voltage.
  • Over-Current: Apply a load that exceeds the rated maximum. Measure the exact current and the delay time (in milliseconds) before the BMS trips.

If any of these values deviate significantly from the supplier’s specification, the BMS design is flawed, putting your product and your users at risk.

7. Does the Fuel Gauge (SoC) Report Accurately Across the Discharge Curve?

If you have specified a “smart” battery with a fuel gauge (e.g., SMBus or I2C communication), you must verify its accuracy. A fuel gauge that drops suddenly from 40% to 10% is infuriating for a user.

Validating Coulomb Counting Accuracy

Log the State of Charge (SoC) percentage reported by the BMS while simultaneously logging the actual Ah removed by your electronic load. Plot these two values against each other. The relationship should be highly linear. If the BMS reports 50% SoC, you should have removed exactly 50% of the measured total capacity. Inaccuracies here indicate poor firmware calibration by the supplier.

8. What is the Self-Discharge Rate and BMS Parasitic Drain?

This is crucial for products that may sit on a shelf or in a warehouse for months before being used. A high self-discharge rate means your customer might open the box to find a dead device.

Measuring the "Sleep" Current

Fully charge the sample, measure its exact voltage, and let it sit for at least 14 days (ideally 30 days) in a temperature-controlled environment. Measure the voltage again. Calculate the capacity loss.

Furthermore, use a micro-ammeter to measure the “parasitic drain” of the BMS itself when the battery is in a sleep or standby state. A poorly designed smart BMS can draw enough current to kill the cells over a few months of storage.

9. Can the Pack Survive a Vibration and Shock Test?

Industrial and portable products are subjected to rough handling. A battery pack that is electrically perfect but mechanically fragile will fail in the field due to broken welds or pinched wires.

Simulating Real-World Mechanical Stress

If you have access to the equipment, subject the samples to a vibration profile that mimics your product’s environment (e.g., standard transportation vibration profiles). Afterwards, re-test the ACR and DCIR. If the resistance has jumped significantly, or if the pack can no longer deliver its peak current, the internal mechanical connections (the spot welds or solder joints) have likely fractured.

10. How Does the Pack Perform at Temperature Extremes?

As we emphasize constantly, room temperature testing is insufficient. You must validate the battery at the edges of your product’s specified operating envelope.

Testing in the Environmental Chamber

Place the sample in an environmental chamber.

  • Cold Test (e.g., 0°C or -10°C): Run your discharge profile. Expect a capacity drop, but watch the voltage sag closely. If it sags below your device’s cut-off, the battery cannot support your product in the cold.
  • Hot Test (e.g., 45°C or 50°C): Run the discharge profile. Monitor the surface temperature carefully to ensure it doesn’t enter a dangerous thermal runaway condition. Also, verify that the BMS’s high-temperature cut-off functions correctly.

11. What is the Short-Term Cycle Life Degradation Trend?

While you likely don’t have months to run a full 500-cycle test before approving a sample, you must establish a baseline degradation trend.

The 50-Cycle Benchmark

Run the samples through 50 continuous charge/discharge cycles at your product’s nominal operating rate. Measure the capacity and DCIR at cycle 1, cycle 25, and cycle 50.

  • If the capacity has already dropped by 3-5% within 50 cycles, the battery is on a steep degradation curve and will likely not meet long-term expectations.
  • If the DCIR has increased significantly, the internal chemistry is breaking down rapidly.

Early Capacity Fade Analysis

Capacity Retention (%) Cycles (0–50) Acceptable (~98%) Warning (~94%) Early steep fade indicates: • SEI layer instability • Lithium inventory loss • Material / process defects Early-cycle behavior is the strongest predictor of long-term cycle life Procurement Rule: Validate batteries over 30–50 cycles — not just initial capacity WARNING: Cells dropping below ~95% within 50 cycles rarely achieve rated lifespan

12. What Do We Find in a Destructive Teardown Analysis?

This is the final, and often most revealing, step. Once electrical testing is complete, you must sacrifice at least one sample to inspect its internal construction. This is where you see the true quality of the manufacturer.

Looking Under the Shrink Wrap

Carefully disassemble the pack and inspect:

  • Weld Quality: Are the spot welds on the nickel strips consistent, deep, and strong? (Try to pull them apart with pliers).
  • Insulation: Is there adequate Kapton tape or barley paper isolating the BMS and the cell terminals? Are the wires routed safely away from sharp edges?
  • Cell Markings: Do the markings on the bare pouch cells match the supplier’s claims regarding the manufacturer and batch codes?
  • BMS Components: Are the ICs and MOSFETs from reputable brands, or are they cheap, unmarked clones? Is the soldering on the PCB clean and professional?

A messy, poorly insulated internal assembly is a massive red flag that the supplier lacks basic manufacturing discipline.

Frequently Asked Questions

How many samples should I request for this level of testing?

To perform all these tests (including the destructive teardown and environmental tests), you should request a minimum of 5 to 10 samples from the supplier. Testing a single sample is statistically meaningless.

What equipment do I need to perform these tests?

At a minimum, you need a programmable DC electronic load, a programmable DC power supply, an AC internal resistance meter (1kHz), a digital multimeter, and surface thermocouples. An environmental chamber and a dedicated battery cycler (like a Maccor or Chroma system) are highly recommended for professional validation.

Should I test the battery inside my product or on a test bench?

Both. Bench testing is crucial for isolating the battery’s performance and gathering precise data (voltage sag, exact capacity). In-device testing is then required to validate thermal performance within your enclosure and ensure system-level compatibility.

What is a “brown-out” and how does the battery cause it?

A brown-out occurs when the battery’s voltage sags below the minimum operating voltage of your device’s electronics (even for a millisecond), causing the device to reset or shut down unexpectedly. This is usually caused by high DCIR or an inadequate peak C-rate rating.

If the samples fail one of these metrics, should I immediately reject the supplier?

Not necessarily. It depends on the failure. If the BMS trip point is slightly off, that’s a tuning issue they can fix. If the internal teardown reveals sloppy welding or mismatched cells, that indicates a systemic manufacturing problem, and you should walk away.

Can I ask the supplier to provide this data instead of testing it myself?

A reputable supplier should provide you with a comprehensive test report (including discharge curves and cycle life data). However, as the buyer, you must perform your own independent validation (or hire a third-party lab) to verify their claims. Trust, but verify.

What is the difference between “cell balancing” and “cell matching”?

Cell matching is done by the manufacturer before assembly, selecting cells with identical capacity and resistance. Cell balancing is an active function performed by the BMS during charging to keep those cells equal as they age. Both are required for a high-quality pack.

How do I test the BMS over-current protection safely?

Use a programmable electronic load. Start at the battery’s maximum continuous rating and slowly increase the current draw in steps until the BMS cuts the power. Record the exact current level where it tripped.

Why does the battery capacity drop in cold temperatures?

Cold temperatures slow down the chemical reactions inside the battery and increase the internal resistance. The energy is still there, but the battery cannot deliver it efficiently, so it hits the low-voltage cut-off much sooner than it would at room temperature.

How does Hanery support OEM engineers during this validation phase?

We view validation as a collaborative process. When we provide samples, we provide a complete technical data package, including our own internal test reports for all the metrics listed above. Our application engineers are available to review your test data, troubleshoot any integration issues, and rapidly iterate the design if necessary.

Conclusion: Data-Driven Procurement is Risk Mitigation

Approving a new Li-Po battery supplier is one of the most consequential technical decisions your team will make. A battery is not a passive component; it is an active, volatile chemical system. Assuming quality based on a glossy datasheet or a smooth sales pitch is a dereliction of engineering duty.

By systematically tracking these 12 metrics, you transition your procurement process from guesswork to a rigorous, data-driven engineering exercise. You will uncover the hidden compromises that low-cost suppliers make, and you will identify the true manufacturing partners who build quality into every cell, weld, and line of code. This level of diligent testing is an investment of time and resources upfront, but it is the only way to guarantee the long-term performance, safety, and profitability of your final product.

If you are tired of testing samples that fail to meet their promises, we invite you to evaluate a Hanery battery. Contact our engineering team today to discuss your specifications and request samples built to an industrial standard.

Request Engineering Samples & Technical Documentation.

Reference

  • Cadex Electronics Inc. “How to Measure Internal Resistance.” Battery University
  • IEEE Standards Association. “IEEE 1625-2008 – IEEE Standard for Rechargeable Batteries for Multi-Cell Mobile Computing Devices.”
  • Texas Instruments. “Battery Management System (BMS) Tutorials.” (General reference for BMS testing methodologies).
  • International Electrotechnical Commission. “IEC 62133-2:2017 – Safety requirements for portable sealed secondary cells.” (Reference for abuse testing procedures).
  • G. Pistoia, ed. “Lithium-Ion Batteries: Advances and Applications.” Elsevier, 2014.
  • M. G. Pecht. “A reliability perspective on the state-of-the-art of lithium-ion batteries.” IEEE Access, 2017.
  • Underwriters Laboratories (UL). “UL 2054 – Standard for Household and Commercial Batteries.”
  • American Society for Quality (ASQ). “What is a Failure Mode and Effects Analysis (FMEA)?”
  • J. B. Goodenough. “Evolution of Strategies for Modern Rechargeable Batteries.” Accounts of Chemical Research, 2013.
  • National Fire Protection Association (NFPA). “Safety Tip Sheet for Lithium-Ion Batteries.”

Change Log:

27/04/2026 Article pulished.

Factory-Direct Pricing, Global Delivery

Get competitive rates on high-performance lithium batteries with comprehensive warehousing and logistics support tailored for your business.

Contact Info

Scroll to Top

Request Your Quote

Need something helped in a short time? We’ve got a plan for you.