Skip to content

OEM Lifecycle Intelligence

Commissioning quality is the first proof path

If an OEM can verify quality quickly, it can reduce avoidable service cost, improve installer feedback loops, and create the evidence base for a broader lifecycle deployment.

Commissioning quality view comparing design expectations with live behaviour

2.80

Median SPF in the BEIS Electrification of Heat Demonstration Project

A strong signal that field performance is far below what well-installed systems can achieve.

4.40+

Publicly monitored high-performing installations

The technology can perform materially better when the installation and setup are right.

48 hours

The first useful verification window

Enough time to spot whether the installed system is behaving like the design intent or still sitting on poor defaults.

£200–£400

Typical warranty callout cost

The cost of finding quality issues late instead of proving them quickly with evidence.

What the gap is really proving

The difference between median field performance and strong monitored performance is not just a hardware story. It is a commissioning and verification story. That is why this is the cleanest first proof page for OEM buyers.

Factory defaults left in place

Flow temperatures, weather compensation, and commissioning settings are often left closer to factory assumptions than the home they were installed into.

Design intent is lost at handoff

What was assumed in design rarely survives intact into installation and aftercare, so teams are forced to reconstruct the context later.

Paper checks do not verify real behaviour

Self-reported checklists record that a step happened. They do not prove the system actually behaved as expected once live.

What governed commissioning verification looks like

This is not just remote monitoring and it is not just a checklist. It is an evidence-backed decision on whether the installed system behaved as designed.

Expected performance baseline

Start from the design assumptions, system settings, and operating envelope that should describe a correct installation.

Observed behaviour in the first hours

Use early telemetry and live performance signals to understand how the installed system is actually behaving.

Evidence-backed quality decision

Compare predicted and observed behaviour, surface exceptions, and keep a record of what was found and what should happen next.

What a first deployment should prove

The first deployment is not meant to prove the entire company. It should prove that the OEM can see quality issues sooner, make better decisions with evidence, and decide whether wider lifecycle continuity is worth expanding.

  • Which installations look correct and which need review first
  • Whether certain installer cohorts, regions, or system types are drifting more than others
  • Which issues are likely commissioning and configuration problems rather than product defects
  • What evidence is currently missing from the OEM quality process
  • Whether the module should expand into Fleet Assurance, Design Studio feedback loops, or partner management

Source notes

This proof page summarises the argument in [The Commissioning Quality Gap](/resources/the-commissioning-quality-gap) and the surrounding strategy work. The key public references behind it are:

Field performance and commissioning

BEIS Electrification of Heat Demonstration Project, Energy Saving Trust field trials, and public monitoring data such as HeatPumpMonitor.org.

Market and quality pressure

Heat Pump Association market reporting, MCS quality framework, and the commercial pressure to scale without letting installation quality drift.

FAQ

Use the proof in a real pilot conversation

If commissioning quality is already visible as a service, warranty, or installer problem, the next step is a scoped Commissioning Verification deployment.

Discuss Commissioning Verification