In the world of electrical engineering and energy management, most people focus on the obvious metric: kilowatt-hours (kWh) . After all, kWh measures total consumption, and it’s the line item that fluctuates with every light switch flipped and motor started.
However, lurking behind the scenes is a more subtle, often misunderstood, and financially devastating parameter: . While kWh measures quantity of energy used, Maximum Demand measures the rate of that usage. And as any facility manager who has opened a commercial electricity bill knows, MD can account for 30–50% of the total charges—sometimes more than the energy itself. maximum demand
[ \textDemand Cost = \textMD (kW or kVA) \times \textTariff Rate , ($/kW) ] In the world of electrical engineering and energy
Current MD = 800 kW. Potential reduced MD = 600 kW. Savings = 200 kW × $12/kW = $2,400/month = $28,800/year. While kWh measures quantity of energy used, Maximum
Assume tariff = $12/kW. Baseline MD (if staggered) = 200 kW. Actual MD = 400 kW. Extra cost = 200 kW × $12 = $2,400 per month. Over a year, that’s $28,800 for just one Monday morning habit. Key Factors That Drive Maximum Demand Understanding MD requires identifying peak-creating behaviors:
In many regions (e.g., USA, India, Southeast Asia), rates range from per month. A facility with an MD of 500 kW paying $15/kW incurs $7,500/month —just for demand.