Demand is the rate at which energy is used, typically measured in kilowatts. Consider, for example, a 100-watt light bulb. In ten hours, this bulb will consume 1 kWh; throughout that period, the utility must meet demand of 0.1 kW.
Utilities generally account for demand by tracking the average usage in each quarter hour; the highest measurement (peak demand) is chosen for the monthly bill. Two similar customers may use the same total amount of energy in a month; the customer with higher demand will have a higher bill because of utility demand charges. This is done to ensure that each home or business pays a fair share of the cost of the grid as a whole. Demand charges usually vary by time of day and possibly by month. They were established because it is drastically more expensive to produce power at peak times. They are a powerful incentive to monitor and lessen use of peak energy.
A home or business may have solar panels installed to reduce the total amount of power pulled from the grid. A solar installation typically produces energy at the time most needed — sunny summer days when everyone is running air-conditioning (peak load). This can dramatically reduce utility demand charges and must be factored into solar energy costs.
However, the site may still require relatively high demand during some part of the day, leading to high utility demand charges. For example, a large grocery store may turn up the air-conditioning system at noon on a very cloudy day. Some “solar utilities” sell power storage systems that use batteries to regulate the electricity a site requires during peak hours, thereby lowering demand charges at the end of the month. Weighing utility or solar, utility vs. solar, can be very complex.
Most industrial and commercial buildings pay utility demand charges. It is rare for residential customers to pay these charges. (Demand-and-consumption meters are usually much too expensive to install on every home.)