Definition: An overhead cost variance is the difference between the amount of overhead applied during the production process and the actual amount of overhead costs incurred during the period. The overhead cost variance can be calculated by subtracting the standard overhead applied from the actual overhead incurred during the period.
What Does Overhead Cost Variance Mean?
Since most companies use perpetual inventory systems, they must update their inventory costs in real time. This means management can’t wait until the end of the period to add up all of the overhead costs incurred and allocate them to each job . Instead, management needs to estimate the future overhead costs and allocate them throughout the production process.
Since most estimates can’t be completely accurate, the actual costs incurred for the period rarely equals the estimated costs made throughout the production process. The difference between these two numbers is the overhead cost variance.
Management analyzes the cost variances to determine how well the production process is performing and why the estimates are different from the actual amounts. They usually split cost variances into three main categories spending, efficiency, and volume.
The spending variance happens when the company gets a deal on materials. In other words, there was an estimates amount set aside for raw material purchases and management negotiated a different price.
Efficiency variances occur when the labor force finishes a job in a different amount of hours than was originally estimates. For instance, a job that should have taken 10 hours to complete actually took 12 hours to complete would result in a 2 hour variance.
Volume variances happen when different amounts of product are manufactured than the estimated amount. Take a change of order for example. The original order only called for 100 units, but the customer decided to increase the order to 150 units. The 50 extra units creates a volume variance from the estimated 100 units.