A nickel. That’s the cost of the electricity needed to mint a single bitcoin.
Forum member casascius measured the actual power consumption and reported the results in a very decent writeup. Casascius runs a mining rig with two ATI Radeon HD 5970s.
After connecting the rig to a Kill-A-Watt meter the power consumption was measured. With an electric rate of $0.095 per kWh, the cost of electricity is $0.049, on average, for each bitcoin created by the rig.
Casascius does overclock at 770 MHz to increase the amount of hashing performed but that also increases the power consumption. In this instance, the amount of electricity consumed to produce a bitcoin is about the same as without overclocking.
What wasn’t addressed in the post was the cost of cooling. Smaller mining endeavors aren’t paying to remove the heat generated by the mining activities whereas those mining commercially are likely to also have costs for cooling as well.
As the level of difficulty increases, the amount of power consumption per bitcoin generated will increase over time as well. For the near future, however, the typical miner’s biggest constraints are the the cost of capital and access to a dwindling supply of the most efficient GPU hardware.