Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

According to http://www.platts.com/latest-news/electric-power/washington/... , 90% seems to be an average number, not a best of fleet number.


That is only in the last decade, and for the US. Other countries and decades have differing numbers. Still it only holds when Nuclear is a less than 50% of production. France tends to have around 77%.


Why would the percentage of production being nuclear have any effect on nuclear capacity factor?


If you produce it but no one buys it what was the point? In fact in reality if you do that you wil pay a lot to the grid maintainer for this.

Capacity factor is % utility of your max generating power output.% utility is lower if you can't produce or no one is willing to buy so you don't produce max.

If you produce min deman load with nuclear at 90% capacity. The it's all good.Once you are above that demand slack means production capacity is wastes. Currently nuclear is the last to stop producing as it's variable costs are so low. If nuclear is a larger part of your grid it will tend to have a capacity factor that is equal to the average demand.


Ah good point. That's why I think solar + nuclear is a good idea, the nuclear plants running all the time and solar adding more during the day. It won't perfectly match demand, but it'd still be better than either tech on its own.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: