You can do simulations, either run it in forward or reverse. Pick a random town, burn it down, run the flame front backwards and find the paths and lines that caused it. Then look at the probable failure rates of those lines.
Run it forwards, pick and event and allow it to run its course.
When designing simulation runs there are bunch of dimensions to consider
* Fidelity or simulation resolution, spatial and temporal
* Number of runs, it is a linear search, spatial search, random, gradient
* If you are testing specific parameters, like will this bridge hold this static load, multiple simulation runs are done with the inputs varied slightly, input sensitivity analysis. [1]
The simulations don't have to be entirely on the computer, they could also couple with them physical and human components. One could page k/n employees that might be on duty at the random time. Response times in real life could be inferred by response to the actual pages. But people were harmed, no actual failures occurred.
If society approached civilization seriously, we would apply the full power of science AND engineering while we fail horribly at the application of engineering and suffer its excesses.
Given other pro PG&E comments in this thread, and how so many factors were in play and this and that. It all boils down to money and acceptable risk and how to package that up in a way that PG&E gets a slap on the wrist. Well done!
It very well could be that we as a society, decide we are ok with these risks, but that we need mediation. And it could be that we A) turn off at risk lines during likely conditions, as shown in simulation or via direct perception B) Have fire bomber planes in the sky during high likely occurrences.
But money is cheaper. You gotta spend it to make it.
OK. I like the increased utilization of simulations idea.
Your Option A is basically how it is done already for Public safety power shutoffs: they (PGE) have an extensive forecasting department, and they de-energize certain areas for risk based on their analyses. It doesn't work all the time: E.g. Camp Fire [2].
B isn't really an option when many of these incidents occur overnight. Otherwise Cal Fire already fly's their spotters during the day during Red Flag events. And their is a network of remote cameras (alertwildfire.org) for surveillance.
This all is beside the point though: Controlled burns in CA have decreased by 50%+ since the 90s/2000s, and 90s era environmental policies killed logging operations in a lot of CA. Fuel built up, and after that, thermodynamics take over.
If we can get back to mitigating the fuel load - controlled burns [1], and require PG&E to put more budget towards brush clearance and line maintenance - instead of mandating they enter into losing renewable contracts [3] - then we can certainly avoid a lot of these issues.
These are all tactics, PG&E had no strategy other than decide by committee so they couldn't be responsible for the resulting deaths and losses.
There are so many folks trying to slip their agendas into the remedies in this discussion. All of those are literal smoke screens for the greed driven structure of PG&E.
PG&E didn't start doing meaningful shutdown on lines until after the Camp fire.
I suspect that the best results may come from simpler modeling, such as modeling the system with a physical model,like modeling aerodynamics in a wind tunnel, or boat hulls in a wave tank.
How much would it take to make a useful scale model in a warehouse somewhere? I'm certainly not knowledgeable enough to understand how the smaller scale would relate to real-life fires, but it certainly seems plausible that, for a relatively small outlay, we'd have an important tool for establishing a much better understanding of wilderness fires. Given how much CA alone spends fighting forest fires, a 10 or 20 million project to build facilities and fund researchers seems like an obvious and much needed project.
Run it forwards, pick and event and allow it to run its course.
When designing simulation runs there are bunch of dimensions to consider
* Fidelity or simulation resolution, spatial and temporal
* Number of runs, it is a linear search, spatial search, random, gradient
* If you are testing specific parameters, like will this bridge hold this static load, multiple simulation runs are done with the inputs varied slightly, input sensitivity analysis. [1]
The simulations don't have to be entirely on the computer, they could also couple with them physical and human components. One could page k/n employees that might be on duty at the random time. Response times in real life could be inferred by response to the actual pages. But people were harmed, no actual failures occurred.
If society approached civilization seriously, we would apply the full power of science AND engineering while we fail horribly at the application of engineering and suffer its excesses.
Given other pro PG&E comments in this thread, and how so many factors were in play and this and that. It all boils down to money and acceptable risk and how to package that up in a way that PG&E gets a slap on the wrist. Well done!
It very well could be that we as a society, decide we are ok with these risks, but that we need mediation. And it could be that we A) turn off at risk lines during likely conditions, as shown in simulation or via direct perception B) Have fire bomber planes in the sky during high likely occurrences.
But money is cheaper. You gotta spend it to make it.
[1] https://en.wikipedia.org/wiki/Sensitivity_analysis