Please think it through?
I'll try to create a very simple hypothetical case to illustrate the problem and how to think about it:
Assume the average vehicle is driven 50 miles per day.
This includes commercial vehicles, long and mid haul semi trucks, work trucks, delivery vehicles, etc. In other words, the average daily per person numbers do not apply here (that would be around 25 to 45 miles per day, depending on location). I feel an average of 50 miles per day, regardless of vehicle class, is a reasonable number to use as a thinking tool to try to get a ROM (Rough Order of Magnitude) of the problem. The models I developed years ago were far more accurate than this, however, that kind of detail in a simulation is hard to convey in a post like this.
Assume, then, this to represent an average for all 30 million vehicles in CA.
The question:
How much POWER would this require?
Let's assume we use a Level 2 charger that would replenish 60 miles in an hour at 7 kW. Again, we are super-simplifying things here. For example, a semi truck or delivery van will be far less efficient and require charging at a much higher power level and longer charge duration. I am just trying to simplify this for the purpose of illustrating the problem.
Assuming 30 million vehicles charged simultaneously, this means we would need 210,000,000 kW
Let's have them charge with a uniform distribution across 24 hours. That means we need 8,750,000 kW
That's 8.75 GW.
A typical nuclear power plant produces 1 GW. In other words, in this evenly distributed scenario we would need the output of 9 nuclear power plants for 24 hours to charge all vehicles in CA.
We need 9 NEW nuclear power plants in CA. I would round that to ten.
This is power over and above current generation and transport capabilities.
How long does it take to build just one nuclear power plant? Well, certainly longer than a high speed train. I think the range is between 25 years and impossible.
How about 10 of them? Never. Unless we stop talking about EV fantasy and start discussing reality. And that is: If we want EV's to take over we need to get serious about being able to massively expand power generation and delivery and we need to do that immediately.
No, it cannot happen by 2030. That's preposterous.
And, no, solar isn't going to do it. That's wishful thinking. A solar installation that can match a 1 GW nuclear power plant and deliver 1 GW 24/7 has to be built with a peak capacity of at least 10 GW. This is massive and more than most people can imagine in terms of land use, materials, batteries, etc.
And, BTW, the above super-simplified hypothetical isn't even close to just how bad things will be in reality. For example, if you assume that, say, 25% of vehicles will need fast or high power charging, the power demand will skyrocket. Remember that I said the problem is power, not energy. Power is what you need when you have to charge a bunch of cars simultaneously. That's because you have to do it given the time constraints of the task. You don't have 48 hours to charge a semi truck that just completed a thousand mile journey. At best your might have eight hours. And that requires power. A typical truck stop might have fifty to one hundred long-haul trucks in need of charging. What they demand is power in order to deliver the requisite energy in a given amount of time. The other thing it does not take into account is concentration. A city like Los Angeles will require a staggering amount of additional electricity to deal with EV's and it will have massive peaks that will dictate the size and shape of the required feeds.
Again, we can go head-in-the-sand or understand we have a very serious that requires at least a doubling of our power generation and distribution capacity. If we don't wake up to that right away it will be an absolute mess.
I could get into your comment about delaying charge and staggering. I have including that sort of thing in my models. It does not change peak power demand. Here's the simplest explanation: Imagine you slow charge 30 million cars for 12 hours and stagger 1/12th of them every hour as you proposed. Well, 12 hours into this charge methodology you have 30 million cars charging simultaneously. And, because cars are used every day, you pretty much end-up with 30 million cars charging 24/7. I am over-simplifying. The point is that the stagger idea seems to be an intuitive solutions (I thought so before I modeled it), yet it does not eliminate the fact that you have to deliver so many kWh (now talking energy) to so many cars within a narrow window of time. In real use very few will adopt EV's if they have to spend 24 hours charging.
Be careful with mixing physics/mathematical arguments and economic ones. If you want to talk physics, assume your (fairly generous) numbers of 8.75 GW. That's 9 nuclear power plants, as you mentioned. Or for solar, mean solar flux in CA is about 5 kWh/m^2 over a day, solar panels are about 20% efficient, that's 1 kWH/m^2/day = 24 m^2 / kW of panels = 24 km^2 / GW * 8.75 GW = 210 km^2 = an approximately 21 x 10 km solar array in the Mojave desert. That's well within the range of plausible land use. For wind, a typical offshore wind turbine generates about 8 MW of power, so we'd need about 1000 of them, turbine blades are about 750 feet across, figure 1/4 mile spacing, we'd cover 250 miles ~= less than half of California's coastline.
The reason these haven't been built yet is because of economics: it's not cost effective to invest this much when the demand isn't there yet. But then we're not going to get 31M car owners suddenly switching over to EVs. We'll get maybe 2-3M each year, switching over as they retire their old vehicles, and then we build one nuclear power plant, or 2 km^2 of solar, or 100 wind turbines, each year until the transition is complete.
[1] https://en.wikipedia.org/wiki/List_of_cancelled_nuclear_reac...
You are making my point. We can't build them.
> Or for solar, mean solar flux in CA is about 5 kWh/m^2 over a day, solar panels are about 20% efficient, that's 1 kWH/m^2/day = 24 m^2 / kW of panels = 24 km^2 / GW * 8.75 GW = 210 km^2 = an approximately 21 x 10 km solar array in the Mojave desert.
I am so incredibly tired of this argument. The only people who reach for this are those who know nothing about the reality of solar. They think in terms of the fantasy they've been sold and, therefore, know nothing about what happens in real life.
> and then we build one nuclear power plant, or 2 km^2 of solar, or 100 wind turbines, each year until the transition is complete.
Please. I beg you. If you have a Excel or something equivalent and have at least a high school understanding of Physics and mathematics, slow down, do some research and try to understand. You really do not. You are confusing a google search for reality.
I'll provide with a quick fantasy vs. reality education as a starting point. The rest is up to you. You can either continue to believe in fantasies or start to understand.
Here's a graph showing the power output of my 13 kW array about a month ago:
https://i.imgur.com/aNnbmDp.png
Notice the parabolic shape with a peak at about 8 kW.
Wait, what? Not 13 kW?
Right. Output changes through the year. I have yet to see it reach the full rated panel output. The most I've seen is around 10 kW. Do you know why? Because the fantasy you quote in terms of efficiency (and everything else) is a rating had under ideal laboratory conditions, starting with an operating temperature of 25 degrees C. This is great for marketing and laughable for real-life conditions.
It doesn't end there. Check this out:
https://i.imgur.com/pB1WgQ0.png
This was the very next day!
What happened? How did the array go from 8 kW all the way down to 2 kW, then back up to about 7, down again, up again, etc.? How did that happen?
Clouds!!! That's how that happened. F-ing solar idealists make me sick. I was one of them, BTW, until I built this system and learned that my fantasy did not match reality at all.
Clouds!!!
Do you think that's it? Check this one out. One day later:
https://i.imgur.com/FiaENVI.png
Clouds. Again! Are you starting to understand? Does this start to paint an image of why all these hand-wavy solar flux arguments are complete and utter nonsense?
Do you know when peak solar production occurs? Which month of the year? Most people will say June/July.
Nope, it's April/May. Here's a full year:
https://i.imgur.com/EQc8EDD.png
Because solar panels have a negative temperature coefficient. That's why! Which means their output is reduced as the panel temperature increases. In June/July it's just too hot. April/May happen to be the right balance between solar input, temperature and other factors.
Remember the graphs showing power generation loss due to clouds? What does that look like through the month. Well, here's what my output looked like this last April:
https://i.imgur.com/8lYKImD.png
See that? On any given day your power output can be reduced by anywhere between 25% and 50%. And that's in a good month. Look at what happened in January:
https://i.imgur.com/bGuCH2F.png
80% reduction in power output! 80%!
For goodness sake, abandon this fantasy and take the time to learn about reality. What's even more frustrating is that people like you will actually engage in intense arguments armed with nothing more than fantasies. Please.
I have no problem with someone not knowing something. We all have tons to learn. I certainly did not have the level of understanding I have today until I built my own solar array and started to try to understand why my output did not match my expectations. What a lesson that was.
What rubs me the wrong way is when people pretend to know something. I have never acted in that manner in my life. If I don't understand something to a good degree I keep my mouth shut and try to learn from those who actually do. That does not mean I don't make mistakes, but I try really hard not to say anything I don't know or have researched to a reasonable depth.
Let's talk about the consequences of the above graphs and your "and then a miracle occurs" calculations (because that's what they are when compared to reality).
The parabolic power output curve means you have to build a solar array 1.5 times larger in order to deliver the same energy over a roughly 12 hour period as that of a constant-power system (nuclear) producing your peak power.
Why?
Because energy is the integral of the power curve over time. The integral of an inverted parabola is 2/3 the area of the enclosing rectangle (the constant power curve). Therefore, my solar array produces 2/3 the energy of a source that can deliver constant power in the 8 kW to 10 kW range. In order to deliver that energy I have to grow my system by the reciprocal of that, which is 1.5. And, of course, I would have to add batteries if what I am after is power. In other words, I have to fill the areas outside the parabola with power I have stored in batteries.
Wait. There's more. This only covers, say, 12 hours of the day. Now I need to overbuild the system yet again in order to provide power at night. That means, at a minimum, a 2x multiplier. I am now up to 3x (1.5 * 2). In other words, my humble 13 kW system would have to triple in size to 39 kW.
Are we done?
No.
Why?
Remember the damn clouds? Here's a graph from March of this year:
https://i.imgur.com/yvTdNX0.png
Horrible stuff. You have to account for this. Believing that one is going to have perfectly shape parabolic output 365 days per year is part of that fantasy I have been referring to. That's not reality.
How do we account for that? If there are no nuclear power plants and no coal (whatever) power plants and we depend 100% on solar (please don't say "wind"), well, there are days when you could fully lose 80% of your output. Heck, you could lose 80% of your output for days or weeks due to weather or fires.
How to size a system to mitigate such events if an entire city and all of the EV transportation in that city depends on locally generated solar energy to exist?
This is where statistics comes in. If we had to mitigate that day when output was 1/5, we would have to build a solar power plant 5 times larger yet. That's not sensible. Reality probably lies somewhere around 50% to 100%, this being a guess and something that is very highly dependent on geography, weather and statistical probability of fires and other events. Build it in the desert? How much output do you lose to sand storms and sand on the array? Build it where there's lot of rain? You might need to overbuild by 10x to get constant power at the required level.
If I assume a 50% overbuild, we go from 3x to 6x.
So now, in order to be able to deliver 1 GW of power 24/7, you need to build a 6 GW photovoltaic solar array with a massive amount of storage.
The real number, when other factors are taken into account, is likely to be closer to 10x overbuild or more. What factors? Failures, maintenance, fill ratio, etc.
Connecting it to my prior post, if you need to add a true 10 GW of power generation capacity that is available 24/7 to support EV's you probably have to build at least 100 GW worth of photovoltaic solar generation and so many batteries I hesitate to count them.
Instead of this fantasy, we need to get our heads out of our collective behinds and build nuclear power plants. That's the only way. Solar alone can't do it, it can (and should be) be a supplemental add-on.
There's more to the story, of course.