• iii@mander.xyz
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    11 hours ago

    Source (1)

    Later this month the LA Board of Water and Power Commissioners is expected to approve a 25-year contract that will serve 7 percent of the city’s electricity demand at 1.997¢/kwh for solar energy and 1.3¢ for power from batteries.

    The project is 1 GW of solar, 500MW of storage. They don’t specify storage capacity (MWh). The source provides two contradicting statements towards their ability to provide stable supply: (a)

    “The solar is inherently variable, and the battery is able to take a portion of that solar from that facility, the portion that’s variable, which is usually the top tend of it, take all of that, strip that off and then store it into the battery, so the facility can provide a constant output to the grid”

    And (b)

    The Eland Project will not rid Los Angeles of natural gas, however. The city will still depend on gas and hydro to supply its overnight power.

    Source (2) researches “Levelized cost of energy”, a term they define as

    Comparative LCOE analysis for various generation technologies on a $/MWh basis, including sensitivities for U.S. federal tax subsidies, fuel prices, carbon pricing and cost of capital

    It looks at the cost of power generation. Nowhere does it state the cost of reaching 90% uptime with renewables + battery. Or 99% uptime with renewables + battery. The document doesn’t mention uptime, at all. Only generation, independant of demand.

    To the best of my understanding, these sources don’t support the claim that renewables + battery storage are costeffective technologies for a balanced electric grid.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 hours ago

      It looks at the cost of power generation

      Yes.

      But then you added the requirement of 90% uptime which is isn’t how a grid works. For example a coal generator only has 85% uptime yet your power isn’t out 4 hours a day every day.

      Nuclear reactors are out of service every 18-24 months for refueling. Yet you don’t lose power for days because the plant has typically two reactors and the grid is designed for those outages.

      So the only issue is cost per megawatt. You need 2 reactors for nuclear to be reliable. That’s part of the cost. You need extra bess to be reliable. That’s part of the cost.

      • iii@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 minute ago

        But then you added the requirement of 90% uptime which is isn’t how a grid works.

        I’m referring to the uptime of the grid. Not an individual power source.

        Assume we’ve successfully banned fossil fuels and nuclear, as is the goal of the green parties.

        How much renewable production, and bess, does one need to achieve 90% grid uptime? Or 99% grid uptime?

        If you want a balanced grid, you don’t need to just build for the average day (in production and consumption), you need to build for the worst case in both production and consumption.

        The worst case production in case for renewables, is close to zero for days (example). Meaning you need to size storage appropriatelly, in order to fairly compare to nuclear. And build sufficient production so that surplus is generated and able to be stored.

        If we’re fine with a blackout 10% of the time, I can see solar + bess beating nuclear, price wise. If the goal instead is a reliable grid, then not.

        As an example: take Belgium. As a result of this same idea (solar/wind is cheap!) we ended up with both (1) higher greenhouse gas emissions and (2) costlier energy generation, as we now heavily rely on gas power generation (previously mainly russian, now mainly US LNG) to balance the grid. Previous winter we even had to use kerosene turbine generation to avoid a blackout.