JODI, Canada and the IEA’s Position On Peak Oil

The JODI data came out a few days ago. Below is JODI World Total C+C with EIA data used for countries not reporting to JODI. I use EIA data also for Venezuela and Iran because JODI uses data reported by these two countries which is political and inflated by about one million barrels per day by Iran and half a million barrels a day by Venezuela.

The data is in kb/d with the last data point September 2013.

JODI World Total

Notice that JODI has a new world high in July just like the EIA had but down 976,000 barrels per day from July to to September.

JODI has Non-OPEC at about 350,000 barrels below the peak in December 2012.

Jodi Non-OPEC

I don’t put much stock in the JODI data but I do find it interesting look at occasionally. And since it is usually almost two months ahead of the EIA data it does give me some idea of where production will be two months ahead of the EIA report.

Canada’s National Energy Board came out with Canada’s production of “Crude Oil and Equivalent” about a week ago. The data is in cubic meters per day so it must be converted to barrels per day. You may remember I posted a chart of their data about a month ago. Well they have lowered their expectations somewhat.

Canada NEB

The data after July is where they expect production to be. Apparently they are a little behind on their data gathering. And as you can see they have lowered their expectations somewhat since their report in October.

But now I must tell you about this, Peak Oil in the IEA’s World Energy Outlook 2013. It appears to be the IEA’s official position on peak oil and the effect LTO, (Light Tight Oil or Shale Oil), will have on the peak.  The text is from pages 447 and 448 of the report.

Spoiler alert: They are also peak oilers, but the peak, they believe, will be after 2035.

Has LTO resolved the debate about peak oil?

It has become fashionable to state that the shale gas and LTO revolutions in the United States have made the peak oil theory obsolete. Our point of view is that the basic arguments have not changed significantly. To understand why, it is useful to revisit the main peak oil argument, which is based on the observation that, for a given basin or country, the amount of oil found and the amount produced tend to follow a rising, peaking and then declining curve over time – known as a “Hubbert” curve. This is either because big fields tend to be found and produced first, followed by smaller fields as the basin matures, or because the cheapest fields are produced first and, as depletion sets in, costs increase (because of smaller, more complex fields) and the basin is outcompeted by other regions. This phenomenon has been observed in many countries (Laherrere, 2003). Where technology opens up a new set of resources that were not previously accessible (as with deepwater or LTO), there can be multiple Hubbert peaks, as each type of resource moves up and then down the curve.

The crux of the peak oil argument has been the assumption that these dynamics, which are well established empirically at the basin or country level, will also take place at the world level (an assumption that has not been vindicated by empirical facts so far). For the purposes of the peak oil argument, the advent of LTO (or other technology breakthroughs) may shift the overall peak in time, but it does not change the conclusion: once the peak is reached, decline inevitably follows rather quickly (and, given the amount of LTO resources compared to the total resources, it could be argued that the peak would be shifted by only a few years in any case).

It is this last assumption – that it is possible to transpose observe country or basin-level dynamics to the world level – that is open to serious doubt. In all the countries that have seen oil production peak, oil demand has continued to increase. This demand has been satisfied, where necessary, by imports from regions that were still pre-peak and therefore lower cost. At the world level since there is no possibility to import, demand has to be equal to supply. If supply is limited, price will rise, reducing demand (and increasing supply). This price mechanism is expected to lead to a long plateau, or slow decline, rather than the rapid decline observed on a country-by-country basis.

With the acceptance that demand is as important as geology and price is determining worldwide supply, it becomes clear that other factors can play a crucial role. One that has been emphasised in successive Outlooks is the role of government politics. Whether driven by the desire to tackle climate change, or simply to encourage efficient uses of resources, government politics have a large effect on future oil demand. This is illustrated by the policy-driven differences between the scenarios; where we see oil production peaks (as in the 450 Scenario) it is not because oil is becoming more difficult and more expensive to produce, but because demand decreases as a result of policy choices.

Taking into account the large amount of unconventional resources that becomes available as oil prices increase, in addition to the significant remaining conventional resources and the sizable potential for EOR in conventional fields, no peak occurs before the end of the projection period. (In peak language, the URR value that enters into the Hubbert equation is large enough to delay the peak until after 2035). This was already the case before LTO. It has not changed much with the arrival of LTO.

So that’s it, no peak until after 2035. I’m betting that date will be revised… and soon.


This entry was posted in Uncategorized. Bookmark the permalink.

95 Responses to JODI, Canada and the IEA’s Position On Peak Oil

  1. Watcher says:

    “The point is that Watcher said 500 bpd”

    Nah, I’m nobody and am going to stay that way. I don’t say anything.

    The data says things. Right here: Just go through it. Most drilling is in the prime counties and they are showing numbers way above 400.

    BTW, factoid, I dug into where this guy got his data. The Bakken Weekly looks like an insert on weekends or something in the Bismark Trib. I went through some recent issues. Here’s one with some info:

    Page 20 begins a profile of drilling permits and after that well completion data. They are STILL drilling predominantly those prime counties. That looks to me like very constrained geography.

    • dcoyne78 says:

      Hi Watcher,

      I thought you had done some calculation to come up with 500 b/d for average 1st month production, I probably misremembered(?is that a word?). Note that initial production is the first 24 hours of output and is generally higher than the average for the first 30 days. So the numbers in your link are a little bit of an apples to oranges comparison.

      Oil companies use high IP numbers for impressive press releases. A 500 b/m number for the 1st month average daily output is possible, but I think a number like 400 to 450 makes more sense and the best match to NDIC output and well data for the Bakken/Three Forks is something even lower (about 330 b/d for first month avg output.


    • That is not what the data says. Four counties have average initial production of above 600 bp/d. Blanchard does not say what he means by “initial production” but that could very well mean the first 24 hours. I know that is what The Bakken News means when they say “initial production”. And no, The Bakken News is not an insert from the Bismark Trib. I got the paper free for three weeks but they wanted too much for a full subscription and I passed. But it is a separate paper in its own right.

      Two other counties have initial production of above 200 bp/d, one above 100 bp/d and all the other counties in the state have initial production of less than 100 bp/d.

      Nowhere in that report does it give average initial production for all the Bakken and for sure nowhere does it give average first month’s production for all the Bakken. But my money is on David Hughes’ report:

      Drill, Baby, Drill
      . Figure 64 illustrates the highest one-month production recorded for wells in the Bakken play. The variability of the wells illustrates the differing geological properties within various parts of the play. The mean IP is 400 bbls/d>/b> with very high quality wells at more than 1,000 bbls/d, amounting to less than five percent of the total. The average production of all operating Bakken wells is now 124 bbls/d, because of the effect of steep well declines and the fact that overall field production is from a mix of old and new wells.

  2. Watcher says:

    If you are at even as low as 500 bpd in the first microsecond of well production, and your decline is 60% in the first year, that’s only going to be 60/12 = 5% in the first month. Average thus 2.5% (average of 0% decline in the first microsecond and 5% on day 30 is 2.5%). That’s a lousy 12.5 bpd. It doesn’t take 500 down to 400. It takes it to 487.5. And the article just noted talks about north of 600 for the primo wells, which is where the vast majority of drilling is.

    It doesn’t make sense to me to use a study of generic wells that produced oil, prior to the days of 30 fracking stages, as the basis for extrapolation — when instead you have actual data from the actual counties over the actual rock.

    • Okay, the decline is not linear. The decline is greatest in the first month. And Blanchard gave no data on the average decline for the average Bakken well because that data is not in The Bakken News. Also, The Bakken News lists hundreds of wells with a N/A for initial production. I strongly suspect that this is because they have slim to no initial production at all.

      You wrote: It doesn’t make sense to me to use a study of generic wells that produced oil, prior to the days of 30 fracking stages, as the basis for extrapolation…

      I need to know where you got that information? The Hughes study was not about generic wells and all the wells were recent. Continuing with your comment: - when instead you have actual data from the actual counties over the actual rock.

      The Bakken News gives a lot of data but they do not come even close to giving you all the data. The Wells are listed by company name and there is lots of bragging about the big ones they bring in but almost nothing about the dry holes they drill. A simple N/A is all you get and there are lots of them.

  3. Watcher says:

    “But it is a separate paper in its own right.”
    That’s good data. I did notice they had a masthead with its own publisher and editor — but I guess maybe they buy space on the Bismark Trib’s website because the URL is a page of the

  4. Watcher says:

    dcoyne the calculation was from Helms (there’s a picture of him in that link, looks upper 50s) explicit statement that 200 something well completions was almost 3X the minimum required to grow production.

    The decline rate for the field that relevant month was 50K bpd. We believe that or we don’t. 200/almost 3 = about 80 or 90 or something wells he says are required for breakeven. For 90 wells to offset 50K of decline means the wells flowed 555 bpd. Simple arithmetic.

    • dcoyne78 says:

      Hi Watcher,

      We have conflicting data from several sources. The production data by Blanchard is interesting, we have Helms comments and we have the data from the NDIC. My model is quite simple and is based primarily on using a hyperbolic decline model (I started with Mason’s model and modified it based on data from Rune Likvern) and the number of producing wells and output provided by the NDIC. The number of producing wells is modified slightly by going all the way back to 1953 and looking at the changes in producing wells over time. On average it looks like wells were abandoned at 25 to 30 years, I use 25 years up to 1996 and 30 years from 1997 to the present and as those old wells drop out I assume an additional new well is added to replace it so that the total wells added each month will be slightly higher to account for this (though the effect is minor).
      Using a single well profile for Dec 2004 to 2013 does not work well which makes sense because in the 2005 to 2008 period horizontal drilling combined with fracking was just beginning in the Bakken, so a lower well profile is used over that period (30 year EUR=156 kb). For the period from 2008 to 2013 a single well profile is used with 30 year EUR about 338 kb when the initial month’s output is 450 b/d, but many different well profiles can be constructed that match the output and well data, Mr Likvern’s well profile has a 30 year EUR of about 450 kb and a first month production of about 350 b/d and matches the data closely (note that Mr. Likvern updates his model as he collects new data and this is based on his older model from late 2012). I also constrain my models to match the mean USGS TRR for the North Dakota Bakken/Three Forks (about 8.5 Gb total). In order to meet this constraint a model with a higher EUR well profile requires a more rapid decrease in the rate of new well EUR decrease (as we run out room in the sweet spots) to keep the TRR at 8.5 Gb. For example the higher EUR model (431 Kb) with lower 1st month(397 b/d) output requires an maximum annual rate of decrease in new well EUR of 26 % (reached in 2017 in this model) to meet the 8.5 Gb TRR constraint. A lower EUR (338 kb) model with a higher 1st month(451 b/d) output allows for a more gradual decrease in new well EUR ( a maximum annual rate of 16.5 % also reached in 2017). A model could easily be devised with a 1st month of 600 b/d, but if the first month is actually this high (both Helms estimate of 90 wells for holding output steady and the EIA estimate of legacy well decline are correct) it does not match very well with the data. Note also that you need to account for the Montana portion of the Bakken legacy decline and deduct that from your 50 kbpd to determine the legacy decline for the ND Bakken.

      For July to August NDIC shows a 36.4 kb/d increase in the Bakken and an increase of 204 producing wells. The EIA DPR shows a July to Aug increase in all of the Bakken of 22.6 kb/d and a legacy decline of 57 kbpd, for simplicity we will assume no wells were added in the Montana Bakken (I do not have data to verify this but I think the number would be low) so the difference in the two changes in output is due to legacy decline in Montana (36.4-22.6 is 13.8) so legacy decline in the Montana Bakken is estimated at 13.8 kb/d so in ND it is 57-13.8 or 43.2 kb/d for legacy decline in the ND Bakken. From new wells we get 43.2 plus 36.4 or 79.6 kb/d from 204 added wells or 390 b/d which is pretty close to Hughes estimate. If legacy decline is 43.2 kb/d then 110 wells would be needed to offset legacy decline as of August data which is not that far from Helms estimate. I think his estimate is maybe outdated. If Helm’s number of say 90 wells to offset decline is correct and the 43.2 kb/d is also correct (note that if wells had been added in Montana the 43.2 kb/d decline would be too high rather than too low an estimate) then new well first month output would need to be 480 b/m. Due to the conflicting data it seems the most we can say is the first month output is between 390 bpd and 480 bpd, probably I should split the difference and make my model about 435 b/d for the first month, though my gut tells me the Hughes estimate is probably best so I my go with 420 bpd.


  5. Coolreit says:

    “Ron Patterson says:
    DECEMBER 2, 2013 AT 1:53 PM
    The energy wing of IHS is known as CERA. Daniel Yergin is Vice Chairman of IHS.”


  6. TechGuy says:

    “I am hoping with that name you are an engineering/science type and not just someone who likes technology. Have you seen the study from U Delaware”

    Yes DC, I do have an engineering & science background, and I have done extensive research in renewables. Ron’s reply to your e-mail does a very good job of explaining it. The issue is much more complex that most people understand. Researcher at Universities (academics) do not have sufficient real world experience managing large scale electrical grids. In addition to reading about academic articles about renewables. I also read the power generation periodicals which do a better job of explaining the real problems of renewables since they are written by insiders that design and maintain the grid.

    This articles is not much different the the dozens I have reviewed. as the focus on chemical systems for energy storage. To date there is no viable chemical storage solution because they all have very limited charge/recharge cycles. As Ron discussed the only viable energy storage solution is pumped water reservoir. Unfortunately all of the locations that these systems could be set up are already fully developed (populated) or there are regional water shortages.

    The articles discussion about long distance distributed power systems (ie > 1000 km) still does not account that only 50% of the earth surface in sunlight and probably only 30% of the surface has the sun in a favorable position for solar generation (ie the low horizon problem). It also does not address large storm\weather system that can easily cast a region over 1000km in overcast conditions that may persist for multiple days. Snow cover is another problem which persist for days or weeks unless manually cleared. Getting all this set up and synchronized for AC loads (60 hz in the US) is extremely complex. At this time its simply not feasible.

    In the case of Wind, Wind power has a goldilocks problem. if the wind speed is too low or too high it does not generate power. The wind has to be blowing just right for it work. Wind can be extremely chaotic as it may blow for 2 hours and then stop for 30 minutes before resuming again. Storm system can disable Wind turbines for days, by either producing rapid wind direction changes, or too much wind forcing operators to shut them down to avoid damage. Rarely does the wind blow at the correct speed for a wind turbine to generate is labeled output. Most turbine operators see wind turbine generate 30% or less of their label out put, So if you need 30 MW power you need a minimum of 100 MW of wind turbines and probably another 10% to accommodate maintenance (ie Turbine taken offline for repair to scheduled maintenance)

    Also consider that 99.9% is not an ideal reliability. The national grid is much more reliable than 99.9%. Usually there is a major failure only once in about 15 years. While regionally failures are higher it very rare that 30% or more of the national grid fails. I don’t recall a single event in my lifetime when the grid was down for more than 30% to 40% of the nation. If the East coast is down, The West is usually operating fine, and when the West is down, the East is operational. It takes a considerable effort and a lot of money to get the grid backup as grid operators must turn on sections of the grid, one at a time and balance power generation with the load.

    Few of these articles (probably none) discuss the true investment cost of renewable power. If you have a demand of 10GW, renewables will need 4 to 12 times (depending on regions and Tech used) the power generation capacity to address low or no output periods. In the case of a distributed system, each spot much produce enough power to meet 100% of the demand when other regions are producing no power. There also needs to be excess capacity to address nighttime demand, periods of overcast that persist for days and maintenance. Distributed system are also much more expensive to install than base load plants as the infrastructure need to connect the power sources to the grid need to be spread out over a much larger region. A typical Coal fired plant or Nuke plant will produce 1GW in a tiny foot print. An operator can connect entire plant output with a single cable run to the grid. With distributed systems, power lines need be run to connect each individual power source (Wind Turbine, Solar panel array). These connections cost real money.

    To suggest that renewables can be cost completive with fossil system is just grossly wrong.

    • dcoyne78 says:

      Hi Techguy,

      I believe you underestimate the researchers at universities. Where do you think power engineers learn their craft?

      I agree that the study may be optimistic. However they studied an area where there is already very good connectivity. As far as distributed systems and the costs, those systems are already connected and the distributed nature reduces the problem of intermittency.

      If you include the externalities of coal and nuclear and all costs of those systems, such as decommissioning nuclear power plants or the health care costs associated with coal power, the costs look much different. Power can be stored with batteries, and natural gas turbines can also be used as a backup.

      I am with OldFarmerMac on this one, we will just have to disagree. There are plenty of places such as Texas where wind is taking off. As solar costs come down further these will take off as well. If power companies pay a premium to get power from electric cars, people will be willing to do it. I don’t expect this will happen until there is greater penetration of electric and plug-in hybrids in the market, probably around 20 % or so which may be a decade or two away, in the mean time batteries or fuel cells can be used for storage. Very little storage is needed at 30 %, as natural gas prices rise storage will become more competitive on our way to 90 %, perhaps by 2040 or 2050.

      Only climate change deniers think coal is the solution, I am fine with nuclear, though think as much wind and solar as feasible is the best solution.


  7. Wes538 says:

    North Dakota well permits and completion data from “The Bakken Breakout” and any other newspapers with similar data all appear to be directly copied and pasted, more or less, from the Daily Activity Report ( published each business day by the NDIC. To the best of my knowledge, initial production values are for the first 24 hours after being tested, and wells that have come off of “tight hole” (confidential) status but do not have any initial production numbers are generally wells that have yet to be tested, typically because they have not yet been frac’ed and completed. They are rarely dry holes. If there are any true dry holes to report on any given date, they will be listed under a separate category in the report.

    Montana issues activity reports on a weekly basis every Thursday. Usually they get posted online (click “Weekly Activity Letter” at sometime on Friday.

  8. aws. says:

    For background it might be worth checking out some of the webcasts from the First Energy conference David Hughes gave the “Shale Myths” keynote presentation at.

    The conference was an opportunity for small to mid-size oil & gas producers to pitch their prospects. I watched Yoho Resources webcas which is a pitch for their Duvernay and Montney shale prospects. Their is discussion and data about initial production, costs and percentage of boe that is liquid (just condensate).

  9. Tribe Of Pangaea- First Member says:

    We’ve got a very interesting time ahead of us…
    Climate and related scientists seem to be ramping up the vocals and radicalism (‘do or die decade’/’leave it in the ground’), then there’s what you’ve suggested a thousand times; then there’s financial/accounting fraud/smoke-and-mirrors as the apparent new modus operandi; then there’re outfits like (protests, civil disobedience, FF divesting campaigns) and some more radical related ones, like Deep Green Resistance (for their underground factions; likely blowing up energy infrastructure) etcetera. Storms are already here, and brewing on the horizon.
    Have you ever heard of the video, The Roach Motel at The End of The Universe? It starts off with the ‘North Dakota oil debacle’.

  10. Techsan says:

    Okay: I am really falling for this stuff.

    PhD scientist and electrical engineer, with 12 KW of PV solar on my roof, 12 KWH of storage, two electric cars (Leaf and Volt) and electric bicycles, net zero energy and no direct fossil fuel use except on summer vacation (which, of course, could be eliminated if necessary).

    Yes, storage costs something, but not a huge amount. My 12 KWH of batteries works out to $25 per month.

    Here’s an interesting fact: for less than the cost of 3 years’ gasoline for an average gasoline car, you can buy solar panels that will power an electric car for the same number of miles per year — for the rest of your life. I find that electric cars are 7 times as energy-efficient as gasoline cars (4+ miles per KWH).

    So, don’t believe the nay-sayers. I’ve not only done the math, I’ve also done the experiment. It is fully implemented, thoroughly instrumented, and it works great.

    • dcoyne78 says:

      Hi Techscan,

      Did you look at that study from the U of Delaware?

      As an electrical engineer (you did not say power engineer so this may not be your area), what do you think of the “copper plate assumption” in that study and overall does it strike you as reasonable for a large grid area such as the area studied (northeast US)? Thanks.


  11. Watcher says:

    “To the best of my knowledge, initial production values are for the first 24 hours after being tested, and wells that have come off of “tight hole” (confidential) status but do not have any initial production numbers are generally wells that have yet to be tested, typically because they have not yet been frac’ed and completed. They are rarely dry holes.”

    This is good data and makes sense. Only so many dry $8 million wells can be tolerated.

    Maybe the most powerful thing I see in the data is the concentrated geography and what it means for that EIA estimate or someone’s estimate of just how many total wells can be drilled.

    What I’m thinking about is how expanded fracking stage count would have a negative effect on the total. If each well is spreading out over more surface area because of technology improvement, the total number of wells probably has to lower — simply because some of the surface area’s rock under was already drained. In other words, the estimate made was probably made with assumptions of shorter fracking distance. With that expanded, the estimate is high.

    Drilling outside those 4 counties has happened and they are listed in that article as weak numbers. It’s not like the boundaries of where it all works have not been explored.

    There is talk flying around about descending deeper for Three Forks oil, but what do the economics say? If you drill and frack a well at one depth, do you bring in a rig and start anew to go deeper? Or do you drill that same well deeper when it’s flow gets low enough . . . for something? Or hmmm, maybe you can drill deeper and collect whatever is there and add to it whatever residual flow comes from the shallower structure. That would be for that even more limited geography where there is Three Forks structure under the other anticline.

    On third thought, that is dumb. You’d force yourself to wait for the first well’s flow to decline to some safe level. You might not want to wait for that. You might need more IP to counter decline *now*. Hmm so this means you likely drill an entirely new well. No way to dodge that cost.

  12. dcoyne78 says:


    You often comment about the 30 fracking stages, but I am pretty sure this has been standard practice in the Bakken since 2010 at least (I think this started in 2008 and by 2010 most companies at devised an optimum strategy). The data I am using is from Rune Likvern on wells which started producing from 2010 to 2012, all wells with at least 12 months of data, those are the wells I base my model on and modify it from time to time as new data becomes available and attempt to match the model with data from the NDIC.
    I agree that Helms remarks point to 500 b for first month production and this matches with IHS data. In a comment above I modified my model to show what changes if we force a higher 1st month production in the optimization (450 b vs 330 b). The short answer, the peak is about the same but moves forward by a year, but the difference is not that significant.


  13. Watcher says:

    Didn’t know fracking was at 30 at 2010. I’ll go back and dig in some recent pressure pumping department managers briefings who my (somewhat vague) recall said it was the new normal, with new defined as weeks to a month or two — but I have to go find it. Not sure.

    The point is not all that metaphorical, but it need not be specific. Techocopia is usually handwaving imaginings of some future miracle. Fracking expansion is not handwaving. It’s a easily envisioned effect. Cost per stage has to be front loaded per equipment transport blah blah so there’s no economic diminishing return issue. Already discussed. Regardless, more stages should lead to fewer geographically possible wells.

    BTW pretty sure I saw recent babble from somewhere about 50 stages, but I think it was some comment thread unlinked.

  14. Tribe Of Pangaea- First Member says:

    “…for less than the cost of 3 years’ gasoline for an average gasoline car, you can buy solar panels that will power an electric car for the same number of miles per year — for the rest of your life.” ~ Techsan

    I wonder how long we’ll have a functioning highway system.
    But, yes, the system, whatever it is, may appear surprisingly resilient, until it isn’t… here and there. In fractal collapse, (I prefer this term over ‘catabolic collapse’) you supposedly have extreme catastrophes interspersed with warm&fuzzy declines and everything in-between. And even the occasional improvements/inclines.
    All those engineers and/with Phd’s and we still seem to be headed for the abyss…

  15. Jeffrey J. Brown says:

    “I wonder how long we’ll have a functioning highway system.”

    It would be ironic if some unfortunate soul were to be driving his new Tesla across a bridge that fails, because it should have been replaced 10 years ago.

  16. Techsan says:

    No, I’m not a power engineer; electronics and computers.

    The “copper plate assumption” is obviously wrong but greatly simplifies the analysis: you can just total up power supplies and requirements, and ignore simulating where power comes from and goes to, and how much is lost in transmission.

    On the other hand, it is not too bad an assumption. Transmission will cost maybe a 10% loss. They acknowledge that they are ignoring some minuses (such as transmission losses) and some pluses (such as demand management), and wave their hands and claim the pluses are bigger. I agree with that; demand management, especially, can do a lot.

    • dcoyne78 says:

      Thanks. It is impossible for any analysis to be completely realistic or it becomes to complex, certain assumptions have to be made. It seems to me that for the area they considered the copper plate assumption is not too bad considering how interconnected that area is, there is quite a bit of redundancy built into the system and upgrades are always needed in any case, the equipment does not last forever. It seems it would be good if the power engineers considered this in plans for grid updates because cheap natural gas and the days of ignoring global warming will not last forever, also nuclear power is not cheap, especially when decommissioning costs are fully accounted for along with liability protection from the government at no cost. See


  17. TechGuy says:

    “Very little storage is needed at 30″

    DC here is my challenge to you. Instead of accepting an article’s information as fact, why not research into the matter on your own. Spend 40 to 80 hours to understand electro-chemistry, battery and fuel cell technology. Learn about battery charge/discharge cycling, and issues with anode erosion. Question everything. The more you question and study on your own the more real knowledge you obtain. Don’t settle for some else work. Then come back and and discuss your findings. See if your improved understanding of electro-chemical storage system still supports your original argument. Apply the scientific method!

    If you are unwilling to do the research on your own then you have no ground to promote a hypothesis.

    “If power companies pay a premium to get power from electric cars, people will be willing to do it. ”

    Well that would be great for the power companies but a terrible deal for the consumers, as it will be the consumers that pay for the replacement costs of their batteries when can no longer charge effectively.

    ” I am fine with nuclear”

    Add nuclear power to the challenge too. I would have thought the disaster of Fukushima would have given everyone a strong reason to distrust Nuclear as a solution. Even the recently retired NRC chairman has gone anti-nuke:

    “I believe you underestimate the researchers at universities. Where do you think power engineers learn their craft?”

    The majority of researchers remain academics for their entire career. Often their research is funded by businesses turn around and use this research to get gov’t to buy their products (Tail wagging the dog). I know this, because I am an engineer that has worked on projects which involved work performed by researchers.

    These researchers always have a biased report that reflects positively in favor of the businesses that funded their research. Research is done by PHD’s, the real world implementation is done by engineers. The PHD’s get to work in a very controlled lab environment that is shielded from the real world. Who got the tougher job? The engineers of course!

    FWIW: The article you reference is extremely light on details. There is no model of an actual working system backed by real world data and simulations. Its written as an executive summary, mostly likely for an audience of gov’t officials to promote and fund a large project.

    • dcoyne78 says:

      Batteries are used for backup all the time. There is no practical reason this could not be scaled up for grid backup except cost. Let’s assume you believe the science of global warming and think that using coal is a bad idea and that eventually people in the US start to realize that the science is correct (just as most people now realize that smoking cigarettes is not good for your health). So the US imposes a large tax on all fossil fuels at their production source (or when they are imported in the case of imported fuels that are not already taxed at similar level in their country of export). In this case the high cost of batteries or fuel cells competes with the high cost of fossil fuels as a backup to wind and solar.

      I assume you either only read the abstract or skimmed very briefly through the paper (I have seen very few 15 page executive summaries, these tend to be 2 to 3 pages at most). The 30 %, 90 % and 99% levels refer to the number of hours covered by wind, solar or storage (batteries or fuel cell) backup, they are not estimates of system reliability. Any hours not covered by renewables are covered by fossil fuel backup. They used actual weather and load demand over a 4 year period in their simulation. Clearly there is no actual system to test we are talking about the entire PJM interconnect, it is a feasibility study and an attempt to determine what the lowest cost renewable system would look like.

      The results, briefly, for those who do not want to read a 15 page technical paper:
      At 2008 costs the cost minimized system covers 30 % of all load hours with renewable power and 60 % of average power needs at a price of 11 cents per kwh, at anticipated 2030 costs the cost minimized system covers 90 % of all load hours with renewables and over 95 % of average power output at a price of 6 cents/ kwh for GIV (grid integrated vehicle) and 8 cents/kwh for fuel cell storage. The 30 % system provides the renewable power from inland wind and the rest from fossil fuel there is no storage in this system. The 90 % system (at 2030 prices) does include storage from GIV or fuel cells but most power is from inland wind (about 65 %) and offshore wind (about 15 %).

      I find the study very informative, but I will research it further.

      Note that if all coal exporting countries put carbon taxes on their coal, at a similar $/ton carbon dioxide emitted when burned, this takes care of the “if we don’t burn it someone else will” argument. We need to make it expensive to burn. Clearly one country cannot make this happen, which is why an international treaty like the Montreal protocol is needed. If that doesn’t happen, a carbon tax can be added to imports from countries with heavy coal use that don’t have a CO2 emission tax themselves. All carbon tax revenue could be used to reduce income tax so that the net tax collected remains the same.


  18. Tribe Of Pangaea- First Member says:

    The system is built with the rusting rebar of irony.

  19. TechGuy says:

    “Batteries are used for backup all the time. There is no practical reason this could not be scaled up for grid backup except cost. ”

    And they are replaced all the time. A typical data center using batteries replaces them every 3 to 5 years depending on how many times they are used, and they are usually used infrequently. The more a battery is used and the deeper the cycling the sooner it wears out and must be replaced.

    The issue I see with batteries is that frequent short draw downs will result in very short span. Consider that if you have intermittent wind blowing, but on multiple occasions it can’t recharge the battery to 100%, this issue absolutely damages the battery. You have to research this to understand this since it would take several pages for me to explain.

    “I assume you either only read the abstract or skimmed very briefly through the paper (I have seen very few 15 page executive summaries, these tend to be 2 to 3 pages at most). ”

    15 pages is tiny for a technical paper. A technical with such a complex topic would probably be 500 to 600 pages. 15 pages would be a executive summary of a technical paper. It appears there is a book they’ve written by the link is only showing 15 pages of it.

    Here is the key sentence that is misleading about their entire research:

    “In order to realistically simulate generation, we use insolation and wind data from DOE and NOAA for each hour being modeled. ”

    They should have gotten real data from operators of the wind farms, They are basically making up the wind generation data using collected NOAA wind data, and probably using the DOE data for demand estimates. I would have been more satisfied if they set up data loggers dispersed at different proposed or existing wind farm locations and used that data to interpolate generation estimates. The NOAA data is like estimating the volume of a lake by eyeballing it from a satellite image. In my opinion this paper should not be used for a valid analysis.

    “Let’s assume you believe the science of global warming and think that using coal is a bad idea”

    There are worse issues than global warming, such as a total economic collapse or a global nuclear war caused by mass civil unrest. If the grid goes down in the US for more than 8 to 10 days, most of the US Nuke plants will begin exposing spent fuel rods as the spent fuel pools water boils away. Your worried about the world in 50 to 100 years. I am worried about a global crisis in the next ten years. Global warming caused by humans will cease when oil and nat gas production can no longer sustain the global economy. If Ron’s estimates are correct, that we could see serious problems beginning in less than a decade. Once oil shortages begin, global trade becomes too expensive, and coal mining becomes drops off too.

    Lets assume your assumptions that renewable are cost competitive are correct. Lets also make this easier by assuming we can replace 25% of electricity generation with renewals. Wouldn’t it make sense to built all of the infrastructure first before shutting down the existing infrastructure? Today the gov’t is forcing the existing infrastructure (coal plants) to be shutdown without even an implementation plan to replace it. Put it this way: Imagine tomorrow a major highway is shutdown tomorrow because it generates too much pollution and the plan is to hope that someone builds an mass transportation network to replace it? What do think the outcome will be?

    Unless the EPA rules are changed, the US will lose 330 GW (42%) of its power by 2022, as it not possible to meet the mandate with any coal fired plant. Many plants will close before that date. From the articles I have read none of the major plant operators have any plans to replace them. They think NatGas prices are going to soar again, and none of them wants to build new nukes since it costs 10s of billions and about 10 years to build it. They got 9 years to replace 330 GWs.

    “Note that if all coal exporting countries put carbon taxes on their coal,”

    Thats not going to happen, and it would not matter as China and India have enough coal to continue expansion for decades. The US is exporting coal that is cleaner to burn, than Asia has.. They buy our coal to cut pollution (well at least China does, since I don’t think we ship much coal to India at this point). The US has better grades of coal available. Good luck in tell the Chinese or India to cut back. I am pretty sure they will reply with extreme profanity!

    “At 2008 costs the cost minimized system covers 30 % of all load hours with renewable power and 60 % of average power needs at a price of 11 cents per kwh”

    I am pretty sure this cost estimate is way off:
    “Cape Wind currently has PPAs with NStar and National Grid for 18.7 and 20.5 cents per kilowatt-hour (not including 3.5 percent annual escalators).”

    Real costs are being masked by subsidies. I am sure any distributed system that is spread out over 1000 km, is going to much more expensive as I discussed in my early comments. Costs per kwh at fossil fuel plants is only $0.04 in New England. Consumers in NE pay three to four times that (for distribution, maintenance, and taxes). Assuming the real cost of Wind is $0.19 kwh, the cost to the consumer is would probably be near $0.40 to 1.00 kwh, since power complanies would need spend money for distribution, storage systems or redundant system distributed farms that are perhaps a 1000 km away or more. As I commented early of you need 30 GW of wind power with 99.9% you need to build 100 GW of real capacity so that if your output from your primary drops to zero, you can draw 100% of the load from alternate sources. So you need to factor in the extra costs. At $0.40 to 1.00 kwh most business reliant on electricity would be forced to close as it would uneconomical to operate. Already many industrial business have off-shored where the costs of electricity are much lower. Further cost increases will only send more production and jobs overseas, including your job.

    Oh, one last thing. if your goal of renewables is to curb global warming, fuel cells or devices using hydrogen are used, its a killer since it very bad green house gas and also attacks ozone. There are also many issues with storing hydrogen (its very difficult to seal because of it small size) and causes hydrogen embrittlement. Hydrogen storage is very expensive.

    BTW: Do you own an off grid system yet?

  20. dcoyne78 says:

    Hi Techguy,

    No I don’t have any solar at all, but when I get some it will be grid connected.

    I understand that batteries are not perfect, but the 30 % coverage of load hours requires no storage.

    Problems with batteries or fuel cells are not relevant for that case. This scenario includes only inland wind an so your Op-Ed talking about the high current costs of offshore wind is not really on topic.

    Current costs of 4 cents/kwh in New England, ignores the externalities of current electric power output, if those costs are included the price goes up. In the case of the PJM the load following contract price is 8 c/kwh and externalities are about 9 c/kwh. So the 11 c/kwh should be compared with 17 c/kwh because with wind power the external costs are quite low. The 4 c/kwh which you claim is the cost of electic supply in New England is no doubt the baseload price and does not include the higher prices needed during peak summer hours. The 99.9 % case requires only 9 hours of central battery storage over a four year period. The study does not reveal the number of hours of storage needed at the 90 % coverage level at 2030 prices, but judging by Power capacity of batteries in the 90 % case in table 8 on p.66 of the paper (which is about half of the 99.9% scenario), I think fewer hours would be needed in this case. Note that the GIV case and fuel cell cases require more hours of storage, by 2030 fuel cells and GIV give the lowest costs, I don’t think the obstacles to either technology is insurmountable.

    Note that earlier you suggested that the GIV solution cannot work because consumers will have to replace their batteries sooner. This is a decision that is up to the consumer. If I think the price the power company is paying for the use of my car battery is too low, I don’t have to participate.


    Your suggestion that data should be collected at prospective wind sites over a four year period is a good one. The study used the data available and one might conclude that maybe further research may be worthwhile.

  21. Dave Ranning says:

    You seem to have created the most visited site from survivors of TOD.
    Hats off, and very valuable.

  22. Tim E. says:

    Too true. Some interesting reading from Energy Skeptic

    “Why and how does “high strength” modern concrete crack and erode (which lets water in, eventually rusting out the rebar inside, ruining the structure)?”

    “The Romans built concrete structures that lasted over 2 thousand years. Ours will last a century — at most.”

    “How long will concrete last if it isn’t maintained?”

    Thank you Ron for the hardcore information and truth.

Comments are closed.