Posted on Thursday, December 3rd, 2009
A version of this article was published in New Scientist on 3 December 2009.
The oil crisis is not dead, only sleeping, according to an emerging consensus. The price may have collapsed from last year’s all-time high of $147 per barrel to around $75 today, as the recession grinds away at demand for crude, but nobody expects that to last when the economy recovers. Analysts Goldman Sachs predict oil will cost $95 by the end of next year, while Deutsche Bank reckons $175 by 2016. The International Energy Agency (IEA), the OECD’s energy watchdog, forecasts a potential “supply crunch” around the middle of the decade.
Yet there is no shortage of oil – at least not underground. Many commentators attribute the $147 spike to the approach of peak oil – the moment when global oil production goes into decline because of geological limits – which should happen, so the theory goes, when we have consumed about half the oil that will ever be produced. And it’s true that by 2008 the world had consumed just under 1.2 trillion barrels of oil, against estimated original reserves of about 2.4 trillion barrels. But that’s just the conventional oil, which is only a fraction of the total.
Conventional reserves are dwarfed by a whole range of non-conventional oil resources, such as the Canadian tar sands, oil shale, and synthetic liquid fuels made from gas or coal, which according to the IEA expand the total oil resource to 9 trillion barrels (see graph). And so far the non-conventionals are almost entirely untouched. So how could there possibly be an oil supply crunch, let alone peak oil, any time soon?
Source: Source: IEA
The trouble is, it’s called non-conventional for a reason. Conventional oil refers to liquid hydrocarbons trapped in deep, highly pressurized reservoirs, which means that when the wells are drilled, the oil usually gushes to the surface of its own accord. Non-conventional oils are not so obliging, and traditionally need large inputs of energy, water and money to get out of the ground and turn into anything useful like diesel or jet fuel. As a result, non-conventional oil production currently amounts to less than 1.5 million barrels per day, out of a total of around 85 mb/d.
But now a clutch of emerging technologies promise to solve some of the problems associated with non-conventional oil production, and allow it to be produced much more cheaply with far less energy and water. According to Julie Chan, VP Finance at E-T Energy, a Canadian company developing a way to produce bitumen from the tar sands by sending an electric current through the reservoirs, “Canada could eclipse Saudi Arabia”. So are non-conventionals about to ride to the rescue, and confound the peak oil doomsters for decades to come?
The only significant non-conventional oil production today comes from the Canadian oil sands, and so far most of the solid bitumen has been extracted from huge mines run by operators such as Shell and Suncor. But mining is expensive, and new projects need an oil price of $80 to be economic. The process also requires huge volumes of water to process the bitumen, yet the industry is already reaching the legal limits of what can be drawn from the Athabasca River in winter. Worse, mining is only possible for deposits less than 75 metres deep – and that’s just 20% of the total resource.
The rest has to be produced using newer ‘in situ’ techniques like Steam Assisted Gravity Drainage (SAGD), where steam is injected underground to melt the bitumen, which is then pumped out. This is cheaper and uses much less water than mining, but far more energy – usually from natural gas – to raise steam. A report from the Alberta Chamber of Resources in 2005 found that if oil sands production rose to 5 mb/d by 2030, it would need 60% of western Canada’s entire gas supply, which it said would be “unthinkable”.
But now a range of new technologies are being developed that promise to relieve some of the constraints. Nexen, a Canadian oil company, has developed a new twist on SAGD by dispensing with natural gas as fuel, and using some of the produced bitumen to generate energy instead. At its Long Lake site, the company gasifies asphaltines – the heaviest fraction of the bitumen – to make a synthetic gas, which is used to raise steam for SAGD, and produce hydrogen to upgrade the bitumen on site into high quality synthetic crude oil. This makes the process cheaper and energy self-sufficient, even generating surplus power to export to the grid. The drawback is higher emissions than mining or conventional SAGD, themselves more polluting than average conventional crude. The company aims to expand production from the current 14,000 barrels per day to 60,000 b/d by 2013.
Toe-to-Heel-Air-Injection (THAI) takes a similar approach, but does its burning underground. THAI involves a horizontal production well, paired with a vertical injector well drilled close to its ‘toe’ (see graphic). To start with steam is pumped down both wells to heat the bitumen until it is hot enough to combust spontaneously when exposed to air. Then the steam is turned off, and air pumped down the injector well to feed a ‘fire front’ that moves slowly through the reservoir from the toe of the production well towards the heel, generating temperatures of up to 500°C. The intense heat separates the bitumen into heavier and lighter fractions, with the heaviest, the asphaltines, being consumed in the fire, and the lighter ones melted to flow into the production well, where they are pumped to the surface. That’s a neat trick, because it means part of the refinery’s job is done underground. This process uses between 10 and 30 per cent of the natural gas consumed by SAGD processes. It is even self-sufficient for its water needs, because groundwater is pumped up the production well along with the bitumen and recycled.
Source: Petrobank Energy and Resources
Another new approach is not to burn the bitumen underground, but to zap it with electricity, using a technique called Electro-Thermal Dynamic Stripping Process (ET-DSP). A grid of vertical wells is drilled into the oil sands, each containing three electrodes (see graphic). Current is conducted between the wells through groundwater, and the resistance heats the bitumen to flow into a production well in the middle. Changing the voltage gradient between the electrodes allows the operators to direct the electric field to heat the richest parts of the reservoir. Any water that comes up is re-injected to maintain conductivity, and because the process runs on grid electricity, there’s no need for natural gas.
Source: ET Energy
On the basis of Alberta’s largely coal-fired power supply, the electricity used in ET-DSP would produce about the same emissions as SAGD production and more than mining. But E-T Energy, the company developing the technology, insists that upstream emissions could be slashed if it were powered using hydro, wind or even gas-fired power. In a separate development, Bruce Power, an Alberta-based nuclear power generation company, has drawn up controversial plans for new reactors sited near Canadian tar sands deposits to provide CO2-free electricity to power oil sands production.
So with huge reserves and new technologies, can the oil sands put off the oil crunch? Surprisingly, promoters of the newest technologies are skeptical. Bruce McGee, boss of E-T Energy, stresses the massive investments that will be required even to reach industry estimates of 5 mb/d by 2030, and doubts output can be raised significantly further. Chris Bloomer, his counterpart at Petrobank Energy and Resources, the company developing THAI, agrees: “The oil sands are not going to solve the world’s oil supply problems”.
That view seems to be supported by the findings of a recent report on the oil sands’ growth prospects from analysts IHS CERA, after consulting widely in the industry. In its most optimistic scenario, “Barreling Ahead”, in which the industry is supported by strong demand, firm oil prices, and government policy, oil sands output reaches 6.3 million barrels per day in 2035. But to get there, the report assumes production capacity would rise 200,000 daily barrels every year, twice the growth rate between 2005 and 2008, when the industry overheated, suffering widespread skilled labour shortages, double-digit inflation and endemic project delays. That, says Jackie Forrest, one of the report’s authors, “is really pushing it”.
THAI and ET-DSP may help relieve resource constraints and bring costs down in future, but IHS CERA estimates it will take between 5 and 15 years to commercialize the new technologies. “It could be a decade before it is used in enough reservoirs to contribute meaningfully to production”, says Forrest. And that’s well beyond many forecasts of the next oil crunch.
In the meantime, any growth will depend on older, more expensive methods that are vulnerable to volatility in the oil price. Since the price slumped from its peak of $147 last year, oil sands projects totaling 1.7 mb/d have been cancelled or delayed indefinitely, according to the IEA. If oil price volatility persists, as many analysts predict – with shortage leading to a price spike, leading in turn to recession and low oil prices again – the drag on oil sands development could become chronic.
Len Flint, of Lenef Consulting, which specializes in the tar sands, thinks output will still rise to 2 mb/d by 2012 because of projects that are too far advanced to cancel, but acknowledges the risks to growth further out: “any volatile prices in which there may be a collapse inevitably curtails the development of the oil sands”.
Others take comfort from the oil price recovery from around $35 at the beginning of this year to around $75 today. “There’s been a probably structural uplift in where the pricing should be, and that’s encouraging”, says John Broadhurst, VP Development & Technical Services for Shell Canada. The company, which currently has 215,000 b/d in mining and SAGD production, is in the process of expanding output by 100,000 b/d, but a subsequent expansion is on hold.
Of all the supermajors Shell has most riding on the tar sands, and the company believes non-conventional oil will make up for declining conventional supplies, at least for a time. But even they don’t pretend it will be easy. “It’s going to be a challenge because these are more challenging hydrocarbons to deliver” says Broadhurst.
The challenges for other forms of non-conventional oil production look even greater. Shale oil is another potentially massive resource, with more than 2.5 trillion barrels in place, and has been used to produce oil since before the conventional industry took off in the late 19th century . But production has dwindled since 1980, and today its only significant use is as power station fuel in Estonia.
The basic problem is that oil shale is a misnomer. The sedimentary rock contains no oil, only an organic substance called kerogen, and to produce oil you need to heat the rock to 500°C until the kerogen decomposes into synthetic crude and a solid residue. Traditionally that has meant digging the shale up and baking it in a huge oven, which is energy-intensive and expensive. It also leaves a greater volume of waste than the original shale, as testified by the ‘bings’ or hills of shale slag that dot the West Lothian region of Scotland, after a century of shale oil production that ended in the 1960s.
Five Sisters shale bings, West Lothian. Source: Tormentor4555
So what’s needed is an in situ production method, like those developed in the oil sands. Three quarters of the global shale resource lies in Colorado, Utah and Wyoming, and the Obama administration has recently re-started the process of leasing Federal land for shale oil R&D. A number of ingenious technologies are being developed to heat the shale underground – including microwaves, high temperature gas injection, and radio waves combined with supercritical CO2 – and then extract the resulting oil using conventional oil wells. But all are in their infancy.
Shell has probably done the most work on its extraordinary shale In Situ Conversion Process (ICP), developed at Cathedral Bluffs, Colorado. Electric heaters are lowered into 2000-foot vertical wells and left to heat the shale to 300-400C for several years, converting its kerogen into oil, which is then pumped out. At the same time the perimeter of the production area has to be frozen to the same depth using wells refrigerated with ammonia to prevent groundwater contamination.
But even after 25 years’ R&D, Shell will not be ready to decide whether to commercialise the technology until the “middle of the next decade and possibly later”. A spokeswoman said it was “premature” to provide information about energy inputs and emissions. The IEA estimates shale oil would cost between $50 and $100 to produce, more when any future carbon penalty is taken into account, and the Agency expects no significant shale oil production this side of 2030.
There’s yet another old-school production method that may experience something of a renaissance in the coming decades. Just as shale oil is nothing new, neither is making liquid fuels from coal. Two German researchers developed the eponymous Fischer–Tropsch process in the 1920s, heating coal to produce a gas of carbon monoxide and hydrogen, which is then catalysed to produce diesel and kerosene. The technology was exploited by oil-strapped, coal-rich Germany during WWII, and by South Africa in the 1980s and early 1990s to beat sanctions imposed during apartheid. South Africa has the world’s only major coal-to-liquids (CTL) plant operating today and China has recently built a demonstration plant in Inner Mongolia.
So, could coal be the answer? While some analysts fear coal production may peak as early as 2025, few doubt there is enough of the stuff to support a major expansion of CTL, and the fuels produced are of a high quality. But the drawbacks are formidable: it takes about two tonnes of coal and up to 15 barrels of water to produce a single barrel of synthetic fuels. That makes it expensive. The IEA says that to supply just 10 per cent of US transport fuel would mean investing $70 billion, and raising American coal production by 25 per cent – an additional 250 million tonnes per year.
Worse, because of the feedstock and energy demands of the production process, CTL fuels have roughly double the carbon emissions of conventional crude on a well-to-wheels – or ‘mine-to-wheels’ – basis. Carbon capture and storage could be applied to the production plant, but the process is only likely to be 90% efficient, and half the total emissions would still escape through the vehicle tailpipe. So even with CCS, CTL is always likely to emit more carbon than conventional crude.
The Fischer-Tropsch process can also be used to make liquid fuels from natural gas. As with coal, there is no immediate shortage of feedstock. In fact, prices have recently slumped as rising gas production in the US and falling global demand combine to produce a worldwide glut which should last for at least the next few years. But, as with coal, there are major drawbacks.
The gas-to-liquids process (GTL) emits much less carbon than CTL, because the feedstock is cleaner, but still more than conventional crude. That’s because almost half the energy contained in the 280 cubic metres of gas it takes to produce a barrel of GTL fuel is consumed in the conversion process. Three small plants account for global production of 50,000 barrels of synthetic fuels per day. That should quadruple in the next few years with the opening of larger plants in Qatar and Nigeria.
Despite the enormous resource, and some interesting new technologies, the prospects for non-conventional fuels look poor. Not only do they face major practical and financial hurdles, now there is also the prospect of stricter and more widespread carbon regulation.
In the US, Federal bodies are already effectively banned from buying non-conventional fuels, and President Obama has pledged to introduce a nationwide Low Carbon Fuel Standard (LCFS), requiring American fuel suppliers to cut carbon emissions per gallon by 10 per cent between 2010 and 2020. “If the US goes ahead with the LCFS, it will slow down the development of the tar sands”, says Professor David Keith, of the Department of Chemical and Petroleum Engineering at the University of Calgary. Non-conventionals would also be hit by the likely spread of carbon pricing post Copenhagen. The IEA is pushing for an OECD emissions price of $50 per tonne – adding $5 per barrel fuel derived from tar sands, $12.50 to GTL fuel and $30 to CTL – rising to $110 by 2030. Whether this happens or not, the prospect that it might is likely to deter investment in non-conventionals.
As the obstacles to expanding non-conventional oil production pile up, the challenge of replacing conventional oil looks ever more daunting. A major study published in October by the UK Energy Research Centre found that output from conventional oil fields is declining by at least 4% annually, meaning we have to add 3 mb/d of new production capacity every year just to stand still – equivalent to a new Saudi Arabia coming on stream every three years. They also found that decline rates would accelerate, and a ‘significant risk’ that conventional production will peak before 2020. So what chance do non-conventionals have of filling the deficit? Not much, according to Steven Sorrel, the report’s lead author. “If everything goes well the oil sands might produce 6mb/d in 20 years time, but by then we’ll need to add at least ten times that much capacity without allowing for any growth in demand. So it’s very hard to see non-conventionals riding to the rescue”.
Still, there may be a silver lining: if non-conventionals cannot defer peak oil for long, they will do that much less damage to the climate.