Thursday, December 20, 2007

Proposal to Dam Red Sea.

It is thought that 50 GW of power might be extracted from a dam built across the Red Sea, and which might alleviate tensions in the Middle East related to energy. Opponents to the idea think that the scheme could create environmental damage on a huge scale and even displace millions of people from their homes. The environmental costs must presumably be weighed against those from CO2 emissions which it is argued the project would save and easing demand on other resources which will run short within a few decades. The dam would need to be 18 miles across, to match the sweep of the Red Sea where it opens into the Gulf of Aden, it's narrowest point.

Hydropower is clean and entirely renewable, albeit there are arguments to the effect that the creation of dams, when areas of land are deliberately flooded, cause the release of methane, another gas thought responsible for global warming. Presumably there would also be some flooding of coastal regions in this project, which would be a stupendous feat of civil engineering. Regarding a timescale, one estimate of 25 years has been given, in comparison with that incurred in building other large dams.

The largest existing hydropower installation is powered by the Italpu Dam at the Paraguayan Brazilian border, which produces 12.6 GW of electricity, and the Three Gorges Dam in China is due for completion in 2009, and more than one million people were displaced during its construction: hence the fears among environmentalists that a similar outcome may befall coastal populations around the Red Sea. The Three Gorges dam generates 13.4 GW and when it is fully operational is expected to increase its output to 22.5 GW. By way of contrast, the Niagara Falls generates just 2.5 GW, taking 90% of the flow of water that flows over the falls themselves.

As ever there are many issues to be considered, but the world may need all the hydropower it can get, rather than using nonrenewable fuels to produce electricity such as gas, coal and uranium, supplies of which may be under quite some pressure in 25 years time, if not before then. However, it is clear that we can't have it both ways, i.e. preserve our energy-rich lifestyle and avoid environmental impacts whether they be on land, sea or the atmosphere, in their interconnected entirety.

Related Reading.
"Proposal: 50 Gigawatts if they dam the Red Sea." By Rick C. Hodgin,

Monday, December 17, 2007

U.K. Nuclear Power Stations to Soldier-on?

Plans have been announced by British Energy to keep the Hinkley B (in Somerset, England) and Hunterston B (in North Ayrshire, Scotland) nuclear power plants running for an additional five years. Both began producing power in 1976, and it is now intended to run them up to 2016, at least, with tests to be made in 2013 to determine if they can be operated safely beyond that initial target. All but one of the current U.K.'s reactors is due for decommissioning by 2024, which means that the nuclear industry has a lot of engineering-work on its hands, and especially so if plans to increase the overall nuclear capacity of the nation are followed.

It is thought that other nuclear power stations owned by the company might also have their operating lives extended, and for Hinkley B and Hunterston, the move will cost around £90 million over the funds for its current programme of investment. Both plants suffered from certain infrastructural difficulties in the past year, resulting in an operational load of 60%, which it is believed can be summarily increased to 70%, and would make the extra five years of useful life economically worthwhile.

Two more nuclear plants, at Hartlepool and Heysham 1, have suffered in their output after wire-corrosion in their boiler-closure units was identified in only the past few months. It is believed that the strategy will help to maintain the U.K.'s electricity output while alternatives are implemented, whatever they might prove to be. For example, Hinkley B and Hunterston provide enough power for over one million homes. British Energy holds around one sixth of the national generating capacity.

The government of Scotland is apparently not opposed to the scheme of running its power stations up to the end of their natural life, while Westminster has yet to make a formal announcement but it seems likely it will endorse such a scheme south of the border. I wonder which will give-out first: the world's nuclear power plants themselves or their supplies of nuclear fuel, as some aver?

Related Reading.
"Longer life for British Energy's nuclear power stations." Yorkshire Post, December 12th, 2007.

Friday, December 14, 2007

Shell to make Biofuel from Algae.

I have written on the subject of producing biodiesel from algae before, and it now looks that Royal Dutch Shell plc and HR Biopetroleum are to build a plant in Hawaii to grow algae and turn it into fuel. One very attractive feature of the strategy in general is the amount of diesel that might be "grown" per hectare compared with that derived from plants. According to the best estimates, somewhere over 100 tonnes of fuel might be synthesised per hectare from algae while 10 tonnes would be good going for the best crops, e.g. palm soya or jatropha, and probably just a tonne or so from rape.

Furthermore, since the algae need to absorb CO2 to grow their carbon weight, they offer a potential advantage of carbon sequestration, since although CO2 is released when the diesel is burned, the growth of the next crop of algae will take-up more of this important greenhouse gas. However, it is my understanding that the highest yields of algae involve using forced conditions of CO2 concentration which must be pumped into the reactors rather than simply leaving them open to the atmosphere to absorb ambient concentrations of the gas. It would be beneficial perhaps to locate algal production plants next to fossil-fuel powered electricity generating stations from which to catch CO2 and feed it to the algae.

However, the Hawaii facility intends to use open-air ponds to grow the algae in, which will be unmodified marine microalgae, indigenous to Hawaii, using patented methods. Presumably the latter have got around possible risks of contamination by other algal species with a lesser final yield of what is sometimes called "algoil." The technology has one more substantial benefit, and that is that unlike growing crops for fuel, which must eventually compete for a limited area of arable land with growing crops for food, the algal ponds can be placed on any land, including coastal areas which are no use for conventional agriculture.

The facility under question will be run by a Shell/HR Petroleum venture company, called Cellina, and will actually be located on the Kona coast of Hawaii Island, near to other facilities which also grow algae mostly for pharmaceuticals and food. The Cellina facility will use high pressure CO2 from cylinders to explore the potential of applying the gas from industrial e.g. power plants, as noted above. There is also a reduced demand on freshwater, which is likely to fail in supply worldwide and which some have predicted there will be wars over, since the algae can be grown in ponds filled with seawater.

I think this a very positive step, and it will be interesting to see how the project develops and if it is successful, just how easy it is to scale up the technology to match anywhere near the 30 billion barrels of oil from petroleum the world uses annually. In the latter respect, if technology based on algae is to provide part of our salvation in the Oil Dearth Era, it needs to be installed large and soon.

Related Reading.

Wednesday, December 12, 2007

UK Wind Farms.

It is intended to provide up to half of the UK's electricity using wind-farms based in the North Sea, the Irish Sea and around the Scottish coast. It is envisaged to build turbines up to 850 feet high, which is 100 feet higher than Canary Wharf, each of which would be capable of powering 8,000 homes and altogether with a combined generating capacity of 33 GW. John Hutton, the energy secretary, is set to open-up the entire coast of these islands for wind-turbine installations, except for those regions deemed essential for shipping.

The plan is to have the scheme up and running by 2020, but there will still need to be fossil-fuel powered electricity generation to cope with demand on occasions when the wind does not blow, which would leave the nation vulnerable to power shortages. It is expected that the turbines will be visible from all locations in Britain, and is the subject of controversy. Presently, there is only around 0.5 GW of our electricity provided by wind-power, out of a total generating capacity of 75 GW.

According to Mr Hutton, "The UK is now the number one location for investment in offshore wind in the world and next year we will overtake Denmark as the country with the most offshore wind capacity. This could be a major contribution towards meeting the EU's target of 20% of energy from renewable sources by 2020."

This should be compared with the fact that the UK is running out of renewable energy as a surge in demand by businesses has outstripped electricity provision by wind farms, hydropower and burning waste gas. Interest in cutting carbon emissions has greatly stretched new supplies of zero-carbon electricity, which is a pain for companies which have agreed to become carbon-neutral.

Clearly, we need to get the wind-farms up and running as soon as possible.

Related Reading.
(1) "Business runs out of green energy supply," By Juliette Jowlt, The Observer:
(2) "Giant offshore wind farms to supply half of UK powre," By Jonathon Leake, The Sunday Times. http://www,

Thursday, December 06, 2007

"Seeding the Ocean": a Discredited Strategy.

It has been proposed to "seed" the oceans with iron filings or other nutrients in an effort to stimulate the growth of phytoplankton (marine algae), and thus to remove CO2 from the atmosphere through photosynthesis. However, new research suggests this may not be an effective strategy, since less carbon is transported to the sea-floor during the summer months when algal growth is at a maximum than during the rest of the year.

The process is known as a "biological pump" because it incorporates CO2, in near-surface waters, into algae which then sinks into deeper waters and thus sequesters the carbon. As a corollary to this line of thought, the more algae there is in the surface regions the more CO2 is taken-up and the greater is the mass of carbon-rich detritus "pumped" to the bottom of the seas. The new findings, based on a novel mathematical analysis of the data, contravene this assumption, however.

The primary author of the paper, which is published in the Journal of Geological Research, Dr Michael Lutz, said: "The discovery is very surprising. If, during natural plankton blooms, less carbon actually sinks to deep water than during the rest of the year, then it suggests that the Biological Pump leaks. More material is recycled in shallow water and less sinks to depth, which makes sense if you consider how this ecosystem has evolved in a way to minimize loss. Ocean fertilization schemes, which resemble an artificial summer, may not remove as much CO2 from the atmosphere as has been suggested because they ignore natural processes revealed by this research."

Publication of the paper snaps close on the heels of a September Ocean Iron Fertilization symposium at the Woods Hole Oceanographic Institution (WHOI) at which were discussed matters related to environmental safety, economics and indeed just how effective the procedure might be in grabbing CO2, thus reducing its concentration in the atmosphere. Dr Hauke Kite-Powell, from the Marine Policy Centre at WHOI estimated a potential future ventures value of $100 billion for the technology on the international carbon trading market, and yet none of the major studies to date have demonstrated that it results in any significant degree of carbon sequestration. It is argued that these have been of too short duration and that vindication of the approach will need larger scale and more permanent experimental arrangements.

According to Professor Rosemary Rayfuse, an authority on international law and the law of the sea, based at the University of New South Wales, since such fertilization strategies are not approved under any carbon-credit schemes, the sale of "offsets" on the unregulated voluntary markets are "nothing short of fraudulent." She said: "There are too many scientific uncertainties relating both to the efficacy of ocean fertilization and its possible environmental side effects that need to be resolved before even larger experiments can be considered, let alone the process commercialized. Ocean fertilization is "dumping" which is essentially prohibited under the law of the sea. There is no pint trying to ameliorate the effects of climate change by destroying the oceans - the very cradle of life on earth. Simply doing more and bigger of that which has already been demonstrated to be ineffective and potentially more harmful than good is counter-intuitive at best."

Dr Lutz commented: "The limited duration of previous ocean fertilization experiments may not be why carbon sequestration wasn't found during those artificial blooms. Thuis apparent puzle could actually reflect how marine ecosystems naturally handle blooms and agrees with our findings. A bloom is like ringing the marine ecosystem dinner-bell. The microbial and food web dinner guests appear and consume most of the fresh algal food. Our study highlights the need to understand natural ecosystem processes, especially in a world where climate change is occurring so rapidly."

This is a fair point, and a timely reminder of the potential dangers of all strategies of "geoengineering". The earth is literally and mathematically a complex system, and it is a folly to try and compartmentalise its elements, "treating" aspects of them in isolation, when we don't really understand the nature of their interconnected whole. In "fixing" one "problem" we may trigger-off a whole host of sympathetic troubles by interfering in the natural balance of the planet-systems, and precipitating changes not predicted by any artificial algorithm of how the earth is supposed to behave, but may yet awaken us by rude reality.

Related Reading.
"New research discredits $100B global warming "fix"," By Virgina Key.

Tuesday, December 04, 2007

Cheap "Organic" Solar Power?

To use current silicon technology at the thickness (200 microns) that it is employed in the present generation of solar cells is not feasible on the grand scale. The main problem is how quickly high grade silicon wafers can be fabricated and I calculated previously that we would need around 100 times the present production capacity to be installed over 20 years to make a real dent in the anticipated shortfall of future world electricity production. Thin-film cells, which use perhaps 100 times less materials, and in more accessible amorphous forms rather than crystalline wafers, would represent a considerable advantage in terms of raw material requirements, but this technology needs further refinement.

In contrast to such devices based on "mineral", inorganic, semiconductor materials, there is the possibility of organic cells, which instead are made from carbon molecules. A conventional photovoltaic (PV) cell consists of a silicon wafer of thickness 200 microns (one fifth of a millimeter), to be compared with a human hair which is around 70 microns thick. This is treated with other materials to form a double-layer structure which is known in electronics as a p-n junction. Photons of light are absorbed by the silicon causing a flow of electrons and a hence a small electric current. In an organic cell, the double layer is made from two ultra-thin (100 nanometer or 0.1 micron) films of organic conducting polymers, embossed onto glass.

A prototype organic cell has been developed by Neil Cavendish at Cambridge University, and one about the size of your hand can produce enough electricity to run a pocket calculator. Most standard solar cells operate with a light-to-electricity conversion efficiency of around 10 - 15%, but organic cells have proved so far much less efficient, perhaps only 3 - 4%. However, they are much cheaper to produce. Indeed, Paul O'Brien at Manchester University thinks that solar cells need be no more expensive to make than high-performance self-cleaning glass. He said: "We're very interested in solar cells where we take an organic layer that's printable or sprayable containing an inorganic mineral like lead sulphide which will actually do the photon capture."

Indeed, lead sulphide can be fabricated into minute "nanorods", perhaps 100 nanometers in length and 20 nanometres in cross section, and can be dispersed in the semiconducting polymer, hence releasing electrons within the material, which can then conduct them carrying an electric current. All researchers in the field stress the need to move away from carbon-based fossil fuels in order to mitigate climate change, or in my view, more urgently to use less of the cheap oil and gas that we are running short of. In principle, cheap solar cells could be incorporated into the walls and roofs of buildings in the form of building integrated photovoltaics (BIPV), as a means to bear the burden of costs yet further. O'Brien reckons that the new solar cell technology might cost as little as one hundredth as much as silicon cells, and that will surely provide an incentive for further development.

Nonetheless, the clock is ticking away toward the Oil Dearth Era, and any such technologies need to be installed quickly, if we are to avoid a massive energy crunch, especially if electricity is implemented in various strategies to keep transportation running. There is also the issue of other materials such as platinum, which will be needed on a massive scale to underpin such innovations - another potential bottleneck in the shifting schemes of potential "solutions" to the impending and unavoidable energy crisis.

Related Reading.
"How solar power could become organic - and cheap," By Michael Pollitt, The Independent, 29-11-07.

Saturday, December 01, 2007

"Oil" from Wood Chips and Nuclear Power.

The two are not directly connected, and yet are set to be part of the energy mix deemed necessary to run the post-cheap oil world. Shell has collaborated with Choren Industries to build a pilot plant in Germany, near Freiberg, which uses wood chips to make synthetic fuel. In an adaptation of the Fischer-Tropsch (FT) technology in which coal is "fired" into synthesis gas ( a mixture of H2 and CO) and this is converted into hydrocarbons over a cobalt catalyst, wood chips are similarly gasified and turned into synthetic fuel. The FT process was invented at the Kaiser Wilhelm Institut fur Kohlenforschung in 1923, and contributed to keeping Hitler's armies fueled during WWII, which would otherwise have ended years before it did as the Allies had blockaded Germany from conventional supplies of crude oil. It was thought that the Germans would be starved of fuel within months of the start of the war, but their scientific ingenuity proved otherwise.

The generalised methods of converting coal to liquid fuel are termed coal to liquids, CTL, and wood etc. (biomass) to liquids conversion is analogously known as BTL. The latter is strictly an experimental technology and there is much more to be done before it might be implemented on the grand scale. It is one of the second generation of biofuels, which is really the only way that anywhere near the amount of petroleum based fuel might be matched from renewable "bio" sources, without compromising food production when growing crops for fuel encroaches onto land for food crops. As an example, even if all of the UK's arable land were turned over to growing sugar for ethanol fuel, only about half its oil based equivalent could be matched. Hence even if we starved, we could only run half our current transportation fleet, overall. The essential basis of "second generation" biofuels is the conversion of lignocellulose into fuel, and hence increasing vastly the "yield" per hectare of fuel, by using a material that is normally discarded in crop production.

The pilot plant in Freiberg will make 15,000 tonnes of fuel each year but the construction of a far larger plant to produce "Sunfuel", as Choren has nicknamed the BTL product, at Schleswig-Holstein with a capacity of 200,000 tonnes annually, is due to begin next year. BTL is a component of Shell's XTL programme, which includes GTL, a form of diesel made from natural gas, hence the "G". The latter technology is being implemented in the form of the world's greatest civil engineering project, employing a workforce of 30,000, namely "Pearl", based in Qatar, with the intention of converting some of that country's huge reserves of natural gas into 140,000 barrels daily of synthetic fuel. This amounts to over 50 million tonnes per year of a diesel that is completely free from sulphur. I attended a meeting run by the Royal Society of Chemistry in Oxford recently at which it was concluded that second generation methods, either cracking lignocellulose into sugars to make ethanol, or via BTL/FT into diesel would not be operating commercially before 2020, i.e. well into the Oil Dearth Era.

Nuclear power is set to be an essential component of the energy mix. When I started writing these articles I thought that we could dispense with nuclear and run everything on renewables; I am no longer of that opinion, and we will need to replace the old generation of reactors in addition to building new ones. How feasible it will be to expand the nuclear industry remains to be seen both in terms of engineering and the availability of nuclear fuel. Depending on how uranium or for that matter thorium is "burned" in nuclear reactors, and how assiduously exploration for further sources of these fuels is done, we may have hundreds or thousands of years worth left to exploit, and hence the technology could be viewed as "renewable". Gordon Brown has, however, outlined four sites for new-build nuclear, and these are at Sizewell in Suffolk; Dungeness in Kent; Hinkley in Somerset and Bradwell in Essex. It is interesting that all these are in the south of England and none in Scotland where of course, Mr Brown hails from!

Related Reading.
(1) "Shell turns to wood chips and straw in search for the road fuel of the future," By Carl Mortsihead, International Business Editor, Timesonline, 2nd March, 2007.
(2) "Brown outlines four sites for nuclear power stations," By Colin Brown, The Independent, 29th November, 2007.

Wednesday, November 28, 2007

Not Oil-Power but Horse-Power.

"The Independent" newspaper reports that least 70 towns in France have adopted horse-drawn carriages as a substitute for vehicles powered by petroleum-derived fuel. The move is part of an effort to reduce CO2 emissions. Now, horses do emit CO2, but only as derived from renewable carbon-fuels, oats and hay. The carriage is called the "hippoville", and is fitted with disc-brakes, signal-lamps and removable seats. As far as cost is concerned, a starting price (so to speak) is around £8,000 (11,000 Euro), which is about the price of 160 barrels of crude oil. I note this morning, incidentally, that the price of North Sea, Brent crude has fallen by $3 dollars to $93 per barrel, as a result of the promise by OPEC to increase its production by 500,000 barrels a day. One wonders, with record amounts of water being pumped out of the giant Ghawar field (the world's biggest producer of crude oil) how feasible this is, amid speculation that its production has already peaked.

As has been pointed-out, the hippoville is not a pollution-free vehicle, since a 1,000 pound horse produces about 50 pounds of dung every day along with six to ten gallons of urine, and which if soaked-up by bedding (straw) would provide another 50 pounds daily. Extrapolating, over a year, such a horse will produce about ten tonnes of dung and an equivalent amount of urine/straw. Now when I was a child, it was a common sight to see people going behind a horse, picking-up its dung to put on their gardens. Rhubarb was particularly favoured for this treatment, as became the subject of a number of British lavatorial jokes; such is our sense of humour. So, the horse exhaust-products could be put to good use in agriculture.

It is worth recalling that before the motor car became popular, there was speculation that the projected future number of horses would leave city-folk waste-deep in dung and it was noted the considerable effort of New York City in disposing of some 12,000 horse carcasses per year. I presume they were rendered-down to make glue and for other purposes. I walked past an expensive restaurant in Thun in Switzerland, some time ago, and noted with surprise that "pferderfleisch"was on the menu, "horse meat", and so this might prove another advantage to the horse, at least in some countries, though I doubt it over here, in this nation of animal-lovers.

Quite seriously, I fully expect to witness a come-back for the horse, amid the society of local farms and small communities that I envisage we will return too, from whence we came before the age of cheap-oil. As Thomas Hardy described in his novels, e.g. The Mayor of Casterbridge (his alias for Dorchester, in the south-west of England), such an agrarian lifestyle was extremely hard, especially if you were poor. He describes the journeyman farm labourers who walked 20 miles a day in search of work, and worked for a few pennies a day until that work was done, and then on to whatever next they could find. There was no welfare state then, and if a labourer was ill, or injured in this terribly dangerous profession, he simply got no money.

I do not envisage this extreme, but an emphasis on home-production - local farms and breeding horses and other animals is more realistic than the "hydrogen economy" for instance, or other technical fixes that will not be introduced in time, or if at all, to save us from the imminent energy-crunch, particularly in terms of transportation. If this nation and others must become as near self-sufficient as possible to survive, the horse will become an essential ingredient of the "energy mix" we often hear about.

Related Reading.
(1) "The horse: Is this the secret weapon to beat global warming?" By Geoffrey Lean, Environment Editor for The Independent.

Monday, November 26, 2007

New Brazilian Oil-Field - "Tupi".

An oil field named Tupi, located off Brazil in the Campos Basin, has increased the accounted reserves of hydrocarbons for that country by 50%, which in a period of escalating oil prices, looks fantastic. Tupi is thought to contain between 5 and 8 billion barrels of what is termed "intermediate gravity oil" [see definitions at end of article], since it is accorded an API (American Petroleum Institute) "gravity" of 28 "degrees". If the API gravity of an oil is less than 10, it is heavier than water and sinks; if it is above 10, it is lighter than water and floats on top of it. The API gravity may be related to the specific gravity (relative density) of the oil by the formula:

API gravity = 141.5/(specific gravity - 131.5).

By rearranging this, the specific gravity (SG) may be deduced as:

SG = 141.5/(API gravity + 131.5).

Hence, the Tupi oil has an SG of: 141.5/(28 +131.5) = 0.887 kg/m^3

However, the case of Tupi is complex, and recovering the oil is going to be a considerable task. The oil lies under a layer of salt, which lies under a layer of rock, which lies under great depths of sea. Probably the rock (pre-salt layer) lies under 2 - 3 kilometres of water, and is itself maybe 2 kilometres thick. The salt layer is gauged-in at around another 2 km in thickness, and having got through all that, there lies the reservoir of oil, and probably gas too, since oil cooked at such depths (and according temperatures) in all likelihood has produced gas.The company working on the TUPI project, Petrobras, is currently exploring for oil and gas in waterline-to-reservoir depths of 5 km, and is the world leader in offshore hydrocarbon exploration.

The salt-layer itself poses some particular challenges. Drilling through salt has been done before, but at Tupi the salt-layer is of an unprecedented thickness for drilling and the depths involved are greater than have been tackled before. To get some idea of the pressure, we can note that the pressure of water increases by about one atmosphere for every 10 metres depth. For solid crustal material, it is nearer three atmospheres for each 10 metres of descent (and an average of 4.5 atm./10 metres at much greater depths). Hence, if the sea layer is 3 km, that imposes 300 atm., below which is 2 km of rock, i.e. 3 x 200 = 600 atm, and then the salt itself, which yields another 600 atm. say, a grand total of 1500 atm. pressure (around 1.5 x 10^8 Pascals or 0.15 Gigapascals, GPa).

The salt is also heated by geothermal energy (from the interior of the Earth), and under these combined conditions of pressure and heat it behaves less like a solid and more like a jelly, with properties of flow, and so a hole may be drilled through it but it then closes. At the Coordination of Post-Doctoral Engineering Programmes (COOPE) hosted at the Fedreal University of Rio de Janiro, there are three high-pressure chambers that permit the simulation of the conditions at depths of 6000 metres, and where drilling equipment is tested.

Another problem is that when oil is pumped-up from the reservoir, it is hot (100 degrees C) and fluid, but at the sea floor temperatures are only around 2 degrees C. where the oil becomes "thick" and this can block the flow of oil to the surface. It is possible to get around this by heating the pipes, but this all adds to the costs of recovering the oil. The weight of the very long, 7 km, pipes can also impose mechanical stress on the steel they are made of and one suggestion is to use titanium instead, but this is a far more expensive material, adding further to the final bill.

On a final note, even at the predicted production from Tupi of 400 kb/d by 2015, this amounts to just 0.4/80 x 100 = 0.5% of the 80 million barrels a day of oil that the world currently uses, and who knows what it will cost per barrel to produce. Either way, as it becomes necessary to drill in increasingly difficult places to get it, oil will become a very expensive commodity, and perhaps other forms of "oil", e.g. as produced from Coal to Liquids (CTL) plants, now considered expensive, or Biomass to Liquids (BTL), thought to be operational technology by 2020, may become economically viable. Either way, the age of cheap oil is well and truly over.

Related Reading.
(1) "Tupi, the new kid in town", By Luis de Sousa:

[In general, oils with an API gravity of 40 to 45 have the highest market price and those with values outside this range sell for less. Above an API gravity of 45, the molecular chains become shorter and are less valuable to a refinery. Crude oil classified as light, medium or heavy, on the following basis:

Light crude oil has an API gravity of above 31.1 °.

Medium oil has an API gravity in the range 22.3 ° and 31.1 °.

Heavy oil has an API gravity less than 22.3.

In contrast, the US Geological Society uses slightly different definitions, but put simply, bitumen sinks in fresh water, while oil floats.

Oil which will not flow at normal temperatures is defined as bitumen for which the API gravity is normally less than 10 °. Bitumen derived from the oil sands deposits in the area of Alberta in Canada, has an API gravity of around 8 °. It is 'upgraded' to an API gravity of 31 ° to 33 ° by dilution and the upgraded oil is known as synthetic crude].

Thursday, November 22, 2007

England's Green and Pleasant Land.

So were written the words of "Jerusalem" by William Blake, the final stanza of which goes:

"I will not cease from mental fight,
Nor shall my sword sleep in my hand
Till we have built Jerusalem
In England's green and pleasant land."

It is sung by strong men at rugby football matches, with tears in their eyes, but it could be taken as an environmental anthem. However, while Britain is still a green land, that could all change as future events take their course. Fields of biofuel crops could take the place of pastures of grazing animals, while enormous acres of same-cultivated crops cover space between identical urban landscapes, villages as we know them, no longer in existence. A report from Natural England entitled "Tracking Change in the Character of the Urban Landscape," has concluded at 40% of the nation's landscapes are deteriorating from their traditional vistas.

There is public antipathy toward farmers, as they are often perceived: taking hefty EU subsidies while creating mountains of potatoes and butter that nourish no one; polluting rivers and streams; cutting-away the hedgerows to make enormous fields to grow more crops for more profit, while killing wild animals and birds. The list could go on. It is truth, however, that a mere 60% of our food is produced within the nation's borders. It is clear too that food prices will soar, but this is no fault of the farmers - inevitably oil prices will make the production of food and its transportation more expensive and so the produce itself, whether grown at home or imported from elsewhere. Food production has in fact fallen by around 1% per year for the past 15 years.

Farmers have been "hit" by BSE, swine-fever, foot-and-mouth disease, to name a few tragedies, while bird-flu constantly threatens further calamity. The suicide rate among farmers is the highest of any occupation, and it is also a truth that few other industries are forced to sell their production at less than cost-price. Public transport in the countryside is generally lamentable and so even the rural "poor" need to be multi-car families, to access basic services such as schools, the doctor, the bank and buying food. Those in this position will suffer greatly as prices, especially of fuel, rise.

To accord with the desired nearly 6% of Europe's fuel demand to be provided in the form of biofuels by 2010, some 3 million tonnes of wheat from a total of 15 million tonnes would be required in the UK, thus removing it from the food market, and eliminating any such surplus for export. The economic success of China means that its population will most likely move from a diet based on rice to one based on grain, and that will put pressure on world markets, possibly to a doubling in grain consumption within 40 years. To produce more food means either planting crops that give greater yields per care or cultivating more land.

The EU has reduced the amount of "set-aside" land from 8% in 2006-2007 to zero in 2007-2008, but since this land is much less productive it will not yield another 8% of crops - it tends to be stony, the headlands, fens and that in forests. There will be major changes make to the countryside, including moving animals to uncropable hill-land, in an effort to cram as much crop production wherever they can be grown. GM (genetically modified" crops are also thought might be necessary as they give higher yields, although the debate over GM has not yet been resolved.

Supporters of nuclear power claim that its preponderance will reduce the amount of land needed to produce biofuels, by which I presume they mean that nuclear can be used to produce hydrogen, and that instead can be a source of transportation energy. I have my doubts about this, on any significant scale, but nuclear power can certainly help us to keep the lights on. All in all I read the signs here as a call to maximise national self-sufficiency certainly in food, while the problem of providing transportation fuel remains, as oil becomes increasingly expensive and in short supply.

Related Reading.
"Goodbye beautiful Britain," Sunday Times Online, August 26th, 2007.

Monday, November 19, 2007

An ex-Peak Oil Believer Speaks!

I have referred to the essence of the latest upwelling on the subject of Peak Oil in previous postings, which is underpinned by the mostly Russian idea that petroleum (oil) is produced by chemical processes within the earth and is not a product of decomposition of dead animals and plants as a result of being cooked within the near surface strata of the planet over millennia. There are two books written on the subject, which expand upon the notion that either by bacterial action on iron oxides or the hydrolysis of metal carbides at some kilometers depth, hydrocarbons are produced. I discussed the elements of both in a recent posting, "Vast Oil and Life in the Deep Earth," which I also posted as one of my regular monthly columns at, respectively in respect of "The Deep Hot Biosphere" by Thomas Gold and "Jagged Environment" by Chris James. This is known as the "abiogenic theory."

Now, F. William Engdahl has countermanded his stance that oil is about to run-out, and believes that the biogenic theory of oil production, favoured the the West, is untenable. I think there is a good point being made here, that for the latter hypothesis to be true, to form the massive Ghawar field in Saudi Arabia, dead dinosaurs etc. would have needed to be trapped to a total volume of 19 miles cubed, at depths maybe 4,000 - 6,000 feet below the Earth's surface, and elsewhere at offshore locations such as the North Sea and the Gulf of Mexico, in rock formations. There are various theories about the events that might have occurred at the Earth's surface (we don't really know what has happened or still does at its greater depths), but it is possible that the pervasion of life and its separation following the break-up of Pangea, tectonic motion or pole shifting (that is the literal slipping of the Earth's crust over the semi-fluid asthenosphere; a terrifying scenario to put it mildly!) might have left its remains thus... but nobody really knows.

However, there may well be many different sources of petroleum. Hydrocarbons may represent an "energy minimum" into which more complex molecules can be "cooked", and there may indeed be unimaginable trillions of tonnes of "oil" lying under our feet. Some may come from animal/plant detritus, and other, presumably deeper volumes from geochemical processes. But this changes nothing about the crisis (in transportation fuel especially) that faces the world. If such reserves do exist, how accessible are they, and in what amount can they be reasonably extracted?

My understanding is that "deep-drilling" is necessary to access these sources, irrespective of their origin, and so there are limits to how fast we can pull petroleum from the earth to match the 30 billion annual barrels that we currently demand from her. It is a simple question of supply and demand, and we are demanding an inexorable amount of oil. Russia is apparently drilling deep wells around the Caspian coast, with alleged success, and yet I can find no confirmation of this. Either Russia will become the world's greatest producer of petroleum and hence the major world superpower, transcending the United States, or it will suffer the fate of all industrialised societies, which will necessarily relocalise into smaller communities in an effort to survive. In any case, the preponderance of cheap oil is over, and ergo the modern world and our customarily associated lifestyles upon it. There will always be oil, in all likelihood, but it's going to cost.

Related Reading.
(1) "The Deep Hot Biosphere", by Thomas Gold, ISBN: 0-387-95253-5, Copernicus Books, 2001. (Available from and
(2) "Jagged Environment", by Chris James, ISBN: 0-954-00940-1, JEpublications, 2001. (Available from but not Or from
(3) "Confessions of an "ex"- Peak Oil Believer," By F. William Engdahl:

Thursday, November 15, 2007

Chinese Takeaway.

China has been accused of "eating the world", as the jaws of the dragon consume more and more resources to feed a relentless appetite for growth, which is anticipated at a sanguine 10% for the year 2008. Indeed, the slack from slowing western economies is being taken-up by the expansion of Chinese industrialisation and commerce. The IMF has reckoned that around half of the world's economic growth will, in this year, be provided by the BRIC's (Brazil, Russia, India and China), and now India is adding more growth to the world economy than the United States, Japan and the EU altogether, while China rises above all nations. Indeed, without China, the world economy would probably be in recession by now.

China's demand for oil, copper, zinc, nickel and all other basic resources is forcing their prices increasingly upward, and the International Energy Agency (IEA) has predicted that the thirst for oil by China and India will quadruple by 2030. Whether this will really happen, given the lack of available crude oil way before then, it seems there will be a "crunch" in supply by 2015. Significantly, 2015 is the upper limit predicted by the Norwegian Statoil for the emergence of the "peak" in oil production, which they forecast could hit as soon as 2010. More likely the peak is already with us, as some analysts think, and the output of oil is being maintained artificially by enhanced recovery methods; hence, beyond the peak, oil supplies will drain rapidly, and it is not obvious to me how any increase in oil production is then possible, let alone a quadrupling in consumption by the Chinese or Indian industrial leviathans.

Meanwhile, China has asked for a 30% increase in its imports of crude oil from Saudi Arabia, and plans also to increase its imports of oil from Iran, having built two new oil refineries to increase the nation's fuel capacity. It is expected that imports of Saudi crude will increase from 460,000 barrels per day (bpd) to 600,000 barrels next year. The two new refineries can handle 240,000 bpd (Fujian on the south east coast) and 200,000 bpd in Shandong province, both of which are scheduled for completion next year. Apparently, China is unperturbed by the US sabre-rattling over the Iranian uranium enrichment programme, and wants to increase oil imports from the country above the 17% enhancement rise during the first nine months of 2007.

When such information about the burgeoning Chinese Economy is quoted, it is usually done so in the spirit of culpability toward its nation. However, it is the West that drives growth, in buying manufactured goods from China, much cheaper than we could make them ourselves. Western Culture is the counter-trade of this imported booty, in terms of quite understandable aspirations toward a "western lifestyle", which in reality even the West can no longer afford to maintain, or not for much longer, against the backdrop of rising oil prices. Therefore, outlets of McDonalds, Starbucks and Kentucky Fried Chicken (KFC) have appeared in the Chinese novo riche east. The traditional rice-diet is being superceded by a meat-rich diet, and imports of pork, beef and milk, which used to be in short supply in China are soaring.

In an effort to assuage memories of an austere socialist past, and mass starvation at times, China is now a net importer of food. As I wrote in "Can we Feed the World?", even if all of us (in the East and the West) adopted a pre-Green Revolution diet, which was largely vegetarian since meat production is more intensive in terms of the amount of land required per person to survive upon, only about 3 billion might be maintained as a total global population, or less than half the current number of 6.5 billion, in the absence of synthetic pesticides and artificially manufactured fertilizers.

To call this scale of events economic "growth" is illusory, since it reflects the plundering of the earth's resources under the false premise that we can continue to consume more and more, essentially forever and without limit, in terms of oil, food and energy. The reality is an artificially enhanced population with relentlessly voracious tastes, and a finally greater die-off in its numbers if world resources, and that includes its population too, are not managed to the aim of a sustainable balance between what might be supplied and what might reasonably be demanded from the capacity of the earth. The weight of demand has swung way down, and our chances of pulling-out of the nose-dive we are into, diminish increasingly in this protracted state of addicted denial. We need to trammel-in "growth" according to a definite worldwide plan. The governments of the world must decide on how to sustain its populations, and the available natural wealth must be apportioned to do so.

Related Reading.
(1) "How China is Eating the World," By Sean O'Grady, The Independent, November 9th, 2007.
(2) "China seeks 30% increase in Saudi oil imports," Reuters, Friday November 9 2007.

Tuesday, November 13, 2007

Biohydrogen Production by Electrical Stimulation.

I had given-up on the idea that producing hydrogen by fermentation to run all the world's transport is at all feasible. I remain to be convinced that it is, as I stated in an early posting (Feb. 26th, 2006) "Biohydrogen from Sugar - a Preposterous Idea," on the basis that "We would need an area of land more than twice the size of the U.K. to grow enough crops to replace our current demand for liquid petroleum fuels by bio-hydrogen, and hence the concept is utterly preposterous." The other problem is that to fill the huge volume of reactors for fermentation would require 150 cubic kilometers of fresh water, which is more than the total volume available for every man, woman and child in the UK.

However, I was sent a an early press-release of a paper which reports on the greatly enhanced production of hydrogen, in yield and rate, that might be achieved by immobilising hydrogen-producing bacteria onto the surface of an electrode, passing a current and thereby stimulating the proton and electron generating activity of such "exoelectrogenic" bacteria using a small applied voltage. The cell is described in [1], and is impressive, in comparison with simply having hydrogen producing bacteria swirling around in a stirred fermentation vessel. Naturally, there are additional resource demands incurred by this more sophisticated technology, which employs a cathode made of carbon cloth onto which a platinum catalyst is supported. The anode chamber was filled with graphite granules and a graphite rod was inserted into the granules.

Bacteria from a soil or waste-water source were inoculated and enriched on a specific substrate using a phosphate buffer and nutrient medium. High yields of hydrogen were obtained from glucose and also from its commonly encountered fermentation products, e.g. acetic acid, butyric acid, lactic acid, propionic acid and valeric acid, meaning that by a change in applied voltage it might be possible to produce hydrogen from these too, and thus rendering the process of fermentation overall more efficient in respect to hydrogen production.

By looking at some rough numbers, it is possible to gauge the likelihood of the technology being adopted on the large scale, in order to match the amount of oil we currently get through in terms of fuel.

The reactor volume is given as 14 mls (anode chamber) and 28 mls (cathode chamber), making a total of 42 mls, from which 1.1 m^3 of H2 is obtained per day. The cathode has an area of 1 cm^2 and is made of carbon cloth on which 0.5 mg of Pt has been deposited.

To match 60 million tonnes of oil, we need about 6 x 10^9 kg of H2 (6 million tonnes). 1 kg of H2 is 500 moles, and occupies a volume of:

500 moles x 24.5 litres/mole/1000l/m^3 = 12.25 m^3.

Hence, 6 x 10^9 kg of H2 has a volume = 12.25 m^3/kg x 6 x 10^9 kg = 7.35 x 10^10 m^3

The reactor produces 1.1 m^3 of H2 per day x 365 days/year = 401.5 m^3/year.
Therefore we need: 7.35 x 10^10 m^3/year/401.5 m^3 H2/m^3 (reactor volume)/year =
1.83 x 10^8 m^3 reactor volume.

[This is a huge improvement over the 1.5 x 10^11 m^3 for a "free" fermentation process, and implies a factor of 800 less in terms of water required. However, in some of the fermentations, water is a reactant but even so, we still need much less than 1% of the comparable quantity of water to run it].

How much platinum is required? 0.5 mg/cm^2/42 mls of reactor cell volume in total.

1.83 x 10^8 m^3/42 x 10^-6 m^3 x 0.5 mg = 2.18 x 10^3 tonnes of Pt = 2180 tonnes. This is equal to the world output of new platinum for 14 years, and that is just to fit the UK's needs, let alone the rest of the world! Thus we have hit the first resource bottleneck.

We would also need 50g Pt/fuel cell x 33 million cars on UK roads = 1650 tonnes of new Pt for fuel cells in which to "burn" the hydrogen, making 3830 tonnes of Pt required in total, or 25 years worth of the world output of the metal.

How much land would be needed to grow the sugar crop? Let's assume that the technology can be adapted to extract 100% of the hydrogen in a sugar C6H12O6 (including the acids etc. that it produces in a first fermentation) which is pretty optimistic:

C6H12O6 ---> 6CO2 + 6H2 + 6 "O" (in an unspecified chemical form).
MW = 180 12

So, we need 180/12 x 6 x 10^9 kg H2 = 9 x 10^10 kg = 9 x 10^7 tonnes of glucose.

If we assume a yield of 19 tonnes of "sugar" per hectare, and an efficiency of 80% to extract the hydrogen, we need:

100/80% x 9 x 10^7/19 = 59.21 x 10^6 ha of arable land = 59,210 km^2 which is 91% of the total of 65,000 km^2 there is altogether. So, we couldn't grow any other crops for food, and while it represents a considerable improvement over unassisted fermentation of sugar into hydrogen, it is still impractical on the grand scale of our transportation requirement.

How much generating capacity would be needed to run the system, by applying a voltage to the anodes?
The average is 300 mW/m^2 of electrode surface.

1 cm^2 corresponds to 42 mls of reactor volume, and the total reactor volume is 1.83 x 10^8 m^3.

Hence the total electrode area is: 1.83 x 10^8/42 x 10^-6 x 1 cm^2 = 4.36 x 10^12 cm^2, and since 1 m^2 = 10^4 cm^2, this amounts to 4.36 x 10^8 m^2.

Thus, the power needed is: 4.36 x 10^8 m^2 x 300 x 10^-3 W/m^2 = 1.31 x 10^8 W = 131 MW, which is not too bad, about 13% of the output of a typical power plant.

How much graphite is needed?
Anode chamber has a volume of 14 mls. If we assume spherical particles, their volume is:

4/3 x pi x (4.54/2 x 10^-3)^3 = 4.9 x 10^-8 m^3. To find the overall volume they occupy, it is helpful to imagine each one occupying a cube of side 4.54 x 10^-3 m (4.54 mm), for which the volume is:

(4.54 x 10^-3 m)^3 = 9.36 x 10^-8 m^3. The total anode volume is (14/42) x 1.83 x 10^8 m^3 = 6.1 x 10^7 m^3, of which, (4.90 x 10^-8/9.36 x 10^-8) x 6.1 x 10^7 m^3 = 3.19 x 10^7 m^3 is graphite. There is a graphite electrode inserted too, which occupies some of the internal space of the cell, but assuming the volume just determined and a density of graphite of 2.25 tonnes/m^3, this amounts to:

3.19 x 10^7 m^3 x 2.25 tonnes/m^3 = 7.2 x 10^7 tonnes or 72 million tonnes of graphite.

As a means to replace oil for transportation, the technology could not be scaled-up sufficiently for the task, certainly not to fuel the entire world's transport. The above figures only refer to the UK, and should be multiplied by around 20 to meet the needs of ca 600 million road vehicles as there are reckoned to be altogether. This would mean that 3830 tonnes x 600 million/33 million vehicles = 69,636 tonnes of Pt would be required, and yet the metal is recovered at a rate if about 150 tonnes per year, implying it would take 464 years to install the lot, using electrohydrogenolysis with fuel cells. This quantity is actually close to the reckoned world reserve of Pt, and so we all of that would need to be turned-over for this purpose, and none for jewelry, scientific apparatus or catalytic convertors to keep the internal combustion engine powered vehicles running "clean" while they were phased out by the new "hydrogen" technology.

It is an interesting paper, and the authors may be correct in their assertion that the technology might still prove useful for local fertilizer production, say, even if a full-scale transportation system based on hydrogen is never implemented (which it never will be). However, the scale even of this will be likely be very small, for the simple facts of limited resources and the otherwise massive engineering requirements.

Related Reading.
[1] S.Cheng and B.E.Logan, "Sustainable and efficient biohydrogen production via electrohydrogenolysis," PNAS, 2007, Early Edition.

Friday, November 09, 2007

Can we Feed the World?

The world population of 6.5 billion is projected to rise to perhaps 8 - 9 billion by 2050. Simultaneously this period corresponds to the Oil Dearth Era, the inevitable consequence of the peak in oil production "peak oil" which is due any time soon, if it has not already occurred. Since much of modern agriculture relies on oil, the question begs of whether we will be able to feed such a swelling population, and if so by what means, or what manner of readjustments might prove necessary to meet the task?

The term "organic farming" is a recent innovation, as opposed to its practices per se, which were those of the world's agriculture prior to the post WWII period, when chemical fertilizers were introduced to the soil. Until then, all farming was "organic" and was done without the employment of artificial "nitrogen" from ammonia or involving the routine use of synthetic pesticides. Modern "intensive farming" methods, such as we rely on in the industrialised nations, have been costed to consume 10 calories of energy in the form of fossil fuel (to provide fertilisers, pesticides and transportation fuel) for each calorie of energy that is recovered from the food itself.

Now, a strategy of localisation will inevitably reduce the contribution from transportation fuel, which is significant, but if pesticides and fertilisers are cut-out too, crop yields fall appreciably, meaning that fewer people can be fed per acre or hectare of arable land. It is a truth that "organic" farming is far more intensive in terms of land, if not in terms of energy. A major driver for the development of "chemical" farming methods was the plenty of chemical materials produced with the intention of military use during the war, and which it was decided could be put to benefit by turning them into agrochemicals.

I have mentioned Thomas (Robert) Malthus previously, who predicted more than 200 years ago that because population grew at a geometric rate (i.e. 2, 4, 8, 16...) but food production increased arithmetically (i.e. 1, 2, 3, 4...), the rate of reproduction would outstrip that of its sustenance, leading to mass starvation and an effective die-off scenario. This did not happen, in consequence of the "green revolution", which ironically is the opposite of the modern "green movement" since it refers to the many developments in agricultural technique that have been implemented since the 1960's, including the use of chemical additives to soil and to the produce grown on it. In consequence, world food production swayed-in with an increase of 250%, from greater absorption of nitrogen than occurs naturally, the growth of selected high-yielding crops like wheat and corn, and a greater mass of grain in the plants overall. For example, in 1950, an acre of land produced around 400 kg of wheat, but by 1950 this had risen to around 2000 kg/acre in South Asia but 4000 kg/acre in Europe and the US.

The downside of this is that it is necessary to provide more irrigation and hence an intensive infrastructure of dams and water-channels are necessary, especially to provide sufficient water during the winter period in order to grow an additional annual crop. Additionally, because more of the plant is consumed by humans, there is less residue left from it for animal feed. A mean energy intake for a human adult is reckoned at 2500 Calories (kilocalories) per day. A balanced diet is believed to correspond to about 60% carbohydrates, 12% protein and 28% fat. It is significant that during the green revolution the world has eaten more meat, meaning that the per capita land requirement is greater than would be the case to feed vegetarians. It has been estimated that 20 people can live entirely without animal products on the same area of land required by a typical meat eater. This may be a considerable overestimate, but certainly the carrying capacity of the earth is reduced if many of its inhabitants eat much more meat than they once did.

According to one calculation [1], the amount of land required to feed a single human is about one acre, following a mainly agrarian lifestyle, i.e. on the basis of pre-green revolution farming, without chemical enhancers. Since the total land area of the earth is about 150 million square kilometers, of which 10% is suitable for growing grains, another 10% for grazing animals on and a further 20% in the form of forests where animals can be raised, it may be deduced as a simple total that the sustainable world human population is:

150 x 10^6 x 100 hectares^2/km^2 x 40% x 2.47 acres/hectare x 1 acre/person = 14.8 billion.

However, the primary energy (food) input is surely the growing and grazing on a total of 20% of the planetary surface (we can't eat trees, although animals such as pigs can grub around the forest floor), suggesting a maximum sustainable population of nearer 7.4 billion, which is way short of the 8 - 9 billion presumed by 2050 and that contemporary farming methods will continue in perpetuity. Certainly there are other species on the planet, that do not exist purely in the interests of supporting the human race and the earth must support them too. So, would 30% of that land resource available for humans be a reasonable estimate? That leaves us with about
2.2 billion as the carrying capacity.

I am depressed. Either we will need to maintain the basic "forced methods" for crops by some means other than oil (and gas), to keep the present level of agriculture going (how?? coal??), or there will be a die-off in the world population, presumably through famine and wars over declining resources. Probably we will need to provide more of our diet directly from crops, rather than processing it through animals first, but even then, that only saves us perhaps a quarter-acre (from the per capita one acre), meaning the planet might support a maximum 3 billion, or less than half the present number. However, can we thus provide sufficient daily calories to fuel a population living far less sedentary lives, by grains etc. alone? There are just too many of us.

Related Reading.
[1] "The World's Expected carrying capacity in a Post Industrial Agrarian Society."
[2] "Human Appropriation of the World's Food Supply."

Wednesday, November 07, 2007

Oil or Liquids?

Two camps stare at one another across the dividing gulf of oil supply. In one are the "peak-oilers" while the other contains the "cornucopians" (otherwise known as peak-oil deniers). The latter group argue that the Hubbert Peak analysis is invalid because supplies of oil, when they enter their inevitable phase of depletion, will be substituted by other sources, often referred to as unconventional oil or "liquids". One category of such "oil" includes "condensates" and "Natural Gas Plant Liquids (NGPL)". Condensates are very pure mixtures of straight-chain hydrocarbons in the range C2 - C12 (containing molecules with between 2 and 12 carbon atoms), cyclohexane (and other naphthenes) and various aromatic compounds (e.g. benzene, toluene and xylenes). NGPL are mostly ethane, propane, butane, isobutane and some C5 and higher homologue hydrocarbons.

Many gas-wells are rich in NGPL. "Condensate wells" are gas-wells that are rich in hydrocarbons of the kind referred to vide supra. When they are first struck, oil-wells expel oil under the natural pressure of gas that they also contain, but the pressure drops as they are exploited and ultimately artificial pressure (e.g. from compressed CO2) must be applied, or pressurised water, among the range of enhanced recovery methods that are employed. There are also "dry-wells" which produce principally methane, but the relative composition of gas and liquid in a well varies enormously according to the local geology and origin of the hydrocarbon resource, overall.

To the tally of unconventional oil is then added oil (tar) sands, such as exist in massive quantity in Alberta, Canada; bitumens ("extra-heavy oil"), for example in the Orinoco belt in Venezuela; and oil-shale, as found for example as a large resource in Colorado. To make up the grand total of 3.7 trillion tonnes, as it has been proposed there is, oil from gas-to-liquids (GTL) and coal-to-liquids (CTL) processes are then costed-in. GTL is a useful means to produce high quality (clean) diesel oil from natural gas, by conversion to syngas and processing via Fisher-Tropsch (FT) methods into hydrocarbons. In the two CTL methods, coal can be converted (indirectly) to syngas and thence hydrocarbons using FT, or it can be hydrogenated (directly) to diesel fuel, based on the Bergius process, where coal powder is reacted with hydrogen under pressure as dispersed in a heavy hydrocarbon oil.

Deep offshore oil, such as that under the Gulf of Mexico, which can only be got by drilling through thousands of feet of water before the underlying rock is drilled, again through thousands of feet, is also accounted for under the heading of unconventional oil, as is true of the potential oil under the Antarctic and Arctic polar regions.

In this last May, the US Department for Energy began to speak of "liquids" rather than "oil", when making projections of exactly how much there will be available in the future, which looks like an ushering-in of the Oil Dearth Era. They predict that there will be a 400% increase in the production of unconventional oil in the US, from 2.4 million barrels a day to a daily 10.5 million barrels in 2030. This may be taken by cornucopians as a rallying-cry, in confirmation that peak oil is not important, in the sense that falling supplies of conventional crude oil will be more than matched by unconventional sources.

However, it is not a mere matter of how much "oil" there is in the ground (in some form or another) but how easy it is to get at, and frankly none of it can be obtained as readily as crude oil can. Bioethanol (corn ethanol) is a separate and much vexed issue, but most vexations rotate around an axis of costing-in other sources of energy used by the necessary agriculture and processing and that there must come the time eventually when growing crops for vehicle fuel conflicts with growing them for food to fuel humans and animals.

Making oil from tar sands is highly intensive in terms of other resources such as natural gas and indeed water. It has been proposed to build two nuclear reactors in Alberta to provide the energy for steam with which to drive the sticky bitumen out of the "sandy" mineral and to crack it into a suitable fuel. Then there are numerous issues surrounding pollution of the environment, and so it is not a happy solution on either count.

According to geological surveys, there are some 2.1 trillion barrels of oil (around twice what is believed to be left worldwide, in the form of recoverable conventional crude oil, if we believe the Saudi estimates of their reserves) present in shale rock in the US, but once again, extracting it is highly energy intensive, and bad for the environment too, since it will be necessary to strip-mine a huge area of wilderness to obtain the rock, which then needs to be heated to around 500 degrees C to get the oil out; then the resulting mountain-sized detritus of waste material, rubble and so on will need to be dumped somewhere.

Conventional oil is almost at $100 a barrel, and that makes many of these alternative approaches to unconventional oil appear attractive on economic grounds. It has been pointed out that the level of viability of these alternative technologies always seems to be about $10 above current crude oil prices. A few years ago it was $25 a barrel and now it is $75; in fact way below the latest $96 barrel. Hence on economic grounds, it would appear that anything goes! However, the EROEI (energy returned on energy invested) will ultimately decide whether a given source is "economic" or not, and clearly the answer is "not" when it takes more energy and other resources to extract oil from a given source than can be recovered from the oil itself when it is burned.

We should not be fooled by estimates of how much "oil" there is in the form of "liquids", the supply of which must inevitably fall. Our best option is to look toward means for reducing the amount of oil that we use, almost certainly by curbing the need for transport via a relocalisation of society, and other more efficient living strategies, rather than waging war on other nations or on the environment in an ultimately vain effort to preserve the status quo of excess.

Related Reading.
(1) "It's no longer "oil", it's "liquids". By Jerome A. Paris. "The Oil Drum" blog, posted October 30th, 2007.

Monday, November 05, 2007

The Methanol Economy?

The term "Hydrogen Economy" is familiar by now, but there are numerous attendant difficulties which may not be overcome, or not in time to circumvent the energy-crash caused by cheap oil running short, signalled by a massive and inexorable hike in oil prices, as is now well underway. Notwithstanding the economic minefield the "Oil Dearth Era" will set, there are intrinsic technical problems in producing and handing hydrogen per se, if it is to be used on a scale of substitution equivalent to that for oil.

I have written on this subject in previous postings at some length, but the following points are salient. Hydrogen is not a basic fuel as are oil, gas and coal, but it must be produced artificially by liberating it from other elements, such as carbon and oxygen with which it is normally combined in nature, in the form of methane (natural gas) and water. These are, however, all energy intensive processes and almost entirely require the use of fossil fuels or nuclear power to drive them. Most of the world's current 50 million tonnes or so of hydrogen, produced annually to make fertilizers and to crack hydrocarbons, comes from "synthesis gas", a mixture of CO and H2 formed by reacting fossil fuels with steam in a process called "reforming", and so both chemical feedstock and heat depend upon them; hardly a "green" process, since CO2 is incurred both from combustion and by chemical stripping of the carbon component.

The ideal would be to make clean hydrogen by the electrolysis of water using renewable electricity (wind, wave, solar, hydro), but we need to go a very long way before that can be done on a large scale, although some think that enough new nuclear power might be installed to make the necessary electricity. I am skeptical that this can be implemented quickly enough, if at all, in the vast dimension that is demanded.

Even if we can make enough hydrogen, there is the issue of how to store, handle and distribute it. In comparison with liquid hydrocarbon fuels, gaseous hydrogen at normal pressure is highly voluminous, and hence it is necessary to handle it either as an extremely volatile liquid (with a boiling point of -253 degrees C, and only 20 degrees above absolute zero), or under high pressures. Either arrangement would require special technology to maintain it safe over time and to prevent leaks, since hydrogen forms highly explosive mixtures with air over a range of concentrations, and there would in any case need to be built a completely new infrastructure for generation, handling and distribution, once again within 10 years or so, and we haven't started yet.

For onboard storage of hydrogen as a fuel in vehicles, a considerable proportion of the energy actually contained in the hydrogen would be required to liquefy (30 - 40%) or pressurise (20%) the material into a "fuel tank". A fuel/tank weight ratio of 6.5% has been proposed below which the hydrogen strategy is inviable and there are numerous suggestions of porous solids into which hydrogen might be packed to occupy a smaller volume, e.g. zeolites, in some cases allowing an energy density close to that of liquid hydrogen but at significantly higher temperatures then -253 degrees C. Nonetheless, cryogenic cooling is still required. As an alternative, it has been postulated that the hydrogen might be stored chemically in the form of methanol. Indeed, one litre of liquid hydrogen contains 70.8 g of hydrogen at -253 degrees C, while one litre of liquid methanol contains 98.8 g of hydrogen and that is at room temperature.

The "methanol economy" could achieve holy grail status as a CO2 emission remediation strategy, by providing the carbon component of CH3OH, thus both preventing it from being released into the atmosphere and providing a vital source of fuel. Actual carbon-capture from atmospheric air on a degree of real significance is the stuff of the future, but capturing CO2 from power stations is feasible, which could be reacted with H2:

CO2 + 3H2 ---> CH3OH + H2O.

We are still left with the problem of making hydrogen on a vast scale and the infrastructure to do so does not exist at all. It is possible that rather than using preformed H2, it might be produced in situ, in the form of electrons and protons, by electrolysing CO2 in aqueous (water) media, so overall the effect is equivalent:

CO2 + 6H+ + 6e- ---> CH3OH + H2O.

However, the latter is difficult, since the reduction of (electron addition to) CO2 at the cathode (negative electrode) occurs in competition with electron addition to protons (H+) making hydrogen atoms and hence H2, the production of which competes with CH3OH formation. CH3OH is not the only organic product of CO2 reduction (either by electrons or H2), but also formic acid HCO2H and formaldehyde H2CO), although George Olah and his team at the Loker Hydrocarbon Research Institute at USC (University of Southern California) have patented a means to convert the latter to methanol, in an overall reaction where HCOOH provides "hydrogen" to reduce H2CO:

HCOOH + H2CO ---> CH3OH + CO2.

It is thought that the methanol would ultimately be "burned" directly in "direct-methanol-fuel-cells", but these currently depend on scarce supplies of precious metals such as platinum, as indeed do hydrogen fuel cells, and that appears to be a drawback on the technology. However, methanol can be converted to mixtures of hydrocarbons by reacting it over zeolite catalysts, for either purpose of making fuel (methanol to gasoline (MTG) process; invented by Mobil in the '70's) or as a feedstock for e.g. making plastics (methanol to olefin (MTG) process. In principle, many organic chemicals including pharmaceuticals might be made from methanol.

Most methanol is currently produced from natural gas (as is hydrogen) and so feeding the methanol economy by this means would impose further demands on a reserve that is, after all finite, as is oil; hence using CO2 as the carbon source appears perfect. Much of the current state of play in the field is heavily guarded by patents, and so I have not been able to tie-down the best efficiency so far achieved for CO2 reduction and nor do I know whether it is more efficient to do this with pre-prepared H2 or by electrochemical methods. However, my impression is that the latter are quite some way off and the process should be seen as a means for storing H2 made independently.

According to one report, the overall energy efficiency incurred in reducing CO2 with H2 and handling the resulting CH3OH is about 20%, and that is before the "fuel" has actually been used in some way. Therefore, while there would be considerable advantages met in handling liquid methanol at room temperature rather than H2 (either as a cryogenic liquid or a highly compressed gas), in terms of energy efficiency I doubt methanol is better than hydrogen, for which a value of nearer 40 - 50% might be accounted in terms of its manufacture by water electrolysis and the subsequent handling processes. Nor can it be, in the sense that installing a gargantuan new electricity generating capacity of similar capacity is necessary to underpin it.

On safety grounds, convenience of handling, storage and distribution (for which the existing oil infrastructure could be adapted), and that methanol might be converted to the numerous products that we presently get from oil (which is becoming more expensive all the time), as well as providing a clean fuel, the strategy holds much appeal. What it is not though, is a limitless supply of synthetic "oil", since CO2-derived methanol depends on electricity from fossil fuels and uranium, and may prove no more than a means for temporarily extending the illusion that the carbon-driven Western lifestyle is sustainable, which it is not.

Related Reading.
(1) "Beyond Oil and Gas: The Methanol Economy," G.A.Olah, A.Geoppert and G.K.Surya Prakash. Wiley-VCH, 2006.
(2) "Novel CO2 Electrochemical Reduction to Methanol for H2 Storage," T.Kobayashi and H.Takahashi, Energy and Fuels, 2004, 18, 285 - 286.
(3) "Beyond Oil and Gas: The Methanol Economy," G.A.Olah, Angew. Chem. Int. Ed., 2005, 44, 2636 - 2699.
(4) "Renewable hydrogen utilisation for the production of methanol," P.Galindo Cifra and O.Badr.

Friday, November 02, 2007

Nanoparticles for Catalytic Converters.

The Mazda Motor Corporation has revealed a new class of catalytic converters which use between 70% and 90% less precious metals such as platinum than are required in current devices. Since around 40% of all platinum produced in the world goes into making catalytic converters (about the same as is used to make jewelry), this would suggest a significant reduction in the demand placed on a resource which presently exceeds its supply. In the new models, the metal is employed in the form of nanoparticles (i.e. with a size of perhaps around 10-100 nanometres. For reference, 1000 nanometres is one micron, and the width of a human hair is about 70 microns, so they are tiny).

The function of the metal is to provide a surface on which chemical reactions are catalyzed, and for example, toxic emissions from exhausts of NO2 (which contributes to ozone formation at ground level and to photochemical smog) are eliminated. In the specific case of NO2, which arises from the combination of atmospheric O2 and N2 drawn into internal combustion engines, at the relatively high temperatures within them, the catalyst simply reverses the process, and breaks it down to O2 and N2 again:

2NO2 --> N2 + 2O2.

How effective a catalyst is depends very closely on the actual area of the surface and simply, the greater that is, the more active the catalyst is expected to be. By using the metals in the form of nanoparticles, a smaller mass of metal is required to provide the same surface area is in current CC's, since the surface area scales roughly with the square of the particle diameter. Prior efforts to implement this technology had been unsuccessful because at the temperature of the exhaust, metal particles can migrate over the surface of the supporting ceramic bead and then coalesce (agglomerate) into larger particles, with naturally smaller surface areas and hence lower catalytic efficiencies. Mazda apparently have invented a means unspecified that can immobilise the metal particles by embedding them at fixed positions in the ceramic surface, which obviates the problem.

Now, the question remains of how useful this will be in the future. Oil prices have just hit $96 a barrel, and they will continue to rise. Inevitably, then, the cost of fuel or simply its reduced availability (since the rising cost will mirror the dearth of petroleum derived fuel, post peak oil) will begin to force cars off the road, thus cutting pollution in any case. When there is less fossil fuel to burn, carbon emissions will necessarily fall too. Can this technology be implemented quickly enough to make any real difference in comparison with the emissions-reductions that will be in any case implemented by the falling number of cars expected during the next 20 years say, as we slip into the age when cheap oil has certainly gone? Or do we still believe that the number of cars will rise interminably into the future; and if so, as fuelled by what means?

Related Reading.
"Catalytic Converters go nano," Ned Stafford, Chemistry World, November 11, 2007, p16.

Wednesday, October 31, 2007

UK Winter Electricity Shortages?

We have heard before that there may be an energy crisis in coming winters, and last year the warning referred to gas-supplies; this year it is electricity. We came through last winter without incident and I hope the same will prove true again. The National Grid has given an alert to the effect that there may well be a shortfall in its generating capacity, which mirrors a hike-up of gas-prices, and we British pay about 40% more for that commodity than our other European neighbours. Part of the problem is that a huge gas-terminal at Milford Haven (South Wales) will not be completed as soon as originally thought, in consequence of industrial action by contract staff and other problems.

The Minister for Energy, Malcolm Wicks, last week conferred with providers of electricity over fears that the UK is once again headed for ramping prices (now this did happen last winter although the lights stayed on), and power cuts in some regions, as indeed happened two years ago. The National Grid has, however, reassured ministers that no actual power blackouts are expected. Nonetheless, on the Grid website was a "transmission system warning" calling for another 300 MW of power to cope with the high-demand period between four in the afternoon and half past seven in the evening when, of course, people are cooking their dinners, watching tv, and putting the kettle-on during the interval in their favourite soap-opera.

Indeed, the situation for electricity supply to the Grid is a little precarious because of troubles at the now aging nuclear reactors in this country. I think 40% of the nation's electricity is made using gas, and so the Milford Haven depot not being completed might impact on the availability of it for this purpose. Indeed, no firm date has been set for when it will open, but it is clear that there will be no imports of liquefied natural gas there from Qatar to meet the winter's predicted demand.

There is also some doubt as to exactly how much gas will be brought in from Norway's Ormen Lange gas-field in the North Sea, via the Langeled pipeline, to the Easington depot in North Yorkshire, which opened last year and has provided some gas, but it is uncertain when it will be operating at its full capacity. When it is, 20% of the UK's gas requirement will be met via it, and the Milford haven depot is expected to carry another 20% in the form of liquefied gas. Our joyous plenty from the North Sea has been had and Britain is increasingly dependent on imports of gas from elsewhere. Hopefully there will be enough to keep the Christmas tree lights on in the coming festive season.

Related Reading.
"Rising fear of energy crisis this winter," By Terry Macalister, Guardian.,,331116810-103690,00.html

Monday, October 29, 2007

Peak Oil - Peak Minerals.

The concept of peak oil is well known, according to the Hubbert theory which I discussed in "Hubbert Peak Oil" a couple of days ago, wherein the amount of oil extractable from the ground is finite and accordingly its production is expected to peak at a point where about half the resource of it has been used up. All resources are finite and will ultimately be extracted only to the limit where it is feasible to do so, whereupon either financial costs or those of energy dictate that to proceed further only yields diminishing returns. The Hubbert theory was originally applied to oil, but there are potential and similar fits to gas and coal reserves too and a recent analysis has been made using the approach to a study of 57 different minerals, as reported by Ugo Burdi and Marco Pagini in a guest posting in the blog "The Oil Drum", which covers many aspects of Peak Oil and related matters.

These authors have fitted both logistic and Gaussian functions to mineral production data from the United States Geological Survey (USGS), and it is interesting that for mercury, lead, cadmium and selenium, there is good accord found between the "ultimate recoverable resources" URR determined from the curve-fitting to the data and those reported in the USGS tables + the amount of each already extracted. For tellurium, phosphorus, thallium, Zircon(ium) and rhenium, the agreement is quite close but tends to smaller values than are indicated from the figures for cumulative production plus the USGS reserves. For gallium, the figure obtained from the fitting analysis is significantly lower than the USGS estimate (by about a factor of seven).

Evidence of peaking is found for a number of the minerals, e.g. mercury around 1962; lead in 1986; Zircon in 1990; selenium in 1994; gallium in 2000. The results for gallium are significant, both in that the peak occurred seven years ago and in the size of its total reserve, which when compared with the amount used worldwide by the electronics industry implies that we may run short of gallium any time soon. Tellurium and selenium are two other minerals that underpin the semiconductor industry and it appears that their fall in production may also impact negatively on future technologies that are entirely reliant upon them, since there are no obvious substitute materials with precisely equivalent properties.

For vanadium, although a production peak is indicated in 2005, the data in the "mineral commodities handbook" show a later and sudden surge in production, which is not fully explained but thought may potentially relate to uncertainties in reporting from countries like China. So, there may be a real and ongoing upsurge in production from e.g. the Chinese economy which is quoted as being "out of sync" with the rest of the world, such is its massive expansion, or it might be a red herring.

Interestingly, copper, zinc, tin, nickel and platinum show an almost exponential increase in production; however, as I have noted previously, the stocks of some metals may be insufficient to supply the technological demands of the modern developed world into the far (or even near) future. There is also the issue of how quickly a rare and difficultly extractable metal such as platinum might be produced in comparison with an overall demand for it. Copper production can be fitted with an exponential function up to 2006, while a logistic function provides about the same quality of fit, yet indicates a peak in about 2040. The latter agrees reasonably well with the USGS estimated copper reserves of 0.5 - 1.0 Gigatons, while the fit gives 2 Gigatons. Notably, the world price of copper has skyrocketed during the past few years, which is again attributed to demand in China, as was the cost and shortage of wood earlier in the year.

Burdi and Pagini note that all of the above analyses rest upon the notion that the determined "peaks" represent actual global production maxima. Indeed, more reserves of all minerals may yet be found if we look assiduously enough for them; but herein lies the issue of underpinning costs, both in terms of finance and energy. It is the latter that may determine the real peaking and decline of minerals, which extend beyond the simple facts, say, of mining and refining a metal from its crude ore. There is also the cost-contribution from the energy needed to garner energy-materials such as oil, gas, coal and uranium, and thence to turn them into power and machinery; and since fossil fuels are being relentlessly depleted, it takes an inexorable amount energy to produce them, resulting in a cumulative and rising energy demand overall.

Saliently, the authors point out that the whole "extractive system" is interconnected through required underpinning supplies of fossil fuels, and it is perhaps this that explains why the production of so many minerals seems to be peaking during the period between the latter part of the 20th century and the start of the 21st, in a virtual mirror-image of the era when troubles in the production of fossil fuels were experienced across the globe. Hence, it may be the lack of the latter which determines the real amount of all other minerals that can be brought onto the world markets.

Related Reading.
"Peak Minerals", By Ugo Bardi and Marco Pagani.

Sunday, October 28, 2007

World Platinum Price Soars.

The cost of platinum has risen over $1,450 an once following worries over the supply of this rare metal as two mines in South Africa were closed because of recent fatal accidents. It is thought that there will be a market deficit of 100,000 ounces of platinum this year. The biggest producer of platinum in the world is Anglo Platinum, which closed its Paardekraal shaft in Rustenburg after a worker died in an accident, and the South African Northern Platinum Ltd. also closed its mine when a worker was killed by a rockfall.

The world market for platinum closed in 2006 with a small surplus of around 10,000 ounces after being in deficit for seven years, outstripping the world demand of 6.775 million ounces, or about 192 tonnes of it. About 42% of that is used to make jewelry and is almost exactly the same as goes into making catalytic convertors; the rest is used to make scientific apparatus. I have commented previously, that the introduction of PEM (proton exchange membrane) fuel cells to run cars fuelled by hydrogen is likely to be hampered by the limited amount of platinum that could be produced for this purpose. Even if all the platinum which currently goes into cleaning the exhaust emissions from cars that burn oil-fuels internal combustion engines, could be skimmed-off for the PEM sector, it would be just enough for:

(0.4 x 192 tonnes/year x 1000 kg/tonne x 1000 g/kg)/50g platinum/car = 1,536,000 cars/year.

Compared with the numbers of road vehicles there are altogether, which I believe is 700 million, this is quite a small figure. I suppose it is possible that more platinum may be found and maybe the world could do without its jewelry in the interests of "saving the planet", but a hydrogen economy based around precious metals looks to me of limited likelihood.

Gold prices have also soared to around $750 per ounce, which is the highest since 1980, when it hit $850. Tensions in the Middle East are partly blamed, especially the decision by Turkey to send its troops into northern Iraq to hunt-down Kurdish rebels, although the country's allies in the West and in Baghdad have urged them to refrain from invading Iraq. There is an issue of how much of many metals and minerals might be supplied in the future, along with oil, gas, uranium and eventually coal, and the prices of all of them will reflect how much can be brought onto the world markets, and indeed how much there is available at any prices.

Related Reading.
"Supply concerns propel platinum to record highs", By Atul Prakash.

Wednesday, October 24, 2007

Hubbert Peak Oil.

In 1956 a paper was published which will be of greater significance to the future of humankind than those reporting on the structure of DNA or the Theory of Relativity. Its title was "Nuclear Energy and the Fossil Fuels", and it was written and presented by M. King Hubbert at an oil-industry conference in Houston, Texas, while he was in the employ of the Shell Development Company. At first Hubbert was not taken seriously in his conclusions that the peak in oil production would follow the peak in oil discovery by about forty years, and so the best year for US oil output would be around 1965 - 1970, roughly 40 years after the most successful year of oil finds, in 1930. He was right, and thenceforth US home oil production has fallen to the extent that the nation now imports two thirds of all the oil it uses, a colossal 20 million or so barrels a day, or one quarter of the world's requirement of oil.

In days before computers, Hubbert would have drawn the graph by hand (probably with the aid of a flexy-curve, or simply freehand as I used to find best, before PC's were available routinely, and mathematical analysis packages such as the Origin programme, which is installed on this machine). The Hubbert peak is based on a logistic function, which is a restricted exponential, and the first derivative of it corresponds to a peak. The derivative of this (i.e. the second derivative of the logistic function) gives an inflexion, where the point at which the curve crosses the baseline corresponds to the peak maximum. The logistic function includes the familiar S-shaped curves that relate to the growth of bacteria and to enzyme kinetics such as those of Michaelis and Menton.

The Hubbert curve (peak) may be defined as:

Q(t) = Q(max)/(1 + ae^bt),

where Q(max) is the total recoverable amount of crude oil in the ground to start off with, Q(t) is the cumulative production (i.e. how much oil has been pulled out of the ground to date) and a and b are constants. Accordingly, the year of maximum production (peak oil) is given by:

t(max) = (1/b)ln(1/a),

and for the world altogether, with a peak discovery year of 1965, this appears as 2005. There is much speculation and analysis that oil production has already peaked, and it is my suggestion that enhanced recovery methods alone have maintained the present output of oil, much of it from the giant fields in the Middle East. It is obvious that the resource is concentrated in only a few particular regions of the Earth, vide supra, and also Russia, South America and Indonesia. Countries such as Iraq and Iran may become swing-producers, i.e. that produce more oil than they use, and I have read opinions to the effect that the Iraq war if not started in the interests of obtaining oil for the West, might become a worthy swing-producer, thus averting economic starvation at least for a few years. Iraq has about 140 billion barrels of oil, and Iran about the same, and so at a level consumption of 30 billion barrels a year for the world in total, we might get almost 10 years worth of supply from there. It is significant that Western companies such as BP and ExxonMobil have been granted 30 year contracts to exploit the Iraqi oil.

Not everybody agrees with the Hubbert analysis and some argue that we will be able to access around four times as much oil as there is present under the Earth in the form of crude-oil, by which they mean the Canadian tar-sands, oil shale, oil made from coal or from gas, biomass and so on. However, this does Hubbert a considerable disservice because he was talking explicitly about cheap oil, and it is this that will inexorably run out, most likely during the next 5 - 10 years. Hence there is no consolation to be found in any putative 3.7 trillion barrels of oil figure, because bringing that into reality will be extremely expensive both financially (to take an economist's standpoint) and more precisely in terms of the energy and other resources such as water that are mandatory in those actions necessary to do so.

We are not about to run out of oil. We will be able to produce hydrocarbons (oil) for decades to come, but not at the cheap prices we are used to. I am working on a rough figure of assuming that everything (and I mean everything - food, clothes, and all else) will cost about twice what it does now in that 5 - 10 year period. That would correspond to a $200 barrel. This will be uncomfortable especially for those who already bear considerable debts, particularly in the UK, which is the most indebted nation in Europe. We also drink more than anyone else apparently, and have a greater incidence of sexually transmitted diseases, which makes me think that the era of the "stiff upper lip" has rather passed for the English. Many of these problems may well be "cured" by a huge hiking-up of general costs in terms of booze, travel and the overused "plastic friend" - the credit card which often proves less than amicable.

Another feature of Britain is that we have "lost" most of our manufacturing industry, and so we buy cheap imports from e.g. China and therefore fuel the economic enterprise of that nation. Without imports to the West of washing machines, TV's and so on, the Chinese economy will grind onto the hard shoulder, and our own economy, based as it is around the "service sector" will crash too meaning that less service-businesses will survive if people have less cash in their pockets to buy their services, and an according loss of jobs in that industry.

The mathematics of Hubbert's theory is very interesting but as I have pointed out before, there were only so many squares on that sheet of graph paper in reflection that there is only so much cheap oil that can be drawn up from the Earth, [i.e. Q(max) in the above equation], hence no matter what values we chose for the constants (a) and (b) or whether we use a Gaussian or Lorenzian distribution or some other mathematical device, the future of humanity will unfold, in ways that will be only evident to later history, upon a world devoid of cheap oil, and to kid ourselves otherwise is an act of addicted denial. We need to plan a society based on localised communities and less dependent on apparently limitless cheap transport, and cheap products made from oil.

(3) "The Hubbert Curve: Its strengths and weaknesses" By, J.H.Laherrere:,m
(4) "Hubbert's Peak - the mathematics behind it", By Luis de Sousa:

Monday, October 22, 2007

Oil Wars!

We can only plan the future of civilization in terms of energy resources other than cheap oil. The title of this article is is not a euphemism for the war in Iraq nor any potential strife elsewhere in the Middle East, in the cause of Western countries obtaining oil, but a reference to the concept that world oil production has already peaked and hence we cannot expect civilization to depend on it as a source of energy into the future. A new report by the German-based Energy Watch Group released its conclusions today that global oil production peaked in 2006. Furthermore, the group believes that global reserves of oil are only about two-thirds the 1,255 billion barrels the oil industry finds consensus on. This sounds to me that they do not believe the remarkable increase in estimates made of the reserves under Saudi, which houses the world's major oil wells.

There are many different figures as to precisely when "peak oil" will strike, but even if it is not already with us, it will come soon. My personal opinion is that production has been artificially maintained, meaning that rather than a smooth decline in the availability of oil, as is most simply indicated by the Hubbert Peak which roughly mirrors the rise in production over history, when present output can no longer be maintained, even by enhanced recovery methods, supply will plummet beyond our worst nightmares, if we dream about it at all.

I will write about the mathematics behind the Hubbert theory in subsequent postings here, but in essence Hubbert only had so many squares on the sheet of graph paper to count underneath his "curve" emphasising the simple fact that there is only so much "cheap" and relatively accessible oil in the ground. I do not dispute (and have explained its sources) that we will be able to conjure-up oil for decades to come, either by pulling it out of the ground, by cracking bitumen from Canadian or Venezuelan tar-sands or synthesising it from coal, gas or even algae, but the age of cheap oil is over. It would therefore be a criminal disservice to humanity to pretend otherwise. The fact of this matter is signified by a huge ramping-up of the price of oil: almost $90 compared with less than a quarter of than only 5 years ago, and the instability of the world financial markets which will now be up-and-down in perpetuity.

Yes, it is easy to blame the "sub-prime" markets and greedy and irresponsible lenders of cash to those who could never be expected to pay it back, but the real underpinning framework of financial instability is the availability and cost of that basic necessity upon which the modern industrialised world has been built - oil!

The days of cheap oil are over. The vampiric $100 dollar barrel in already in sight; and then we can expect $150, $200 or who knows how much? Since everything in our modern global village depends on oil, we can expect the price of everything to increase markedly. It is not only the cost of fuel, and of everything that is transported over colossal distances to supermarket shelves, hence increased costs to be borne by the consumer, but an increase in the basic costs of manufacture, from everything from food to plastics, since oil is the underpinning raw feedstock from which everything is made. It really is the proverbial double-whammy.

We can only therefore make realistic plans for the future in the absence of thoughts about cheap oil. I am speculative about what can really be provided in terms of renewable energy, or at least in time to head-off the dearth of oil that will hit us within a decade, and even nuclear power which the UK government has made a firm commitment to, will be hard pressed to substitute for fast depleting supplies of oil and gas. Jeremy Leggett (CEO of a major solar-energy company) and author of "The Carbon War" and "Half Gone" (a reference to the fact that according to Hubbert Peak theory, when the peak in oil production is reached half the oil there in the first place has been used-up) is of the opinion that both the UK government and the energy industry are in "institutionalised denial" and that action should have been taken sooner.

I have commented as much, and it is also my opinion that appropriate action should have been taken in the early 1970's when the OPEC artificially hiked-up the price of oil, leading to a political "oil crisis". Now the crisis is not a matter of politics but of geology and there is simply not enough of the stuff in the ground to be extracted at the low costs we have been used to. Furthermore, "half gone" is an optimistic delineation of the resource, the production of which is more likely to follow a skewed Hubbert curve, with a very rapid decline in supply beyond the putative peak, and a see-sawing ramp in its cost and thence of all goods.

Economic hardships and wars are the QED of this simple fact, as humanity in its various artificial nation states struggles to survive. But in accepting the reality of peak oil and all it implies, let's think ahead in the absence of "cheap" oil. Our lives will be less softened by cheap energy, and we need to be aware of this now, and not fool ourselves into false security of alternatives such as wind or wave power or the hydrogen or methanol economies. It is too late to introduce them anyway, and only the proverbial paradigm shift in thinking in terms of plentiful oil to those of oil dearth will preserve us from war and per se as a human civilization.

Related Reading.
"Steep decline in oil production brings risk of war and unrest, says study," By Ashley Seager, Guardian Monday October 22, 2007.,,331028371 - 110373.00.html