Wednesday, February 17, 2016

Peak Supermajors Introduction & 4Q2015 Result

This blog post was originally published at willmartin.com- go to willmartin.com to stay up to date on future blog posts.
http://willmartin.com/peak-supermajors-introduction-4q2015-results/

Introduction
Today I would like to introduce my “Peak Supermajors” project. The goal of this project is to answer the question “when will we reach peak oil” by studying the production and financial health of the world’s largest oil companies. Because oil is a finite resource, its daily global production will eventually reach a peak. By measuring when individual oil companies reach peak oil, I hope to bring us closer to answering the question “when will we reach peak oil?”
I am beginning my project by analyzing the largest publicly-traded companies: the “supermajors“. These 5 companies – BP, Chevron, ExxonMobil, Royal Dutch Shell and Total – produce nearly 20% of the world’s oil and gas. They are mostly descendants from the original “Seven Sisters,” which themselves were largely descendants of John D. Rockefeller’s Standard Oil Company. These companies are leaders in the industry, both financially and technologically. By understanding the history of these companies and their strategy for the future, we can better understand the historical arc of the broader oil industry. As I fill out the database I plan to expand it to include data from all of the largest global oil companies.

Seeing Oil on a Longer Time Horizon
One of my goals with this project is to expand the peak oil conversation to a longer time horizon. I am a member of the Long Now Foundation, an organization whose goal is to “provide a counterpoint to today’s accelerating culture and help make long-term thinking more common.” When I read news stories about the oil industry, the time horizon discussed always seems to be a financial quarter or (at most) a year. News stories talk about production changes “year on year” but never “decade on decade” or “since the company reached peak oil in 1972.” By collecting a database of multi-decade production data I hope to expand the quarterly production discussion to a longer time horizon.

Focus on Oil and Gas Production Individually 
When quarterly earnings are reported, financial news sites usually mention the change in the company’s “headline” production figure (if they mention production at all). This “headline” figure combines oil production with gas production by converting gas to “barrels of oil equivalent” based on its embedded energy content. This is worse than combining apples with oranges.Chris Martenson describes it “as if someone asked you how many calories you had stored in your pantry, and you lumped together not just your food, but also the batteries in your flashlights and other home electronics. They might have caloric energy equivalents, but you sure can’t eat them.”
Oil and gas are not used in the same way. Oil is used to produce liquid transportation fuels (gasoline, diesel, jet fuel and marine “bunker fuel”) and to make lubricants. Natural gas is burned in power plants to make electricity and is converted by chemical plants into fertilizer and plastics. Due to the differences in outputs, oil and gas operate in very different competitive environments. Oil competes with biofuels (ethanol, biodiesel) and the electrification of transportation (electric cars, high speed rail). Gas competes with renewable electricity sources (wind, solar, hydropower, etc), organic fertilizers and bioplastics.
So instead of lumping oil and gas production together, I will be discussing them separately each quarter. Hopefully this will help steer the quarterly conversation away from “headline” numbers and towards an analysis of oil and gas production individually.

Seeing Peak Oil From Miltiple Angles
This database fits into a multi-faceted way of measuring peak oil. There are many ways to slice-and-dice global oil production including measuring production by country, by field, and (now) by company. Each of these measurements allows us to better measure whether we are approaching peak oil, are at peak oil or have passed peak oil. For example, using a country-level production database like the BP Statistical Review of World Energy allows us to determine which countries have passed peak oil. Steve Andrews publishes an analysis annually using the BP data to summarize all of the “Pre- and Post-Peak Nations.” Once enough countries have reached peak oil, we will pass the global peak in oil production. Using field-level data from IHS (Wood Mackenzie offers a similar field-level database) the Energy Watch Group in Germany has produced a series of reports showing current and future peaks by oil producing region. Once enough fields have reached peak oil, we will pass the global peak in oil production. The third way of looking at peak oil is at a company level. Every single barrel of oil is produced by a company, whether it be a publicly-traded company like ExxonMobil or a privately-held national organization like Saudi Aramco. Once enough oil companies have reached peak oil, we will pass the global peak in oil production. Looking at company-level data is just another window into peak oil.

Data Sources and Future Work
My ultimate goal is to have a “full history” of production data for all of the supermajors. Because these companies can trace their origins to the very beginning of the oil industry, this means collecting over 100 years of data. To achieve this monumental task I began by collecting the most recent data from the company annual reports and SEC filings currently available online. I quickly learned that this data only goes back about 15 years and started searching for additional data sources. I found additional annual reports on company websites, online databases, archive.org, eBay, Amazon, Google Books, Google Scholar and the National Iranian Oil Company’s online library. I have also spent dozens of hours at Stanford’s Media & Microtext Center scanning microfiche copies of old annual reports. To fill in some of the gaps and double-check my work I relied on Richard Heede’s extensive database at carbonmajors.com as well asOil and Gas Journal company surveys.
I now have a private collection of over 1,000 annual and quarterly reports for some of the world’s largest oil companies. I will continue to expand the database and improve its accuracy over time. The database now stands at tens of thousands of data points, with hundreds of thousands of metadata points backing up each datum. Along with collecting 100+ years of production data I’ve also been collecting 100+ years financial and operational data for each company. I am planning to use this data to perform long-term analysis. For example, how has capital efficiency changed over time? Have the supermajors reached “peak production per dollar of inflation-adjusted CAPEX spend?” How has employee productivity changed over the last 100 years? Have the supermajors reached “peak free cash flow per barrel of production?” Have they reached “peak production per employee?” These are just a few questions that I plan on analyzing. As I complete the analysis I am planning to submit my findings to the Oil Age journal (arguably the best source for peak oil research) for academic publication. In the interim I’m planning to publish these short updates on a quarterly basis to show the current amalgamated state of the supermajors’ oil and gas production rates.
The data accuracy is not perfect right now. I’m about 80% confident with the accuracy of the data – which I consider “good enough to blog” but not necessarily good enough to submit to an academic journal. I still have some data validation to do and I plan to complete a full statistical audit with a large enough sample size to get me to 95% confidence in the data accuracy. So for the time being take all of this with a grain of salt.
Before I begin, I would like to knowledge the amazing work of people who inspired me to begin this project. I was inspired by similar company-focused efforts by Richard HeedeMatt Mushalikand the researchers at the Energy Watch Group. I would also like to thank Mason Inman for helping me think through the idea.
If you are interested in keeping up-to-date on this project, please SUBSCRIBE using the link to the left.
On to the show…

Supermajors
The supermajors’ liquids production rate for Q4-2015 was 9,329,500 barrels per day. Year-over-year, production rose by 611,022 barrels per day. Overall oil production peaked at 30,554,482 barrels per day in Q1-1973. Since reaching peak oil, Supermajor liquids production rate has fallen by 69.5%. This represents a post-peak compounded annual decline rate of 2.7%. If production continued to linearly decline at this rate, it would reach zero production in 2034. The more recent peak occurred at 11,135,767 barrels per day in Q3-1999. Since reaching the second peak, Supermajor liquids production rate has fallen by 16.2%. This represents a compounded annual decline rate of 1.1%. If production continued to linearly decline at this rate, it would reach zero production in 2099.
Peak Supermajor Oil
The supermajors’ natural gas production rate for Q4-2015 was 40,598,130,435 cubic feet per day. Year-over-year, production declined by 1,338,695,652 cubic feet per day. Overall oil production peaked at 47,339,390,527 cubic feet per day in Q1-2010. Since reaching this primary peak, Supermajor natural gas production rate has fallen by 14.2%. This represents a post-peak compounded annual decline rate of 2.6%. If production continued to linearly decline at this rate, it would reach zero production in 2050.
Peak Supermajor Gas
ExxonMobil
ExxonMobil’s liquids production rate for Q4-2015 was 2,481,000 barrels per day. Year-over-year, production rose by 299,000 barrels per day. Overall oil production peaked at 7,010,929 barrels per day in Q1-1972. Since reaching peak oil, Exxon Mobil Corporation’s liquids production rate has fallen by 64.6%. This represents a post-peak compounded annual decline rate of 2.3%. If production continued to linearly decline at this rate, it would reach zero production in 2039. The more recent peak occurred at 2,803,460 barrels per day in Q1-2007. Since reaching the second peak, ExxonMobil’s liquids production rate has fallen by 11.5%. This represents a compounded annual decline rate of 1.4%. If production continued to linearly decline at this rate, it would reach zero production in 2083.
ExxonMobil’s natural gas production rate for Q4-2015 was 10,603,000,000 cubic feet per day. Year-over-year, production declined by 631,000,000 cubic feet per day. Overall gas production peaked at 14,652,000,000 cubic feet per day in Q4-2010. Since reaching this primary peak, Exxon Mobil Corporation’s natural gas production rate has fallen by 27.6%. This represents a post-peak compounded annual decline rate of 6.3%. If production continued to linearly decline at this rate, it would reach zero production in 2029.
Chevron
Chevron’s liquids production rate for Q4-2015 was 1,775,000 barrels per day. Year-over-year, production rose by 43,000 barrels per day. Overall oil production peaked at 10,718,904 barrels per day in Q1-1973. Since reaching peak oil, Chevron Corporation’s liquids production rate has fallen by 83.4%. This represents a post-peak compounded annual decline rate of 4.1%. If production continued to linearly decline at this rate, it would reach zero production in 2024. The more recent peak occurred at 2,273,819 barrels per day in Q4-1998. Since reaching the second peak, Chevron’s liquids production rate has fallen by 21.9%. This represents a compounded annual decline rate of 1.4%. If production continued to linearly decline at this rate, it would reach zero production in 2076.
Chevron’s natural gas production rate for Q4-2015 was 5,385,000,000 cubic feet per day. Year-over-year, production rose by 285,000,000 cubic feet per day. Overall gas production peaked at 13,472,131,148 cubic feet per day in Q1-1972. Since reaching this primary peak, Chevron Corporation’s natural gas production rate has fallen by 60.0%. This represents a post-peak compounded annual decline rate of 2.1%. If production continued to linearly decline at this rate, it would reach zero production in 2045.
 
BP
BP’s liquids production rate for Q4-2015 was 2,137,000 barrels per day. Year-over-year, production rose by 169,000 barrels per day. Overall oil production peaked at 6,362,701 barrels per day in Q1-1973. Since reaching peak oil, BP plc liquids production rate has fallen by 66.4%. This represents a post-peak compounded annual decline rate of 2.5%. If production continued to linearly decline at this rate, it would reach zero production in 2037. The more recent peak occurred at 3,084,248 barrels per day in Q3-1988. Since reaching the second peak, BP’s liquids production rate has fallen by 30.7%. This represents a compounded annual decline rate of 1.3%. If production continued to linearly decline at this rate, it would reach zero production in 2077.
BP’s natural gas production rate for Q4-2015 was 7,076,000,000 cubic feet per day. Year-over-year, production declined by 148,000,000 cubic feet per day. Overall gas production peaked at 10,128,700,000 cubic feet per day in Q1-2000. Since reaching this primary peak, BP plc’s natural gas production rate has fallen by 30.1%. This represents a post-peak compounded annual decline rate of 2.2%. If production continued to linearly decline at this rate, it would reach zero production in 2052.
 
Total
Total’s liquids production rate for Q4-2015 was 1,077,000 barrels per day. Year-over-year, production rose by 0 barrels per day. Overall oil production peaked at 1,560,000 barrels per day in Q1-2006. Since reaching peak oil, Total SA liquids production rate has fallen by 31.0%. This represents a post-peak compounded annual decline rate of 3.7%. If production continued to linearly decline at this rate, it would reach zero production in 2037. The more recent peak occurred at 1,560,000 barrels per day in Q1-2006. Since reaching the second peak, Total’s liquids production rate has fallen by 31.0%. This represents a compounded annual decline rate of 3.7%. If production continued to linearly decline at this rate, it would reach zero production in 2037.
Total’s natural gas production rate for Q4-2015 was 6,219,000,000 cubic feet per day. Year-over-year, production rose by 0 cubic feet per day. Overall gas production peaked at 6,312,000,000 cubic feet per day in Q1-2015. Since reaching this primary peak, Total SA’s natural gas production rate has fallen by 1.5%. This represents a post-peak compounded annual decline rate of 2.0%. If production continued to linearly decline at this rate, it would reach zero production in 2066.
 
Shell
Shell’s liquids production rate for Q4-2015 was 1,859,500 barrels per day. Year-over-year, production rose by 100,022 barrels per day. Overall oil production peaked at 5,887,671 barrels per day in Q1-1973. Since reaching peak oil, Royal Dutch Shell plc liquids production rate has fallen by 68.4%. This represents a post-peak compounded annual decline rate of 2.7%. If production continued to linearly decline at this rate, it would reach zero production in 2035. The more recent peak occurred at 2,584,870 barrels per day in Q3-2002. Since reaching the second peak, Shell’s liquids production rate has fallen by 28.1%. This represents a compounded annual decline rate of 2.5%. If production continued to linearly decline at this rate, it would reach zero production in 2049.
Shell’s natural gas production rate for Q4-2015 was 11,315,130,435 cubic feet per day. Year-over-year, production declined by 844,695,652 cubic feet per day. Overall gas production peaked at 13,940,791,209 cubic feet per day in Q1-2013. Since reaching this primary peak, Royal Dutch Shell plc’s natural gas production rate has fallen by 18.8%. This represents a post-peak compounded annual decline rate of 7.3%. If production continued to linearly decline at this rate, it would reach zero production in 2027.

Do Aerator Shower Heads Use More Energy?

This blog post was originally published at willmartin.com - go to willmartin.com to stay up to date on future blog posts.
http://willmartin.com/do-aerator-shower-heads-use-more-energy/

Our house had an old “high flow” shower head. As California is in the midst of an epic drought (despite all the recent rain), I was planning on installing a low flow “aerator” shower head. These shower heads mix in air with the outgoing water to lower the flow rate while theoretically not sacrificing comfort. Compared to a traditional shower head the water pressure felt is higher but the flow rate is lower, so they end up saving water.
But right as I was about to install one I heard a theory that these aerator shower heads actually end up using more energy because the mist causes the shower to feel colder, which means the user turns the hot water dial up, thereby using more hot water, which requires the hot water heater to burn more natural gas (or use more electricity, which burns more coal and natural gas) to reheat more water.
At first I thought this was a calculus problem, (think high school calculus – water goes in to a tank at one rate and leaves at another rate, how long until the tank is empty?) but then I realized it’s actually a simple algebra problem. Since the user turns up the amount of hot water in the mix, we can just assume a temperature change differential and calculate the change in hot water usage.
I calculated the existing flow rate of the shower using a stopwatch and a bucket at 3.5 gallons per minute (it took 45 seconds to fill a 10 liter bucket). Other older high flow shower heads can be up to 5.5 GPM. According to the EPA, the average shower length is 8 minutes. This means my existing shower head uses 28 gallons per shower.
According to Bosch, the average shower temperature is 106 degrees Fahrenheit and the average groundwater temperature is 58°F. The Department of Energy recommends setting your hot water heater at 120°F. Our hot water heater was set to a scalding 140 degrees Fahrenheit, so I turned it down to 120. This means for each gallon of hot water you pull out of the hot water heater it needs to heat another gallon of water by 62°F (Delta T).
It takes 1 British Thermal Unit to heat 1 pound of water 1°F. At 58°F, 1 gallon of water weighs8.34 pounds. At 120°F, 1 gallon of water weighs 8.25 pounds. Using the average of the two, and assuming a 90% thermal efficiency in converting natural gas to hot water, a hot water heater uses about 570 BTUs for each gallon of hot water consumed.
By the time the 120°F hot water makes it from the hot water heater through the copper pipes to your shower it has lost about 5°F of heat. If you prefer your shower at 106°F then you will need to set your shower dial at a mix of 84% hot water (.84*115+(1-.84)*58=106). This checks out with the shower dial position of the old shower head being about 85% of the way towards full hot water. So the “high flow” shower head used about 13,406 BTUs per shower. (570 BTUs per gallon of hot water * 3.5 gallons per minute * 84% hot water mix * 8 minute shower = 13,406.4 BTUs).
After installing the new low-flow shower head, I noticed that I needed to move the shower dial slightly closer to the “full hot” position to achieve the same comfort level. I’d estimate it is about 95% of the way to hot. This means that the shower head lowers the temperature of the shower by about 6°F (106-(.95*115+(1-.95)*58)). The new low-flow shower head uses 2 gallons per minute. So at this new hot-cold mix, it uses about 8,664 BTUs per shower. (570 BTUs per gallon of hot water * 2.0 gallons per minute * 95% hot water mix * 8 minute shower = 8,664 BTUs).
So to answer the question of this blog: YES, aerator shower heads do save energy (and lots of water).
But how long does it take to pay for itself? According to the EIA, natural gas contains about 1,028 BTUs per cubic foot. At a cost of $10 per thousand cubic feet of natural gas (about the residential average in California for the past few years), each shower saves about 6.3 cents of energy. ((570 BTUs per gallon of hot water * 1.5 gallons per minute difference between the high flow and low flow shower heads * .95 percent hot water mix * 8 minutes) / 1028 BTUs per cubic foot of natural gas) * $0.01 per cubic foot of natural gas). If we reach “peak gas” in the near future, this cost could increase dramatically. Since the shower head cost $14, it will pay for itself after about 222 showers. Some water districts (including our own EBMUD) give away these shower heads for free, making the return on investment infinite!
Bonus:
While I was installing the new shower head I also installed a “ladybug” water saving temperature-controlled shutoff valve. This ingenious device shuts off the flow of water once it reaches a certain temperature. To restart the flow you simply pull a cord. So when you want to take a shower, you simply run it as you normally would; once the ladybug detects the shower is hot, it slows the flow to a trickle; then you just hop in and pull the cord to restart the flow.
The Ladybug Temperature-Controlled Shutoff Valve
The Ladybug Temperature-Controlled Shutoff Valve
The alternative way to avoid wasting water while you wait for the shower to heat up is to install a recirculating pump. This pump sits under your bathroom sink and is connected to the hot and cold water lines. When you’re ready to take a shower you push a button and the pump sucks water from the hot water line and forces it down the cold water line until the hot water line reaches the desired temperature. Besides being expensive (they cost about $200 without installation), a recirculating pump also causes your cold water line to have some warm water in it, so when you go to the sink to get cold water after a shower, it will be warm for a bit (which bothers some people).
Waiting for a shower to heat up wastes water and energy because most people don’t want to sit around with their hand in the shower stream waiting for it to heat up. This means that they might let it run for a minute or two longer than they need to. At 2 gallons per minute, an extra 2 minutes of run time amounts to a savings of about 2.1 cents per shower ((570 BTUs per gallon of hot water * 2 gallons per minute * .95 percent hot water mix * 2 minutes) / 1028 BTUs per cubic foot of natural gas) * $0.01 per cubic foot of natural gas). At a cost of $29, this will pay for itself after about 1,377 showers. While this may seem like a while, for a family of four, this is a payback period of less than a year.

Friday, October 30, 2015

Peak Oil + Autonomous Cars = Traffic Nightmare

Originally posted at willmartin.com.


How you’re going to feel once peak oil and automated cars make traffic a nightmare

Autonomous Cars
For a number of years now, high-end luxury cars have had autonomous cruise control systems that use lasers or radar to maintain a set distance to the car ahead. Earlier this month Teslarolled out an autonomous driving mode in its electric cars that takes this a step further. Teslas will now drive themselves on a freeway, accelerate and decelerate on their own and maintain a set distance to the car in front. However, unlike other cars with autonomous cruse control, they will now also change lanes – if you click the turn signal the car will automatically check your blind spot and execute a lane change. This is just one more step toward a fully autonomous car that consumers can buy.
Some of the world’s largest tech companies are working on autonomous vehicles. Google has driven their autonomous prototype cars over a million miles, which have been completely accident-free, except for other human drivers crashing into them. Apple has also been hard at work on an autonomous, possibly electric, car of their own. In all, over 25 companies are currently developing autonomous cars.
Most industry experts believe that fully autonomous cars will be mainstream in just 5 years. Others predict that in 20 years most cars won’t have a steering wheel or pedals and in 25 yearsmost people won’t need a drivers license.
Speed Records
Recently Alex Roy teamed up with Carl Reese and Deena Mastracci to complete a cross-country speed record in a Tesla with the new autonomous control upgrade. Roy is best known for breaking the Cannonball Run record by driving across the United States in 32 hours and 7 minutes in a BMW M5. (This record was most recently bested by Ed Bolian in 28 hours 7 minutes with a Mercedes CL55 AMG.) Reese and Mastracci are known for previously driving the Cannonball Run route in an electric car in just 58 Hours and 55 Minutes. Using Tesla’s new autonomous mode, the trio completed the route in 57 hours, 48 minutes – over an hour faster than their previous electric car record, but still about 30 hours slower than the petroleum-powered record. For reasons I will describe below, Bolian’s record may stand for eternity as the fastest transcontinental automobile crossing. In the future, traffic may simply become so bad that no one will be able to achieve such a feat again.
The Promise of Autonomous Cars Ending Traffic
Cornucopian futurists have suggested that increased adoption of autonomous cars could bring an end to our traffic congestion woes. One MIT researcher thinks they could reduce traffic by80%. The Brookings Institute says that autonomous cars will “reduce much of the congestion and delays that make road travel so onerous.” They could even eliminate traffic in Los Angeles– arguably the world’s most car dependent city. It will be “An End to Traffic Jams Forever!”
The idea is that unlike human drivers, autonomous cars have perfect reaction times. They can follow the car in front of them with very little braking distance, matching speeds perfectly. If a group of autonomous cars gets together on the freeway they could form a “train” – all traveling in unison just inches from each other’s bumpers. It has been theorized that having just a few autonomous cars on the road could greatly reduce traffic congestion for everyone else.
The appeal of this is obvious. The average suburbanite is desperate for any news that allows them to think they can continue their “suburban, car-dependent, happy motoring living arrangement.” Driverless cars seem to offer the ability to continue living in a quiet suburban cul-de-sac miles from the nearest workplace or shop. You’d simply sit back, play around on your phone and let your robot car whisk you away to your destination dozens of miles away. “Super-commuters” (those who commute more than 50 miles to work) wouldn’t need to change a thing – they could just catch up on some shut-eye while their robot car drives them to and from work.
Peak Oil and Climate Change Legislation
Peak Oil is the point at which global oil production reaches a maximum rate and begins a permanent decline. Oil is a finite resource, so peak oil will happen – it’s just a mathematical fact. The controversy around peak oil isn’t about whether it will happen, but when, why and how it will happen: Is it happening now? Will it happen because oil gets too expensive to produce, restricting supply? Will it happen because oil gets too expensive to consume, restricting demand? How quickly will production decline after the peak? Will substitute forms of energy and transportation technologies offset the decline? No one can definitively answer any of these questions, but we do know that at some point in the future we will be faced with declining levels of global oil production. One possible outcome of peak oil is that we won’t have sufficient economic substitutes for oil and the price of oil rises significantly. Perhaps electric car production is limited by the high cost of extracting lithium for the batteries (especially since mining requires so much oil). Perhaps NIMBYism prevents us from increasing the walkabillity of our neighborhoods through the construction of public transportation routes and higher-density mixed-use buildings. In any case, in this scenario people would be stuck relying on their car, but oil prices would incentivize them to use as little fuel as possible.
Another source of higher energy prices is a potential global climate change agreement. Already 114 nations have signed the Copenhagen Accord, which states that the parties agree to limit global warming to 2 degrees Celsius above pre-industrial levels. The European Union has enacted climate change legislation. If all of the existing fossil fuel reserves that are on the books of the world’s oil, gas and coal companies were burned, it would generate more than 2.8 trillion tons of CO2 – well in excess of the 1 trillion ton “budget” that almost every country has agreed to. In order to keep that excess 1.8 trillion tons of carbon in the ground, a global climate change agreement would need to raise the cost of emitting carbon to a point where more than half of the remaining reserves are never burned. This could be accomplished through a global carbon tax or a global cap and trade program, but the result would be the same – far higher prices for gasoline at the pump. If a global climate change agreement  is reached, the average motorist will see rising fuel prices and will be incentivized to use as little fuel as possible.
The Eco Button
Many cars on the road today already have an “eco” button on the dash. The button doesn’t do very much today – it typically changes the throttle response, adjust the climate control and changes the fuel mapping a bit. In the future of automated cars, however, the “eco” button could do far more – it could pick the most efficient route to the destination (with the fewest hills and stops), it could drive at an optimal speed, and it could accelerate and decelerate at the optimal rates. Today the “eco” button gives drivers about 5-10% better fuel economy. In the future, self-driving cars could easily double your fuel economy at the simple push of an “eco” button. People today are used to hitting the “eco” button when they want to save a bit of fuel. In the future, if fuel prices are far higher and self-driving cars allow a far more impressive fuel economy improvement in “eco mode,” it seems obvious that more and more people will be pushing the “eco” button.
Hypermiling
As gasoline prices have gotten higher over the past two decades a group of fuel maximizing techniques know as “hypermiling” have become more popular. This can involve physically modifying a car by eliminating weight and improving aerodynamics. More commonly hypermiling is accomplished through driving techniques like optimal speed management and acceleration modulation to keep the internal combustion engine at optimal stoichiometric efficiency. Colloquially this is known as “driving like a grandma.”
Today hypermilers are able to achieve some amazing feats of fuel efficiency. Hypermilers areroutinely able to get double the “sticker” fuel economy of average cars. For example, hypermilers can get 127 MPG out of a Toyota Prius, which is rated by the EPA for 60 MPG. They can get 62 MPG out of a Toyota Corolla, a car rated for 35 MPG. Even with the king of fuel inefficiency, the Hummer, hypermilers are able to get 22 MPG in a truck that normally gets 10 MPG.
Pulse and Glide Driving and the Accordion Effect
Internal combustion engines are most efficient when they are under full load at low to medium RPMs. This means when you are driving up a hill you will consume less fuel to maintain the same speed if you use full throttle in a higher gear (at a lower RPM) than if you downshifted and used less throttle at higher a RPM. This becomes evident when one looks at the brake specific fuel consumption (BSFC) efficiency contour chart for a typical internal combustion engine:
A brake specific fuel consumption efficiency contour chart
brake specific fuel consumption efficiency contour chart
Most internal combustion engine cars these days use electronic fuel injection; This allows the engine to consume zero fuel when you completely let off the throttle and coast. When driving on level ground this means the most efficient way to drive is to “pump” the throttle by accelerating at full throttle from 1500 to 2500 RPM and then coasting back down to 1500 RPM with no throttle. Hypermilers call this the “pulse and glide” technique. Unfortunately, as anyone whose been in a taxi recently can attest, this “throttle pumping” accelerator modulation is also the best way to make passengers carsick.
Besides making passengers carsick, the pulse and glide fuel saving technique also contributes to traffic through the “accordion effect.” When traffic is dense and a road is close to reaching its maximum capacity a single speed different can ripple through the crowd, causing stop-and-go traffic to pile up. This speed disruption can be as simple as someone looking down to check their phone; when they look back up the may realize they are following too closely and brake; the cars behind them see brake lights and they brake as well out of an abundance at caution; a mile back this ripple of brake lights brings traffic to a complete halt.
In the future, autonomous cars may be programmed to pulse and glide their accelerators to maximize fuel economy. As the autonomous cars coast down in speed, human drivers behind them will hit the brakes, causing stop-and-go traffic.
Speed Limits and Speed Minimums
If you asked the average person driving down the freeway to tell you the speed limit, chances are they would have a good idea. But if you asked the same people to tell you the minimum speed, I bet most would have a hard time coming up with it. In California most suburban freeways have a 70 mph speed limit and a 45 mph speed minimum. In reality this means that in the absence of traffic people in the left lane are driving 80 mph while people in the right lane are driving 65 mph. Anyone driving the speed minimum of 45 mph would be traveling at a 35 mph difference to other people on the road. Imagine standing on the side of a road and watching a car pass you at 35 mph – that’s a significant speed difference. Differences in speed cause traffic to pile up – as human drivers come up on an autonomous vehicle traveling significantly slower than they are the human drivers will hit their brakes which will cause everyone further back to hit their brakes, which will lead to the “accordion effect” of stop-and-go traffic.
If automated cars are put into “maximum economy mode” it is likely that the computer will poll its database for the minimum speed it can drive on any particular freeway and accelerate up to just that speed. Any police officer that pulled over an owner of an automated car driving the speed minimum would have hard time in court fighting a perfect computerized output of GPS coded data proving the car was following the letter of the law.
What this means in practice is that either politicians will have to raise the speed minimums of freeways or that we will have to live with traffic congestion caused by automated cars driving the minimum highway speed. My money says that very few politicians will want to go to bat for increased speed minimums.
Electric cars and “Range Maximization”
In a post-peak oil future with global climate change legislation, carbon-based fuels may be more expensive, but elections may not. If an electric car owner charges their car with electrons from solar panels on their home’s roof, they may not care as much about the cost of the electrons. But unless there is a major breakthrough in the energy density of electric car batteries, owners of electric cars will still have a major incentive to drive in a way that maximized the range of their vehicles. Importantly, the techniques used to maximize fuel economy in an internal combustion car are very similar to the techniques used to maximize the range in an electric car.
Just as internal combustion cars have an optimal speed for minimizing fuel economy, so too do electric cars have an optimal speed for maximizing range. Worryingly, the optimal speed for achieving maximum range in electric vehicles is far slower than the optimal speed for achieving maximum fuel economy in gasoline cars. According to Tesla, the range-maximizing speed for their Model S sedan is just 25 miles per hour! The current world electric car range record was set in a Tesla P85D. The drivers achieved 452.8 miles of range on a single charge by driving an average speed of 24.2 mph. If drivers push the “eco” button in an automated electric car like a Tesla, it is possible that the software would choose a route that allows it to maintain an average speed of 25 MPH. This would necessitate the car to avoid highways and use roads with lower speed limits. Most city streets, however, have speed limits of 35 MPH. Rural roads often have speed limits of 45 MPH. On many of these roads people are used to driving 5 to 10 MPH over the posted speed limit. A hypermiling automated electric car could easily be driving at 20 or 30 MPH below the average traffic speed. On a two-lane road, where it is difficult to pass, traffic would quickly back up behind such a slow car.
Tesla Range vs Speed Chart
Tesla Range vs Speed Chart
The other main difference between electric cars and internal combustion cars is the acceleration efficiency. While internal combustion cars are most fuel efficient when accelerating at full throttle, electric cars are most energy efficient when accelerating very slowly. When trying to optimize the energy efficiency (and maximize the range) of an electric car, the best technique is to accelerate slowly (like there is an egg beneath the accelerator pedal) and to decelerate slowly by leaving plenty of stopping distance and letting the motor’s regenerative braking bring you to a halt. In fact, a driver who is truly optimizing the efficiency of their electric car would almost never need to use the brake pedal. Needless to say, this form of driving – with extremely slow acceleration and leaving many car lengths of following distance to allow for slow deceleration by the regenerative brakes – can easily case traffic to pile up.
Due to the low energy density of current electric car batteries, the easiest way to maximize the range of the car is to add as many batteries as possible to the car. Unfortunately adding more batteries adds more mass. As race car teams know very well, added mass is multiplicative – it snowballs. When you add an extra thousand pounds of batteries, you need to add an extra hundred pounds to the chassis to support the batteries; a heavier chassis requires beefier suspension arms, wheels and tires; a larger overall mass requires bigger breaks to stop, which further increases the mass of the wheels, tires and suspension; and on and on. Once the car has been designed with all of the safety and comfort requirements plus the structure to hold such a large amount of batteries it can tip the scales at astronomical values. The curb weight for the new Tesla Model X SUV, for example, is 5441 lbs – that is 741 lbs heaver than the Hummer H3! What’s worse, every pound added to the car increases the amount of raw materials needed to build and increases the complexity of assembling the car; thus the current top-of-the-lineTesla costs 78% of the median home price in the United States.
Self-driving cars offer an alternative way for electric cars to have long range without breaking the bank. Instead of loading up a car with more and more batteries, a self-driving car could have fewer batteries but be able to achieve an impressive range when put in “range maximization mode.” The average American drives 37 miles per day. Currently the cheapest electric car on the market is the Mitsubishi i-MiEV, which has a 62 miles range and costs just $15,495 after rebates. Amazingly that’s just $500 more expensive than the cheapest internal combustion engine car for sale today (the Chevy Sonic). 62 miles of range is almost 70% more than the average person drives in a day. In the near future, automated car technology could become so inexpensive that even a car like the i-MiEV could become totally driverless.
Volkswagen recently got in trouble for cheating on emissions testing by designing the software of their vehicles to adjust the fuel mapping to lower emissions when the vehicle sensed it was being tested on a dyno. In much the same way, it is plausible that in the future electric car companies could design their software maximize to the range of their vehicles when they sensed they were being tested. An inexpensive electric car like the i-MiEV may be able to achieve over 100 miles of range by accelerating and decelerating slowly and capping its top speed. The car may simply engage its “eco” mode when it senses it is being tested. But of course “your mileage may vary.” In the real world, ranges would be far less – but as long as the range under normal driving conditions remained above the daily driving needs of the average American, most people wouldn’t complain. For longer trips, drivers could put it in self-driving “eco mode” and just sit back and read a book while the car putters along at 25 MPH with dozens of cars piled up in traffic behind them.
Traffic Today, Traffic Tomorrow
Too many Americans drive too much every day. Many “super commuters” travel over 50 miles each way to their jobs every day day. Heading out of their suburban and exurban homes they must contend with drowsy drivers, drunk drivers, distracted drivers texting away, and, increasingly, horrendous traffic jams. Urban sprawl has pushed people from the suburbs into theexurbs. In many places around the world individual cities have sprawled so far that they have begun to merge into megalopolises.
Autonomous cars seem to offer the perfect solution to our driving problems. Robots have perfect reaction times – no more “accordion effect” of stop-and-go freeway jams caused by drivers slamming on their brakes. Robots never get distracted – no more accidents from texting while driving; no more idiots driving too slowly and swerving out fo their lane because they’re not paying attention.
Unfortunately, the promise of a traffic-free future is a probably a mirage. Peak oil and global climate change legislation will raise the price of transportation fuels. Barring a major breakthrough, affordable electric cars will only be able to achieve long ranges through economical driving. As more and more people hit the “eco” button on their autonomous cars, roads will become increasingly jammed up by robotic cars driving like grandpa on his way home from the blue plate special. As the cost of living in walkable neighborhoods continues to rise more people may consider moving to car-dependent suburbs. Autonomous cars may make suburban commutes look attractive, but reality will be different. As autonomous car software allows more people to hypermile their cars at the push of a button, suburban commutes could become unbearable. Rather than heading towards a traffic-free future we may be headed towards a traffic jam nightmare.