Niel Edmunds and the next generation of reservoir engineeringBy Peter McKenzie-Brown
This article appears in the June issue of Oilsands Review
Neil Edmunds is a serial innovator. Now the vice-president of enhanced oil recovery for Laricina Energy, in the past he’s worked in a variety of technical and executive positions with other companies. For example, at EnCana he provided reservoir and operations direction for Foster Creek’s Vapex and SAGD pilots. At CS Resources he was responsible for the Senlac thermal project in Saskatchewan. Later, as the CS vice president responsible for recovery technologies, he focused on enhanced recovery research.
In the 1980s he was lead engineer on AOSTRA’s underground test facility (UTF), which provided the definitive demonstration of the viability of SAGD. The UTF proved the process beyond question in 1992, when it briefly achieved positive cash flow at a production rate of about 2,000 barrels per day from three horizontal pairs. Edmunds stresses that the use of horizontal well pairs was not his idea, but was suggested by his predecessors at AOSTRA. However, SAGD pioneer “Roger Butler wanted to try a vertical fracture, but none of us wanted that, so we designed the horizontal well idea and tested it at the UTF.” The rest is history.
Just as SAGD was constructed upon the physics of Roger Butler, the original idea for what Edmunds calls his “favourite claim to fame” is partly an adaptation of work begun by nuclear engineer Terry Stone of the Alberta Research Council. Stone’s PhD thesis included a mathematical model to calculate fluid flows at reactor accidents.
At AOSTRA, Edmunds began applying this idea to heavy oil production, and sold the idea to CS Resources when he joined that company in 1995. Eventually acquired by Cenovus through a merger, the simulator “gives a detailed model of what happens in the wellbore in terms of heat transfer and fluid flow,” according to Edmunds. “This gives Cenovus quite an advantage in terms of engineering the wellbores themselves. Think about it: you’ve got a 7-inch pipe and it’s maybe 800 metres long; to make it efficient you have to figure out how to circulate the fluids to heat the reservoir uniformly.” He deadpans, “that involves some real plumbing challenges.”
Replacing engineers with computers
So what is Edmunds up to today? “I like to say we’ve replaced reservoir engineers with computers, but what we’ve really done is up the level at which engineers can operate. Instead of being drones who try to optimize stuff every day, we can now do rapid searches through classes of variables to find the best approach to any given reservoir.”
The first company to attempt to develop commercial production from Alberta’s bitumen carbonates, Laricina’s bitumen carbonates project is now steaming up, and the company will follow this with a program in the Grand Rapids formation, which is essentially the same as the McMurray. Both projects will use thermal solvent programs.
To illustrate the nature of the reservoir engineering problems he faces, Edmunds describes the chore as like finding the highest peak in a mountain range – in the fog. The surface to be optimized can’t be seen, only sampled at specific points. The problem exists in many dimensions, and it’s non-linear – especially when you consider the economics involved. Most frustrating of all, the same action can generate different, even opposite, effects when applied in different situations. Given those realities, he set out to develop an algorithm that could help the company select the most economically efficient way to produce from these difficult and largely unexplored strata.
The project – he says it began as a hobby before he helped create Laricina – now involves an algorithm of about 20 lines, and it could conceivably transform in situ oilsands production. “We use a lot of machinery to run the input files, but the basic algorithm is simple.” He adds, “This is pretty new in the oil business.”
“What we have done is to program a genetic algorithm. We encode the possible processes so the algorithm generates digital chromosomes out of 0s and 1s. Once you’ve run each file you need a fitness score. Ours is dollars per barrel. We create a class of possible processes with a fair number of variables. Our computer may take a couple of weeks, but it can run a huge number of possibilities. The computer takes the winners from each trial, recombines their strings of 0s and 1s – the same thing biology does with DNA. We use some from the mother and some from the father,” Edmunds explains, “and we presumably end up with a better organism. You never know if you have the best possible answer, but in the 5,000 trials the computer ran for us it does seem to have ended up with a very good answer.”
Your reporter’s skill in mathematics is limited, so to pursue Edmunds’ ideas I referred to a paper he and co-authors Behdad Moini and Jeff Peterson prepared for the 2009 Canadian Petroleum Conference. Titled “Advanced Solvent-Additive Processes via Genetic Optimization,” the paper is a partly whimsical, somewhat over-written but unquestionably accessible description of the project. It seems to deliberately raise more questions than it answers.
As the authors explain, the industry has long known that adding light hydrocarbon solvents to steam can improve well performance, but the optimum choice of additives involves evaluating vast numbers of alternatives. The genetic approach may allow computers to quickly come up with solutions tailor-made for each production system.
The authors describe the application of advanced mathematics to complex factors in reservoir engineering and find that the results tally with findings from trial and error. The convergence verifies the usefulness of applying mathematics in this way to real-world problems. The computer run takes a couple of weeks, while trial and error can take many years, so the authors argue that employing mathematics in this way can save time and money, big-time. While they admit that the model is greatly simplified, their general conclusion is that an industry that devoted major effort to similar projects could find itself spending months in the lab instead of decades in the field. The argument is compelling.
After running the algorithm, Edmunds’ team used engineering models to crack the code of the computer output and then applied an economics package to the whole. “In this, we are trying to do an economics calculation. The key thing for me was that working on these solvent processes involves too many variables. When you think you’ve solved a problem, it can be hard to look at the raw output and decide whether what you have come up with is good or not. So we have an automatic economics package which looks at each of the simulations. This permits computers to identify solvents and timetables that will maximize profit.”
With a sense of pride that only the mathematically gifted can appreciate, Edmunds observes that “our algorithm reproduced some of the best mathematical ideas that people have written up in the last 15 years or so.” Not bad for what he calls “a dumb piece of code.”