Posted on May 2nd, 2016
by Chip Davis
Human Tendencies and The Corporate Organism
No one likes to plan for their own demise and as humans, we are generally hedonists. When things are working well, it is a relief from daily anxiety to direct our attention toward having achieved prosperity. Companies are collections of humans.
For almost every successful new company, there is another that was in a position to make that success its own. This opportunity lost can be a function of several failings: “We didn’t see this or we misunderstood that.” When companies and board rooms contemplate what’s next, it is many times in the context of their current customer set. Customers are good sounding boards for market feedback; however, what actually happens next (or after that) is not always found through customer dialogues. Customers are only going to discuss “what is next” within certain boundaries (i.e. what they are willing to share, what they think is germane to the listener, what they had for breakfast). It is these boundaries that many times are the raw material for a narrow view of the likely reality of “what’s next.”
I once heard former Secretary of State Condoleezza Rice posit that the eventual demise of so many geopolitical strategies is a failure to establish competent “listening posts.” She was speaking to an audience of technology investors using her career observations as a proxy applicable to the technology company organism. To this technology listener, she meant that most companies don’t establish engines/teams/processes dedicated to considering what technologies might be coming out of left field to undercut strategy and market position. Her eventual point was that many times this results in “opportunity lost” and evolves into something seriously detrimental to shareholder value.
In the world of software and the Energy Industry, it is very common for the target market to deliberate “build our own.” Software vendors must regularly compete against the prospect’s view that what they envision provides a competitive advantage and needs to be proprietary. I would argue that this mindset sets a course that deprives the prospect of an interesting form of “listening post.” How?
A good software company is a de facto market intelligence-gathering organism. Through the proliferation of product in the market, the software vendor creates neural pathways between companies (prospects) that compete against each other. A software company’s mission is to gather various forms of “what’s next” from across the installed base and put that information into code/practice. By examining a software vendor’s product road map and long-term vision, a prospect/customer can develop the sense of what his competition is up to. It is an avenue for detecting and handicapping certain types of market messaging (and your competitor’s priorities around competitive advantage) to which the listener otherwise likely would would never construct any form of foresight. In his own “buy vs. build” deliberations, the prospect is unknowingly trading away the implicit “listening post” (i.e. buy) for the belief of short-term competitive advantage (i.e. build).
So What Am I Saying?
Obviously this dynamic is not absolute; however, it is very common reality. Many great enterprise software companies have developed successful playbooks based solely on this notion. Does the “listening post” construct always materialize to the favor of the prospect? NO. But I think I would rather avail myself of the possibility of YES than the probability of opportunity lost.
Posted on April 25th, 2016
by Chip Davis
There is a long history of oil industry participants getting duped by software companies. Unsuspecting prospects are lured by low annual license fees combined with a one-time services load creating the appearance of an attractive total cost of ownership. The prospect is lead to believe that the highly attractive delta on the license fee (compared to competing vendors) more than offsets a higher one-time services load and, on this basis, he makes his selection. Next stop, “deployment.”
“Domain expertise” is the high level measure used to convey software’s fit for a particular use case. The services load for any deployment is generally inversely correlated to the degree of domain expertise inherent in the code. High domain knowledge in the code = lower services etc etc. When the prospect gets a quote reflecting a high services load, it is a glaring signal that the code “doesn’t know my business.” So in assuming the high services project, the customer is really saying…”I will take the risk of teaching you my business of which you currently know very little.” This is where projects fail.
If the software vendor does not have five other similar prospects, then your code is a one-off. The code is dependent entirely on you to evolve beyond the initial specifications. The vendor is not coming to you continually with “domain specific” enhancements discovered from a larger user community. As a result, you continue teaching and continue paying for what otherwise would come at no additional charge in the once-contemplated high annual license fee. This raises the total cost of ownership.
We have many times seen portfolio companies competitively lose prospects only to have those prospects come back a year later. On the one hand we like the business and on the other…it is painful to know the story behind their return.
Posted on April 6th, 2016
by Chip Davis
In this video, the CFO of an oilfield service company discusses how software helped operations impacted by a reduced labor force.
Posted on March 7th, 2016
by Chip Davis
Without limited partners, private equity funds can do nothing. This truth many times causes a fund manager to set his priorities around a belief that limited partners are the primary customer of a private equity fund. If we contemplate the business model of a private equity fund through the lens used by the fund manager to consider a potential portfolio company, a different answer can reveal itself.
What Are We Selling?
Houston Ventures is structured to invest in technology companies that solve operational problems in the Energy Industry. There is a wide range of simple problems that result from the remoteness of the oil & gas development supply chain such as (a) what do I do next? (b) how should I do it?, (c) can I do it?, (d) did I do it?, (e) did I do it right?, and (f) is everyone happy? Finding companies that meet our criteria reflects a large number of variables to consider. Examples in our portfolio of companies having met our criteria include LiquidFrameworks, Geoforce, NuPhysicia, and Oseberg.
The purpose of market specialization for us is two-fold: (i) having an ability to gauge the market’s prioritization of its operational problems in both up and down markets, and (ii) understanding operational frameworks (at the portfolio company level) that have proved successful for young companies and that fit the DNA of the Energy Industry. Many companies with great “horizontal” technology have hung a shingle in Houston, Texas (et al) only to vacate with no success and not understanding why.
The companies we have in our portfolio are there because they believed that market specialization from a capital source mattered heavily. They were shopping and we were selling. The existence of alternatives required Houston Ventures to engage in a form of value selling (just like the companies themselves). This in turn required us to understand the problem set of the prospect company, prioritize those problems, and convince the listener that solutions we might provide were meaningful and better than those against whom we compete. This is especially complex given that many times the listener must first be educated to the existence of problems he has yet to detect. It is not a short sales cycle. Here is a real world example of one shopper’s evaluation screens.
So Who is the Customer?
Without diminishing the import of LP’s, a lot more intellectual property is required to address the “buying decision” of a small technology company targeting Energy than for a potential LP considering an investment in the Houston Ventures funds. We have to look at ourselves in the following vein, “If a prospect company’s principal value pitch to us is “product price”, we want to end the sales cycle as quickly as diplomatically possible”. Why should the evaluation of a funding source be any different?
It is in this sense that one might consider a portfolio company to be the “true customer” of a private equity fund. We are the product, they are the customer. Focus on how the customer is underserved, act on it expertly, and shareholders (LP’s) prosper.
Posted on February 15th, 2016
by Chip Davis
Russ Capper meets with Chip Davis to discuss operating problems and technology solutions for the Energy Industry. The EnergyMakers Show is a weekly video production featuring interviews with energy industry thought leaders, innovators and public policy makers discussing challenges and solutions to the world’s rapidly increasing need for energy. The Show is broadcast over the web, and featured segments are broadcast on the radio in Houston on NEWS92fm.
Posted on January 22nd, 2016
by stephanie cummings
January 22, 2016 — In December, Houston Ventures invested in a small Oklahoma-based company named Oseberg. Oseberg focuses on the Oil & Gas sector, providing users a means to acquire competitive insights around upstream activity. These insights result from the types of data stored in the Company’s system and its methods for observing relationships among the data.
Looking for Oil & Gas – Searching for and producing oil and gas requires consideration of a large number of variables. Operators gather data from internal and external sources in an effort to determine (a) what they should do next, and (b) how they should do it. Oseberg helps them “discover” what they should do next.
- External Sources of Data – There are several well established companies that provide industry information about who is drilling, where they are drilling, and aspects of how they go about it. Think Drilling Info and IHS: most of this data is a “reprint” of regulatory filings submitted to states by oil companies and pertains to what they drilled and what they produced. Each state generally offers this data online in a highly structured format, which makes it relatively easy to gather and package it into a single system. Doing just this has made a lot of people a lot of money.
- Retrieval vs. Search of Data – Data aggregator offerings are more about convenience than market intelligence. Time spent in these systems reveals that the user generally has to know what he/she is looking for in order to find it. If you know what you are are looking for, these systems are great. But what if you don’t know what you are looking for? What if you are trying to find out what you are looking for? This is where SEARCH comes in.
What analysts look for in oil and gas are patterns among and between data. What makes their investigation so difficult is that no spot on the planet is identical to any other spot. This means that the causal factors for success vary greatly between locations. When you perform a search on say, Google, what you insert in the search bar is basically a request to find everything that matches a particular pattern (i.e., “show me everything that looks like this“). What is required in the oil and gas search bar is something like THIS^23. What is also required is a system that can find things responsive to THIS^23. This is Oseberg. Google is responsive to your search request because the data knows what it is under a variety of definitions.
Why We Did It – The reason we invested in Oseberg is that data in the Oil & Gas sector responds to only a narrow set of requests while it needs to respond to a wide range. Oseberg’s mission is to make this plausible and its customers think it is doing exactly that. Customers can now discover what they should be looking for without having to know before they start.
Posted on October 29th, 2015
by stephanie cummings
Reuters | Oct. 21, 2015 — Stagnating rig productivity shows U.S. shale oil producers are running out of tricks to pump more with less in the face of crashing prices and points to a slide in output that should help rebalance global markets.
Over the 16 months of the crude price rout, production from new wells drilled by each rig has risen about 30 percent as companies refined their techniques, idled slower rigs and shifted crews and high-speed rigs to “sweet spots” with the most oil.
Such “high-grading” helped shale oil firms push U.S. output to the loftiest levels in decades even as oil tumbled by half to less than $50 a barrel and firms slashed rig fleets by 60 percent.
But recent government and private data show output per rig is now flatlining as the industry reaches the limits of what existing tools, technology and strategies can accomplish.
“We believe that the majority of the uplift from high-grading is beginning to wane,” said Ted Harper, fund manager and senior research analyst at Frost Investment Advisors in Houston. “As a result, we expect North American production volumes to post accelerating declines through year-end.”
Drillinginfo, a consultancy with proprietary data, told Reuters well productivity has fallen or stabilized in the top three U.S. shale fields – the Permian Basin and Eagle Ford of Texas and the Bakken of North Dakota – since July or August.
The U.S. Energy Information Administration, whose benchmark drilling productivity index is based in part on Drillinginfo data, forecasts next month’s new oil production per rig in U.S. shale fields to stay at October levels, which it estimates at 465 barrels per day (bpd).
The big challenge of shale oil work is that well output drops off quickly – often more than 70 percent in the first year alone. So producers need to keep squeezing more oil out of new wells drilled by the currently deployed rig fleet just to offset steep declines in what existing wells produce.
OLD WELL DRAG
If that is no longer possible and firms remain reluctant to add rigs because of low crude prices and an uncertain outlook, overall production is set to sink. (Graphic)
Chip Davis, managing partner at energy venture capital firm Houston Ventures, says the downward pull of declining output from older wells is getting stronger.
In the Eagle Ford, production from so-called legacy wells fell by 145,485 bpd last month, a drop that was 23 times larger than the 6,293 bpd lost in September of 2010, before the fracking boom brought thousands more wells online.
“The boulder that is decline is much bigger in size and rolling much faster than before,” Davis said. “We’ve got very few rigs to buttress the rate of decline.”
That growing drag suggests the fall in U.S. output could be sharper than a 10 percent drop the EIA sees between a peak of 9.6 million bpd in April and next August, when it expects production to bottom at 8.66 million bpd before starting to recover.
Producers’ coping strategies with the worst cash crunch in years could be also hurting productivity of new wells.
To save money, many have started drilling shorter and cheaper vertical wells. They have also cut back in some cases on the size of multi-million dollar hydraulic fracturing jobs for long horizontal wells. Both factors can hurt the average amount of oil being added by new wells.
Analysts say it is hard to predict how much U.S. output will fall and whether it will undershoot official forecasts because lower production could lift prices and that in turn might prompt producers to redeploy idle rigs to pump more.
But for now, most companies are budgeting less next year for new drilling work and the U.S. rig count has tumbled to 595, according to Baker Hughes.
Analysts at Bernstein Research have said that productivity gains so far in this downturn have come from improved efficiency rather than fundamental leaps in technology.
Yet such advances, which are hard to predict, would be necessary to boost productivity again because analysts say shale firms seem to have fully exploited techniques such as drilling multiple wells from one location, drilling longer horizontally, and more intensive fracturing along a well bore.
Initial production rates for new wells in major oil basins also appear to be slowing, Bernstein analysts said, citing their analysis of peak rates dating back to 2009.
“Shale efficiencies will be unable to overcome rig count collapse, leading to a roll in production which is bullish for oil price,” they said.
(Reporting By Anna Driver and Terry Wade; Editing by Tomasz Janowski)
Posted on October 7th, 2015
by stephanie cummings
As of August 2015, increases in daily oil production resulting from recent rig activity was 4.8x greater than it was in January 2011. In other words, it takes a lot fewer rigs today to create initial increases in production.
On the flip side, decreases in daily production deriving from wells not recently drilled as of August 2015 is 18.5x greater than it was in January 2011.
It is interesting to ponder the ratio of these comparisons in terms of a sports metaphor: “I am in 4.8x better shape than I used to be, but I have to work 18.5x harder to stay there.”
The graphic above indicates that the number of rigs required to sustain the prior month’s production (now) is about 3x what it was at the beginning of 2011. While it will take further decreases in daily production to create a meaningful improvement in drilling activity, I would note that the velocity of the decrease (even after required decreases in production) relative to the force required to counter that velocity appears to be greater than it has been before by a remarkable margin.
While everyone is looking for “markets to balance” around supply and demand for oil, I am suggesting that the fundamental calculus equations used to plot “what happens next” have changed.