Productivity and Big Bang Theory

Productivity has been falling in Australia for some time. In the mining, utility and manufacturing sectors we have seen a remarkable fall in productivity over the last decade. Some of this has been caused by rising labour costs, but in mining and utilities in particular, capital expenditure on infrastructure has been major contributor. So how will new technology and the era of “big data” transform the way these sectors derive return on capital investment?

 

According to the ABS this may have been driven in part by rapid development of otherwise unprofitable mines to production in an environment of once-in-lifetime high commodity prices. From a labour perspective, this has also driven wages in the mining sector which has knock-on effects for utilities.

Meanwhile for the last decade utilities have been dealing with a nexus of chronic under-investment in some networks, our insatiable appetite for air conditioning in hot summers and a period of growth in new housing with poor energy efficiency design in outlying urban areas which are subject to greater temperature extremes. The capital expenditure required to keep pace with this forecast peak demand growth has been a major negative in terms of productivity.

In this post I am going to consider how analytics can find increased productivity in the utilities sector (although there should be parallels for the mining sector) and specifically through optimisation of capital expenditure. I’ll discuss labour productivity in a  future post.

Deloitte has recently released its report into digital disruption: Short Fuse, Big Bang. In this report the utility sector is one which is going to be transformed by technological change, albeit more slowly than other sectors. Having said that, electricity utilities and retailers are going to be the first to experience disruptions to their business models, before water and gas. This is being driven by the fact that electricity businesses are at the forefront of privatisation among utilities and the politicisation of electricity pricing. Internationally, energy security concerns (which as in turn has seen the rise of renewables, energy conservation and electric vehicle development, for example) have also driven technological change faster for electricity utilities.

On face value the concept of smart grid just looks like the continuation of big ticket capital investment and therefore decline in productivity. Is there, however, a way to embrace the smart grid which actually increases productivity?

Using good design principles and data analytics, I believe the answer is yes. Here are three quick examples.

Demand Management

The obvious one is time of use pricing of electricity which I have written about on this blog several times already. The problem with this from a savings point of view is that the payoff between reduced peak demand and saving in capital expenditure is quite lagged and without the effective feedback between demand management and peak demand forecasting then may just result in overinvestment in network expansion and renewal. In fact I believe that we have already seen this occur as evidenced by the AEMO’s revision of peak demand growth. When peak demand was growing most rapidly through the mid 1990’s , demand management programs were proliferating as were revisions to housing energy efficiency standards. It should have been no surprise that this would have an effect on energy usage, but quite clearly it has come as a surprise to some.

Interval meters (which are also commonly referred to as “smart” meters) are required to deliver time of use pricing and some parts of the NEM are further down the track than others in rolling these out, so this solution still requires further capital investment. In my recent experience this appears to be the most effective and fairest means for reducing peak demand. Meter costs can be contained however as “smart meter” costs continue to fall. A big cost in the Victorian rollout of smart meters has not just been the meters themselves but the communications and IT infrastructure to support the metering system. An opt-in roll out will lead to slower realisation of the benefits of time of use pricing in curbing peak demand but will allow a deferral of the infrastructure capital costs. Such an incremental rollout will allow assessment of options such as between communication-enabled “smart meters” versus manually read interval meters (MRIMs). They are meters which capture half hour usage data but do not upload that via a communications network. They still require a meter reader to visit the meter and physically download the data. These meters are cheaper but labour costs for meter reading need to be factored in. There are other advantages to communications-enabled meters in that data can be relayed in real time to the distributor to allow other savings spin offs in network management. It also makes it possible for consumers to monitor their own energy usage in real time and therefore increase the effectiveness of demand pricing through immediate feedback to the consumer.

Power Management

From real time voltage management to reduce line loss, to neural net algorithms to improve whole of network load balancing, there are many exciting solutions that will reduce operating costs over time. Unfortunately, this will require continued capital investment in networks that do not have real time data-reporting capabilities and there is little appetite for this at the moment. Where a smart grid has already rolled out these options need to be developed. Graeme McClure at SP Ausnet is doing some interesting work in this field.

Asset Lifetime

This idea revolves around a better understanding of the true value of each asset on the network. Even the most advanced asset management systems in Australian distributors at the moment tend to treat all assets of a particular type of equal value, rather than having a systematic way of quantifying their value based on where they are within the network. Assets generally have some type of calculated lifetime and these get replaced before they expire. But what if some assets could be allowed to run to failure with little or no impact on the network? It’s not that many talented asset managers don’t already understand this. Many do. But good data analytics can ensure that this happens with consistency across the entire network. This is an idea that I have blogged about before. It doesn’t really require any extra investment in network infrastructure to realise benefits. This is more about a conceptually smart use of data rather than smart devices.

The era of big data may also be the era of big productivity gains and utilities still have time to get their houses in order in terms of developing analytics capability. But delaying this transition could easily see some utilities facing the challenges to the business model currently being faced by some in the media and retail industries. The transition from service providers to data manufacturers is one that will in time transform the industry. Don’t leave it too late to get on board.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s