Will Smartgrid Put Electricity Utilities Out of Business?

Or at least radically change their business model. This is an interesting argument that I have now come across from two or three different sources in the last little while.  And if they do survive they do so to supply large commercial and industrial customers or else as a deprecated peer-to-peer network. The argument goes like this: if the cost of micro generation continues to fall (think cost of solar panels and battery storage in particular) will houses start to leave the grid and become energy self sufficient?

The game changer in this respect may be electric vehicles(EV). Imagine that your house is running low on power so you pop down to the shop to charge up the car and when you get home you pump half the charge into your house to get you through until the sun shines again or the wind blows. Or maybe you get home delivery battery swap for your house, delivered by EV of course.

Admittedly this is still a long way off by which I mean more than five years away but depending on a number of factors this could be well under way shortly after that time.  The factors that I believe will influence this are as follows.

Battery Technology

There have been a number of claims in recent times about improved battery life and lower manufacturing costs. It is true that costs have been falling. One report I have heard recently indicated that EV batteries have fallen 20% in price in the last four months, but this is driven I expect by scaling up production in anticipation of greater demand rather than improvements in technology.  I have recently come across two apparent breakthroughs which hold some promise: one is a way to improve the performance of lithium ion batteries which may prove to be a bridging technology. The other is more promising – using nanotechnology to create a whole new battery technology. But there is still quite a degree of scepticism which is justified.

Cost of Solar PV Panels

This is one of the biggest hurdles people face in terms of entering micro generation. I believe prices are being driven down again by a combination of scale of production and improvements in technology. There is no better example of this than Google closing down its solar concentration project. Subsidies aside, it may yet again prove the folly of being an early technology adopter for those that have paid very large amounts of money for their solar arrays.

Energy and Network Price Trade Off

If energy and network costs continue to rise then the relative household financial benefit to going off grid will continue to rise. This is a well established assumption behind the growth of electric vehicles but what if it is also the beginning of the end for the grid? It is likely that emission trading schemes and taxes will contribute a little to increasing prices. Micro generation costs will perhaps be driven down by the energy self-sufficiency policies of the US and China.

Global Recovery

The big one which will enable all of the above will be the pace and timing of global recovery especially in the emerging economies combined with continued population growth. This will provide the environment for increased R&D, faster time to market and scaling of production all of which will improve the technological feasibility and lower the costs for large numbers of consumers going off the grid. New housing will be important in this as self-sufficiency will need to weighed up against building out the existing network rather than retrofitting. Economic growth will again put pressure on traditional energy sources and increase global carbon output and the foot will then again be applied to global accelerator. And if global recovery is the main driver of the demise of the grid, then it is a question when and not if.

Advertisements

What’s going on with load factor?

For the last couple of years the commonly accepted view is that load factor growth is a major threat to the economical operation of electricity utilities in Australia. The problem is that peak demand has been growing but consumption has not been growing at the same rate. For utilities, the main capital expenditure cost is providing enough network capacity for maximum demand, but revenue from sales of electricity comes from what is sold.

Part of the hypothesis for why load is growing is that consumers are buying more energy efficient devices and are bombarded with energy conservation messages which means that ordinarily they use less electricity but on peak hot days they forgo conservation for comfort and contribute to peak demand. The result is that electricity utilities still have high capital expenditure but the revenue base shrinks due to non-peak energy saving by consumers. But is this really true or is it a false assumption based on a subtle variation of the Khazzoom-Brookes postulate?

In terms of an energy paradox, there are a couple of observations that I can make from the analytics we have been doing recently. As one industry leader told me recently, network forecasting was very easy until about 2006. The period from about 2003 to 2008 did not only include some of the hottest years on record but also was notable for record population growth.

Then everything changed.

Since the late-2000s many substation forecasts have started to break down. This has been a problem common to a number of utilities, but it has been of particular interest to networks that have been experiencing growth in connections. It is compounded by growing political pressure to reduce capital expenditure.

Firstly, we had a global downturn which depressed economic activity and caused widespread job losses that encouraged consumers to save money. In Australia, we also had what is starting to look like a long cycle change in the climate with a series of wetter, cooler summers which means that peak air conditioner use has fallen in many places. And in mid-2008 the rate of population growth started to slow. Even more curiously, the pattern of consumption by housing age starts to change. From about 1980 until the mid-2000s, the newer the house the higher the consumption, but around 2005-2008 all that changed. After this date, fewer houses start to consume relatively less electricity and possibly also less peak demand. Daniel Collins at Ausgrid has done some interesting analysis in this regard which he presented at the 2011 ANZ Smart Utilities Conference.

And then there are solar photo voltaic panels which have been subsidised by government and in effect increase load factor and therefore electricity prices. Solar panels reduce net consumption by feeding electricity back into the grid, and this usually happens during the middle of the day when the sun is brightest but peak demand happens in the early evening on hot days when people get home to a hot house and turn on the air conditioning. This leads to lower revenue but the same or higher network costs.

Electricity is perhaps the oldest post-Industrial revolution technology still in widespread use and this means that there is a long body of thought and experience in understanding its consumption and the mechanisms for its delivery. This strength may also however prove to be a weakness. Many ideas have been around for a long time and are no longer routinely challenged. The energy paradox is one of them. I am not saying that the underlying of the premise is wrong, but there is certainly room to reinterpret this idea in relation to the current situation in Australia. Challenging orthodox thinking. You need to be very sure of your facts. This is where analytics and a wide spread of both data and data mining methodologies can help. As I have said, analytics is at its strongest when it is not hypothesis driven but is working without an explicit hypothesis or trying to decide between many competing ones.

The answer is that I don’t really know what is going on with load factor. I have yet to be convinced that anyone (including myself) has worked how to properly account for the climatic variance of peak demand, or fully understand the relationship between housing age and consumption, or what the true relationship is with population growth, or how many speeds our multi-speed economy has, but something is definitely happening and investigative analytics has excellent potential in being able greatly develop our understanding of how all of these effects interact.

Melbourne Water and Weekend Flooding

Courtesy of the weekend flooding in Melbourne I have used the Melbourne Water instantaneous flow rates published on their website to check out the peak flooding of Plenty River at Greensborough.

This is a shot from standing next to the automatic reporting station. The small white post sticking out of the water in the middle of the photo is the road side flood level indicator which I think tops at 3.5 metres which is about consistent with the automatically reported river level.

And this is photo from a short way downstream showing a sign stating the obvious.

Seven Things IT Should Know About Analytics

  1. Analytics is not BI

Analytics is serviceably defined by wikipedia but this does not really do justice to the potential of a properly established analytics environment. To paraphrase Donald Rumsfeld BI deals with “known knowns” whereas analytics (at its most exciting) deals with “known unknowns” – that is, as data scientists, we know what we don’t know.

Let me illustrate.

It is important to measure new meter connections, where they are occurring and at what rate they are occurring. This is a well defined measure and can easily be translated into regular report with well defined metrics (e.g. how many, time from request to connection, breakdown by geography, meter type, etc). This is BI.

If however we don’t know why connections are growing or if they appear flat but if fact we suspect some classes are growing while others are shrinking so that the net effect is flat growth; or that consumption is changing (as it is in many parts of the country) and this this may somehow be linked to new connections. This is analytics.

  1. Analytics is a business function; not an IT function

Well, not always but usually. When I started in this field about ten years ago nobody really knew what to do with our team. We were originally part of a project team implementing a large scale CRM system. We were deeply technical when it came to understanding customer data but we weren’t part of IT. We were a “reporting” team but we were also data cube, database and web developers. There was no process for us to have access to a development environment outside of the IT department so we built our own (we bought our own server from Harvey Norman and when we had to move offices we wheeled it across the road on an office chair). We built our own statistical models to allocate sales because the system had not been built to recognise sales when they appeared in the product system. And eventually we started using the data to built predictive models and customer segmentation.

To begin with I think IT saw us as a threat to the safe running of “the system”, but over time we were accepted as a special case. It is true that to this day for an organisation to be truly competitive in analytics it must recognise the analytics teams embedded within the business are deeply technical and need to be treated as a special class of super user.

  1. Analytics is Agile

This is not a new idea. The best analytics outcomes are performed by small cross functional groups that cover data manipulation, data mining, machine learning and subject matter expertise. The groups are usually small because analytics developed is investigative and generally not hypothesis driven (or if there are hypotheses then there may be many competing ones which need to be tested), and the outputs of very complex analysis can often be disarmingly simple algorithms. It has not been usual for months of development to produce less than 20 lines of code.

  1. Analytics needs lots of data

Analytics thrives on lots of available data but its not used all of the time. When we are asked about what data we require, the trite answer is “give us everything”. The reason is that results can be biased if only built on part of the data record. That’s not to say that all of the data is used all of the time but a competent analyst will always know what has been excluded or how the data may have been sampled or summarised. For example, we spend a lot of time deciding on whether to treat missing data as null, zero or replace it in some way and the answer is always different depending on the project. In the age of smart grid data where datasets are very large (I have recently seen a ten terabyte table) an analytics environment should regularly build data samples for analytical use (properly randomised of course in consultation with analytics users).

  1. Analytics takes care of the “T” in ETL

Analytics teams like to take control of the “Transform” part of the ETL process, especially of the transform step involved summarisation or some other change to the data. Because data mining processes can pick up very subtle signals in the data, small changes to the data can lead to bad models being produced, and sometimes the what is considered bad data is an effect of interest to the data miner. For example, “bad” SCADA readings need to be removed from the dataset in order to develop accurate forecasts but the same bad data may be of interest in the building of asset failure models.

  1. Nothing grows in a sterile garden; don’t over-cleanse your analytics datasets

All raw data is dirty: missing records, poor data entry, mysterious analogue readings get digitised. But as in the example above, dirty data can be a signal worth investigating. Also, because models can be sensitive to outliers the data miner likes to have control over the definition of outlier and it is often a relative measure. If outliers have already been removed then this can cause valid records to be discarded in further outlier removal processes. Of course this needs to be balanced against the possibility that dirty data may lead to false conclusions especially with less experienced analytics users. So the right balance needs to be found but this does not always mean that cleaner is better.

  1. Be aware of who is doing analytics in your organisation

Because analytics is a technical function but it is embedded in business units and because it is agile by nature, analytics can be a very rapid way to develop value metrics for large system changes. It is good to know who is doing what as this may provide tangible evidence of business value. In most organisations that are doing some analytics, it tends to be in pockets throughout the organisation so it may not be immediately obvious who is doing what and how that might support IT business cases.

The Smart Grid is Already Here

I have just returned from the ANZ Smart Utilities Conference in Sydney last week where I heard a lot talk about the future potential of smart grids. It is true that there are some fundamental changes around the corner, particularly driven by new technologies in an energy constrained economy. But arguably the smart grid began sixty years ago when SCADA started collecting data and has been a mostly sleeping giant ever since. In a world where energy has been plentiful and cheap there has been no real desire to capture, store and analyse the data in any large scale and systematic way.

In some respects, what has happened to the electricity grid over the last few decades has been a lesson in the tragedy of the commons because cheap electricity has meant that the grid that delivers it has been treated as an over-exploited resource.

And now we are seeing the impacts of underinvestment in the grid, we are now interested in making it “smart” when in fact it has been smart all along. In spite of data quality issues and sporadic archiving of data, there is still a lot of value held in that data that has not been unlocked. This is especially so now that we have technology stacks that can handle very large datasets and apply advanced statistical and machine learning processes to data.

So the smart grid is already here. While it is important to keep an eye of future technology and trends it is equally important that utilities get their houses in order in terms of how they manage and interrogate their existing data assets.

Welome to Inside Smartgrid Analytics

Hello and welcome. This blog is intended as an insiders view of an analytics professional practising in the emerging field of utility analytics. It is an exciting time with a tsunami of data heading our way. Not only will this data transform the way that electricity networks are managed but it will also open a whole new world for utilities who will transform from network managers into data manufacturers. This data will open new commercial opportunities for utilities and allow them to play a much more dynamic, evidence-driven role in the formulation of energy policy.