I have just returned from the ANZ Smart Utilities Conference in Sydney last week where I heard a lot talk about the future potential of smart grids. It is true that there are some fundamental changes around the corner, particularly driven by new technologies in an energy constrained economy. But arguably the smart grid began sixty years ago when SCADA started collecting data and has been a mostly sleeping giant ever since. In a world where energy has been plentiful and cheap there has been no real desire to capture, store and analyse the data in any large scale and systematic way.
In some respects, what has happened to the electricity grid over the last few decades has been a lesson in the tragedy of the commons because cheap electricity has meant that the grid that delivers it has been treated as an over-exploited resource.
And now we are seeing the impacts of underinvestment in the grid, we are now interested in making it “smart” when in fact it has been smart all along. In spite of data quality issues and sporadic archiving of data, there is still a lot of value held in that data that has not been unlocked. This is especially so now that we have technology stacks that can handle very large datasets and apply advanced statistical and machine learning processes to data.
So the smart grid is already here. While it is important to keep an eye of future technology and trends it is equally important that utilities get their houses in order in terms of how they manage and interrogate their existing data assets.