Knowledge Center

The Rise of the Machines

31st January 2020


The following is an excerpt from Network Magazine, November 2019. Redistributed with permission.

As energy supply and demand rapidly evolve, the future of electricity generation and distribution becomes ever harder to predict. Fortunately technology is improving at the same pace, giving network operators the opportunity to stay ahead of the growing challenge. In this partnership study, brought to you by Network in association with Kelvatek, we investigate the role of machine learning in development of the smart grid, what the challenges are to its widespread use – and how these can be overcome.

In the future, we anticipate an energy sector rich with algorithmic balancing, automated asset optimisation, software platforms managing the interactions between multiple actors and truly cross-vector provision of energy.” These were the words of the Energy Data Taskforce, which was established to give ministers clarity on the opportunities offered by the vast amount of digital information now collected by the sector. “Greater data openness will provide far superior price and market visibility, increase liquidity and drive investment into the right technologies, locations and solutions for the system, all delivering better system and price outcomes for consumers,” added the taskforce in its June 2019 report.

But Laura Sandys, chair of the taskforce, tells Network there is quite a mountain for the industry to climb to make the most of its data. “As a group of companies and regulators we are quite far behind the curve when it comes to understanding the value of data and knowing how to manage it,” she says. “The food sector leads the way, along with transportation and manufacturing. The automotive sector is very seriously advanced. But it has never been asked of the energy sector. We have relied on a dumb system rather than a smart system.” This set-up won’t be able to cope for much longer, however. “As we move into a different world there will be a lot more moving parts, so data on how they all work will become more and more crucial,” says Sandys. “Data will be the new value in the electricity system.”

The taskforce, managed by the Energy Systems Catapult, identified gaps in the quality and visibility of data available in the sector as well as in the skills available to make the most of what information did exist. “We came up with two principles,” says Sandys. “Filling the gaps and presuming open data.” These underlying fundamentals led to five recommendations being set out in the taskforce’s report earlier this year. Summarised, these were: digitalisation of the energy sector; maximised value of data; increased visibility of data; increased visibility of infrastructure and assets; and co-ordination of asset registration.

Sandys says companies in the sector reacted more positively than expected to the report. “There is now acceptance that data management should be part of business as usual,” she says. “It’s about how rather than why.” And artificial intelligence is part of this ‘how’, she explains. “Once we start to get real time data, distributing assets and delivering value, this will not be coordinated by a command and control office in Wokingham. It will need to be algorithmically managed, predicted and assessed. Machine learning is really important.”

Improving efficiency

Randolph Brazier, head of innovation at the Energy Networks Association (ENA), agrees that computerised analysis of information can play a big role in improving the efficiency of the energy system. “Artificial intelligence can help you understand why network equipment is failing,” he says. “Computers can cross check a lot of data from a lot of sources and determine trends that humans would not be able to see.” Network operators need help dealing with challenges both in demand for electricity and generation of it, Brazier explains. “We are starting to see new types of demand such as electric vehicles and potentially heat pumps.

These are significant demands that the networks were not designed for when they were put together in some cases more than a century ago. The challenge is ensuring that we don’t have to build a much bigger network to enable people to use these devices; the cost would be unacceptable.”

On the generation side, renewables are being created in a range of previously unexpected places. “This doesn’t come into the network in centralised locations it comes in at every level, even from homes. Again the network was not designed for these levels of voltage or for the multidirection flow.

We need a smart grid to allow us to manage these generation and demand challenges.” Brazier says understanding data is critical to show where power is being generated and used, and also to highlight the condition of the assets distributing that power. He points to the Low Carbon Technology Detection Project run by Western Power Distribution whereby artificial intelligence was part of a system able to identify thousands of previously un-located electric vehicles and solar panels. The ENA has set up its own data working group to try to maximise networks’ use of machine learning. “If we can use the technology to save customers money, decarbonise and keep the lights on then we want to do that.” Brazier backs Ofgem’s approach to encouraging technology development in the energy sector. “We believe the current regulatory framework, which incentivises smart solutions and innovation, is broadly pretty good,” he says. “There are two key elements that have helped – innovation funding through the Network Innovation Allowance and the Network Innovation Competition, and the totex mechanism, which has one pot for both operational and capital solutions, incentivising the networks to find smarter operations rather than building more.”

Recognising data

Jonathan Rodgers, future networks lead at Kelvatek, highlights the importance of skills and innovation when he points out that data has no worth in itself until it is understood and acted on wisely. “It is the action that creates the value,” he says. One area where action can make a big difference is in the seemingly mundane but actually hyper-critical maintenance of cables and other network infrastructure. He gives one example of how an increase in the turbulence of both supply and demand for power affects the physical network.

“Electricity North West, through a project called Class, has been using tap changers, assets on the network, to reduce voltage, which reduces demand.” While this innovation works well to balance out surges, it requires detailed knowledge of power supply and demand, and also puts additional duties on tap changers, which may need maintaining sooner.

“We need to tailor maintenance schedules to the reality of how assets are being used rather than on a time-basis or on condition inspections,” says Rodgers. Kelvatek is working with Electricity North West on vibration monitoring of tap changers, using machine learning algorithms to see if a problem is developing. “We were producing up to 20GB of data per tap changer every month in trials,” says Rodgers.

“Machine learning can filter out what isn’t meaningful and look only for what is useful.” Samir Alilat, innovation and DSO strategist at Kelvatek, says the company had a lightbulb moment – almost literally – when it released the Bidoyng, or smart fuse. “As well as having a function on the network, as an auto-recloser, we realised it could grab data as faults happened on the network,” he says. “We knew that if we could get hold of the data behind the faults we could put it together with other information we held to work out the location of the fault for the network operator.”

This was useful in helping operators fix problems more quickly but Kelvatek has moved on to tackle the issue of predicting where things will go wrong. “We are interested in using a network model to be able to advise network operators to deploy sensors so they can intervene or repair a fault before any incident happens. “Machine learning projects can help bring disparate data sources together to find relational patterns, creating powerful information that can be used to deliver benefits in a number of areas.”

Alilat cites a number of focuses for Kelvatek in how it uses machine learning to create value for operators and their customers. “You can look at cable and asset health; LCT detection; load forecasting; voltage optimisation; simulating future scenarios for the market and the network. There are a lot of business cases for using collated data.

“Artificial intelligence can establish patterns you might not see as a human. You are automating the learning process.” However there are some important elements to get right before the technology can do its job. “The quality and volume of data is important,” says Alilat. “If a human reads incomplete or poor quality books then their conclusions will be coloured accordingly and it’s the same for machines.

“You need a lot of data at the correct quality and resolution. You also need a multi-disciplinary team to analyse data and come up with the right models.” Sandys calls for caution in how artificial intelligence is introduced to manage the networks. “Setting up shadow systems is important, where you run something using a machine learning algorithm at the same time as doing it in real time.

Then you can work out whether the machine learning understands the system,” she says. In the long-run though, she anticipates significant changes being driven by the new technology. “Ultimately National Grid should become a systems software company. The network could be run by machines. You need human intervention and oversight but fundamentally you should have some very effective tools by which things can be automated.”

Interested in finding out more?

To discuss anything highlighted in our blog, please complete this enquiry form and a member of our team will be in touch.