Guest Post

Adam is an energy engineer working towards decarbonising the supply of heat and power.  Adam is excited about applying advanced analytical techniques like machine learning and linear programming to help design and operate our energy systems at a higher level of performance.

I’d strongly recommend checking out his fantastic ADG Efficiency blog.


Introduction

Technological innovation, environmental politics and international relations all influence the development of our global energy system.

Yet there is one less visible trend that may come to dominate all the others. Machine learning is blowing past previous barriers for a wide range of problems. Many results that were expected to take decades have already been achieved.

I’m really excited about the potential of machine learning in the energy industry. I see machine learning as fundamentally new. Up until now all the intelligence humanity has created originated in our brains. Today we have access to a new source of intelligence – computers that can see patterns in data that humans can’t see.

This is a two part series. Part One of this series will introduce what machine learning is, why it’s so exciting and some of the challenges of modern machine learning. Part Two will highlight some energy industry specific applications of machine learning.

What is machine learning

Machine learning gives computers the ability to learn without being explicitly programmed. Computers use this ability to learn patterns in large, high-dimensionality datasets. Seeing these patterns allows computers to achieve results at superhuman levels – literally better than what a human expert can achieve.

Machine learning is now state of the art for a wide range of problems. The fields of computer vision, natural language processing and robotics have all been moved forward by machine learning.

To demonstrate what is different about machine learning, we can compare two landmark achievements in computing & artificial intelligence.

In 1996 IBM’s Deep Blue defeated World Chess Champion Gary Kasparov. IBMs Deep Blue ‘derived it’s playing strength mainly from brute force computing power’. But all of Deep Blue’s intelligence originated in the brains of a team of programmers and chess Grandmasters.

In 2016 Alphabet’s Alpha Go defeated Go legend Lee Sedol 4-1. AlphaGo also made use of a massive amount of computing power. But the key difference is that AlphaGo was not given any information about the game of Go from its programmers. Alpha Go used reinforcement learning to give Alpha Go the ability to learn from its own experience of the game.

Both of these achievements are important landmarks in computing and artificial intelligence. Yet they are also fundamentally different because machine learning allowed AlphaGo to learn on it’s own.

There are a number of exciting applications of machine learning in the energy industry:

  • forecasting of generation, demand & price
  • energy disaggregation
  • reinforcement learning to control energy systems

Part Two of this series will flesh out some of these applications.

Why now?

Three broad trends have led to machine learning being the powerful force it is today.

Data

It’s hard to overestimate the importance of data to modern machine learning. Larger data sets tend to make machine learning models more powerful. A weaker algorithm with more data can outperform a stronger algorithm with less data.

The internet has brought about a massive increase in the growth rate of data. This data is enabling machine learning models to achieve superhuman performance.

Amount of global data created

For many large technology companies such as Alphabet or Facebook their data has become a major source of the value of their businesses. A lot of this value comes from the insights that machines can learn from such large data set

Hardware

There are two distinct trends in hardware that have been fundamental to moving modern machine learning forward. The first is the use of graphics processing units (GPUs) and the second is the increased availability of computing power.

In the early 2000’s computer scientists innovated the use of graphics cards originally designed for gamers for machine learning. They discovered massive increases in training times – reducing them from months to weeks or even days.

 

Increase in CPU performance (1978-2010)
Increase in CPU performance (1978-2010)

This speed up is important. Most of our understanding of machine learning is empirical (based on experiment). This knowledge is built up a lot faster by reducing the iteration time for training machine learning models.

The second trend is the availability of computing power. Platforms such as Amazon Web Services or Google Cloud allow on-demand access to a large amount of GPU-enabled computing power. Access to computing power on demand allows more companies to build machine learning products. It enables companies to shift a capital expense (building data centres) into an operating expense, with all the balance sheet benefits that brings.

Algorithms & Tools

I debated whether or not to include algorithms and tools as a third trend. It’s really the first two trends (data & hardware) that have unlocked the latent power of algorithms, many of which are decades old. Yet I still think it’s worth touching on algorithms and tools.

Neural networks form the basis of most state of the art AI. In particular it’s neural networks with multiple layers of non-linear processing units (known as deep learning) that forms the backbone of the most impressive applications of machine learning today. These artificial neural networks take inspiration from the biological neural networks inside our brains.

Conventional neural networks take inspiration from the structure of our own visual cortex and have revolutionised computer vision. Recurrent neural networks (specific the LSTM implementation) have revolutionised natural language processing by allowing the network to hold state and ‘remember’.

Recurrent Neural Networks

Another key trend in machine learning algorithms is the availability of open source tools. Companies such as Alphabet or Facebook make their machine learning tools all open source and available. In a similar way that the availability of computing power is allowing small companies to build machine learning products, the availability of open source tools is allowing democratisation of cutting edge algorithms.

It’s important to note that while these technology companies encourage and openly share their tools, they don’t share their data. This is because data is the crucial element in producing value from machine learning. These technology companies know that world class tools and large amounts of computing power are not enough to deliver value from machine learning – you need data to make the magic happen.

Challenges

Any powerful technology has downsides and drawbacks.

By this point in the article the importance of data to modern machine learning is clear. In fact large datasets are so important for supervised machine learning algorithms used today that it is a weakness. Many techniques don’t work on small datasets. Human beings are able to learn from small amounts of training data – burning yourself once on the oven is enough to learn not to touch it again. Many machine learning algorithms are not able to learn in this way.

Another problem in AI is interpretability. A model such as a neural network doesn’t immediately lend itself to explanation. The high dimensionality of the input and parameter space means that it’s hard to pin down cause to effect. This can be difficult when considering using a machine learner in a real world system. It’s a challenge the financial industry is struggling with at the moment.

Related to this is the challenge of a solid theoretical understanding. Many academics and computer scientists are uncomfortable with machine learning. We can empirically test if machine learning is working, but we don’t really know why it is working.

Worker displacement from the automation of jobs is a key challenge for humanity in the 21st century. Machine learning is not required for automation, but it will magnify the impact of automation. Political innovations (such as the universal basic income) are needed to fight the inequality that could emerge from the power of machine learning.

I believe it is possible for us to deploy automation and machine learning while increasing the quality of life for all of society. The move towards a machine intelligent world will be a positive one if we share the value created.

Data is an investment that will pay off

In the specific context of the energy industry I see digitisation as a major challenge. By digitisation I mean a system where everything from sensor level data to prices are accessible to employees world wide. It’s not just about having a local site plant control system and historian setup. The 21st century energy company should have all data available in the cloud in real time. This will allow machine learning models deployed to the cloud to help improve the performance of our energy system. It’s easier to deploy a virtual machine in the cloud than to install & maintain a dedicated system on site.

Data is one of the most strategic assets a company can own. It’s valuable not just because of what insights it can generate today, but because of the potential massive value machine learning techniques could extract from it in the future. Data is an investment that will pay off.

Stay tuned for Part Two of this series where I will go into detail on some of the applications of machine learning in the energy industry.

If you’ve found this blog helpful and would like other topics covered, please feel free to drop me an email with suggestions. You’re welcome to subscribe using ‘Subscribe to Blog via Email’ section and this will get you the latest posts straight to your inbox before they’re available anywhere else

2 COMMENTS

I'd love to hear what your thoughts are...please feel free to leave a reply