0

Machine learning and AI are two buzzwords you hear so often that they’re kind of losing their meaning. If we just focus on machine learning, there’s so much hype around its future potential, that it is easy to forget about what is already happening today. For instance, NVIDIA is using machine learning to boost renewable energy generation and reduce costs of wind farms. Another example is a startup working with NVIDIA to develop a smart meter for home and utility-grid scale. But even with these examples, is machine learning living up to that hype? Let’s take a look at how machine learning is building a more renewable electrical grid and how it’s starting to impact us. Maybe we can come to a decision on this.

For the past few years I’ve been part of a program called Connected Solutions with my Ecobee Smart Thermostat. The thermostat uses machine learning and automations to cut down on electricity demand during the summer when everyone is cranking up their air conditioning when they get home from work. I’ve got a few other things in my home that also tap into machine learning, but I’ll get to that in a minute. This is just one small piece of a much bigger system that’s changing how we use electricity in our homes and our local grids.

As we transition from fossil fuels to a renewable based system to help combat climate change, things like EVs and electric heat pumps for heating and cooling are going to increase our demand for electricity … a lot. And as I’m sure you’re well aware, intermittent renewables like solar and wind power have a whole new set of challenges that we have to manage to get as much power as we can in an efficient manner, yet on demand as needed. That’s where, not just hardware, but software, machine learning and AI come into play to help pave the way towards a low-emissions future.

AI is a vast area that consists of the ability to enable any machine or computer to replicate human capabilities such as object recognition, decision making, and problem-solving. AI has several branches, including machine learning. 1 2

Machine learning allows systems to naturally learn and improve themselves by using experiences to predict outcomes without being explicitly programmed. It focuses on the development of computer programs that can access and leverage data for learning. But before we jump into the details of how it works, why it’s important and the companies making it a reality, let’s take a look at the history of machine learning. 3 4

Machine learning is based on a model of brain cell interaction, which Donald Hebb initially presented in 1949 in his book “The Organization of Behavior”, where he introduced theories on the interaction between neurons. In 1952, Arthur Samuel of IBM developed a checkers computer program that could learn and become better, so he came up with the term “machine learning”. The program placed value on each checker piece and it remembered the outcome of each move. In essence, the program got better at checkers as it played. What was particularly important about this event is that the program arrived at a solution that the programmer did not create; it detected patterns in the checkers game that the humans did not.

Until the 1970s, machine learning was part of the AI evolution. In the following years, AI research wasn’t focusing on algorithms, but on logical approaches, which is why machine learning branched off to evolve on its own and follow a data-driven approach in the following decades. By the 2000s, businesses started to turn their attention towards machine learning. For example, Google started to apply complex mathematical calculations to big data using machine learning.

In my previous life before YouTube, I worked as a designer in software development and tapped into machine learning to improve some of the user experiences I helped to design. With the help of graphics processing manufacturers like NVIDIA, who were speeding up the development of faster and more powerful technologies, machine learning took off in many sectors between 2010 and 2020. 5 6

But how does it actually work?

Machine learning, like the human brain, needs inputs to gain information and learn from. However, it makes use of training data and knowledge graphs to comprehend entities, domains, and the relationships between them.

It’s like when we were in school learning math. Something I’ve never been good at, but it’s still a good analogy. First, the algorithm uses examples, direct experience, or instructions as inputs to generate an estimate of a pattern in the data, like a math teacher giving the students a lot of practice problems along with the answers. Then, an error function is used to evaluate the model’s accuracy using those examples for comparison. In the case of our students, they take an exam and their answers are compared to the actual answers so that the teachers can measure the effectiveness of both the student and the set of practice problems they were given. Finally, the model is optimized by altering weights to reduce the gap between the known example and the model prediction, like our math teacher using alternatives to the practice problems to teach the students, and the students using other methods to study for the final exam. After that, the program will repeat the optimization process on its own until the target accuracy is achieved. Looking back on my math experience, I never reduced that gap.

Machine learning utilizes three main techniques: supervised learning, unsupervised learning and semi-supervised learning. 7 8

Supervised Learning: uses training sets, which are labeled datasets. This is like reviewing an example problem that has been solved. Manually moving spam from your inbox to the spam folder is an example of supervised machine learning.

Unsupervised Learning: uses unlabeled datasets. The algorithms are capable of discovering unknown patterns or data groupings without the need for human intervention. Unsupervised learning is suitable for cross-selling techniques, consumer segmentation, image and pattern recognition.

Semi-Supervised Learning: guides categorization and feature extraction from a larger, unlabeled data set using a smaller labeled data set. It’s useful when there isn’t enough label data to train a supervised learning algorithm.

Machine learning is becoming more important as the amount and diversity of data available increases, not to mention the affordability of computational power and high-speed Internet. It’s proven useful because it can solve issues at a speed and scale that the human mind cannot match. Machines can be trained to discover patterns and correlations between incoming data and automate regular activities using huge amounts of computer power behind a task. 9 10

All these benefits have made machine learning spread throughout numerous industries, including healthcare, manufacturing, banking and finance, transportation, and sustainability. For example, in the renewable energy field, machine learning can be used to overcome the problem of intermittency by forecasting sunlight and airflow for solar and wind power much better than humans do. It uses past weather data to provide accurate forecasting so that renewable energy supply companies can plan when to produce or augment energy generation. As a result, renewables can become more reliable, affordable, and efficient. Machine learning can also be used to improve renewable energy storage and identify the optimal layout and geographical location of solar and wind power plants. From the maintenance perspective, machine learning can be employed to collect data from sensors installed in the electrical grid to detect anomalies, predict failures, and automate monitoring. It can be used to monitor turbine health, and help schedule maintenance or adjust parameters to minimize wear on turbines.

Considering all these pros, governments and companies around the globe have been working on the development of machine learning to tackle climate change. 11 12 In the United States, for example, the Solar Energy Technology Office teamed up with IBM to develop Watt-Sun (a pun I both love … and hate), a technology that uses machine learning to sort through data acquired from a large database of weather reports. The idea was to reduce the variability of solar energy’s output, also reducing the need for excess energy storage systems. By using Watt-Sun, the team was able to increase the accuracy of weather forecasting as it related to energy output by 30%. 13 14

Siemens Gamesa Renewable Energy, a worldwide company with thousands of wind turbines installed around the world, has teamed up with NVIDIA to develop a digital twin platform to perform high-fidelity simulations of wind farms utilizing physics-informed machine learning. Digital twins consist of a virtual model that accurately mirrors a physical object. This technology will empower researchers with massive computational power in order to model complex systems with much higher speed and accuracy than previous AI modeling.

With this platform, a complex simulation of fluid dynamics can be made up to 10,000x faster than conventional methods used to simulate engineering problems, which take more than one month, even running in a 100-CPU cluster, so this technology is really game-changing.

The platform takes advantage of the NVIDIA Modulus, which is used to train a neural network and build an AI surrogate model for digital twins. Before launching a new product, let’s say a wind turbine for example, manufacturers can make simulations using a high-fidelity model of the turbine and real weather conditions to check how it could behave in reality. By doing that, they can avoid spending a lot of money to deploy a product that might not perform as expected. The project also works with NVIDIA Omniverse, which is a platform for 3D design collaboration and simulation. These platforms open up opportunities for virtual system testing, layout changes, software improvements, or updates without affecting the physical, real world counterpart.

An example of how this system can impact power efficiency is that, with the digital twins platform, researchers will be able to detect the impacts of a wind farm’s layout. Turbines installed close to each other can change the wind flow and create a wake effect, reducing their efficiency. When wind turbine blades spin, they create a cone-shaped core area with lower wind speed behind it, reducing the amount of energy that can be harvested by others downstream. On top of that, turbulence in the wider (peripheral) area behind the entire area covered by the blades is significant, reducing the energy production from other turbines. A solution for this problem is moving turbine rotors away from the oncoming wind to deflect the wake so energy is recovered.

By using NVIDIA’s platform, it’s expected that Siemens Gamesa will be able to increase power production by 20% with layout optimizations.15 16 17 To better wrap my head around the potential here, I reached out to Alex, a fellow YouTuber and friend of mine, and an actual rocket scientist and data analyst, for his take…

The Danish company Vestas is also bringing machine learning into their wind turbine applications in order to reduce the wake effect. They’ve used Microsoft Azure high-performance computing, Azure Machine Learning, along with help from Microsoft partner minds.ai, which utilized DeepSim; it’s a controller design platform that’s based on machine learning.

Vestas is using this to make controllers respond appropriately to inputs from the wind farm environment, such as altering turbine yaw in response to wind direction, speed, and wake effect to boost wind farm efficiency and yield. With the combined resources of DeepSim it results in faster convergence and lower computational costs. Even a 1% increase in wind turbine efficiency might result in millions of dollars in revenue and a more secure energy future. IEEE Spectrum estimates that 10% of potential wind power is lost to wake effects, so you can imagine how much power and revenue companies have been losing.18 19 20

With renewables popping up everywhere, there’s been more energy supply instability, with up to 60% variance. Machine learning could modify electricity pricing depending on the vast amount of data generated by expanding the number of smart meters and sensors in use, as well as helping to forecast supply and demand, balance the grid in real-time, and reduce downtime. 21

A startup, Anuranet, has been taking advantage of the NVIDIA Jetson edge AI platform to build smart electric meters that can take measurements tens of thousands of times per second, which allows for very fine tuned monitoring. Homeowners will be able to control how energy flows from solar panels into the home or into connected EVs, which also helps to reduce energy bills.

Anuranet’s brand of smart meters, Bullfrog, can be connected to the grid to help with smart appliances, home energy hubs, solar panels, electric vehicle chargers, and batteries, etc. For the homeowner, their smart meter monitors disruptive weather patterns that may affect energy grids, optimizes energy usage, and detects energy waste or failures in appliances and breakers. While at the utility-scale, it will allow microgrids to optimize numerous energy sources and storage facilities, rapidly detect grid faults, implement dynamic pricing and demand response with real-time data using AI. All this can help increase reliability, security, and sustainability.

It’s very similar to systems that I’ve used in my home from Sense and Span. I used to have a Sense energy monitor installed in my electric panel that could detect improper voltage from the grid, so you could proactively reach out to your utility to address any issues. It could also warn you if a compressor or motor, like on your home air conditioning or furnace, is drawing too much current, so you can get it serviced before a failure. I now have a Span smart electric panel, which has alerted me to unusual changes to my electricity use on certain circuits saving me money. We’re seeing more and more of these types of these home products hitting the market. Speaking of which, Anuranet’s smart meter is expected to hit the market this fall.
22 23

In the research field, we found an incredible number of ongoing machine learning and AI projects using Linknovate. Everything from, “A Deep Learning Model for Forecasting Photovoltaic Energy with Uncertainties” to “Deep Reinforcement Learning for Optimal Energy Management of Multi-energy Smart Grids”. This is a very active and growing industry, but is it living up to the hype?

Well, there still are several challenges that need to be overcome. Machine learning is not based on knowledge but on data, so lack of training data or unclean and noisy data can lead to inaccurate predictions. For early adopters of such technology, there may not be enough data to make proper decisions, drastically reducing the effectiveness of the system. Because the old system worked, there might be some buyer’s remorse with the machine learning based systems until the training sets improve.

In addition, the machine learning process is complex by itself. Analyzing the data, removing data bias, training datasets, applying complex mathematical calculations to the models … makes it complicated. Running machine learning models is also a slow process, taking a lot of time and demanding computational power. 3 24

But as we can see, the benefits it can provide towards tackling climate change and making our electrical grid more reliable are impressive. With big players in the renewable sector leaning into machine learning and with NVIDIA pushing the development of faster and more powerful computing technologies, we may see more widespread use of this technology around the globe in the coming years, helping to make a more sustainable future.

The Truth About Plastic Recycling … It’s Complicated

Previous article

Why this Hydrogen Breakthrough Matters

Next article

You may also like

Comments

Leave a reply