Power Management using Neural Networks

This project simulate power management on a cpu using behavioral model of biological neuron networks to perform prediction of the power load.

Abstract
This project simulate power management on a cpu using behavioral model of biological neuron networks to perform prediction of the power load. The model uses variety of parameters, which are values sampled over different points on the chip, as inputs. The output is the power load. The neuron network model chosen is the LSM, the model was simulated on MATLAB.

 

The problem
Power consumption has become one of the biggest challenges in high performance microprocessor design. The rapid increase in the complexity and speed of each CPU generation is outstripping the benefits of voltage reduction and feature size scaling.

Designers are thus continuously challenged to come up with innovative ways to reduce power, while trying to meet all the other constraints imposed on the design.

As technology progresses, the more complex the design is. Thus technology scaling is needed in order to fit more transistors on a chip.

From this figure it is well understood that performance is thus limited by technology scaling, due to the fact that the smaller the scaling is, the larger the power density is, which will eventually exceed the limit that we will be able to cool down.
Hence the need in developing power management techniques.

 

The solution:
This project will focus on predicting power load using a given neuron network. The given neuron network receives a binary input vector and produces a binary output vector which is the response of the neuron network. The binary input vector is a binary representation of various events sampled on the chip, it is translated into spike trains and fed into the LSM. Due to the lack of real values, those events are being generated using pre-rendered model. Since different events on a circuit can consume different amount of power, the events have different weights, those weights, although pre-rendered, are unknown to the system. The objective of the project is to create a learning neuron network that will issue (in high success percentage) the correct power load upon set of given event values.  This is done by training the given neuron network using training input sets (test sets). Test sets are sets of event values whose power load are predetermined. After training the system with a sufficient amount of these sets, the system can be trained to reach high hit (success) rate.

 

Tools

  • Matlab
  • LSM open loop:

This is neuron network simulation utility for matlab, allowing to create models for Neural MicroCircuits (NMC) of various sizes and levels of detail.

For more details – see the LSM homepage : http://www.lsm.tugraz.at/

 

Conclusions
For maximum optimization of the network, different test were conducted (three sets of tests in total). The three main conclusions are brought below:

  • By increasing the number of events, the prediction of the system improves.
  • The higher the resolution is, the lower the precision of the system is in predicting the CPU load. (resolution is the distance between each sample in the pool of load strings, so for higher resolution, less strings are fed to the nmc)
  •  The spreading of the load vector throughout the full spectrum of the load, and applying the new training inputs to the circuits improves the test result significantly.

Using enough training input stings (more then 100 different strings per power load), and under certain constrains (minimum no. of events, network size, and resolution) we can reach high success rate.

For example:

No. of events samples in test sets events Levels Correct hits in
%
100 160 100:5:140       
84
1000 160 100:5:140   
 
 91     
100 160 10:20:250       
94

 

 

A picture from the GUI createdfor the project might clear the meaning of the term “correct hits” (in this case it’s 76%).

1

The GUI interface is already built in the matlab code.

Acknowledgment
We would like to thank our project supervisors Karina Odinaev & Yigal Raichelgauz for their help and guidance throughout this work, as well as for their patience. Our thanks go also to theOllendorff Minerva Center for supporting this project.