Facebook EmaiInACirclel
Développement front-end, back-end

Embedded Systems – Designing for Low Power – Episode 04

Mickaël Hiver
Mickaël Hiver
Revenue Operations Manager

Haste makes waste

One of the many ways for you as a software engineer to reduce power consumption is to minimize the CPU frequency.
If you have some hardware performance to spare it can have a drastic effect on power consumption since CPU frequency is typically proportional to power consumption in CMOS logic. This is because a digital CMOS circuit has very little static power consumption; it only consumes significant power when it switches states.
In this article we will focus on the dynamic power consumption resulting from switching, and we will leave the static power consumption for now. It contributes very little to your overall power consumption, and as a software engineer you have very limited influence on static power consumption anyway.
Before we go into what you can do to reduce the power consumption, we will take a quick look at what actually draws power in a CMOS circuit. For each clock cycle, the load capacitance (C) is charged and then discharged. The total charge flowing from Vdd to ground is then CVdd, and the energy drawn will be CVdd^2. As this happens every clock cycle, the power consumption can be written as:
P=αƒCVdd^2
where ƒ is the CPU frequency and α is the activity factor representing the fraction of gates switching each clock cycle. Adjusting the CPU frequency to have fewer switches seems to be the easiest way to reduce power consumption for you as a software engineer as the other factors are often determined by the circuit design.
But keeping the CPU frequency low is not a question of just setting it as low as possible while maintaining an acceptable performance, there is more to it than that. First of all, you need to make a trade-off between performance and power consumption. Reducing the clock speed will reduce the performance by the same factor, but on the other hand also reduce the power consumption.
There is probably a lowest level of performance that is required for the system to function properly, but because software can be written and optimized in many different ways, there are also chances to trade hardware performance against software performance. This means optimizing the application for speed so that fewer instructions are needed to get the same job done. You will then free some hardware performance that can be used to lower the CPU frequency.
You’ll have to determine where the application is spending most of its time and where performance is most critical. Once you have established which functions are critical for your application, you can get started. For example, you should make sure that any time-critical functions are executing from RAM and not flash memory as this will speed up performance. You can also instruct the compiler to optimize for speed on critical sections of code, even if you use size optimizations on other sections to keep overall code size down. Then of course you need to look into the code that you have written to see what improvements can be done.
While trading performance for power works well for applications in active mode, it is rare that embedded applications need to be constantly active. More common is that the application, in between short bursts of executing, can be put into sleep mode. This changes a lot because now you will have to make a trade-off between setting a lower clock frequency or keeping the high frequency and instead finish the tasks earlier so that more time can be spent in low power mode.
For argument’s sake, we assume that the device consumes zero power in sleep mode and that there is no overhead entering and leaving sleep mode. With doubled clock frequency you will then consume twice as much power per second but it will only take half the time. The result is that there will be no difference in power consumption between the two strategies. If we look at this from the CMOS perspective, it will switch states equally many times to execute the instructions regardless of clock frequency. And since power is consumed when switching, power consumption will be the same.
But this is not the entire truth. For example, analog peripheral units will not respond to a decrease in clock frequency in the same way as digital logic. An application making use of analog peripheral units when active may be better off using a high CPU frequency while executing and then enter sleep mode. On the other hand, when an analog peripheral like a comparator or converter is used, it may be required that it is in use during a specific time period rather than for specific number of clock cycles, or it could require a specific CPU frequency to operate properly, complicating things further.
Some applications need to stay in active mode in between bursts even though they have little to do. In this case, clock throttling could be a good strategy. While the CPU frequency can be high during execution of critical functions, it can be reset to a level where program execution is slower but still acceptable when there is less work to do.


Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *