Reducing power consumption is a major challenge facing the IoT. Steve Rogerson looks at the techniques being employed. Some extremely large figures have been bandied around for the number of devices that will eventually make up the Internet-of-Things as research companies try to outdo each other in their predictions.
Whether the reality will match the forecasts - at least in the time frames suggested - has yet to be seen, but what is certain is that many of these devices will be in remote areas or hard to reach places and will be powered either by energy harvesting or batteries.
As nobody wants to be changing batteries at regular intervals and energy harvesting can be patchy to say the least, there is a lot of pressure on designers to reduce the power consumption of these devices significantly. At present, there seems to be two major approaches - trying to design the chip architecture in such a way that it uses less power and looking at the way these devices operate and the network to which they are connected to see if savings can be made there.
On the network side, the wireless protocol is attracting a lot of attention. Cellular and Wifi, while widely available, are not noted for low power consumption, as witnessed by modern smartphones that often need recharging once or twice a day. Bluetooth Low Energy has a lot of advantages, but one is not range and that can cause problems for some IoT applications. And then there are the various low power WAN technologies, such as LoRa and Sigfox that hold promise for certain M2M type applications.
“The most controllable parameter that is driving energy consumption down is the protocol of the network,” said Hardy Schmidbauer, Head of Marketing at Semtech, one of the backers of the LoRa standard. “Low power wide area networks are driving orders of magnitude of improvements in battery lifetime.”
He said a problem with cellular was that the devices had to check into the network at regular intervals, and that took a lot of battery life, though he acknowledged that the latest work with LTE-M - the LTE network optimised for M2M - was tackling some of these issues.
“They are extending the synchronisation time from 1.5s to 20s, but that is still worse than can be achieved by low power WANs,” he said. “These are asynchronous. In something like LoRa the end node can be asleep for days or weeks. That gives significantly longer battery life.”
Randall Restle, an applications engineer at Digi-Key, said the simplest way to save power was to turn the device off when it did not have to communicate.
“There are companies making devices they guarantee will wake-up when they need to,” he said. “You can get by with once a minute or longer. The current to power a clock can be very small. Where the power reduction is needed is in the communications.” Schmidbauer said applications for cellular were where there was an existing power supply and high data rate communications were needed, a good example being automotive where the car’s battery provided the power. Cellular, he believes, will account for just 10-15% of targeted applications for the IoT, with WiFi and Bluetooth handling 40% and the rest being via low power WANs.
Reducing the bandwidth is a good technique to reduce the power, but this also reduces the data rate, not a problem for many IoT applications.
“It does mean you can use a less power hungry radio,” said Emmanuel Gresset, Director of Business Development at Ceva. “However, some of the early standards are only one way because it was thought the devices would only need to send. That is wrong. You always need some kind of downlink. Now, all the standards are two way and most of the sensors have very low data rates.”
A lot of semiconductor manufacturers are starting to embed wireless technology within the chips or modules, whether it be short range such as Bluetooth Low Energy or a longer range sub-gigahertz transceiver, and then optimising the design for the lowest power.
“You don’t want them on all the time, just when they have to do a task,” said Andrew Bickley, Technology Marketing Director at Arrow Electronics. “So you need a deep-sleep mode to turn-off everything apart from a sniffer for when they need to wake-up. You want the wake-up to be quick, do the task and then go back into deep sleep. There are a lot of people working on this.”
Gresset added: “If you do a few packets once in a while, they can go into power saving mode and wake-up when needed. That is a very powerful method to reduce power.”
Another way to reduce power is through the radio. By making it half duplex, a lot of the same components can be used for transmit and receive, and that reduces the power. It also means more can be integrated into the same chip.
Gresset believes a fully integrated chip to support LTE will appear in 2016. This is already available for networks such as Weightless, Sigfox and LoRa.
“The problem is there are many of these standards and they are competing against each other,” said Gresset. “The business model is not clear. Some will be successful for specific use cases, especially in say a city, but they are never going to be successful nationwide. If you want to track trailers, trucks, cars, pets and so on, you need full coverage, and that means you are going to have to use cellular. LTE will be the only option combined with GPS.”
“These new expensive processors are very bad at leakage,” he said. “They leak a lot. For IoT, you need to go back to higher widths because they have much better leakage figures.”
However, while leakage is a problem, even more can be saved by tackling the dynamic power consumption, as internal nodes are charged up and down. One company – Ambiq Micro – is tackling this by going to an old technology – sub-threshold CMOS techniques, first used in the 1970s by Swiss watchmakers who saw the advantage of using some transistors in the sub-threshold region. This was later adopted for pacemakers and RFID tags, but not much else until now.
This can benefit power saving in an IoT application by moving some power hungry functions into the processor and using sub-threshold techniques. Keith Odland, senior director of marketing at Ambiq Micro, illustrated this with a medical application. “We were working with a customer who had a wearable device measuring health data such as heart rate,” he said. “It would write the data to a flash card. The power used to acquire the data was 1µW but it took 1mW to write to the card, so a thousand times more energy to store the data than to acquire the data.”
However, data such as this lends itself well to compression. By using the processor to compress the data ten times took 2µW, which then reduced the amount of time needed to write to the card, resulting in a te-fold reduction in power consumption.
Odland said Ambiq was the first company to build large-scale chips using sub-threshold technology. “We targeted the wearable market first,” he said. “They wanted low power, the market is new and fragmented without large incumbents. The people in there are open to change. But it is not the only space. We think our approach can work in any type of microcontroller application.”
One of the other difficulties in achieving low power is security. There is a lot of paranoia - some justified - about the IoT being a route for hackers to attack everything from a home network to critical medical applications. However, adding security to devices also adds power consumption.
“You have to optimise for low power and ensure you have the security and encryption,” said Bickley. “This is a design challenge going forward.”
Without sensors there would be no IoT. Sensors are what bring in the data, but Bickley feels sensor technology is lagging behind and that is causing power consumption problems. At the heart of his worry is that most sensors are analogue and the data they collect need to be digital.
“You need extra circuitry to switch the analogue to digital,” he said. “So we are looking for a rapid conversion to digital low power sensors. That will be a big challenge. ST Microelectronics with its MEMS sensors is driving the market forward. We are asking other sensor makers to build analogue-to-digital conversion into their sensors so you get digital out. It is beginning to happen.”
An interesting piece of research is happening at the Eindhoven University of Technology, where researchers have developed a tiny wireless temperature sensor that is powered from the radio waves that are part of the sensor’s wireless network. This means the sensor needs not even a single wire, nor a battery that would have to be replaced. The sensor contains an antenna that captures the energy from the router. The sensor stores that energy and, once there is enough, the sensor switches on, measures the temperature and sends a signal to the router.
There is a long way to go before the dream of the IoT on the scales envisaged will be a reality, but one of the big hurdles is power consumption. Innovative research such as at the Eindhoven University of Technology is pointing the way and no doubt this and other forms of energy harvesting will play a big part. Also encouraging is the designing of systems that allow the end devices to spend large amounts of time in deep sleep mode. Work such as this and developments on the network architecture and protocols suggest this hurdle will be overcome, and in a relatively short time.