Events News

How to handle the data explosion

15th January 2014
Nat Bowers
0

 

Steve Rogerson, reporting from China, looks at how industry is gearing up to cope with the predicted explosion in data, by taking intelligence to the edge.

As momentum gathers for Internet of Things applications, the need for more data processing and storage is set to grow exponentially. This was one of the key messages from November’s Advantech World Partner Conference in Suzhou, China.

Taiwanese company Advantech is pinning its future on IoT, and the associated M2M, cloud computing and smart city fields, to the extent that it believes these will trigger a doubling in turnover for the company from $1bn today to $2bn within five years. However, key to this growth will be a move to more intelligence in edge computing so that data can not only be received but acted on without human intervention.

“We are moving from machine-to-operator to machine-to-machine,” Magic Pao, Advantech Product Manager, told the conference. “We have to move from reaction to prediction. We have to do more with the data, not just check what is happening now but predict what will happen.”

Magic Pao, Product Manager, Advantech

Magic Pao, Product Manager, Advantech: “We have to move from reaction to prediction.”

This message was picked up by Ben Lee, the company’s Associate Director in the USA, who said that processing all those data would require an immense amount of digital signal processing. For example, he said that five million years of video would be crossing global networks each month by 2017. Video traffic will account for more than two-thirds of consumer internet traffic and internet video-to-TV traffic will account for 14%. Even today, the amount of video uploaded every second would take five years to watch. And video on demand traffic by 2017 will be equivalent to six million DVDs per month.

“Traditional pay TV will eventually be replaced,” said Lee. “Internet and cloud technologies are letting agile IPTV producers enter the market. Video is going to dominate the internet.”

This, he said, meant that telecoms operators were going to need more computing power for media processing within their infrastructures. In security, video surveillance will need more analytics support as well as well as just acquisition.

Ben Lee, Advantech Associate Director

Ben Lee, Advantech Associate Director: “We need to see innovation in the video market.”

Other areas that will need more digital signal processing as a result of increased video use will be test and automation in factories for applications such as optical inspection. Mission critical applications such as sonar, radar and drone programmes will need more DSP in real time. And video conferencing is moving towards higher resolution and larger screens.

Medical imaging

Advantech Senior Director, Linda Tsai, picked up the story: “We have a growing and ageing population,” she said. “The world’s citizens travel 30 billion miles and by 2050 this will be more than 150 billion miles. The population will grow from 7.2 billion today to 9.6 billion by 2050.” She said this would increase the use of intelligent transportation and road surveillance.

“Because there are more people in the cities, we have to ensure their security,” she said. “We have to combine data for vehicles and passengers, and this has to be done 24:7 because the roads never shut down.”

Linda Tsai, Senior Director, Advantech

Linda Tsai, Senior Director, Advantech: “Because there are more people in the cities, we have to ensure their security.”

Much of this will put heavy strains on traditional centralised cloud computing as the demand for storing video in different formats grows. There are also bandwidth issues.

“We need to find creative ways to reduce this,” said Lee. “We need to see innovation in the video market. This means we have to provide more value, not just in hardware but in software. And it has to be scalable and compatible with different form factors.”

He said this meant integrating various technologies into hybrid systems with encoders and transcoders that can handle different formats. Adaptive bit-rate technologies such as HEVC will give higher compression and thus smaller file sizes. He also said there would be more need for video segmentation and content-delivery-network technologies to handle this.

However, Advantech is taking two different approaches as to how to handle DSP in its products, and there is a clear disagreement among the proponents of each. Whereas on one hand the company has committed itself to using DSPs from, for example, Texas Instruments in some of its products, other product groups believe that general-purpose processors are more than adequate.

Peter Marek, Senior Director, Advantech

Peter Marek, Senior Director, Advantech: “People want to use the best of both worlds.”

“Processes that were once handled by DSPs can now be handled by general-purpose Intel processors,” said Peter Marek, a Senior Director at Advantech. “This lets you repurpose equipment and that produces advantages for some providers.”

However, he acknowledged there was ‘still a space for special purpose architectures’, adding: “People want to use the best of both worlds.”

The answer to this, according to Mark Nadeski, DSP Business Development Manager at Texas Instruments, is to use SoCs that combine DSP with general processing.

“The need for increased processing brings more challenges,” he said. “And you need to be able to re-use what you develop. The answer is the SoC. And it needs to be scalable.”

Mark Nadeski, DSP Business Development Manager, Texas Instruments

Mark Nadeski, DSP Business Development Manager, Texas Instruments: “The need for increased processing brings more challenges.”

He said that using a building block approach to design could speed time to market because the blocks could be re-used in different designs. And he pointed out that product differentiation could come from software.

“This is important when you have to do a lot of your own differentiation,” he said. “Software programmability can be very advantageous in some instances. But you need to help customers do this. You can have development boards and libraries to make it easier.”

Rick Dwyer, a Vice President with Intel, pointed to a problem; most of the data being collected is unstructured. He said this meant that databases needed to evolve and for there to be far more processing at the edge rather than in the cloud. However, he said that 85% of the devices that would be connected within the next five years were already installed and the infrastructure was in place.

Rick Dwyer, Vice President, Intel

Rick Dwyer, Vice President, Intel: “You don’t have to wait.”

“You don’t have to wait,” he said. “We have the opportunity today to provide gateways to collect data and apply analytics to equipment that is already installed.”

Dwyer’s colleague, Jim Robinson, General Manager at Intel, added: “The real value of the Internet of Things is extracting data from devices in a way that allows us to make actionable decisions, to enable business transitions.” He said that while now most of the analytics was happening in the cloud, this needed to change.

“It needs to be distributed,” he said, “because we do not have enough network bandwidth and storage capabilities to move all the data to the cloud. And we can’t wait for batch processing in the cloud for mission critical applications. More intelligence has to move to the edge devices. You need distributed analytics to enable the Internet of Things.”

Jim Robinson, General Manager, Intel

Jim Robinson, General Manager, Intel: “You need distributed analytics to enable the Internet of Things.”

Aaron Su, an Advantech Product Manager, believes that when it comes to edge computing, ARM technology is the only way to go: “Smart cities will require a lot of computing,” he said. “If you want it to be everywhere you need a lot of different devices that are small and low power. This is why ARM technology can be the best choice for this. Intel’s Atom is good but you don’t need all the features. ARM is 100% customisable for the application. You just have the functions you need, so it is the most cost effective.”

A good example for using edge computing, he said, was measuring water levels in reservoirs. If the level was too high, the device could start the pumps without human intervention.

“This is providing local computing power rather than using the cloud,” he said. “This is why ARM is very important. If you put an Atom in the box you would need a fan to cool it down.”

Aaron Su, Product Manager, Advantech

Aaron Su, Product Manager, Advantech: “If you put an Atom in the box you would need a fan to cool it down.”

However, increasing processing power in edge computing will put even more strain on network providers that are still struggling to cope with the explosion in the number of smartphones. Dealing with a similar if not bigger growth in M2M traffic will create serious headaches.

“Network providers were not ready for smartphones,” said Marek. “It took them half a year to a year to build up their networks. This continues to be a challenge. They can’t keep up, so they are changing their business models where the big users get premium service and the rest get what is left.”

He pointed out that if you make a voice call and the network is down, you may be upset but you can try later; with M2M this might not be an acceptable option: “With M2M, network downtime can put you out of business,” he said. “Some of this can’t be compromised. Our customers are the network equipment providers and we have to keep up with this traffic explosion. Our customers are scaling up their equipment.”

He said a lot of these customers had consolidated what they were doing just to survive but now they needed to develop new products and that time-to-market pressure was getting worse.

“We can no longer wait to get the contract before we start the development work,” he said. “We need to come up with leading-edge technology to help them build intelligence into their networks. And we face the same time-to-market pressure that our customers face.”

And he said that Advanced TCA (ATCA), which had become the base standard for network equipment providers, was now straining under the pressure.

“ATCA is very powerful, but not powerful enough for some of the applications,” he said. “With a terabit firewall, you need to break some of the limits of ATCA. But you can’t afford to do a full custom platform. So you need a semi-custom platform that re-uses some of the components of ATCA. This will not be completely new but takes ATCA up to a new level.”

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier