Industrial

Raspberry Pi provides the key to the industrial data centre

5th September 2022
Kiera Sowery
0

Industry 4.0 is a movement that has important ramifications for the architecture of industrial-computing systems. By Cliff Ortmeyer, Global Head of Technical Marketing, Farnell discusses.

A key direction in Industry 4.0 is the merging of information technology (IT) with the operational technology (OT) that coordinates machine tools and process control systems on the shopfloor. Increased communication between these two halves of a manufacturing business make it possible for suppliers to react to market changes much faster. A second driver is the use of IT-OT integration to control processes in the manufacturing plant in a far more data-driven manner.

Instead of building fixed production lines where components flow through a series of predetermined steps as the product is assembled, the route is determined in real time based on the requirements of the product and the availability of production machinery at that time. Two variants may be processed by different machines or manufacturing cells based on which components need to be integrated and the finish applied to each of them. In principle, each finished product that emerges from the factory is customised to the buyer’s specific requirements. Mass customisation is perceived to be a driver of increased profitability and market share for the companies that embrace it.

Flexibility is key to Industry 4.0  

Flexibility is a critical factor which applies as much to the computing infrastructure that oversees the shopfloor as to the machine tools and systems that transport work in progress to the different manufacturing cells. In the past, the focus was on islands of self-contained automation based around machines and robots running fixed programs to complete specific tasks. Industry 4.0 systems need a much greater degree of coordination not just across the factory floor but with the back-office systems that schedule and dispatch orders. At the same time, the faster networks needed to support this IT-OT integration can be used to enhance the functionality of the machines and robots already in the plant.

In this environment, it becomes possible to run tasks that need higher processing power without demanding major upgrades to the industrial computers and control systems that are attached to the machine tools. An example is AI-assisted visual inspection to check the quality of assembly or coatings. Instead, the tasks can be passed to flexible computers on the shopfloor in situations where latency is important. Alternatively, if the local processing is insufficient, tasks can be passed to systems in the cloud, with local processing power used to compress the images and videos to minimise transfer time and cost.

The systems may determine which is the best approach to take in real time to balance the use of local compute resources against response times. If a problem is found with a coating process by a relatively simple AI model running on a local system, the data may be passed onto the cloud for analysis to check that it is not a fault with the machinery or the raw materials.

Networked processing to improve product quality

Using networked processing, it becomes easier to augment manufacturing cells with additional sensors and the algorithms that can be used to improve overall quality. For example, changes in temperature and humidity can affect chemical processes that result in changes in paint quality or the adhesion between surfaces. Instead of upgrading machinery to include these sensors, adopters of Industry 4.0 practices can attach the sensor interfaces to computer modules that process the data into forms that can be used by the existing machine tools. This is an approach that British bicycle manufacturer Brompton has adopted at its factory. The company used Raspberry Pi modules to augment product tracking around the plant without having to change the core production machinery. The company now has more than 100 low-cost Pi modules in the plant wherever additional sensing and data-capture facilities are needed.

In Singapore, electronics manufacturing services specialist Jabil, has adopted the Raspberry Pi as its core hardware platform. Jabil’s factory digitalisation programme enables applications to be easily developed, tested and deployed securely onto the shopfloor systems no matter where they are. The common thread in these use-cases for the Raspberry Pi lies in the high degree of software compatibility between different versions of the hardware and the core capabilities of the hardware itself. That makes it far easier to move applications to where they are required or where compute resource is available.

Software and hardware integration

The Raspberry Pi has key advantages for the next stage in Industry 4.0. Networks of low-cost but high-capability computers must integrate to form a factory data centre where applications can move to where they are needed and even migrate from one system to another as demands change. In this environment, software tools and components originally developed for cloud data centres are adapted for use in systems running at the edge, whether these edge systems are attached directly to production machinery, located in cabinets on the factory floor or micro data centres close by.

Based on the Cortex-A series of microprocessor cores developed by Arm, every Raspberry Pi is a multicore computer with a full memory-management unit that includes support for operating systems such as Linux. These operating systems rely on virtual-memory support as well as a rich complement of high-speed I/O ports, all of which can be powered through a convenient USB port. The Raspberry Pi 4, for example, is based around a Broadcom quad-core SoC with 1MB of shard level-two cache, combined with dual HDMI video outputs to support operator interfaces and Gigabit Ethernet.

The support for Linux inherent to the Raspberry Pi is vital to the ability of industrial customers to access the development and deployment benefits of cloud computing technologies. A key element of today’s cloud systems is virtualisation and containerisation. Virtualisation provides a way to separate applications from each other more effectively than if they are hosted by the same operating system.

Under virtualisation, applications running in one operating system cannot access the memory of those running in another. All accesses to I/O or memory outside its own virtual address space is intercepted by a hypervisor. That is a major advantage in industrial systems where some software will be critical to a machine’s function but may need to run alongside third-party software to provide other functions. If updates are needed to one, they can be performed without affecting the other components. Furthermore, the support for Trustzone inherent to the Arm Cortex-A architecture provides mechanisms for checking the authenticity of software images before they are even installed as well as controlling their access to I/O at runtime.

Power of containerisation and orchestration  

Virtualisation imposes an overhead on the processor because of the need to intercept memory and I/O accesses. In response, the cloud-computing industry developed the more resource-efficient technology of containerisation. There are a number of security features built directly into Linux that can be used to prevent applications from corrupting memory outside their own address space and which do not incur the overhead of hypervisor actions. A second important feature of container technologies, such as Kata or Docker, is that it is possible to package all the libraries and system-level functions that an application needs into one image that can be transferred to any compatible machine on the network. Those libraries can be completely different from those needed by a container running on the same machine. As the container isolates the application from hardware and system-software differences, the result is a mechanism for ensuring that applications can run wherever there are spare compute resources.

In the cloud, Kubernetes, an open-source orchestration tool created by Google, is used to load, run, move and remove containers automatically based on rules set by systems administrators. Similar tools are now moving to edge systems through projects such as Arm’s Project Cassini and others. The combination of containerisation and orchestration provides the ability for industrial users to maximise the use of their computer systems and make them futureproof. New capacity can be added easily through the addition or upgrading of processor modules. The Raspberry Pi platform means they can do it in a highly cost-effective manner and take advantage of the built-in I/O and networking capabilities to ensure all the systems in this distributed industrial data centre can communicate with each other.

The result is that Raspberry Pi is far more than a low-cost platform for high-speed processing. Thanks to the migration of cloud technologies into the manufacturing space, it can be the basis for the industrial data centre that is needed to unify IT and OT under the Industry 4.0 umbrella.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier