Sensors must evolve to make Industry 4.0 workable
Rahman Jamal, Technology and Marketing Director for Europe at National Instruments and spokesman of the Sensors for Digital Production working group looks at the building blocks that will define the impact of Industry 4.0 on manufacturing.
The successful evolution of Industry 4.0 from a promising concept into a practical reality will be heavily contingent upon the quality of data that can be acquired and shared between the component parts of a digital production system. This will require a new generation of sensor technology that is not yet available to engineers.
Industry 4.0 is entirely dependent on sensors. These are the sense organs of machines, supplying the data that will be used in production. Knowledge derived from technology must be made available in production: for the people and, above all, for the machines. This can only be achieved with sensors that supply current data about process states and machine status. These are the reasons why sensors are the key to the implementation of Industry 4.0 in production technology.
One of the challenges to be overcome is that, in many cases, sensors cannot be integrated directly into a production environment because they are essentially laboratory-based solutions, are very costly or do not have online capability. In addition, sensor output signals can be ambiguous. For example, the signal from a force sensor measuring process forces can be incorrectly interpreted if information about the positions of the workpiece and the tool are lacking. Another factor is that, even though individual sensors can measure process quantities and supply signals for process monitoring, additional signal processing is always necessary to enable model-based interpretation of product quality or the states of tools, machines and aids.
Closing the gap
In a bid to close these gaps, there are two main thrusts to the research and development of sensors. The first trend is the development of integrated sensors intended to supply a higher level of information by directly evaluating the sensed data. With the use of suitable models, information over and above simple signals can be delivered to the next entity: the machine controller and thereby the operator, or to the process planning level.
The second trend is the development of multi-sensor systems that allow several quantities of data to be acquired in the system. Sensor fusion, which allows intelligent signal processing of the multiple measurement quantities of a multi-sensor system, is driven by the combination of these two thrusts.
These developments illustrate the ways in which new sensor solutions for production can be researched and developed by interdisciplinary collaboration with the natural sciences. A common thread among many examples is that further developments are necessary with regard to industrial use in the context of digital production. Other factors to consider include cost reduction, miniaturisation with integrated sensors and enhancement of internal signal processing.
Current discussions about system development always touch upon the capabilities of networking and multiple use of process data, which are covered by the term ‘Cyber-Physical Production Systems’ (CPPS). Similarly, a Cyber-Physical Sensor System (CPSS) is a network of multiple sensors based on autonomous fusion. The foundation for networking is provided by adaptive system data provided to the sensor system by the CPPS. Taking into account optimisation goals, CPSSs deliver the process data to the CPPS that is necessary for building workable process models.
The biggest challenge with networking is managing the generated data and information. This challenge was described as follows in NI Data Acquisition Technology Outlook 2013 on the subject of data acquisition: “Differentiation is no longer about who can collect the most data; it’s about who can quickly make sense of the data they collect.”
Test and production data usually contains very valuable and important information for assessing a process or product. Loss of this data due to inadequate logging inevitably leads to high costs. Without this test data it is not possible to take cost-effective decisions, particularly for the design of new processes.
For fast location and retrieval as well as reliable documentation, it is therefore necessary to store descriptive data together with the measurement data. Several standardised data formats provide an ideal data structure for this, which allows data management systems to be built quickly and economically using commercial systems without sacrificing traditional database functionality. As access to measurement data, information and key indicators must also occur outside enterprise boundaries in future, work is currently under way to develop the technical data cloud.
Cloud computing is a form of distributed computing with dynamic, demand-oriented IT infrastructures that provide processing power, data storage, network bandwidth or services. The advantage for enterprises is that short-term resource needs do not lead to a costly and permanent expansion of the IT infrastructure.
It is important that each party only receives the data it needs for its work. Otherwise there is a risk of ‘dead’ data sets that are not usable but simply stored on server systems. Quantified documentation of significant process parameters must be created within operations. This also requires the development of strategies for managing and archiving measurement data. The increasing measurement resolution of sensors combined with higher time resolution poses a challenge for data processing. The limiting factors here are the transmission speed and latency of current bus systems in machine tools. Provision of information at various execution levels also leads to changes in requirements for real-time processing.
For hard real-time requirements, future sensor systems will have to process data into information themselves. The resulting information can then be provided to actuator components or a higher-level target system for further processing. With applications that are not time critical, calculated characteristics can be sent to database systems using standard network protocols. A hostless CPPS sensor could also process a direct data stream on a local data storage device. Consequently, systems with higher reconfigurability and scalability, based on real-time processors and FPGA technology, must be used to ensure the greatest possible flexibility. With regard to CPPS, manufacturers of sensors and actuators can only provide a hardware platform because customer requirements vary greatly. For this reason, the goal should be to develop open, easily integrated embedded sensor platforms. This would enable users to add new functions to instruments and actuators by developing their own algorithms.
Customer demand for intelligent products that are flexibly extensible can already be seen now in many areas of the entertainment industry. These intelligent devices can be adapted to individual customer needs with the use of apps. Many of these platforms also allow users to program their own apps. The resulting applications can be sold or made available to the user community for free. Future devices will be open, reconfigurable and user-adaptable platforms that lead to the formation of communities. These user communities will generate new application areas and functions for the manufacturer by developing apps.
As part of the Aachen Machine Tool Colloquium, the Laboratory for Machine Tools and Production Engineering (WZL), Kistler and National Instruments developed a demonstrator to acquire, analyse and present process information in real-time on the basis of NI CompactRIO and NI LabVIEW. The process information that resulted from an appropriate sensor fusion could then be accessed through a ‘technical cloud’ via mobile devices used by production floor operators.