Wired or wireless: which is best for sensor deployment?
Taking sensor measurements from real world processes has always been fundamental in data acquisition systems. Some would even argue that defining the points to be measured was the starting point from which systems were defined, and ultimately dictated what the deployed systems could and could not do.
By Paul O’Shaughnessy, Channel Sales manager, Advantech IIOT
This is no less true as IoT technologies are adopted in the transition to Industry 4.0. In fact, the adoption of these new technologies increases the demand for more and more measurement data, as users include an ever-increasing number of factors into the optimisation of their processes and equipment.
This hunger to add new data brings its own challenges and some significant differences. The inherent flexibility of IoT systems means that the required data measurements evolve over time, requiring strategies to integrate pre-existing measurement points with new ones, which can often be physically removed from any pre-existing wiring or communication links.
In such cases, the cost of adding the new measurement points, or more normally the cost of recovering the data from these points, can often be the deciding factor in whether a desired system enhancement will make economic sense. It is therefore understandable that engineers are interested in the deployment of wireless measurement systems, as these can often be installed much more quickly and cost effectively than those involving the laying and routing of additional wired infrastructure.
Deploying wireless sensing technology involves compromises, however, and it is the effect of these compromises that ultimately determines if a wired or wireless architecture is selected and, if the latter, which radio technology is most suitable.
The case for wired Infrastructure
The traditional way sensors have been connected to data acquisition systems is via wired interfaces, typically 4-20mA current loops, but also in the case of more specialised measurements via higher speed voltage-based systems. These connections can carry power to the sensors, are reliable, accurate, offer faster measurement transmission and update times than wireless systems and, if correctly installed, are both highly secure and insensitive to noise and other interference. These characteristics come at a cost however.
The cost of installation can be very high, often involving alterations to buildings or digging trenches in which to lay the cables. In some cases, cabling to a sensor that is geographically remote may not be physically practical at all. It is also usually not possible to use wired interfaces to mobile equipment, unless the range of movement is limited and can be accurately forecast.
The evolution of remote I/O
Telemetry has been used for many years to alleviate some of these limitations. By moving the physical sensor interface closer to the sensor, digitising its data at that point, and then sending the digitised data onwards via a serial or network data connection we can significantly reduce the cost of installation whilst also improving noise susceptibility, especially where multiple sensor interfaces exist at a single point. It also becomes possible to use radio to transmit the digitised data, extending the reach of data acquisition systems to geographically remote sites.
The definition of ‘remote I/O’ is very broad. For the purposes of this discussion, we will apply it to any device which interfaces to sensors and digitises the resulting measurements before onwards transmission. This means we encompass traditional Remote Telemetry Unit (RTU) and remote I/O devices using protocols such as Modbus, but also USB devices aimed at lab measurements and fieldbus-based data acquisition nodes, as well as more modern incarnations, such as IoT edge and gateway devices. The use of any such device starts to introduce compromises that must be considered.
- Digitising data implicitly places a constraint on the precision that can be recovered. Does the remote I/O device convert the data with enough resolution (i.e. number of bits) to be able to extract the nuances in the measurement required by any upstream analytics?
- Is the quality of the measurement (accuracy, repeatability, noise immunity etc.) sufficiently good?
- Transmitting data either serially or via a network takes a finite amount of time. This time is affected by the amount of data to be transmitted, the amount of protocol overhead needed to encapsulate the data, the speed of the communications media being used, and the number of other devices sharing the same connection. Can the data be recovered fast and sufficiently often enough to provide the necessary insights and avoid the risk of significant measurement spikes being missed if they occur between samples?
- In transmitting data across communication networks, there is always a chance that a data packet may be delayed, received multiple times, or lost. How significant will any of these events be to the process being analysed or controlled?
Additional considerations for wireless systems
If the digitised information is to be transmitted onwards using a wireless solution, there are other things that must be considered:
- How will power be provided to the remote device? Simple devices may be able to operate from internal batteries for an acceptable period, but many wireless devices may still need cables to be laid to provide power and so not completely avoid the costs of wiring.
- How immune is the radio technology to interference? How secure are the transmissions from eavesdropping, or from malicious actors blocking signals, or impersonating devices to provide false information?
- Do the radios operate in controlled, licenced airspace, or do they use licence-free frequencies? If the latter, how is transmission to be guaranteed against accidental disruption by other systems operating in the same frequencies?
- What are the ongoing costs of the communications provision? The costs of wired systems are normally limited to the capital cost (Capex) of purchasing and installing the equipment, but many radio-based systems involve an ongoing operational expense (Opex), for example for cellular service provision.
- Can reliability be improved by adopting systems with radio diversity with the chosen technology (i.e. automatic routing of signals via multiple path options, for example as found in ‘mesh’ systems)?
- What is the working range of the wireless technology in the environment it will be installed? Theoretical ‘line of sight’ maximum distances can be significantly longer than real world “practical” distances, especially if transmitters are located in buildings, valleys or underground. Similarly, environments where there is a lot of movement of large metal assets (e.g. forklift trucks) and/or water filled objects (ego human beings) can cause difficulties due to the constantly changing radio absorption, reflection and interference profile.
- What public or shared infrastructure (e.g. cellular networks) will the data have to transit across to reach its destination. How is access control performed, and data privacy ensured?
So, which is best?
As you may already have realised from what has preceded, recovering data from sensors into IoT systems involves a balance of measurement performance, transmission speed, security, power, range and cost. It may not come as a big surprise therefore to learn that there is no ‘best’ solution. Wired systems typically offer the best levels of resilience and speed of acquisition, but these are increasingly becoming matched by newer fieldbus-based systems.
For lower speed applications, wireless systems can offer a fast and cost-effective way to add further measurement points within an existing installation, provided they are able to operate effectively within the environment in which they are operating. They can be especially useful in overlay applications, where data is recovered from select points within existing installations, and the existing systems are not easily interfaced to third party systems.
Things get more complicated when considering geographically remote data acquisition. The best technology to apply will depend not only on the factors outlined above, but also upon how many data points exist at the location. With a small number of sensors, it may be sensible to use native radio-based sensors such as those designed for networks such as NB-IoT, LoRa or SigFox. If more sensors are present, then it is more likely that some form of input aggregating edge device, accepting inputs from all sensors but passing the resulting digitised values back through a single radio channel, will be a more effective solution.
This is especially true in cases where the edge device has intelligence, enabling it to provide front end filtering and transformation of the raw sensor data prior to transmission, effectively not transmitting low value repetitive data, but instead transmitting significant events which implicitly have higher value as actionable information…. but that’s another article.
In many cases, the question should not be a case of ‘either-or’. Often, the best results are achieved by mixing technologies, optimising for each measurement point as you go. Advantech offers a range of wired and wireless sensor interfaces as both standalone devices and integrated within industrial computers and gateways. Covering the full range of available technologies including hardwired, fieldbus, serial, networked and radio via LTE, NB-IoT, mesh, and LoRaWAN, we can help determine the ideal combination of devices to deploy in any application.