Blog

Cable goes mobile – or is it the other way round?

9th December 2016
Anna Flockett
0

It seems almost laughable now to think that voice was the driving force behind the development of the original telecommunications services. How surprised those early pioneers of telephony and telegraphy, Alexander Graham Bell, Thomas Edison and Guglielmo Marconi, would be to find teenagers wandering around, mobile phone in hand, downloading and watching videos as they go.

 

Data is the key to almost all developments in communications these days and the mobile phone is the enabler: the device through which almost everyone, business users and consumers alike, accesses the Internet. Voice is almost – but not quite – a thing of the past!

From the outset, fixed Internet access to homes and businesses was provided by cable, since the basic telecommunications network was connected together by twisted pair cable runs. And why not? There was no need for mobility in these applications. In any case, mobile telephony only really started to emerge in the early 1980s and did not start to find true popularity until the mid-1990s.

The disadvantage of cable is, of course, that, over distance, the signal it carries degrades. Even so, the twisted pair cable used in telephony applications proved up to the task for hosting low data rate applications. The situation was improved – ice higher data rates could be supported – by using coaxial cable that had initially been developed for cable TV. The best performance of course comes from using fibre optic cable, where signal degradation is negligible even with runs of tens of kilometres, but this is disproportionately expensive to install and maintain.

To try and get around this problem and reap the benefit of fibre’s high bandwidth availability while minimising the cost of deployment, a variety of half-way house schemes have been adopted. In the latest, fibre is run to the distribution point or street cabinet and then the conventional cable runs to the customer premises. Bandwidth is optimised through usage of a signalling technique called G.fast, developed from the digital subscriber line (DSL) technology introduced in the late 1990s as digital signals started to be carried.

More recently, however, it’s been becoming apparent that, even in fixed communication environments, LTE, operating in the mobile spectrum, is being used as the backhaul technology of choice – the way of connecting the core network to the peripheral sub-networks. While this certainly has the advantage of providing universality and commonality of access to the cloud and the other services demanded by users, it might seem strange, given the limitations on spectrum for mobile technology, the wide availability of cable and the much higher bandwidth available when using fibre optic cable.

The disparity between bandwidths available from the two technologies (LTE and cable) is diminishing rapidly as the LTE specification evolves. It’s perhaps synchronous with the convergence of the communication requirements of domestic and business users. Increasingly there is little or no differential as the boundary between home and office blurs for business users and the demands of domestic consumers increase.

Part of the reason for this diminution of differential is because the data rates possible with cable are being realised by employing similar techniques to those used to increase the data rates of LTE. Rates into Gbits/s will be feasible from both technologies in the next couple of years, with LTE rates being driven up by the standards-setting agenda of 3GPP, the third generation partnership project collaborative effort between telecoms partners tasked with designing LTE.

Maybe it was obvious that technologists would try applying the same or similar techniques and approaches in order to squeeze higher performance out of different technologies. It’s akin to what goes on in the medical world: using tried and tested drugs on different diseases for which there is currently no cure or remedy can often produce surprising and beneficial results.

In the tele and datacomms world this sharing approach also contributes to a slightly surprising – but ultimately satisfying – result: the convergence of the fixed and mobile communications environments. Fixed-Mobile Convergence (FMC) has been touted before but never actually been realised. The desperate need to squeeze as much bandwidth as possible out of the scarce availability of radio spectrum is changing that.  

There are other drivers contributing to this blended approach: the cost of and regulation concerned with maintaining copper cable runs under the ground and digging new cable runs, environmental considerations – and the fact that, cellular base stations are often already in situ.

Cio, in its Visual Networking Index (VNI) Global Mobile Data Traffic Forecast Update issued in February this year, commented: “Mobile offload exceeded cellular traffic for the first time in 2015. 51% of total mobile data traffic was offloaded onto the fixed network through WiFi or femtocell in 2015. In total, 3.9EB of mobile data traffic were offloaded onto the fixed network each month”. It sees the trend continuing to reach 55% (38.1EB/month) by 2020.

Much of this has to do with mobile data usage in the home, where users have fixed broadband or WiFi access points or where users are served by operator-owned femtocells and picocells. Much of the data-hungry video consumption takes place in the home. Cisco’s report predicts that three quarters of the world’s mobile data traffic will be video by 2020, up from 55% in 2015.

Some operators are jumping onto the mix and match LTE and Internet bandwagon with a one-stop shop packaged offering. Smart home technology, mobile/fixed telephony, Internet – they’re all there from operators such as AT&T. Maybe FMC is really here to stay this time, driven by the consistently upward demand for data.

By Charles Sturman, Senior Principal Product Strategy, u-blox.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier