It seems almost laughable now to think that voice was the driving force behind the development of the original telecommunications services. How surprised those early pioneers of telephony and telegraphy, Alexander Graham Bell, Thomas Edison and Guglielmo Marconi, would be to find teenagers wandering around, mobile phone in hand, downloading and watching videos as they go.
Data is the key to almost all developments in communications these days and the mobile phone is the enabler: the device through which almost everyone, business users and consumers alike, accesses the Internet. Voice is almost – but not quite – a thing of the past!
From the outset, fixed Internet access to homes and businesses was provided by cable, since the basic telecommunications network was connected together by twisted pair cable runs. And why not? There was no need for mobility in these applications. In any case, mobile telephony only really started to emerge in the early 1980s and did not start to find true popularity until the mid 1990s.
The disadvantage of cable is, of course, that, over distance, the signal it carries degrades. Even so, the twisted pair cable used in telephony applications proved up to the task for hosting low data rate applications. The situation was improved –to enable higher data rates to be supported – by using coaxial cable that had initially been developed for cable TV. The best performance of course comes from using fibre optic cable, where signal degradation is negligible even with runs of tens of kilometres, but this is disproportionately expensive to install and maintain.
To try and get around this problem and reap the benefit of fibre’s high bandwidth availability while minimizing the cost of deployment, a variety of half‑way house schemes have been adopted. In the latest, fibre is run to the distribution point or street cabinet and then the conventional cable runs to the customer premises. Bandwidth is optimized through usage of a signalling technique called G.fast, developed from the digital subscriber line (DSL) technology introduced in the late 1990s as digital signals started to be carried.
More recently, however, it is becoming apparent that, even in fixed communication environments, LTE, operating in the mobile spectrum, is being used as the backhaul technology of choice – the way of connecting the core network to the peripheral sub‑networks. While this certainly has the advantage of providing universality and commonality of access to the cloud and the other services demanded by users, it might seem strange, given the limitations on spectrum for mobile technology, the wide availability of cable and the much higher bandwidth available when using fibre optic cable.
The disparity between bandwidths available from the two technologies (LTE and cable) is diminishing rapidly as the LTE specification evolves. It’s perhaps synchronous with the convergence of the communication requirements of domestic and business users. Increasingly there is little or no differential as the boundary between home and office blurs for business users and the demands of domestic consumers increase.
Part of the reason for this diminution of differential is because the data rates possible with cable are being realized by employing similar techniques to those used to increase the data rates of LTE. Rates into Gbits/s will be feasible from both technologies in the next couple of years, with LTE rates being driven up by the standards‑setting agenda of 3GPP, the third generation partnership project collaborative effort between telecoms partners tasked with designing LTE.
Maybe it was obvious that technologists would try applying the same or similar techniques and approaches in order to squeeze higher performance out of different technologies. It’s akin to what goes on in the medical world: using tried and tested drugs on different diseases for which there is currently no cure or remedy can often produce surprising and beneficial results.
In the tele- and datacomms world this sharing approach also contributes to a slightly surprising – but ultimately satisfying – result: the convergence of the fixed and mobile communications environments. Fixed‑mobile convergence (FMC) has been touted before but never actually been realized. The desperate need to squeeze as much bandwidth as possible out of the scarce availability of radio spectrum is changing that.
There are other drivers contributing to this blended approach: the cost of and regulation concerned with maintaining copper cable runs under the ground and digging new cable runs, environmental considerations – and the fact that cellular base stations are often already in situ.
Cisco, in its Visual Networking Index (VNI) Global Mobile Data Traffic Forecast Update issued in February this year, comments: “Mobile offload exceeded cellular traffic for the first time in 2015. 51% of total mobile data traffic was offloaded onto the fixed network through Wi‑Fi or femtocell in 2015. In total, 3.9 exabytes of mobile data traffic were offloaded onto the fixed network each month”. It sees the trend continuing to reach 55% (38.1 exabytes/month) by 2020.
Much of this has to do with mobile data usage in the home, where users have fixed broadband or Wi‑Fi access points or where users are served by operator‑owned femtocells and picocells. Much of the data‑hungry video consumption takes place in the home. Cisco’s report predicts that three quarters of the world’s mobile data traffic will be video by 2020, up from 55% in 2015.
In time, the boundary between fixed‑line and wide‑area‑wireless service will blur significantly, with in‑home broadband and remote Wi‑Fi access points served by LTE backhaul, and cellular connected devices off‑loading through various access points. Some operators are jumping onto the mix and match LTE / Internet bandwagon with a one‑stop‑shop packaged offering. Smart home technology, mobile/fixed telephony, Internet – they’re all there from operators such as AT&T, Vodafone and DT. Maybe FMC is really here to stay this time, driven by the consistently upward demand for data.