Welcome!

Machine Learning Authors: Elizabeth White, Liz McMillan, Yeshim Deniz, Pat Romanski, Ed Featherston

Blog Feed Post

The Quality on 4G Networks Is Like an Attractive Person Nobody Wants to Date For Too Long (Disponible en Español)

Is Like Having a Big Motor under the Hood; But Wooden Wheels Supporting the Car 

 

When the first releases of LTE and WiMAX started to be presented to the general public, it was clear that both technologies held an unprecedented opportunity to provide service quality in a way that was only dreamed in previous access technologies and everybody seemed to follow that line of thinking making QoS (Quality of Service) a hot topic, I have addressed the advantages of this feature in previous blogs, and I even supported the fact that MNO can use it as an effective tool to create differentiation and brand loyalty, but there are many factors that slow down and hinder the full potential development of quality in 4G networks, ranging from regulation of Internet neutrality to the inexistence of applicable cross quality domain specifications to make quality a transparent feature between networks.

 

Impressed by the Looks, Disappointed by the Personality

12-10-12-img1.jpg

Image taken from: http://anromaway.wordpress.com/2012/03/01/10-things-to-remember-when-delivering-customer-service/

 

The discussions on quality are not new, transport technologies have struggle to come up with a usable concept of quality and have presented several frameworks to implement it, but like in the case of IP transport have not been very popular. MNO have committed to the implementation of quality since the first traffic differentiation features of GSM and WCDMA and other 2G and 3G technologies, but this has never been a true differentiator for them, it has been more of an option for optimization totally left at the discretion of the network provider. But 4G came as an answer to the change of Internet usage habits, change in which users started to use their desktops less and their portable devices more and more, said change impacted deep MNO because users now started to use bandwidth heavy services outside the control of the network provider and when you have a complicated resource to manage, as the electromagnetic spectrum is, quality seemed like the way to go to try and regain some control on how network resources were being used, even some considered that it could be the answer to recover some of the loss profit that was going to the pockets of content providers. But when you look the situation in detail you’ll find a very difficult business case for quality, more when you consider that the app frenzy and the empowerment of devices have started to make the service to pour outside the network provider boundaries in several ways. Part of the philosophy of 4G is to reach several networks so the MNO could retain control of the user, but that created yet another problem; quality in a cross domain environment. Finally, to complicate things more, once you have some quality implementation on your network you began to realize that is not enough to secure the quality in a portion of the network but in the whole telecommunications infrastructure, so the E2E quality challenge comes into the picture, and with that, the conclusive thought that is really not about the network quality but the user experience quality you have to be good at in order to remain relevant in the business.

 

So You Bought it, Took it Home, Plugged it, Felt Scammed, Now You Take it Back to The Dealership

 

Well, not really, 4G technology standardization bodies have made their homework; strong frameworks have been proposed to develop the concept of quality, LTE for example made a cohesive architecture involving the LTE access to the EPC in order to support the EPS bearer which is finest granularity of quality in that technology. Plus both LTE and WiMAX have made noticeable efforts to develop consistent procedures to provide appropriate mapping towards legacy technologies, so previous releases can work seamlessly with the latest releases. Also, as the interest on quality grows the investigation becomes more involved and looking towards NGN and future generation of wireless cellular network technologies the panorama seems pretty interesting. Finally the current implementations of 4G networks already have a potent quality scheme to control traffic flows through the network.

 

Complexity Is Not Something You Can Sell Easily

 

So what’s so different about 4G networks that make quality a hard **** to swallow, for starters the access segment is planned to support several layers of cellular coverage formed by Picocells, Femtocells and relaying nodes, this set up is known as heterogeneous network, in order to handle the interference and control of resources in such configuration, the user’s device or UE must perform an active role providing key measurements back to the eNB, in other words the control signaling and overhead will be significant, if we consider some of the proposed frameworks to accomplish E2E quality, it is also expected that the UE device provides specific information in a periodic manner, or explained differently, more signaling, more resources for control and less for user traffic, without mentioning that current devices have built in connection managers that are totally outside the control of MNO, very close related to this last issue, there is a real threat that menaces a service that have had an ensured quality for as long as the cellular network existed, the voice in 4G network runs on IP and can be deployed in an application inside a device that prioritizes processes based on the OS SW programed in its processor. Going further, In the case of LTE, EPC is equipped to handle traffic coming from non 3GPP accesses, but other technologies like for example; current WiFi, are not ready to map quality parameters as expected.

 

The EPC has the functional entities to enforce and control the resources assigned for an application in terms of the EPS bearer, but there is a transport technology that support the communication of the functional entities in the EPC and towards the access and of course that carry the user traffic, an all IP network that has struggle for a long time to implement quality with no luck, the current frameworks like DiffServ or IntServ have proved to provide very modest results for stringent delay and jitter applications in the first case and several problems of scalability in the second case. If we look beyond the edge router of the EPC we’ll find the Internet which is the biggest IP network which has never needed the quality to be successful, there you’ll find the servers where applications reside, needless to say, network operators have no control whatsoever on the Internet.

 

Those Hard Issues To Solve Are The Ones That Matter The Most

 

I have just mentioned complexity as a separate challenge for quality, but next I’d like to list the challenges that I see as the main barriers to implement it as planned by the 4G technology releases:

 

  1. Internet Neutrality regulation is such a delicate subject that no matter how you put it, if you start to apply prioritization traffic engineering, sooner or later you’ll be thinking how to explain that you’re not handicapping the customer rights to access information. I have proposed and approach centered in the user, or as I called it now; user requested quality.
  2. E2E quality, from the user device to the application server, cannot be implemented or realized without a big CAPEX consideration, and so far churn seems the only way to justify expenditure on quality but it’s nearly a titanic endeavor to relate one to the other, additionally the radio propagation is “capricious” you cannot sell quality by telling the user that in some cases he can receive what he paid for but sometimes not.
  3. The device manufacturers have found a way to master the Radio Control layer of wireless accesses, current and not so current smartphones have a SW connection manager programmed and even remotely controlled by the manufacturer and majorly outside the reach of network providers, so whatever strategy MNO implement in the network it’s better to consider the device connection manager issue.
  4. Quality was so trendy that every standardization, investigation and regulation body has its own agenda to address the issue; 4G technology was conceived to be a congregator of different technologies under one big umbrella, so, separate efforts cannot be beneficial for this technology. Very few cross domain proposals exists, although one might think that being IP the common transport layer for everyone, things will be easier, but the current state of implementation shows otherwise. Worth mentioning the IETF proposal for an extensible IP signaling protocol suite, called NSIS.
  5. Paradoxically the user comes last when quality is on the table. I previously mentioned that is difficult to sell quality to final user, but is even more challenging when no one cares to make it easy for him, the regulation agencies of several countries have failed totally in this aspect, regulating the MNO to publish performance indicators using engineering units, tells nothing to the user; a better approach would be to put these indicators in time, expressing unavailability or periods outside the download speed accepted mean, or even better idea is to group performance measurements into quality indicators that aggregate these figures, presenting a more general behavior of the network or the service.

 

It Is Very Rewarding When You Find A Purpose For An Item You considered Unusable

 

Considering all these factors, I don’t see quality as a feasible feature in 4G networks… Not as the industry tries to display it at least. Even if the history says that you must start providing quality to the network, MNO must shift the focus and start to provide quality to the user, or what many call, Quality of Experience, contrary to what many think, QoE can be implemented parallel to that of the network, and can be independent of the underlying technology. The quality experience can benefit from the tools network technology can provide and undoubtedly once an event is detected a quality process must be put in action to improve the service the user experiences and this action relies in the network technology to accomplish it, but that is just the reactive part of the quality, the proactive part considers the continuous optimization of network parameters, the right schedule of preventive maintenances, the proper training of the customer service staff, the proper problem escalation procedures and why not the right agreements with partners and devices manufacturers, all these can make the difference everyone is looking from the quality. Now from the current technology point of view the task is for the regulators to make quality an easier measure to understand. Another point is business case; It’s clear that a wireless network operator cannot ensure the level of quality a user will receive always, but services or applications that can exploit the advance capabilities of the current 4G technology hasn’t been created yet (e.g Context aware applications), MNO must start to think in these opportunities as aggregated services to take advantage of the current network capabilities before other actors do it.

 

For more, follow me, @jomaguo

 

Read this post in Spanish

Read the original blog entry...

More Stories By Deborah Strickland

The articles presented here are blog posts from members of our Service Provider Mobility community. Deborah Strickland is a Web and Social Media Program Manager at Cisco. Follow us on Twitter @CiscoSPMobility.

@CloudExpo Stories
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, will discuss how they bu...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, will discuss how from store operations...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., will introduce you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He will explore applications in several industries and discuss technologies that allow the deployment of advanced visualization solutions to the cloud.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, will discuss some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he’ll go over some of the best practices for structured team migrat...
SYS-CON Events announced today that Datera will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera offers a radically new approach to data management, where innovative software makes data infrastructure invisible, elastic and able to perform at the highest level. It eliminates hardware lock-in and gives IT organizations the choice to source x86 server nodes, with business model option...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
As people view cloud as a preferred option to build IT systems, the size of the cloud-based system is getting bigger and more complex. As the system gets bigger, more people need to collaborate from design to management. As more people collaborate to create a bigger system, the need for a systematic approach to automate the process is required. Just as in software, cloud now needs DevOps. In this session, the audience can see how people can solve this issue with a visual model. Visual models ha...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its ...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbui...