Welcome!

AJAX & REA Authors: Liz McMillan, Elizabeth White, David H Deans, Pat Romanski, Scott Hirsch

Blog Feed Post

The Quality on 4G Networks Is Like an Attractive Person Nobody Wants to Date For Too Long (Disponible en Español)

Is Like Having a Big Motor under the Hood; But Wooden Wheels Supporting the Car 

 

When the first releases of LTE and WiMAX started to be presented to the general public, it was clear that both technologies held an unprecedented opportunity to provide service quality in a way that was only dreamed in previous access technologies and everybody seemed to follow that line of thinking making QoS (Quality of Service) a hot topic, I have addressed the advantages of this feature in previous blogs, and I even supported the fact that MNO can use it as an effective tool to create differentiation and brand loyalty, but there are many factors that slow down and hinder the full potential development of quality in 4G networks, ranging from regulation of Internet neutrality to the inexistence of applicable cross quality domain specifications to make quality a transparent feature between networks.

 

Impressed by the Looks, Disappointed by the Personality

12-10-12-img1.jpg

Image taken from: http://anromaway.wordpress.com/2012/03/01/10-things-to-remember-when-delivering-customer-service/

 

The discussions on quality are not new, transport technologies have struggle to come up with a usable concept of quality and have presented several frameworks to implement it, but like in the case of IP transport have not been very popular. MNO have committed to the implementation of quality since the first traffic differentiation features of GSM and WCDMA and other 2G and 3G technologies, but this has never been a true differentiator for them, it has been more of an option for optimization totally left at the discretion of the network provider. But 4G came as an answer to the change of Internet usage habits, change in which users started to use their desktops less and their portable devices more and more, said change impacted deep MNO because users now started to use bandwidth heavy services outside the control of the network provider and when you have a complicated resource to manage, as the electromagnetic spectrum is, quality seemed like the way to go to try and regain some control on how network resources were being used, even some considered that it could be the answer to recover some of the loss profit that was going to the pockets of content providers. But when you look the situation in detail you’ll find a very difficult business case for quality, more when you consider that the app frenzy and the empowerment of devices have started to make the service to pour outside the network provider boundaries in several ways. Part of the philosophy of 4G is to reach several networks so the MNO could retain control of the user, but that created yet another problem; quality in a cross domain environment. Finally, to complicate things more, once you have some quality implementation on your network you began to realize that is not enough to secure the quality in a portion of the network but in the whole telecommunications infrastructure, so the E2E quality challenge comes into the picture, and with that, the conclusive thought that is really not about the network quality but the user experience quality you have to be good at in order to remain relevant in the business.

 

So You Bought it, Took it Home, Plugged it, Felt Scammed, Now You Take it Back to The Dealership

 

Well, not really, 4G technology standardization bodies have made their homework; strong frameworks have been proposed to develop the concept of quality, LTE for example made a cohesive architecture involving the LTE access to the EPC in order to support the EPS bearer which is finest granularity of quality in that technology. Plus both LTE and WiMAX have made noticeable efforts to develop consistent procedures to provide appropriate mapping towards legacy technologies, so previous releases can work seamlessly with the latest releases. Also, as the interest on quality grows the investigation becomes more involved and looking towards NGN and future generation of wireless cellular network technologies the panorama seems pretty interesting. Finally the current implementations of 4G networks already have a potent quality scheme to control traffic flows through the network.

 

Complexity Is Not Something You Can Sell Easily

 

So what’s so different about 4G networks that make quality a hard **** to swallow, for starters the access segment is planned to support several layers of cellular coverage formed by Picocells, Femtocells and relaying nodes, this set up is known as heterogeneous network, in order to handle the interference and control of resources in such configuration, the user’s device or UE must perform an active role providing key measurements back to the eNB, in other words the control signaling and overhead will be significant, if we consider some of the proposed frameworks to accomplish E2E quality, it is also expected that the UE device provides specific information in a periodic manner, or explained differently, more signaling, more resources for control and less for user traffic, without mentioning that current devices have built in connection managers that are totally outside the control of MNO, very close related to this last issue, there is a real threat that menaces a service that have had an ensured quality for as long as the cellular network existed, the voice in 4G network runs on IP and can be deployed in an application inside a device that prioritizes processes based on the OS SW programed in its processor. Going further, In the case of LTE, EPC is equipped to handle traffic coming from non 3GPP accesses, but other technologies like for example; current WiFi, are not ready to map quality parameters as expected.

 

The EPC has the functional entities to enforce and control the resources assigned for an application in terms of the EPS bearer, but there is a transport technology that support the communication of the functional entities in the EPC and towards the access and of course that carry the user traffic, an all IP network that has struggle for a long time to implement quality with no luck, the current frameworks like DiffServ or IntServ have proved to provide very modest results for stringent delay and jitter applications in the first case and several problems of scalability in the second case. If we look beyond the edge router of the EPC we’ll find the Internet which is the biggest IP network which has never needed the quality to be successful, there you’ll find the servers where applications reside, needless to say, network operators have no control whatsoever on the Internet.

 

Those Hard Issues To Solve Are The Ones That Matter The Most

 

I have just mentioned complexity as a separate challenge for quality, but next I’d like to list the challenges that I see as the main barriers to implement it as planned by the 4G technology releases:

 

  1. Internet Neutrality regulation is such a delicate subject that no matter how you put it, if you start to apply prioritization traffic engineering, sooner or later you’ll be thinking how to explain that you’re not handicapping the customer rights to access information. I have proposed and approach centered in the user, or as I called it now; user requested quality.
  2. E2E quality, from the user device to the application server, cannot be implemented or realized without a big CAPEX consideration, and so far churn seems the only way to justify expenditure on quality but it’s nearly a titanic endeavor to relate one to the other, additionally the radio propagation is “capricious” you cannot sell quality by telling the user that in some cases he can receive what he paid for but sometimes not.
  3. The device manufacturers have found a way to master the Radio Control layer of wireless accesses, current and not so current smartphones have a SW connection manager programmed and even remotely controlled by the manufacturer and majorly outside the reach of network providers, so whatever strategy MNO implement in the network it’s better to consider the device connection manager issue.
  4. Quality was so trendy that every standardization, investigation and regulation body has its own agenda to address the issue; 4G technology was conceived to be a congregator of different technologies under one big umbrella, so, separate efforts cannot be beneficial for this technology. Very few cross domain proposals exists, although one might think that being IP the common transport layer for everyone, things will be easier, but the current state of implementation shows otherwise. Worth mentioning the IETF proposal for an extensible IP signaling protocol suite, called NSIS.
  5. Paradoxically the user comes last when quality is on the table. I previously mentioned that is difficult to sell quality to final user, but is even more challenging when no one cares to make it easy for him, the regulation agencies of several countries have failed totally in this aspect, regulating the MNO to publish performance indicators using engineering units, tells nothing to the user; a better approach would be to put these indicators in time, expressing unavailability or periods outside the download speed accepted mean, or even better idea is to group performance measurements into quality indicators that aggregate these figures, presenting a more general behavior of the network or the service.

 

It Is Very Rewarding When You Find A Purpose For An Item You considered Unusable

 

Considering all these factors, I don’t see quality as a feasible feature in 4G networks… Not as the industry tries to display it at least. Even if the history says that you must start providing quality to the network, MNO must shift the focus and start to provide quality to the user, or what many call, Quality of Experience, contrary to what many think, QoE can be implemented parallel to that of the network, and can be independent of the underlying technology. The quality experience can benefit from the tools network technology can provide and undoubtedly once an event is detected a quality process must be put in action to improve the service the user experiences and this action relies in the network technology to accomplish it, but that is just the reactive part of the quality, the proactive part considers the continuous optimization of network parameters, the right schedule of preventive maintenances, the proper training of the customer service staff, the proper problem escalation procedures and why not the right agreements with partners and devices manufacturers, all these can make the difference everyone is looking from the quality. Now from the current technology point of view the task is for the regulators to make quality an easier measure to understand. Another point is business case; It’s clear that a wireless network operator cannot ensure the level of quality a user will receive always, but services or applications that can exploit the advance capabilities of the current 4G technology hasn’t been created yet (e.g Context aware applications), MNO must start to think in these opportunities as aggregated services to take advantage of the current network capabilities before other actors do it.

 

For more, follow me, @jomaguo

 

Read this post in Spanish

Read the original blog entry...

More Stories By Deborah Strickland

The articles presented here are blog posts from members of our Service Provider Mobility community. Deborah Strickland is a Web and Social Media Program Manager at Cisco. Follow us on Twitter @CiscoSPMobility.

@CloudExpo Stories
SYS-CON Events announced today that TMCnet has been named “Media Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Technology Marketing Corporation (TMC) is the world's leading business to business and integrated marketing media company, servicing niche markets within the communications and technology industries.
As cloud gives an opportunity to businesses to buy services externally - how is cloud impacting your customers? In his General Session at 15th Cloud Expo, Fabio Gori, Director of Worldwide Cloud Marketing at Cisco, will provide answers to big questions: Do you see hybrid cloud as where the world is going? What benefits does it bring? And how does Cisco connect all of these clouds? He will also tell us everything about Intercloud and Cisco investment on it.
Can we look to the paradigm of cloud computing from a completely different perspective? In his General Session at 15th Cloud Expo, Gundars Kulups, Sales Director at DEAC, will discuss what we can learn from our dining habits when choosing a cloud solution. Gundars Kulups is Sales Director at DEAC, full service data center operator. An IT expert, he specializes in European countries and has worked in the IT industry since 1992. He graduated from Riga Technical University (RTU) in Latvia and com...
SYS-CON Events announced today that Parasoft will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. For 27 years, Parasoft has researched and developed software solutions that help organizations deliver defect-free software efficiently. By integrating Development Testing, API/cloud/SOA/composite app testing, and service virtualization, we reduce the time, effort, and cost of delivering secur...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at Internet of @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., will show what is needed to leverage the IoT to transform...
SYS-CON Events announced today that Utimaco will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Utimaco is a leading manufacturer of hardware based security solutions that provide the root of trust to keep cryptographic keys safe, secure critical digital infrastructures and protect high value data assets. Only Utimaco delivers a general-purpose hardware security module (HSM) as a customiz...
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, will describe an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-devic...
Until recently, many organizations required specialized departments to perform mapping and geospatial analysis, and they used Esri on-premise solutions for that work. In his session at 15th Cloud Expo, Dave Peters, author of the Esri Press book Building a GIS, System Architecture Design Strategies for Managers, will discuss how Esri has successfully included the cloud as a fully integrated SaaS expansion of the ArcGIS mapping platform. Organizations that have incorporated Esri cloud-based appl...
Once the decision has been made to move part or all of a workload to the cloud, a methodology for selecting that workload needs to be established. How do you move to the cloud? What does the discovery, assessment and planning look like? What workloads make sense? Which cloud model makes sense for each workload? What are the considerations for how to select the right cloud model? And how does that fit in with the overall IT tranformation? In his session at 15th Cloud Expo, John Hatem, head of V...
Dyn solutions are at the core of Internet Performance. Through traffic management, message management and performance assurance, Dyn is connecting people through the Internet and ensuring information gets where it needs to go, faster and more reliably than ever before. Founded in 2001 at WPI, Dyn’s global presence services more than four million enterprise, small business and personal customers.
IBM and Tencent Cloud signed a business cooperation memorandum to collaborate on providing public cloud with Software-as-a-Service solutions for industries. Both parties agreed to focus on emerging small and medium enterprises in the smarter cities and smarter healthcare industries as well as other fields. This will enable these industries to utilize mobile, cloud computing and big data tools to transform internal processes and operations, thus achieving cloud transformation in the era of mobili...
SimpleECM is the only platform to offer a powerful combination of enterprise content management (ECM) services, capture solutions, and third-party business services providing simplified integrations and workflow development for solution providers. SimpleECM is opening the market to businesses of all sizes by reinventing the delivery of ECM services. Our APIs make the development of ECM services simple with the use of familiar technologies for a frictionless integration directly into web applicat...
European data center operator DEAC is the largest in the Baltics. The activities are orientated to provide data center services and IT outsourcing on Eurasia and America scale in order to create the primary or backup or additional data center for customer in the EU, to protect its business and, most importantly, reduce costs up to 40% within 3-5 years. DEAC is an IT outsourcing services and solutions company whose highly experienced and qualified employees offer various groups of services and...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo, moderated by Ashar Baig, Research ...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at Internet of @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, will discuss how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money! Speaker Bio: ...
Samsung VP Jacopo Lenzi, who headed the company's recent SmartThings acquisition under the auspices of Samsung's Open Innovaction Center (OIC), answered a few questions we had about the deal. This interview was in conjunction with our interview with SmartThings CEO Alex Hawkinson. IoT Journal: SmartThings was developed in an open, standards-agnostic platform, and will now be part of Samsung's Open Innovation Center. Can you elaborate on your commitment to keep the platform open? Jacopo Lenzi: S...
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
Ixia develops amazing products so its customers can connect the world. Ixia helps its customers provide an always-on user experience through fast, secure delivery of dynamic connected technologies and services. Through actionable insights that accelerate and secure application and service delivery, Ixia's customers benefit from faster time to market, optimized application performance and higher-quality deployments.
SYS-CON Events announced today that Stratogent will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Stratogent is a custom managed services organization based in San Mateo, California. We design, implement, and support mission critical infrastructure 24x7 on premises, in datacenters and in the Cloud. Since 2005, we have acted as an extension of internal IT teams, achieving a customer reten...