Welcome!

IoT User Interface Authors: Liz McMillan, Ram Sonagara, Jnan Dash, SmartBear Blog, Tim Hinds

Blog Feed Post

The Quality on 4G Networks Is Like an Attractive Person Nobody Wants to Date For Too Long (Disponible en Español)

Is Like Having a Big Motor under the Hood; But Wooden Wheels Supporting the Car 

 

When the first releases of LTE and WiMAX started to be presented to the general public, it was clear that both technologies held an unprecedented opportunity to provide service quality in a way that was only dreamed in previous access technologies and everybody seemed to follow that line of thinking making QoS (Quality of Service) a hot topic, I have addressed the advantages of this feature in previous blogs, and I even supported the fact that MNO can use it as an effective tool to create differentiation and brand loyalty, but there are many factors that slow down and hinder the full potential development of quality in 4G networks, ranging from regulation of Internet neutrality to the inexistence of applicable cross quality domain specifications to make quality a transparent feature between networks.

 

Impressed by the Looks, Disappointed by the Personality

12-10-12-img1.jpg

Image taken from: http://anromaway.wordpress.com/2012/03/01/10-things-to-remember-when-delivering-customer-service/

 

The discussions on quality are not new, transport technologies have struggle to come up with a usable concept of quality and have presented several frameworks to implement it, but like in the case of IP transport have not been very popular. MNO have committed to the implementation of quality since the first traffic differentiation features of GSM and WCDMA and other 2G and 3G technologies, but this has never been a true differentiator for them, it has been more of an option for optimization totally left at the discretion of the network provider. But 4G came as an answer to the change of Internet usage habits, change in which users started to use their desktops less and their portable devices more and more, said change impacted deep MNO because users now started to use bandwidth heavy services outside the control of the network provider and when you have a complicated resource to manage, as the electromagnetic spectrum is, quality seemed like the way to go to try and regain some control on how network resources were being used, even some considered that it could be the answer to recover some of the loss profit that was going to the pockets of content providers. But when you look the situation in detail you’ll find a very difficult business case for quality, more when you consider that the app frenzy and the empowerment of devices have started to make the service to pour outside the network provider boundaries in several ways. Part of the philosophy of 4G is to reach several networks so the MNO could retain control of the user, but that created yet another problem; quality in a cross domain environment. Finally, to complicate things more, once you have some quality implementation on your network you began to realize that is not enough to secure the quality in a portion of the network but in the whole telecommunications infrastructure, so the E2E quality challenge comes into the picture, and with that, the conclusive thought that is really not about the network quality but the user experience quality you have to be good at in order to remain relevant in the business.

 

So You Bought it, Took it Home, Plugged it, Felt Scammed, Now You Take it Back to The Dealership

 

Well, not really, 4G technology standardization bodies have made their homework; strong frameworks have been proposed to develop the concept of quality, LTE for example made a cohesive architecture involving the LTE access to the EPC in order to support the EPS bearer which is finest granularity of quality in that technology. Plus both LTE and WiMAX have made noticeable efforts to develop consistent procedures to provide appropriate mapping towards legacy technologies, so previous releases can work seamlessly with the latest releases. Also, as the interest on quality grows the investigation becomes more involved and looking towards NGN and future generation of wireless cellular network technologies the panorama seems pretty interesting. Finally the current implementations of 4G networks already have a potent quality scheme to control traffic flows through the network.

 

Complexity Is Not Something You Can Sell Easily

 

So what’s so different about 4G networks that make quality a hard **** to swallow, for starters the access segment is planned to support several layers of cellular coverage formed by Picocells, Femtocells and relaying nodes, this set up is known as heterogeneous network, in order to handle the interference and control of resources in such configuration, the user’s device or UE must perform an active role providing key measurements back to the eNB, in other words the control signaling and overhead will be significant, if we consider some of the proposed frameworks to accomplish E2E quality, it is also expected that the UE device provides specific information in a periodic manner, or explained differently, more signaling, more resources for control and less for user traffic, without mentioning that current devices have built in connection managers that are totally outside the control of MNO, very close related to this last issue, there is a real threat that menaces a service that have had an ensured quality for as long as the cellular network existed, the voice in 4G network runs on IP and can be deployed in an application inside a device that prioritizes processes based on the OS SW programed in its processor. Going further, In the case of LTE, EPC is equipped to handle traffic coming from non 3GPP accesses, but other technologies like for example; current WiFi, are not ready to map quality parameters as expected.

 

The EPC has the functional entities to enforce and control the resources assigned for an application in terms of the EPS bearer, but there is a transport technology that support the communication of the functional entities in the EPC and towards the access and of course that carry the user traffic, an all IP network that has struggle for a long time to implement quality with no luck, the current frameworks like DiffServ or IntServ have proved to provide very modest results for stringent delay and jitter applications in the first case and several problems of scalability in the second case. If we look beyond the edge router of the EPC we’ll find the Internet which is the biggest IP network which has never needed the quality to be successful, there you’ll find the servers where applications reside, needless to say, network operators have no control whatsoever on the Internet.

 

Those Hard Issues To Solve Are The Ones That Matter The Most

 

I have just mentioned complexity as a separate challenge for quality, but next I’d like to list the challenges that I see as the main barriers to implement it as planned by the 4G technology releases:

 

  1. Internet Neutrality regulation is such a delicate subject that no matter how you put it, if you start to apply prioritization traffic engineering, sooner or later you’ll be thinking how to explain that you’re not handicapping the customer rights to access information. I have proposed and approach centered in the user, or as I called it now; user requested quality.
  2. E2E quality, from the user device to the application server, cannot be implemented or realized without a big CAPEX consideration, and so far churn seems the only way to justify expenditure on quality but it’s nearly a titanic endeavor to relate one to the other, additionally the radio propagation is “capricious” you cannot sell quality by telling the user that in some cases he can receive what he paid for but sometimes not.
  3. The device manufacturers have found a way to master the Radio Control layer of wireless accesses, current and not so current smartphones have a SW connection manager programmed and even remotely controlled by the manufacturer and majorly outside the reach of network providers, so whatever strategy MNO implement in the network it’s better to consider the device connection manager issue.
  4. Quality was so trendy that every standardization, investigation and regulation body has its own agenda to address the issue; 4G technology was conceived to be a congregator of different technologies under one big umbrella, so, separate efforts cannot be beneficial for this technology. Very few cross domain proposals exists, although one might think that being IP the common transport layer for everyone, things will be easier, but the current state of implementation shows otherwise. Worth mentioning the IETF proposal for an extensible IP signaling protocol suite, called NSIS.
  5. Paradoxically the user comes last when quality is on the table. I previously mentioned that is difficult to sell quality to final user, but is even more challenging when no one cares to make it easy for him, the regulation agencies of several countries have failed totally in this aspect, regulating the MNO to publish performance indicators using engineering units, tells nothing to the user; a better approach would be to put these indicators in time, expressing unavailability or periods outside the download speed accepted mean, or even better idea is to group performance measurements into quality indicators that aggregate these figures, presenting a more general behavior of the network or the service.

 

It Is Very Rewarding When You Find A Purpose For An Item You considered Unusable

 

Considering all these factors, I don’t see quality as a feasible feature in 4G networks… Not as the industry tries to display it at least. Even if the history says that you must start providing quality to the network, MNO must shift the focus and start to provide quality to the user, or what many call, Quality of Experience, contrary to what many think, QoE can be implemented parallel to that of the network, and can be independent of the underlying technology. The quality experience can benefit from the tools network technology can provide and undoubtedly once an event is detected a quality process must be put in action to improve the service the user experiences and this action relies in the network technology to accomplish it, but that is just the reactive part of the quality, the proactive part considers the continuous optimization of network parameters, the right schedule of preventive maintenances, the proper training of the customer service staff, the proper problem escalation procedures and why not the right agreements with partners and devices manufacturers, all these can make the difference everyone is looking from the quality. Now from the current technology point of view the task is for the regulators to make quality an easier measure to understand. Another point is business case; It’s clear that a wireless network operator cannot ensure the level of quality a user will receive always, but services or applications that can exploit the advance capabilities of the current 4G technology hasn’t been created yet (e.g Context aware applications), MNO must start to think in these opportunities as aggregated services to take advantage of the current network capabilities before other actors do it.

 

For more, follow me, @jomaguo

 

Read this post in Spanish

Read the original blog entry...

More Stories By Deborah Strickland

The articles presented here are blog posts from members of our Service Provider Mobility community. Deborah Strickland is a Web and Social Media Program Manager at Cisco. Follow us on Twitter @CiscoSPMobility.

@CloudExpo Stories
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies adopt disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevO...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, will discuss how the ability to access and analyze the massive volume of streaming data from mil...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...