Welcome!

AJAX & REA Authors: Elizabeth White, Trevor Parsons, ChandraShekar Dattatreya, Liz McMillan, David H Deans

Blog Feed Post

The Quality on 4G Networks Is Like an Attractive Person Nobody Wants to Date For Too Long (Disponible en Español)

Is Like Having a Big Motor under the Hood; But Wooden Wheels Supporting the Car 

 

When the first releases of LTE and WiMAX started to be presented to the general public, it was clear that both technologies held an unprecedented opportunity to provide service quality in a way that was only dreamed in previous access technologies and everybody seemed to follow that line of thinking making QoS (Quality of Service) a hot topic, I have addressed the advantages of this feature in previous blogs, and I even supported the fact that MNO can use it as an effective tool to create differentiation and brand loyalty, but there are many factors that slow down and hinder the full potential development of quality in 4G networks, ranging from regulation of Internet neutrality to the inexistence of applicable cross quality domain specifications to make quality a transparent feature between networks.

 

Impressed by the Looks, Disappointed by the Personality

12-10-12-img1.jpg

Image taken from: http://anromaway.wordpress.com/2012/03/01/10-things-to-remember-when-delivering-customer-service/

 

The discussions on quality are not new, transport technologies have struggle to come up with a usable concept of quality and have presented several frameworks to implement it, but like in the case of IP transport have not been very popular. MNO have committed to the implementation of quality since the first traffic differentiation features of GSM and WCDMA and other 2G and 3G technologies, but this has never been a true differentiator for them, it has been more of an option for optimization totally left at the discretion of the network provider. But 4G came as an answer to the change of Internet usage habits, change in which users started to use their desktops less and their portable devices more and more, said change impacted deep MNO because users now started to use bandwidth heavy services outside the control of the network provider and when you have a complicated resource to manage, as the electromagnetic spectrum is, quality seemed like the way to go to try and regain some control on how network resources were being used, even some considered that it could be the answer to recover some of the loss profit that was going to the pockets of content providers. But when you look the situation in detail you’ll find a very difficult business case for quality, more when you consider that the app frenzy and the empowerment of devices have started to make the service to pour outside the network provider boundaries in several ways. Part of the philosophy of 4G is to reach several networks so the MNO could retain control of the user, but that created yet another problem; quality in a cross domain environment. Finally, to complicate things more, once you have some quality implementation on your network you began to realize that is not enough to secure the quality in a portion of the network but in the whole telecommunications infrastructure, so the E2E quality challenge comes into the picture, and with that, the conclusive thought that is really not about the network quality but the user experience quality you have to be good at in order to remain relevant in the business.

 

So You Bought it, Took it Home, Plugged it, Felt Scammed, Now You Take it Back to The Dealership

 

Well, not really, 4G technology standardization bodies have made their homework; strong frameworks have been proposed to develop the concept of quality, LTE for example made a cohesive architecture involving the LTE access to the EPC in order to support the EPS bearer which is finest granularity of quality in that technology. Plus both LTE and WiMAX have made noticeable efforts to develop consistent procedures to provide appropriate mapping towards legacy technologies, so previous releases can work seamlessly with the latest releases. Also, as the interest on quality grows the investigation becomes more involved and looking towards NGN and future generation of wireless cellular network technologies the panorama seems pretty interesting. Finally the current implementations of 4G networks already have a potent quality scheme to control traffic flows through the network.

 

Complexity Is Not Something You Can Sell Easily

 

So what’s so different about 4G networks that make quality a hard **** to swallow, for starters the access segment is planned to support several layers of cellular coverage formed by Picocells, Femtocells and relaying nodes, this set up is known as heterogeneous network, in order to handle the interference and control of resources in such configuration, the user’s device or UE must perform an active role providing key measurements back to the eNB, in other words the control signaling and overhead will be significant, if we consider some of the proposed frameworks to accomplish E2E quality, it is also expected that the UE device provides specific information in a periodic manner, or explained differently, more signaling, more resources for control and less for user traffic, without mentioning that current devices have built in connection managers that are totally outside the control of MNO, very close related to this last issue, there is a real threat that menaces a service that have had an ensured quality for as long as the cellular network existed, the voice in 4G network runs on IP and can be deployed in an application inside a device that prioritizes processes based on the OS SW programed in its processor. Going further, In the case of LTE, EPC is equipped to handle traffic coming from non 3GPP accesses, but other technologies like for example; current WiFi, are not ready to map quality parameters as expected.

 

The EPC has the functional entities to enforce and control the resources assigned for an application in terms of the EPS bearer, but there is a transport technology that support the communication of the functional entities in the EPC and towards the access and of course that carry the user traffic, an all IP network that has struggle for a long time to implement quality with no luck, the current frameworks like DiffServ or IntServ have proved to provide very modest results for stringent delay and jitter applications in the first case and several problems of scalability in the second case. If we look beyond the edge router of the EPC we’ll find the Internet which is the biggest IP network which has never needed the quality to be successful, there you’ll find the servers where applications reside, needless to say, network operators have no control whatsoever on the Internet.

 

Those Hard Issues To Solve Are The Ones That Matter The Most

 

I have just mentioned complexity as a separate challenge for quality, but next I’d like to list the challenges that I see as the main barriers to implement it as planned by the 4G technology releases:

 

  1. Internet Neutrality regulation is such a delicate subject that no matter how you put it, if you start to apply prioritization traffic engineering, sooner or later you’ll be thinking how to explain that you’re not handicapping the customer rights to access information. I have proposed and approach centered in the user, or as I called it now; user requested quality.
  2. E2E quality, from the user device to the application server, cannot be implemented or realized without a big CAPEX consideration, and so far churn seems the only way to justify expenditure on quality but it’s nearly a titanic endeavor to relate one to the other, additionally the radio propagation is “capricious” you cannot sell quality by telling the user that in some cases he can receive what he paid for but sometimes not.
  3. The device manufacturers have found a way to master the Radio Control layer of wireless accesses, current and not so current smartphones have a SW connection manager programmed and even remotely controlled by the manufacturer and majorly outside the reach of network providers, so whatever strategy MNO implement in the network it’s better to consider the device connection manager issue.
  4. Quality was so trendy that every standardization, investigation and regulation body has its own agenda to address the issue; 4G technology was conceived to be a congregator of different technologies under one big umbrella, so, separate efforts cannot be beneficial for this technology. Very few cross domain proposals exists, although one might think that being IP the common transport layer for everyone, things will be easier, but the current state of implementation shows otherwise. Worth mentioning the IETF proposal for an extensible IP signaling protocol suite, called NSIS.
  5. Paradoxically the user comes last when quality is on the table. I previously mentioned that is difficult to sell quality to final user, but is even more challenging when no one cares to make it easy for him, the regulation agencies of several countries have failed totally in this aspect, regulating the MNO to publish performance indicators using engineering units, tells nothing to the user; a better approach would be to put these indicators in time, expressing unavailability or periods outside the download speed accepted mean, or even better idea is to group performance measurements into quality indicators that aggregate these figures, presenting a more general behavior of the network or the service.

 

It Is Very Rewarding When You Find A Purpose For An Item You considered Unusable

 

Considering all these factors, I don’t see quality as a feasible feature in 4G networks… Not as the industry tries to display it at least. Even if the history says that you must start providing quality to the network, MNO must shift the focus and start to provide quality to the user, or what many call, Quality of Experience, contrary to what many think, QoE can be implemented parallel to that of the network, and can be independent of the underlying technology. The quality experience can benefit from the tools network technology can provide and undoubtedly once an event is detected a quality process must be put in action to improve the service the user experiences and this action relies in the network technology to accomplish it, but that is just the reactive part of the quality, the proactive part considers the continuous optimization of network parameters, the right schedule of preventive maintenances, the proper training of the customer service staff, the proper problem escalation procedures and why not the right agreements with partners and devices manufacturers, all these can make the difference everyone is looking from the quality. Now from the current technology point of view the task is for the regulators to make quality an easier measure to understand. Another point is business case; It’s clear that a wireless network operator cannot ensure the level of quality a user will receive always, but services or applications that can exploit the advance capabilities of the current 4G technology hasn’t been created yet (e.g Context aware applications), MNO must start to think in these opportunities as aggregated services to take advantage of the current network capabilities before other actors do it.

 

For more, follow me, @jomaguo

 

Read this post in Spanish

Read the original blog entry...

More Stories By Deborah Strickland

The articles presented here are blog posts from members of our Service Provider Mobility community. Deborah Strickland is a Web and Social Media Program Manager at Cisco. Follow us on Twitter @CiscoSPMobility.

@CloudExpo Stories
In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally. DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.
ScriptRock makes GuardRail, a DevOps-ready platform for configuration monitoring. Realizing we were spending way too much time digging up, cataloguing, and tracking machine configurations, we began writing our own scripts and tools to handle what is normally an enormous chore. Then we took the concept a step further, giving it a beautiful interface and making it simple enough for our bosses to understand. We named it GuardRail after its function - to allow businesses to move fast and stay sa...
SYS-CON Media announced today that Sematext launched a popular blog feed on DevOps Journal with over 6,000 story reads over the weekend. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. DevOps Journal brings valuable information to DevOps professionals who are transforming the way enterprise IT is done. Sematext is a globally distributed organization that builds innovative Cloud and On Premises solutions for performance monitoring, alerting an...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
Verizon Enterprise Solutions is simplifying the cloud-purchasing experience for its clients, with the launch of Verizon Cloud Marketplace, a key foundational component of the company's robust ecosystem of enterprise-class technologies. The online storefront will initially feature pre-built cloud-based services from AppDynamics, Hitachi Data Systems, Juniper Networks, PfSense and Tervela. Available globally to enterprises using Verizon Cloud, Verizon Cloud Marketplace provides a one-stop shop fo...
Leysin American School is an exclusive, private boarding school located in Leysin, Switzerland. Leysin selected an OpenStack-powered, private cloud as a service to manage multiple applications and provide development environments for students across the institution. Seeking to meet rigid data sovereignty and data integrity requirements while offering flexible, on-demand cloud resources to users, Leysin identified OpenStack as the clear choice to round out the school's cloud strategy. Additional...
The move in recent years to cloud computing services and architectures has added significant pace to the application development and deployment environment. When enterprise IT can spin up large computing instances in just minutes, developers can also design and deploy in small time frames that were unimaginable a few years ago. The consequent move toward lean, agile, and fast development leads to the need for the development and operations sides to work very closely together. Thus, DevOps become...
SYS-CON Events announced today that IDenticard will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. IDenticard™ is the security division of Brady Corp (NYSE: BRC), a $1.5 billion manufacturer of identification products. We have small-company values with the strength and stability of a major corporation. IDenticard offers local sales, support and service to our customers across the United States and Canada...
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, p...
SYS-CON Events announced today that AIC, a leading provider of OEM/ODM server and storage solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. AIC is a leading provider of both standard OTS, off-the-shelf, and OEM/ODM server and storage solutions. With expert in-house design capabilities, validation, manufacturing and production, AIC's broad selection of products are highly flexible and are conf...

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's

The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.
SYS-CON Events announced today Isomorphic Software, the global leader in high-end, web-based business applications, will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software ...
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"Our premise is Docker is not enough. That's not a bad thing - we actually love Docker. At ActiveState all our products are based on open source technology and Docker is an up-and-coming piece of open source technology," explained Bart Copeland, President & CEO of ActiveState Software, in this SYS-CON.tv interview at DevOps Summit at Cloud Expo®, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover ...
AppZero has announced that its award-winning application migration software is now fully qualified within the Microsoft Azure Certified program. AppZero has undergone extensive technical evaluation with Microsoft Corp., earning its designation as Microsoft Azure Certified. As a result of AppZero's work with Microsoft, customers are able to easily find, purchase and deploy AppZero from the Azure Marketplace. With just a few clicks, users have an Azure-based solution for moving applications to the...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The cloud is becoming the de-facto way for enterprises to leverage common infrastructure while innovating and one of the biggest obstacles facing public cloud computing is security. In his session at 15th Cloud Expo, Jeff Aliber, a global marketing executive at Verizon, discussed how the best place for web security is in the cloud. Benefits include: Functions as the first layer of defense Easy operation –CNAME change Implement an integrated solution Best architecture for addressing network-l...
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.