Welcome!

Machine Learning Authors: Yeshim Deniz, Elizabeth White, Mehdi Daoudi, Pat Romanski, Liz McMillan

Blog Feed Post

The Quality on 4G Networks Is Like an Attractive Person Nobody Wants to Date For Too Long (Disponible en Español)

Is Like Having a Big Motor under the Hood; But Wooden Wheels Supporting the Car 

 

When the first releases of LTE and WiMAX started to be presented to the general public, it was clear that both technologies held an unprecedented opportunity to provide service quality in a way that was only dreamed in previous access technologies and everybody seemed to follow that line of thinking making QoS (Quality of Service) a hot topic, I have addressed the advantages of this feature in previous blogs, and I even supported the fact that MNO can use it as an effective tool to create differentiation and brand loyalty, but there are many factors that slow down and hinder the full potential development of quality in 4G networks, ranging from regulation of Internet neutrality to the inexistence of applicable cross quality domain specifications to make quality a transparent feature between networks.

 

Impressed by the Looks, Disappointed by the Personality

12-10-12-img1.jpg

Image taken from: http://anromaway.wordpress.com/2012/03/01/10-things-to-remember-when-delivering-customer-service/

 

The discussions on quality are not new, transport technologies have struggle to come up with a usable concept of quality and have presented several frameworks to implement it, but like in the case of IP transport have not been very popular. MNO have committed to the implementation of quality since the first traffic differentiation features of GSM and WCDMA and other 2G and 3G technologies, but this has never been a true differentiator for them, it has been more of an option for optimization totally left at the discretion of the network provider. But 4G came as an answer to the change of Internet usage habits, change in which users started to use their desktops less and their portable devices more and more, said change impacted deep MNO because users now started to use bandwidth heavy services outside the control of the network provider and when you have a complicated resource to manage, as the electromagnetic spectrum is, quality seemed like the way to go to try and regain some control on how network resources were being used, even some considered that it could be the answer to recover some of the loss profit that was going to the pockets of content providers. But when you look the situation in detail you’ll find a very difficult business case for quality, more when you consider that the app frenzy and the empowerment of devices have started to make the service to pour outside the network provider boundaries in several ways. Part of the philosophy of 4G is to reach several networks so the MNO could retain control of the user, but that created yet another problem; quality in a cross domain environment. Finally, to complicate things more, once you have some quality implementation on your network you began to realize that is not enough to secure the quality in a portion of the network but in the whole telecommunications infrastructure, so the E2E quality challenge comes into the picture, and with that, the conclusive thought that is really not about the network quality but the user experience quality you have to be good at in order to remain relevant in the business.

 

So You Bought it, Took it Home, Plugged it, Felt Scammed, Now You Take it Back to The Dealership

 

Well, not really, 4G technology standardization bodies have made their homework; strong frameworks have been proposed to develop the concept of quality, LTE for example made a cohesive architecture involving the LTE access to the EPC in order to support the EPS bearer which is finest granularity of quality in that technology. Plus both LTE and WiMAX have made noticeable efforts to develop consistent procedures to provide appropriate mapping towards legacy technologies, so previous releases can work seamlessly with the latest releases. Also, as the interest on quality grows the investigation becomes more involved and looking towards NGN and future generation of wireless cellular network technologies the panorama seems pretty interesting. Finally the current implementations of 4G networks already have a potent quality scheme to control traffic flows through the network.

 

Complexity Is Not Something You Can Sell Easily

 

So what’s so different about 4G networks that make quality a hard **** to swallow, for starters the access segment is planned to support several layers of cellular coverage formed by Picocells, Femtocells and relaying nodes, this set up is known as heterogeneous network, in order to handle the interference and control of resources in such configuration, the user’s device or UE must perform an active role providing key measurements back to the eNB, in other words the control signaling and overhead will be significant, if we consider some of the proposed frameworks to accomplish E2E quality, it is also expected that the UE device provides specific information in a periodic manner, or explained differently, more signaling, more resources for control and less for user traffic, without mentioning that current devices have built in connection managers that are totally outside the control of MNO, very close related to this last issue, there is a real threat that menaces a service that have had an ensured quality for as long as the cellular network existed, the voice in 4G network runs on IP and can be deployed in an application inside a device that prioritizes processes based on the OS SW programed in its processor. Going further, In the case of LTE, EPC is equipped to handle traffic coming from non 3GPP accesses, but other technologies like for example; current WiFi, are not ready to map quality parameters as expected.

 

The EPC has the functional entities to enforce and control the resources assigned for an application in terms of the EPS bearer, but there is a transport technology that support the communication of the functional entities in the EPC and towards the access and of course that carry the user traffic, an all IP network that has struggle for a long time to implement quality with no luck, the current frameworks like DiffServ or IntServ have proved to provide very modest results for stringent delay and jitter applications in the first case and several problems of scalability in the second case. If we look beyond the edge router of the EPC we’ll find the Internet which is the biggest IP network which has never needed the quality to be successful, there you’ll find the servers where applications reside, needless to say, network operators have no control whatsoever on the Internet.

 

Those Hard Issues To Solve Are The Ones That Matter The Most

 

I have just mentioned complexity as a separate challenge for quality, but next I’d like to list the challenges that I see as the main barriers to implement it as planned by the 4G technology releases:

 

  1. Internet Neutrality regulation is such a delicate subject that no matter how you put it, if you start to apply prioritization traffic engineering, sooner or later you’ll be thinking how to explain that you’re not handicapping the customer rights to access information. I have proposed and approach centered in the user, or as I called it now; user requested quality.
  2. E2E quality, from the user device to the application server, cannot be implemented or realized without a big CAPEX consideration, and so far churn seems the only way to justify expenditure on quality but it’s nearly a titanic endeavor to relate one to the other, additionally the radio propagation is “capricious” you cannot sell quality by telling the user that in some cases he can receive what he paid for but sometimes not.
  3. The device manufacturers have found a way to master the Radio Control layer of wireless accesses, current and not so current smartphones have a SW connection manager programmed and even remotely controlled by the manufacturer and majorly outside the reach of network providers, so whatever strategy MNO implement in the network it’s better to consider the device connection manager issue.
  4. Quality was so trendy that every standardization, investigation and regulation body has its own agenda to address the issue; 4G technology was conceived to be a congregator of different technologies under one big umbrella, so, separate efforts cannot be beneficial for this technology. Very few cross domain proposals exists, although one might think that being IP the common transport layer for everyone, things will be easier, but the current state of implementation shows otherwise. Worth mentioning the IETF proposal for an extensible IP signaling protocol suite, called NSIS.
  5. Paradoxically the user comes last when quality is on the table. I previously mentioned that is difficult to sell quality to final user, but is even more challenging when no one cares to make it easy for him, the regulation agencies of several countries have failed totally in this aspect, regulating the MNO to publish performance indicators using engineering units, tells nothing to the user; a better approach would be to put these indicators in time, expressing unavailability or periods outside the download speed accepted mean, or even better idea is to group performance measurements into quality indicators that aggregate these figures, presenting a more general behavior of the network or the service.

 

It Is Very Rewarding When You Find A Purpose For An Item You considered Unusable

 

Considering all these factors, I don’t see quality as a feasible feature in 4G networks… Not as the industry tries to display it at least. Even if the history says that you must start providing quality to the network, MNO must shift the focus and start to provide quality to the user, or what many call, Quality of Experience, contrary to what many think, QoE can be implemented parallel to that of the network, and can be independent of the underlying technology. The quality experience can benefit from the tools network technology can provide and undoubtedly once an event is detected a quality process must be put in action to improve the service the user experiences and this action relies in the network technology to accomplish it, but that is just the reactive part of the quality, the proactive part considers the continuous optimization of network parameters, the right schedule of preventive maintenances, the proper training of the customer service staff, the proper problem escalation procedures and why not the right agreements with partners and devices manufacturers, all these can make the difference everyone is looking from the quality. Now from the current technology point of view the task is for the regulators to make quality an easier measure to understand. Another point is business case; It’s clear that a wireless network operator cannot ensure the level of quality a user will receive always, but services or applications that can exploit the advance capabilities of the current 4G technology hasn’t been created yet (e.g Context aware applications), MNO must start to think in these opportunities as aggregated services to take advantage of the current network capabilities before other actors do it.

 

For more, follow me, @jomaguo

 

Read this post in Spanish

Read the original blog entry...

More Stories By Deborah Strickland

The articles presented here are blog posts from members of our Service Provider Mobility community. Deborah Strickland is a Web and Social Media Program Manager at Cisco. Follow us on Twitter @CiscoSPMobility.

@CloudExpo Stories
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, discussed the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insights from the vast t...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
While some vendors scramble to create and sell you a fancy solution for monitoring your spanking new Amazon Lambdas, hear how you can do it on the cheap using just built-in Java APIs yourself. By exploiting a little-known fact that Lambdas aren’t exactly single-threaded, you can effectively identify hot spots in your serverless code. In his session at @DevOpsSummit at 21st Cloud Expo, Dave Martin, Product owner at CA Technologies, will give a live demonstration and code walkthrough, showing how ...
SYS-CON Events announced today that Elastifile will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Elastifile Cloud File System (ECFS) is software-defined data infrastructure designed for seamless and efficient management of dynamic workloads across heterogeneous environments. Elastifile provides the architecture needed to optimize your hybrid cloud environment, by facilitating efficient...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to...
SYS-CON Events announced today that Golden Gate University will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Since 1901, non-profit Golden Gate University (GGU) has been helping adults achieve their professional goals by providing high quality, practice-based undergraduate and graduate educational programs in law, taxation, business and related professions. Many of its courses are taug...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
SYS-CON Events announced today that DXWorldExpo has been named “Global Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Digital Transformation is the key issue driving the global enterprise IT business. Digital Transformation is most prominent among Global 2000 enterprises and government institutions.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
yperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let’s say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it....
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.