Click here to close now.

Welcome!

AJAX & REA Authors: Liz McMillan, AppDynamics Blog, Carmen Gonzalez, Elizabeth White, Pat Romanski

Blog Feed Post

The Hype & Reality of Small Cells Performance

Heterogeneous networks (HetNets) consist of large (macro) cells with high transmit power (typically 5 W – 40 W) and small cells with low transmit power (typically 100 mW – 2 W). The small cells are distributed beneath the large cells and can run on the same frequency as the large cell (co-channel), or on a different frequency. As an evolution of the cellular architecture, HetNets and small cells have gained much attention as a technique to increase mobile network capacity and are today one of the hot topics in the wireless industry. Many of the initial deployments of small cells are of the co-channel type. Standards such as LTE have focused on incorporating techniques to improve the performance of co-channel deployments in earlier releases of the technology standard leaving the handling of multi-frequency deployment type to later releases. In all, operators today have multiple options of small cell deployment scenarios, operational techniques and technology roadmaps to choose from.

 

B1 - Figure 1 Heterogeneous Network Architecture.png

Figure 1 Simplified Heterogeneous Network Architecture.

 

To illustrate some of the deployment issues related to small cells, I will provide in this article a qualitative review of small cell performance and explore their impact on the operator's small cells deployment strategy. The focus is on co-channel deployments which aside from being common in this early stage of HetNet evolution, they provide for a complex radio frequency environment.

 

Throughput Performance: The overall throughput experienced by users on both downlink (base station to the mobile subscriber) and uplink (mobile to base station) paths will generally increase as small cells are deployed. This applies to both users camped on the macro cell and those on the small cells, but for different reasons:

 

  1. The users on the macro cell will benefit as more small cells are added because fewer users will share the common capacity resources. Therefore, the more small cells are added, the better likelihood a user on the macro cell will experience higher throughput; meanwhile,
  2. Users on the small cell will experience better throughput than those on macro cell because of higher probability of line-of-sight connection to the serving base station.

 

If the mobile subscribers are uniformly distributed over the coverage area, then the likelihood a user will experience a certain level of throughput is approximately similar as the number of small cells increases. But in reality, the distribution of users is not uniform as they tend to concentrate in certain "traffic hotspots." In this case, a small cell in a traffic hotspot is expected to provide lower throughput than a small cell in a uniform user distribution area. In the meantime, a user on the macrocell will experience a more pronounced increase in throughput because a higher proportion of users are offloaded from the macro cell. As even more small cells are added, interference will increase leading to successively diminishing marginal increase in throughput.

 

This last note is an important one: small cells are beneficial up to a point. The user experience will be affected by the density of small cells with a diminishing marginal return followed by actual degradation of service as the number of small cells exceeds a certain threshold. When this threshold is reached depends on a number of factors that include the type of technology, morphology, and cell density and distribution. Inter-small cell interference is one factor that limits small cell performance. Another factor is that as we add more small cells, we create more 'cell-edge' regions within the coverage area of macrocells that can also limit performance as I will expand upon below.

 

The throughput performance will depend on the location of the small cells and their proximity to macrocells. A small cells close to a macrocell is more likely to be affected by interference than one located at the cell-edge resulting in lower throughput performance. Correspondingly, the performance will depend on the size of the macrocell, or rather, the macrocell density. Small cells deployed close to the cell edge of a large macrocell will provide better performance than those deployed in high-density macrocell area where the average radius is relatively small.

 

Throughput performance will also depend on the output power of the small cell. Simulations show that for a certain macrocell radius, higher power small cells provide better throughput performance than lower power ones given the same small cell base station density.

 

Nevertheless, the key take away here is this: it pays to find out where the traffic hot spots are as otherwise, the gain achieved from small cells will be small. Small cell deployment would have to be 'surgical' in select areas to achieve the maximum return on investment.

 

Interference and Coverage Performance: While small cells improve performance in general, there are certain situations where they cause interference or even a coverage hole. One decisive factor is the large power imbalance between the small cell and the macrocell. The power imbalance is larger than simply the rated transmit power because macrocells implement high-gain sectored antennas (13-16 dBi) while small cells typically implement a much lower gain omni-directional antenna (3-6 dBi). The power imbalance results in asymmetric downlink and uplink coverage areas. Because the macrocell has much higher power than the small cell, the downlink coverage area of the small cell would be smaller than the uplink coverage area. This shifts the handover boundary closer to the small cell increasing the possibility of uplink interference to the small cell with which the interfering mobile might have a line-of-sight path. This type of interference is potentially very damaging since it affects all the users in a cell and forces the mobile units served by the small cell to transmit at higher power. The power imbalance also increases the risk of downlink interference although this type of interference is more limited because it affects a single user. The uplink-downlink imbalance is a leading reason why LTE Release 8 small cell gain is limited because cell selection is decided by downlink signal strength and the options for interference mitigation are limited.

 

B1 - Figure 2 Small Cell Interference Scenarios.png

Figure 2 Co-channel interference scenarios in small cell deployments.

 

To address the uplink-downlink coverage imbalance, the coverage area of the small cell base station is extended to allow the small cell to capture more traffic. This is accomplished by adding a bias to the small cell received signal during the cell selection process. But extending the small cell coverage also increases the chances of downlink interference to a mobile subscriber operating at the edge of the small cell.

 

Aside from co-channel interference, there's also a risk of adjacent channel interference in multicarrier networks where macrocells implement two or more frequency carriers. Consider for example a mobile attached to a macrocell on frequency A while it is very close to a small cell operating on adjacent frequency B. The mobile is susceptible to adjacent channel interference from the small cell which would likely have a line-of-sight path to the mobile in contrast to a non-line-of-sight connection with the macrocell.  Another example is that for the uplink: a mobile attached to a macrocell and operating from the edge of a small cell on an adjacent frequency could cause interference to the small cell.

 

There are other potential interference scenarios in addition to those described here. But the basic fact is that the actual performance and benefit of small cells will vary, and will do so more widely in the absence of interference mitigation/performance enhancing techniques. This is one reason why some requirements for small cell deployments have been hotly debated, without a firm resolution. For example, a basic requirement is that of small cell backhaul capacity: what should it be? Should the backhaul link be designed to handle the peak throughput rate, which is a function technology, or the average throughput rate which is much harder to ascertain and put a value on because it depends on many factors related to the deployment scenario?

 

Based on the above description, we know that throughput of small cells will depend largely on the load. The more clustered the subscribers, the lower the overall small cell throughput. On the other hand, if there's a light load (few users), then the capacity will be high. If you are an operator, you sure would need to think carefully about the required backhaul capacity! And while we're on the backhaul topic, let's not forget that we also need to make sure that backhaul on the macrocell is dimensioned properly to support higher traffic load which will certainly come as more small cells are deployed.

 

In this post, I went through some aspects of small cell performance.  These problems are well recognized and certain techniques are being developed and integrated into the standards to address them. This raises other important questions to the operator's strategic network planning process, such as: what interference management and performance enhancement features should be considered? And, what is the technology roadmap for these features? I will expand more on some of these techniques in a future blog post.

 

Follow Frank Rayal on Twitter @FrankRayal

Read the original blog entry...

More Stories By Deborah Strickland

The articles presented here are blog posts from members of our Service Provider Mobility community. Deborah Strickland is a Web and Social Media Program Manager at Cisco. Follow us on Twitter @CiscoSPMobility.

@CloudExpo Stories
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
There's no doubt that the Internet of Things is driving the next wave of innovation. Google has spent billions over the past few months vacuuming up companies that specialize in smart appliances and machine learning. Already, Philips light bulbs, Audi automobiles, and Samsung washers and dryers can communicate with and be controlled from mobile devices. To take advantage of the opportunities the Internet of Things brings to your business, you'll want to start preparing now.
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developmen...
In a world of ever-accelerating business cycles and fast-changing client expectations, the cloud increasingly serves as a growth engine and a path to new business models. Dynamic clouds enable businesses to continuously reinvent themselves, adapting their business processes, their service and software delivery and their operations to achieve speed-to-market and quick response to customer feedback. As the cloud evolves, the industry has multiple competing cloud technologies, offering on-premises ...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
Hybrid IT is an approach to delivering IT services that matches business requirements and application needs with different IT deployment modalities. In their session at 16th Cloud Expo, Jeff Katzen, Director of the Cloud Practice at CenturyLink, and Gary Sloper, Area Vice President, Sales Engineering and Operations, at CenturyLink, will go into more depth around those different modalities and how customers have made decisions to choose between them. They will talk to some of the challenges t...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
With worldwide spending on cloud services and infrastructure growing by 23% in 2015 to $118B, it is clear that cloud services are here to stay. Yet, the rate of cloud adoption varies by companies and markets around the world. With thousands of outages and hijacks across the Internet every day, one reason for hesitation is the faith in quality Internet performance. In his session at 16th Cloud Expo, Michael Kane, Senior Manager at Dyn, will explore how Internet performance affects your end-user...
With SaaS use rampant across organizations, how can IT departments track company data and maintain security? More and more departments are commissioning their own solutions and bypassing IT. A cloud environment is amorphous and powerful, allowing you to set up solutions for all of your user needs: document sharing and collaboration, mobile access, e-mail, even industry-specific applications. In his session at 16th Cloud Expo, Shawn Mills, President and a founder of Green House Data, will discus...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
Organizations today are confounded by an avalanche of data that needs to be processed and managed on a daily basis. Through relevant use cases and a thought-provoking dialogue on an organization’s ‘Data to Decisions’ journey, Andrew Clyne, Chief Data Officer at CenturyLink Cognilytics, will reveal in his session at Big Data Expo how your organization can monetize data as a strategic asset. State-of-the-art Big Data and Advanced Analytics capabilities provided as a managed service can enable da...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner is Product Manager of the Omega DevCloud with KORE Telematics Inc., will discuss the evolving requirements for developers as IoT matures and co...