Welcome!

IoT User Interface Authors: Ram Sonagara, Jnan Dash, SmartBear Blog, Tim Hinds, Yung Chou

Blog Feed Post

The Hype & Reality of Small Cells Performance

Heterogeneous networks (HetNets) consist of large (macro) cells with high transmit power (typically 5 W – 40 W) and small cells with low transmit power (typically 100 mW – 2 W). The small cells are distributed beneath the large cells and can run on the same frequency as the large cell (co-channel), or on a different frequency. As an evolution of the cellular architecture, HetNets and small cells have gained much attention as a technique to increase mobile network capacity and are today one of the hot topics in the wireless industry. Many of the initial deployments of small cells are of the co-channel type. Standards such as LTE have focused on incorporating techniques to improve the performance of co-channel deployments in earlier releases of the technology standard leaving the handling of multi-frequency deployment type to later releases. In all, operators today have multiple options of small cell deployment scenarios, operational techniques and technology roadmaps to choose from.

 

B1 - Figure 1 Heterogeneous Network Architecture.png

Figure 1 Simplified Heterogeneous Network Architecture.

 

To illustrate some of the deployment issues related to small cells, I will provide in this article a qualitative review of small cell performance and explore their impact on the operator's small cells deployment strategy. The focus is on co-channel deployments which aside from being common in this early stage of HetNet evolution, they provide for a complex radio frequency environment.

 

Throughput Performance: The overall throughput experienced by users on both downlink (base station to the mobile subscriber) and uplink (mobile to base station) paths will generally increase as small cells are deployed. This applies to both users camped on the macro cell and those on the small cells, but for different reasons:

 

  1. The users on the macro cell will benefit as more small cells are added because fewer users will share the common capacity resources. Therefore, the more small cells are added, the better likelihood a user on the macro cell will experience higher throughput; meanwhile,
  2. Users on the small cell will experience better throughput than those on macro cell because of higher probability of line-of-sight connection to the serving base station.

 

If the mobile subscribers are uniformly distributed over the coverage area, then the likelihood a user will experience a certain level of throughput is approximately similar as the number of small cells increases. But in reality, the distribution of users is not uniform as they tend to concentrate in certain "traffic hotspots." In this case, a small cell in a traffic hotspot is expected to provide lower throughput than a small cell in a uniform user distribution area. In the meantime, a user on the macrocell will experience a more pronounced increase in throughput because a higher proportion of users are offloaded from the macro cell. As even more small cells are added, interference will increase leading to successively diminishing marginal increase in throughput.

 

This last note is an important one: small cells are beneficial up to a point. The user experience will be affected by the density of small cells with a diminishing marginal return followed by actual degradation of service as the number of small cells exceeds a certain threshold. When this threshold is reached depends on a number of factors that include the type of technology, morphology, and cell density and distribution. Inter-small cell interference is one factor that limits small cell performance. Another factor is that as we add more small cells, we create more 'cell-edge' regions within the coverage area of macrocells that can also limit performance as I will expand upon below.

 

The throughput performance will depend on the location of the small cells and their proximity to macrocells. A small cells close to a macrocell is more likely to be affected by interference than one located at the cell-edge resulting in lower throughput performance. Correspondingly, the performance will depend on the size of the macrocell, or rather, the macrocell density. Small cells deployed close to the cell edge of a large macrocell will provide better performance than those deployed in high-density macrocell area where the average radius is relatively small.

 

Throughput performance will also depend on the output power of the small cell. Simulations show that for a certain macrocell radius, higher power small cells provide better throughput performance than lower power ones given the same small cell base station density.

 

Nevertheless, the key take away here is this: it pays to find out where the traffic hot spots are as otherwise, the gain achieved from small cells will be small. Small cell deployment would have to be 'surgical' in select areas to achieve the maximum return on investment.

 

Interference and Coverage Performance: While small cells improve performance in general, there are certain situations where they cause interference or even a coverage hole. One decisive factor is the large power imbalance between the small cell and the macrocell. The power imbalance is larger than simply the rated transmit power because macrocells implement high-gain sectored antennas (13-16 dBi) while small cells typically implement a much lower gain omni-directional antenna (3-6 dBi). The power imbalance results in asymmetric downlink and uplink coverage areas. Because the macrocell has much higher power than the small cell, the downlink coverage area of the small cell would be smaller than the uplink coverage area. This shifts the handover boundary closer to the small cell increasing the possibility of uplink interference to the small cell with which the interfering mobile might have a line-of-sight path. This type of interference is potentially very damaging since it affects all the users in a cell and forces the mobile units served by the small cell to transmit at higher power. The power imbalance also increases the risk of downlink interference although this type of interference is more limited because it affects a single user. The uplink-downlink imbalance is a leading reason why LTE Release 8 small cell gain is limited because cell selection is decided by downlink signal strength and the options for interference mitigation are limited.

 

B1 - Figure 2 Small Cell Interference Scenarios.png

Figure 2 Co-channel interference scenarios in small cell deployments.

 

To address the uplink-downlink coverage imbalance, the coverage area of the small cell base station is extended to allow the small cell to capture more traffic. This is accomplished by adding a bias to the small cell received signal during the cell selection process. But extending the small cell coverage also increases the chances of downlink interference to a mobile subscriber operating at the edge of the small cell.

 

Aside from co-channel interference, there's also a risk of adjacent channel interference in multicarrier networks where macrocells implement two or more frequency carriers. Consider for example a mobile attached to a macrocell on frequency A while it is very close to a small cell operating on adjacent frequency B. The mobile is susceptible to adjacent channel interference from the small cell which would likely have a line-of-sight path to the mobile in contrast to a non-line-of-sight connection with the macrocell.  Another example is that for the uplink: a mobile attached to a macrocell and operating from the edge of a small cell on an adjacent frequency could cause interference to the small cell.

 

There are other potential interference scenarios in addition to those described here. But the basic fact is that the actual performance and benefit of small cells will vary, and will do so more widely in the absence of interference mitigation/performance enhancing techniques. This is one reason why some requirements for small cell deployments have been hotly debated, without a firm resolution. For example, a basic requirement is that of small cell backhaul capacity: what should it be? Should the backhaul link be designed to handle the peak throughput rate, which is a function technology, or the average throughput rate which is much harder to ascertain and put a value on because it depends on many factors related to the deployment scenario?

 

Based on the above description, we know that throughput of small cells will depend largely on the load. The more clustered the subscribers, the lower the overall small cell throughput. On the other hand, if there's a light load (few users), then the capacity will be high. If you are an operator, you sure would need to think carefully about the required backhaul capacity! And while we're on the backhaul topic, let's not forget that we also need to make sure that backhaul on the macrocell is dimensioned properly to support higher traffic load which will certainly come as more small cells are deployed.

 

In this post, I went through some aspects of small cell performance.  These problems are well recognized and certain techniques are being developed and integrated into the standards to address them. This raises other important questions to the operator's strategic network planning process, such as: what interference management and performance enhancement features should be considered? And, what is the technology roadmap for these features? I will expand more on some of these techniques in a future blog post.

 

Follow Frank Rayal on Twitter @FrankRayal

Read the original blog entry...

More Stories By Deborah Strickland

The articles presented here are blog posts from members of our Service Provider Mobility community. Deborah Strickland is a Web and Social Media Program Manager at Cisco. Follow us on Twitter @CiscoSPMobility.

@CloudExpo Stories
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
SYS-CON Events announced today that (ISC)²® (“ISC-squared”) will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Two leading non-profits focused on cloud and information security, (ISC)² and Cloud Security Alliance (CSA), developed the Certified Cloud Security Professional (CCSP) certification to address the increased demand for cloud security expertise due to rapid growth in cloud. Recently named “The Next...
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., focused on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what do...
As someone who has been dedicated to automation and Application Release Automation (ARA) technology for almost six years now, one of the most common questions I get asked regards Platform-as-a-Service (PaaS). Specifically, people want to know whether release automation is still needed when a PaaS is in place, and why. Isn't that what a PaaS provides? A solution to the deployment and runtime challenges of an application? Why would anyone using a PaaS then need an automation engine with workflow ...
Recognizing the need to identify and validate information security professionals’ competency in securing cloud services, the two leading membership organizations focused on cloud and information security, the Cloud Security Alliance (CSA) and (ISC)^2, joined together to develop an international cloud security credential that reflects the most current and comprehensive best practices for securing and optimizing cloud computing environments.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, will discuss using predictive analytics to ...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...