Click here to close now.



Welcome!

IoT User Interface Authors: Scott Allen, Roger Strukhoff, Liz McMillan, Pat Romanski, Elizabeth White

Related Topics: Microsoft Cloud, Java IoT, Microservices Expo, IoT User Interface, Silverlight, @CloudExpo

Microsoft Cloud: Article

.NET and SharePoint Performance

Don’t let default settings ruin your end user experience

SharePoint is a popular choice for intranet applications and therefore it is important that it performs well to ensure employee productivity. Waiting ten seconds just to load the initial dashboard doesn't necessarily support that. At a recent customer engagement we identified an interesting source of a potential performance problem that impacts all SharePoint and .NET-based installations that use the ServicePointManager to access web services. It turns out that ServicePointManager comes with a default setting that allows two concurrent connections. If you happen to have a SharePoint dashboard that queries data from more than two data sources, your end users will suffer from a very long page load time. In this blog I explain the steps that one of our customers took to identify and solve this particular problem. This issue has a broader impact beyond SharePoint; I suggest all .NET developers to look into this issue.

Step 1: Identify Problematic Pages
The first question is: do we have a problem or not? You can either wait for your end users to start complaining to the help desk or be proactive and look at end-user response times. The following screenshot shows data from each individual visitor who accessed a SharePoint application. Focusing on one visit that was identified as having a frustrating experience shows the individual visited pages. Starting with an extremely costly first page (43 seconds to load) the visitor also suffered from an extremely slow response time for the following two actions (click on Home and reloading Default.aspx):

More than six seconds were spent only on server-side processing. There was also a very long wait at the Client (JS Time) for the initial page.

The long wait time at the Client (high JS-Time) could be explained as inefficient JavaScript that caused problems especially on older browsers such as Internet Explorer 7. The server-side problem, however, impacts every user - regardless of the browser used.

Step 2: Identify Problematic Web Parts
Drilling into the actual details of the Default.aspx page for that particular Visitor shows which Web Parts are involved in that Page as well as where these Web Parts spend most of their time:

Web Parts such as the DataFormWebPart or ContentEditorWebPart spend most of their time waiting on resources

Looking at the details shows us that these Web Parts actually spawn multiple background threads to retrieve data from different back-end web services and then they spend time waiting (sync) on those threads when they are done with their work. A closer look also reveals that each of these threads is taking a significant amount of time in I/O:

The WebParts spawn multiple background threads. These threads are all performing I/O operations that take up a very long time (up to 5.8s)

Step 3: Identify Root Cause of Problem
Why are all of these background threads taking so much time? Expanding the internals of the first call shows that it takes 5.8 seconds until the web service actually sends a response back on the socket. So - that explains why the first asynchronous thread takes 5.8 seconds:

First web service call takes 5.8s until we receive data on the socket that is used internally by the ServicePoint implementation.

The other web service calls executed by the remaining background threads pretty much go down through the same execution path spending most of their time waiting for an available outgoing connection:

We have a total of 10 background threads that try to execute a web service call. Most of them spend their time waiting in the ServicePoint instead of sending the actual request.

Step 4: Fix the Problem
Doing a little research (aka use your favorite search engine) on this brings up the following post on stackoverflow.com: Max Number of concurrent connections that ultimately leads us to the MSDN documentation for ServicePointManager where we learn that the default setting for concurrent connections is two.

The default of two concurrent connections causes parallel executing web service requests to wait until there is a free connection available.

So - the solution to this problem is to change the default value. In this case, it should be at least set to the number of allowed Web Parts on a single dashboard because most of them will execute asynchronous web service calls.

Next Steps
This problem is obviously not only relevant for custom SharePoint development but can impact any .NET application that uses ServicePointManager. It was a very interesting case with this customer as they only used out-of-the-box components provided by SharePoint and still ran into this problem. If you are implementing custom SharePoint solutions I also recommend checking out our blog series about the Top SharePoint Performance Problems when implementing your custom Web Parts and Web Controls starting with: How to Avoid the Top 5 SharePoint Performance Problems.

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"Avere Systems is a hybrid cloud solution provider. We have customers that want to use cloud storage and we have customers that want to take advantage of cloud compute," explained Rebecca Thompson, VP of Marketing at Avere Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profession...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficie...
In the world of DevOps there are ‘known good practices’ – aka ‘patterns’ – and ‘known bad practices’ – aka ‘anti-patterns.' Many of these patterns and anti-patterns have been developed from real world experience, especially by the early adopters of DevOps theory; but many are more feasible in theory than in practice, especially for more recent entrants to the DevOps scene. In this power panel at @DevOpsSummit at 18th Cloud Expo, moderated by DevOps Conference Chair Andi Mann, panelists discusse...
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
What does it look like when you have access to cloud infrastructure and platform under the same roof? Let’s talk about the different layers of Technology as a Service: who cares, what runs where, and how does it all fit together. In his session at 18th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, an IBM company, spoke about the picture being painted by IBM Cloud and how the tools being crafted can help fill the gaps in your IT infrastructure.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Most organizations prioritize data security only after their data has already been compromised. Proactive prevention is important, but how can you accomplish that on a small budget? Learn how the cloud, combined with a defense and in-depth approach, creates efficiencies by transferring and assigning risk. Security requires a multi-defense approach, and an in-house team may only be able to cherry pick from the essential components. In his session at 19th Cloud Expo, Vlad Friedman, CEO/Founder o...
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors ...
The U.S. Army Intelligence and Security Command (INSCOM) has awarded BAE Systems a five-year contract worth as much as $75 million to provide enhanced geospatial intelligence technical and analytical support. The award was issued under the INSCOM Global Intelligence indefinite delivery, indefinite quantity contract.
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
The initial debate is over: Any enterprise with a serious commitment to IT is migrating to the cloud. But things are not so simple. There is a complex mix of on-premises, colocated, and public-cloud deployments. In this power panel at 18th Cloud Expo, moderated by Conference Chair Roger Strukhoff, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships at Commvault; Dave Landa, Chief Operating Officer at kintone; William Morrish, General Manager Product Sales at Interou...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
University of Colorado Athletics has selected FORTRUST, Colorado’s only Tier III Gold certified data center, as their official data center and colocation services provider, FORTRUST announced today. A nationally recognized and prominent collegiate athletics program, CU provides a high quality and comprehensive student-athlete experience. The program sponsors 17 varsity teams and in their history, the Colorado Buffaloes have collected an impressive 28 national championships. Maintaining uptime...
Apixio Inc. has raised $19.3 million in Series D venture capital funding led by SSM Partners with participation from First Analysis, Bain Capital Ventures and Apixio’s largest angel investor. Apixio will dedicate the proceeds toward advancing and scaling products powered by its cognitive computing platform, further enabling insights for optimal patient care. The Series D funding comes as Apixio experiences strong momentum and increasing demand for its HCC Profiler solution, which mines unstruc...