Welcome!

Machine Learning Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Flint Brenton, Pat Romanski

Blog Feed Post

What are you waiting for?

The future of HTTP is here, or almost here.   It has been 5 years since SPDY was first introduced as a better way to deliver web sites.  A lot has happened since then. 

  • Chrome, Firefox, Opera and some IE installations support SPDY.
  • SPDY evolved from v2 to v3 to v3.1.
  • Sites like Google, Facebook, Twitter, and Wordpress to name just a few are available via SPDY.
  • F5 announced availability of a SPDY Gateway.
  • The IETF HTTP working group announced SPDY is the starting point for HTTP/2.
  • And most recently - Apple has announced that Safari 8, due out this fall,  will support SPDY!  This means that all major browsers will support SPDY by the end of the year.  

By the end of the year all major browsers will support SPDY, and the IETF is scheduled to have the HTTP/2 draft finalized.  This week the IETF working group published the latest draft of the HTTP/2 spec.  The hope is that this will be the version that becomes the proposed RFC.  

The Internet Explorer team  posted a blog at the end of May indicating that they have HTTP/2 in development for a future version of IE, there is no commitment whether this will be in IE 12 or another version but they are preparing for the shift.  We at F5, have been following the evolution of the spec and developing prototypes based on the various interoperability drafts to make sure we are ready as soon as possible to implement an HTTP/2 gateway.   So what are you waiting for, why are you not using SPDY on your site?

Using SPDY today allows you to see how HTTP/2 may potentially impact your applications and infrastructure.   HTTP/2 is not a new protocol, there are no changes to the HTTP semantics and it does not obsolete the existing HTTP/1.1 message syntax.   If it’s not a new protocol and it doesn’t obsolete HTTP/1.1 what is HTTP/2 exactly?  Per the draft’s abstract:

This specification describes an optimized expression of the syntax of
   the Hypertext Transfer Protocol (HTTP).  HTTP/2 enables a more
   efficient use of network resources and a reduced perception of
   latency by introducing header field compression and allowing multiple
   concurrent messages on the same connection.  It also introduces
   unsolicited push of representations from servers to clients.

   This specification is an alternative to, but does not obsolete, the
   HTTP/1.1 message syntax.  HTTP's existing semantics remain unchanged.

HTTP/2 allows communication to occur with less data transmitted over the network and with the ability to send multiple requests and responses across a single connection, out of order and interleaved – oh yeah and all over SSL.  

Let’s look at these in a little more detail.  Sending less data has always been a good thing but just how much improvement can be achieved by compressing headers.     It turns out quite a bit.    Headers have a lot of repetitive information in them: the cookies, encoding types, cache settings to name just a few.  With all this repetitive information compression can really help.    Looking at the amount of downloaded data for a web page delivered over HTTP and over SPDY we can see just how much savings can be achieved.   Below is a sample of 10 objects delivered over HTTP and SPDY, the byte savings result in a total savings of 1762 bytes.   That doesn’t sound like much but we’re only talking about 10 objects.  The average home page now has close to 100 objects on it, and I’m sure the total number of hits to your website is well over that number.   If your website gets 1 million hits a day then extrapolating this out the savings become 168 MB, if the hits are closer to 10 million the savings nears 1.7 GB.   Over the course of a month or a year these savings will start to add up.  

  HTTP SPDY Byte Savings
https://.../SitePages/Home.aspx 29179 29149 30
https://.../_layouts/1033/core.js 84457 84411 46
https://.../_layouts/sp.js 71834 71751 83
https://.../_layouts/sp.ribbon.js 57999 57827 172
https://.../_layouts/1033/init.js 42055 41864 191
https://.../_layouts/images/fgimg.png 20478 20250 228
https://.../_layouts/images/homepageSamplePhoto.jpg 16935 16704 231
https://.../ScriptResource.axd 27854 27617 237
https://.../_layouts/images/favicon.ico 5794 5525 269
https://.../_layouts/blank.js 496 221 275

SPDY performed header compression via deflate, this was discovered to be vulnerable to CRIME attacks, as a result HTTP/2 uses HPACK header compression, an HTTP header specific compression scheme which is not vulnerable to CRIME.  

The next element to examine is the ability to send multiple requests and response across a single connection, out of order and interleaved.  We all know that latency can have a big impact on page load times and the end user experience.  This is why HTTP 1.1 allowed for keep-alives, eliminating the need to perform a three way handshake for each and every request.   After keep alives came, domain sharding  and browsers eventually changed the default behavior to allow more than 2 concurrent TCP connections.  The downside of multiple TCP connections is having to conduct the three way handshake multiple times, wouldn’t things be easier if all requests could just be sent over a single TCP connection.  This is what HTTP/2 provides, and not only that the responses can be returned in a different order in which they were reqeusted. 

 

HTTP2

Now onto the SSL component.  HTTP/2 requires strong crypto –128 bit EC or 2048 bit RSA.  This requirement will be enforced by browsers and cannot be disabled.   With the ever growing number of attacks having SSL everywhere is a good thing but there are performance and reporting ramifications to encrypting all data.  Organizations that deploy solutions to monitor, classify and analyze Internet traffic may no longer be able to do so.  

All the changes coming in HTTP/2 have the potential to impact how an application is rendered and how infrastructure components will react.   What are the consequences of having all requests and responses transmitted over SSL, can the network support 50 concurrent requests for objects, does the page render properly for the end user if objects are received out of order?  On the positive you could end up with improved page load times and a reduction in the amount of data transferred, stop waiting and start enabling the future of the web today.  

Read the original blog entry...

More Stories By Dawn Parzych

Dawn Parzych is a Technical Product Marketing Manager at Instart Logic. Dawn has had a passion for web performance for over 15 years with a focus on how to make the web faster. As a technical product marketing manager at Instart Logic, she researches and writes about trends in the web performance space and how they impact the user experience. Prior to joining Instart Logic, Dawn worked at F5 Networks, Gomez & Empirix.

@CloudExpo Stories
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
In his session at 20th Cloud Expo, Brad Winett, Senior Technologist for DDN Storage, will present several current, end-user environments that are using object storage at scale for cloud deployments including private cloud and cloud providers. Details on the top considerations of features and functions for selecting object storage will be included. Brad will also touch on recent developments in tiering technologies that deliver single solution and an end-user view of data across files and objects...
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, discussed how AI can simplify cloud operations. He covered the following topics: why cloud mana...
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching o...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
The best way to leverage your CloudEXPO | DXWorldEXPO presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering CloudEXPO | DXWorldEXPO will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at CloudEXPO. Product announcements during our show provide your company with the most reach through our targeted audienc...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors!
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...