IoT User Interface Authors: Kevin Benedict, Elizabeth White, Dana Gardner, Ed Featherston, Liz McMillan

Blog Feed Post

What are you waiting for?

The future of HTTP is here, or almost here.   It has been 5 years since SPDY was first introduced as a better way to deliver web sites.  A lot has happened since then. 

  • Chrome, Firefox, Opera and some IE installations support SPDY.
  • SPDY evolved from v2 to v3 to v3.1.
  • Sites like Google, Facebook, Twitter, and Wordpress to name just a few are available via SPDY.
  • F5 announced availability of a SPDY Gateway.
  • The IETF HTTP working group announced SPDY is the starting point for HTTP/2.
  • And most recently - Apple has announced that Safari 8, due out this fall,  will support SPDY!  This means that all major browsers will support SPDY by the end of the year.  

By the end of the year all major browsers will support SPDY, and the IETF is scheduled to have the HTTP/2 draft finalized.  This week the IETF working group published the latest draft of the HTTP/2 spec.  The hope is that this will be the version that becomes the proposed RFC.  

The Internet Explorer team  posted a blog at the end of May indicating that they have HTTP/2 in development for a future version of IE, there is no commitment whether this will be in IE 12 or another version but they are preparing for the shift.  We at F5, have been following the evolution of the spec and developing prototypes based on the various interoperability drafts to make sure we are ready as soon as possible to implement an HTTP/2 gateway.   So what are you waiting for, why are you not using SPDY on your site?

Using SPDY today allows you to see how HTTP/2 may potentially impact your applications and infrastructure.   HTTP/2 is not a new protocol, there are no changes to the HTTP semantics and it does not obsolete the existing HTTP/1.1 message syntax.   If it’s not a new protocol and it doesn’t obsolete HTTP/1.1 what is HTTP/2 exactly?  Per the draft’s abstract:

This specification describes an optimized expression of the syntax of
   the Hypertext Transfer Protocol (HTTP).  HTTP/2 enables a more
   efficient use of network resources and a reduced perception of
   latency by introducing header field compression and allowing multiple
   concurrent messages on the same connection.  It also introduces
   unsolicited push of representations from servers to clients.

   This specification is an alternative to, but does not obsolete, the
   HTTP/1.1 message syntax.  HTTP's existing semantics remain unchanged.

HTTP/2 allows communication to occur with less data transmitted over the network and with the ability to send multiple requests and responses across a single connection, out of order and interleaved – oh yeah and all over SSL.  

Let’s look at these in a little more detail.  Sending less data has always been a good thing but just how much improvement can be achieved by compressing headers.     It turns out quite a bit.    Headers have a lot of repetitive information in them: the cookies, encoding types, cache settings to name just a few.  With all this repetitive information compression can really help.    Looking at the amount of downloaded data for a web page delivered over HTTP and over SPDY we can see just how much savings can be achieved.   Below is a sample of 10 objects delivered over HTTP and SPDY, the byte savings result in a total savings of 1762 bytes.   That doesn’t sound like much but we’re only talking about 10 objects.  The average home page now has close to 100 objects on it, and I’m sure the total number of hits to your website is well over that number.   If your website gets 1 million hits a day then extrapolating this out the savings become 168 MB, if the hits are closer to 10 million the savings nears 1.7 GB.   Over the course of a month or a year these savings will start to add up.  

  HTTP SPDY Byte Savings
https://.../SitePages/Home.aspx 29179 29149 30
https://.../_layouts/1033/core.js 84457 84411 46
https://.../_layouts/sp.js 71834 71751 83
https://.../_layouts/sp.ribbon.js 57999 57827 172
https://.../_layouts/1033/init.js 42055 41864 191
https://.../_layouts/images/fgimg.png 20478 20250 228
https://.../_layouts/images/homepageSamplePhoto.jpg 16935 16704 231
https://.../ScriptResource.axd 27854 27617 237
https://.../_layouts/images/favicon.ico 5794 5525 269
https://.../_layouts/blank.js 496 221 275

SPDY performed header compression via deflate, this was discovered to be vulnerable to CRIME attacks, as a result HTTP/2 uses HPACK header compression, an HTTP header specific compression scheme which is not vulnerable to CRIME.  

The next element to examine is the ability to send multiple requests and response across a single connection, out of order and interleaved.  We all know that latency can have a big impact on page load times and the end user experience.  This is why HTTP 1.1 allowed for keep-alives, eliminating the need to perform a three way handshake for each and every request.   After keep alives came, domain sharding  and browsers eventually changed the default behavior to allow more than 2 concurrent TCP connections.  The downside of multiple TCP connections is having to conduct the three way handshake multiple times, wouldn’t things be easier if all requests could just be sent over a single TCP connection.  This is what HTTP/2 provides, and not only that the responses can be returned in a different order in which they were reqeusted. 



Now onto the SSL component.  HTTP/2 requires strong crypto –128 bit EC or 2048 bit RSA.  This requirement will be enforced by browsers and cannot be disabled.   With the ever growing number of attacks having SSL everywhere is a good thing but there are performance and reporting ramifications to encrypting all data.  Organizations that deploy solutions to monitor, classify and analyze Internet traffic may no longer be able to do so.  

All the changes coming in HTTP/2 have the potential to impact how an application is rendered and how infrastructure components will react.   What are the consequences of having all requests and responses transmitted over SSL, can the network support 50 concurrent requests for objects, does the page render properly for the end user if objects are received out of order?  On the positive you could end up with improved page load times and a reduction in the amount of data transferred, stop waiting and start enabling the future of the web today.  

Read the original blog entry...

More Stories By Dawn Parzych

Dawn Parzych is a Technical Product Marketing Manager at Instart Logic. Dawn has had a passion for web performance for over 15 years with a focus on how to make the web faster. As a technical product marketing manager at Instart Logic, she researches and writes about trends in the web performance space and how they impact the user experience. Prior to joining Instart Logic, Dawn worked at F5 Networks, Gomez & Empirix.

@CloudExpo Stories
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Enterprises have been using both Big Data and virtualization for years. Until recently, however, most enterprises have not combined the two. Big Data's demands for higher levels of performance, the ability to control quality-of-service (QoS), and the ability to adhere to SLAs have kept it on bare metal, apart from the modern data center cloud. With recent technology innovations, we've seen the advantages of bare metal erode to such a degree that the enhanced flexibility and reduced costs that cl...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Intelligent machines are here. Robots, self-driving cars, drones, bots and many IoT devices are becoming smarter with Machine Learning. In her session at @ThingsExpo, Sudha Jamthe, CEO of IoTDisruptions.com, will discuss the next wave of business disruption at the junction of IoT and AI, impacting many industries and set to change our lives, work and world as we know it.
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, will provide economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session will also include a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
We are always online. We access our data, our finances, work, and various services on the Internet. But we live in a congested world of information in which the roads were built two decades ago. The quest for better, faster Internet routing has been around for a decade, but nobody solved this problem. We’ve seen band aid approaches like CDNs that attack a niche's slice of static content part of the Internet, but that’s it. It does not address the dynamic services-based Internet of today. It doe...
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
By now most people have either created their configuration management solution or are just embarking on this journey. In his session at @DevOpsSummit at 19th Cloud Expo, Marco Ceppi, a DevOps Engineer working at Canonical, will discuss how to take configuration management to the next level with modelling and orchestration. He will also discuss how and why people are moving from a machine-centric view to a service/application-oriented view of deployments, and how you can leverage the knowledge a...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
In the 21st century, security on the Internet has become one of the most important issues. We hear more and more about cyber-attacks on the websites of large corporations, banks and even small businesses. When online we’re concerned not only for our own safety but also our privacy. We have to know that hackers usually start their preparation by investigating the private information of admins – the habits, interests, visited websites and so on. On the other hand, our own security is in danger bec...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.