Category: CommunityShould All Web Traffic Be Encrypted?

Share this post...Tweet about this on TwitterShare on Google+0Share on Facebook0
Should All Web Traffic be Encrypted?

Photo Credits: Tim Gage

The Internet is not a secure channel for the transmission of private data. In spite of the implementation of the HTTPS protocol that uses TLS to encrypt information, the bulk of data flowing between servers and web clients remains unprotected.

We’re all well aware of the consequences of insecure communication, and, whatever stand you may take on the recent series of revelations concerning security, there’s no doubt there is a serious problem with online privacy. All of us who work in the Web services industry, including software developers and web hosting companies, depend on a healthy online ecosystem that engenders trust in our users. A lack of trust has the potential to seriously degrade the online marketplace from which we all benefit.

In a recently published letter from the HTTP working group, which is currently discussing the shape of the to-be-released HTTP/2 protocol, it is suggested that all data that travels across the open Internet be encrypted. The letter suggests three main methods by which this might be accomplished.

Firstly, opportunistic encryption of HTTP URIs, which is not dependent on server authentication. Secondly, opportunistic encryption with server authentication. And, thirdly, only implementing HTTP/2 within HTTPS URI’s.

The first of these options is not particularly appealing, because, although it would be less expensive to implement, the lack of server authentication via certificate authorities means it would be trivial to spoof the identity of sites.

The second option is more appealing than the first, because it essentially offers the same level of protection for HTTP URIs as we currently have with HTTPS. However, it would impose a significant burden on site owners, who would be compelled to implement authentication on their servers, potentially delaying the uptake of the HTTP/2 protocol.

The third option appears to have the most support. It implements a two-tier system where the HTTP/2 protocol is only made available through HTTPS connections, using the new features of HTTP/2, including asynchronous connection multiplexing, as an inducement for site owners to use HTTPS, while still allowing the current HTTP/1 protocol to function.

While this may be a move in the right direction, as Ars Technica writer, Dan Goodin, points out, these plans depend on the viability of the current TLS system, which in turn depends on the trustworthiness of certificate authorities. As we mentioned in a previous article while the TLS system as a whole is sound, it puts trust in third parties whose incentives may not align well with those of Internet users.

However this conversation shakes out, it seems that the movers and shakers who stand behind the mechanisms that power the Internet are serious about implementing wide-ranging encryption of data on the open Internet.

SecuritySSLTLS
Dec 12, 2013, 2:27 pmBy: InterWorx (0) Comments

Leave a Reply
Surround code blocks with <pre>code</pre>

Your email address will not be published.

Newsletter

Sign up to receive periodic InterWorx news, updates and promos!

New Comments

Current Poll

  • This field is for validation purposes and should be left unchanged.

Forum Posts