Skip to content

Effects of Net Neutrality- Continued

The recent Net Neutrality Opportunities and Challenges post discussed three likely changes that will result from the recent rollback in Net Neutrality restrictions. As a part of that multi-part series, this installment will talk through the likely impacts -- both challenges and opportunities -- to how we build solutions for the web.

In all three of the described scenarios, the speed with which you can access data, and the total amount of data that is required to act on a site, both become exceedingly important. If your connection through your ISP, or the provider of the service you are attempting to access, is on a "slow lane" then every byte of data counts. The past 20 years have seen a proliferation of rich content for the web, and the required data volume to support that experience has grown significantly. In 1995, the average web page had a data weight of approximately 15 kb (0.015 MB). Even through the dotcom explosion and into 2010, a mere 8 years ago, the average web page weight was only about 715 kb (0.715 MB). Today, the average page has a data weight of approximately 2.3MB, more than 150 times what it was 20 years ago, and triple that of 2010. With these new changes in place to how we use the internet, we'll likely see significant changes to how web sites and web applications are architected and built to be optimized for a post-net neutrality world.

Every Aspect of Code Needs to be Efficient.

With an even playing field for data access, developers in the past 20 years have had the freedom to implement sophisticated tools, methods, and interactions on the web. In part, the proliferation of —do it yourself— websites has been built on the foundation of applications that may not produce elegant or efficient code, but allow for non-technical users to easily create sites that render and allow the world access to their contents. In other areas, developers are able to bring solutions to market quickly through the use of code libraries, pre-built modules, open source tools, and other shortcuts that allow users to focus on solving business problems without reinventing the wheel. Developers have also had the luxury of a multitude of tools built for the purpose of tracking and analyzing user behaviors while they browse, and feeding that data back for reporting. This has allowed developers to improve user experiences over time by analyzing what did and did not work, creating an ever-improving web experience.

These approaches and tools, however, are not the most efficient method to delivering content to the web. Take, for example, WordPress, a commonly used Content Management System (CMS), owning the largest market share for CMS's on the internet. It's a platform intended to serve many different purposes, and as a result, has code within the application "just in case" that purpose is needed. Since the core application code is designed for many purposes, by necessity some of that code will execute on your site, regardless of whether or not it's in furtherance of your purpose. Each of these unnecessary functions and executions sap precious bandwidth or consume data.

WordPress has become popular, in large part, because of a vast community of open source developers, each building modules and themes to help extend the look, feel, and functionality of WordPress. These add-ons follow the same model, and in a world where limited access reigns, can exacerbate the problem: code written to serve multiple functions now shifts from potentially flexible to tangibly detrimental. This code bloat could slow down the single function you want it to serve. Themes built with the assumption of fast access to large images, or leveraging external content delivery services, are not optimized for a lightweight web starved for data bandwidth, and are instead built with the foundational assumption of high performance and unrestricted access. While these options have thus far helped provide low cost solutions by leveraging existing code, and provided great user experience by leveraging multiple third-party tools to bring together a coherent interface, their continued functioning presupposes certain levels of access to the internet that may not be true for all users. Instead, the ISP through which the user accesses the content, if they can access the content at all, will begin to dictate which of these methods is available, accessible, or sufficiently performant to deliver a quality user experience without degrading performance.

Ongoing improvements based on analyzing user feedback could similarly become tricky in this world, as each collected data point represents another piece of data that must be sent from the user's browser to the application for tracking. These tools will either become a luxury only available to the "fast lane" sites, or will have to be refactored into a form that is lightweight and incredibly performant. When faced with the decision between displaying a high-res image or knowing that the user clicked a link to see the image, developers will need to leverage creative methods to ensure they are not sacrificing business objectives for the sake of customer experience -- neither of which can be left behind.

There are two really interesting--and likely-- shakeouts from these outcomes.

First, those with resources will be more incentivized, at least in the short-term, to build custom solutions specifically tailored to their needs. Until the open source and Enterprise Solutions communities catch up to the new ISP pricing models, companies may be required to build rather than buy. While more expensive, this will have the benefit of allowing fast movers to quickly adapt to the changing environment, and more quickly establish a position of accessibility with their customers.

Second, those communities that have developed open source tools won't go away, they'll start over. If efficiency becomes the operative goal, a massive refactoring of existing open source tools will get underway to either refactor or recreate tools that previously worked, if only inefficiently. For companies, this represents two really important decisions: Retool their technology, choosing tools or agencies that can prove results in building more efficient tools, or pursue alternative methods for engaging with existing and potential customers.

Application and Database Calls Need to be Minimized.

The modern web is one of real-time access to data. Rarely are websites built as flat HTML files. Instead, modern sites are more often web applications, starting with a database, which sits behind an application, which in turn renders the web pages with which users interact. Each click or scroll on a website potentially requires a call be made from the user's browser, to the application, to the database, and then back.

Take online banking as an example: Loading the homepage first checks my browser to see if I have an active session. If so, the application may use that information to pass me directly to my Account Balance page. If not, the application will present me with a login page, as well as some content specifically targeted to me with special offers or features. When I login, there are more calls made to the application to process my login attempt, to the database to compare my credentials, and back to the application to handle the success or failure of my login attempt. Once I am successfully logged in, I might want to look at my banking activity for the past month and download a PDF statement to print. Again, my actions on the web page send requests through the Application, to the database, and back through the application to render the appropriate data on the page, and generate the requested file.

Each of these micro-interactions has a performance cost: data being passed back-and-forth; time being spent by the user waiting; pages being rendered, images being loaded, files being downloaded. Some actions might require multiple calls be executed in sequence, giving the user a seamless experience, but, behind the scenes, orchestrating a complex series of data handshakes for security and accuracy.

Inefficient code -- code written with an assumption that all internet connections are treated equally-- can survive in a neutral internet world because a developer can assume users will have equal access to content. However, in a world of internet slow lanes, or in which users more closely monitor their data usage because they are paying as they go in a zero rating plan, inefficient code would be death for a web application. Developers will need to take into account that many users will be accessing their site via a slower or throttled data connection, and, as a result, can ill afford an application that requires many handshakes and database calls to achieve their goals on the site.

User experience becomes king.

Code can be written efficiently but still consume large volumes of data if a user is required to navigate to many pages or perform actions repeatedly to accomplish their goals on a site. To ensure competitiveness in this new environment, user experience design will become even more crucial than today.

User experience is about more than just wireframes and page aesthetic, but additionally encompasses user personas and goals, their navigation path, what ancillary pages and applications they may encounter through that path, and a thousand other factors specific to each site.

Take, for example, a user that subscribes to our fictional "Social Media" and "News" bundles offered through their ISP, and they're browsing Facebook. Their only available internet access is to a defined list of Social Media and News websites.

While the user is browsing Facebook:

  • What happens when that user clicks a link to a news article on Reuters? Is Reuters included in the News Bundle for their ISP?
  • What if the user clicks a link to a News article for a site that is NOT included in their ISP's News bundle?
  • What if the user clicks a link to their friend's blog post about the past weekend's cooking exploits?
  • What if the user clicks a link to a YouTube video, but they do not have a paid subscription to their ISP's "Streaming Content" Bundle?

These are all user experience scenarios the web developer will need to take into account in order to create an experience that both accomplishes the user's goals of browsing Facebook and consuming content, while not disrupting their use of the application to an extent that they stop using the application entirely. Should the designer choose to handle these exceptions within their site/application? Or will they instead bounce the user out to an ISP hosted page explaining the problem, offloading the bad news to third party, but offering a potentially jolting user experience? Alternatively the designer could attempt to convey the problem to the user in-app, making the experience better but risking a negative association of their application to the content restrictions of their ISP. These are decisions designers and developers have not yet been faced with, but will become critical in the new internet environment.

A second level to this question focuses on what types of media a UX designer considers necessary to provide an excellent user experience. When shopping online, customers have become accustomed to high resolution images, product video demonstrations, pages and pages of customer ratings and reviews, and potentially multiple pages of specs, technical information, or other data to help in the buying experience. Designers and Developers will need to carefully balance the collateral that is required for user conversion against information that is merely nice to have.

As discussed above, an unfortunate impact to removing unnecessary code from web pages could be the removal of the tools UI designers use to refine the user experience. Website analytics depicting which efforts lead to more successful conversion rates -- data from A/B testing tools, analytics packages, and other on-page tools that add data overhead-- may be less available as developers work to lean down their code. As a result, designers will need to rely more heavily on primary research, focus groups, customer outreach campaigns, and other traditional user interaction data to inform their design decisions.


The foundation of the modern web is one of interconnected services, involved user interaction, and lots and lots of data. These have in many ways become dependencies to meet business goals effectively. Significant changes in how users access the internet will not necessarily mean changes in user expectations for how they use the internet. As a result, designers and developers will need to be much more cognizant of the potential for inconsistency in the level and type of access, so they can ensure they're still able to effectively engage their users.

Blog & Events

Featured Work