What started as a rant from comedian John Oliver calling for support of an open internet, has now developed into a monumental philosophical debate riddled with controversy that is centered upon the Federal Communications Commission’s (FCC) attempt to reclassify broadband based on its role in modern day life. For the past two decades, the Internet has been unregulated and free, but that’s about to change and with this change, a light is shined on the importance of optimization solutions for bandwidth savings.

What is net neutrality?

In broad terms, net neutrality is the idea or principle of having the Internet equally accessible for all. Advocates proclaim it’s meant to stop internet service providers including mobile wireless services from discriminating against any application or content that is shared over their networks. It is the idea of keeping the internet free of “fast lanes” that give preferential treatment to companies and websites willing to pay more to deliver their content faster. It, therefore, prohibits internet service providers from putting everyone else in “slow lanes” that are said to discriminate against the “smaller” companies that cannot afford to pay a premium.

Broadband is now considered a utility

The decision by the United States Court of Appeals for the District of Columbia Circuit maintained the FCC’s view that broadband Internet should be reclassified as a utility, telecommunications service as opposed to information services, giving them more power to enforce stricter rules and regulations in a once free market.  Essentially, this ruling solidifies that the internet has become as important to the American way of life as the telephone and forces providers to adhere to some of the same regulations faced by phone companies.

But what’s the point? In theory, the reclassification will increase competition among service providers, which as we learned in economics class, should cut costs to consumers, and raise the quality of service as well as drive innovation. But opposing forces claim this new open policy will hurt the consumer as prices increase due to lack of incentive for companies to build out and update their networks as needed to meet consumer demand, which is growing exponentially due to video. In fact, Cisco reported that by 2019, video will represent 80% of all global consumer Internet traffic. 

Whatever side you fight for, in the end, what net neutrality comes down to is the commoditization of the internet… bits.  

Bandwidth savings is now a necessity making optimization a requirement

As a service provider, now that you can no longer pay to get your bits (content) delivered faster or more efficiently, this means everyone is on the same playing field.  We are all stuck in the same traffic jam.  Couple this fact with the increasing consumer demands for a reliable and stable experience and in one of the most infamous phrases ever uttered, “Houston we have a problem.”  But the good news is that companies have developed solutions to improve encoding efficiency which can lead to additional bitrate reductions of 20-40% without a perceptible change in viewing quality.

Optimization solutions, like Beamr’s JPEGmini and Beamr Video, have already become well entrenched in the fabric of media workflows, especially video, and are likely to grow even more in their importance over the years to come. Why? Because the name of the game is getting content to your consumer with the best quality and highest user experience possible, and now the only way to do that is to increase file efficiency by optimizing your file sizes to conserve bandwidth so you can cut through the clutter.

For example, with an image intensive website, JPEGmini can cut the file size of your photos by up to 80% without compromising quality, whereby giving your user faster page load times.  Which by the way, Google recently made a change in their ranking algorithms to weight website searches based on page load time and mobile performance.  In short, if your site is slow, you are going to see other sites rank more highly on Google search engine results pages.  For video, Beamr Video is able to reduce H.264 and HEVC bitrates by 20-40% without any compromise in quality, whereby enabling a smoother streaming experience. With the ruling as it stands, only companies that can lower file size and save bandwidth will break through congested networks to deliver content that meets consumer expectations for entertainment anytime, anywhere and on any device.

BONUS: Four secrets to keep in mind when selecting a video bitrate reduction solution

  1. Automation is your friend.  Though an encoder in the hands of a well-trained expert can yield amazingly small files with good quality, for nearly all streaming service providers, automation will be required to deploy a bitrate reduction solution at scale.  And anything less than scale will not have a meaningful impact. For this reason, you should select a product that can give you the ability to automate the analysis and workflow implementation.  
  2. Solutions that work at the frame level will give you the best overall video quality.  Products that analyze video to determine an optimum fixed bitrate cannot provide the level of granularity that you will want to have as methods that perform this analysis at the frame or scene level.  Beamr Video operates the analysis and compression parameter settings on a frame basis and in a closed loop, to ensure that the optimal allocation of bits per frame is achieved.  No other method of optimization can achieve the same level of quality across a disparate video library as Beamr Video.
  3. Leave neural networks to Google, you want a solution that uses a perceptual quality metric to validate video quality.  A sophisticated quality measure that is closely aligned to human vision will deliver a more accurate result across an entire library of content.  Quality measures such as PSNR and SSIM provide inconsistent results and cannot be used with a high degree of accuracy.  These quality measures have some application for classifying content, but they are less useful as a mechanism to drive an encoder to achieve the best quality possible.
  4. Guaranteed quality, is there any other choice?  Sophisticated encoder configurations that use CRF (Constant Rate Factor)  on first look can appear to be viable, however, without a quality verification step that is post-encode the final result will likely contain degradation or other transient quality issues.  If the product or technique that you have selected does not operate in a closed loop or contain some method for verifying quality, the potential to introduce artifacts will be unacceptably high.  

 

For more information or to receive a free trial of Beamr Video, please visit http://beamr.com/product  

 


Share