Google Search Developments In Early 2016: Accelerated Mobile Pages (AMP) and Penguin 4.0

Google_Search_Dev_Penguin
Within the next 3 months the Search world will encounter two known algorithm updates to Google that will benefit users and SEO’s on a large scale. AMP (Accelerated Mobile Pages) is an open collaboration project designed to increase the speed of loading mobile pages and Penguin 4.0 is reported to be a real-time link spam/recovery tool.

2015 was a year of steady advancement for Google Search with algorithm updates such as RankBrain (AI technology) and deep app indexing (access to app data). Google continues to be focused on mobile speed and quality results.

Accelerated Mobile Pages (AMP)

This open standard project was officially announced on Oct. 7, 2015, with collaboration from Google, Twitter, and Wordpress. The proposed platform has gained momentum with a range of supporters in analytics software companies, ad networks, and CMS providers. Examples include AdSense, DoubleClick, comScore, Adobe Analytics, Parse.ly, Nielsen, ClickTale, OutBrain, AOL, Taboola, and ChartBeat.

Born out of discussions between publishers and tech companies about slow mobile performance, and backed up by mobile UX research, AMP will benefit users with increased loading speeds. Early tests have come back positive. According to Pinterest:

“We found that AMP pages load four times faster and use eight times less data than traditional mobile-optimized pages.”

The improved speed of distribution is expected to result in higher user satisfaction and increase revenue for businesses through faster conversion processes such as subscriptions, sign ups, sales, etc.

With the project on GitHub, over 4,500 developers have provided work, mitigating much of the fear about industry adoption.

Challenges

Not everyone in the industry is excited for AMP. Some developers are taking a wait-and-see approach to adoption of the platform. Many view AMP pages as possibly detrimental due to several challenges:

  • Less control over content
  • Efficiency through uniformity
  • Loss of unique web features
  • Questions still remain about governance model, caching, hosting
  • Keeps users on the Google results page instead of pushing the user to a native site

Using strict guidelines that strip out much of the heavy loading objects will increase speed, but it remains to be seen if the loss of unique features will be worth it in the long run. The industry is conflicted after having seen a similar initiative in Facebook Instant Articles, which also provide a framework for faster loading mobile pages but has been criticized for forcing content to be served under stringent rules.

Why Speed is Important - Ad Blocking?

Ad blockers turned into a serious problem in 2015 and marketers are unsure how to tackle the issue. According to an emarketer report: “Many internet users have begun to sour on the tradeoff of free content for ad views, turning to ad blockers as publisher ad loads seem heavy and intrusive.“

It appears to be a case of ‘fearing what they don’t’ know as Publishers still lack first party information on ad blocker usage on their sites. It has been speculated that much of these industry speed updates could be partly motivated to maintain a foundational web advertising model that keeps many sites operating. If the industry can speed up mobile load times, maybe those heavy ads won’t be as much of a burden to UX, thus deterring ad blocking software from being used.

What Makes AMP So Fast?

Moz has a great synopsis:

“It's like a diet HTML. So certain tags of HTML you just can't use. Things like forms are out. You also need to use a streamlined version of CSS. You can use most of CSS, but some parts are falling under best practice and they're just not allowed to be used. Then JavaScript is basically not allowed at all. You have to use an off-the-shelf JavaScript library that they provide you with, and that provides things like lazy loading.”

Example

What AMP pages look like on mobile devices:

google-search-image

Demo AMP

To create an AMP page, start here.

To get technical info and FAQ’s, go here.

For diverse perspectives, review this WebmasterWorld forum discussion.

Demo:

Penguin 4.0

Penguin was first released in 2012 and designed to target link schemes, spam, and paid links. The 4.0 version will reportedly be real-time. Meaning as soon as Google detects a site has removed a link, disavowed a link, or indexes a spammy link, the site will see a penalty in real-time. This also means a faster recovery from a penalty as well. As it currently stands, recovery from Penguin penalties can take up to several months.

This is a huge benefit for webmasters to be more responsive in managing link histories and controlling spam attacks or PageRank.

Penguin 4.0 is set to be integrated in early 2016.