bookmark_borderGoogle goes (once more) in opposition to the lease of subdomains and directories

Already greater than 11 years in the past, I tackled the topic right here within the weblog: giant, dependable web sites hire an entire subdomain or listing of their area to a 3rd get together. Usually with out data or approval of the positioning search engine optimization crew.

This third get together makes use of Google's general belief in all the area and runs its personal web page of questionable key phrases which have a better rating on Google than on an unknown area.

In latest days, the topic appears to have been again on the agenda in Google. In consequence, a few of these collaborations have been considerably worse than earlier than. is an effective factor to see. There are at the moment two highly effective subdirectories providing third-party presents:

Each directories ( and of the identical day clearly misplaced visibility in Google's outcomes. An analogous growth will be seen in different areas and in different nations. Here’s a desk with the numbers of 25 domains or subdomains of domains left to 3rd events:

Profitable websites on international domains

web site nation Visibility 2019-08-27 Visibility 2019-07-27 change of 40.4767 74.3012 -45.5% himself 37.3634 57.9056 -35.5% of 58.872 57.3748 2.6% himself 20.9205 26.1085 -19.9% United Kingdom 23521 24.5576 -4.2% himself 25.3647 23.362 8.6% himself 12.1242 22.3108 -45.7% of 13.169 22.2677 -40.9% pl 16.1364 19.8753 -18.8% pl 11.0012 13.6511 -19.4% br 8.4407 10.9138 -22.7% pl 9.0854 9.7642 -7.0% br 8.0925 9.4753 -14.6% at 9.3804 8.8751 5.7% br 1.9723 8.7574 -77.5% Fri. 7.2552 8.1564 -11.0% nl 4.5499 6.8313 -33.4% of 6.8213 6.3882 6.8% nl 6.0764 6354 -4.4% of 4.4358 5.8461 -24.1% Fri. 5.32 5.7086 -6.8% United Kingdom 5.6212 5.3285 5.5% United Kingdom 3693 5.2646 -29.9% himself 4.7444 4.0352 17.6% of 2.0236 3.6653 -44.8%

Along with the key losers, there are additionally many examples of any such lodging, the place visibility has remained secure and even gained visibility within the final month.

An evaluation of the visibility of 150 instances reveals that the visibility of those instances has decreased by virtually 20% in all areas over the last month.

Danny Sullivan commented on the official Twitter account for the webmaster's communication on the topic:

Considerably, public consideration on the topic is at the moment being pushed by an nameless twitterAnd depend The creator doesn’t agree along with his actual identify, however has a devoted opinion on the great and the unhealthy. The sustainability with which he approaches the topic means that he doesn’t do it as a interest, however that he’s motivated by financial concerns.


And day by day, the groundhog greets. An rising variety of search engine optimizations are gaining new consideration. Looking back, the danger for web site operators, particular person sub-domains or rental directories was slightly low. Google is just within the rating of the rented content material and never the content material of all the area. A assure that it nonetheless works, however there’s none.

bookmark_borderGoogle about fundamental updates: your competitors is best than you

The article entitled What site owners ought to learn about Google's "main updates" has many generalities, but additionally information and new weights. I’ve summarized and categorized an important statements right here:

  • Official affirmation: Google has formally confirmed the primary updates. The numerous minor adjustments to the algorithm proceed to be performed with out affirmation.
  • Basic reassessment of the content material: The aim of the primary updates usually are not particular person pages or content material. Google goals to raised consider the content material with these updates.

    The instance chosen by Danny is attention-grabbing on this context: in case you create a listing of the 100 greatest movies in 2015 and also you replace them now, they’ll inevitably have a unique facet, not as a result of the movies there are just a few years had been worse, however as a result of new and higher motion pictures are added.

    Transferred to the primary replace: the extent of claims in Google's outcomes has elevated. Anybody who needs to proceed to be eligible for good outcomes have to be frequently higher, as will the competitors.

  • Record of questions for the content material: Google has up to date and tailored the 2011 content material question listing. It’s now divided into sections. The sector "Comparative Questions" is attention-grabbing. It's not nearly your individual content material, however about your individual content material versus the competitors: does the location supply substantial added worth over different hits?
  • Studying and comprehension: E-A-T: The Google idea of E-A-T (Experience, Authority and Reliability) ought to now have been learn and understood by all customers within the tips regarding the analysis of the standard of the analysis. That is the bottom on which Google is making an attempt to enhance the algorithm.
  • Google is a machine: Danny additionally confirms that Google can’t perceive the content material. Google wants to acknowledge machine-readable alerts and use them as a proxy for evaluating human content material. Certainly one of these alerts are all the time hyperlinks.
  • Restoration at subsequent replace: If a site loses by way of a Google Core Updates rating and adjusts its content material consecutively, a rating enchancment will happen once more on the subsequent replace of the Core. Unsurprisingly, there is no such thing as a assure.


After all, we can’t anticipate the Holy Grail to succeed in the highest rankings after the Google weblog. However, Google has summarized some factors and confirmed others. Probably the most attention-grabbing factor that I discovered was the affirmation of the affect of your opponents' content material in your rankings. The standard degree of SERPs is rising and people who need to proceed enjoying should take part.

bookmark_borderGoogle proclaims replace of featured snippets

Google has introduced an replace to its Featured Snippets algorithm. The replace is meant to make chosen snippets extra helpful by understanding which info requires new content material.

Which requests deserve freshness?

Google gave three examples of forms of search queries that require new content material.

  1. Repeatedly up to date info
  2. Info that adjustments with time
  3. Present occasions

The aim is to take away the contents of the chosen clips which can be out of date and not helpful.

Right here's how Google explains it:

"… A brand new replace of the algorithm improves the understanding of our methods as to what info stays helpful over time and which info turns into out of date extra shortly.

That is particularly helpful for Featured Snippets, a Search function that highlights which pages of our methods are probably to comprise the data you’re in search of.

For queries the place new info is necessary, our methods will attempt to discover essentially the most helpful and up-to-date snippets. "

I believe the a part of the above rationalization that’s most necessary is that their methods decide which content material is evergreen (stays helpful in time) and what forms of content material lose their usefulness over time.

Info up to date often

This info could also be based mostly on a schedule or a change to come back and wait.

"Listed here are some examples the place latest snippets are notably helpful. You could be in search of often up to date info, resembling the subsequent full moon, the winner of a actuality present or the upcoming holidays. "

Under is a question with out of date outcomes:

out of date "width =" 600 "height =" 509 "sizes =" (max-width: 600px) 100vw, 600px "data-srcset =" /08/out-of-date-featured-snippe.png 600w, 480w "data-src ="

Under is similar question however with new outcomes:

New content "width =" 600 "height =" 509 "sizes =" (max-width: 600px) 100vw, 600px "data-srcset =" 08 / fresh-featuring-snippet.png 600w, 480w "data-src =" https: / /

Info that adjustments with time

One other class that deserves freshness is info whose nature could change over time. That is attention-grabbing as a result of the data wants change in a future occasion based mostly on time in comparison with the necessity for info of an occasion that has already occurred.

Right here's how Google explains it:

"For instance, within the method of an occasion, we be taught extra particular particulars. A newer web page on a TV premiere may comprise extra particular info and different helpful content material, resembling trailers, which you can click on to look at. "

Event based on time "width =" 800 "height =" 478 "sizes =" (max-width: 800px) 100vw, 800px "data-srcset =" / 2019 /08/event-time-based.png 800w, 480w, https: // cdn .searchenginejournal .com / wp-content / uploads / 2019/08 / event-time-based-680x406.png 680w, based- 768x459.png 768w "data-src ="

Present occasions

Google then used the present occasions class, the place the data is most helpful when it’s up-to-date. He used the instance of a listeria recall.

Preview selected "width =" 800 "height =" 546 "values ​​=" (max-width: 800px) 100vw, 800px "data-srcset =" 2019/08 / up-to-date.png 800w, 480w, https: // cdn. / wp-content / uploads / 2019/08 / update-680x464.png 680w, date-768x524. png 768w "data-src ="

New Pointless Outcomes for Evergreen Content material

Firstly of the announcement, Google mentioned that it was not essential to often replace the data necessities of persistent content material. They used the instance of a fact-based question, the place the actual fact doesn’t change.

Right here's how Google explains why everlasting content material doesn’t want new internet outcomes:

For instance, in case you ask "Why is the sky purple at sundown," the underlying rationalization doesn’t change over time, and the clearest description is usually discovered on an older web page. Prioritizing recent content material wouldn’t essentially give higher outcomes. "

In line with a fantasy of search engine optimisation, Google prefers to categorise new content material. The assertion above contradicts and demystifies the parable. He explains that Google understands that search queries on evergreen subjects don’t get higher outcomes with new internet pages.

To remove

This replace is necessary as a result of it forces content material publishers to find out what content material is everlasting and what forms of content material require a brand new replace. The snippets offered will now additional reward content material publishers that present helpful content material for the second.

bookmark_borderAfter 25 years: Google needs to standardize the robots.txt file and provides its personal robots.txt analyzer to obtain.


The Web has modified dramatically over the previous 20 years and continues to be prolonged to new applied sciences and new alternatives. A really fundamental a part of the robotic's infrastructure, nonetheless, has not modified perpetually, however remains to be not standardized: the robots.txt file. Now Google needs to push normalization and has additionally launched a robots.txt parser.

Nearly all web sites have a robots.txt file that’s neither related nor fascinating to customers or guests of the web site, however ought to be taken into consideration by search engines like google or their robots. There isn’t any obligation to notice, however it’s a good sound and can be applied by all recognized search engines like google. With the file, that is the primary use case. Some information and folders of particular search engine robots could also be excluded.

<img src = "” alt=”googlebot robots "width =" 1500 "peak =" 750 "class =" full-size alignnone wp-image-88798 "srcset =" 1500w,×150.jpg 300w,×384.jpg 768w,×512.jpg 1024w,×320.jpg 640w,×400.jpg 800w "values ​​=" (most width: 1500px) 100vw, 1500px”/>

Who would have thought: the format robots.txt file had already been launched in 1994, however remains to be not standardized. Because of this, there are some variations in implementation and misunderstandings can happen with crawlers. Google now needs to vary with two new initiatives: First, format standardization and implementation have to be realized, with some new options comparable to caching, character set, or advisable dimension to be decided.

So as to have the ability to globally implement these attainable modifications, the evaluation used internally is now out there in open supply and provides this performance for each obtain and take a look at software. There isn’t any clear net service with an API, which is a bit stunning on this case. Google factors out that elements of the analyzer written in C ++ date again to the 1990s, however nonetheless work nicely and are nonetheless used internally right this moment.

»Google robots.txt analyzer
"Announcement of the analyzer
»Standardization Announcement

" Site owners, beware: Google doesn’t take note of all the foundations of robots.txt – these are the alternate options

See as nicely
" Research: Google responds each second of the question and fewer and fewer clicks on search outcomes

" Googlebot turns into everlasting: Chrome's engine will probably be up to date usually

Cease Lacking Google Information: Subscribe to the Google Watchatch E-newsletter
Subscribe to the GoogleWatchBlog publication