Google answers to Webmasters

This is a summary of two meetings between webmasters from various countries and the team at Google Webmaster Central, including Matt Cutts. Webmasters have asked all the questions they wanted to ask but that they could not do before, and someone in the team gave an answer.
The first edition was organized by the Search Engine Genie website and the second through Moderator which allows Internet users to vote for questions.
This is a selection of the answers, and it is sorted in topics for a better accessibility.

Domain name and hosting

How important is the age of a site and the domain for ranking?

In most cases no, we provide the most relevant content, not the oldest. But in some cases, the age is a signal, however you must not make an obsession about that.

Directories

Recently you have deleted this reference in the guidelines: "Submit your site to the Dmoz and Yahoo directories." Does that means that such links will not be counted in the future?

There will always be chances that links from directories will not be taken into account in the future.
What we realized is that novice webmasters seeing this recommendation are encouraged to submit their site in lot of directories even if they are of poor quality.
We have not changed the weight of those links, but we have removed this reference.

Until recently (the last six months) we could have a good ranking by submitting an article (provided it is almost unique) to a directory of articles and this seems no longer be the case. Are the links of these sites devalued?

In my experience, no directory of articles is of high quality. Sometimes there are a ton of articles copied on the site and it is difficult to find the original. The user experience on these sites can also be negative.
Sometimes it is better to be already a little better known before to submit a link rather than submit a ton of links for that.

Why Google does not have a directory?

Google uses the Dmoz.org directory and does not need for another one.

Backlinks

Are backlinks from other sites of the same person providing help or penalizing the ranking?

Backlinks of the same company are divided into two camps.
You have cases where one has very few sites, and link them, and that makes sense.
In other cases, webmasters have between 1000 and 2000 domains linked. I do not recommend that.

Other answer:
I do not see these links penalizing your ranking if they are made to the user.
If your sports site links your campsite, it is useful to visitors. This can only improve your ranking.
If your sites are only linked in footer as cross advertising or to increase the number of links, it is unlikely to help neither your ranking nor your users.

What importance is given to backlinks of social networks and blogs?

Social sites and blogs are adressed as other sites.

Since more than 300 questionable links are pointing to my site, it went from the top ranking for its keywords on the second page at best. Has a mass of bad neighborhood links effect on the ranking?

We are doing our utmost to make sure a site can not affect another. If you suffer from spam, you can report in Spamreport.

PageRank

We recently changed the name of our society. The old domain had a good PR, but the new is very low. Is a 301 redirect sufficient to transfer the PR?

It's a good question and it is very frequent. Basically 301 is the best solution. The topic was discussed on Google Webmaster Central.

Do you not feel that the webmaster should be informed of any manual penalty and a demand in reconsideration examined in a more serious view in such a case?

Imagine you have a site where you add original content or tools regularly. If it was hijacked and then contains hidden content or links, or if you are the owner and the webmaster does not meet the guidelines, in this case you can be informed by the Webmaster Tools. The message will surely provide information on the problem. Once you have corrected it, your request for reconsideration will be studied carefully.
On the other hand if you have a quantity of identical sites with content downloaded from other sites, do not expect a message from Google.

Matt said that Google takes into account 200 signals to score a page and that PageRank is one among many. If so why is it put forward on the Google toolbar? Is it possible that it will be removed from the toolbar in the future?

The PageRank is just a signal that we use, but also the simpler to understand for users and it is easy to know what it measures. So I think it is useful to users and it makes sense to present it in the Google Toolbar.

How often do you update the PageRank of a site?

The PageRank is re-calculated all the time (different PageRank each day). The toolbar indicator 3 or 4 times a year.

Working to improve PR

It is better to spend time to improve the content, accessibility and to promote the site.

PageRank of a site could it be lowered by a page

No.

PageRank and robots.txt

Indeed, without nofollow the PR can be transmitted even if a page is blocked by robots.txt.

Bounce Rate

Are the bounce rate and time spent on a page taken into account for the ranking of this page? Specifically, if a user clicks on a result and returns immediately to choose another, does it affect the ranking of first?

If users leave a site as quickly, there is a good probability that they would not recommend it to others (or return themselves). So yes, at least indirectly, if a site is designed so that visitors immediately leave it there is a chance that we would not recommend it as much as those that users love (and which are recommended by others).
(The Google patent makes clear that the algo reflects that).

Duplicate content and scrapers

How many domains can point to the same destination before it is a problem? More specifically, domain1.com, domain2.com so. point to domain.com which has its own pages ...

In your example, it seems that domain.com is the only site to host content. It is common for companies to buy misspelled versions of their domain and redirect 301 on the correct domain.

You claim that you can distinguish the original and RSS copies, so why are there who are better ranked than the originals?

We do the best to identify them and find the best URL to content. There are two ways for you to help us:
If you provide content that can be syndicated on other sites, include an absolute link (not relative) on your site to be identified as the author.
And about spammer sites, make a spam report through webmaster tools.

Effect on the PageRank?

Multiple links to a same page divide the PR.

Web design and links

For my optimization, I'd like to improve my ranking by removing technical barriers (starting with the parameters of dynamic URLs). What are the most important corrections to do and how to measure results with Webmaster Tools?

To test the accessibility of your site, I recommend you use a checker for broken links. This gives you an idea of how robots see your site and indicate where they loop or find duplicate because the parameters of URLs.
(Scriptol note: A link checker is provided on this site).

Suppose that my site is in English and French. Must the English and French versions of a page have different URLs? Another good practice for the architecture of a multilingual site?

If you can afford, use domain.com and domain.fr. If this is not possible, consider en.domain.com and fr.domain.com. Otherwise, domain.com/en and domain.com/en can work.
In Webmaster Tools, you can geolocate a site (and I believe the sub-domains), which can help.

Is it that too much rel=nofollow in external links can be considered a suroptimisation and penalized? Are there penalties for this?

I do not be worried about that. I will try to make the site as natural as possible.
(Scriptol note: See our article about Black Hole Sites).

I have a site in five languages and I can not pay 5 domains. Is it better 5 subdomains or 5 subdirectories?

Both are worth. If the content is different, I use subdomains otherwise directories, but it is up to you.

Can we expect that Google favors sites with valid HTML code?

Since less than 5% of pages are valid according to the study made by Opera, there would be no sense for us to penalyze the 95% other pages.
W3C compliant not required even in the future, but conform sites perhaps are most attractive to visitors and will be indirectly promoted.
(Scriptol note: See statistical analysis of pages by Opera.)

Does using display: none or left: -9999px for accessibility lowers your ranking?

If this does not hide something in an attempt to mislead users and engines, then it is generally correct.

Is it true fewer internal links promote external links?

The PageRank is shared between the links on a page but I recommend you do not concentrate on that point as you can not measure it or act on it.

If I link my site A and site B, and that B links to A, will I be penalized? Are the relationship strategies working?

Participating in link networks to influence the ranking is prohibited by Google's guidelines.

Sub-domains have they effects on rankings?

There is usually a connection between domain and sub-domain (not competing with domain in SERP?).

HTTP and HTTPS

They are treated in an identical way.

Static via dynamic

No difference between static links (html etc...) and dynamic links (?=xxx).

Dash or underscore in URL

It does not matter.
So far the  underscore code "_" which is part of characters in identifiers in programming languages ​​were considered as part of words. For example Web_Site was seen as a single word. The rule will change the underscore is now a separator and Google, the term Web_Site is seen as two separate words now,  Web and Site. 

File extensions do they matter?

Whatever your files end in .html,. Htm, .php or .asp,  Google treats them the same way. Only .exe could be ignored for indexing!

Deep pages are they disadvantaged?

The number of slash "/" does not count Google. It seems that accounts for Yahoo and Live, which ignore the pages that are too deep in subdirectories.

Alt and title

Alt is used for indexing, title only serves to your visitors.

Too much link in home page

As long as you have less than 100 links, depending on what you think is best for the visitor, the number is not a problem.

A lot of 301 redirects

It is not a problem even for a long time. The only problem is to have multiple redirects to the same page.

Using a CMS

Not a problem if it complies with the Web standards. You must above all avoid having the same title and the same description for all pages.

Moving by CSS

A menu put at end and moved by script or CSS to top is not a problem.

Sitemap

If my site has a very large number of pages, like Amazon, must I include each URL that I want to be indexed in the XML file. If not how must I complete the XML file?

You are free to use your sitemap to put all the links... it's done for this! However, if you have many URLs that point to the same content, you must only include the preferred version in the sitemap.

In addition to the XML sitemap, is it useful to have an HTML sitemap on the same site?

An HTML sitemap can help search engines. Especially if the XML file is not present. The widget 400 of the Webmaster Tools you can put in your error page can use the /sitemap.html file or equivalent to help users find what they seek.
I recommend an HTML sitemap, but for visitors rather than the engines.

Is it useful?

The sitemap can see URLs added and modified. They can be scanned faster than falling over by accident.

Image in the sitemap

It works, but it is not worth the image into an HTML page.

Keywords

Many believe that to be well ranked, it is enough to have quality backlinks. But what is the importance of keywords on the site? Is the density of keywords important to show the topic of the page. What percentage is suggested?

The links are only one factor used for ranking pages. We watch the content on the page or off the page, so what you have in the page is essential for ranking.
However there is no density of keywords recommended. Your content must be of good quality and written for users. If you write for engines, your content can become not natural and that will eventually penalize you rather than help you.

Should we put the keywords in the URL?

It's pointless if they are already on the page. But it is useful for pages containing only images.

How is used meta name="keywords"?

We do not read this meta.

Domain name and geotargeting

Sub-domain and visibility

Not any more visibility with sub-domains or subdirectories that in the domain.

Bought a penalized domain

In this case you have to submit the site again and to indicate the change of ownership.

Generic TLD or country TLD?

Extensions of countries (ccTLD) have better treatment on local searches.

ccTLD

Geotargeting does not work for country code extensions such as .fr, .jp...

ccTLD or GWT geotargeting

Both operate in an identical way.

Country code for sub-domain

Country code such as jp or country name as japan, used as sub-domain are not recognized for geotargeting. Only Webmaster Tools may be used for geotargeting.

Meta used for the language, html lang=, meta name=language, or http-equiv=content-language?

None, only the content of the page is used to recognize the language of the page.

Webmaster tools

Since Google is against software to manage the ranking in the SERPS, are there any plans at Google for a service of its kind approved by it, that webmasters can use?

My idea is that often one cares too much of "groups of keywords winners" and want the best for this ranking even if it is not the best for the site. Focusing on a group and forgetting the rest which is also ranked is not the best idea. Following the statistics given by the server gives a good idea of what works in practice.
But it is a request that we know and I think that it would be nice if we could offer it for a reasonably limited number of keywords.

Does Webmaster Tools will provide one day the opportunity to dissociate us from sites that link to us, as it exists on Yahoo Site Explorer?

It happens that we work so hard to ensure that a site can not penalize another that we have not provided that opportunity.
If you have a ton of links pointing to your site, scrolling among them would be really annoying. This would be a challenge and we did not see the need for this, so we have not yet offered it.

Update July 2011: The Site Explorer service is closed this month.

Is Google planning to show us the ranking of keywords in Webmaster Tools?

That already exists for a limited number of keywords in the "Top Search Queries". Although we are not used to communicate our plans, extending this feature is certainly an idea that has been suggested.

Matt Cutts: In addition to what Susan said, I personally think it would be a good idea.

Does the geolocation in Webmaster Tools has the same weight that a TLD?

Google uses a package of signals such as the location of the server and TLD to determine which users may be interested in the content of the site.
Geolocation replaces the TLD. If one sees the site as a shop for example, it defines the people living around as key customers.
Geolocation should not be used to target a language.

Other questions

How often change your algorithm?

We change the algorithm all the time. Last year we changed it 450 times.

RSS feeds

RSS feeds should not be indexed. They must be blocked by robots.txt. You must also ensure that the feed, when it is displayed on another site, has a link to your site.

Page removal

To have pages removed from the index, it is better to put the noindex attribute in the page, and to allow the robot to crawl it.

Paid links by a concurrent for us

No problem because we are trying to guess the intentions.

Partial sandbox

A site partially penalized may be submitted for inclusion at any time.

<title> tags

They serve only as titles in results.