Ajax and Adsense, the solution
The combination of Adsense and modern Web technologies as Ajax is a source of dilemna for webmasters. Adsense provides contextual ads, textual or graphical ads that are choosen to be relevant with the content of the page. To do that, the crawler parses the page furnished by the server, extract keywords and choose in its database advertisements whose keywords (defined in the Adword administering panel, not in the ad) matches those of the page, and assign the relevant ads to the page.
Web 2.0 makes life harder to robots: the text is not parsed
When the crawlers parse the page, either to index it or to choose the most relevant contextual ads, the dynamic content is not available for them and is ignored. The contextual ads are relevant with the static part of the page only.
Text loaded dynamically and not visible to search engines is not the only drawback of Ajax, pages or even an entire part of a site can be totally unknown to crawlers and missing from search results if the site is completely dynamic. Actually, links are also important for the Adsense robot that the content of the page, in the quest for relevance.
<p onclick="myfunction()"> ...text... </p>
To make it visible to crawlers, it is sufficient to turn it into HTML link:
<a href="mypage.html" onclick="myfunction()"> ...text... </p>
It makes no difference to the user, it is always the value of onclick which will be taken into account by the browser, but the link itself can be taken into account by the robot that interprets the contents of href and ignores the onclick value.The filename will now participate in the determination of keywords to the target page.
How to solve the problem?
In an article on another website, I have described a mean to have content indexed by search engine even if it is loaded dynamically in a page. The idea is to add a responseHTML attribute to the XMLHttpRequest object, that allows to add dynamically into a page, texts extracted from other HTML pages that are indexed by the engines.
I don't kn ow if this is applicable here as we want the text scanned INSIDE the page to make ads relevant with the content. This can be achieved only if the whole text is present when the page is loaded.
Search engine are now able to parse Ajax content as explained in the article Ajax crawlable. It is just not easy to do for non programmers.
The issue of internal links
Robots are unaware of internal links, denoted by the symbol # following the link, to a chapter or a page section.
A link on a section in another page has this form:
<a href="mypage.html#mysection"> </a>
The crawler will only see the link on mypage.html, not the section. That would be the same with a dynamic link (which may be, remember, sometimes taken into account, sometimes ignored by the robots).
<p onclick="myfunction('mypage.html#anchor=mysection')"> </p>
In this case, the internal link on the section is as a parameter. It must be transformed in the following manner:
And the link will be indexed as a whole with the filename and the parameter. The link will be seen by the Adsense robot and keywords contained in the link and the anchor will be taken into account which will bring more relevant ads in the target page.
Using Ajax could require some sacrifices and this technology must be used
mainly to present data that should not be indexed in relation with the Ajax
page, but should be indexed in other document instead.
You must at start design your pages and also your site, not for search engines, but by incorporating the concept of tracking links. It must be possible from each page on the site, by following links, to reach any other page of the site, either to visitors or crawlers.