How can I make Google crawl content behind an AJAX load more button

6,633

Solution 1

If you are using the pagination with basic reloading then it will automatically crawl by search engines(then it should have different URLs for each page).

When you are using AJAX to load more content to the same page then above case will not work, but you can use an alternatives like dynamically change the URL with AJAX (without # append to the URL because when you append # to the URL, after the # will not crawl the search engines).

You can achieve it with history.js jQuery library it will work only with HTML5 supported browsers i.e. IE # symbol is appended.

By using this method you will get different URLs for each page then search engine will index your page content.

Also make sure you have added all those urls in sitemap.xml.

Solution 2

The best option is to have a noscript fallback to regular pagination. The search engine will pick up on the links in the noscript section and index those pages. This has the added benefit that anyone browsing your site without JavaScript enabled will get a better user experience.

Solution 3

You should provide a mechanism other than pagination to reach all the questions and answers on your site. Pagination is a poor navigation technique because:

  • It is rarely used by visitors. Usually only 2% of visitors actually use pagination.
  • Using AJAX for it makes it uncrawlable
  • Even without AJAX, it causes PageRank to dissipate so quickly that by page 4, none of the questions get any juice.

Instead you should:

  • Create a sitemap XML file that lists all your content and submit it in Google Webmaster Tools.
  • Use tags to group you questions by theme. Have tag pages that list questions in each tag.
  • Link questions directly to other questions using "related" links or something similar.
  • Consider expanding the number of items on the first page of pagination. If you have 100 items per page, that means that search engines will be able to find 10 times the amount of content from the first page alone.
Share:
6,633

Related videos on Youtube

Jad Joubran
Author by

Jad Joubran

Updated on September 18, 2022

Comments

  • Jad Joubran
    Jad Joubran over 1 year

    I have a feed that has 10 questions and answers loaded when you first open the page. And then I have a "Load More" button that gets you 10 new questions and answers everytime you click on it.

    How can I make Googlebot crawl those new questions and answers as if they were paginated, i.e.:

    • page 1 ==> 0-9 questions and answers
    • page 2 ==> 10-11 questions and answers
    • and so on.

    Or can you provide a better alternative?

  • Jad Joubran
    Jad Joubran over 10 years
    it makes sense.. that's the answer to my question. I just need a small clarification.. how can I tell Google to crawl those extra pages? for example: feed/2 feed/3 feed/4 etc.. should I add a meta tag or what? and will google crawl them all or should I add them to the sitemap.xml file? Thank you
  • Jad Joubran
    Jad Joubran over 10 years
    I am already sending the questions in the sitemap, but the problem is I need to add more internal links to those questions.. I am doing the "related" links which is a great idea.. I am also working on a directory to do the thing you mentioned.. but I just also need the 3 feeds that I have to link to those questions.. so how can I do it?
  • Stephen Ostermiller
    Stephen Ostermiller over 10 years
    I added one more item to my answer
  • Ivo van der Veeken
    Ivo van der Veeken over 10 years
    Your best bet is to put them in the sitemap. This way Google will definitely find the pages.
  • Jobin
    Jobin over 10 years
    @JadJoubran if you are using any CMS for your sites you can set the meta tags as well with ajax (even normal script also you can achieve bcoz the url have params) , also include those pages in sitemap.xml then search engines will consider it as normal pages. In my concern you have include in xml and add meta data dynamically to the page while using pagination
  • Giacomo1968
    Giacomo1968 over 10 years
    Or better yet, have some user agent detection in the system to allow spider/robot friendly content to be sent to a spider/robot when they access a page.
  • Jad Joubran
    Jad Joubran over 10 years
    @Kris can you please provide an example?
  • thomasrutter
    thomasrutter over 10 years
    The <noscript> option is superior to user agent detection. Make sure it's equivalent content though, not something spammy or dodgy in an attempt to mislead search engines.
  • unor
    unor over 10 years
    Do you have a source for "Usually only 2% of visitors actually use pagination."?
  • Stephen Ostermiller
    Stephen Ostermiller over 10 years
    I have measured usage of pagination on two different sites and it came out that low each time. In both cases it was lists of items that users could choose from. We found that users like search, and filters for choosing an item far more than pagination.