Why is Pagination an SEO Issue?
Pagination, the practice of segmenting links to content on multiple pages, affects two critical elements of search engine accessibility.- Crawl Depth: Best practices demand that the search engine spiders reach content-rich pages in as few "clicks" as possible (turns out, users like this, too). This also impacts calculations like Google's PageRank (or Bing's StaticRank), which determine the raw popularity of a URL and are an element of the overall algorithmic ranking system.
- Duplicate Content: Search engines take duplication very seriously and attempt to show only a single URL that contains any given piece of content. When pagination is implemented improperly, it can cause duplicate content problems, both for individual articles and the landing pages that allow browsing access to them.
When is Pagination Necessary?
When a site grows beyond a few dozen pages of content in a specific category or subcategory, listing all of the links on a single page of results can make for unwieldly, hard-to-use pages that seem to scroll indefinitely (and can cause long load times as well).Clearly, I need to log into Facebook more often...
Numbers of Links & Pages
We know that sometimes pagination is essential - one page of results just doesn't cut it in every situation. But just how many links to content should the average category/results page show? And how many pages of results should display in the pagination?There are a lot of options here, but there's serious danger in using the wrong structures. Let's take a look at the right (and wrong) ways to determine link numbers.
In some cases, there's simply too many pages of results to list them all. When this happens, the very best thing you can do is to work around the problem by... creating more subcategories! It may seem challenging or even counter-intuitive, but adding either an extra layer of classification or a greater number of subcategories can have a dramatically positive impact on both SEO and usability.
There are times, however, when even the creation of many deep subcategories isn't enough. If your site is big enough, you may need to have extensive pagination such that not every page of results can be reached in once click. In these cases, there are a few clear dos and don'ts.
Do:
- Try to link to as many pages of the pagination structure as possible without breaking the 100(ish) links per page limit
- Show newer content at the top of the results list when possible, as this means the most link juice will flow to newer articles that need it (and are temporally relevant)
- Use and link to relevant/related categories & subcategories to help keep link juice flowing throughout the site
- Link back to the top results from each of the paginated URLs
- Show only a few surrounding paginated links from paginated URLs - you want the engines to be able to crawl deeper from inside the structure
- Link to only the pages at the front and end of the paginated listings; this will flow all the juice to the start and end of results, ingoring the middle
- Try to randomize the paginated results shown in an effort to distribute link juice; you want a static site architecture the engines can crawl
- Try to use AJAX to get deeper in the results sets - engines follow small snippets of Javascript (sometimes), but they're not at a point where this is an SEO best practice
- Go over the top trying to get every paginated result linked-to, as this can appear both spammy and unusably ugly
Titles & Meta Descriptions for Paginated Results
In most cases, the title and meta description of paginated results are copied from the top page. This isn't ideal, as it can potentially cause duplicate content issues. Instead, you can employ a number of tactics to help solve the problem.Example of results page titles & descriptions:
Top Page Title: Theatres & Playhouses in Princeton, New JerseyYes, you can use no meta description at all, and in fact, if I were setting up a CMS today, this is how I'd do it. A missing meta description reduces complexity and potential mis-casting of URLs as duplicates. Also notce that I've made the titles on results pages sub-optimal to help dissuade the engines from sending traffic to these URLs, rather than the top page (which is made to be the better "landing" experience for users).
Top Page Meta Description: Listings of 368 theatres, playhouses and performance venues in the Princeton, NJ region (including surrounding cities).
Page 4 Title: Page 4 of 7 for Princeton, New Jersey Theatres & Playhouses
Page 4 Meta Description: Listings 201-250 (out of 368) theatres, playhouses and performance venues in the Princeton, NJ region (inclusing surrounding cities).
Alternate Page 4 Title: Results Page 4/7 for Princeton, New Jersey Theatres & Playhouses
Alternate Page 4: Description: -
Nofollows. Rel=Canonicals and Conditional Redirects
Some SEOs and website owners have, unfortunately, received or interpreted advice incorrectly about employing directives like the nofollow tag, canonical URL tag or even conditional redirects to help control bot activity in relation to pagination. These are almost always a bad idea.Whatever you do, DO NOT:
- Put a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL. You'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).
- Add nofollow to the paginated links on the results pages. This tells the engines not to flow link juice/votes/authority down into the results pages that desperately need those votes to help them get indexed and pass value to the deeper pages.
- Create a conditional redirect so that when search engines request paginated results, they 301 redirect or meta refresh back to the top page of results.
Letting Users Display More/Less Results
From a usability perspective, this can make good sense, allowing users with faster connections or a greater desire to browse large numbers of results at once to achieve these goals. However, it can cause big duplicate problems for search engines, and add complexity and useless pages to the engines' indices. If/when you create these systems, employ javascript/AJAX (either with or without the hash tag) to make the pages reload without creating a separate URL.(the Google Analytics interface allows users to choose the number of rows shown, though they don't have to worry much about crawlability or search-friendliness)
0 komentar:
Posting Komentar