SEO Checklist Update
This is a mind map talking about SEO checklist. You can create a mind map like this effortlessly.
Similar Mind Maps
SEO Check list
who is him?
what does he sell?
where are his customers?
who are his customers?
what does he want? Leads, conversions, visits, ...
how many competitors in SERP?
allintitle:"xxx yyy" analysis
Total Google results count, inurl, allinanchor, ...
"xxx yyy" search analysis
hom many rank better than him?
6/12 months target options
more time on site
lower bounce rate
more social sharing
SERP Competition & Competitors
1. Architecture (navigation levels, internal linking, unnecessary redirection, too many URLs, orphaned pages, broken links, ...)
2. Indexing & Crawling (canonical, noindex, follow, nofollow, redirects, robots.txt, sitemap.xml, server errors)
3. Duplicate content & On page SEO (more url same page, repeated text, pagination, parameter based, dupe/missing titles, description, h1s, etc..)
4. Backlink Analysis
Page and Domain Authority (SEOmoz)
Google Webmastertools, Bing Webmastertools, SEOmoz, Ahrefs.com, Majestic SEO, ...
age of the domain
EMD - keyword exact match in url
registration + hosting data
trust/authority of the host domain
Multi language websites
Use one gTLD
use many ccTLD
does it exist?
is it necessary?
is it correct?
Never block CSS and JS dependencies
Bot-Specific directives has priority over generic directives
Directives order doesn't impact on priorities
User-agent: *Disallow: /privatefolder/Disallow: /privatefile.html User-agent: Googlebot/2.1Disallow: /nogoogle.html Sitemap: http://www.mysite.com/sitemap.xml #allow all CSS and JS filesallow: /*.css$allow: /*.js$ #Alternatively you can explicitly disallow single pagesUser-agent: *Disallow: /~joe/junk.htmlDisallow: /~joe/foo.htmlDisallow: /~joe/bar.html #Example 1: Block all, also sitemapUser-agent: *Disallow: / #Block a file estensionDisallow: /directory/*.estension #Block specific folderwww.dominio.com/directory/subdirectory/chiave/subdirectory/Disallow: /*/keyword/ #Block all url containing a specific wordwww.dominio.com/1chiave.estensionewww.dominio.com/2chiave.estensionewww.dominio.com/chiave3.estensioneDisallow: /*keyword #Block specific folderwww.dominio.com/1chiave1/www.dominio.com/2chiave2/www.dominio.com/chiave3/Disallow: /*keyword*/ #Block a page without block the same page plus parametersDisallow: /directory/file.estensione$Disallow: /directory/file.pdf$ #Block all URL with parametersDisallow: /*? #Block all URL with "get" parameterDisallow: /*?* #To exclude all robots from part of the serverUser-agent: *Disallow: /cgi-bin/Disallow: /tmp/Disallow: /junk/ #To exclude a single robotUser-agent: BadBotDisallow: / #To allow only GooglebotUser-agent: GoogleDisallow:User-agent: *Disallow: / #To exclude all robots from the entire serverUser-agent: *Disallow: / #To exclude all files except one#This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the #one #file in the level above this directory:User-agent: *Disallow: /~joe/stuff/ #To allow all robots complete accessUser-agent: *Disallow:
is it complete?
XML file named "Sitemap.xml"
file must be no larger than 50MB when uncompressed
place sitemap in root folder - The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/.
no more than 50,000 URLs for a single sitemap
If you have more than one Sitemap, you can list them in a Sitemapindex.xml file and then submit the Sitemap index file to Google. You don't need to submit each Sitemap file individually.
If your site is accessible on both the www and non-www versions of your domain, you don’t need to submit a separate Sitemap for each version. However, we recommend picking either the www or the non-www version, and using recommended canonicalization methods to tell Google which version you are using.
Do not include session IDs in URLs
<?xml version=”1.0” encoding=’UTF-8’?><urlset xmlns=’http://www.sitemaps.org/schemas/sitemap/0.9’><url><loc>http://www.mysite.com/</loc><lastmod>2012-05-25</lastmod><changefreq>monthly</changefreq><priority>0.8</priority></url><url>etc, etc, etc</url></urlset>
<Tag> - <Importance> - <Description><urlset> - Required - Encloses all information about the set of URLs included in the Sitemap.
<url> - Required - Encloses all information about a specific URL.
<loc> - Required - Specifies the URL. For images and video, specifies the landing page (aka play page, referrer page). Must be a unique URL.
<lastmod> - Optional - The date the URL was last modifed, in YYYY-MM-DDThh:mmTZD format (time value is optional).
<changefreq> - Optional - Provides a hint about how frequently the page is likely to change. Valid values are:- always. Use for pages that change every time they are accessed.- hourly- daily- weekly- monthly- yearly- never. Use this value for archived URLs.
<priority> - Optional - Describes the priority of a URL relative to all the other URLs on the site. This priority can range from 1.0 (extremely important) to 0.1 (not important at all).Does not affect your site's ranking in Google search results. Because this value is relative to other pages on your site, assigning a high priority (or specifying the same priority for all URLs) will not help your site's search ranking. In addition, setting all pages to the same priority will have no effect.
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:image="http://www.google.com/schemas/sitemap-image/1.1"> <url> <loc>http://example.com/sample.html</loc> <image:image> <image:loc>http://example.com/image.jpg</image:loc> </image:image> <image:image> <image:loc>http://example.com/photo.jpg</image:loc> </image:image> </url> </urlset>
Visible HTML links
Avoid only JS/Flash navigation
Keep low navigation levels
how many URLs are indexed?
how many URLs are crawled by Googlebot everyday?
how many URLs are in the sitemap.xml?
How many URLs found crawling?Ex: Screaming Frog
How many canonical tags?
How many with parameters?
How many URLs are noindex?
How many URLs with duplicated content?
Google index vs sitemap vs crawler
operators --> site: -subfolder
www vs not-www
check indexed pages
Indexed Pages Analysis (index vs sitemap)
Google Operators Queries
Main domain Page indexed --> site:example.com/
site:www.example.com -/eng/ -/blog
Primary Index --> site:example.com/*
- intitle:- inurl:- intext:- inanchor:- link:- filetype:
sitemap total urls = indexed pages
Status Score = # URL in Google index / # URL in sitemap
> 0,8 = Good
< 0,8 = Not good
sitemap tot urls < indexed pages
check canonical, double contents/urls and unwanted indexed files
sitemap tot urls > indexed pages
why some pages are not indexed?
Google Webmaster Tools
Google search [site:www.example.com]
HTTP status code
3xx, 4xx, 5xx
Setup an IP redirection
Setup preferred domain
Redirect www to not-www or vice versa
The requested resource has been assigned a new permanent URI and any future references to this resource SHOULD use one of the returned URIs. Clients with link editing capabilities ought to automatically re-link references to the Request-URI to one or more of the new references returned by the server, where possible. This response is cacheable unless indicated otherwise.
404 Not Found: The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent. The 410 (Gone) status code SHOULD be used if the server knows, through some internally configurable mechanism, that an old resource is permanently unavailable and has no forwarding address. This status code is commonly used when the server does not wish to reveal exactly why the request has been refused, or when no other response is applicable.
302 Found (HTTP 1.1) / Moved Temporarily (HTTP 1.0) A 302 redirect is a temporary redirect. It passes 0% of link juice (ranking power) and, in most cases, should not be used. The Internet runs on a protocol called HyperText Transfer Protocol (HTTP) which dictates how URLs work. It has two major versions, 1.0 and 1.1. In the first version, 302 referred to the status code "Moved Temporarily." This was changed in version 1.1 to mean "Found."
307 Moved Temporarily (HTTP 1.1 Only) A 307 redirect is the HTTP 1.1 successor of the 302 redirect. While the major crawlers will treat it like a 302 in some cases, it is best to use a 301 for almost all cases. The exception to this is when content is really moved only temporarily (such as during maintenance) AND the server has already been identified by the search engines as 1.1 compatible. Since it's essentially impossible to determine whether or not the search engines have identified a page as compatible, it is generally best to use a 302 redirect for content that has been temporarily moved.
if the page you are removing has a suitable alternative page on your web site, then 301 it. Do not always 301 the page to your home page. If there is no suitable, and by suitable I mean, a page that is very similar to the page you are removing, then 404 the page.301 if there is a related and similar page to the page you are removing. 404 if there is not.
Redirect pages one-to-one, never many-to-one
keyword rich URLs
nevers use non-ASCII characters
page-title: keyword at beginning
www.example.com/page-title (WordPress 2013)
point to same domain
alfa.example.com can point to www.example.com
point https -> http
syntax on id url
on page: http://www.example.com/page.html?sid=123
<head><link rel="canonical" href="http://www.example.com/page.html"/></head>
no underscore in url
avoid URL parameters
use absolute url inside links: http://...
use a light navigation (1-4 levels)
Internal link structure
check internal link distribution
use specific anchor text
use homepage deep links for top products/pages
check most linked pages
Links Position Weights
Links Higher Up in HTML Code Cast More Powerful Votes
External Links are More Influential than Internal Links
Links from Unique Domains Matter More than Links from Previously Linking Sites
Links from Sites Closer to a Trusted Seed Set Pass More Value
Links from "Inside" Unique Content Pass More Value than Those from Footers/Sidebar/Navigation
Keywords in HTML Text Pass More Value than those in Alt Attributes of Linked Images
Links from More Important, Popular, Trusted Sites Pass More Value (even from less important pages)
Links Contained Within NoScript Tags Pass Lower (and Possibly No) Value
A Burst of New Links May Enable a Document to Overcome "Stronger" Competition Temporarily (or in Perpetuity)
Pages that Link to WebSpam May Devalue the Other Links they Host
Internal Links Distribution
more internal links to important pages
use keyword in anchor text
use keywords in URL
use structured levels: draw a tree/SILO
better no more than 100 link on page
Warning: Internal nofollow
<a rel="nofollow" href="www.example.com">Example</a>
Google Webmaster Tools
Download a back link report to see if you're missing out on links pointing to orphaned, 302 or incorrect URLs on your site. If you find people linking incorrectly, add some 301 rules on your site to harness that link juice
Open Site explorer
page A "index"
page A "noindex"
page B "disallow"
page A "English"
page B "French"
page A "404"
page B "404"
page A "disallow"
page A "nofollow"
anchor text keyword rich
absolute link under HTTP
relative links under HTTPS
find not-HTML elements
with Google cache
can you see all elements?
fetch as Googlebot - GWMT
No-JS Navigation check
try disable JS in browser
are you still able tu use and navigate the website?
remove unused rules
Check text/html ratio
Check HTML declared language vs real language
First TAG position: <head><title>Title</title></head>
Length: max 56 char included spaces
use important keywords at the beginning of the title
Weight: Keyword < Category | Website Title
Tool: AdWords keyword research
no repeat keywords
unique titles for every page
Avoid Stop Words
articles (such as “the”, ”an” and “a”)
auxiliary verbs (such as “am”, “is”, and “can”)
conjunctions (such as “and”, “or”, “but” and “while”)
particles (such as “if”, “then”, and “thus”)
prepositions (such as “of”, “that”, “on” and “for”)
pronouns (such as “he”, “we”, “which” and “her”)
Use Title keywords inside description text
length: max 156 char
use keywords at the beginning
repeat TOP keywords max 2x
unique description for every page
from 5 to 20 words, include title keywords
unique SET for every page
if the page is an AdWords landing page, use AdWords bought keywords
META Language Tag
<meta http-equiv="content-language" content="it">
Tip: better placed in sitemap
Multi Language:rel="alternate" hreflang="x"
in HEAD section
<link rel="alternate" hreflang="en" href="http://www.example.com/page.html" /><link rel="alternate" hreflang="en-gb" href="http://en-gb.example.com/page.html" /><link rel="alternate" hreflang="en-us" href="http://en-us.example.com/page.html" /><link rel="alternate" hreflang="de" href="http://de.example.com/seite.html" />
Meta Refresh (Warning, not safe)
Meta refreshes are a type of redirect executed on the page level rather than the server level. They are usually slower, and not a recommended SEO technique. They are most commonly associated with a five-second countdown with the text "If you are not redirected in five seconds, click here." Meta refreshes do pass some link juice, but are not recommended as an SEO tactic due to poor usability and the loss of link juice passed.
Force page refresh
Place inside <head> to refresh page after 5 seconds:<meta http-equiv="refresh" content="5">
Redirect to http://example.com/ after 5 seconds:<meta http-equiv="refresh" content="5; url=http://example.com/">
Redirect to http://example.com/ immediately (BETTER):<meta http-equiv="refresh" content="0; url=http://example.com/">
An alternative is by sending an HTTP redirection header, such as HTTP 301 or 302
Robots meta tag
upload the robots.txt in the root directory
In XHTML, the language is declared inside the <html> tag as follows:
<html xmlns="http://www.w3.org/1999/xhtml" lang="en" xml:lang="en">...</html>
<meta name="robots" content="noindex">
<meta name="robots" content="nofollow" />
<meta name="robots" content="noarchive">
no Google cache version
no open directory project
<meta name="robots" content="NOODP">
<meta name="googlebot" content="nosnippet">
better in sitemap.xml
<link rel="alternate" hreflang="fr" href="http://www.ex.com/fr/index.html" /><link rel="alternate" hreflang="en" href="http://www.ex.com/en/index.html" />
<a href="http://www.w3schools.com" hreflang="en">W3Schools</a>
<meta name="robots" content="noindex">
<meta name="googlebot" content="unavailable_after: 25-Aug-2007 15:00:00 EST">
X-Robots-Tag: unavailable_after: 7 Jul 2007 16:30:00 GMT
Headings H1 - H6
Use H1 one time for page, H2-H6 could be repeated
Use in order: H1>H2>H3>H4...
Headings should contain TOP keyword phrases
Length: 2-6 words
Google Doc XMLIMPORT (f)
Strong & Italic
Use it on Keyword phrases and related terms
rel prev, rel next
Google Search Console
Original images perform better
if you can't: filter it, resize it, mirror it, ...
image tag alt="define"
1 word every 16*40 pixels
include relevant keywords
unique text for each image
image tag title="define"
always define image dimensions in HTML
spider supported formats: BMP, GIF, JPEG, PNG, WebP or SVG.
Additionally: - the image filename is related to the image’s content;- the alt attribute of the image describes the image in a human-friendly way;- HTML page’s textual contents as well as the text near the image are related to the image.
Logo alt tag: "brand name" > "home" > "logo"
Compression .JPG 80%
Upload scaled images
GD STar Rating
Google Rich Snippets Testing tool
Rich Snippet submission form
Use Local business markup
Custom 404 page
check status code
must be 404!
1. Post it on your website with no strings attached. It’s free and yourequire no personal information from prospects2. Blog about it3. E-mail your in-house database4. Post it on your social media profiles5. Publish a press release (pitch it to the media too)6. Create an ad campaign using banner and text ads7. Reach out to popular and respected bloggers in your industry andget them to blog about it8. Mention it in your next monthly newsletter9. Use it as a basis for a webinar or podcast episode10. Produce a video about it
In blog use categories & TAGS
Study SERP to find nice free places
AdWords keyword Tool for traffic
study best title
study right keywords
use at least 350 words
too much kw repetitions
bad human readability
write with steps
alt tag with main keywords
title tag with main keywords
image file name with main keywords
call to action
forms on landing pages
easy words and phrases
Content and Usability
– Mobile friendly website (Wordpress Touch/Mobify/...)– Mobile ads (SMS Text/video/Google Mobile ads)– Mobile & Social integration– Mobile apps/QR codes
body text & word count > 350
on site analysis
AdWords keywords tools
Branded / not branded
time on site
Google SERP Analysis
easy ranking areas
what they do?
SEO and PPC competition
time on site
Keep ranking history
EVE Milano Keywords Tool
use rel alternate href lang
change language button
redirect to the same page
do not redirect to the homepage!
setup meta viewport
desktop and mobile has same URL
setup http vary
Dedicate mobile site m.
setup Mobile redirect
check Webserver Performances
if backlink brings authority, redirect the 404
check Link popularity for the 404 resource
check Backlink anchor text
check Most linked pages
avoid too many site wide backlinks
Local directories to start
Rank inbound link?
KW + brand name
City + Brand Name
City + Service Keyword
Exact Service Keyword
different KW for different landing
See social section
See social section
use nofollow tag
publish quality content for natural linking
Blogs and Forums
find comments dofollow
warning: don't buy links
yes dedicated page
on related blogs
Use partial RSS file
Register RSS to Aggregators websites
Insert deep links inside RSS
Correct Broken links - 404
use Redirect 301
use Redirect 302
- Ask for link removal- Ask nofollow tag- noindex on destination page- Disallow with robots.txt- redirect 410- redirect 404- copy page and move internal link + noindex
Authorship Link for bloggers
use Keyword and description
use sidebar links
Call to action
analysis and shared answers
Like button indication
Open Graph integration
min CTR > 0,03 (3%)
min CPL > 0,3 (30%)
social shared ads
use url builder
auto Hashtag generation
budget: min 10$/day and 2$ click
not all language (!)
Share button plugin on website
Pin it button plugin
create topic dashboards
follow the moods, don't use it only to promote
to Company page
to Website URL
Open Graph TAG implementation
"Add to my circles" Button
Pin It Button
Follow Me Button
Internal 2x content
External 2x content
Low quality and/or thin content
Bad backlink profile
check kw rank history
server down time
spam and site-wide links
link pruning activities
malware on server
if unique source
longer url with too much parameters
too much levels
high bounce rate
short time on page
low quality out-bound links
spam inbound links
having too many transactional anchor text
site wide links
dofollow sponsor links