Complete SEO guide for meta tags, URLs, robots, sitemaps, social tags, multi-language sites, local search, mobile, page speed and more!
SEO (Search Engine Optimization) is an awkward topic for most web developers. Unlike web development, SEO is not an exact science and impact of specific SEO actions can become visible after days, weeks, months, or sometimes — never. SEO is quite different than other sorts of web development, where you make a code change, switch tab to the browser, refresh, and voila: your changes are instantly visible.
Searching for SEO best practices often returns contradictory and outdated information or topics related to keyword research, content optimization, and link building. It can be really easy to get lost in all the constant updates, and dismiss SEO as something only SEO experts should do. This SEO guide jumps right to the meat of it:
- it provides current, regularly updated SEO best practices
- it focuses only on SEO topics relevant to web developers
- it skips SEO topics which are better suited for SEO experts, marketers and business owners (keyword building, content optimization, link building)
SEO practices can be divided into two main categories: on-site SEO, and off-site SEO. On-site SEO is a practice of optimizing both the content and HTML source code of a page. Off-site SEO refers to external signals and links elsewhere on the Internet. In this guide, I’ll focus exclusively on on-site SEO, and mainly on source code and web server optimizations.
Title tag (
<title></title>) is a strong SEO signal and one if the first impressions users get from the website. It is used in several very important places:
- search engine results pages (SERPs)
- social networks
- web browser tabs
The exact length of displayed title in SERPs is based on container width (600px), not the number of characters. Google will typically display 50–70 characters, but try to keep titles under 60 characters.
<title></title> is required HTML element and should not be blank. Also, each title should be unique — don’t use defaults for multiple pages!
Put important and most specific keywords first and add brand name at the end of the title, e.g.:
- Nike LunarGlide 8 Women’s Running Shoe. Nike.com
- Apple MacBook Pro 13-Inch (2017) Review & Rating | PCMag.com
The meta description is an HTML attribute that provides a short summary of a web page. Search engines often display it under the blue clickable links in search engine results pages (SERPs), although they sometimes ignore it.
Although meta description doesn’t influence rankings directly, it can impact click-through rates, which do influence rankings.
Google typically displays around 160 characters, but try to keep description under 155 characters.
Like title tag, each meta description should be unique. However, unlike title tag, the meta description is not required and you can omit it. You should write a description when targeting one to three heavily searched terms, or expect users to share a page on social networks. You can leave it blank when you are targeting long-tail traffic (three or more keywords).
Never use double quotation marks in meta description (or any other HTML attribute), because search engines will cut off your description when they encounter quote character.
Use alt attribute in an
img tag to provide alternative text to describe the image. It will be used by screen readers, it will be displayed instead of an image when an image cannot be loaded, and it will help search engines index an image properly.
Give your images detailed, informative file names and descriptive alt text to help search engines index images properly.
When width and height attributes are specified, a web browser can begin to render a page even before images are downloaded.
The most popular screen readers cut off alt text at around 125 characters.
a tag (hyperlinks)
Use descriptive keywords in anchor text that give a sense of topic of the target page. Anchor text should be succinct, relevant to the linked-to page, not overly keyword heavy and not generic (e.g. “click here”).
<a href="http://five.agency/">Mobile design & development agency</a>
rel="no follow" for paid links and untrusted content (user submitted links). Search engines will discount any link value that would be passed to a normal link.
A well-crafted URL is human-readable, semantically accurate, and provides both humans and search engines an indication about content on the destination page. Instead of using IDs in URLs, use descriptive keywordswhich reflect category hierarchy.
Ideally, a website should have a pyramid structure — all pages should be accessible with 3–4 clicks from the homepage. If applicable, link back from articles to their category and subcategory. This will help to establish site architecture and spread the link juice.
One of the common problems with URLs is duplicate content. Duplicate content is content that appears at more than one URL. It can present multiple issues for search engines and dilute visibility of each of the duplicates (including original!). To fix duplicate content issues, two methods are used most commonly: 301 redirects and
301 redirect is a permanent redirect which passes the link juice to the redirected page.
rel=”canonical” attribute is a part of the HTML head element and contains URL of the original page. It should be added to all duplicate pages, as well as the original page.
HTML link elements
rel="prev" should be used to indicate the relationship between URLs in paginated series. This will hint search engines to send users to the most relevant page — typically the first page of the series.
Whenever possible, place content on the same subdomain to preserve authority — don’t use subdomains to separate website content into sections.
www vs non-www
From SEO perspective there is no difference between the two, but it is important to choose one and stick to it. Additionally, 301 redirects should be used to redirect from non-preferred domain to preferred domain.
non-www has two big technical disadvantages:
- it cannot be used reliably with cloud providers like Heroku which commonly use DNS CNAME records so that they can dynamically change the IP address of the web server. non-www cannot have CNAME record.
- cookies are sent from non-www domain to all subdomains. If the site uses a subdomain to serve static content, this would slow down access to static content and possibly cause problems with caching.
www should be preferred for largest websites, websites with a large number of subdomains, and websites hosted in the cloud.
robots.txt is a text file which instructs search engine robots how to crawl pages on the website. While Google doesn’t crawl, or index the content blocked by
robots.txt, Google might still find disallowed content from other places on the web and index it. To prevent indexing of a page, use robots meta tag instead.
Indicate the location of any sitemaps associated with this domain at the bottom of the
Google has sophisticated algorithms to determine the optimal crawl speed for a site and recommends against limiting the crawl rate unless crawler is causing server load problems.
meta robots tag
Use meta robots tag with parameter
noindex to instruct crawlers not to show the page in search results. Use it for pages like internal search results, login and other authentication related pages.
<meta name="robots" content="noindex">
Use meta robots tag with parameter nofollow to instruct crawlers not to follow the links on the page.
<meta name="robots" content="nofollow">
noindex and nofollow directive can be combined:
<meta name="robots" content="noindex, nofollow">
An alternative to using meta robots tag is X-Robots-Tag HTTP response header.
Although crawlers can usually discover most of the well inter-linked site, a sitemap can improve the crawling of the site, especially when you:
priorityattribute to signify the importance of the page relative to other URLs in the site
changefreqattribute to hint how frequently the page is likely to change
lastmodattribute to signal the date of last modification of the page.
Social meta tags
Social meta tags are not a direct ranking signal, but correctly set social tags, help content spread, which often leads to increased links and mentions. Open Graph has become de facto standard for social meta tags — Facebook, Twitter, LinkedIn, Google+ and a lot of other major platforms recognize it. Twitter has its own Cards meta tags, but if they are not present on the page, Twitter falls back to using Open Graph tags.
If you are using social sharing widgets, add
image:height. These two tags will enable Facebook to correctly load the image in their sharing pop up the first time a page is shared.
Optional additional properties:
locale— the locale of the resource
fb:app_id— used by Facebook Domain Insights — traffic analytics
image:type— MIME type of the image: image/jpeg, image/gif, image/png
If you want Twitter to display full-width prominent image alongside a tweet, use
twitter:card tag with the
Minimum image dimension for such Card is 300x157px, and the maximum is 4096x4096px.
Structured Data (Schema.org)
Include structured data on the page to provide explicit clues about the meaning of a page to Google and other search engines. Google also uses structured data to enable special search result features and enhancements, like images, breadcrumbs, carousels, social profiles links, etc.
Schema.org has become de facto standard syntax for structured data. Structured data can be added to a page in three formats: JSON-LD, Microdata and RDFa. Google recommends JSON-LD, which doesn’t interleave structured data markup with the HTML, but keeps it nicely separated.
At minimum add site’s name, logo, contacts and social links. See resources below for more info.
If your website’s content can be described by any of the supported schema.org content types, add those as well. Examples include articles, books, events, music, products, recipes, movies, videos, etc.
Structured data doesn’t necessary improve rankings, but improves how pages are represented in SERPs which should improve click-through rate.
HTML5 Semantic elements
HTML5 semantic elements are probably not a SEO signal, but they help crawlers to better understand content on the website.
Use header, main and footer tags to divide body content into three sections:
Use header element inside the article to encapsulate h1, main image, category, etc. Use footer element inside the article to encapsulate author and tags.
Use aside element for social sharing icons and related articles.
Use section or aside element for related articles below the main article.
Use time element to represent article publication date.
Full example for a typical article page.
Moz, Kissmetrics, HTML5 doctor source code
HTTPS is already a ranking signal, and will only become stronger in the future.
Since January 2017, Google has started displaying “Not secure” warnings for HTTP pages that collect passwords or credit cards. Long term, Google will mark all HTTP sites as non-secure. This warning will lead to increased bounce rates and shorter user engagement, which are negative SEO signals.
Moreover, HTTPS (over HTTP/2) is faster than HTTP, and faster pages rank and convert better.
Since the raise of Let’s Encrypt, you can setup HTTPS for free, and what’s even better, you can fully automate renewals.
HTTP Status Codes
HTTP responses can signal to search engines that a page has moved permanently (301 Moved Permanently) or temporarily (302 Found) to a new URL. Until recently (mid 2016), 301 redirect resulted in around a 15% loss of PageRank and 302 redirect didn’t pass any PageRank to new URL, but since then Google stopped caring which redirection method you use and they all pass full PageRank. If you care about other search engines, the 301 redirect should still be preferred method for permanent redirects.
Generally, 404 Not Found should be used when a web server cannot find requested URL. However, there are a couple of exceptions to that rule, but only if there is alternative highly relevant page available for 301 redirect:
- valuable links from external sources. You should first try to get the external sources to change the link, but that’s not always possible.
- mistyped URLs
- URLs that receive large number of users
404 page should clearly notify the user that the page they were trying to reach doesn’t exist. It should also provide a link to the homepage, a clearly visible search box and easy to use navigation system so that users can potentially find what they were originally looking for.
Redirecting 404 pages to the homepage is a bad practice because most users will not realize that the web page they got is not the one they were trying to reach.
Mobile devices typically have a lot slower connections and higher latency than desktops, so page load speed is crucial. On top of all other pagespeed optimizations, make sure you are serving images resized for smaller mobile resolutions.
Don’t use popups — it can be difficult and frustrating to close them which might lead to higher bounce rate. Instead, use simple banners inline with page’s content. Design for fat finger — make sure there is enough space between links and buttons so that they are not accidentally tapped.
Search results with rich snippets are even more likely to stand out on mobile than on desktop. Also, there are rich snippets enhancements like Carousels that are available only on mobile.
Google recommends using responsive web design and serving both desktop and mobile on the same URL, instead of using separate mobile URLs.
Most mobile browsers do not render Flash. Use HTML5 instead.
Use meta viewport tag to instruct browser how to adjust the dimensions and scaling of the page to the width of the device:
If the website has a local element, optimize mobile content for local search — see Optimize for local search section for more details.
Optimize for local search
Local results appear for people who search for businesses and places near their location. To improve local ranking on Google and enhance presence in Search and Maps, provide complete and detailed business information in Google My Business.
Add accurate and appealing pictures to your listings to show people your goods and services.
Add the name, address and phone number to the website, but also to other websites like Yelp, Foursquare, social media sites. Make sure information is consistent — watch for typos. Add city and state to the title tag, H1 or other headings, URL, content, image alt tags and meta description. Embed Google Map that points to your Google My Business listing.
High quality, positive reviews improve website visibility and increase the likelihood that a potential customer will visit the location. Encourage customers to leave a review and respond to reviews.
You can help Google determine the website language correctly by using a single language for content and navigation on each page, and by avoiding side-by-side translations. Keep the content for each language on separate URLs. Most often, you’ll want to use subdirectories or subdomains to separate multilingual content. Subdirectories structure is easier to setup and maintain, but is less flexible — it uses a single server (or single cloud deployment) and makes it harder to separate sites. If you want to target countries instead of languages, use country-specific top-level domains (.co.uk, .de, .es…).
Cross-link each language version of a page so that users can switch the language with a click or two.
Add lang attribute to html tag, to declare the language of a web page. Google ignores this attribute, but Bing uses it, it’s recommended by W3C and it helps some screen readers to handle pronunciation properly.
Avoid automatic redirection based on the user’s perceived language. Instead, use hreflang attribute to specify the relationship between translated pages and Google will use that information to serve the correct language URL in search results. Each language page should identify all different language versions, including itself.
The value of hreflang attribute identifies the language and optionally the region (e.g. en, en-gb, en-au). For language/country selector pages, or homepages shown for all languages, use
Hash fragments that represent unique page states should begin with the exclamation mark. E.g.
You can add hash fragments to your link
rel="canonical" and sitemap.xml if that is the preferred URL you want to appear in search results.
Page Load Speed
Page load speed is one of the confirmed signals that Google uses to rank pages.
According to some researches, it seems that Google uses Time to First Byte (TTFB) as page load time. Page load speed is also important from user experience perspective. Pages that load faster have lower bounce rates and higher average time on page. Therefore, both time to first byte and page load time should be optimized.
Page speed optimization deserves its own article (I’ll write one soon, so stay tuned! :-)), but in the meantime check out the resources below.
This Is the End
Yay! You’ve made it until the end. Congratulations!
This guide is also available as Web Developer’s SEO Checklist — a handy interactive checklist you can use while you implement all these actions on your latest project.
As I have mentioned in the introduction, SEO is changing extremely fast. If you think that anything is missing from this guide, or that some of the recommendations in this guide are outdated, I would love to hear it. Just drop me a comment below.
Thank you for reading. If you like the article, please click the👏 below so others can find it too — it would really mean a lot to me. :)