2 parts to SEO
1. Search Engine
Resources from Google:
SEO Starter Guide
Webmaster Tools (WMT)
Also, Matt Cutts
what is there to optimize?
on page and off page
on page = optimize the html
off page = everything else (links from outside, social sharing, etc)
the most important thing
have to have good content in order for google to rank the site
GREAT content is better, of course
but it's difficult for search engines to measure
how do SE's know if the content is good or great?
various measures / metrics --
word count, multimedia (if it relates to the content), bounce rate (how many users bounce back to google after hitting your site), time on page,
it's about sharing
links from outside, social networks, those are the best things
"Real Company Shit"
great talk about companies writing things that get shared almost automatically because the content is THAT good.
on-page optimizations (code stuff)
the URL itself
SEO URLs aka "speaking URLs"
page content should be apparent from the URL
make categories browse-able
good for readers
good for bots
query string = user input
URL not available?
site move, old pages, deletions
404s are ok if deleted / retired content
500s should NEVER happen
for everything else, use redirects
301 = permanent redirect. USE IT.
tells SE it's moved and SE should use the new page instead next time.
302 = temporarily redirected. AVOID THIS.
404s and 500 also in Google Web Master Tools. it will tell you which pages result in a 404 or 500 on your site.
title and description are most common
<title> becomes title in Google results (or other SE's)
description becomes the description in SE results
if user searches something ELSE that's not in your "description" but IS somewhere else on the site, Google will change the "description" in SE results to match the searched term / area of your site.
"description" serves as a fallback for Google Plus
not really used by Google any more
not ranked, nor used as an indicator of the content
bots are smart enough to parse content and see for themselves which words are important
bot views many pages as "news" that aren't actually "news"
so use googlebot-news and "no index" to tell it that a page doesn't actually have "news"
"main" url for the current page
why? duplicate content = bad
if 1 piece if content is available via many URLs b/c both users and SE's get confused
sometimes it's unavoidable
for basic things like www/non-www sitename.com, just use 301 redirects
rel="canonical" to point to view all page
paginated content with rel="prev" and del="next"
they designate previous and next pages from where you're currently at.
<link rel="next" href="page3.html" />
enables any web page to become a rich object in a social graph
developed and mainly used by Facebook
there is an Open Graph debugger
-paste your uRL in, it will tell you how your page will look when integrated with Facebook
2 types. Publisher and Author
Publisher - the creator of the site itself
Author - author of the article/content on the site
(1 site/publisher can have many different authors)
Trusted authors and publishers are "preferred" in Google results
1 of several formats to designate page content
used by several SE's -- Bing and Google
"harmless" markup -- can put it in existing code and it doesn't alter the design
-lots of documentation there
marked up content w/ schema
review - add the "star review rating" in your SE summary results
searchAction - users and (for instance) search your content right from the Google results page
events - can specify, say, calendar dates in your SE results, etc.
Google's Rich Snippet Tool
-- past in the content
gives a "report" on what it found for the site and the schema being used, etc.
other things --
tells SE's if the page should be followed or indexed (or neither)
if your site is available in multiple languages, can use "xhreflang" meta tag to tell SE that multiple languages exist for this site
linking to your app
obvious way -- link to your app in the App Store
smartphonebanners.com -- will show a banner at the top of your site saying "a mobile app is available, want to get that instead?"
Android: app indexing and links to app in Google SERPs
Loading times play a role in SE
-- keep it low, it's an advantage for the user. Also good for bots b/c they can crawl your site faster. Bots have a limited amount of time/space available for your site.
Use compression and caching
tends to be overlooked
controls Bot crawling for the whole site
can disallow private/irrelevant pages or directories
can specify XML site maps
especially useful for bigger sites with a lot of content
-"guide" to your site
for all types of content
tells SE where all yr content lives
--documentation about these
but in the end, keep in mind...
search engines decide what to show or use
- ranking in the Search Engine Result pages (SERP)
calculated by complex algos
google updates are named after animals
most important are panda and penguin (seriously)
panda - promote site w/ good content
penguin - fight web spam
SEO is part of UX
SEO should be thought about from the beginning of the site
adding SEO on existing pages may be difficult but it's worth it.