Google
 
Web smallbusinessbrief.com

View Full Version : SEO Checklist


Trivia-Wolf
31st July 2004, 09:09 AM
I've been doing web pages since the mid 1990s, I work alone, read-n-research as much as possible about SEO and relevant things, but find that after 14 years in business, one developes tunnel-vision. :(

What I have never seen is a check-list that can be used to make sure that when you create a page or site, you have put in at least most of the basic elements for the engines and for people.

I thought that if some are interested, we might be able to come up with a check-list that could help all of us. If this is something that is of no interest to anyone, or is not worth doing, I won't feel bad if the thread is removed.

I'll start by what I use:

Page Title: Use a descriptive, informative title

Meta Tags:
<meta name="Description" content=" ">
<meta name="Keywords" content=" ">
<meta name="revisit-after" content="15 days">
<meta name="robots" content="index,follow">
or <meta name="robots" content="noindex, nofollow">
<meta name="distribution" content="global">
<meta name="copyright" content="Copyright © 200?, WEBSITE NAME, CITY, All rights reserved worldwide">
<meta name="rating" content="GENERAL">
<meta name="classification" content="Business and Economy">
<meta name="author" content="INFO GOES HERE">
<meta http-equiv="reply-to" content="[email protected]">

Alternate Text Tags on Images: A descriptive tag - not just xyz.jpg (4K)

index page - text rich with keywords. Sometimes I use a meta tag to redirect to a home page after 0 seconds. The index page is for the engines and not for people.

Frame pages: Most of the time I don't use frames since if a Search Engine indexes the frame and you don't have links built in, the viewer has no way to get back to the rest or your site.

Page Size and Loading Time: Short to medium sized pages that load relatively quickly with a dial-up connection.

Optimize all graphics and use appropriate image size.

Size the graphics before they go on the page, never used the web building program to size the image. (I use front page - but do not use FP extensions)

-------------//-------------

I think that's most of what I use. I would really like to see this list improved on. If my thinking is wrong, I'd like to know that too.

PS. To the person/people who put this site together... thanks, you are doing and have done a GREAT job and fill a valuable need. :)

David Wallace
31st July 2004, 10:15 AM
The only meta tag that really matters is the meat description tag. Even that is not used to form descriptions in the SERPs among the 4 leading crawlers unless you specifically search for something that is in the meta description tag and not in the html text itself.

Meta keywords tags are pretty much useless. I still incorporate them but do not go to the effort of trying to make them unique for each page or anything like that.

The <meta name="robots" content=" "> is only useful if you DO NOT want a spider to index a page. Telling them to index or when to come back is futile. If you don't want them indexing a page, then the <meta name="robots" content="noindex, nofollow"> tag would be used.

The rest won't hurt but also will not provide any benefit as far as search engine optimization and positioning goes.

The main thing when optimizing sites is to make each title tag unique and relevant to the page, have well written copy that represents your keywords in natural language, optimize the anchor text in links where possible and regarding alt attributes, optimize these where images link as non linked images don't matter but don't stuff them with keywords (i.e keyword, keyword, keyword, keyword, etc.).

Trivia-Wolf
31st July 2004, 11:05 AM
The main thing when optimizing sites is ... optimize the anchor text in links where possible but don't stuff them with keywords

Thank you David,
:confused: Please define "anchor text".

David Wallace
31st July 2004, 12:15 PM
Anchor text is the text within a link.

Example: <a href="link.html">Anchor text here</a>.

Robert
31st July 2004, 02:32 PM
I'm a big fan of the meta description tag, even if its impact is limited regarding search engine optimization.

When I want to add a site to a directory, the first place I'll look for a good description of the site is the meta description tag. The same hold true if I want to bring a site to the attention of newsletter readers.

It's just so much easier for me to grab a well crafted meta description than compose one myself. Plus, it gives the site owner greater control in how I describe their site.

Robert
31st July 2004, 03:05 PM
PS. To the person/people who put this site together... thanks, you are doing and have done a GREAT job and fill a valuable need. :)

The thanks goes to you and all the members. Members asking questions and helping each other whenever they can is what makes this place great. :)


index page - text rich with keywords. Sometimes I use a meta tag to redirect to a home page after 0 seconds. The index page is for the engines and not for people.

This threw up a huge red flag for me. Every page should be designed for search engines and people. I'll let the pros jump in on this, but what you've described sounds like a "doorway" page or a form of low-tech cloaking (showing search engines one thing, and people another). If you take a look at Google's Information for Webmasters, specifically this page...

http://www.google.com/webmasters/2.html#B3

... you will find the following statement from Google:

"...certain actions such as cloaking, writing text that can be seen by search engines but not by users, or setting up pages/links with the sole purpose of fooling search engines may result in permanent removal from our index."

Another page worth visiting is Google's Webmaster Guidelines page:

http://www.google.com/webmasters/guidelines.html

But, I do like your suggestions for optimizing graphics, page size, etc. because that will help create a better experience for visitors to your site. :thumbsup:

Old Welsh Guy
31st July 2004, 05:21 PM
All of the above, but that Yahoo! suggest using keyword & description tags.

The redirect is a serious tightrope walk. You really should not cloak a page like that, unless you are happy to risk being banned.

Firstly you will not get into any of the serious directories if you are using that technique, DOMZ, joe ant, skaffe etc will all bomb your application out. I know that I certainly would reject it if there was a redirect there. If you are already in a directory (human edited), and someone reports the redirect, then you are going to be removed and banned.

One major league ommission was the creation of an optimised navigation system. and a site map page. these two things can make a mountain of difference to the performance of your site, especially in the case of Google.

Vinnie
2nd August 2004, 04:37 AM
Redirects are fine as long as it is not a con. Recently we rebuilt a client’s website and original urls were in some mess and disarray for example we the client had:

testhotels.co.uk/accommodation$in%%mars$/honeymoom&suite.htm

The page had a 5/10 ranking. But to us this was an unacceptable URL string and not consistent to release from a design house.

We changed the URL (this URL and the above is just an example for the sake of this discussion) to:

testhotels.co.uk/accommodation/honey_suite.htm

The entire website except for a few pages was all made up of similar stings with foreign symbols and letters.
We changed the entire website to a workable English language that was design consistent.

We put the website live and with the old long urls we put a redirect one each one to their new equivalent pages.

The first thing what happened was the site took high positions in 145 categories overnight. It then bottomed out after a week to take its natural place and it will be an ongoing solution to gain it high positions.

We notice the old urls stayed at the top for a while but after 4 weeks Google dropped them in favour for some of the new ones. So it's working well in some places.
What I was dismayed to find was that the new pages even though they are fluctuating at the moment between Google first page and second page syndrome. They all cam in at 0/10 rank and still remain this way. The redirects did not pass on the old rank. This is a load of BS because Google states that new pages that are a result of a redirect will not lose rank.

If the new pages took the old rank then they would all be flying high at the moment. As it stands they are up and down and all over the place and causing me a headache.

Old Welsh Guy
2nd August 2004, 07:47 AM
What I was dismayed to find was that the new pages even though they are fluctuating at the moment between Google first page and second page syndrome. They all cam in at 0/10 rank and still remain this way. The redirects did not pass on the old rank. This is a load of BS because Google states that new pages that are a result of a redirect will not lose rank.

If the new pages took the old rank then they would all be flying high at the moment. As it stands they are up and down and all over the place and causing me a headache.

Hi Vinnie, Maybe I am on the wrong track here, but the 0 second redirect I have assumed to mean a Javascript refresh thing, In this case the page that is redirecting is still intact, and so keeps its PR.

When Google talk about PR passing on, they are talking about using a 301 redirect, which is how I assume you have done it? The PR normally follows on a few weeks later, and right after an update. I did one recently, and the PR jumped ship after the backlink/pr update. It is bloody annoying though having to wait lol

Vinnie
2nd August 2004, 05:15 PM
Hi OWG

Okay is this really so? Are you saying I should wait and the old page rank will catch up?
I really hope so, it's been about four weeks now.

Old Welsh Guy
2nd August 2004, 08:35 PM
truly Vinnie, Google normally updates any 301 Page rank at the next proper update. This has been my experience anyhow. Did you do a complete 301 domain , or did you do a pape by page 301?

eg www.olddomain.co.uk/dogs.htm = www.newdomain.co.uk/dogs.htm

if you have done it like that then it should great for the users and the SE's, plus you will keep the deep linked PR.

OWG

P.S. please let me know how it rolls out in the next update ;-)

Vinnie
3rd August 2004, 03:37 AM
Hi OWG

It has also been in my experience that Google picks up old rank and transfers it and it did so when we ourselves rebranded.
After you posted yesterday I went back to double check the 301 redirects as something was nagging me awfully. Then it came back to me we couldn't do a 301 redirect on the fasthost servers where the new website had been moved. So we went with a non-error message:

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<title>This page has moved</title>
<meta name="robots" content="noindex, follow">
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<meta http-equiv="refresh" content="3;URL=langtry_lodge.htm">
</head>

<body>
<p><strong>The page you are looking for has moved</strong></p>
<p>Your browser will automatically take you to the new page in 3 seconds but to go there immediately please click on the link below:</p>
<p><a href="langtry_lodge.htm">langtry lodge.htm</a></p>
</body>
</html>

We did this with each individual page and pointed it at it's equivalent page. I notice running a check through Gogole last night that most of the new pages have now overtaken the old ones and gaining strength but because the new ones have 0/10 PR they are struggling against other pages with higher rank. We did not change some of the page urls because they they had sensible addressing still retain their rank and a high position if not higher now.

I have had to get people to link to the new pages individually especially the more important ones to give it a bit of a boost.

Overall the PR rank is still not as high as it should be. I have now set up some very good reciprocal links that should boost PR in the coming weeks.

Trivia-Wolf
3rd August 2004, 09:26 PM
When I first wrote the Thread, I really didn't think anyone would comment. Now I know why I don't gamble... even living in a gambling town.

Anyway... There was a comment about SE frowning on redirect. I use the refresh meta tag so while it's set to a time of 0 - it is still visible to those who look quick. Also, in case the tag dosen't work, there is a physical link to the home page.

Here's a couple more things to throw out to the group.

1. I have 2 web sites of my own that advertise and sell a book, photos and CD about the 1939 NY World's Fair. If I type the following "1939 New York World's Fair" on google as of June 22, PMPhoto.to ranks 8 and 9 out of 69,000 lists. If I remove the ' from the word World's it ranks 14 and 15 out of 56,700 sites. The other site, cyberheadwebs.com doesn't even show up.

Now on Yahoo - with the ', PMPhoto get 12 and 14 and Cyberheadwebs get 26 out of 65,000. Take out the ' and PMphoto fall to 36 of 127,000 and CHW doesn't show.

This is true on AllTheWeb, MSN, AOL, AskJeeves, HotBot, Lycos, Teoma, NetScape and Altavista... they are all different.

Now I consider the ranking pretty good, but what I find interesting is that the page rank in the google bar is only 4 out of 10. Hmmmmm, makes me wonder where I'd place if the bar showed an 8 or 9.


2. Someone in one of the posts mentioned optimizing for each page. Boy do I agree. What I tell my clients is that the home page isn't everything.

Cyberheadweb is a low traffic site. I recently put counters on all the pages, then put one page with the same counter from each page. We all know that counters really don't mean much, however, when you look at a group of counters all at one time you can easily see where people are going.

The home page only has 251 hits. But, the Worlds Fair page has 2011 and my antiques page has 1901. (Now now, quit laughing at the numbes.) This tells me that people are finding the individual page that they search for as opposed to finding the site in general.

Food for thought: If all pages use the same meta tags, and no real good title description, and no real individuality then wouldn't you expect the site to receive a low ranking. After all, if everything looks almost the same to a SE then how do they choose which page to list?

I really try to make each page a "site" unto itself. So far I'm having pretty good luck.

Sorry for being so loooong winded.

Regards,
Paul

StupidScript
4th August 2004, 08:52 PM
Paul, there's no shame in being long-winded...unless you're being derogatory. Just ask ME! :)

Optimizing individual pages is where it's at. As noted, spider-positioning is valuable when the individual page is relevant to the search, and after all, you want people to enter your building by any entrance they deem valuable.

I always try to make the keywords and the description META tags worth something, too. While many of us believe we have a handle on how the spiders crawl, none of us can claim that we truly understand their algorithms. Unless a spider* is dim-witted or uber-cynical, the keywords in the META tag are used, not ignored.

True, some of the expensive guys are uber-cynical, however the increased exposure on the smaller engines is worth the effort.

The description META tag is especially important due to its popular use in the search result display. Consider it an especially tricky place to put a keyword-rich sales message: It's your doormat, and the most likely factor in getting the click. But write it poorly, and you'll receive naught but spittle.

*Of course, "spiders" aren't witted or cynical, rather the way their indexes are used displays cynicism or idiocy...often.

Sue
5th August 2004, 11:14 PM
We've found that SE rankings also improve if you make sure the pages are HTMLv4 compliant.

I recently bought a licence for CSE HTML Validator (see http://www.htmlvalidator.com ). Although something like MS FrontPage does a lot of coding for you, it's not necessarily compliant with the standards :( . The validator reports errors in the code that will effect search engine ranking, and gives suggestions on how to improve other bits.

I'm sure there are free validators available. We picked this one because it allows us to check code in our development environment, before code is moved live.

Robert
6th August 2004, 12:11 PM
When I first wrote the Thread, I really didn't think anyone would comment. Now I know why I don't gamble... even living in a gambling town.

In this case your gamble payed off by producing a great thread. :) Keep your posts coming!

StupidScript
6th August 2004, 12:22 PM
We've found that SE rankings also improve if you make sure the pages are HTMLv4 compliant.

Spiders typically pay attention to the DTD (Document Type Definition) which should be at the top of every HTML page, and if the rest of the code is not compliant with the DTD, the spider "freaks", as it gets lost trying to work through non-compliant code.

A little like handing someone a map, telling them it's a map of New York, and then sending them on their way to explore West Virginia.

Lots more info from the horse's mouth: http://www.w3.org/TR/html4/

Sue: Is it your experience that HTML 4 makes the difference in rank, or is it that the code must comply with the DTD, regardless of the version (XML1.0, HTML3.2, etc.)? I haven't related compliance with ranking, before, and I'm curious...

StupidScript
6th August 2004, 02:33 PM
The <meta name="robots" content=" "> is only useful if you DO NOT want a spider to index a page. Telling them to index or when to come back is futile. If you don't want them indexing a page, then the <meta name="robots" content="noindex, nofollow"> tag would be used.

Another excellent use for the ROBOTS META tag is to keep them from cahing your pages, particularly if you update the page content frequently:

<meta name="robots" content="all,index,follow,noarchive">

I sometimes use it just for Google, whose cache seems to be very persistent:

<meta name="googlebot" content="noarchive">

Placing this tag on your pages will keep the cached version from being an option in the search results of pretty much every engine, the next time they spider you. (The cache still exists, however it won't be used in the results and there will be no "Cached" link on the result.)

http://help.yahoo.com/help/us/ysearch/basics/basics-10.html

Sue
9th August 2004, 07:57 PM
Sue: Is it your experience that HTML 4 makes the difference in rank, or is it that the code must comply with the DTD, regardless of the version (XML1.0, HTML3.2, etc.)? I haven't related compliance with ranking, before, and I'm curious...

Sorry, HTML 4 is the only one we've tried...

Trivia-Wolf
11th August 2004, 01:08 PM
I recently bought a licence for CSE HTML Validator (see http://www.htmlvalidator.com ). Although something like MS FrontPage does a lot of coding for you, it's not necessarily compliant with the standards :( .

Sue is 150% correct about MS FrontPage not necessarily compliant. I've had to "fix" things it does many, many times. :( (Oh, by the way I don't use their extensions.)

On your recommendation, I purchase the HtmlValidator and I think I'm going to love it. Way to go Sue :thumbsup: Thanks :thumbsup: (you deserve 2-thumbs up!)

Now I have a problem and am hoping that someone out there can help.
The validator suggested I use "<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">"

I have a drop-down menu I created using Fireworks. If I include the Doctype line the drop-down menu boxes don't appear properly. It loses the top and left outline and forces the the text to left justify.

I've been up to the W3C Recomendation - and got a bit bewilderd. Is there an additional tag to add so that the Javascript works properly when using the Doctype statement. I did not find any way of correcting the problem if the Doctype tag was left in, and I spent about 3 hours trying just about everything.

Your help will be GREATLY appreciated. (Thank goodness I only use this type of navigation on 2 of 24 sites or I think I'd go utterly mad!!!)

Warmest regards to all...
Paul

StupidScript
11th August 2004, 05:28 PM
the drop-down menu boxes don't appear properly

I moved my response to "Website Development" => "HTML 4.0 and Javascript" to keep this thread a little bit more clean.

http://www.smallbusinessbrief.com/forum/showthread.php?p=969#post969

:)

StupidScript
12th August 2004, 01:33 PM
Everything old is new again...

I went back and started reading this thread from the beginning, again, to remind myself of its purpose, and I got clonked by a thought...

What pretty much every SE says in their optimizing suggestions is the same stuff that HTML/SGML was designed to do in the first place: communicate accurate information to the humans who will be looking at it.

A quick example can be gleaned from the use of the ALT text attribute in the IMG tag.

Why is there an ALT attribute for an image? For one reason: To provide a textual description for the browser to use when the image does not display. An "alternate" to the image itself.

Who would use such a thing? For one: Vision-impaired people using browsers that "read" the pages to them. For another: People who do not have image capability (like my Mom on her ancient Mac SE) or who choose to surf with images off.

Try turning off your image display in your browser and checking out your site. Turn off Javascript, too. Try to surf your site using a text-only browser, like Lynx for Linux users. (Alternately, http://www.delorie.com/web/lynxview.html offers an online Lynx substitute.)

Does the site compel you to move through it, or is it boring? Is the information laid out in a sensible order, or is it jumbled? Is there enough information for you to move intuitively through the site, or do you have to figure things out to get where you want to go? Or to get to where the site wants you to go?

Pretend to be a spider, which is pretending to be a person, and see what they will see. Chances are this will expose weaknesses in your layout or copywriting or use of the ALT attribute, and help to refine your awareness of what the spider/image-less are going through.

Make the site for people (of all capabilities), and spiders will "enjoy" it, too.

And THAT's just the ALT attribute... :)

Addendum: It is really easy to get caught up in making a "pretty" site with Flash and applets and drop-downs and pictures, especially for those who use WYSIWYG composition programs. "Pretty" is okay, but clean code, useful information and "natural" progression through the site are far more important. I can think of a dozen sites offhand that are 99% text that get lots of business compared to their competitors who went the multimedia-site route. Why? Good algo positioning and solid psychometrics. The more you understand the mechanics of publishing on the web, the better-able you are to build a site that people will want to read and respond to.

Robert
12th August 2004, 02:09 PM
Excellent post, James!

StupidScript
12th August 2004, 10:46 PM
Ah'm jest sayin'...
:o

bradmarcus
28th November 2010, 12:25 AM
I agree that the meta description tag is very important. Actually, Google uses it to create it's description after the page title in the SERP. If none exists, it will pull the first paragraph of text, but I've seen huge jumps in organic search results by taking the time to craft a great description tag.

I'm a big fan of the meta description tag, even if its impact is limited regarding search engine optimization.

When I want to add a site to a directory, the first place I'll look for a good description of the site is the meta description tag. The same hold true if I want to bring a site to the attention of newsletter readers.

It's just so much easier for me to grab a well crafted meta description than compose one myself. Plus, it gives the site owner greater control in how I describe their site.