Featured Post

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the " SEO Issues - is it Penguin? Is it Panda? or is it me? " in which I detailed o...

Sunday, December 2, 2007

Ha! to all you Linking black hatters...

You know I have been talking about is for several years now. When I started as a full time SEO technician several years back, the company I worked for (Visible Technologies) used linking as their main strategy for rankings. While they obtained the rankings quickly, the search engines slowly started catching up to all those websites that used such sites at linkmarket, and linkworth in order to increase their rankings. Funny thing is that now I get to tell them "I told you so".

Does this mean that Google doesn't approve of linking? By all means, NO... Google supports linking in every way, in fact they encourage website owners to obtain external links in order to help their rankings. What they don't approve of is the purchasing of links in order to increase rankings.

The point of rankings within the search engine results pages (SERPs) is to help the user find what they are looking for efficiently. If a website has next to no relevant content and has rankings just because of a load of external links is that helping the user find what they need?

No

If the website has relevant content to what the user is looking for, then chances are the user isn't only going to find what they need, but the user is going to be so excited about the site that they are going to either blog it, or tell others about the site by adding a link to it.

This was the whole basis to Google's algorithms from the beginning. The problem is that SEO's have been using black hat techniques in order to increase rankings quickly (linking, doorway pages, duplicate content, etc) leaving Google and the other search engines having to adjust the algorithms in order to bring the most relevant results.

When optimizing a website, always be sure to provide your user with relevant content, and landing pages that reflect what each user would be looking for. For example, if someone is looking to start dating online they would want a website that offers advice for those wanting to date online. Thus when a user types in "Online Dating Advice" They would find a website and a webpage that reflects the online dating advice.

Google talks more about this and Matt also adds a bit about how Google made some algorithmic changes recently which resulted in a lot of websites loosing pagerank and search results - Purchasing Links is  BAD -

Read it and memorize it well...

Always remember that if you have to adjust because your site lost rankings, then you aren't optimizing correctly.

Tuesday, November 27, 2007

Google snippets

Matt Cutts visited us here in Seattle by heading to Google Kirkland office. They had decided to make a few videos of Matt while he was there and post it to the Google Webmaster Blog - Anatomy of a Search Result. Matt's video explains the snippets (title and description that shows up in your search results). He used Starbucks as an example. The Starbucks site is a good one since it has limited content on the page, but uses meta descriptions in their code.



Matt mentioned in the video that we have no control over the site links presented underneath the snippet which isn't entirely true. While we are unable to pay for the extra sitelinks within the snippet, we are able to control them through the webmaster tools. Under the "links" and then "sitelinks" within webmaster tool, if Google has generated sitelinks for your website, you are able to control them from there.


Always remember how the snippet works:


  1. If the search term is within the meta description of the web page then the description will appear in the snippet (with the term highlighted in bold).

  2. If the term isn't in the meta description or there isn't a meta description then the words around the content within the body of the page will be displayed (with the term highlighted in bold).

  3. If the page does not have content and does not have a meta description, then the description is generally pulled from the open directory project

  4. If there isn't any content on the page, no description in the meta tags, and the site hasn't been submitted to the open directory project then the descriptiontion will be blank.

A great example of the fourth scenario is a client I have been working on that came to me with his website that was created entirely in graphics. The website was a great design, and had a lot of great information on it, but the text was put in images and placed on the site that way. The website wasn't even ranking for the name. The only way to pull it up int he results was to do a site: search to see that the pages were getting indexed, but the content wasn't getting recognized (because of the images).


I am currently in the process of pulling out the text from the images and the html is recoded to keep the design the same while keeping the text still recognizable. I also added a webmap for more efficient indexing, and landing pages for the terms that the client wanted to rank for. There is an xml sitemap submitted to the webmaster tools, and a Google analytics account setup for tracking conversions.
The client should start to see better results soon after we get the new site launched.

Tuesday, November 20, 2007

Classmates is getting rankings

I was asked for the Classmates.com address the other day and since I hadn't been there in a while I went to Google and typed "classmates renton". I was surprised to see that registration form page for the "classmates test high school"
I hate to say this but the fact that tis page is ranking is wrong on so many levels.
From a usability standpoint - most people looking for the classmates address (such as myself) while Google ha it on the map, to see this page i the results is very confusing. How many of these pages are ranking? I have come across the final form page on several occasions while searching schools, or other such affiliations (researching schools for the kids). The problem is that a user coming to this page for any other reason than to sign up for classmates.com is very confused as to what they should do. This page is meant to be the last in a 5 page registration process in which a user starts from www.classmates.com and then flows through to the school they attended when they graduated. I had mentioned this while the SEO Manager at classmates.com but they chose to keep the pages as is, and decided not to make the fix in order to get the actual landing pages that we had designed and launched in December 2006. The pages were designed to recognize who would be landing on these pages, give the user a clear understanding as to why they were there, and what we would like them to do once there (The target audience, value proposition, and call to action). Those pages went from a 10% conversion rate (that's being nice) to a 50% to often 60% conversion rate.
It's a shame that the registration form page is ranking higher than the intended landing page.

The second reason why it's a bad thing to see this page listed is that the page is the test high school that the QA department uses in order to ensure there are no bugs in the registration process.
If it were me, I would remove it from the index through the webmaster tools, and then add it to the robots.txt to ensure that it wouldn't ever rank again.

Maybe even suggest that we not let it go to prod, and leep it on the QA servers just for testing.

Friday, November 2, 2007

The Shiny Red Button

You have all at some point heard me talking about my ideal website. If not, I'll describe that perfect website once again. You see, us marketing folks want to keep things as simple for the user as possible. A clear call to action on the page in which the user knows exactly what to do within seconds of the page loading.

So what is the clearest call to action imaginable? Why it's a shiny red button.

So my ideal website would be one that has a simple red button with nothing else. Big - shiny - and RED

Us marketing folks joke from time to time by saying we should add elements to this shiny red button. Like a call to action in words such as "click here" or possibly using a bit of psychology and adding "don't" to the "click here".

But then what is the value proposition on a page with a shiny red button? Could it be the button itself?

What does the button do?
How do you market the button to drive visitors to it?


All of these questions and more will be answered in time.
In the mean time take a moment to visit the Shiny Red Button - send it to a friend, and watch the button and all the marketing aspects change over time.


Wednesday, August 15, 2007

Official Google Webmaster Central Blog: New robots.txt feature and REP Meta Tags

Great for Promotions!

Now Google offers the feature to add a meta tag that tells googlebot when your page is going to expire. While at Classmates.com we would launch promotions on a regular basis in which landing pages and SEO were passed up because the PPC would be too expensive to run for such a short time, and we didn't want to run risk of the page ranking later when the promotion was finished.

SEO Similar Content case study

In April 2007 I attended the SES in New York and took part in a duplicate content discussion panel. The panel talked about the usual content issues such as shingles, duplicate content, boiler plate, etc.
At Classmates I was working on a project that was one page design in which it had dynamically fed content resulting in over 40 million dynamically generated pages. The goal was to individually optimize these pages so that they would act as landing pages when a person were to search a specific word that was located in our database.
I spoke with the people on the panel and asked them how they would resolve the duplicate content issues on so many dynamically generated pages. They suggested setting up a system in which we could add content to the page individually, and just take on a few each week until they all had unique content. The problem with that was that we didn't have the man power to do this for over 40 million pages. The pages were user generated (meaning the user filled out the information for each specific page) and exposing the user content was against the privacy policy and also against our business model leaving the possibility of allowing the user generated content to be visible by the search engines in order to make the pages unique.

So back to square one - when the experts can't help you, what do you do?

So I went back to work after the conference and presented the dilemma to my copywriter. We started talking about duplicate content issues and how we can avoid the content getting filtered out with boilerplate content. Then I thought - "what breaks up the content for each shingle? Is it the code, punctuation, or all of the words counted without code or punctuation?"

So I decided to run a test case against the question...

We took one block of copy that didn't make sense and added a 3 word phrase not common in the english language to optimize the copy for that word. Then applied it to 3 different basic html pages (no css, no skin, no javascriptiing of any kind, etc)

  1. Page 1 - the first page was the copy with the punctuation removed not a single quote, period, or comma as one solid paragraph surrounded by paragraph tags.
  2. Page 2 - the second page was the same paragraph as is without the punctuation that was originally in the content broken up with heading, paragraph, and break tags.
  3. Page 3 - the third page was the paragraph with the punctuation in it as it was originally written and surrounded in whole by paragraph tags.
Links to all three pages were added to the home page of a site that has been in existence for 5 years. They were titled "page 1", "page 2" and "page 3" so that they weren't swayed by the anchor text of the link.

It took a little over a week for the pages to show up in the Google index with Yahoo and MSN trailing by 4-6 weeks. Page 3 showed up first while page 1 showed up in the index shortly thereafter. When searching for the term that the pages were optimized for, Page 3 received rankings while page 1 did not show up. Page 2 was never indexed on Google or MSN, but did show up on Yahoo eventually.

After 3 months of page 2 not showing up in the index I changed page 2 and added punctuation in random places. The punctuation was in different places than page 3, and I removed the heading, paragraph, and break tags.

Just a few days after making the changes page 2 started showing up in the search results for the same term, and not as a supplement result. It was a unique ranking all it's own.

The result is that the search engines use punctuation to break up the content. So when optimizing a dynamic page that will create many dynamic generated pages it's good to use bullet points while keeping your sentences short, and adding the dynamic content as much as possible (without being too repetitive).

Monday, August 13, 2007

Official Google Webmaster Central Blog: Malware reviews via Webmaster Tools

Official Google Webmaster Central Blog: Malware reviews via Webmaster Tools

What is malware?

Malware is a software that tracks your moves online and feeds that information back to marketing groups so that they can send you targeted ads.

Malware can be downloaded with you even knowing by playing online games, downloading software that includes it and they don't inform you, etc. that install the software that tracks your every move.

Most of the malware installed will show you random pop-up ads but some peoples' computers slow down or even crash. The malicious software will use peoples' personal information and abuse it.

Google will now show you results in the search engine result page (SERP) that will warn you if a website uses malware. Some site owners don't even realize that something on their website has malware on it, so they are letting webmaters know through the Google webmaster tools if they have malicious software etc. on their site so they they can remove it.

I always love to see the user coming first when it comes to websites and search engine optimization and this is another great way that Google is helping the webmaster optimize the website, and pay attention to the user.

Problems with multiple domains

There are a lot of websites and companies that buy up multiple domains in order to try and corner the market and weed out the competition. While at Classmates.com I found a long list of domains that the company owned, some of which were parked, and others were resolving to the same DNS as the www.classmates.com domain. The problem with the multiple domains pointing to the same DNS for SEO is that the search engines view this as an exact match down to the same directory and filename, as well as the content on the page. The result in the end is that one or more of the domains could be banned or even penalized for being the same.

I ran into the same issue with the professional womens organization - the original domain was www.pwoo.org which was later changed to wwww.professionalwomenonline.com. The www.pwoo.org domain was used when I was quoted in articles and marketing publications, so I wanted those who came from there to still arrive to the same site even though the domain changed. With the two domains pointing ot the same site, I struggled with ranking issues due to links, and duplicate content. So I removed the www.pwoo.org domian from the Google index so that it won't obtain rankings anymore. The www. professionalwomenonline.com domain is now getting great rankings and the users who type in www.pwoo.org can still see the same site.

How did I resolve the issue?
Google has a great addition to their webmaster tools in which you can remove files, directories, and even whole domains. By removing one of the domains from the index, the other can be left to obtain rankings without resulting in being blacklisted. The only problem is that you have to have the pages 404 (page does not exist) in order for the domain to be removed.

Back up your files on a computer or separate server and remove them all (just temporarily)
Go to your Google webmaster tools and select the site you wish to choose (if you don't have a site setup in the Google webmaster tools I would advise creating an xml sitemap and setting up an account today)
From your webmaster tools site dashboard select the "URL Removals" for your website
Click the "New Removal request"
In this case you want to select " Remove your site from appearing in Google search results."
Confirm by clicking the " Yes, I want to remove this entire site."
The site is then added fore removal.

Tuesday, July 31, 2007

Official Google Webmaster Central Blog: Supplemental goes mainstream

Official Google Webmaster Central Blog: Supplemental goes mainstream

You know I have been an advocate of natural search landing pages. With Google's technology n supplemental results aligned with natural search optimization and marketing for usability a user that searches say Annville Institute, Annville, Kentucky (KY) on Google will see:
Annville Institute, Annville, Kentucky (KY)
Ever find yourself wondering what happened to so-and-so from Annville Institute in Annville, KY? Perhaps you’re trying to get in touch with one of your best ...

In the search results - which then takes the user to a page that is more relevant to what they are looking for than the standard www.classmates.com homepage.

This also helps projects like the buyyouadrink.org profiles so that users can share what information they want when Googlers search their name in order to find out more about them through their own personal profile pages.

Wednesday, June 6, 2007

In-House SEO isn't Just Optimizing...

Having been an in-house SEO for a few years I have come to the realization that optimizing a company's website from within the company itself isn't all about knowing the technology and algorithms of search engine optimization. I must say that I am glad I understand how a bot indexes a site, how targeting the right keywords on specific pages at the correct densities, and how a dynamic website can encourage rankings through initiatives that target specific traffic and generate conversions. But what I am most thankful for is the ability to communicate and work with other individuals within the company. While my main responsibilities may be to increase traffic  through natural search marketing, my success stems from the success of others. If I notice a page or section of the site that includes a large amount of content that could be great fodder for the search engines that isn't getting indexed I target the issue, and collaborate with those responsible for correcting it. If the content within those pages could be written to include the website's main key terms then I work with the copywriters or those responsible for those pages in order to get the terms within the content and in the correct densities so that they rank properly and aren't confusing to the users that read the content. The Search Marketing industry has not only grown over the past several years, but is changing in such a way that it takes more than just analyzing a site and telling a webmaster to make some simple corrections. Nowadays you have to coordinate with several different departments from IT to the copywriters, the user interface designers, the database developers, the analytics departments and so on in order to get the work you need done so that you can be successful in your job as a search marketer.

My advice to the consulting companies out there is simply this: 

If you can get inside the walls of the company you are trying to help, then by all means take that opportunity and run with it. Ask for introductions with as many people that are involved with the website you are working on as much as possible and develop strong working relationships with those individuals. They will not only help you understand the site you are optimizing better, but they can help leverage the work you need completed in order to drive the optimization. And by all means - don't ever stop communicating with those individuals. There is always something more you can do, and clients tend to start shopping elsewhere when they feel the work isn't being done for the money they are spending.


To the SEO consultants that are looking to make the transition in-house my advice is this: 

in the first month you are there, make sure your analytics are in place and you can grab a footprint of where the site is before doing any work. Also use that time to build relationships with the individuals that can help you in your success (i.e. the copywriters, developers, designers, product managers, and so on). Once you have successfully grabbed the data you need to show how natural search is driving traffic and conversions and you have developed the relationships you need in order to be successful then start optimizing and keep those lines of communication open as you continue to optimize the site. These are exciting times for us search marketers and for those of us that are crazy passionate about what we do there are bright futures in this technology. Just remember to be a successful optimizer you not only need to know design, development, analytics, etc. But you also need to know how to work well and rely upon others.