Home > Search Engine Optimization > Google Sandbox Doesn’t Exist!

Google Sandbox Doesn’t Exist!

December 3rd, 2004 Tony Leave a comment Go to comments

i’m starting to buy into the idea that the google sandbox is non-existent. i talked to quite a few peeps at the WebmasterWorld conference that had sucessfully launched new sites that quickly ranked well for competitive keywords.

i think caveman may be right:

I think it involves a series of dup filters, not limited to “content” in the “onpage” sense of the word. We believe it can involve text, and/or anchor text, and/or non-content code. Not too unlike good ol’ kw filters.

We guess that it involves an evolving db of kw’s, with the list changing in waves, though the biggest waves had names (e.g. Austin), and now the evolution of the list is more subtle. This could be a conclusion however that is way off base, with other things happening that just make it look like a kw based phenomenon. Some 2kw phrases that seem like they should be on the hit list are not, which is why we only guess at the kw list thing. And evidence abounds that it’s not just a money kw thing. I wish that would get put to bed.

We’d agree that there are multiple algo’s, and that G’s system factors in qualitative and evolutionary factors including age of pages, age of links, number of pages, number of links and rate of growth. There are lots of ways for a new cat to get skinned, and there have been ever since Florida. Just worse since April.

We also think a site has *hurdles* to get over that kick in or not at varying degrees depending upon their assesment of (what we view as) the evolutionary measures. And the hurdles also involve semantics, and measures of quality.

Trip a filter, you die. Sitewide. Fail to get over the hurdles, you die. Sitewide. (Did someone say G was all about pages?) ;-)

One thing I’d never do right now. Launch a 10,000 page site, or add 1000 pages to a 500 page site…unless I had a LOT of other things going for me.

I even seem to recall pages getting trimmed from the really big sites, which may be more related to all this than has been discussed much…especially if the algos are conceptually similar, but vary largely by degree.

this is so typical of the SEO business. a couple of years ago the standard way to launch a new site was to make sure you had all 100 to 20,000 pages up and ready to be indexed and then drop a link to the new site. you needed to make sure all your pages were ready to go before you queued googlebot because a deepcrawl was nearly inevitable at first launch but it may be a long time before you saw another. now it seems you’re better off starting with a few pages and trickle in new pages and inbound links. you definitely need software to manage this. its the only way i’ve seen sucess in the past year.

Categories: Search Engine Optimization Tags:
  1. No comments yet.