Get exclusive CAP network offers from top brands

View CAP Offers

How Long To Get Out of Google Sandbox

voodooman asked 3 years ago
Hello,

My newest site was started this past June, has about 140 inbound links (according to Yahoo) but is still ‘sandboxed’. The way I know is that I can search for specific combinations of words in my site in quotes and as long as there are ANY other sites with this same combination of words, my site appears at the end of this list of search results–even if the query only produces 5 or 10 sites.

I have used sitemaps –site is fully indexed–, descriptions are good, no URL-only pages, but NO GOOGLE TRAFFIC.

End of January makes 9 months for this site in the sandbox and I was wondering have any of you started new sites in the last year or two and how long were they ‘sandboxed’?

Did you do anything special to ‘get them out’ so to speak?

26 Answers
webber286 answered 3 years ago
The sandbox has only been in existence since the early part of 2004, so 3 years in the sandbox at this point would be impossible since it has only existed for 20 or 22 months. It is possible you might not have been ranking prior to the sandbox, and from what I understand it is possible to trigger the sandbox at anytime, so maybe that is where the 3 years comes from.

In any case, there are certain triggers for starting and ending the sandbox, the problem is only Google knows what the triggers are. Webzcas, what were you doing that got you out of the sandbox in 9 months? Some theories I have that might help is achieving a DMOZ listing, buying a Yahoo Directory listing, and registering your domain for at least 5 years. Did you do any or all of those?

As for the Javascript re-directs, that goes explicitly against Google’s webmaster guidelines and they are taking steps to ban sites that use those types of tactics. Just read Matt Cutts blog for more evidence of this. It may take awhile for Google to catch it, and possible they never would, but it is still a risk if you are in it for the long run.

webber286 answered 3 years ago
The sandbox has triggers that put you in and triggers that get you out. What those triggers are is the Million dollar question that can only be answered by a few souls at the Googleplex. Notice that Matt Cutts site had no issues with the sandbox.

Voodooman, if your intention is to not make any reciprocal links, then you may avoid being put into the sandbox. Not sure what happens if you manage to get only one-way links. But as you said, if you don’t have enough links, it will be tough to rank for competitive terms in Google. That is definitely the system they have created.

My belief is that the sandbox is definitely an aging filter, but the age required to get out of the sandbox depends on the competitiveness of the keyword(s) being targeted. Add to this that gambling sites are pretty much always mentioned in any “bad neighborhood” discussion on SEO forums, and we might be treated with more prejudice.

For the benefit of others reading this thread, some other things you can try that may help reduce the time in the sandbox is to get listed in DMOZ, register your domain for at least 5 years,buy some one-way links, and maybe try buying a link from the Yahoo directory. Also make sure you aren’t linking out to any Google banned sites. I just found one last night that I had to get rid of. Of course the best way to avoid the sandbox seems to be starting with a domain that is more than 2 years old, if we could all be so lucky to have one of those in this industry.

voodooman answered 3 years ago
Yeah–that’s the sixty-four thousand dollar question. Again, the site I am trying this with is NOT currently in the Sandbox (since I just started developing it this week), so I can’t say whether it will help a site out that is already in. My goal is to prevent my latest site in development from ever being in the sandbox by eliminating what I think is the cause of a site being sandboxed.

Yes, thanks for the confirmation on the Googlebot not following these JS type links (I was pretty sure I was correct about this, but it’s always good to have my suspicions confirmed!)

Anyway, as far as I can tell the Sandbox is some type of aging filter because you can take sites that have a zillion links in and they will still be sandboxed and sites with few inbound links will not-the only differentiating factor I have found is the time these sites have been in existence. I dont think inbound links are the answer as I feel Google has to realize that this is a stupid (and easily manipulated way) to rank sites. Sites should be ranked based on well, written, fresh, original content and that’s it.

Wouldn’t doing this eliminate all spam, duplicate stolen content, scraper sites, Google bowling, link scheming etc. Look, Google, by their ranking methods, has taken the best resource index on the net and turned it into an index of sites that are nothing more than glorified link farms, for the most part. I dont even use Google anymore myself, but feel it is still very valuable for its partner traffic (like from AOLsearch etc.) Google traffic is far less valuable than traffic from Yahoo, MSN and AOL as I have seen during my years as a top-10 Match.com affiliate.

webber286 answered 3 years ago
Well, I think I have verified that Google doesn’t follow Javascript pop-up links. I remembered a site that I worked on a couple of years ago (non-gambling related) that used Javascript pop-ups to show help text. By trying several searches for pieces of the text on the pages of the pop-up windows, Google was unable to find the page in their index. So, it seems that it will work the way you are wanting, the next question is if it will help you out of the sandbox.

webber286 answered 3 years ago
Ok, I get what you are doing now…using Javascript to pop a new window that has the link automatically re-direct from there. It’s an interesting idea, let us know if it works out for you.

voodooman answered 3 years ago
Webber–Didnt mean to leave you hanging.. I am not sure of the ASP links, but if you look at my Fast Payout Online Casinos site below..you will see that every outgoing link on every page is within a javascript function call (like the kind your site uses to validate user’s emails etc. Google ignores everything it finds between and if you look at my source on every page you will see that I went through the added trouble of surrounding my JS functions with comment lines, as well. I really want G to think I have a site with no outgoing links (other than to other pages on my own site). It’s a theory I have but its based on some facts I have obtained and other sites I have observed.

Again, I dont know if this is THE ANSWER, but based on what I’ve seen I think its reasonable to try–as I had not started that Fast Payouts site when I came up with this idea I figured build it this way from the ground up. I don’t know what to think yet as the site and its few developed pages so far is quite new.

webber286 answered 3 years ago
It is an interesting theory Voodooman, I too have heard the rumblings that Affiliate based sites are treated harsher than other sites. Our site doesn’t use Javascript re-directs, but we do employ a similar sort of re-directing using ASP code. We did that originally to keep track of click throughs to casino sites. I’m not the technical one, but it seems that Google may not follow this code as well, though I’m not really sure how to check that. I can go to Google sitemaps and they give me a list of error pages when Google tries to follow certain links. So far it has been because of sites that have gone down that we were linking too (fastpoker.com for instance). So maybe this means that Google is following our links.

In any case, you may want to look at our site and see if that helps out your theory or not.

Dominique answered 3 years ago

Originally Posted by antoine
Just build websites and dont worry about google for now.

That is what I recommend too.

It’s what I did and still do.

villa10 answered 3 years ago
antoine wrote:
Just build websites and dont worry about google for now.

I have to agree with this.

At least in my case it was a big mistake to follow in the blind trust some SEO mantras that after years are not a guarantee for a good ranking.

The canned answer in SEO forums “content and linking” do not resists the analysis.
A lot of sites out there, doing very bad with 1000’s of pages and rec. links.

So I think that is razonable to build sites not only thinking in google.

voodooman answered 3 years ago
The theory is that Google sandboxes sites depending on the ‘type’ of site it finds. Since it’s not very intelligent, it must use some means to gauge the types of sites it finds, I believe it uses the nature of outgoing links. When my new site is completed, Googlebot will NOT find any outbound links from my site (I am not even going to do link exchanges, but expect to get inbound links from some directory listings, posts etc.). This will create a self-contained environment that 1) Preserves PR and 2) *Hopefully* eliminates the Sandbox effect.