Downtimes and SEO

Many people focus on saving money while shopping, by using coupons thinking about saving money for their future. Younger generation makes less use of coupons unlike their parents or grandparents who used clipping coupons for saving money. With the host of cyber entrepreneurs, it has become quite easy to save a buck or two, so it’s time to shop smart using the latest promo codes and clipping coupons

One of the most debated questions in the world of Search Engine Optimization, as well as one that is in the most frequently asked list, “What happens to the SEO of my website if I have downtime?”

Firstly let us understand the agenda of Search Engines and the way they work, now Search Engines like Yahoo and Google are supposed to be having search results which people can visit, which people can go on to gather information they need and results which are over everything, useful for their visitors.

points of sale Logo Quiz answers chain of cosmetics Amazon Coupons makeup, skincare consumers

So it is only natural for Search Engine to remove the pages of a website from its index if Downtimes occur, since the search engines do not want people searching for something and ending up at a website that doesn’t even work. However, there has to be a severity of a downtime for the Search Engine to actually take the plunge and remove the web page from its index.

Sometimes this can take months, and the other times it might take a few days. It all depends on a few factors:

– Past downtime history of the website
If a website is known to have downtimes and coming back up, the Search Engines will not remove their pages from the index, but the website and its pages will also not be able to rank high in the SERPs.

– Time of downtime
Search Engines regularly check for the existence of pages, and if after an x amount of times,  the Search Engine cannot reach the page., it demotes the page from its ranking in the Search Engine, if even then after a further x amount of times the Search Engine cannot find the page, it will remove the page from its index. If its just put down in rankings, its rather easy to get back up with some updates, however if the page has been removed, it could get back in natural course if its back up and running for a decent amount of time.
– Domain Registration
If a website went down because its registration expired. Search Engines will most likely delete all traces of the website from the SERPs in a week or two at most after the website going down. Once this has happened, the website will have to work all its way back into the search engine all over again.

Not a lot to worry about as far as SEO is concerned, but still this is something you should watch out for, downtimes never *ruin* your SEO contrary to popular beliefs. But they could still harm it temporarily, so try not to have downtimes, thats the only advice I can give on this matter.


What is Negative SEO?

Competition is everywhere, you can sense it around you when you’re in something that has even one bit of competition. There is always a vibe or two coming from somewhere that tells you that you will face people wanting to take over your business and replacing it with theirs. Its inevitable, competition will always exist.

Search Engine Optimization is something that is no exception. There is a lot of competition in SEO as well, whether you’re providing it as a service, or doing it for your own website, you will always get people trying to get better than you. Competition is appreciated in most circles and is mostly welcomed. But in SEO, this might not be the case.

In this field of the Internet, which is the optimization of websites on the Internet, comes out a negative aspect of competition. Competition which is supposed to be healthy, and that is, Negative SEO. It could be defined as deceptive practices in SEO to make another person’s website(s) perform bad in Search Engines, how are these practices taken about? Lets check out:

– Incoming Links: Search Engines pay heed to what kind of websites link to you, and sometimes, people “GoogleBowl” their competition with practices that involve linking from shady/illegal websites. The good part is that Search Engines don’t ban websites over such links anymore, but there is also bad news, with Google’s new paid link detection methods coming up, Googlebowlers may just find a new way to destroy competition.

– Outgoing links: Forums, blog comments, discussion groups. All of these places are supposed to have activity from people, but sometimes Negative Optimizers might use these places to link to websites that are considered unsafe/illegal/banned by Search Engines. As a result of which your website might be straight out banned from the SE’s. Solution? Nofollow all outgoing links by default (guide coming up soon)

– Script Hijacking: Sometimes people might exploit the scripts used by their competitor’s websites and do things that violate Search Engine policies (such as keyword spamming) and then get the website banned. Highly unlikely if you’re a webmaster who keeps a watch and maintains security.

– Links Explosion: Somewhat of a sister to Googlebowling, this is a method used by people with a large number of websites under their control (Typically 200+). What happens is, when a website gets a lot of links with the same anchor texts etc. from many many websites (200+) they get sandboxed in Google. There is really no way to avoid this, but to much relief this is a rare occurence.

– Accusations: If you have any ads on your websites except adsense. There is a chance of your competitors reporting you to Google for serving them with spyware and adware. I know this because it happened to a website of one of my colleagues, he had Clicksor’s Interstitial ads. Now if you look up his website in Google, they don’t appear right at the top of keyphrases and Google actually shows a warning that the website might be dangerous.

So this all that is basically to Negative SEO, attempts in getting the competitor’s website kicked out of the Search Engines’ index, getting them low in SERPs, getting them blacklisted, it all counts as Negative SEO. Although Search Engines try a lot that people aren’t influenced by this black face of SEO, one must also be cautious and keep a watch, its only then websites could be protected from the effects of Negative SEO.


Blackhat widely in use by SEO’s!

Whenever you go on a website that has to do with SEO, or some place where people talk about SEO, or if you even buy an ebook about SEO, what you see almost all the time is that people tell you that White Hat SEO is the way to go and you must always use White Hat SEO for your websites, people also say that Black Hat SEO is not worth the effort because its use will ultimately get you banned.

This notion could be rested upon a simple logic of the ethical and unethical, White Hat SEO is purely ethical Search Engine Optimization, and Black Hat SEO is like using cheat codes in a video game, in other words unethical SEO, but the difference between using a cheat code in a video game and using Black Hat SEO for your website is that when you use the cheat codes, there’s no real harm done, you just enhance your gaming experience sitting in front of your computer.

But when one uses Black Hat SEO for their websites, it results in simple, straight forward, yours truly, Bans. Bans from the search engines who think being unethical is simply not the way to go, as common sense states. However, through my personal research I have found out that Black Hat SEO is reigning supreme amongst start-up SEO’s and novice SEO’s, even more commercially than personally.

What happened was that I was reviewing a friend’s new website he said he had got optimized for best Search Engine performance by a local SEO, I checked into the source code etc. and much to my surprise I found keywords stuffed in the bottom of each web page in background color font. There were also many more trails of a Black Hat on the website, which I removed off my friend’s website upon telling him that it was ethical.

This was not the first case I saw of Black Hat, but a lot of self proclaimed novice SEOs today are using Black Hat to give their clients the results they want on Search Engines, most of these people work this way, they promise their clients a certain spot on some keyphrases/keywords. With Black Hat SEO, they obtain 50% of the keywords, which is mentioned in their clause as the minimum required results to be paid. These SEO’s then get paid and get out of the scene.

Some time later, the clients’ website gets banned. Much to the dismay of the client, but the SEO can not be technically accused of anything as he did give the rank he promised. Even though it was through deceptive practices. This brings me to the next point, why don’t SEO’s choose the white hat way? Especially the novice ones? Well there are certain reasons –

– White Hat SEO is hard to learn
– White Hat SEO requires lots of practice
– Building links with White Hat is a tough job
– White Hat is a slow process
– White Hat SEO requires more time and money

So basically we can see that in an effort to earn a lot of money quickly, novice SEO’s shove Black Hat down the throats of their clients and their websites. A website might even get permanently banned due to these practices. Most Black Hat SEOs are smart and will not let you know whats going on, you might even get the impression that they are performing their task honestly. But the truth might be otherwise, therefore here is what you should do when you’re getting your website optimized for the Search Engines:

– Ask someone experienced with White Hat to look at your website
– Look in your source code, if you find keywords stuffed in anywhere except the Meta Tags, then its Black Hat.
– Browse your whole website at regular intervals to make sure if any page-forwarding etc. is not going on
– Ask your SEO what are they doing to your website.
– If you find anything suspicious, talk about it with your SEO and with someone experienced with White Hat.

A few precautions here and there will save you from falling prey to the effects of Black Hat, I used to believe nobody would use Black Hat commercially in their business, but these shocking experiences I have had say otherwise, therefore make sure you keep a watch if you’re buying SEO services from anywhere.


Pop-ups, worth the annoyance?

Advertising is one of the most popular ways for website owners to make money online. Almost every website owner these days opts for a PPC or CPM program in an effort to make some more money from their website. In this race of making more and more money, there have been introduced many more new forms of advertising as well.

Simply put, in the online world, many more new ways of advertising are appearing everyday in an effort to give the publisher as much financial return from their website as possible and the advertiser as many views of their advetisement as possible. Among these new ways of advertising are interstitial ads, pop-under ads etc. as well. But we can trace them back to their most annoying and popular ancestor, the Pop up ads.

What is a pop up ad? Simply put, advertisement windows that pop up uninvited when an internet user is trying to open a web page. Every now and then we come across a website that serves its visitors pop up ads. Although not always the case, but this is most observed in the websites of people who have been banned/rejected from the popular PPC program Adsense. Or simply put, just want to make more from their website.

Webmasters might be successful in getting the most out of serving their websites with pop ups and some of them might also believe Pop ups are great, But is it the same case with normal Internet users? What do they think of pop ups? checked on online communities for its answers.

“I hate them” says webmaster GRIM, “especially the ones with multiple pop-ups on every page”. User Blackmane gives further insight saying “unless the website has content I really want, I hit the back button as soon as possible and usually never visit again” another web enthusiast shares vaguely the same views and comments “I wouldn’t revisit a website with popups unless there’s a strong reason”.

These are not just a few users, but a representation of what the whole online community thinks of websites that have pop-up ads, judging from the responses Map100 saw to pop-ups, its clear that people do not want to see pop-up advertisements on websites and most would not re-visit a website with pop-ups unless they had a very strong reason to do so.

For webmasters therefore, it would be advisable to not have any kind of pop up ads on their websites, they are really unwelcomed customers for Internet users, similar goes for Interstitial ads and Pop-unders, nobody wants to view pop-up ads, especially since 95% of the times they are way unrelated to the theme of the website the pop-ups are coming from. Through the mode of survey, Map100 has been able to determine that having pop up ads not only reduces the amount of traffic you have substantially, it also gives your website a bad name. (Some websites with pop ups have been even blacklisted by Search Engines)

So play it clean, play it clear, and don’t use Pop up ads for your website, they do more harm than good.


Stop worrying about the green bar. Live PageRank explained.

PageRank is a very important metric these days. However, a lot of people are misinterpreting it. They are not only expecting too much of it, but do not really understand how it works.

Most of webmasters know PageRank changes with the amount of backlinks you get. A good amount of these webmasters think that PageRank is all that matters. This is wrong. PageRank updates at an average of every three months. This is what makes it a highly inaccurate metric. Of course, Google has the Live PageRank updated in real time but only they can access it. This is when webmasters go out looking for the Live PageRank in different places. They look in PageRank prediction tools; they look in the Google Directory, DMOZ, etc. This makes it quite amusing, as it is much easier than that.

The real PageRank is easy to find. My personal belief is that Google gives web sites page rank based on how often their spider comes up with a link to a site. It’s simple, nothing else matters. The PR of the page with the link on it is just a side effect. Every time Google comes through one of your links it follows it and gives you some PR juice. The more links it follows to your website the more your PageRank will increase. This one line explains all the mysteries about PageRank. It not only explains how it is obtained, what the real page rank is but more so why it is important to have backlinks in “high PageRank” websites, better addressed as, websites which google visits very often.

Based on this belief, it follows that a site could technically achieve a high PR, if it had 1 link on a PR0 page, and Googlebot was somehow malfunctioning and clicking on this same link over and over and over again, for a long time. This is quite easy to understand as it is exactly what happens. If you get a link from a PR7 page, with it alone you can achieve a PR5 (perhaps), because the PR7 site is being visited by the google spider so often, that this same spider keeps ending up at your site over and over. This is how subpages from PR9, PR10, get a high PR as well.

So basically, Live Page Rank is updated all the time, all you have to do is look at the number of backlinks, and these vary daily. I personally use the Yahoo! backlinks tool since this gives me a more accurate look on the number of backlinks I currently have. If you keep working on increasing those backlinks, your PageRank will skyrocket and Google spider will be all over your site, daily.

Of course, I have made my point and passed my beliefs onto you. Now you want proof. The best part about it is that I can present you with proof by just mentioning the site you are on right now, reading this blog post. was started a month and a half ago. It had perhaps, 3 backlinks pointing to it. The domain was old, and died out. It was not updated at all and hence all the backlinks it had decreased. It had a measly PR1. I changed the whole purpose of the site, and started working on it. I built the site from scratch, made sure it is different from the thousands of directory sites out there and started my marketing campaign. According to Yahoo! we are up to around 160,000 backlinks which is pretty good. I would say it’s definitely enough for a PR5 next update, perhaps a PR6 (no ones knows, but do you care?). Googlebot visits the site very frequently. I have tracked the “cached” page updates Googlebot does on the website throughout two weeks now, and it is never older than 2-4 days on high page rank sites. This leads to the next accurate tool to check Live PageRank, the “last cached” tool. It is also a great metric for Live PageRank. Here are some stats that proof this point and back up my backlinks Live PageRank belief.

This is G o o g l e‘s cache of as retrieved on Dec 21, 2006 00:57:26 GMT.
PR9 site, last cached yesterday, obviously.

This is G o o g l e‘s cache of as retrieved on Dec 21, 2006 00:33:33 GMT.
PR8 site, last cached yesterday.

This is G o o g l e‘s cache of as retrieved on Dec 21, 2006 00:43:39 GMT.
PR9 subpage, great example. The subpage of Wikipedia has more PR and it is cached more often than the main page. Since more people link to the english part of the site, google spider visits it more often.

This is G o o g l e‘s cache of as retrieved on Dec 20, 2006 08:13:14 GMT.
Our lovely site with an ugly PR1, however, the cache suggest we’ll be out of the ugly gray PR bar soon. Who cares about the bar though, as long as Google visits us often 😉

This is G o o g l e‘s cache of as retrieved on Dec 18, 2006 23:09:29 GMT.
PR8 page, this gets a little confusing due to the following entry.

This is G o o g l e‘s cache of as retrieved on Dec 21, 2006 00:53:52 GMT.
PR7 subdomain but definitely more popular than the main domain. This is why a lot of DigitalPoint threads rank so high for different search engine terms. I wouldn’t be surprised if the forums get a higher PR next update due to their increasing popularity and countless new backlinks.

This is G o o g l e‘s cache of as retrieved on Dec 17, 2006 05:42:44 GMT.
PR8 site, even though its the homepage, it is cached 5 days ago, much older than all the sites above. It is because it does not have as many backlinks pointing to it, hence not as many spider visits. I can come up with countless Wikipedia inner pages that are cached more often than the main page just because they have more links pointing to them.

This is G o o g l e‘s cache of as retrieved on Dec 11, 2006 19:03:24 GMT.
A PR4 page proves my point, the cache is soon 2 weeks old, which proves cache is a very accurate tool for showing the Live PageRank of a site. One more example follows.

This is G o o g l e‘s cache of as retrieved on Nov 4, 2006 16:31:50 GMT.
A PR3 page proves my point further, little backlinks, Live PageRank is bad. Last cached 6 weeks ago. Horrible.

This is G o o g l e‘s cache of as retrieved on Dec 4, 2006 00:09:21 GMT.
My own personal page, was cached not so long ago. I assume due to this it’s Live PageRank should be around 4? Let’s hope so.

I could find countless more PR4 and PR3 sites to show you further how backlinks and the cache have a immediate implication on the Live Page Rank, what’s more so, it renders the PageRank metric useless since the real important thing is when the site was last visited anyway. You don’t want a link on a site which says it is PR5 but was last cached a month ago, think about all the time Google will take to get to that page, and then it has to get to your link as well.

I hope this post has given people some insight on what Live Page Rank actually is, and why everyone should stop worrying so much about the little green bar and using different tools to find out where to get links from. Have fun marketing your sites, and I hope next time someone asks you to exchange links you will look at Google cache and Yahoo! backlinks instead of the useless green bar.