“Black hat SEO” is a phrase to describe various techniques used by webmasters in order to rank high in search engines that violate search engine guidelines.
Each search engine have their own guidelines to help webmasters get their site included in search engine indexes. In order to get included a site should comply with those rules or it might be excluded from indexes.
If someone intentionally breaks those rules they put their site on risk of ban or removal from indexes.
And thats exactly what black hat SEO “specialists” do. They break those rules on purpose. They don’t care about users and search engines. They want traffic. And its not targeted traffic.
Black hat SEO techniques
Black hat SEO techniques include:
• Keyword stuffing - put lots and lots of keywords on site without real unique content; also hiding keywords on site and make them visible only to search engines,
• Doorway pages - single pages to direct traffic to other pages, do not have any valuable content itself,
• Cloaking - showing one version of site to the users and another to the search engines,
White hats vs Black hats
White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing - Wikipedia.
White hat SEO specialists create websites for human beings, make them easily accessible for search engines and use other methods to promote them on the Internet.
Black hat SEO’s create website only for search engines. They break search engine guidelines and use deception to drive traffic to a site with low value to the user.
White hat SEO aim to stay in high rankings for a long time, generating targeted traffic to the site, satisfy incoming users by giving them what they search for and enable search engines to crawl their site with ease.
Black hat SEO aim to trick visitors to visit a website, trick search engines by showing it a different (SE optimized version of a site) and rank well for desired keyword for short period of time. That is, until search engine discovers an unethical method used by such site and removes it from search engine results for those keywords.
Conclusion
Don’t do evil. Make websites for people and don’t deceive or trick them. Don’t do black hat SEO.
Q:- Internal Links: Do my internal links pass page rank properly?
Creating a good internal linking strategy to pass pagerank from one page to another is an important part of SEO. Unless you have a good understanding of some basic technical concepts, it sometimes may be hard for you to figure out if your internal links are capable of passing pagerank. For example, you might have a fancy drop down menu of links that don’t appear until you mouse over the menu. How can you tell if the bots are crawling it?
One simple method is to sign your site up for Google Webmaster Tools and check out the Internal Links report. If the menu is passing page rank you should see the page show up in the report as page that is generating an internal link to the pages for the items listed on the menu. Thanks Google.
Tags: Internal Linking
12 responses so far ↓
We have lots of URLs with noindex meta tags showing up in our report. It’s interesting because it backs up what some Googlers have suggested - noindex,follow allows page rank to pass thru the noindexed page and into the pages to which it links.
That said, it is still an extremely valuable tool no matter what, and I thank you for reminding me this morning to browse our internal links looking for poorly formatted URLs (i.e. ?sort=best-sellers, etc.). I found out a new employee has been interlinking products this way. Problem solved.
Glad to be of help Everett.
You could also just run a site:www.example.com/
to find out if the page is indexed if you didn’t want to sign-up for webmaster tools.
Jeff, you are correct, except that your method doesn’t prove if the indexed page is passing page rank.
While the reports in Webmaster Tools can be used to reveal important, actionable information, I’m not so sure that it can be said that the internal link report is an indicator of passed PR. I just found a number of high-level pages where Google reported no internal links, when there are a number of static links, including from a sitemap.
While no doubt GWT is buggy and misses stuff (particularly since the new release a couple of weeks ago), I would check to see if the pages that have the links have been crawled and indexed by Google. If they haven’t been crawled yet they would not show up in the internal links report.
That said, if the page does show up, then it’s a good indication that it’s passing PR.
We review this on every site using not on GWT. Another tool that we have used is XENU, it is a 404 checker..but it provides information on Level and Links In / Links Out. More importantly, we visually inspect the site and try to create link maps (usually hand drawn) to help figure out how to ’sculpt’ the page rank. There are definitely some very scientific ways to do this, but we treat it more like an art and use site: in Google to monitor the result.
The difference between your techniques and the one I am suggesting above is that the only way (I think) to actually prove that Google is associating an internal link with the page it is linking to, is to see it show up in the internal links report on GWT. This still doesn’t mean it is passing PR, but it seems like the best indication available that it is.
Is there a fool proof way to determine if an inbound link passes page rank? I would like to know.
I don’t think there is a truly foolproof way (except perhaps by doing a very controlled test), but I do think using GWT is as foolproof as you can get.
i can’t believe this post got 40+ sphinns… huh? What’s your secret man?
I try to write posts that are actually useful to people other than hardcore SEM-types.
Monday, September 28, 2009
Black Hat Techniques: Does my website break Google's rules?
Posted by seodeveloper at 8:17 PM
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment