Various
statements of experts in the SEO blogs that chases a little confusing
Internet traffic about tricks to increase pagerank our blog. How not to confusing, due to follow one of the instructions given it was a cause new problems.
INSTRUCTIONS ARE STRANGE
In one of the blogs that I read, the owner states that the trick to increase the page rank for beginner blogger one is to do a post posts and posts.
Learning SEO techniques would later be learned. over time. The reason is, that google will index the first blog post in the productive category alias has a lot of posts.
After thinking about it a point this opinion, because logically if you are a company of workers who work more productively for example if your friend works only up at 16.00 pm then you may come to the workings at 20:00. pm.
These conditions are so sure you will make known in advance by the management company co-workers than another.
Based
on this thought I then start the process of writing the post all-out,
and without realizing I had a blog entry at the position where the
destruction occurs when restricted by robots.txt who translated the
search engine spider robots have limited the search process (crawl) on
his blog me.
This I know circumstances after making a diagnosis through webmastertools in which there are quite a lot of crawl error, please read my post crawl problem.
Elimination of any steps I have done through the menu but did not remove Url adequately address existing problems, because every time you make the diagnostic process will always show the link not found or restricted robots txt.
Apparently the post pages are already indexed by google so memorynya stored in the cache, even though in doing so it is only the elimination of temporary nature (temporary).
Fortunately I read his writings inspiring rizkyzone perform removal step webmastertools blog url in the dashboard with the aim will temporarily disconnect my blog with google until a period of 1 week.
And after the anger-anger addurl completed will be done again.
CONCLUSION
Do not do that many posts on your blog because it will cause problems crawl error * Perform periodic posts (2 days) and the maximum amount of posting 2 only * Perform routine checks of your blog through google congenital webmastertools
This I know circumstances after making a diagnosis through webmastertools in which there are quite a lot of crawl error, please read my post crawl problem.
Elimination of any steps I have done through the menu but did not remove Url adequately address existing problems, because every time you make the diagnostic process will always show the link not found or restricted robots txt.
Apparently the post pages are already indexed by google so memorynya stored in the cache, even though in doing so it is only the elimination of temporary nature (temporary).
Fortunately I read his writings inspiring rizkyzone perform removal step webmastertools blog url in the dashboard with the aim will temporarily disconnect my blog with google until a period of 1 week.
And after the anger-anger addurl completed will be done again.
CONCLUSION
Do not do that many posts on your blog because it will cause problems crawl error * Perform periodic posts (2 days) and the maximum amount of posting 2 only * Perform routine checks of your blog through google congenital webmastertools
No comments:
Post a Comment