Tag Archives: optimization

Do follow blogs

A new search engine has borned. Its made for only do follow blogs and is very easy to use.
We all know that dofollow blogs are a good way to post comments with links to transfer Page Rank and the most used tool to find this blogs are directory’s for do follow blogs.
This search engine supports all languages ,so you can search in what languages you want.
You can generate customs RSS’s with the keywords you want to track and in the moment when our crawlers find articles with that keywords , you will be notified via RSS.
This search engine is indexing on the fly ,so you can watch in the meniu bar statistics about data base.

If you create an account ,you can track statistics about the links used in comments on articles that are indexed and this how you will know if you comment was approved,if has nofollow tag on it and much more.

This search engine is free for articles that has PR0.For example if you want to see articles with PR5 you have to pay for it.This measure is taken because server resources costs.

The address is WWW.BLOGFOLLOWER.COM and is a do follow blogs search engine

Place a link to your site with refference to this article,if you want to use this post!!

Link To This Post
1. Click inside the codebox
2. Right-Click then Copy
3. Paste the HTML code into your webpage
codebox

How to not start link building for a site - mistakes in seo

All market is full with lots of tools that promise top 10 positions in Google,100000 backlinks in seconds and other cracks.
I will give you a list of things to NOT to do in a link building strategie.

1) Do not submit your site to more than 50 directories in a month.This are types of directories where are you sure your  site will appear.

Submit to more and you will get the sandbox effect upon your site.

2) Do not make link exchanges site wide.Site wide means with the whole site.You will see it doesnt help you more than a link exchange with a page relevant that is positioning good in SERPS.

3) Do not place your link in forums signature.You will see no difference in SERPS.Why? its like the site wide Link exchange but more irrelevant

4) Do not make Link exchanges with sites that have more than 100 links on a page(internal links and external links).First of all G will see that page spamy.Try to place on more pages more than 100 internal links,wait 3,4 days and check the messages from webmaster tools.You will get a message from googlebot that warnings you have lots of links on pages.

The second reason is that the PR juice will be devided in 100 pieces and you will have 1/100 pieces from the “cake”.This is valuable in case that all 100 links are relevant with the spamy page.

5) DO NOT USE STUPID ANCHORS if you submit your site to a directory articles or somewere else. USE ONLY KEYWORDS that you want to optimize.

6) In a linking exchange if an anchor is used on a page more then once to different sites,theres a problem.What should you do? add - to the anchor in case you cant use other keywords. Why? Google makes escape to nonalphanumeric characters in a search result.So why dont to try it in anchors. If you search seo or seo - in Google you get same results.

7) Do not make lots of backlinks in a short time whitout to be very relevant!!You will get the sandbox effect!

8 ) Link exchanges or one way links MUST BE MADE only with relevant content to your site.DONT LOOK for Page Rank..You need higher ranks in SERPS not a bigger PAGE RANK.

9) Use in exchanging, URLS that have keywords in them.This is crucial.

10) Do not create sites on the same IP for exchanging links between them.They have no value.

11) Do not make exchanges only with the home page of the site.Use internal pages from your site  in Link exchanges.

Place a link to your site with refference to this article,if you want to use this post!!

Link To This Post
1. Click inside the codebox
2. Right-Click then Copy
3. Paste the HTML code into your webpage
codebox

SEO friendly urls after submiting a form

What means that?
It means that in a form whit GET method, when you submit, you will be redirected to a new page with “?example=variable” added to the URL.
Think if your users will search on your site some data and wish to place the link with the searched results on blogs,forum will have an ugly URL,but the most of all you will lose characters spaces for more keywords for SEO.
Here is an example:
This is a default form submitted url:

www.example.com/search?category=some+simple+data+that+makes+your+url+inefficient

And a pretty one:
www.example.com/some-simple-data-that-makes-your-url-efficient-for-search-engines-s

In the last example your link has 24 characters more used for keywords in the same length the preview has.
How can we do this?Simple.
We will use forms with POST methos.This will send data via POST to a page where it will be redirected to a new location ,the URL rewrited.

<form method=”post>
<input name=”data” type=”text” >
<input type=”submit” >
</form>

The php script should contain on the firs line this:

if (isset($_POST['data']) && $_POST['data']!= ”) {
header( ‘Location: http://www.example.com/preg_replace(”/[^a-zA-Z0-9s ]/”, “-”, $_POST['data'])-s;
die();
}

and the .htaccess file should contain between the lines this(if you have an wordpress do not use this method)

RewriteRule ^([a-zA-Z0-9]+)-s$ redirected-page.php?data=$1

This is all.
Redirected-page.php is the page where you get the requests and show the results.

Place a link to your site with refference to this article,if you want to use this post!!

Link To This Post
1. Click inside the codebox
2. Right-Click then Copy
3. Paste the HTML code into your webpage
codebox

Google cache page limit of 101KB can be raised to 800KB

:) Isnt this funny to read? Lots of you who wondered why in G, cache pages are stoped at 101KB limit content and thoughed to write less content or to place links on the above limit of 101KB or something else that can compress the valuable content in max 101KB, there is a solution.

The meaning of 101KB limit is not the cache size of a page ,is THE BANDWITH LIMIT FOR DOWNLOADING A PAGE.You read right,you can have cache pages of about 800KB if you compress your content after this is sending to browser.

For PHP I use gzcompress and I can compress aprox 500% of content.:)More CPU load is added but not a lot.

More content in a page, more organic traffic you get.

Place a link to your site with refference to this article,if you want to use this post!!

Link To This Post
1. Click inside the codebox
2. Right-Click then Copy
3. Paste the HTML code into your webpage
codebox

What was Google sandbox and what is today

Sandbox was a penalty for sites who gained relatively fast backlinks in short time.It was an automated process with a drastic effect in SERPS.Most of pages that are in sandbox drop lots of positions and end on the bottom of rankings and they stay there for a lot of time.

Lots of webmastes who “think” that sandbox means, are saying that a solution to escape from sandbox effect is to gain more links,more valuable links,lets say more relevant links with PR for your optimized page.This is a mistake because if links are that that put your page in sandbox,why are you trying to gain more backlinks?

I tested a lot on sites that are in sandbox and the only solution to get out of there in SHORT TIME IS TO ELIMINATE THE BACKLINKS THAT BURYED your pages in sandbox.Once I had a page on another site of mine with games.I tryed to get backlinks for it,aprox 10 links site wide from other 10 sites of games with PR between 3 and 4. In 2 weeks that page droped lot of positions in SERPS and I loosed at least 80% traffic from it(aprox~800 UV/day).After 2 weeks I thougth that those 10 links site wide must be the reason for sandbox effect and then I deleted 60% of them from parteneres sites. They were all one way links.

In 4 days my page went up again in SERPS to 3th positions and my traffic doubled.

THE SOLUTION WAS TO ELIMINATE BACKLINKS THAT BURYED ME IN SANDBOX.

As a matter of fact,not numbers of backlinks can drop you in SERPS but the PR from them can.If you have a new page/site for at least 3 month indexed in Google and you gain 2 backlinks of PR6 to it, be surrely you will see the effect. If they are natural links,links placed in article that correspand to an natural phrase(Phrases can be tested if they have a logical sens and they are human relevant with very simple alghoritms ) you will see no sandbox effect.

Nowadays Google is positioning pages in SERPS with a new alghoritm.Fresh content gain 100% rise in SERPS and after a month they drop …a lot….All  new pages are happening this.

Is not Google dance, its not sandbox its a new pain that you have to fit with.

Place a link to your site with refference to this article,if you want to use this post!!

Link To This Post
1. Click inside the codebox
2. Right-Click then Copy
3. Paste the HTML code into your webpage
codebox

Optimization for Google

Important factors in optimizing for Google:

* web domain age
* domain name
* links to the site (IBL - Backlinks) and obviously their quality
* content that is offered to visitors

Web Domain Age

Google gives a great importance to old and stable site’s, this is obvious because is one of the major factors used to rank it in the search engines.

Domain Name

You can choose domain name keywords, then you will have more chances to be on the first page result.

IBL - Backlinks

Links to your site should be from the same niche , links from old websites is important too , they can redirect traffic and increase page rank.

Content

Create unique content, quality and quantity. Google search for sites which are periodically updated and have rich content. Do not try to make clones of your site, duplicate contect is penalized by google.

Place a link to your site with refference to this article,if you want to use this post!!

Link To This Post
1. Click inside the codebox
2. Right-Click then Copy
3. Paste the HTML code into your webpage
codebox

On-Site Optimization

On-Site Optimization refers to the optimization of the website “infrastructure”:

1. Tag <title>
2. Meta tags
3. Alt tags on the images
4. Text formatting (headings, bold, italic)
5. robots.txt file

The title tag is maybe the most important here, not only for the SEO but also for the CTR (click through rate) in the SERP (search engine results page). The title of each page must be different, so that it won’t be considered duplicate content, but it should have 2-4 keywords. For example, a page title of type:

Kw1, kw2 – site name, description with kw3 in it (in my opinion the most appropriate method)

Most of the meta-tags are informative, the only ones that mater being the meta description and meta keywords. Whatever is written in the meta description will appear as title in SERP. It’s important to have one or two keywords here too.

The headings, bolding and italicizing the text matters less, but it matters. Obviously, keywords are bolded, not something else. The idea is that the search engine sees the mark-up as a bolded word, so it decides that it is more important than the rest of the page and might be a keyword.

Robots.txt is a file in the root of the site from where you can control the way that the SE crawlers index your site. You can ask them not to index certain pages with the no-index attribute or not to transfer from the authority of the site (for example pagerank) with no-follow. With robots.txt and some internal linking, you can control the total PageRank on the site and focus it only in some important pages.

Place a link to your site with refference to this article,if you want to use this post!!

Link To This Post
1. Click inside the codebox
2. Right-Click then Copy
3. Paste the HTML code into your webpage
codebox