How do they do it?

Well... there is a lot of BlackSEO out there and each of them has its own techniques.

Some of them only post comments on forums, blogs and other social or collaborative sites.

Other ones also go searching for vulnerable sites and applications. When they find them, they create new pages or modify existing ones, inserting links and text. Even sitemaps or other search engine related files may be modified to improve the impact of their activities.

Sometimes they create dynamic code that produces different output depending on who visits the page and how it is done.

For example, they can know the visitor is a search engine by looking at the "User-agent" header of HTTP request. Search engine robots use well known user agent strings.

But most browsers and tools can be configured to send arbitrary user agent values. Any user can set them to use one from a search engine robot. To get rid of this, same BlackSEO check the source IP address to. This is much harder to forge.

And they are able to know if the visitor came to the page following a link in a search engine result page too. The "Referer" HTTP header gives them that information. Users that clicked on a link may be redirected to an online shop, for example.

Next: Should I care?