Trying, trying and still not coming out? You are starting to ask yourself: why does google hate me ? Of course, I understand giving Google human characteristics, but remember that this is just one of many search engine algorithms. You are not to make friends with him, but choose the right methods so that he would read your site as valuable.
There are many myths around SEO (search engine optimization) that we will face today. When approaching creating a blog, website or online store, you should always keep in mind that even the best graphics and the best idea will not work without readers or customers.
Each of us depends on earning from our Internet activities. The exceptions are charitable organizations or real enthusiasts, but even they need to break through, to make donations or read content they want to share.
Let’s get to the specifics, below is a list of the most popular SEO mistakes that affect that “Google hates you”.
Once at work, we had the task of preparing a list of things that, in our opinion, affect positioning. All sensible recommendations based on our experience and consultation with other SEO specialists were found there.
Ultimately, the table consisted of 120 items among which there were many (from my point of view) absurd, as the results have how many people have added the bookmarked page.
If you ask any person dealing with website positioning, you will hear a tirade of things that affect the results. In my opinion, there are about 10 important factors, including the two most important. If you have read the entry about how Google works , you should know that the factors determining the final result to a large extent are the content on the page and what sources link to it.
Virtually every customer that I lead or run came from the wrong assumption: website first, then keyword analysis. Good, if he had any idea that such analysis was needed.
I often got a list of words that after checking turned out to be completely missed. Why do you think I put the phrase ” why does google hate me ” in the first paragraph ? The answer is simple, you probably just entered it in Google, and I additionally mentioned it again in the previous sentence.
This is the keyword for this text. It is this phrase that is meant to make someone enter this post and read it. By no means does it mean that it will be spammed in every other sentence, you need to know moderation, but more on that later.
Keyword analysis is the absolute foundation from which I start work, be it on the pages of clients or my own. I have no way of knowing the chemical, horticultural, nutrition and hundreds of other industries. After analysis, I present it to the client and decide which phrases are actually good and which will not convert.
The basic process of analysis is described here -> How to find niche
It’s similar with your blog or online store. If you have not done the analysis or you did it wrong, do not count on readers and customers. However, if you are sure that the analysis was done in a good way, go to the next step.
The second point in the list of the most common SEO errors is more complex. Remember that content is the basis for good results. The exception are brands with large advertising budgets, e.g. Coca-Cola does not need a website with a blog.
However, in your or my case, we are only one of many and to reach a wider audience we need to interest them in something. Behind him we will get to the originality and length of the texts, let’s deal with the most common mistake:
Copying content
Do not do this. Never. I could stop there, but I doubt that I would convince you that way.
The first scarecrow is copyright. Most texts available on the Internet have a copy-prohibition clause. None of us wants his work to be unlawfully stolen and used by other people.
The most common consequence is an e-mail with information about the necessity to delete the copied content. In the event of non-compliance with the game comes the court and the entire legal procedure, which will not end with anything good.
Suppose, however, that you do not have a problem with it and want to risk it. In that case, Google will tell you that you hate you. There are billions of pages on the Internet, imagine that on your inquiry you get 100 who write word for word exactly the same. Such a search engine would make no sense.
Google promotes proprietary content that does not duplicate patterns. By no means is it reinventing the wheel. There are matters that are not subject to discussion, but they can be described in various ways. When you compare my article with others, you will see that in many issues it describes similar errors, but I do it in my own way and expands the explanation with my own experience.
The problem occurs with online stores and product descriptions, especially if they are specialized. It’s hard to come up with a unique description for ” isopropyl alcohol ” especially when the store’s assortment consists of several thousand products.
In online stores I always use the method that focuses the most on categories and blog. Today, one product is on offer, it is not available tomorrow, it makes no sense to spend more time with it. In stores with a smaller amount of goods, from the category: home, electronics, etc., it is easier to show inventiveness and definitely recommend creating individual descriptions.
Too short or not valuable content
We already know that Google algorithms don’t like pages that copy content, so we’re creating new text. You start by analyzing keywords, you managed to choose 10 phrases that have no competition and generate cool traffic. You put them in the text, the page is indexing and you are on page 8 of google.
Algorithms do not like spam as well as copied content. This text already has 900 words, and only about 3% of the content are keywords. It is not an art to find the right phrases and spam them in the text. Content is created primarily for people, SEO issues must be in the background.
There is no single percentage for the keywords you should stick to. One person will say 3% another 1%, how many specialists have so many opinions . For my part, I recommend after finishing writing, reading the entire text and checking whether the number of selected words is not in the eye. If so, it is a sign that they should be thrown out a bit.
The next aspect is the length of the texts. These should not be too short. Suppose you want to write a text about sports shoes. I recommend that you always enter this phrase in google and look at the first or second page of searches. Then analyze the length of the text on these pages and write the longest text, or one that falls within the upper limits.
If the first google page will be occupied by pages with content oscillating around 20,000 characters, unfortunately, but if you write 5,000 characters you don’t have great chances for success (unless you’re the NY Times or something similar).
I understand keyword analysis and text writing as one. In my opinion, these aspects are responsible for 50% of success. The second indicator that will make Google stop hating you is linking from external sites, to which it assigns 40% of importance.
What link building is and how it works I will try to move in a separate article. At this point, you must know that creating very good content for a low competitive phrase does not guarantee top 10.
To improve these opportunities, links to other websites should be sent to our site. In this way, Google determines whether a given text is worth attention and display to other users. In the end, since many other pages indicate this particular, it means that the content there is valuable.
The problem with linking is that we can’t control it 100%. Creating a text, we can only hope that someone will be interested and decide to share the link to him.
Of course, it is worth acting in Social Media providing links to your content, but you must know that the link from FB, Twitter or Pinterest does not matter, it is important for someone to see it, get interested and put it on your website.
The method described above is called Organic linking. The method is perfect and unattainable for most people. There are millions of topics, but not everyone is “sexy” and will make you want to link to it. New websites usually have no chance of getting such links. Then inorganic linking comes into play, through which you can help yourself and harm yourself.
We can add links to our site ourselves. The main sources for obtaining such links are internet forums and catalogs. Google generally dislikes artificial link acquisition, but has recently released an official statement saying that the lack of links is unnatural and this may be a sign of a lack of commitment. So, according to representatives, there is nothing wrong with not abusing this method.
Is it working? Yes, especially for phrases with low competition, and at the very beginning of operations. There are many examples on the internet showing effects after linking posts. I also confirmed the effectiveness of this method. However, I am very careful with this and linking to several texts does not make sense to me, it is much better to focus on creating new content.
Not every link has the same meaning. The link from sites with great trust is treated differently as in Washington post , and differently from a reposted page. Programs such as Semrush are used to check the credibility of a given site. If you do not want to spend money on software, you can in a way guess whether a given source is valuable, e.g. large outbound links or poor content will make it rather not worth trying to add a link to a given directory/page.
Too many links appearing within a short period of time can result in a ban on the site or part of it. As in everything, you need to be moderate. It is possible that your results are poor, because you have a ban, it can be checked in Google Search Console in the manual actions tab.
I advise against buying links from anyone. Believe me 1000 links for $ 5 is not a bargain. Most of the links obtained in this way will do nothing and can even harm.
Another factor that may cause poor results is low CTR (cick through rate). This is an official factor, I attribute a fraction of meaning to it, but it’s still worth making sure it stands at a high level.
Let’s say your site has 14th position in google and you are doing everything to promote it to the first page. Users who see it on the second page skip it by clicking more often on other pages, encouraged by good titles and descriptions.
Why would google promote your site if users choose different content? Check if your title and description contain keywords and a call to action, e.g. check, find out, call etc.
I understand by poor optimization, page loading speed, graphic errors and errors on the page itself.
Page speed
In literally every SEO guide you will find information that is not important for the search engine to make the page load quickly, after all, Google itself provides a tool for measuring speed, so it must mean something.
The problem is that I’ve never noticed a difference in position after speed optimization. This is my individual opinion: page speed has no direct impact on the results. Probably many specialists will not agree with this and a good discussion is always welcome.
However, the speed of the website is important to the user. This is especially important for mobile devices. Every year, the internet is faster and cheaper, which reduces the patience of users. Even tenths of a second matter. That is why I recommend checking the speed of your website.
Errors on the site
None of us like mistakes that make reading articles, navigating the site or making purchases difficult. The most common errors are broken internal links. If the category link or the page I want to access does not work, it immediately discourages me from using the page I am on.
The situation is similar with poor responsiveness or graphic distortions blocking or hindering an action.
Immediately after launching the page, please add it to Google Search Console. It is a tool for Webmasters watching over the proper functioning of the site. You can check the errors that the browser sees, and check whether the subpage is indexing, and if not, report it for indexation.
In GSC, we additionally report the robots.txt file and sitemap.xml, in the case of the latter, it facilitates page indexation. The site map is practically point by point showing google robots what pages are within the domain. This is the absolute foundation that every page requires.
Since I mentioned robots.txt, always check if it contains the “Disallow: /” command. This-looking code link blocks the crawlers from accessing all content on the page. Just remove “/” to start the page indexing. Sometimes you may want a subpage not to index, then you enter the command: Disallow: /subpage/
Remember that the crawler visits the page from time to time and your changes will not be immediately visible.
This mistake is most common in online stores. The number of subpages that crawl goes into thousands, and many of them have nothing because they are duplicates or products have been discontinued .
Such pages have no value and it is worth excluding them from indexation, and those that already exist in the index redirect to the appropriate subpages using a 301 redirect.
These are error pages that should appear to the user after entering the wrong name of one of your pages, e.g. yoursite.com/animasl . In this case, the user shifted two letters and should get a page with a 404 error. This is normal and there is nothing wrong with that.
The problem occurs when pages that should work and do not work and the google robot entering your site receives a page 404. The more such errors, the greater it affects the negative quality of the page.
The tenth aspect you may not follow, which causes poor results, is internal linking. This is especially important in online stores with an extensive structure.
Internal links make it easier for your users to navigate the page. It is worth using them because of UX.
SEO is also important here. Internal links transfer power between the subpages and help you find the page with an indexing job.
At the beginning I mentioned the need to create original and interesting content. However, remember that even wikipedia has pictures. Your site can’t be a wall of text, it doesn’t support positioning.
Divide the text into sections divided by headers. Every now and then add photos and videos, multimedia have a positive impact on positions, and in addition they can allow you to appear on Google graphics, or display your video on youtube in the results.
Of course, graphics should be optimized in terms of size and alternative “alt” texts and titles.
Google representatives have repeatedly emphasized that updating content helps to achieve good results. This concept is about regularly adding content and updating existing ones.
From my experience, I can say that updating for me in most cases means extending existing texts with phrases found in Google Search Console. I tested it on several hundred articles and the effect is not always staggering, but it often ended in an actual increase in position.
On the internet you will easily find pages that have not been updated for several years, whose individual posts are high in the search engine. Usually, however, it is about low competitive phrases and the fact that the domain is old.
At the end I left the question of analyzing my and users’ activities. If you don’t learn from mistakes, don’t expect results. If you are not patient, you will be spending money on advertising or resignation.
Unfortunately, finding your audience on the internet is not so easy, but when you do, the benefits are huge.
I always recommend installing Google Analytics (some think it is a ranking factor). It’s worth following a course showing how to read data from this tool.
Look at the words that users enter in the Google Search Console. It is a mine of keywords thanks to which you can appear on the web.
Other factors that may be relevant
a) SSL certificate
b) Website with privacy policy, about us and contact. Contact details on the site, and e.g. directories in which you link should be the same.
c) Text structure – in addition to headings, it is worth using lists, tables, bolds, italics etc.
d) Schema – Structured data describe what is on the page. They can also increase CTR thanks to Google visibility.
The search engine works in a strictly defined manner, compliance with relevant guidelines must result in good results. If that were not the case, I or other SEO people would not be able to achieve reproducible results. Why does google hate me?
If after reading the article you still wonder about this question, go back to your page and check how many of these points agree. If you agree that you are doing everything right, I encourage you to leave a comment along with the problem described, then I will try to help you individually.