Although SEO is no longer unfamiliar in China and has even become an industry, there is still no very scientific and systematic analysis method in the industry. The reason is probably due to the particularity of the search engine optimization industry. Search engines strictly guard their algorithms and only publish guidelines that are difficult to understand why. So many SEOers are playing a game where they never know the specific rules, which is the source of confusion in this industry.
I have repeatedly emphasized the importance of the "Google Website Quality Guidelines" because these are the only correct rules that search engines tell website owners. If you don't even master these rules, then I'm not sure you can follow them. Where to get more authoritative guidance. But in actual combat, although you have read this "Guide" well and already know the rules of search engines better than many people, it is not enough to just know this. A set of scientific and systematic analysis methods can help you go further.
I think that after so many years of development in SEO, there should no longer be the kind of analysis method that relies on perceptual analysis to do SEO. A commonly used statement in this analysis method is: What do I think the search engine will do. For example: I think search engines are not that stupid, and they can definitely handle this; I think search engines will regard this factor as one of the ranking factors... If you rely on perceptual analysis to do SEO, then the change curve of your SEO traffic is also very perceptual. Of course, we should not make baseless speculations and hearsay. For example: guessing what search engines will do without a theoretical basis, or blindly following speeches made by relevant search engine personnel and authoritative figures.
Since the search engine does not tell us the specific algorithm, how can we establish this scientific and systematic analysis method? The answer is: start with a theory that you know is definitely correct, and slowly evolve it in practice.
The analysis process in the previous article "How Web Page Loading Speed Affects SEO Effects" was based on an analysis from a well-known theory, and then we got another exact factor that affects SEO traffic. In this process, the theory that I am sure is correct is: the search engine crawler must have crawled that page before it will have the opportunity to include the web page. According to the following data analysis in the article, it can be concluded that web page loading speed will greatly affect SEO traffic.
Then analyze, what measures can affect the loading speed of web pages? The network environment, server hardware, and CMS itself can all affect web page loading speed. Optimizing any of these can improve web page loading speed. It can be immediately concluded that the network environment affects SEO traffic, the server hardware affects SEO traffic, and the speed of the CMS itself affects SEO traffic.
Next, analyze, what can be done to optimize the CMS itself? Enabling Gzip compression, merging CSS and JS files, reducing DNS queries, enabling caching, etc. can all optimize the speed of the CMS itself. ...These things look so familiar, that’s because these suggestions have been told to you in the "Website Performance" in "Google Webmaster Tools". However, according to our above analysis process, we can know that the optimizations mentioned in "Website Performance" are all optimizations of the CMS itself, and do not mention the optimization of the network environment and server hardware. It’s just that you are sure that these two factors really affect SEO traffic. If one day an article appears in "Google Blackboard" or Google's official blog (you need to bypass the firewall) telling you how to choose a good server hosting provider, don't be surprised, because you already know why. Google has always used this method to tell you how to optimize certain factors, but from their standpoint, they will not explain to you in detail why they should do this.
Through data analysis, we can also know who has a greater impact and who has a smaller impact.
Many common sense factors can evolve step by step in this way. This analysis process is very scientific. You can explain the principles very clearly, both to yourself and to others. And in this evolutionary process, you will find that you are increasingly able to control SEO traffic. Each step of evolution means that you know a little more about search engines, your SEO knowledge structure has improved a little more, and at the same time, your ability to control SEO traffic has become stronger. At the same time, you will find that you have fewer and fewer conflicts with web designers and engineers, because good SEO will not let the interests of SEO and web designers and engineers conflict.
Knowledge structure, SEO controllability, department relationships
As long as you have experienced a lot of such analysis processes, it will definitely subvert many people's original SEO knowledge structure. Because many of the SEO methods that were circulated in the past were mostly based on perceptual analysis, without explanation of why they should be done, no data support, or even theoretical support, so they missed the point. I said in "Word Segmentation and Index Library" that things you may think are details are actually key points, and things you think are key points can actually be ignored.
So, in daily SEO work, what capabilities support you to carry out such an analysis process?
I don’t know if you still remember the four abilities I mentioned in "How to Learn SEO". In this analysis process:
1. Understand the technologies and principles related to search engines: You can fundamentally understand search engines, determine many theories that must be correct, and find many clues worth analyzing.
2. Understand the technologies related to website production: It will allow you to know which factors on the website can affect which aspects of the search engine, and what methods to use to solve the problem.
3. Data analysis ability: You can understand how various existing factors affect SEO traffic, and rely on this ability to mine more factors. The scientific and systematic SEO analysis process is inseparable from the support of data from beginning to end.
4. Understand the search engine you want to rank for: No matter how hard you try, there will still be some problems that cannot be understood both statistically and theoretically. Every search engine, just like people, has a certain temperament. You can get the answer through your knowledge of this search engine. At the same time, understanding this search engine can also give you more factors that can be analyzed.
Finally, this method of scientific and systematic SEO analysis based on common sense can control SEO traffic better than understanding the algorithms of some search engines.
Many people may refute this view. For example, some time ago, my friend told me that the founder of a foreign trade B2C website came from Google, so they must be able to do a good job in SEO. I said that is impossible. Only those who have done search engines themselves will understand why. For example: Alibaba's B2B website can also be regarded as a search engine. I know the sorting rules. But if you give me a merchant's website and ask me to get traffic on Alibaba, before there is a scientific and systematic method, I will It's definitely not done well. Because the algorithm of search engines is not addition, subtraction, multiplication and division, it is not the combination of this factor and that factor that will lead to good traffic. The designers of search engines know the weight of this or that factor and the approximate results that may be produced, but they cannot control the specific results. Otherwise, people at Baidu would not search thousands of words every day to check the accuracy of the search results. Part of Google's success is due to the fact that Yahoo adopted its search technology, which allowed Google to accumulate a large amount of data, practice and improve its algorithms.
Moreover, within search engines, only a handful of people know the weight of each factor. Most of the engineers who design search engines are responsible for a specific task, optimizing and solving a specific problem, such as those responsible for crawlers. Engineers solve the work of improving crawler efficiency, and engineers responsible for content deduplication reduce index duplicate content. Even the engineers who designed this search engine are like this, let alone a person from a branch in another country. Otherwise, so many resigned engineers from Baidu and Google would have leaked the algorithm long ago.
If you can make a small-scale search engine using open source programs, you will be able to understand this problem better. Even if you configure the search engine's algorithm yourself, you cannot predict the subsequent search results. And it’s one thing to be a search engine, but it’s another thing to drive traffic on search engines. Otherwise, Google will not realize that the original web page loading speed affects SEO traffic.
Article source: http://www.semyj.com/archives/1032 Author: Guoping