Everyone pays attention to a lot of SEO data every day, but some of them are closed after being opened for viewing, let alone recorded. In fact, these data are very interesting and important. Developing a good habit of recording them can be very helpful for long-term analysis in the future, and sometimes it is even necessary. Today I will mainly talk to you from four aspects. I will also extend these basic data by the way. I hope it will be useful to everyone.
First, the daily collection of the website (channels, columns)
I think many webmasters will check the inclusion status of the website as soon as possible every day in order to have a general understanding of the website. It has two functions. First, it records the current website's inclusion status; second, it records and summarizes it every day and makes it into an icon style for overall analysis. Because we know the overall inclusion status of the website, we have a general understanding of the search engine inclusion rules. In this way, based on the number of articles we publish, we can predict the approximate traffic of the website very well.
Suggestions and tools: For small websites, just record the entire site, but for slightly larger sites, you need to include each channel and column separately, so that the analysis will be more accurate. As for tools, it is recommended that novices just use site syntax. It is not recommended to use tools, which will increase the workload. After getting familiar with them, you can slowly use tools to do sampling queries.
Second, the increase and decrease trend in the number of external links to the website
The external links of the website are what many webmasters are most concerned about. External links have the greatest impact on our website and are also the focus of the webmaster’s daily work. We usually not only check the number of external links, but also summarize the recent work. For example, we usually go to forums to post signatures, blogs, and sprockets. These effects require real-time query of external link data. Get whether it has any effect. The increase or decrease in the number of external links has great guiding significance for future work.
Suggestions and tools: Don’t pay too much attention to Yahoo external links. Baidu external links (related domains) also contribute a lot to rankings. You should look at both when making inquiries, and the same applies when making external links. Don't just think that leaving the URL is useless and just do it. In fact, it still has an effect. Of course, there are many tools for querying external links, and general webmaster tools can meet your needs.
Third, observe website traffic and visiting URLs
Website traffic and visiting URLs are two very critical aspects. Many webmasters will analyze the effect of their work from these two aspects, not only the entire website, but also the traffic from each channel and column. Look at the URLs of these traffic pages and analyze what words are searched. These Does the word have any derived words? From where does the traffic come to the website, so the next step can be planned and adjusted.
Suggestions and tools: Be careful when analyzing, such as identifying the keywords that bring traffic, starting from multiple angles, looking at the origin, time, word length, daily search volume, page ranking, whether the page article is original, and a snapshot of the page itself Time and so on. This way you can get the basic rules. As for tools, it is recommended that you use Google analytics (GA), because it can maintain website data for a longer period of time and facilitate analysis.
Fourth, look at the spider visit patterns from the logs
Website logs are valuable treasures left by search engines when they visit the website. We can find a lot of valuable information in them. Through the summary of this information, we can clearly guide the root cause of the problem at a glance. For example, you can usually see here the total number of visits to the website by SE every day, the average number of pages crawled each time, the efficiency of crawling every day, what are the crawling paths, etc. If the average number of pages visited by spiders every time they come to the website is very limited, it means there is a problem with the website structure.
Suggestions and tools: You can see the root cause of the problem by looking at the logs. For example, the top ten pages and directories crawled by many websites are the theme directories where CSS is located, or the shopping cart pages, contact us, etc. After seeing these through the logs, Can be blocked in robots. As for tools, you can use Backfire or Lightyear tools. There are also some tools in English, but they are inconvenient to operate, so they are not provided here.
Okay, that’s it for today’s four aspects. I hope it can really help everyone. Please point out any shortcomings so that we can improve together. The weight loss drug ranking list ( http://www.jianfeiyaocn.com ) is provided, please keep it for reprinting!
Editor in charge: Yangyang author tdggw's personal space