There is such a saying in SEO: details determine success or failure. In SEO optimization, we have always emphasized attention to details, because it is possible that a detail we ignored in optimization can directly cause our optimization to enter a bottleneck period. Many webmasters are When you first take over an old site for optimization, you will encounter some difficult problems, such as stagnant rankings, slight declines in rankings, sometimes up and sometimes down inclusions, etc. It is sometimes difficult for us to find solutions to these problems. , then when we take over a website that is in the middle of optimization or a website that has some problems, we need to diagnose and analyze the details of all aspects and dig out the long-standing problems, so that our optimization can become proactive. Today Let me discuss with you the detailed analysis of the optimization diagnosis of the old site. ​
When we take over a website, the first thing we need to do is system analysis. System analysis does not look at the website from a macro perspective, but analyzes the website from every detail. When we take over a website, we need to analyze it. What time is it. ​
1. Operation of Robots.txt​
The main function of Robots.txt is to process the pages and paths in the site. It can help us block some content-less pages and dead links, etc. Let’s look at Robots.txt from three aspects.
1. Check whether there is Robots.txt. Robots.txt is a must for our website optimization. Not having Robots.txt means that the SEO of this website is very poor. The method to check whether there is Robots.txt is very simple and you can enter it directly. Domain name/Robots.txt is sufficient. Generally, Robots.txt is in the root directory. If it exists, we can see it. If not, we need to do it quickly. ​
2. Check whether the Robots.txt is written correctly. Most webmasters should understand the syntax of Robots.txt. What needs to be emphasized here is the arrangement of Allow and Disallow. Do not put Allow in front, but put Disallow in front. First ban and then allow. ​
3. Check the pages and paths blocked by Robots.txt. We webmasters must learn to identify some common folders on the website. We open each blocked page and path one by one to see if these pages are invalid and should really be blocked. It is still unnecessary. At the same time, we also need to check the site: domain name for inclusion to see if there are any pages in the inclusion that need to be blocked. This is very important. ​
2. Diagnostic analysis of website paths​
1. Parameter detection of dynamic paths. Many websites use dynamic paths, and each dynamic path has parameters. If a dynamic path has more than 3 or more parameters, it will directly lead to difficulty in the inclusion of the page. So how to check how many parameters there are in a dynamic path? It's very simple, just look at the equal signs in the path. There are as many parameters as there are equal signs. ​
2. The good and bad of pseudo-static paths. Pseudo-static paths are also commonly used by us. At that time, there were many pseudo-static paths that were not done well, which also affected their inclusion. During our inspection, if there are question marks in his pseudo-static paths, Or other parameter symbols or Chinese, it means that this pseudo-static path has failed. Some pseudo-static paths, such as wp’s pseudo-static, sometimes have /index.php/ behind the domain name. Such pseudo-static is also very bad. , we should pay attention to this. ​
3. Check the rationality of the path. One of the issues about the rationality of the path should be emphasized, that is, the problem of Chinese paths and English paths. Many webmasters like to use Chinese paths because they think it looks intuitive, but the inclusion of Chinese paths is better than pure paths. It is much more difficult to include alphabetical paths, so webmasters should be careful when using Chinese paths.​
3. Inspection of commonly used optimization elements​
Commonly used optimization elements include: canonical tags, nofollow tags, H tags, alt tags, etc.​
1. Let me tell you the difference between nofollow and robots.txt. Nofollow is mainly used to prevent weight from being transferred to external URLs, while robots.txt is mainly used to block internal URLs. One is responsible for the outside, and the other is responsible for the inside. , clear division of labor. ​
2. The canonical tag is something we often use, especially in forums. This tag is generally suitable for list pages and page paginations with high similarity. If you have several paginations or lists with high similarity, you need to tell the search engine. Which page do you use to participate in the ranking of keywords? The specific writing method is as follows: < link href=" http://www.***.net/zhishi/ " rel="canonical" / > ​
3. Everyone is familiar with the alt tag. This attribute must be added to every image on our website to tell search engines the content of the image. This is also very important and should not be ignored. ​
4. The H tag is mainly suitable for pages with a lot of content titles. If your content page has a middle title and a subtitle below the big title, and there are many titles, then you need to add the H tag to make the page organized. Generally, there is no such thing on the home page. Need to add this. ​
4. Website page similarity analysis and website optimization flaw analysis​
1. The similarity of a website mainly depends on the template text of the website. Look at the proportion of the template text of the website to the main content of the entire website. If the proportion of the template text occupied is too high, it will directly affect the similarity of the website. At this time, it is necessary to Streamline our boilerplate text. ​
2. Website optimization flaws are mainly website opening speed diagnosis, long titles, long paths, etc. For website speed optimization, I would like to recommend a tool to you, which is Firefox’s pagespeed. It can detect you very well. We must carefully analyze the reasons that affect the speed of the website and provide solutions. Problems such as long titles and long paths cannot be ignored. These are matters that will deduct points for optimization. ​
5. Inspection of website external links​
Mainly ask whether you have bought links, ask in detail when you started buying links, where you bought links, whether you still buy links now, check its Baidu related domains, check whether the website has mass-posted content, check Google Link links to view the website Processing of high-authority pages. ​
6. Analysis of the reasons for the downgrade of website rights​
If the website we take over suffers a downgrade, we need to ask and analyze the following questions. ​
1. Whether the website title has been changed, whether the program and path have been changed, whether the path has been changed and whether corresponding processing has been done, such as robots.txt processing, and whether the template has been changed. ​
2. Check its IIS log to find out whether it is a server problem or other problems that caused the downgrade. ​
3. Analyze the stability of its external links. There are three main external link factors that cause the reduction of authority: First, large-scale loss of external links, mainly due to account deletion or unstable purchased links. Second, there are a lot of spam external links on the website. Third, there are factors involved in reducing the authority of friendly links. ​
After diagnosing and analyzing the entire website, we can roughly analyze the problems existing on the website, formulate solutions for different problems, and carry out the next step of optimization work. ​
This article was first published by http://www.51diaoche.net original A5. Welcome to reprint.
Editor in charge: Chen Long Author Longfeng Hoisting Machinery’s personal space