The first e-commerce circle SEO competition has entered the sprint stage. The contestants have each come up with their own special skills. Search engine optimization (SEO) continues to emerge with innovative SEO technologies and talents. Huzi is fortunate to participate in this competition. I also learned a lot of SEO experience and knowledge from it. Below, Huzi will share with you the technical points of improving Baidu rankings by brushing keyword rankings that emerged in this e-commerce SEO competition. He mainly analyzes how to bait spiders to improve Baidu rankings from the perspective of website logs.
1. Understand that the server returns two results after the spider crawls
1. The server returns 200 0 64: Through tracking experiments, I can roughly summarize it into three points: A: Session failure and request unreachable are not signs of Baidu closing the site. B: Baidu blocks the site, “pulls out the hair” or puts it in a sandbox. C: Maybe your server is a 64-bit operating system.
2. The server returns 200 0 0: This means that Baidu Spider made a request to the server. After getting the response, it started to crawl new data. Therefore, at this time, if you publish a new article, it will usually be included immediately. Through experiments Generally, the number released may reach 80%, but if it is an article published before this time period, it will not be released immediately. In this case, it will usually not be released until Baidu updates.
2. Use the crawling patterns of spiders to analyze and bait using software
1. Since server logs are generally relatively large, the workload will be huge if we analyze them manually, so we can borrow some website log analysis software to complete it. Huzi currently uses the Lightyear SEO log analysis tool. This software can Download it from my website, http://www.hxbfw.cn/UPLOAD/Lightyear SEO Log Analysis.rar. The operation is relatively simple and can generally be completed by novices. If you don’t understand, add me on QQ. The tool is shown below:
2. Improve Baidu ranking through improved SVM algorithm
A. Thanks to KKK103 for providing the article. Huzi studied his method and found that it was indeed unique. We know that when users retrieve the information they need through search engines, a large number of search results will be returned to the server. If the user retrieves valuable information, they will click on the link, usually stay for a while, and read carefully. Of course, there are It will be downloaded from time to time. If it is information that the user does not need, the web page will be closed immediately, so we can determine the user's needs based on the time of click or not.
B. The website log also includes user access time, URL and other data. We can use tools to process these data through user identification and session identification, from which we can find out which search terms or information the user really needs. , the corresponding relationship between the search term and the URL that meets the user's information needs can be determined. After Huzi's observation, he concluded that the relevance of the webpage that the user is interested in after searching for the keyword is greater than the webpage that ranks before it and has not been clicked.
C. Search engine server logs record information about the time and frequency of user visits to the website. Log analysis only needs to statistically analyze this information to find those web pages that users frequently visit and spend a long time browsing. In fact, it is Those are the web pages that users are interested in, and then use the SVM algorithm to increase the weight of these web pages so that they rank high in the search results, which can improve the search engine's accuracy. This is also the key to kkk103 ranking on the homepage of Baidu.
Summary: This ranking method is to use the log analysis of the website to obtain the search engine preferences and crawling rules. We can use certain methods to actively bait spiders to crawl the web pages we set before, so as to achieve the result of improving the website ranking. In addition, Through log analysis tools, by extension, a spider bait can be made to actively guide Baiduspider to crawl and crawl, and obtain a good ranking. I’m not just talking about sitemaps.
This article is originally from Huzi E-commerce Circle. The article address is: http://www.hxbfw.cn/Article/335.htm. Please indicate when reprinting, thank you!
Thank you Huzi for your contribution