Friendly links are one of the necessary criteria for search engines to judge the quality of a website. The so-called "red" is red and "black" is black, which is also used in search engine rules. The quality of friendly links directly affects the optimization effect of the website. Among them, exchange is one aspect, but "one-way" links can more easily improve the optimization effect of the website. Therefore, under the temptation of interests, it is inevitable that some people will try their best to make a "one-way connection". At the same time, the endless means are becoming more and more difficult to guard against. From the simple "exchange and withdrawal of links" to the advanced use of optimization vulnerabilities, everything is vividly reflected in it. The use of "robots" is one of its usual methods, and it is also the least noticeable one. From time to time, many new technologies are revealed to deceive webmasters. How can webmasters prevent fraud by robots? The following content will give you a glimpse:
1: “Robots” protocol specification
Speaking of "robots", the first thing that webmasters think of is the "ROBOTS.TXT" file on the website. What is the relationship between the two? In fact, ROBOTS.TXT is an application of the "robots" protocol. Tell search engines what is accessible and indexable, and what is not accessible and indexable. The use of ROBOTS protocol can be said to have brought great convenience to website optimization. However, because of this, some people opportunistically "complicate" this rule and disrupt the normal order in website optimization, such as deceiving friendly links.
2: “Robots” blocks friend links
Robots.txt can block website directories and links, allowing search engines to index content completely according to their own wishes. On the one hand, it solves the problem that the security information of the website will not be leaked by spiders, causing immeasurable losses. On the other hand, you can also optimize the details of the website, which is more conducive to increasing the weight. But the pros and cons depend on it. What will happen if the webmaster has ulterior motives and uses the robotst protocol to block the friendly link module or link? In this way, when the search engine indexes the website, will the link be indexed and the weight transferred? The answer is no, and There is no difference between exchanging links on such websites and making individual links. Therefore, webmasters should be wary of whether the information in Robots.txt in the linked site is reasonable.
Three: "Robots meta" tag
Do you know about the "robots meta" tag? Many websites use it to guide search engines to index the entire site. But if the webmaster has ulterior motives, he can also change the original meaning, such as: "<meta name="robots" content="index, follow">" Have you ever seen such a line of code? Does it look familiar? What it means is: all search engines can index along the home page. But once it becomes "
"The meaning is completely opposite: to prevent the page from being included in the index and to prevent spiders from tracking the page link. The two small characters "no" bring different meanings. The webmaster should carefully check whether the source code is correct after exchanging links. It's written completely correctly, don't be careless. Note: Even if the nofollow attribute is not added to the individual links in the page, the meaning is the same as long as it is added at the beginning of the page.
Friendship link exchange is not just an exchange of external links, but also a trust. How can we talk about sustainable development if we are deceiving each other? Webmasters should be more sincere when exchanging links, so that long-term cooperation can achieve a win-win situation. Has the website been demoted? A5 Optimization Team/He Guijiang ( http://i.zhihuiSEO.com ) reminds the webmaster to check if there are any problems with the "robots" of the friendly links.
Editor in charge: Chen Long Author Binary Network's personal space