Everyone knows that website revisions, problems with the same IP website, or your website being attacked will cause your website to lose authority. However, there is another factor that will also affect the weight of the website, and that is the duplicate content of the website. If the website is Filling with more duplicate content will inevitably affect the weight and ranking of the website, so taking precautions in this regard is an important measure for SEO.
When does duplicate content occur?
1 Collect content. This is easy for everyone to understand. In order to save time on writing content, many webmasters go to other websites to collect some content to fill their own websites. Doing so will definitely lead to an increase in duplicate content between websites, and the consequences of doing so Everyone knows this well, but there are still many webmasters who are lucky. However, the author suggests that everyone can be appropriately pseudo-original and add their own understanding to the article. This is better than directly copying and pasting, which will make spiders like it more.
2 Duplicate content caused by api sites. This reason is mainly for Taobao website, because after the API site is updated, it will produce a lot of duplicate content, which has a tendency of homogeneity. But now you don’t have to worry, because Taobao has banned Taobao website from using API promotion.
3 The sorting function causes repeated presentation of content. For example, some large shopping navigation websites will have columns such as "Ranking by Sales Volume" and "Ranking by Popularity" on the homepage. These different rankings are usually based on URL dynamic parameters. Just imagine a popular website. The products are likely to be products with high sales volume, so this results in different URL combinations. Once there are too many such products, it will be quite detrimental to website optimization, because there will be many overlapping URL addresses.
4 On the third point, I said that many websites for downloading novels and songs will have homogeneous content. In fact, this situation is also common in the e-commerce B2C industry. Maybe you will say that e-commerce websites have very little content. , there are many pictures, so this situation is unlikely to occur. However, it is precisely the pictures that cause the website to be repetitive, because everyone knows that spiders do not recognize pictures. At this time, everyone will use the alt tag to annotate the pictures, but this cannot It is guaranteed to reduce the inclusion of duplicate pages, because after all, the alt tag marks less text, and the URL addresses of most pictures are still the same, so it is still difficult to avoid the inclusion of duplicate pages on the website.
5. Website program setting issues. In the process of building a website, sometimes the webmaster creates some dynamic pages in the background to achieve a preview effect of the website. However, because these dynamic pages are ignored, spiders are crawling. It will still be included in the directory, which will cause duplicate content on the website.
6 generated rss subscription. Everyone is familiar with RSS subscriptions. Some large news websites or websites such as personal blogs will use RSS subscriptions to generate content for personal sites. However, the content of these personal sites will inevitably be reprinted by others, which will cause the original source to be reprinted. The information overlaps with the content of other websites, and repeated inclusion by spiders becomes possible.
7 Duplicate pages caused by error codes. Everyone knows that the only error page return code can only be the 404 status code, and other status codes are incorrect. For example, if you use status code 200, then when you delete a page, it will tell the spider on the other side The request for this page is successful, which will cause the website content to be included again. In the same way, if you want to guide the error page, you must use the 404 status code, otherwise it will definitely cause duplicate content to the website.
Duplicate content is a form of cheating in the eyes of spiders. Therefore, if the website is too repetitive, it will cause the website's weight to decrease. So how should we prevent it in normal website construction and maintenance?
1. Reduce the frequency of collecting content. This is easy to understand. There is nothing for nothing in the world. If the development of the website depends on collecting, then the website will have no hope of continuing to survive.
2 Each page has a unique title. This is a very important place, because tags are an important part of website optimization. Its addition allows spiders to distinguish the uniqueness of this page, and it is also a good improvement for user experience, especially column pages, which are unique. The tag can give certain optimization advantages to the weight of internal pages.
3 Modify the meta tag. A good description plays a major impact on the spider's inclusion judgment. However, many sites now use CMS programs to implement meta tag independence on a large scale. Therefore, the author recommends that you manually modify some meta tags, at least more than all meta tags. It would be much better if the web pages share one description, but of course this is a helpless move.
4 Modify the article and make it pseudo-original. It is also difficult for everyone not to collect articles. I can understand this. After all, webmasters who have worked on websites know that they have to write a certain amount of articles every day. Even if they are writers, they cannot bear to write every day, but for collecting We must revise the articles that come in. It is best to say it again in your own words. Add your own opinions and understanding to the article. This is a flattering approach for spiders.
5 Modify the CSS style sheet. In order to reduce the difficulty of building their own website, many novice webmasters sometimes download other people's website homepages or content pages, and then slightly modify them and put them into their own websites. But novice webmasters should not forget to download them. There are many similar CSS style sheets in the sample page. If not modified, the website will inevitably become more repetitive. Of course, personally, I don’t agree with everyone imitating other people’s web pages and modifying the style sheets. After all, building a website requires some skills. own characteristics.
6. Reduce invalid or duplicate URLs. In the process of building the website from the beginning, we try to make the URLs as unified as possible, and do not use dynamic pages to connect, because from the perspective of spiders, they do not like dynamic pages.
7. Reduce links between invalid content. Many times we will modify the previous website structure or content pages, and some deleted content will be left behind. We must clean up these things in time and use administrator tools to remove these invalid links to avoid spider duplication. Fetch, resulting in a situation similar to status code 200.
Website construction is like the human body. Omissions in any aspect may cause "disease". Therefore, when a website loses authority or ranking, don't always think that it is caused by friendly links, website revisions, etc. It may also be caused by duplicate content. We should not take this lightly. We may not see the disadvantages in the short term, but over time, the stability of website development will no longer be guaranteed. This article was published by the editor of Hemorrhoids Remedies http://www.cqtaihai.com/ . Please indicate the reprint, thank you.
(Editor: Chen Long) Author cptai's personal space