As an SEOer, URL canonicalization is a problem that must be solved. To put it bluntly, canonicalizing a URL means telling the search engine which URL you want when there are several different URLs pointing to the same page, instead of letting the search engine do it on its own. Judgment, so standardizing URLs is a search engine-friendly approach. Below I will share several methods of URL normalization.
The first type, 301 redirection, can be said to be the most common one and the most used one. 301 redirection is generally used in new and old domain names, new and old URLs, and to transfer domain names without www to domain names with www. On the Internet, both virtual hosts and independent servers can be redirected. If it is a virtual host, you must first determine whether the host supports 301, and then select different rules for 301 based on the operating system and server type. I will not explain too much here. , you can also refer to the tutorial when doing 301 on an independent server. I hope you can try it yourself. What you learn by yourself will be remembered more firmly than what others teach you. But remember not to change 301 into 302, so you should test whether it is 301 after doing it.
The second type, the "canonical" tag, may be unfamiliar to many people. This tag recommends that search engines select the target URL from several different URLs. However, this is only a suggestion, not a command. For specific usage, please refer to the following: It's normal for a website to have multiple pages listing the same set of products. For example, one page might display products in alphabetical order, while another page displays the same products by price or rating. For example:
If Google knows that the content of these pages is the same, we may only index one version of the search results. Our algorithm selects the web pages that we believe best answer the user's query. However, users can now specify a canonical page for search engines by adding the <link> element and the rel="canonical" attribute to the <head> section of a non-canonical version of the page. Adding this link and attribute allows website owners to identify which pages have the same content, allowing Google to suggest: "Of all the pages with the same content, this page is the most useful. Please rank this page higher in search results." location." (This paragraph comes from the GG help article)
The third type, Robots.txt, if a website has duplicate content or the above situation occurs, you can also use robots to block URLs that you think do not need to participate in rankings.
The fourth type, Noindex tag, has the same principle as above, that is, URLs that you do not need to participate in ranking are not indexed by search engines. Usage: Add this statement <Meta name="Robots" Content="Noindex"> to the head of the relevant web page. part.
The above are several ways to standardize URLs. I support the first and second methods. They are for reference only. This article is written by the webmaster of Shanghai Debt Collection Company http://www.hu-tz.com . Please keep it for reprinting.
Thanks to Hangzhou Debt Collection Company for their contribution