The explanation of URL in the encyclopedia is Uniform Resource Locator. In fact, to put it simply, it is a web page address. Some webmasters may say that there is something to pay attention to in a URL, but I want to say don’t underestimate this URL. If you choose the right one, it will Website optimization can play a vital role, so what issues should we pay attention to when optimizing? Now I will explain to you.
One URL construction is shortened and static
Although official regulations state that URLs are legal as long as they do not exceed 1,000 letters. However, the setting of the URL obviously cannot have so many words, and in actual operation, the author found that many website pages use a lot of parameters, resulting in a very long website URL. Not only is the user unable to obtain additional useful information from it, but it also affects the page. The access speed leads to poor user experience. Someone has done a survey and said that the click-through rate of a short URL is 2.5 times that of a longer URL, and many people now like simplicity, and a lengthy URL is definitely not conducive to the spread of the website.
Secondly, there is another problem that the URL needs to be static. Although in actual operation, we are sometimes forced to choose some dynamic URLs, I still recommend that you choose static URLs only if you have the opportunity. Although dynamic URLs do not have a great impact on search engines, static URLs URL can effectively shorten the length of the address, and also effectively avoid the risk of the website falling into the search engine black hole.
However, there are certain rigid rules for designing static URLs, that is, the number of parameters in the URL can be as few as possible. Generally, about 3 are enough. If there are more, it will slow down the website access speed and the loading of the address.
2. The URL structure level is clear and concise
Webmasters should not think that the problem of user experience only exists in the construction of website content. The construction of URLs also needs to pay attention to user experience. Therefore, when we design the URL of our own website, we must combine it with the overall hierarchical structure and functional settings of the website to give spiders and users a clear URL page, so that everyone can "hear it before seeing it" and predict the website's future in advance. Main content, reduce the bounce rate of the website.
In addition, when building a URL, you should also purify your keywords, analyze and summarize some key points and integrate them into the URL, and attach the really important words to the path. Don't make it so that you don't even know what the content of the website is. URL. Just like the example above, I embedded the keywords "No. 2 Chief" and "Steward" into the URL.
Finally, let’s talk about the directory hierarchy. The directory hierarchy I’m talking about here actually refers to the physical directory structure, not the logical structure. When we design the URL, the structure of the website should try to reduce the directory hierarchy as much as possible. The hierarchy cannot be too deep. It is generally recommended not to go beyond three levels. Especially for some new sites, the weight is low and search engine spiders will crawl very shallowly. Spiders will probably not crawl on deeper pages, so the directory hierarchy should be as high as possible. few.
Three URL writing standards
1. Some webmasters will add ";", "," and other characters in the URL of their website. I personally don't know why they do this. It may be for the uniqueness of the URL, but what I want to tell you is Although the URLs with the above special symbols can be opened in the browser, when we are doing promotion, these propagated links will not be opened by users, which will greatly reduce the effectiveness of our website promotion.
2 Speaking of this, I have to remind everyone that a complete URL needs to be added with "/", such as http://www.xxx.com/ . Only the URL written in this way can concentrate the weight of the website itself, so When we make external links and internal links in articles, we must write this standard URL with the "/" symbol. Although it is very small, it is very effective.
3 It is best to use uniform uppercase and lowercase characters in the URL, and all lowercase characters are best. Sometimes URLs with different capitalization are treated as two different web pages, causing confusion or separation of authority.
Four other aspects
1.URL301 jump settings
If the website is suddenly revised, we must perform a 301 jump at this time. Because a 301 jump represents a change in the permanent address, a sudden revision of the website will make the spider confused for a while. At this time, a 301 jump can tell the spider that our website has been changed, attracting the spider to the new URL, and allowing it to index the new URL. The updated content of the website will prevent us from modifying the website URL ourselves and causing the website's weight to decrease.
PS: It usually takes a long time to change the old and new URLs. At this time, you must treat all kinds of data calmly, you know.
2. One page corresponds to one URL
Sometimes when building a website, we will encounter multiple URL addresses pointing to the same page. In fact, this situation is very common, especially when I want to create a brand effect, I will use multiple domain names such as com, cn, net, etc. to create a New access path. For example, although www.anbncn.com and www.abc.com (random examples) point to the same page, they are two links in the eyes of search engines, which will inevitably lead to the dispersion of website weight.
But if you think about it, you know that if you do this, the URL will definitely disperse the weight of the website, and search engines will not recognize multiple URLs. This will result in only one of several URL addresses being recommended, and the page weight will be affected. Great impact. However, we can avoid this situation by paying attention to techniques when optimizing. I also asked webmasters about this question before, and summarized three points based on their answers:
(1) Do not put unnecessary content such as session id, statistical code, etc. that can easily cause duplication into the URL.
(2) Use robots.txt to prevent search engine spiders from crawling abnormal URLs that are not displayed to users.
(3) For some abnormal pages, use 301 to jump to normal pages.
The above are some requirements that I personally think should be paid attention to when constructing URLs. I would like to share them with you here, hoping that they can be helpful to everyone. This article was written by http://www.llxhw.com Swallowing Starry Sky’s latest chapter list. Reprinting is welcome, thank you.
Editor in charge: momo Author hi135's personal space