For website optimization, it is generally divided into two categories, one is internal optimization and the other is off-site auxiliary optimization. In fact, off-site optimization is usually what we call search engine-assisted optimization. From some perspectives, its role is relatively objective. For internal website optimization, there are elements that are irresistible to updates, which is relatively more difficult. Therefore, for these two types of optimization, on-site optimization is relatively simple and has a relatively powerful effect. However, during the years I have been working on the website, I have found that many webmasters have ignored its importance, which is also a major reason why many webmasters cannot get ranked.
Before telling the experience, everyone must understand the importance of internal links. Don’t just rely on off-site optimization to improve website rankings. That is impossible. Some friends who love to analyze rival sites may find that many sites with better rankings actually don’t have many external links, but their rankings are very stable, so they still can’t understand it. In fact, at this time, you should pay more attention to how the other party does on-site optimization. Yes, at least in my opinion, as long as a website is optimized on-site and does some off-site auxiliary optimization, the ranking can easily rise. Of course, on-site optimization is also a persistent task. Okay, without further ado, today I will mainly share with you some experiences and techniques I have accumulated when building my own website, focusing on optimizing the internal links of the website.
(1) Clear and tidy website structure, concise and clear directory hierarchy
This is undoubtedly the most basic for a website, but it is also a necessary foundation component, because many times after we set up the structure of the website, we usually do not make changes, and even if we change it, we will use 301 Jump in the form to avoid losing authority of the website. Also, everyone knows that spiders like original content, but many times you will find that you are publishing original or pseudo-original content every day, but spiders just don’t crawl. At this time, you should see if your website column URL is too complex. Changes, resulting in spiders often being "too lazy" to crawl. So at this point, it is helpful to set the website structure clearly and concisely in one sentence, both for the inclusion and weight of the website.
Here I would like to emphasize one point: for a website, the more directories there are, the more dispersed the weight of the content pages will be. Therefore, when designing the directory, we should try to make the directory shallower as much as possible, which is more conducive to spider crawling. It would be nice to have more natural page weight, and the overall website weight will also be improved. There are still many friends who want to ask how to set it up? In fact, it is very simple. Generally, we can directly use the custom columns or article pages in the background of the website. If it is a dynamic web page, remember to make it static, which is more conducive to spiders. crawling.
(2) Cleverly use nofollow tags to reasonably allocate website weight flow
Maybe many webmasters may not be very familiar with the nofollow tag, especially novice webmasters. In this writer’s opinion, the nofollow tag is very important. If you don’t understand it, you can find some nofollow tags and use relevant introductions to learn about it. The author does not Let’s talk more, mainly talking about its function.
For an SEO optimizer, as long as "rel=nofollow" is added to some link codes on our website, search engines will think that we do not give this page any weight. And in the GG search engine optimization suggestions, it is also mentioned that the best internal links of a website page should generally not exceed 100, otherwise the website weight will be too dispersed, resulting in our website page not ranking for keywords. So at this time, if we use the nofollow tag reasonably, we can allocate some pages that are relatively uninterested in conveying weight to some long-tail keyword pages.
Having said all this, many webmasters may not understand where they like to use "rel=nofollow". Here, the author will give you some suggestions. Specifically, we still have to analyze and decide based on our own website:
1. "About Us | Contact Us | Copyright Statement | Site History | Friendly Links" We can use nofollow emoticons for such pages, because these pages do not have much effect on our website and mainly rely on users to click in, which generally does not Users are searched through search engines, so distributing weight to them is a waste.
2. Pages with a lot of repetitive website content. The content of some websites is difficult to write, and the content is highly repetitive. This is very disliked by spiders, and it also greatly disperses the weight of the website. Therefore, we can also nofollow some highly repetitive pages, so that other pages can gain more and better weight.
The above two points are what I usually use. The specific ones still need to be determined according to your own website.
(3) Cleverly use page jumps to transfer internal page weights
For this point, it is actually discovered, tried and implemented based on the principle of the website jumping to WWW without www. Generally, for a page that can be accessed from multiple ways, spiders usually give better weight. Therefore, making page jumps is also a very necessary way to implement long-tail keywords. There are also the more repetitive pages mentioned before. In addition to nofollowing them, it is also possible to use the jump method, because everyone knows that search engines do not like highly repetitive pages and rarely use them. included, so when you publish some relatively repetitive content, you can collectively jump these repetitive contents to the "unique" content, so that the weight of these repetitive contents can be well transferred to The "only" page.
(4) Make a website map and RSS subscription map to make it easier for spiders to crawl
Sitemap, I believe every webmaster knows, these are very helpful for SEO optimization and website inclusion, allowing spiders to better crawl and crawl our website pages. Especially for sites that are updated frequently, the collection is generally very powerful, and the map is also very simple. Generally, we can generate it directly in the background of the website, so we have to do a good job of website map and RSS subscription map and Regular updates will make your site more favored by spiders.
Summary: In fact, for internal website optimization, more often we still need to analyze and plan for implementation based on some conditions of our website itself, because after all, the website optimization methods of each industry are also different, so today the author mainly These are some optimization suggestions that are common to general websites. I hope they will be helpful to all webmasters. This article was originally shared by Game Name Network http://www.name2012.com . I hope that friends who reprint it will remember to retain the link and copyright. I would like to express my gratitude to you. Well, that’s all for sharing with you today. I will communicate with you more on this platform in the future, so see you next time.
Editor-in-Chief: Susu Author Game Name Network’s personal space