When building a new website, we should do a self-test before officially submitting it to search engines. To avoid frequent modifications after opening, delaying website inclusion. Whether a website is made to comply with SEO standards, you can also use the following methods to self-test whether the website complies with SEO standards.
one. Whether the website keyWords, description, <title></title> tags are written in compliance with SEO standards is the lifeblood of a website's SEO and directly affects the later rankings of development.
two. Whether there is a lot of junk code on the page. If a website page has too many junk codes, it can easily give search engines the illusion that the site is a junk site.
three. The size of the homepage of the website. It is generally believed that the size of the homepage of the website that requires SEO should not exceed 30KB. Try to streamline the homepage code as much as possible, and move the content that can be moved to the column column as much as possible. This not only ensures that the website opens quickly and is friendly to search engines.
Four. Whether the website navigation design is reasonable? The website navigation is divided into primary navigation and secondary navigation. The primary navigation is mainly the name of the website column. The secondary navigation includes website map, website introduction, service items, website registration number, advertising cooperation, traffic statistics and other related information. Information, navigation content settings are designed according to the site. It is generally believed that it is more reasonable to write the column name in the main navigation and place the rest of the content in secondary navigation. The primary navigation is at the top of the website, and the secondary navigation is at the bottom of the website.
five. Whether there are dead links on the website. Dead links refer to link addresses that cannot be accessed normally. Its existence will affect the inclusion of the website and is not friendly to website spiders. There are many tools available online to detect this.
six. Does the 404 page of the website exist? The 404 page refers to the prompt message that appears when a link on the website cannot be accessed. The 404 page can be designed like others. Simply follow a website domain name with 404 and visit the website to see that the 404 page cannot be redirected directly. Home page. This will cause the home page of the website not to be included.
seven. Does the website robots.txt text exist? The main function of robots.txt is to block a certain page of the website. Many webmasters think that their websites are developed for search engines and do not set the robots.txt text. The author recommends that even if they are all open, they should Create a robots.txt in the root directory of the website, and allow all text without writing any content.
eight. Check whether the website server speed is stable. The most taboo thing about a new website is instability. It is a tragedy that the website cannot be accessed if you work hard to attract spiders. If you share a host, you should also check whether the site under the same IP is normal. To avoid being implicated by others, it is recommended to purchase an independent host.
Nine. Never ignore the domain name. If you are purchasing a second-hand domain name, be sure to check whether the domain name has been punished by search engines before purchasing or whether there are sites similar to your domain name that have published a large amount of information that is not conducive to the development of the website. The author once encountered a situation where the new website was not included for half a year because of the domain name. After changing the domain name, Baidu included more than 400 articles in 48 hours.
ten. The issue of keyword density control is an old issue. I believe webmasters all know it. There has never been a conclusion on this issue. Nowadays, it is generally agreed that 2%-10% is the best. How many keywords to add will be defined according to the length of the article you write. After adding, copy the article address to the Internet to query the keyword density.
These are just the author's personal experience. In front of the veteran, I must be doing what I do best. I use the webmaster network to share my experience with everyone. If webmasters have anything to add or have any new insights, please leave a message for discussion.
This article was originally published by the webmaster of www.gm06.com and was first published on Webmaster Home
Thanks to xinlaide11 for his contribution