-
Because Qingxin brother is a programmer, many friends around me came to me and asked me to tell them how their website program is doing, what areas of the code can be further optimized, and if they have any questions about website layout, etc. I was getting a little bored, so I took advantage of Sunday to sort it out: What should be done well when optimizing the website program? It is very comprehensive. I hope everyone can review it carefully and improve the shortcomings of their own websites. Make some changes.
1.DIV+CSS, static URL "within 255byte", simplify the code and eliminate junk code
DIV+CSS is not only good for optimization, but also for better scalability of the program, and better division of labor for the management of the program. When you want to change the color tone or change the template style, you only need to change the style. The content of the website It will not change, and it is the most powerful thing for program management.
Although Baidu and Google both say that there is no problem with dynamic paths and they can identify any longer path, simple URL paths are still conducive to optimization. The most important thing is to facilitate users to spread your website content and make it easier for you to optimize. .
2. Reasonable use and setting of Flash, Img, Iframe, and js tags, and use with caution
These elements can be used. If you make a choice between customer experience and optimization, then the user experience is the first priority; even if you use these elements, remember to give him a good description.
3. Reasonable use and parameter setting of Meta, title, h, b, ol, ul, p, a tags
This is a cliché, but remember that these tags must be used reasonably. Here is a case page for you: Google bidding: http://www.ruyitu.com/Google.html
PR2 of this page, diversified use of tags, few external links, take a look at the rankings: Google first, Baidu second, Sosou sixth, Bing third; look at the specific code yourself, it is a good example of tag diversification Example.
4.robots.txt, sitemap, 404, tags, friend-add "Automatically add friend links for review" page settings
Among the above pages, the most controversial one is the tags page. Let me tell you, because this page can aggregate your related articles together through keywords, do you think it will be ranked higher in terms of density and relevance? This is why tags page spiders like to crawl and rank well.
5. Previous page, next page; page turning function facilitates spider crawling
When your website has a lot of content, your old content will slowly be hidden and forgotten by search engines. To make your content crawlable by search engines smoothly, the page turning function is the best. The most complete way to let search engine spiders crawl all content on your website.
6. Each article content page must ensure a unique URL path and unique title
The content page in the website must have a unique URL. This means that no matter when you classify it by time, category, characteristics and other attributes, you must ensure that the content page has a unique link, otherwise your content will be duplicate content. Yes, the weight of the website will be lost for the page you link to internally.
7. The page layout allows related articles, latest articles, random articles, and recommended articles to be displayed reasonably.
This is a good way to make your content page fully eaten by spiders, provide users with relevant content, improve user experience, and extend the user's stay period. It is also a very labor-saving way to push your content to users.
8. Article internal linking system: The basic principle is to link to tags pages, column pages, content pages, and home pages.
The internal link system is mainly the best way to make your relevant pages rank well and to prevent your internal pages from being forgotten by search engines.
After reading this, more than 90% of websites don’t do it well. Do it now and don’t let your ranking stop because of hesitation and excuses.
Thank you for your contribution