SEO is closely related to technology. Many people think that SEO does not require technology. In fact, this is a misunderstanding. If you do not understand technology, you will not be able to do SEO successfully, and you will not be able to do it scientifically if something goes wrong. In the eyes of many people, SEO is nothing more than publishing some articles and sending some external links every day. If you understand SEO in this way, it means that your level is still at a very preliminary stage, even if the search engine is a technology It is a very powerful thing, so if we want to get the corresponding traffic on it, we must have certain skills, or at least understand its principles. So what issues should we pay attention to whether the pages of the website meet the technical requirements of SEO? Woolen cloth?
1. Use a text browser to check your web page to see if the important URL links in the web page can be captured by the text browser.
The commonly used text browser is lynx. You can install it in the Linux environment. When it browses your web page, it is almost the same as what the search engine spider sees, so we often use it to test the crawlability of the web page. , if important links in your page use JS, Ajax and other technologies, the lynx browser cannot see these links, and the same search engines cannot crawl these links. Therefore, before the website goes online, you can use a text browser to check the main web pages to see if any technology that is not friendly to search engines is used.
2. Allow search engine spiders to crawl your site without using session IDs or parameters that can track their path to the site.
Some session IDs are very effective for tracking user access behavior, but it is not a good idea for search engines. If these techniques are used, your web pages may not be fully indexed because the search engine program It cannot be ruled out that those web pages that look different from the URL, but have exactly the same web content. It is not friendly to inclusion.
3. Make sure your web server supports the If-Modified-Since HTTP header
If your server supports the if-modified-since-HTTP header, when the search engine spider crawls your web page, it will first check the If-Modified-Since HTTP header value. Through this value, the search engine The crawler program can determine whether your web page has changed since it last crawled it. If there is no change, it does not need to download the same page content, saving spider resources and saving your server bandwidth, so that the spider can Crawl more other pages.
4. Set up reasonable robots files
It is best to set up a robots file for each website. If your website does not have any content and does not want to be indexed by search engines, you can create an empty file named robots and upload it to the root directory of your website. Through the robots file, We can let search engines crawl certain directories but not crawl certain directories. For example, some template files can be prohibited from being crawled by search engines. Some background login files can also be prohibited from inclusion using robots files. When setting up robots files , we must be careful to avoid banning some important files, we can use Google Webmaster Tools for testing.
5. Conduct compatibility testing on the website to ensure it displays correctly in every browser
Before we launch the website, it is best to do a lot of browsing tests to ensure that the web content can be displayed correctly on all browsers. For example, IE has an IE tool test that can test the display effects of various IE versions. Google Chrome There are also similar plug-ins that can test the compatibility of the browser. After the test is completed, it can be displayed normally before going online. During daily maintenance, you can also frequently test the compatibility of the page display.
6. Frequently use some front-end testing tools to test the performance of web pages
For example, we can use Google's page speed tool and Yahoo's yslow to test the performance of web pages. These two tools can also point out areas in your website that need to be improved. We can follow the steps described there to optimize the front-end page, for example Enable GZip, enable keep-alive, use CSS sPRites technology, merge JS and CSS, etc. The above content was first published on admin5 by China Beauty Talent Network www.mrrencai.com . Please keep the URL for reprinting, thank you!
Editor-in-Chief: Yangyang Author keepmove’s personal space