Today we talk about what the program should pay attention to when building a website:
3. Important web pages should be found from a relatively shallow part of the website, ensuring that each page can be reached through at least one text link.
Note: I think what Baidu means is that the simpler the website structure, the easier it is for Baidu to crawl (never have countless DIVs or tables inside one DIV or table)
4. Try to use text instead of flash, Javascript, etc. to display important content or links. Baidu is temporarily unable to recognize the content in Flash and Javascript. This part of the content may not be found in Baidu searches; only include links to flash and Javascript. Web pages may not be included by Baidu.
Note: This is telling us that if your site has a lot of flashjavascript, it is best to create a page with a text link to the address linked in your flash and jaascript.
5. Use frame and iframe frame structures as little as possible. Content displayed through iframe may be discarded by Baidu.
Note: Use as little as possible. From this sentence, I understand that it cannot be used in large quantities. You can write some HTML code and add a few more iframes, then open the process manager to open the webpage and take a look. You will understand why spiders don't like sites with a lot of iframes.
6. If the website uses dynamic web pages, reducing the number of parameters and controlling the length of parameters will be beneficial to inclusion.
Note: Reducing the number of parameters and controlling the length of parameters will be beneficial to inclusion! When building a website, many friends who build websites say that static pages should be generated for the entire site (they say that spiders like static pages and will give up dynamic pages). I really believe it! When I saw this sentence on Baidu Help today, I finally understood why spiders prefer static pages.
Please pay attention to this sentence: "Reduce the number of parameters and control the length of parameters"! Anyone who writes programs knows: the more parameters, the longer the length. The greater the chance of errors in a dynamic page! You may not understand this. , let's make an analogy: a spider is a person, the network is an area composed of countless alleys, and a dynamic page error is a dead end. If you say that a person has been here several times and it has been a dead end, do you think this person will come here again in the future? ? Then why do spiders like static pages? That’s because static pages can be accessed as long as the page is not deleted on your server (it is not a dead end)! If you delete a large number of static pages on your server, you will find out before long In fact, Baidu does not prefer static or dynamic sites. Baidu prefers sites that can be accessed normally and will not let him get into dead ends.
Reprinting is welcome but please retain the author: www.lt77.com