I am very happy that Excellent SEO Blue Sky can bring an article to A5 friends. Today’s article mainly talks about the details of SEO optimization of corporate websites. Without further ado, let’s get to the point.
1. Enterprise search engine keyword distribution
When I looked at competitor websites, I found that many SEOers would pile website keyword links at the top or tail of the website. I think this is actually unnecessary. The keywords should be distributed as densely as possible and not too large. Stacked keywords have been a problem in the past few years. Banned by google. Spreading the keywords will actually greatly improve the search engine's favor and will not make the search engine think it is cheating.
2. How to write articles for corporate websites
Most of my friends will copy other people's articles, modify them and publish them on their own websites. This method is actually very feasible. It mainly depends on your modification ability. If you are good at it, try to modify the article from beginning to end. Never modify a paragraph or type. It's stupid to mess up the order of the articles. In addition, when my Kaifeng SEO friend Yunfei writes corporate articles, he will choose e-books and use software to export them. Most of these articles are original content, which is a good little trick.
3. Standard writing method of robots files
I once asked a colleague about robots
Disallow:/kaifengseo/
Disallow:/kaifengseo
Is there any difference between the two?
There must be a difference. Adding / after the first line prevents spiders from crawling the subdirectories and contents of the Kaifengseo folder, while the second line without adding / prevents spiders from crawling the kaifengseo directory and all contents in the directory. The included range is larger than the first line. This is a detail that you need to pay attention to when optimizing your corporate website.
4. Meta Robots And Robots Nofollow
Robots files are the first files crawled by search engine spiders. Spiders will include or not include some pages based on the requirements of your website's robots, but whether to include or not mainly depends on the search engine itself.
Meta Robots as in HTML page have
With this code, spiders will not follow the code on the web page. rel="nofollow" is for individual links.
5. Website article link details
Most of the links to website articles on the websites you are looking after are: domain name + directory + (time) + link method to the article page.
Domain name/directory/article page
Domain name/directory/time/article page
This structure is indeed clear, but if the overall framework of the website is adjusted, many previous links will become invalid, resulting in a large number of dead links, which is detrimental to the user experience.
For example, if the article 2.html in directory 1 of www.admin5.com/1/2.html wants to be changed to directory 3, then the previous link to www.admin5.com/3/2.html will be dead. At this point, some dedecms users will ask why I have never encountered such a problem. Then Lantian will tell you that dedecms uses a copy method. The files in the 1 directory will not be lost, and the K directory will be displayed normally. The homepage is still the same. It will direct you to the latest updated directory. Webmaster.com does a pretty good job. All articles are in the /Article/ directory. His method allows articles to be stored in one directory, and then each directory will call the articles in the Article directory, so that even if the directory is changed, the final link of the article will not be changed.
Everyone is welcome to communicate with me. I am Excellence SEO Blue Sky QQ97522319 Guangzhou Website Construction http://www.corfu.cn Please keep the article information and A5 link for reprinting.
Editor in charge: Yang Yang The author insists on the personal space of 1992