-
Now search engines, whether Baidu, Google or other engines, have changed their algorithms. What are the specific changes? The author believes that current search engines pay more attention to user experience. So how do search engines test whether a website has a good user experience? To put it simply, the search engines adjust their own algorithms and use technology to simulate users as much as possible to intelligently test the website. This is the best explanation for why the website optimization was very good before, but it was K or demoted for no reason.
A website with too much collected and pseudo-original content. A website must have its own original content. Most of today's search engines can identify whether your articles are collected or original. I copied an article from another website, exchanged the paragraphs, and then updated it to my website after deleting and adding the paragraphs. Baidu can also include it, but I carefully discovered that, This page brings a negative impact to my website. I randomly searched for a section of my pseudo-original article (the order and wording of this section were messed up by me), and then I searched on Baidu, and what I found was not this article from my website, but me. For the article that has not been original before, it can be seen that Baidu algorithm can already identify most simple pseudo-originals.
The content of the second website is not updated in time. If your website is not updated in time, or even if it is updated, the content of your website is not updated enough, then I think search engines will regard your website as a dead site or an old site. So don't expect your website to have any good rankings.
The traffic of the third website is too low and the bounce rate is too high. If the traffic of your website within a day is too low, it means that your website has not been recognized by users and cannot rise in the rankings. In addition, the bounce rate of website traffic is too high, which means that your website is a spam website. Well, at first I thought it was impossible for Baidu to even collect such data, but recently I used Baidu’s statistics and found that one of the items in it is the bounce rate. It can be seen that this is also the bounce rate that search engines use to judge your website. Whether it can be recognized by users.
The structure of the fourth web page is not rigorous enough. The structure of the webpage is not rigorous enough. For a good website, you must know which content is always displayed and which does not need to be displayed. For example, the link list that is often displayed is sometimes on the left and sometimes on the right. There is no uniformity, and search engines are crawling. When fetching a page, it not only grabs the text content of the page, but also generates a link list. If the position of the link list is often not fixed during the fetching process, the search engine will determine that your website affects the user's browsing, and will be demoted. .
Five web pages have the same meta tags. If you don’t know what the meta tags of a web page are, I suggest you use the HTML suggestion project in Google Admin Tools. It not only describes the meaning and usage of meta tags, but also helps you intelligently analyze whether there are meta tags on your website. of similarity. Hundreds of web pages cannot have the same web page title, web page keywords and descriptions. If your website has such problems, it is best to correct them.
6. There is no ALT attribute added to the webpage image, or there are too many repetitions of keywords in the ALT attribute. When a user visits a page, if there are too many keywords or irrelevant keywords in the description of the image, then the user experience will certainly be bad.
Seven webpage keywords are over-optimized. If the keywords on a page of yours appear too densely or too much, it will not be difficult for search engines to conclude that your webpage has no actual content, there is too much false information on the webpage, and the user experience will definitely be bad.
Eight websites use web technologies such as jumps indiscriminately. If your website uses unnecessary jumps, search engines will reduce the trust of your website, because the web pages searched from the user's perspective cannot be randomly redirected.
Nine website access speeds are too low. When search engines crawl pages, they now include access speed. If your website speed is too low, it will inevitably affect website users' long waiting times. Therefore, you should not add unnecessary Flash and JS scripts, or you can add them. You must make the access speed of the web page faster.
Ten websites have too single link source or dead links. From a user's perspective, if a website has dead links, it will naturally make you feel bad. It also shows that the website is unstable and will naturally be demoted. The source of the website is too single. It is abnormal if most of the website links come from forum blogs.
The above is my summary of the ten search engine based on user experience. The judgment of a website's quality is a personal point of view. Criticisms and corrections are welcome.
This article was originally created by Aerated Concrete Equipment ( www.ihuazhu.com ), please indicate when reprinting. My personal opinion has nothing to do with the website.
Thanks to ihuazhu9527 for his contribution