Current location - Quotes Website - Personality signature - What should I do if I can't find the problem of website demotion?
What should I do if I can't find the problem of website demotion?
Refers to the decline in the rating of the website by the search engine, which is a punishment for the website by the search engine. Usually, it is caused by the website's own cheating, and it will recover itself after correcting the mistakes for a period of time.

First, the keyword ranking dropped significantly.

Second, the number of websites has decreased.

Third, the update speed of search engine snapshots is reduced (this item is not applicable to new sites).

Four: the homepage of the website is not the first in the search engine.

Five: the website is K.

The website downgrade has the above five characteristics, among which the keyword ranking is greatly reduced, with K as the main factor.

The right to edit this paragraph is degraded.

Website demotion is a common situation in website marketing. Usually, excessive website optimization or unreasonable layout will lead to the website being demoted by Baidu. As a marketer, how to inquire whether the website has been downgraded?

The 1. website contains only the home page:

The performance of this kind of website being downgraded belongs to the "advanced version" of the above situation, and only the homepage of the website is left when querying the included pages of the website. Except that many new webmasters are still included in the homepage of search engines after time optimization, the occurrence of this kind of power reduction can basically be considered as the website being k;

2. The inclusiveness of the website has completely disappeared:

Imagine that a website has some pages collected by search engines, and then all of them disappear overnight. Perhaps this is the problem of the search engine itself, but it is more likely to be caused by the unfavorable factors of the website itself. From inclusion to non-inclusion, this shows that the website has been "initially" abandoned by this search engine;

3. The number of websites has dropped sharply:

In general, it is normal for the number of websites to fluctuate slightly, especially for medium and large websites. However, when the number of websites decreases sharply in an instant, it is obviously the performance of the website being downgraded;

4. The ranking of single or multiple keywords fluctuates greatly:

Generally, white hat seo optimization method is always used to optimize the website, and it is rare that multiple keywords are greatly reduced at the same time. It is obviously abnormal when most keywords of the website rank in a large area at the same time;

5. Site or domain name is not in the first place:

Site and domain name are high-level instructions of search engines. Of course, domain names are not suitable for every search engine. In general, through the site or domain name, the home page of the website will appear at the top of the search results. Therefore, when the homepage of the website is not in the first place, it means that the search engine has downgraded the website. The performance of this kind of website being downgraded is called "the home page sinks to the bottom", which is a reminder to K station.

It should be easy to judge whether the website has been downgraded. If most websites are normal and only your website is abnormal, then you should be careful that the website has been degraded. At this time, it is best to calmly analyze the links inside and outside the website, find out the problems and solve them patiently.

The reason why this paragraph was degraded in folding editing.

Folding space problem

Because the speed of space access is unstable, sometimes it is slow, sometimes it can't be opened, because there are also many cases of being degraded. Therefore, the author suggests here that we must not buy cheap space for the sake of cheapness, but choose a guaranteed and fast domestic space. I wonder if my website couldn't be opened when it was last updated. Here, the author sorts out the common grab return codes of search engines.

1xx (temporary response) indicates the status code of the temporary response, requiring the requester to continue the operation.

100 (Continue) The requester should continue to send the request. The server returns this code, indicating that it has received the first part of the request and is waiting for the rest.

10 1 (switching protocol) The requester has requested the server to switch protocol, and the server has confirmed that it is ready to switch.

2xx (successful) status code indicating that the request has been successfully processed.

200 (Success) The server successfully processed the request. Usually, this means that the server has provided the requested web page.

The 20 1 (created) request succeeded, and the server created a new resource.

202 (Accepted) The server has accepted the request, but has not yet processed it.

The 203 (unauthorized information) server successfully processed the request, but the information returned may come from other sources.

The 204 (No Content) server successfully processed the request, but returned nothing.

205 (Reset Content) The server successfully processed the request, but nothing was returned.

206 (partial content) server successfully processed some GET requests.

3xx (redirection) indicates that further action is required to complete the request. Typically, these status codes are used for redirection.

The 300 (multi-choice) server can perform various operations on the request. The server can select an operation according to the user agent, or provide a list of operations for the requester to choose from.

30 1 (permanent move) The requested webpage has been permanently moved to a new location. When the server returns this response (a response to a GET or HEAD request), it will automatically move the requester to a new location.

The 302 (temporary move) server is currently responding to requests for web pages from different locations, but the requester should continue to use the original location for future requests.

303 (View Other Locations) When the requester should use a separate GET request for different locations to retrieve the response, the server returns this code.

304 (Unmodified) The requested webpage has not been modified since the last request. When the server returns this response, the content of the web page will not be returned.

305 (Using Proxy) The requester can only use the proxy to access the requested webpage. If the server returns this response, it also means that the requester should use the proxy.

307 (Temporary Redirect) The server is currently responding to requests from web pages in different locations, but the requester should continue to use the original location for future requests.

4xx (Request Error) These status codes indicate that there may be an error in the request, which hinders the processing of the server.

400 (Wrong Request) The server does not understand the syntax of the request.

40 1 (unauthorized) request requires authentication. The server may return this response for the webpage that needs to log in.

403 (Forbidden) The server rejects the request.

404 (Not Found) The server cannot find the requested web page.

405 (Disable Method) Disables the method specified in the request.

406 (Reject) The requested webpage cannot be responded with the requested content characteristics.

407 (agent authorization required) This status code is similar to 40 1 (unauthorized), but the requester should be authorized to use the agent. 408 (Request Timeout) The server timed out while waiting for the request.

409 (Conflict) The server encountered a conflict while completing the request. The server must include information about the conflict in the response.

4 10 (deleted) If the requested resource has been permanently deleted, the server will return this response.

4 1 1 (valid length required) The server does not accept requests without a valid content length header field.

4 12 (prerequisites not met) The server does not meet one of the prerequisites set by the requester in the request.

4 13 (request entity is too large) The server cannot process the request because the request entity is too large, which exceeds the processing capacity of the server.

4 14 (requested URI is too long) The requested URI (usually URL) is too long for the server to process.

4 15 (unsupported media type) The requested page does not support the requested format.

4 16 (requested range does not meet the requirements) If the page cannot provide the requested range, the server will return this status code.

4 17 (not up to the expected value) The server does not meet the requirements of the "expected" request header field.

5xx (Server Error) These status codes indicate that the server encountered an internal error while trying to process the request. These errors may be the problem of the server itself, not the request error.

500 (Internal Server Error) The server encountered an error and could not complete the request.

50 1 (Not yet implemented) The server does not have the function to complete the request. For example, this code may be returned when the server does not recognize the requested method.

502 (Wrong Gateway) The server acting as a gateway or proxy receives an invalid response from the upstream server.

503 (Service Unavailable) The server is currently unavailable (due to overload or downtime maintenance). Usually, this is only a temporary state.

504 (Gateway Timeout) The server acted as a gateway or proxy, but did not receive the request from the upstream server in time.

505 (HTTP version not supported) The server does not support the HTTP protocol version used in the request.

Reasons for page degradation

1. Modify page titles regularly.

2, do not update the article, do not send external links.

3. Joint effect of friendship links

4. The combined effect of one-way connection or two-way connection (adding the appearance of one-way connection of five websites, during which four websites were downgraded)

5. If the server is unstable, it may be hung up or invaded.

6. Homogeneous pages with a large number of pages

7. Irregular internal connection referral

Use group software. There is no doubt that if you use groupware, it will lead to degradation. Baidu is very sensitive to the judgment of mass sending, and basically recognizes it, with a recognition rate of over 90%. Therefore, mass mailing is a capital crime and it is impossible to escape punishment.

Folded link lost

The decline of the number of external links will lead to the decline of keyword ranking and website weight. The reason why Baidu doesn't publish the external chain query is to avoid being manipulated by everyone. It is safe to lose the external chain within 15%, and it will be more dangerous if it exceeds 15%. Baidu will calculate and punish according to the proportion of lost links.

The loss of external links is mainly manifested in the following aspects: ① forum signature, account blocked by forum, post not maintained and deleted by search engine; ② Blog was closed by Bian Xiao; The friendship link is lost, and your website can't be opened temporarily, resulting in the removal of the external link.

Fold purchase link

Whether it is an open chain or a black chain, there is a risk of reducing the authority of the website. Because the quality of websites outside the * * chain will continue to decrease, the number of exported links will increase and the weight will be less. Moreover, websites that buy links will not be officially optimized, and they will be easily punished by search engines and will be implicated in websites.

Collapse junk links

It also belongs to the category of link loss to some extent. Search engines will identify spam links as valid links at first, and then these spam links will be valuable. When identifying these spam links, Baidu will remove the weight of these spam links, and even deduct more weight than the original. This is the biggest impact.

Folding implied weight reduction

If you have a link to a website, when it is downgraded, then your website will also be implicated in the downgrade. Therefore, it is essential to check the export link once a day. The criteria for checking whether there is degradation are: whether the keyword ranking of the reference website drops, whether the brand words and unique words drop, and whether the snapshot drops sharply at the same time. Once found, it must be removed in time.

Search engine optimization of folding black hat

This method can only get short-term benefits, and as a result, it will never turn over again after a while. Cheating like 30 1 steering is a typical cheating phenomenon. Once it is recognized, it is necessary to directly remove the ranking. There is also the over-optimized black hat technology in the station: piling up keywords, too many internal links and too many internal links in the article.

Folding hanging advertisement

Hanging advertisements greatly affects the quality of the website, especially the content-independent advertisements, which seriously damages the trust of search engines in the website. Especially bad is the pop-up advertisement, which will seriously affect the quality of web pages.

Collapse to change keyword density

Some webmasters deliberately increase the frequency of keywords in order to improve the keyword density or pursue the so-called keyword density standard of 2%-8%, which seriously affects users' reading and is easily judged as cheating by search engines. In fact, naturally increasing keyword density is very simple. As long as you navigate with breadcrumbs, you can increase the density.

Collapse frequently modified titles.

Changing the title is the biggest punishment for the new station. This will lead to two phenomena: one is to delay the release time of the inclusion, that is, many pages are included today, but the snapshot is the previous reason; Second, the title is often modified, even the home page will be k-dropped, and the punishment is really heavy.

Folding high-weight links is growing too fast.

For the new station, the reason why the homepage was downgraded by K, Homepage is because the homepage added too many friendly links in a short time, especially high-weight links, which exceeded the normal development level of the new station and would be considered cheating and lead to punishment. Therefore, it is very important to moderately increase the number of external chains. It is recommended not to add more than 8 friend chains in the first month.

The repetition rate of folded pages is too high.

If a lot of duplicate titles are published but not deleted, they are all included, and the website will be punished. There is also the content problem: in many product stations and enterprise stations, except for some parameters of pictures and products, the content of product pages on the inner pages is basically repeated, and Baidu will give great punishment to these repeated contents. This situation is that the similarity is too high, and if many pages are the same, they will be punished.

Folding inner chain cheating

For example, many keywords are inserted into the text to point to the home page, and it is best not to do it. There are also many websites that add a few keywords to the anchor text at the bottom to point to the home page. These methods are taboo operations of search engines.

The folding server is unstable.

There are countless examples of power drop caused by server instability. How to judge the stability of space is to observe IIS logs. For example, many 503s are generated by accessing IP. If these 503s are not generated by search engine crawlers, there is no problem, because the decision-making power to judge stability lies with search engine crawlers. Try not to use your own network to judge the website, but look at the IIS log to see if there are any return status codes such as 503 and 404 when the search engine crawler visits.

Too many folded dead links.

If you delete the original page, there will be dead links. How to find dead links, the most accurate is to look at the log, judged by spiders and robots. Too many dead links will reduce the trust of search engines to the website, increase the bounce rate of users, and lead to the degradation of rights.

Folding website revision

Website revision will affect the change of content, leading to a large-scale change of keyword density. If the website changes programs directly, the path will also change. If the old link is deleted, there will be dead links. If it is not deleted, there will be duplicate pages and it will be downgraded. Only by doing a good job of 30 1 redirection corresponding to each path can the impact be reduced. However, it is difficult to get the ranking within 3 months after the revision and program change.

Folding also opens up too many second-level domain names.

More than five, search engines will think it is cheating. There are too many second-level domain names in small and medium-sized websites, which will lead to the decline of keyword relevance and the inclusion of websites may be removed.

Generate folded and repeated pages and pages without content.

For example, print page, session ID, profile page, comment page, etc. Are independent dynamic addresses, which will lead to the degradation of the website.

Poisoning of folding website

Poisoned websites do great harm to users. Once they are recognized by search engines, they will directly degrade the rights of websites, because they will harm the interests of all users, and search engines will certainly not let them go.

Page elements are over-folded.

For example, the ALT cheating optimization of the picture, the picture attribute added too many keywords, and the search engine included this article in the cheating category. So, add a description in moderation. There is also the bold labeling of keywords and the abuse of H tags. These factors will lead to the degradation of the website.

The quality of folded pages decreases.

For example, the page opening speed is too slow, JS calls are too many, customer service is too much, webmaster statistics, weather forecast and so on. They are all added, which seriously affects the user experience, and are all factors that lead to the degradation of the website.

Solve the folding and editing of this paragraph

One: Check that robots.txtRobots.txt is the first page that a search engine visits when it crawls a website. When you find that the website is punished, the first thing to do is to check the robot. TXT file. Not only do you need to check carefully manually, but you also need webmaster tools to verify whether there are any errors, which prohibits search engines from including certain pages and directories.

Two: check the websites with the same IP. Although search engines don't impose strict penalties on websites on the same server, if you are unfortunately on the same server with a lot of junk websites or even illegal websites, the chances of being punished are very high. At this time, you need to use the webmaster's tools to view the articles of the same IP website, and select some to view the collection and keyword ranking. If most websites have problems, please replace the host as soon as possible.

Three: Check whether the website uses the turn code. Except for 30 1 turn, other turns such as JS turn may be judged as cheating. Even if you are not careful or necessary to make these changes, search engines will punish you mercilessly. So if there are many changes on the website, please delete them as soon as possible.

Four: It is also important to check the metacode of the page. If the metacode is hacked by competitors, or the direct technicians operate improperly, the nofollow attribute is made on robots.txt, and 30 1 transcoding is added.

Five: it is not appropriate to check whether there are suspected webmasters who are over-optimized. When you are punished, you should pay attention to whether there are keywords piled up on the page, whether there are too many internal links, and whether there are too many footers that are meaningless to users. If the anchor text set for website optimization is over-optimized, you should resolutely improve it without hesitation to reduce the degree of optimization. Pay attention to the principle of degree in everything and master the degree of optimization is a skill that SEO personnel must master.

Six: Learn to be calm and don't blindly rectify the website. If you are sure that your website is not cheating, you must be calm when you encounter the trend of declining keyword ranking and inclusion. You can't blindly modify the website. Observe for a few days or weeks first. The decline in ranking is not necessarily caused by the problems of our website, but is probably caused by the algorithm changes of search engines. Search leads to the continuous introduction of new algorithms. If the new algorithm does not conform to the user experience, it is likely to be revised back, and then you will draw water with a sieve.

The above six methods [1] can basically solve the problem of power reduction, and the key is to do it seriously.