The site quality index is an indicator of how useful your site is to users from the point of view of Yandex.
When calculating the quality index, the size of the site’s audience, behavioral factors and data from Yandex services are taken into account. The index value is updated regularly.
If the site has a mirror, then the index of the main mirror of the site will be equal to the index of the main.
The ICS index of the site subdomain is usually equal to the index of the main domain.
Signs based on user behavior data may appear next to the site address in Yandex search results. Such signs may indicate user satisfaction and trust in the site.
Popular site — the site receives this sign if it has a high traffic and a regular audience.
Choice of users — the sign is received by sites with a high degree of engagement and user loyalty according to Yandex.
Adding the data from certain website to the information base by search engine robot is called indexing. Indexed data is used for information search and is showed by the search engine as a results after user request.
A chance to attract client via search engine is bigger if more pages of your site are indexed.
Create an XML and HTML sitemaps one click away from the home page and add articles with different subjects to increase the speed of indexing the page.
Your website is all right.
To avoid getting into filters fill your website with qualitative and useful information.
AGS is a filter of Yandex search engine. It is used for detecting sites with useless content which are created as usually for references selling. Yandex added them to blacklist with help of AGS algorithm.
Instead of excluding such sites from search they would get zero TIC Also this change spreads on all the sites whict were found by AGS earlier. References from such sites would not be considered in the ranking and sites themselves would get lower rank.
Since 2009, Roskomnadzor has been controlling the dissemination of information on the Internet. For this purpose in 2012, the agency created a registry of banned sites, which is updated daily. The first to be blocked are sites with prohibited content: calls for violence, hatred on racial or religious grounds, or pornography. Roskomnadzor may also block the site for less serious violations, for example, Federal Law 152 «On Personal Data».
To remove the block, you need to remove the materials on the site, because of which you were blocked. After that, write a letter to the address: email@example.com.
Usually infection occurs because of any vulnerability which lets hacker get control over the site. Hacker can change site content (e.g. add spam) or create new pages. Usually his aim is - phishing (it stands for getting personal data and information about credit cards fraudulently). Also hackers can embed/implant malicious code, for example scripts or iframes. They extract content from another site to attack computers which are used for browsing this page.
The site is safe.
Google crawls sites to find infected resources, phishing pages, and other issues that degrade the quality of the SERP and user experience. Thanks to this information, the search engine warns users about insecure sites. If the site is deemed dangerous, Google may lower it in the search results or delete it.
The index date indexing index
The statistics systems on the site take into account traffic, failures, viewing depth, and many other indicators. They help to track the effectiveness of promotion and advertising campaigns.
Yandex Metric and Google Analytics are popular free data processing services. They provide all the necessary reports on visits to your site and contribute to the speedy indexing of pages in search engines.
We found 1 IP-address linked to the site.
Tag title is the page header, which is the key to SEO structure of the site. Title which is registered in the tag title appears in the search engine results.
The header text must be fully informative, unique and vary from 10 to 70 characters.
Page description can be found in meta tag description. There is a particular description for every page. The page description is so important because search engine may use it for snippets creation. So description influences on the search engine results ranking.
Write a description for each page in 70 to 160 characters (including spaces). Use keywords that better reflect the text’s main point. Make your text unique. Put the most important keywords in the beginning of the description
The headers on the page (tags h1-h6) are used to show the importance of the text located after each title. Using it you can create structure of your text with subtitles that make the text look more streamlined when promoting your site.
The most important tag is h1, the main header that should be putted on top of the page. Do not add more than one tag h1 because the crawler may identify the tag not correctly and drop important information.
Use subheads h2-h6 so much as you want and whenever you want. Proper use of header tags will help stimulate the traffic growth. It is no need to put entire text into the header tags because the search engine can see only the first few words of text. The rest are injected.
The text on the page shouldn’t be too short; otherwise there won’t be a sufficient number of keywords. But it shouldn’t be too long as well. In this case, the article will become "diffused" in the eyes of search engines and keywords will get lost in a long text.
The optimal text size is about 1000-2000 words for two or three promoted keywords / phrases. Of course, is not always possible to keep within these limits. By the way such a text size is perfect not only for the search engines, but also for the visitor. People don’t like reading neither long texts nor thousand pages of text.
Sickness is one of the quality text indexes and includes the frequency of the same words repeat in a text document. "Academic frequency" is equal to the proportion of repeating words to the entire volume of the text.
Texts with a high level of sickness (above 8%) are of low quality. They considered to be spammed and have a poor readability. That will definitely make visitors stay away from such a text. When they are detected by search engines Trust's website can be reduced or website even banned. Low sickness won’t help in the site promotion.
When writing the text, do not reach the sickness level more than 8-9%. Also, do not aspire to zero. Perfect level is- 4-6%. Almost all classical literature texts have the similar level.
The page size shouldn’t exceed 300 KB, it must be reduced within reasonable limits by deleting the informative content. The optimal size of the document is up to 100 KB.
Uploaded 66 resources .
|URL||Response code||Resource Type||MIME resource||resource size (compressed)|
We count all the elements of the page: images, videos, scripts and more. According to Google’s recommendations,to make the page load quickly, their total weight should not exceed 1600 KB. Optimize your resource size: use text compression, reduce HTML, JS and CSS, use WebP instead of JPEG, enable data caching.
To successfully index the page with search bots, the HTTP response code of the server must be 200.
Loading speed affects the user factors directly. Reducing load time reduces bounce rates. Reducing the load time for one second increases the conversion on two percent ( non-linear function). An increasement loading time to 7 seconds increases bounce rate on 30%. All that is load in 7 seconds and more, causes an increase of bounce rates.
Yandex robot attends slow sites less often. This affects on site promotion, such a website is rarely indexed. The server response time also affects the ranking of queries.
External link means that you refer to an external resource. Try not to rely on bad resources, those that have fake information and may harm the user. Many outgoing links on your website – is also not so good. Refer only to authoritative resources.
Do not put outgoing links on homepage. Selling links disturbs promotion.
W3C-validator is a service which lets test web-pages with several standards at the same time. To be more specific - you can test if your page corresponds to HTML or XHTML format.
Test will help to avoid small bugs like missed brackets, quotes, wrong nested tags and so on; nowadays browsers are compatible with W3C-validator - this affects accuracy of page display in the browser; valid code is better to interpret and process; if the code is valid this is the guarantee of compatibility with existing and future browsers versions.
The site does not yet have user ratings in the Web of Trust (WOT) service
Trust level in the Web of Trust (WOT) service shows demonstrative site rating from users of this web-resource. These users had already installed this extension in addition to their browser. Principle of operation is the next: some users rate the site and other users decide to visit this site or not. WOT is totally free.
Extension is available for most popular browsers. After installation of it you will see specific indicator beside site address in the search results. Green circle means that site is safe for browsing. Yellow - you should be accurate browsing this site. Red circle says that site has malicious content and it is dangerous to visit it.
Schema.org is a single universal standard, which recognize the most popular search engines such as Google, Yandex, Yahoo, and Bing.
Microcosmica is a semantic markup of pages on the site with the aim of structuring data, based on the introduction of special attributes in the HTML code of the document.
the Pros of microcathode:
the Markup directly in the HTML of the page using special attributes and does not require creating a separate export file.
Open Graph was developed by Facebook experts so that links to sites within the social network are displayed beautifully and informative. Open Graph is supported by many social networks: Facebook, Twitter, Google+, VK, Odnoklassniki and instant messengers, for example, Telegram and Skype now.
Why use Open Graph?
To get a beautiful site snippet, you need to insert Open Graph meta tags in the code of the page in the <head> tag.
search engines consider in what country server is located. Perfect situation when server is located in the same country with your target audience.
Register domains in other popular domain zones for the convenience and protection of the brand from cybersquatters.
If sites www.maxparlay.website and maxparlay.website operate without redirects separately. These two copies can be sticked together by search engines. And it will affects on search optimization negatively.
Due to incorrect encoding, site content may not be displayed correctly. In addition to the fact that visitors do not like this, the site will not be indexed or will fall under the search engine filter. We recommend using the UTF-8 encoding so that the text on the pages of the site is displayed correctly. In some CMS, Wordpress, for example, files are written in this encoding, AJAX also supports only UTF-8.
Do not forget to specify the encoding in meta tags: <meta charset="UTF-8" />
Google Font API
The robots.txt file is a list of restrictions for search robots or bots that visit the site and crawl information on it. Before crawling and indexing your site, all robots access the robots.txt file and look for the rules.
The robots.txt file is located in the root directory of the site. It must be accessible at the URL: maxparlay.website/robots.txt
There are several reasons to use the robots.txt file on the site:
A sitemap is a file with information about site pages to be indexed. With this file you can:
Use a favicon to make your site different. Favicon is a picture of a special format. It is displayed in the search engine beside your site address and also in the address line.
Put the favicon into the root folder of your site so browsers would display it. You can attach/assign specific favicon to every page.
When not existing page is requested, server should return error 404 which means «page is not found». The code of such answer says to browsers and server that page does not exist.
If server is customized wrong than it will be returned error 200 which means that page exists. Concerning this search engines would index all pages of your site with errors.
Customize your site in such way: when not existing pages are requested answer code 404 (page is not found) or answer code 410 (page is deleted) should be shown.
Server displays standard page with 404-error when not existing page is requested. It is recommended to create a unique 404-page for users convenience. And also to add link back to the site.
Due to the cache users re visiting your website, spend less time on loading pages. Caching headers should apply to all cached static resources.
Turn on your server for caching in the browser. The duration of storing static resources in the cache must be at least a week. External resources (ads, widgets, etc.) must be stored at least 1 day.
the server response Time determines how long it takes to download the HTML code to display the page.
Reduce the server response time, so that it was no more than 200 MS. Great response time may be due to dozens of factors: the application logic, slow database, routing, software platform, libraries, the lack of processing power or memory. All these circumstances should be taken into account during optimization.
Many web servers can before sending to compress files in the GZIP format, using your own procedure or third-party modules. This allows faster loading of the resources needed to display the website.
Compressing resources with gzip or deflate function allows you to reduce the amount of data transferred over the network.
Try to reduce the image size to a minimum: this will speed up loading of resources. Correct the format and compression of images reduces their volume. This ensures that users will be able to save time and money.
Should be basic and advanced optimization on all images. As part of the basic optimization are trimmed unnecessary fields, decrease color depth (to maximum acceptable values), deleted comments and the image is saved in a suitable format. Basic optimization can be done using any program for editing images.
Postpone loading of unused CSS styles to reduce their size by 631.65 КБ (97%). Found 3 resources:
|URL||Size||Size of unused||Percentage of unused|
|440 КБ||427.35 КБ||97.12|
|152.02 КБ||149.21 КБ||98.16|
|55.37 КБ||55.09 КБ||99.49|
Truncated to 5 lines.
By default, the browser must load, analyze and process all the styles it encounters before it can display content on the user's screen. Every external CSS file must be uploaded. These are additional network loads that increase the time that content is displayed.
Unused CSS also slows down the display of content. To style all the elements on the page, the browser must look at the entire tree of HTML tags and check which CSS rules apply to each node. The more unused CSS, the more time the browser may need to style the elements on the page.
Optimal Approach - Add Critical CSS Rules to <head> HTML. Once the HTML is loaded, the browser has everything it needs to display the page. No more network requests.
The solution to this problem is complex, so it is not critical.
Добавьте проект, загрузите ключевые слова и получите полную сводку с ежедневной динамикой по позициям вашего сайта.Подробнее
Check the website maxparlay.website every day and watch for changes. Free.To register