the index of the quality of a site is an indicator of how useful your website is for users from the point of view of Yandex.
When calculating the quality index takes into account the audience size of the website, behavioral factors and data for Yandex services. The value of the index is regularly updated.
If the site has mirror, other mirror site will be equal to the amount of main.
the Index of the X subdomain of the site, as a rule, equal to the amount of the primary domain.
Next to the address of the website in Yandex search results can appear signs based on data about user behavior. Such signs may be indicative of user satisfaction and trust on the site.
Popular website receives the token, if it has high traffic and a dedicated audience.
Select users — sign get websites with a high degree of engagement and loyalty of users according to Yandex.
Websites placement to Yandex Directory occurs after supply of the corresponding application ant it`s approval by moderator.
Many webmasters consider that the reason why by giving results Yandex search engine ranks the websites presented in Yandex Directory higher than others on the certain requests at other equal conditions is that information from Yandex Directory is used for separate factors rank determination.
Your website is all right.
To avoid getting into filters fill your website with qualitative and useful information.
AGS is a filter of Yandex search engine. It is used for detecting sites with useless content which are created as usually for references selling. Yandex added them to blacklist with help of AGS algorithm.
Instead of excluding such sites from search they would get zero TIC Also this change spreads on all the sites whict were found by AGS earlier. References from such sites would not be considered in the ranking and sites themselves would get lower rank.
Since 2009, Roskomnadzor monitors the dissemination of information on the Internet. For this Department in 2012 was created the register of banned sites which is updated daily. The first lock includes sites with banned content: incitement to violence, hatred on racial or religious grounds or the grounds of pornography. Also Roskomnadzor may block the site for less serious violations, for example, the Federal law № 152 "On personal data".
To remove the block, it is necessary to remove content on the website from which you acquired the lock. Then write a letter to the address: firstname.lastname@example.org.
Usually infection occurs because of any vulnerability which lets hacker get control over the site. Hacker can change site content (e.g. add spam) or create new pages. Usually his aim is - phishing (it stands for getting personal data and information about credit cards fraudulently). Also hackers can embed/implant malicious code, for example scripts or iframes. They extract content from another site to attack computers which are used for browsing this page.
The site is secure.
Google scans sites to locate the infected resources, phishing pages and other problems that degrade the quality of delivery and user experience. With this information, the search engine warns users about unsafe sites. In case the website will be deemed dangerous, Google can lower it in the results or delete.
Place a button with a X on the Yandex website. The button shows the current index of X and allows you to quickly start analyzing the website.
|Links to sites||
Position in the world: 224,442 262
Position in the country India: 34,718 697
builds the Alexa ranking sites on the basis of data that are sent to the Central server from users who installed a browser plug-in. For Runet sample obtained is very small, so the data is inaccurate. The rating makes calculations only for second level domains. If you have a blog on a blogging platform, you will see information about the entire platform at once, and not about the blog.
System statistics on the website take into account attendance, refusals, depth and many other parameters. They help to monitor the effectiveness of promotion and advertising campaigns.
Yandex Metrika and Google Analytics is a popular free data-processing services. They provide all the necessary reports about the visits to your site and promote early pages indexed in the search engines.the
Tag title is the page header, which is the key to SEO structure of the site. Title which is registered in the tag title appears in the search engine results.
The header text must be fully informative, unique and vary from 10 to 70 characters.
Page description can be found in meta tag description. There is a particular description for every page. The page description is so important because search engine may use it for snippets creation. So description influences on the search engine results ranking.
Write a description for each page in 70 to 160 characters (including spaces). Use keywords that better reflect the text’s main point. Make your text unique. Put the most important keywords in the beginning of the description
The headers on the page (tags h1-h6) are used to show the importance of the text located after each title. Using it you can create structure of your text with subtitles that make the text look more streamlined when promoting your site.
The most important tag is h1, the main header that should be putted on top of the page. Do not add more than one tag h1 because the crawler may identify the tag not correctly and drop important information.
Use subheads h2-h6 so much as you want and whenever you want. Proper use of header tags will help stimulate the traffic growth. It is no need to put entire text into the header tags because the search engine can see only the first few words of text. The rest are injected.
The text on the page shouldn’t be too short; otherwise there won’t be a sufficient number of keywords. But it shouldn’t be too long as well. In this case, the article will become "diffused" in the eyes of search engines and keywords will get lost in a long text.
The optimal text size is about 1000-2000 words for two or three promoted keywords / phrases. Of course, is not always possible to keep within these limits. By the way such a text size is perfect not only for the search engines, but also for the visitor. People don’t like reading neither long texts nor thousand pages of text.
Sickness is one of the quality text indexes and includes the frequency of the same words repeat in a text document. "Academic frequency" is equal to the proportion of repeating words to the entire volume of the text.
Texts with a high level of sickness (above 8%) are of low quality. They considered to be spammed and have a poor readability. That will definitely make visitors stay away from such a text. When they are detected by search engines Trust's website can be reduced or website even banned. Low sickness won’t help in the site promotion.
When writing the text, do not reach the sickness level more than 8-9%. Also, do not aspire to zero. Perfect level is- 4-6%. Almost all classical literature texts have the similar level.
The page size shouldn’t exceed 300 KB, it must be reduced within reasonable limits by deleting the informative content. The optimal size of the document is up to 100 KB.
downloaded 52 resource.
|URL||Response code||Resource type||MIME resource||the size of the resource (compressed)|
We consider all the page elements: images, videos, scripts and so on. So the page loads quickly, according to the recommendations of Google, their total weight should not exceed 1600 KB. Optimize resources: use to compress text, shorten HTML, JS and CSS, use WebP instead of JPEG, turn on data caching.
Loading speed affects the user factors directly. Reducing load time reduces bounce rates. Reducing the load time for one second increases the conversion on two percent ( non-linear function). An increasement loading time to 7 seconds increases bounce rate on 30%. All that is load in 7 seconds and more, causes an increase of bounce rates.
Yandex robot attends slow sites less often. This affects on site promotion, such a website is rarely indexed. The server response time also affects the ranking of queries.
External link means that you refer to an external resource. Try not to rely on bad resources, those that have fake information and may harm the user. Many outgoing links on your website – is also not so good. Refer only to authoritative resources.
Do not put outgoing links on homepage. Selling links disturbs promotion.
W3C-validator is a service which lets test web-pages with several standards at the same time. To be more specific - you can test if your page corresponds to HTML or XHTML format.
Test will help to avoid small bugs like missed brackets, quotes, wrong nested tags and so on; nowadays browsers are compatible with W3C-validator - this affects accuracy of page display in the browser; valid code is better to interpret and process; if the code is valid this is the guarantee of compatibility with existing and future browsers versions.
The website does not yet have user ratings in the service Web of Trust (WOT)
Trust level in the Web of Trust (WOT) service shows demonstrative site rating from users of this web-resource. These users had already installed this extension in addition to their browser. Principle of operation is the next: some users rate the site and other users decide to visit this site or not. WOT is totally free.
Extension is available for most popular browsers. After installation of it you will see specific indicator beside site address in the search results. Green circle means that site is safe for browsing. Yellow - you should be accurate browsing this site. Red circle says that site has malicious content and it is dangerous to visit it.
Schema.org is a single universal standard, which recognize the most popular search engines such as Google, Yandex, Yahoo, and Bing.
Microcosmica is a semantic markup of pages on the site with the aim of structuring data, based on the introduction of special attributes in the HTML code of the document.
the Pros of microcathode:
the Markup directly in the HTML of the page using special attributes and does not require creating a separate export file.
the Open Graph was developed by experts Facebook, to links to websites within the social networks was displayed beautifully and was informative. Now, Open Graph is supported by many social networks: Facebook, Twitter, Google+, Vkontakte, Odnoklassniki and instant messenger, for example, Telegram and Skype.
Why use Open Graph?
to get a beautiful snippet of website code page in thetag to insert meta tags Open Graph. the
search engines consider in what country server is located. Perfect situation when server is located in the same country with your target audience.
Young and new domains are hardly promoted in highly competitive topics. Also the history of the domain and website is important. Old domains with bad history is pretty difficult to promote. Search engines like old domains with good history (without filters, spam, black seo etc).
Don’t forget to extend the domain name. It is better turn automatic extension at your registrar. After the end of the domain registration there is a chance of losing access to the domain.
Information interchange between server and visitors should be confidential. It is important for promotion of commercial sites. It enhances loyalty of potential customers and increases trust level. Also it affects on site conversion and growth of positions in the search engine response when using almost all requests.
If sites www.verytraffic.com and verytraffic.com operate without redirects separately. These two copies can be sticked together by search engines. And it will affects on search optimization negatively.
due to incorrect encoding of the content of the website may be displayed incorrectly. Besides the fact that visitors will not like it, the site is not proindeksirujut or falls under the filter search engines. We recommend you to use UTF-8 encoding to the text on the pages of the site display correctly. In some CMS, for example Wordpress, files are written using this encoding, AJAX also supports only UTF-8.
do Not forget to specify the encoding in the meta tags: <meta charset="UTF-8" />
Google Font API
Google Tag Manager
File robots.txt – this is a list of restrictions for search engines or bots that visit the site and scan the information on it. Before you can scan and index your site, all the robots turn to the file robots.txt and looking for rules.
File robots.txt located in the root directory of the website. It needs to be available at the URL: verytraffic.com/robots.txt
There are several reasons to use a file robots.txt at:
a Sitemap is a file with information about the site pages subject to indexing. With this file you can:the
Use a favicon to make your site different. Favicon is a picture of a special format. It is displayed in the search engine beside your site address and also in the address line.
Put the favicon into the root folder of your site so browsers would display it. You can attach/assign specific favicon to every page.
When not existing page is requested, server should return error 404 which means “page is not found”. The code of such answer says to browsers and server that page does not exist.
If server is customized wrong than it will be returned error 200 which means that page exists. Concerning this search engines would index all pages of your site with errors.
Customize your site in such way: when not existing pages are requested answer code 404 (page is not found) or answer code 410 (page is deleted) should be shown.
Server displays standard page with 404-error when not existing page is requested. It is recommended to create a unique 404-page for users convenience. And also to add link back to the site.
Found 20 resource with no header caching, or set too short a time.
|URL||The lifetime of the cache||Size|
|2 hours||17.61 КБ|
|4 hours||117.9 КБ|
|4 hours||108.63 КБ|
|4 hours||65.91 КБ|
|4 hours||54.42 КБ|
the truncated to 5 rows.
Due to the cache users re visiting your website, spend less time on loading pages. Caching headers should apply to all cached static resources.
Turn on your server for caching in the browser. The duration of storing static resources in the cache must be at least a week. External resources (ads, widgets, etc.) must be stored at least 1 day.
the server response Time determines how long it takes to download the HTML code to display the page.
Reduce the server response time, so that it was no more than 200 MS. Great response time may be due to dozens of factors: the application logic, slow database, routing, software platform, libraries, the lack of processing power or memory. All these circumstances should be taken into account during optimization.
Many web servers can before sending to compress files in the GZIP format, using your own procedure or third-party modules. This allows faster loading of the resources needed to display the website.
Compressing resources with gzip or deflate function allows you to reduce the amount of data transferred over the network.
Try to reduce the image size to a minimum: this will speed up loading of resources. Correct the format and compression of images reduces their volume. This ensures that users will be able to save time and money.
Should be basic and advanced optimization on all images. As part of the basic optimization are trimmed unnecessary fields, decrease color depth (to maximum acceptable values), deleted comments and the image is saved in a suitable format. Basic optimization can be done using any program for editing images.
By default, the browser must download, parse and process all the styles that he has, before he will be able to display content on the user's screen. Each external CSS file should be loaded. These additional network load, which increases the display time of the content.
Unused CSS also slows down the display of the content. To style all the elements on the page, the browser needs to see the whole tree of HTML tags and CSS rules are applied to each node. The more unused CSS, the more time you may want the browser to style the elements on the page.
the best way is to add critical CSS rules in the <head> HTML. Once the HTML is loaded, the browser has everything it needs to display the page. No need to make network requests.
the Solution to this problem is complicated, so it is not critical.
Users of PCs and mobile devices are used to perform vertical and not horizontal scrolling websites. If to view just the content you need to scroll the website horizontally or zoom out, it causes any inconvenience.
When developing the mobile site with a meta viewport tag, you will have to position the content so that it will not fit into the specified viewport. For example, if the image is wider than the viewport, there may be a need for horizontal scrolling. To avoid this, you need to change the content so that it entirely fits.
Website design for mobile phones solves three problems: provides users with the most comfortable web browsing from any device, builds a positive image of the company and effects on the site search rankings.
your pages do not specify a viewport by using the viewport. This means that the mobile device will try to display them as on PC, decreasing the scale proportionally to the screen size. Specify the viewport tag to make your website display properly on all devices.
the viewport defines how a web page is displayed on the mobile device. If not specified, the page's width is equal to the standard value for PC, and it is reduced to fit on the screen. Through the viewport you can control the page width and scaling on different devices.
One of the most common problems of reading of sites on mobile devices is too small font size. Have constant to scale the website to read small text, and it is very annoying. Even if the site has a mobile version or adaptive design, the problem of poor readability due to small font are not uncommon.
Use legible font sizes to make your site more convenient.
Plug-ins help the browser process the special content, such as Flash, Silverlight or Java. Most mobile devices do not support plug-ins, which leads to many errors and security violations in browsers that provide this support. In this regard, many browsers restrict the work.
Check the website verytraffic.com every day and watch for changes. Free.To register