Decided to promote the site in search engines? Great! But don’t rush to dump your advertising budget: first, check your resource for errors and problems. Today the editorial board together with the SEO-specialist of REG.RU Guzel Egorova will tell you what is SEO-audit of a website: why and how it should be done, what services and programs can be used for this.
What is SEO:
it is a set of measures to increase the significance of the site in search engines and, consequently, to attract search traffic (incidentally, you already know about it from our previous posts :)). Audit – an assessment of the quality of the site on certain criteria: we analyze the site, check compliance with the requirements of search engines, which determine the quality of the site. All this will allow you to understand what you need to do to improve the page ranking and attract more users from search engines.
Why do I need a SEO audit?
How to do a site audit?
An online audit reveals pages. SEO analysis allows you to assess how much your site is “liked” by search engines, and which of the errors found need to be corrected. Thanks to this analysis, you can not only assess the prospects of your project, but also highlight the priority edits.
Depending on how the search engine will evaluate your page, depends on the place where your site will be in the search results for a particular query. If the quality of the page is low, it may be removed from the search and not participate in the output. Audit reveals pages that require improvement, and on the basis of this you can make a list of necessary changes to the site.
If the search engine bypass site found the page is not of sufficient quality, it ceases to be taken into account in the ranking of sites at the time of the user show output on his query.
The audit can be of great benefit where conditionally “bad” pages prevent the ranking of the good. For example, an online store may increase targeted traffic by fixing product pages that search engines perceive as a double because of incorrect layout or poor quality. The audit helps to catch such pages, and small edits on them will bring traffic.
It turns out that the main purpose of the audit – to identify the most available ways to increase traffic to the site as a whole, and on certain pages in particular. After the audit is worth an analysis, make a plan of work and in the first plan to bring those that will bring the greatest effect at the lowest cost to their implementation.
The low position of the site in search can be the result not only of a particular error, but a set of factors. Therefore, do not exclude from the list of minor improvements, which may seem insignificant. When all the most important work is done, the introduction of individual adjustments can make the site more relevant to the search engine on the sum of the estimates of the site in its comparison with competitors.
Therefore, before starting work on the optimization of the site, it is necessary to conduct an audit and fix the identified errors to brush up the background and understand what happens to the site as a result of specific promotional work.
How to conduct a free website analysis?
Before you conduct a SEO audit, you should thoroughly study the site, its specificity and immerse yourself in the subject. There is no universal checklist or template, suitable for any project. However, there are common approaches to how to do a site analysis.
For the initial analysis of the site will be sufficient these tools: A crawler to evaluate your site (a program to crawl through the pages of a site. Examples are Screaming Frog SEO Spider, Netpeak Spider or SiteAnalyzer)
RDS bar (a browser plugin for analyzing website SEO performance). Useful for evaluating your site as well as your competitors.
Webmaster toolbar from Yandex and Google. The RDS bar has three interesting parameters: the number of indexed pages in Yandex and Google, Linkpad data on the number of referring donors.
On a good site the number of indexed pages in Yandex and Google should be approximately the same. Yandex shows in the index pages that he found enough quality to rank them in the search for some keyword phrases. Google shows the number of pages in total (everything the robot found).
“Chitay-town” in the top of the results. There is a concept of an expanded index and a main index: The main index in Google is the pages that participate in the search for some key phrases.
The advanced index is generally all the pages that the robot has reached. They participate only in an advanced search of the site itself, that is, they do not show up for any phrases.
Yandex also has an expanded index, it is called “Downloaded” (you can look in the panel of the webmaster). The idea is that a good site extended index should be about equal to the main, that is, there should be no bad weak low-quality pages.
If the difference in indexed pages in Yandex and Google is more than 1.5 times, you need to check which pages are indexed by which search engine (for example, using Rush-analytics), or by “gazing” to assess which layer of pages gets into the index (profile pages, pagination, passwordized, etc.), and prohibit access to it through the robots.txt file. You will also be able to identify important pages which, for some reason, are not included in the index.
What else to analyze?
Check the link profile One of the indicators of site quality for search engines is the number of links to your site from other resources.
With Linkpad you can superficially evaluate the link profile of your site. How does it work? There are robots that roam the Internet and collect link data into databases – these are Ahrefs, Serpstat, Majestic, Megaindex, Linkpad. None of these services does not have complete information, since the robots can not bypass the entire Internet (and search robots including). Associated with this is also a time delay – put a link may not be indexed immediately. Therefore, when analyzing the chart of links should be kept in mind – for the growth that can be seen in the graph, in fact the links were placed 1-2 months ago.
The level of link profile spam is an important criterion. For spammy anchors can get filters – Minusinsk in Yandex and Penguin in Google (limit sites in the ranking for a period of one to several months, which can lead to a significant loss of traffic from search).
A spam anchor is a key from the semantic core that is used instead of the link text. If there are a lot of such anchors, it is worth either diluting them or removing them altogether.
Analyze snippets (meta tag)
The title of the snippet is usually taken from the Title tag. The text part is taken from the text or from the Description meta tag. Snippets can be accompanied by links, address, phone number, office hours, a picture, price, a block of questions and answers. This data is usually pulled from the micro tags on the site.
Title and Description tags should be worked out and prescribed for all pages of the site, including non-promoted. High-quality design of other pages is one of the factors evaluating the resource search and shows that you care about each page of your site. Where to find and prescribe meta tags – read our post on basic website optimization ? ?
Check the H1 header.
H1 headings are words, phrases or phrases that summarize the meaning of the text. We don’t necessarily have to read the entire text of the article to understand what it’s going to be about. We cast a glance at the headings and subheadings and decide whether the material is interesting to us, whether it is worth reading.
Markup H builds a hierarchy of headings in the text. Content in the H1 tag is read by search engines as a text title. If there is more than one H1 in the text – search chooses one of them at its discretion. Because of the same headings on different pages may cause a shift in relevance.
It is important that the page was a single heading, which should inform the search robot of the content of the text on the page. This is why a high-frequency key is usually used in H1.