Semantic core and landing pages. Compiling a semantic core

What is the semantic core of a site? The semantic core of the site (hereinafter referred to as SY) is a collection keywords and phrases for which the resource progressing V search engines and which indicate that the site belongs to a certain topics.

For successful promotion in search engines, keywords must be correctly grouped and distributed across the pages of the site and contained in a certain form in meta descriptions ( title, description, keywords), as well as in headings H1-H6. At the same time, overspam should not be allowed, so as not to “fly away” to Baden-Baden.

In this article we will try to look at the issue not only from a technical point of view, but also to look at the problem through the eyes of business owners and marketers.

What is the collection of SY?

  • Manual— possible for small sites (up to 1000 keywords).
  • Automatic— programs do not always correctly determine the context of the request, so problems may arise with the distribution of keywords across pages.
  • Semi-automatic— phrases and frequency are collected automatically, phrases are distributed and refined manually.

In our article we will consider a semi-automatic approach to creating a semantic core, as it is the most effective.

In addition, there are two typical cases when compiling a synonym:

  • for a site with a ready-made structure;
  • for a new site.

The second option is more preferable, since it is possible to create an ideal site structure for search engines.

What does the process of compiling a NL consist of?

Work on the formation of the semantic core is divided into the following stages:

  1. Identification of directions in which the site will be promoted.
  2. Collecting keywords, analyzing similar queries and search suggestions.
  3. Frequency parsing, filtering out “empty” requests.
  4. Clustering (grouping) of requests.
  5. Distribution of requests across site pages (creation of an ideal site structure).
  6. Recommendations for use.

The better you create the core of the site, and quality in this case means the breadth and depth of semantics, the more powerful and reliable the flow of search traffic you can direct to the site and the more customers you will attract.

How to create a semantic core of a website

So, let's look at each point in more detail with various examples.

At the first step, it is important to determine which products and services present on the site will be promoted in the search results of Yandex and Google.

Example No. 1. Let’s say the site has two areas of services: computer repair at home and training to work with Word/Exel at home. In this case, it was decided that training was no longer in demand, so there was no point in promoting it, and therefore collecting semantics on it. Another important point, you need to collect not only queries containing "computer repair at home", but also "laptop repair, PC repair" and others.

Example No. 2. The company is engaged in low-rise construction. But at the same time he builds only wooden houses. Accordingly, queries and semantics by directions "construction of houses from aerated concrete" or "construction of brick houses" may not be collected.

Collection of semantics

We will look at two main sources of keywords: Yandex and Google. We’ll tell you how to collect semantics for free and briefly review paid services that can speed up and automate this process.

In Yandex, key phrases are collected from the Yandex.Wordstat service and in Google through query statistics in Google AdWords. If available, as additional sources semantics, you can use data from Yandex Webmaster and Yandex Metrics, Google Webmaster and Google Analytics.

Collecting keywords from Yandex.Wordstat

Collecting queries from Wordstat can be considered free. To view the data of this service, you only need a Yandex account. So let's go to wordstat.yandex.ru and enter the keyword. Let's consider an example of collecting semantics for a car rental company website.

What do we see in this screenshot?

  1. Left column. Here is the basic query and its various variations with "tail". Opposite each request is a number showing how much this request is in in general has been used by various users.
  2. Right column. Requests similar to the main one and indicators of their overall frequency. Here we see that a person who wants to rent a car, in addition to the request "car rental" can use "car rental", "car rental", "car rental" and others. This is very important data that you need to pay attention to so as not to miss a single request.
  3. Regionality and history. By choosing one of possible options, you can check the distribution of requests by region, the number of requests in a particular region or city, as well as the trend of changes over time or with the change of season.
  4. Devices, from which the request was made. By switching tabs, you can find out which devices are most often searched from.

Check different options Key phrases and the received data are recorded in Excel tables or Google spreadsheets. For convenience, install the plugin Yandex Wordstat Helper. After installing it, plus signs will appear next to the search phrases; when you click on them, the words will be copied; you will not need to select and paste the frequency indicator manually.

Collecting keywords from Google AdWords

Unfortunately, Google does not have an open source of search queries with their frequency indicators, so here you need to work around it. And for this we need a working account in Google AdWords.

We register an account in Google AdWords and top up the balance with the minimum possible amount - 300 rubles (on an account that is inactive in terms of budget, approximate data is displayed). After that, go to “Tools” - “Keyword Planner”.

Will open new page, where in the “Search for new keywords by phrase, site or category” tab, enter the keyword.

Scroll down, click “Get options” and see something like this.

  1. Top request and average number of requests per month. If the account is not paid, then you will see approximate data, that is, the average number of requests. When there are funds on the account, exact data will be shown, as well as the dynamics of changes in the frequency of the entered keyword.
  2. Keywords by relevance. This is the same as similar queries in Yandex Wordstat.
  3. Downloading data. This tool is convenient because the data obtained in it can be downloaded.

We looked at working with two main sources of statistics on search queries. Now let's move on to automating this process, because collecting semantics manually takes too much time.

Programs and services for collecting keywords

Key Collector

The program is installed on the computer. The program connects work accounts from which statistics will be collected. Next, it creates new project and a folder for keywords.

Select “Batch collection of words from the left column of Yandex.Wordstat”, enter the queries for which we collect data.

An example is included in the screenshot; in fact, for a more complete syntax, here you additionally need to collect all query options with car brands and classes. For example, “bmw for rent”, “buy a toyota with option to buy”, “rent an SUV” and so on.

WordEb

Free analogue previous program. This can be considered both a plus - you don’t need to pay, and a minus - the program’s functionality is significantly reduced.

To collect keywords, the steps are the same.

Rush-analytics.ru

Online service. Its main advantage is that you don’t need to download or install anything. Register and use it. The service is paid, but when you register, you have 200 coins in your account, which is enough to collect small semantics (up to 5000 requests) and parse frequency.

The downside is that semantics are collected only from Wordstat.

Checking the frequency of keywords and queries

And again we notice a decrease in the number of requests. Let's go ahead and try another word form of the same query.

We note that in the singular, this request is searched by a much smaller number of users, which means the initial request is a higher priority for us.

Such manipulations must be carried out with every word and phrase. Those requests for which the final frequency is equal to zero (using quotes and exclamation point), are eliminated, because “0” means that no one enters such queries and these queries are only part of others. The point of compiling a semantic core is to select the queries that people use to search. All queries are then placed in an Excel table, grouped by meaning and distributed across the pages of the site.

It’s simply not possible to do this manually, so there are many services on the Internet, paid and free, that allow you to do this automatically. Let's give a few:

  • megaindex.com;
  • rush-analytics.ru;
  • tools.pixelplus.ru;
  • key-collector.ru.

Removing non-target requests

After sifting through the keywords, you should remove unnecessary ones. Which search queries can I remove it from the list?

  • requests with the names of competitors' companies (can be left in contextual advertising);
  • requests for goods or services that you do not sell;
  • requests that indicate a district or region in which you do not work.

Clustering (grouping) of requests for site pages

The essence of this stage is to combine queries that are similar in meaning into clusters, and then determine which pages they will be promoted to. How can you understand which requests to promote to one page and which to another?

1. By request type.

We already know that everything queries in search engines are divided into several types, depending on the purpose of the search:

  • commercial (buy, sell, order) - promoted to landing pages, pages of product categories, product cards, pages with services, price lists;
  • informational (where, how, why, why) - articles, forum topics, answer to question section;
  • navigation (telephone, address, brand name) - page with contacts.

If you are in doubt what type of request it is, enter its search string and analyze the results. For commercial requests there will be more pages offering services, for informational requests there will be more articles.

There is also geo-dependent and geo-independent queries. Most commercial requests are geo-dependent, as people are more likely to trust companies located in their city.

2. Request logic.

  • “buy iphone x” and “iphone x price” - need to be promoted to one page, since in both the first and second cases, the same product is searched, and more detailed information about him;
  • “buy iphone” and “buy iphone x” - need to be promoted to different pages, since in the first request we are dealing with a general request (suitable for the product category where iPhones are located), and in the second the user is looking for a specific product and this request should promote to the product card;
  • "how to choose good smartphone“—it is more logical to promote this request to a blog article with the appropriate title.

Look search results on them. If you check which pages on different sites lead to the queries “construction of houses made of timber” and “construction of houses made of bricks”, then in 99% of cases these are different pages.

4. Automatic grouping using software and manual refinement.

The 1st and 2nd methods are excellent for compiling the semantic core of small sites where a maximum of 2-3 thousand keywords are collected. For a large system (from 10,000 to infinity of requests), the help of machines is needed. Here are several programs and services that allow you to perform clustering:

  • KeyAssistant - assistant.contentmonster.ru;
  • semparser.ru;
  • just-magic.org;
  • rush-analytics.ru;
  • tools.pixelplus.ru;
  • key-collector.ru.

After automatic clustering is completed, it is necessary to check the results of the program manually and, if errors are made, correct them.

Example: the program can send the following requests to one cluster: “vacation in Sochi 2018 hotel” and “vacation in Sochi 2018 hotel breeze” - in the first case, the user is looking for various hotel options for accommodation, and in the second, a specific hotel.

To eliminate the occurrence of such inaccuracies, you need to manually check everything and, if errors are found, edit.

What to do next after compiling the semantic core?

Based on the collected semantic core, we then:

  1. We create the ideal structure (hierarchy) of the site from the point of view of search engines;
    or in agreement with the customer, we change the structure of the old website;
  2. we write technical assignments for copywriters to write text, taking into account the cluster of requests that will be promoted to this page;
    or We are updating old articles and texts on the site.

It looks something like this.

For each generated request cluster, we create a page on the site and determine its place in the site structure. Most popular queries, are promoted to the top pages in the resource hierarchy, less popular ones are located below them.

And for each of these pages, we have already collected requests that we will promote on them. Next, we write technical specifications to copywriters to create text for these pages.

Technical specifications for a copywriter

As in the case of the site structure, we will describe this stage in general outline. So, terms of reference to text:

  • number of characters without spaces;
  • page title;
  • subheadings (if any);
  • a list of words (based on our core) that should be in the text;
  • uniqueness requirement (always require 100% uniqueness);
  • desired text style;
  • other requirements and wishes in the text.

Remember, don’t try to promote +100500 requests on one page, limit yourself to 5-10 + tail, otherwise you will get banned for over-optimization and will be out of the game for a long time for places in the TOP.

Conclusion

Compiling the semantic core of a site is painstaking and hard work, which needs to be given especially close attention, because it is on this that the further promotion of the site is based. Follow the simple instructions given in this article and take action.

  1. Choose the direction of promotion.
  2. Collect all possible queries from Yandex and Google (use special programs and services).
  3. Check the frequency of queries and get rid of dummies (those with a frequency of 0).
  4. Remove non-targeted requests— services and goods that you do not sell, a request mentioning competitors.
  5. Form query clusters and distribute them across pages.
  6. Create an ideal site structure and draw up technical specifications for the content of the site.

In our article, we explained what a semantic core is and gave general recommendations on how to compose it.

It's time to look at this process in detail, creating a semantic core for your site step by step. Stock up on pencils and paper, and most importantly, time. And join...

We create a semantic core for the site

As an example, let's take the site http://promo.economsklad.ru/.

The company's field of activity: warehouse services in Moscow.

The site was developed by specialists of our website service, and the semantic core of the site was developed in stages in 6 steps:

Step 1. Compile a primary list of keywords.

After conducting a survey of several potential clients, studying three sites close to our topic and using our own brains, we compiled a simple list of keywords that, in our opinion, reflect the content of our site: warehouse complex, warehouse rental, storage services, logistics, storage space rental, warm and cold warehouses.

Task 1: Review competitors' websites, consult with colleagues, brainstorm and write down all the words that, in your opinion, describe YOUR site.

Step 2. Expanding the list.

Let's use the service http://wordstat.yandex.ru/. In the search line, enter each of the words from the primary list one by one:


Copy the refined queries from the left column to Excel spreadsheet, we look through the associative queries from the right column, select among them those that are relevant to our site, and also enter them into the table.

After analyzing the phrase “Warehouse rental,” we received a list of 474 refined and 2 associative queries.

Having carried out a similar analysis of the remaining words from the primary list, we received a total of 4,698 refined and associative queries that were entered by real users in the past month.

Task 2: Collect a complete list of queries on your site by running each of the words in your primary list through Yandex.Wordstat query statistics.

Step 3. Stripping

First, we remove all phrases with a frequency of impressions below 50: “ how much does it cost to rent a warehouse?" - 45 views, " Warehouse rental 200 m" - 35 impressions, etc.

Secondly, we remove phrases that are not related to our site, for example, “ Warehouse rental in St. Petersburg" or " Warehouse rental in Yekaterinburg", since our warehouse is located in Moscow.

Also, the phrase “ warehouse lease agreement download" - this sample may be present on our website, but is actively promoted on this request there is no point, since a person who is looking for a sample contract is unlikely to become a client. Most likely, he has already found a warehouse or is the owner of the warehouse himself.

Once you remove all unnecessary queries, the list will be significantly reduced. In our case with “warehouse rental,” out of 474 refined queries, only 46 relevant to the site remained.

And when we cleaned the full list of refined queries (4,698 phrases), we received the Semantic Core of the site, consisting of 174 key queries.

Task 3: Clean up the previously created list of refined queries, excluding from it low-frequency keywords with less than 50 impressions and phrases that are not related to your site.

Step 4. Revision

Since you can use 3-5 different keywords on each page, we won’t need all 174 queries.

Considering that the site itself is small (maximum 4 pages), then from full list We select 20 that, in our opinion, most accurately describe the company’s services.

Here they are: warehouse rental in Moscow, warehouse space rental, warehouse and logistics, customs services, safekeeping warehouse, warehouse logistics, logistics services, office and warehouse rental, safekeeping of goods and so on….

These keyword phrases include low-frequency, mid-frequency, and high-frequency queries.

Please note that this list is significantly different from the primary one taken from your head. And it is definitely more accurate and efficient.

Task 4: Reduce the list of remaining words to 50, leaving only those that, in your experience and opinion, are most optimal for your site. Don't forget that the final list should contain queries of varying frequencies.

Conclusion

Your semantic core is ready, now is the time to put it into practice:

  • review the texts on your site, maybe they should be rewritten.
  • write several articles on your topic using selected key phrases, post the articles on the site, and after search engines index them, register in article directories. Read “One unusual approach to article promotion.”
  • pay attention to search advertising. Now that you have a semantic core, the effect of advertising will be much higher.

If you know the pain of search engines’ “dislike” for the pages of your online store, read this article. I will talk about the path to increasing the visibility of a site, or more precisely, about its first stage - collecting keywords and compiling a semantic core. About the algorithm for its creation and the tools that are used.

Order the collection of the semantic core from SEO specialists of the Netpeak agency:

Why create a semantic core?

To increase the visibility of site pages. Make sure that Yandex and Google search robots begin to find pages of your site based on user requests. Of course, collecting keywords (compiling semantics) is the first step towards this goal. Next, a conditional “skeleton” is sketched out to distribute keywords across different landing pages. And then articles/meta tags are written and implemented.

By the way, on the Internet you can find many definitions of the semantic core.

1. “The semantic core is an ordered set of search words, their morphological forms and phrases that most accurately characterize the type of activity, product or service offered by the site.” Wikipedia.

To collect competitor semantics in Serpstat, enter one of the key queries, select a region, click “Search” and go to the “Key phrase analysis” category. Then select “SEO Analysis” and click “Phrase Selection”. Export results:

2.3. We use Key Collector/Slovoeb to create a semantic core

If you need to create a semantic core for large online store, you can’t do without Key Collector. But if you are a beginner, it is more convenient to use free tool— Word fucker (don’t let this name scare you). Download the program, and in the Yandex.Direct settings, specify the login and password for your Yandex.Mail:
Create a new project. In the “Data” tab, select the “Add phrases” function. Select your region and enter the requests you received earlier:
Advice: create a separate project for each new domain, and create a separate group for each category/landing page. For example: Now collect semantics from Yandex.Wordstat. Open the “Data collection” tab – “Batch collection of words from the left column of Yandex.Wordstat”. In the window that opens, select the checkbox “Do not add phrases if they are already in any other groups.” Enter a few of the most popular (high-frequency) phrases among users and click “Start collecting”:

By the way, for large projects in Key Collector you can collect statistics from competitor analysis services SEMrush, SpyWords, Serpstat (ex. Prodvigator) and other additional sources.

Many web publications and publications talk about the importance of the semantic core.

Similar texts are available on our website “What to Do.” At the same time, only the general theoretical part of the issue is often mentioned, while the practice remains unclear.

All experienced webmasters insist that it is necessary to create a basis for promotion, but only a few clearly explain how to use it in practice. To remove the veil of secrecy from this issue, we decided to highlight the practical side of using the semantic core.

Why do you need a semantic core?

This is, first of all, the basis and plan for further filling and promoting the site. The semantic basis, divided according to the structure of the web resource, are pointers on the way to the systematic and targeted development of the site.

If you have this foundation, you don't have to think about the topic of each next article, you just need to follow the bullet points. With the core, website promotion moves much faster. And promotion gains clarity and transparency.

How to use the semantic core in practice

To begin with, it is worth understanding how the semantic basis is generally compiled. Essentially, this is a list of key phrases for your future project, supplemented by the frequency of each request.

Collecting such information is not difficult using the Yandex Wordstat service:

http://wordstat.yandex.ru/

or any other special service or program. The procedure will be as follows...

How to create a semantic core in practice

1. Collect in a single file (Exel, Notepad, Word) all queries on your key topic, taken from statistical data. This should also include phrases “out of your head”, that is, logically acceptable phrases, morphological variants (as you yourself would search for your topic) and even variants with typos!

2. The list of semantic queries is sorted by frequency. From queries with maximum frequency to queries with minimum popularity.

3. All junk queries that do not correspond to the theme or focus of your site are removed and cleared from the semantic basis. For example, if you tell people for free about washing machines, but you don’t sell them, you don’t need to use words like:

  • "buy"
  • "wholesale"
  • "delivery"
  • "order"
  • "cheap"
  • “video” (if there are no videos on the site)…

Meaning: do not mislead users! Otherwise, your site will receive a huge number of failures, which will affect its rankings. And this is important!

4. When the main list is cleared of unnecessary phrases and queries and includes a sufficient number of items, you can use the semantic core in practice.

IMPORTANT: a semantic list can never be considered completely ready and complete. In any topic, you will have to update and supplement the core with new phrases and queries, periodically monitoring innovations and changes.

IMPORTANT: the number of articles on the future site will depend on the number of items in the list. Consequently, this will affect the volume of required content, the working time of the author of the articles, and the duration of filling the resource.

Imposing a semantic core on the site structure

In order for the entire list received to make sense, you need to distribute requests (depending on frequency) across the site structure. It is difficult to give specific numbers here, since the scale and frequency difference can be quite significant for different projects.

If, for example, you take a query with a millionth frequency as a basis, even a phrase with 10,000 queries will seem mediocre.

On the other hand, when your main request is 10,000 frequency, the average frequency will be about 5,000 requests per month. Those. a certain relativity is taken into account:

“HF – CP – LF” or “Maximum – Middle – Minimum”

But in any case (even visually) you need to divide the entire core into 3 categories:

  1. high-frequency queries (HF - short phrases with maximum frequency);
  2. low-frequency requests (LF - rarely requested phrases and word combinations with low frequency);
  3. mid-frequency queries (MF) - all average queries that are in the middle of your list.

The next step is to support 1 or more (maximum 3) requests for the main page. These phrases should be of the highest possible frequency. High frequencies are placed on the main page!

Next, from the general logic of the semantic core, it is worth highlighting several main key phrases from which sections (categories) of the site will be created. Here you could also use high-frequency queries with a lower frequency than the main one, or better - mid-frequency queries.

Low-frequency remaining phrases are sorted into categories (under created sections and categories) and turned into topics for future publications on the site. But it's easier to understand with an example.

EXAMPLE

A clear example of using the semantic core in practice:

1. Home page(HF) – high-frequency request – “site promotion”.

2. Section pages (SP) – “custom website promotion”, “independent promotion”, “site promotion with articles”, “site promotion with links”. Or simply (if adapted for the menu):

Section No. 1 - “to order”
Section No. 2 – “on your own”
Section No. 3 – “article promotion”
Section No. 4 – “link promotion”

All this is very similar to the data structure on your computer: logical drive (main) - folders (partitions) - files (articles).

3. Pages of articles and publications (AP) - “quick site promotion for free”, “cheap promotion to order”, “how to promote a site with articles”, “promotion of a project on the Internet to order”, “inexpensive site promotion with links”, etc. .

In this list you will have the largest number of diverse phrases and phrases, according to which you will need to create further publications on the site.

How to use a ready-made semantic core in practice

Using a query list is internal optimization content. The secret is to optimize (adjust) each page of a web resource to the corresponding core item. That is, in fact, you take key phrase and write the most relevant article and page for it. It will help you assess the relevance special service, available at the link:

In order to have at least some guidelines in your SEO work, it is better to first check the relevance of sites from the TOP search results for specific queries.

For example, if you are writing text on the low-frequency phrase “inexpensive website promotion with links,” then first simply enter it in the search and evaluate the TOP 5 sites in the search results using the relevance assessment service.

If the service showed that sites from the TOP 5 for the query “inexpensive website promotion with links” have a relevance of 18% to 30%, then you need to focus on the same percentages. Even better is to create a unique text with keywords and relevance of about 35-50%. By slightly beating your competitors at this stage, you will lay a good foundation for further advancement.

IMPORTANT: using the semantic core in practice implies that one phrase corresponds to one unique resource page. The maximum here is 2 requests per article.

The more fully the semantic core is revealed, the more informative your project will be. But if you're not ready for long work and thousands of new articles, no need to tackle broad thematic niches. Even a narrow specialized area, developed 100%, will bring more traffic than an unfinished large website.

For example, you could take as the basis of the site not the high-frequency key “site promotion” (where there is enormous competition), but a phrase with a lower frequency and narrower specialization - “article site promotion” or “link promotion”, but reveal this topic to the maximum in all articles on the virtual platform! The effect will be higher.

Useful information for the future

Further use of your semantic core in practice will consist only of:

  • adjust and update the list;
  • write optimized texts with high relevance and uniqueness;
  • publish articles on the website (1 request – 1 article);
  • increase the usefulness of the material (edit ready-made texts);
  • improve the quality of articles and the site as a whole, monitor competitors;
  • mark in the kernel list those queries that have already been used;
  • complement optimization with other internal and external factors(links, usability, design, usefulness, videos, online help tools).

Note: The above is a very simplified version of the events. In fact, based on the kernel, sublevels, deep nesting structures, and branches into forums, blogs, and chats can be created. But the principle will always be the same.

PRESENT: useful tool to collect the kernel in the Mozilla FireFox browser -

Share