Correct composition of the semantic core. How to create a semantic core from scratch? An alternative method for selecting key phrases using competitor analysis services

In 2008 I created my first Internet project.

It was an online electronics store that needed promotion.

Initially, the work of promotion was handed over to the programmers who created it.

What to promote?

They compiled a list of keys in 5 minutes: mobile phones, video cameras, cameras, iPhones, Samsungs - all categories and products on the site.

These were common names that did not at all resemble the correctly formed semantic core.

A long period passed without results.

Incomprehensible reports forced us to look for performers specializing in website promotion.

I found a local company and entrusted them with the project, but it was no use either.

Then the understanding came that promotion should be carried out by real professionals.

After reading a lot of reviews, I found one of the best freelancers who assured me of success.

Six months later there are no results again.

It was the lack of results in organics for two years that led me to SEO.

Subsequently, this became the main calling.

Now I understand what was wrong with my initial promotion.

These mistakes are repeated by the majority of even experienced SEO specialists who have spent years promoting websites.

The mistakes consisted of incorrect work with keywords.

Essentially, there was no understanding of what we were promoting.

There is no time to collect a semantic core, quickly fill in your contact information.








Free NL writing tools

Free clue finding tools are important for finding interesting ideas.

You don't have to pay for them; sometimes registration is required.

I will tell you in detail the secrets of how to get ready-made semantics from these tools.

Collecting key lists is quite easy; there are many free and paid tools.

Let's start with 4 popular free resources that I use all the time.

1.1. Google keyword planner is the most universal tool in which you can make filters by region and language.

It is interesting because it makes a large selection of homogeneous keys, shows traffic and the level of competition in contextual advertising.

Requires registration with Google Adwords to work.

It is also important to create at least one company that does not require payment.

All these processes are visually clear, so let's move directly to the scheduler keywords.

To start working with the tool, click on the wrench (upper right corner) and select “Keyword Planner”.

The screenshot shows the new version of the design.

After this, you are taken to a page where you can enter many keyword options, search on a relevant page, or select the desired category.

In the new design interface we see such a window.

We will consider both options for selecting keywords.

1. OPTION

You see 2 modules.

  1. Find Keywords
  2. Get search volume data and forecasts

When you go to the “Find Keywords” module, You receive a form for entering variants of key phrases, which must be entered separated by commas.

As we can see, the number of received options has already expanded significantly.

In the old interface there were no more than 700 of them, in the new one we got 1365 options.

The number of options received is still inferior to paid services, which select a wider list of low-frequency queries.

In the same window you can adjust the following functions.

  1. Region search queries
  2. Search network: Google or Google+ partners
  3. Download the resulting options in csv-excel format
  4. By default, it shows data for the year, can be adjusted for seasonal queries
  5. Shows the number of options found
  6. Correcting the necessary data or adding filters (by default only 1 filter, do not show adult content).

Monthly data is divided into beautiful infographics, which is very convenient for viewing seasonal queries.

Also an important factor is from what devices these keys are viewed - desktop and mobile version.

We go below and get directly a list of keywords with frequency, minimum and maximum bid per click.

When going to the module “Get data on the number of requests and forecasts,” we will enter the previously considered requests.

We receive conversion data for selected keys: expense, number of conversions, conversion value, clicks.

This is valuable information for budgeting in Google Adwords and roughly compares to SEO.

I want to immediately upset those who plan to use only this tool.

The correctness of the data is highly questionable.

A well-known expert in the SEO world, Rand Fishkin, criticized the accuracy of traffic and the correctness of clustering.

Therefore, it is better to additionally use other known resources.

1.2. Wordstat.yandex.ru is an analogue from Yandex, which also shows traffic and homogeneous requests.

To work, you need to log in using Yandex mail or social networks.

Questions: why, who, what, how, where (why, who, what, how, where) are frequently used words in this segment.

See below for a list of popular words for voice search in the English-speaking segment.

At the same time, I want to warn you - do not over-optimize!

John Mueller, one of the Google analysts, warned about this.

There is no need to specifically modify part of the content to suit voice search, if its quality decreases.

Think about behavioral factors, this is the most important parameters.

1.4. Predict the keys. To do this, use a free key collection utility.

I understand that the terminology is complex, so let’s look at it with an example.

Just create one query in the first column of this type (SYNONYM1|synonym2|synonym3) (Synonym4|synonym5|synonym6).

For example: (repair|repair|repair) (engines|internal combustion engines).

In other columns, enter the regions: Moscow, Moscow time (in the GEO column).

In the “Region” column, write down the region number according to Wordstat.

Then, if you click the “GET KEYWORDS (1)” button, the “FIND MORE KEYWORDS (2)” button will appear - the system will show the results for Wordstat without taking into account the words that you have already used.

You can also click on the lines below (3) to check the results for selected groups.

Write unnecessary words in the MINUS words column.

The necessary ones are placed in other columns (for convenience, they are labeled as Properties, Types, Items, Brands, Transactional Queries, Regions, etc.).

For example, it is clear here that “do-it-yourself, video, mechanics” go into the minus, and “diesel, capital, turbines, blocks, injectors” will be useful to us for subpages and subsections (4).

After each update of the list, press in a circle again “GET KEYWORDS (5)” and “FIND MORE KEYWORDS (6)” - and continue the cycle until there is only garbage left in the results.

The system will substitute already used queries into the minus operator.

The convenience of the utility is that it eliminates repetitions in the Yandex search query, which greatly simplifies the work.

Ready-made lists can be transferred to Excel by clicking on each line or simply dropping them directly into KeyCollector (after adding a list of negative keywords to the appropriate section).

The speed of semantic parsing can be reduced from several hours to several minutes.

1.5. Ubersuggest - this tool was bought by the famous CEO guru Neil Patel for $120 thousand.

After that, he invested another 120 thousand dollars. The USA is committed to improving it and does not stop there.

He also promised that Ubersuggest will always be free.

Data for of this instrument pull with Google Keyword Planner and Google Suggest.

There is no need to register when using it, which is also a big plus.

This tool does not have a Russian-language version, but it is possible to obtain data using Russian-language keys.

To search for a list of keys, enter a high-frequency query, select a language and a search engine.

An additional option is to add a list of negative keywords in the field on the right.

The obtained data can be downloaded in Excel csv format.

This functionality is implemented at the bottom of the resulting list.

Paid tools for finding keys

Paid tools are important to provide more full list keys.

They also provide additional important parameters for analyzing search keys.

I will tell you about 3 paid tools that I personally use.

Many low-frequency queries can also be selected using SEO resources: serpstat.com, ahrefs.com, semrush.com, moz.com, keywordtool.io and others.

You don’t need to pay for everything; choose the ones that suit you best.

These tools are paid, with different monthly plans.

If you need to get access one-time, contact me for freelancing.

For a small fee (from $5) I will provide you with information on your keys.

The free versions of these tools are limited.

To search for low-frequency keys, you need to enter a high-frequency query; the selected systems independently expand the possible options.

For the query “plastic windows” using Serpstat, we received 5200 options in Yandex.Moscow, in Google Russia - 3500.

For the same query, Ahrefs generated 7,721 variants of different keys.

By the way, Tim Soulo, Ukrainian marketing specialist at Ahrefs, stated, which will give a six-month subscription to the one who shows the service that generates the most keys.

The same query in keywordtool.io collected only 744 keyword options, and this tool specializes only in keywords.

I use it mainly to search for keywords for YouTube, Amazon, and eBay.

After collecting a list of keys, it is important to scatter them across the pages of the site or cluster them.

I’ve mentioned that hard-to-pronounce word “clustering” several times already.

Let's look at it in more detail.

Let's start with a remix of a famous tongue twister to make pronunciation easier :-)

Keyword Clustering

Grouping keys across website pages is one of the most time-consuming tasks.

Some do it manually, some pay the appropriate services.

This is time-consuming and expensive.

I'll show you a free one quick way group the semantic core.

One of the most common mistakes is incorrect grouping of keywords across the pages of the website being promoted or clustering of the semantic core.

It's like building a house and not having a construction plan.

Breaking down the list of keywords across site pages is the root of any promotion.

A search key is a question asked by an Internet user to which he wants a relevant answer.

Requests must match the content on the page.

Otherwise, users will start leaving your site.

The search engine will not show in the results a resource that has bad behavioral factors.

All of the tools listed above, when using 3-4 words in a key phrase, reduce the time for grouping keys, while losing many different combinations.

And what if there are really a lot of keys?

Manual clustering of several thousand keys sometimes takes up to several days.

It is necessary to compare the results for different homogeneous keys.

If the pages in the TOP match, then the keys can be combined into one group.

It is best to consider this issue with an example.

As you can see, the TOP has the same URLs, so there is no need to create individual pages for these queries because users are looking for the same content.

Even if several pages match in the results, this means that the keys can be combined into one group.

The main difficulties in clustering are checking several tens or even hundreds of thousands of keys.

In this situation, mistakes are inevitable.

People are not robots, they get tired.

Somewhere they are under pressure from deadlines, they have to do the work incompletely.

This applies even to experienced SEOs.

For many years, seeing the same mistakes, I wanted to find a solution to this issue.

Several paid tools have appeared on the Internet that automate the work of key clustering.

But here the question also arises of quality, price and lead time.

For example, prices for clustering a list of up to 4000 keys are included in tariff plan B at serpstat.com.

Everything that needs to be checked on top of the plan costs $20 for 3000 keys.

I respect the work of our colleagues who have created irreplaceable SEO tools, but to be honest, even for one average project this is very little.

Just one page of the site can lead from several hundred to several thousand keys.

The pricing policy can be understood; algorithms need to retrieve search results and compare results on similar pages.

These are the resources expended plus the commercial component.

At the same time, the search results are constantly changing, and the pages in the TOP change accordingly.

What was relevant may become irrelevant in a couple of months.

The second disadvantage is the time, which is offset by the fact that the process can be started and returned to when it is completed.

As a rule, this takes up to several hours, depending on the loading speed of the service.

We don't like to wait, let alone pay :-)

Therefore, we studied the problems of grouping keys as much as possible and created our revolutionary keyword clusterer, which solves the main problems:

  • our tool offers free clustering of an unlimited list of keys (if the service is overloaded, we will introduce a limit of up to 10K keys per day);
  • performs clustering in seconds;
  • allows you to set individual settings depending on the issuing requirements;
  • removes junk and irrelevant queries;
  • combines synonyms into one group;
  • minimizes manual labor.

Using our clusterer, we created ready-made turnkey semantics for an English-language project of 80 thousand keys in just 20 minutes!

The theme is “dating”, but we haven’t lost sight of anything.

A month ago I would have said that this was madness, today it is reality.

The website has instructions on how to use the tool, as well as a “How it works” button.

Let's briefly talk about the main elements.

Important note: fields are not required.

It all depends on the selected keys.

For the primary test, I fill in only one field, “Count as one word.”

I further cluster the finished version.

  • Copy the keys, as often as possible, and paste them into the clusterer form. For example, from wordstat.yandex.ru or from two Excel columns. The system recognizes keys and numbers as separate components. The data in the final version is distributed correctly.
  • The second option is to download from a file in txt, csv, xls, xlsx formats. You can simply download semantics from Serpstat, Ahrefs, Google Keyword Planner, Key Collector or other tools. There is no need to specially process them for the clusterer. The system itself will distribute everything according to the required parameters. If the calculator does not understand which columns belong to what, a dialog box will appear with clarification on the selected columns.
  • Next, select the frequency level: HF (high frequency), MF (mid frequency), LF (low frequency), MF (micro frequency). Everything is individual here, try it different options and check with real results.
  • Be sure to check the box “Take into account geodependence” or not. For example, you are promoting websites in the city of Kharkov. In the TOP, many pages are not optimized for it, which means geo-dependence fades into the background. If your main request is “refrigerator repair in Kharkov,” then you need to take into account geodependence.
  • “Advanced clusters for semantics” groups non-clustered queries into the most relevant groups. When you disable this function, keys without groups will go to the “ Not grouped."
  • Next, fill out the form “Count as one word”. This is necessary in order to combine several words into a single whole. As a result, the system will not split phrases into separate clusters. For example: washing machine. The system will not divide words such as “washing” and “machine” into 2 clusters. Other examples: clothes for newborns, iPhone 8, online electronics store. If you have several such phrases, enter them separated by commas.
  • Negative keywords are necessary to immediately eliminate irrelevant keywords from the list. For example, the word “free”. In order not to eliminate phrases such as “ free shipping”, use the operator exclamation mark“!” Then you will prohibit the system from declining this word. For example: !free.
  • The list of ignored words is those words that do not affect the results in the search results. The system automatically ignores prepositions in the Russian and English segments, so it is not necessary to enter them. For example, the phrase “ Apple iPhone X". The word “Apple” does not affect search results in any way, because users are looking for iPhone data. To avoid creating an extra cluster, add it to this form.
  • The last form is synonyms. For example, the words “buy”, “price”, “cost” mean the same thing for commercial queries. The system automatically recognizes them as synonyms, so it is not necessary to enter them. Enter other words synonyms: “iPhone”, “iphone” or “choose”, “choose”, they have the same meaning in the Russian-speaking segment. If there are a lot of synonyms, click plus and add other options.

To get the final version, click “SEARCH” and get a clustered list.

Relevant keys are highlighted with ticks.

We compared the results with paid clusterers; the accuracy of the data obtained in our tool is higher.

The convenience and speed of work in it is better than even in Excel, which works slowly when adding a huge list of keys and a large number of formulas.

I would post the results of our comparisons, but I think this would be incorrect in relation to our colleagues.

Plus, it is biased on our part to give examples that may be considered successful.

Therefore, I leave everything to the readers’ judgment.

I will be glad to hear your opinion in the comments.

Of course, our clusterizer is not a magic pill that solves all problems.

Even Google tools do not show accurate data in clustering.

Our clusterer is a huge time saver.

Ready-made lists are easier to check and organize across site pages.

Promotion for low-frequency queries

Promotion for low-frequency queries is the start for any young project.

Don’t try to knock experienced large projects with a limited budget out of the TOP 10.

I will show you effective ways to find low-frequency keys.

The majority of young website owners initially select high-frequency and mid-frequency queries.

These are keys like “ buy iphone”, “apartment rental”, etc.

According to them, the TOP has been occupied by high-trust sites that clearly do not want to leave it.

SEO budgets for such resources are many times higher, plus additional trust helps promote them with less effort.

You will never move sites with millions of traffic that everyone knows about to the TOP.

A young resource needs to concentrate on low-frequency queries, while, according to MOZ analysis, 80% of all sales on the Internet come from low-frequency queries.

Low-frequency queries contain 4 or more words with a frequency of up to 1000 people per month.

Create content for them and get traffic in the near future.

You can search for low-frequency queries using various tools.

Let's look at the main ones.

4.1. Use search suggestions: Google, Yandex, Bing, Facebook, Twitter, Pinterest, Wikipedia, Amazon, any other sites that have this function.

This, of course, is a lot of manual work and a complete headache, but it is this approach that allows you to find real keys for promotion.

4.2. Use forums that cover your topic, especially ones like Reddit.

Find threads that have collected a lot of comments on your topic.

Copy the name of the thread and create content for these keys.

Let's look at an example of how to compete for well-known queries with such monsters as Amazon, Expedia, Yelp in the American segment.

For example, you are promoting the request “ticket fly” (air tickets).

These keys are used to rank sites such as Expedia, Kayak, which have more than 4 million traffic for branded queries alone!

Check the search results, the first 4 sites are contextual advertising.

And then in organic there are only monsters with traffic of at least several million.

Believe me, it’s impossible to compete with them for these keys.

You need to look for queries that these resources do not promote.

Many Western SEO companies for small commercial sites do not use key selection tools at all.

Enter your basic query into Reddit search.

Check popular threads that have received a lot of points and comments.

Copy the title or its main part.

As an example, I entered the keyword “fly ticket” into the Reddit search and looked through popular threads.

Don't be fooled by the predicted traffic based only on the keys you see in the topic of the thread.

If your goal is to get to the TOP and receive traffic, then you need to analyze this parameter.

Some experts check the cost per click and the level of competition contextual advertising, but this data may differ significantly from SEO indicators.

This is more interesting for informational purposes, but not for determining the budget for SEO.

To analyze the level of competition in SEO, it is best to use ahrefs.com, majestic.com, moz.com, semrush.com.

Recently, semrush merged its donor databases with majestic, so the quality of donor verification there is also high.

Don’t try to push highly competitive requests with a small budget.

It's better to focus on keywords with a low level of competition.

LSI (uniform queries)

Uniform queries (LSI) increase content visibility and therefore traffic.

More traffic means more sales.

I'll show you everything effective methods LSI search.

LSI (Latent Semantic Index) are homogeneous queries that are shown at the bottom of the search results.

The search engine uses them for readers who have not found useful information in the TOP 10 so that they can formulate a request differently.

Using such keys, you can expand content or create new ones.

This already depends on your clustering.

When promoting a site in another region, homogeneous requests are shown under your existing IP.

In such a situation, adjustments are made by requests tailored to your region.

If you don't want to play with changing IP, use the application for Google Chrome— GeoClever.

After installing it, directly in the search you can select any city in the world, even little-known ones.

You can quickly get a list of search suggestions using wordstat.yandex.ru.

To do this, after entering the main key, look through the right block.

Let's check the query “SEO optimization”.

As you can see, there are more options received than in Yandex and Google.

Well, if you want to collect all the homogeneous requests for Youtube, Bing, Yahoo, Ebay, Amazon, Google that the software can collect, then use Scrapebox (see point 5).

The disadvantage of this program is that it is paid, it costs $67.

To operate, it requires the use of an IP database, which can be purchased on the Internet.

The advantages are that a large number of search tips are difficult to get anywhere else.

The software is also multifunctional and helps automate many other manual processes.

Using Scrapebox, I collected 7,786 results for the query “SEO optimization”.

Of course, many of these keys are junk.

Use the clusterer from step 3 to filter out unnecessary keys.

The program also allows you to check the actual traffic of selected requests.

Pareto principle

Choosing priorities is important for getting results.

To do this, use the Pareto principle.

I will show you the most effective methods for selecting priority keys to promotion.

Italian economist Vilfredo Pareto discovered the principle in 1886 that 20% of effort produces 80% of results.

He discovered that 20% of the Italian population owns 80% of the land area, 20% of the pea bushes produce 80% of the crop.

This principle still works today.

Dear women, it follows that men prepare for congratulations in advance.

But SEO specialists should be ready to promote these keys even earlier.

Don't try to push a highly competitive request in a short time frame.

It's like pumping up a month before beach season.

Those who didn't have time are late.

Prepare, as I always do, for next year.

Meta information optimization

The purpose of meta information is to tell the user what your page is about.

Meta tags also help search engines match keywords with site content.

I'll show you how to properly optimize meta information on your website.

Have you selected a list of keys and divided them into pages?

Now move on to creating meta information - Title & Description.

Many people share this process and hand over the writing of tags to copywriters.

Do not do this under any circumstances.

Even great copywriters write meta tags incorrectly.

As a result, there will be no traffic, because... Your keys do not match the content.

And, as we know, the more clicks, the more conversions.

Search engines will not show sites in the TOP that they do not go to.

They regard them as not relevant.

Let's first look at what meta tags are and where they can be found.

Title is the site page code that looks like this: This is the name of your page

It is embedded in the page code and is not displayed in internal content.

You can find it in your browser tab.

When a page is shared on social networks, for example on Facebook.

The most important display is when checking search results.

Description meta tag, or brief description page is displayed in code as follows:

The size of the displayed Meta Description tag in Google is about 160 characters.

This meta tag does not need to be specified, unlike Title.

In such a situation, the search engine selects content from your page that will be most relevant to the search keywords.

If you are not sure about the automatic selection of a search engine, enter the Description.

How to increase Title click-through rate?

The principle is simple: in the search form, insert the URL or write a potential Title.

In the results obtained, the system gives you a score from 0 to 100 and gives recommendations for optimization.

Let's take a closer look at Title optimization methods.

9.1. Add emoticons

They are the ones that attract more attention than the standard ones: 10, 20, 50, etc.

Why do you think the title of this article is “Semantic Core of a Website: 9 Examples of Composition (Course 2018)”?

The number “9” is more realistic than 10, 20, 50, 100...

Strange numbers do not cause a feeling of understatement or, on the contrary, compression of information, because in this situation we have listed our best methods for compiling SYNOPSIS, and did not drag out another tenth one.

Using brackets increases click-through rates by 38%, according to Hubspot analysis.

This is a huge plus because you can insert synonyms into brackets or highlight important data.

Use different brackets: round, square.

9.4. Arouse curiosity

The best way to get a click is to create curiosity.

Think about what might make your audience feel this way.

Remember these emerging news that some star has died.

When you go to the site, it turns out that this was fake information.

The main goal is to click!

The best example is Dale Carnegie and the titles of his books: “How to Win Friends and Influence People,” “How to Stop Worrying and Start Living.”

These titles have been provoking people to read his work for several generations.

9.5. Include encouraging words

A lot of encouraging words are included in the content, but they can also be added to the Title?

To do this, use different options: discounts, cheap, get it for free, download.

To search for motivating words, analyze the results of contextual advertising.

In Google Adwords and Yandex.Direct, it is very important to attract a lot of clicks.

If your ad is not clicked on, then the click will be more expensive for you, which is why contextual studies pay special attention to this.

Let's look at how to find motivating words using examples.

Let’s enter the search keys “buy iPhone 8 Kyiv”.

Collect a database of motivating words from here, select those that correspond to the content on the site.

Another technique is used when setting up remarketing.

Marketers lure those who left the cart without purchasing with additional discounts.

This significantly increases the percentage of sales.

Try the same technique when filling out your Title.

Offer discounts and promotions. People love it!

9.6. Use your domain in Title

When I talked about using a domain in Title in the Everest technique, some Internet users wrote in the comments that this was complete nonsense.

Honestly, I thought so too.

I didn’t understand why many sites use their brand in a short title.

Instead, you can add additional keys there.

My opinion changed dramatically after I read a lot of research on this matter.

The bottom line is your openness to Internet users.

It is the addition of a brand that significantly increases click-through rates.

It is best to use your brand at the end of the Title.

If it is included at the beginning, it moves attention away from the main keys.

9.7. Capitalize every word

It is this point that causes the most doubt among many SEOs.

They doubt that this complies with the rules of the Russian language.

IN English It is considered correctly correct when every word in the title is capitalized.

Let's see how everything happens in contextual advertising.

As we can see from the screenshot, this technique is used not only in the Title, but also in the description.

Why is this being done?

Capital letters attract more attention, which increases the percentage of clicks.

In Russian-language organics, this technique is rarely used, so I leave everything to the readers’ judgment.

Personally, I have not found any rules of the Russian language that indicate that this is not correct.

I suggest discussing it in the comments.

CONCLUSION

Initially, I wanted to write an article about website SEO optimization and started with keywords.

But in the process of creating the material, it became clear that there is a lot of information on this issue.

This is how we came up with an article about searching and compiling keywords.

To find keys, you don't need to limit yourself to one tool.

This is somewhat similar to a brainstorming session (see point 1), where all the ideas are initially collected over the course of several days by the entire team.

At the final stage, real ideas for which there is time and resources are selected from the good ones.

It’s the same with keys: initially you need to collect a huge list of queries.

To do this, it is important to use paid and free tools.

The next step is to eliminate a large number of irrelevant keywords and keep only those that match your goals.

To do this, use a keyword clusterer, which will collect all the keywords into groups.

They must match your priorities.

Don't try to promote everything.

A bird in the hand is worth two in the bush.

Use the Pareto principle - 20% of goods bring 80% of the profit.

Focus on low-frequency keywords, which account for 80% of all online sales.

Don't try to fight tight budgets with large, experienced sites that invest millions in promotion.

Better find your niche.

Use forums and search tips for this.

Use LSI (uniform queries) to expand the list of keys and existing content.

Check the seasonality of selected keys with using Google Trends.

Prepare in a timely manner for promotion.

Don't put it off for a short time.

Optimize meta information for the selected keywords, especially Title.

This is the second internal ranking algorithm; its attractiveness determines whether a visitor will go to your site or not.

If you are reading these lines, it means you have mastered the article, for which I am incredibly grateful to you.

I suggest continuing the discussion in the comments.

The semantic core is a scary name that SEOs came up with to denote a rather simple thing. We just need to select the key queries for which we will promote our site.

And in this article I will show you how to correctly compose a semantic core so that your site quickly reaches the TOP, and does not stagnate for months. There are also “secrets” here.

And before we move on to compiling the SY, let's figure out what it is and what we should ultimately come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is the usual excel file, which lists the key queries for which you (or your copywriter) will write articles for the site.

For example, this is what my semantic core looks like:

I have marked in green those key queries for which I have already written articles. Yellow - those for which I plan to write articles in the near future. And colorless cells mean that these requests will come a little later.

For each key query, I have determined the frequency, competitiveness, and come up with a “catchy” title. You should get approximately the same file. Now my CN consists of 150 keywords. This means that I am provided with “material” for at least 5 months in advance (even if I write one article a day).

Below we will talk about what you should prepare for if you suddenly decide to order the collection of the semantic core from specialists. Here I will say briefly - they will give you the same list, but only for thousands of “keys”. However, in SY it is not quantity that is important, but quality. And we will focus on this.

Why do we need a semantic core at all?

But really, why do we need this torment? You can, after all, just write quality articles and attract an audience, right? Yes, you can write, but you won’t be able to attract people.

The main mistake of 90% of bloggers is simply writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don’t know about it. They are not psychics, but just robots. Accordingly, they do not rank your article in the TOP.

There is another subtle point with the title. For example, you have a very high-quality article on the topic “How to properly conduct business in a face book.” There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the highest quality, useful and interesting on the Internet on this topic. No one was lying next to you. But it still won't help you.

Why high-quality articles fall out of the TOP

Imagine that your site was visited not by a robot, but by a live inspector (assessor) from Yandex. He realized that you have the coolest article. And hands put you in first place in the search results for the request “Promoting a community on Facebook.”

Do you know what will happen next? You will fly out of there very soon anyway. Because no one will click on your article, even in first place. People enter the query “Promoting a community on Facebook,” and your headline is “How to properly run a business in a face book.” Original, fresh, funny, but... not on request. People want to see exactly what they were looking for, not your creativity.

Accordingly, your article will empty its place in the TOP search results. And a living assessor, an ardent admirer of your work, can beg the authorities as much as he wants to leave you at least in the TOP 10. But it won't help. All the first places will be taken by empty articles, like the husks of sunflower seeds, that yesterday’s schoolchildren copied from each other.

But these articles will have the correct “relevant” title - “Promoting a community on Facebook from scratch” ( step by step, in 5 steps, from A to Z, free etc.) Is it offensive? Of course. Well, fight against injustice. Let's create a competent semantic core so that your articles take the well-deserved first places.

Another reason to start writing SYNOPSIS right now

There is one more thing that for some reason people don’t think much about. You need to write articles often - at least every week, and preferably 2-3 times a week, in order to gain more traffic and quickly.

Everyone knows this, but almost no one does it. And all because they have “creative stagnation,” “they just can’t force themselves,” “they’re just lazy.” But in fact, the whole problem lies in the absence of a specific semantic core.

I entered one of my basic keys into the search field - “smm”, and Yandex immediately gave me a dozen hints about what else might be interesting to people who are interested in “smm”. All I have to do is copy these keys into a notebook. Then I will check each of them in the same way, and collect hints on them as well.

After the first stage of collecting SY, you should be able to text document, which will contain 10-30 broad basic keys, which we will work with further.

Step #2 — Parsing basic keys in SlovoEB

Of course, if you write an article for the request “webinar” or “smm”, then a miracle will not happen. You will never be able to reach the TOP for such a broad request. We need to break the basic key into many small queries on this topic. And we will do this using a special program.

I use KeyCollector, but it's paid. You can use free analogue- SlovoEB program. You can download it from the official website.

The most difficult thing about working with this program is setting it up correctly. I show you how to properly set up and use Sloboeb. But in that article I focus on selecting keys for Yandex Direct.

And here let’s look step by step at the features of using this program for creating a semantic core for SEO.

First we create new project, and call it by the broad key you want to parse.

I usually give the project the same name as my base key to avoid confusion later. And yes, I will warn you against one more mistake. Don't try to parse all base keys at once. Then it will be very difficult for you to filter out “empty” key queries from golden grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the “Worstat” button in the program interface, enter your base key, and click “Start collection”.

For example, let's parse the base key for my blog “contextual advertising”.

After this, the process will start, and after some time the program will give us the result - up to 2000 key queries that contain “contextual advertising”.

Also, next to each request there will be a “dirty” frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise drawing any conclusions from these figures.

Step #3 - Collecting the exact frequency for the keys

Dirty frequency will not show us anything. If you focus on it, then don’t be surprised when your key for 1000 requests does not bring a single click per month.

We need to identify pure frequency. And to do this, we first select all the found keys with checkmarks, and then click on the “Yandex Direct” button and start the process again. Now Slovoeb will look for the exact request frequency per month for each key.

Now we have an objective picture - how many times what query was entered by Internet users over the past month. I now propose to group all key queries by frequency to make it easier to work with them.

To do this, click on the “filter” icon in the “Frequency” column. ", and specify - filter out keys with the value "less than or equal to 10".

Now the program will show you only those requests whose frequency is less than or equal to the value “10”. You can delete these queries or copy them to another group of key queries for future use. Less than 10 is very little. Writing articles for these requests is a waste of time.

Now we need to select those key queries that will bring us more or less good traffic. And to do this, we need to find out one more parameter - the level of competitiveness of the request.

Step #4 — Checking the competitiveness of requests

All “keys” in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). They can also be highly competitive (HC), moderately competitive (SC) and low competitive (LC).

As a rule, HF requests are also VC. That is, if a query is often searched on the Internet, then there are a lot of sites that want to promote it. But this is not always the case; there are happy exceptions.

The art of compiling a semantic core lies precisely in finding queries that have a high frequency and a low level of competition. It is very difficult to manually determine the level of competition.

You can focus on indicators such as the number of main pages in the TOP 10, length and quality of texts. level of trust and tits of sites in the TOP search results upon request. All of this will give you some idea of ​​how tough the competition is for rankings for this particular query.

But I recommend you use Mutagen service. It takes into account all the parameters that I mentioned above, plus a dozen more that neither you nor I have probably even heard of. After analysis, the service gives an exact value - what level of competition this request has.

Here I checked the query “setting up contextual advertising in google adwords”. Mutagen showed us that this key has a competitiveness of "more than 25" - this is the maximum value it shows. And this query has only 11 views per month. So it definitely doesn’t suit us.

We can copy all the keys that we found in Slovoeb and do a mass check in Mutagen. After that, all we have to do is look through the list and take those requests that have a lot of requests and a low level of competition.

Mutagen is a paid service. But you can do 10 checks per day for free. In addition, the cost of testing is very low. In all the time I have been working with him, I have not yet spent even 300 rubles.

By the way, about the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, regarding the frequency of requests. We now need to take the final step, which will allow you to attract a lot of traffic even for low-frequency queries.

Step #5 — Collecting “tails” for the selected keys

As has been proven and tested many times, your site will receive the bulk of traffic not from the main keywords, but from the so-called “tails”. This is when a person enters strange key queries into the search bar, with a frequency of 1-2 per month, but there are a lot of such queries.

To see the “tail”, just go to Yandex and enter the key query of your choice into the search bar. Here's roughly what you'll see.

Now you just need to write down these additional words in a separate document and use them in your article. Moreover, there is no need to always place them next to the main key. Otherwise, search engines will see “over-optimization” and your articles will fall in search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main key query.

For example, we have a request - “Setting up contextual advertising”. Here's how to reformulate it:

  • Setup = set up, make, create, run, run, enable, place...
  • Contextual advertising = context, direct, teaser, YAN, adwords, kms. direct, adwords...

You never know exactly how people will search for information. Add all these additional words to your semantic core and use them when writing texts.

So, we collect a list of 100 - 150 key queries. If you are creating a semantic core for the first time, it may take you several weeks.

Or maybe break his eyes? Maybe there is an opportunity to delegate the compilation of FL to specialists who will do it better and faster? Yes, there are such specialists, but you don’t always need to use their services.

Is it worth ordering SY from specialists?

By and large, specialists in compiling a semantic core will only give you steps 1 - 3 from our scheme. Sometimes, for a large additional fee, they will do steps 4-5 - (collecting tails and checking the competitiveness of requests).

After that, they will give you several thousand key queries that you will need to work with further.

And the question here is whether you are going to write the articles yourself, or hire copywriters for this. If you want to focus on quality rather than quantity, then you need to write it yourself. But then it won't be enough for you to just get a list of keys. You will need to choose topics that you understand well enough to write a quality article.

And here the question arises - why then do we actually need specialists in FL? Agree, parsing the base key and collecting exact frequencies (steps #1-3) is not at all difficult. This will literally take you half an hour.

The most difficult thing is to choose HF requests that have low competition. And now, as it turns out, you need HF-NCs, on which you can write a good article. This is exactly what will take you 99% of your time working on the semantic core. And no specialist will do this for you. Well, is it worth spending money on ordering such services?

When are the services of FL specialists useful?

It’s another matter if you initially plan to attract copywriters. Then you don't have to understand the subject of the request. Your copywriters won’t understand it either. They will simply take several articles on this topic and compile “their” text from them.

Such articles will be empty, miserable, almost useless. But there will be many of them. On your own, you can write a maximum of 2-3 quality articles per week. And an army of copywriters will provide you with 2-3 shitty texts a day. At the same time, they will be optimized for requests, which means they will attract some traffic.

In this case, yes, calmly hire FL specialists. Let them also draw up a technical specification for copywriters at the same time. But you understand, this will also cost some money.

Resume

Let's go over the main ideas in the article again to reinforce the information.

  • The semantic core is simply a list of key queries for which you will write articles on the site for promotion.
  • It is necessary to optimize texts for precise key queries, otherwise even your highest-quality articles will never reach the TOP.
  • SY is like a content plan for social networks. It helps you avoid falling into a “creative crisis” and always know exactly what you will write about tomorrow, the day after tomorrow and in a month.
  • To compile a semantic core it is convenient to use free program Word fucker, you just need her.
  • Here are the five steps of compiling the NL: 1 - Selection of basic keys; 2 - Parsing basic keys; 3 - Collection of exact frequency for queries; 4 — Checking the competitiveness of keys; 5 – Collection of “tails”.
  • If you want to write articles yourself, then it is better to create a semantic core yourself, for yourself. Specialists in the preparation of synonyms will not be able to help you here.
  • If you want to work on quantity and use copywriters to write articles, then it is quite possible to delegate and compile the semantic core. If only there was enough money for everything.

I hope this instruction was useful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (extract from personal experience in 10 years =)

See you soon!

Yours Dmitry Novoselov

In our article, we explained what a semantic core is and gave general recommendations on how to compose it.

It's time to look at this process in detail, creating a semantic core for your site step by step. Stock up on pencils and paper, and most importantly, time. And join...

We create a semantic core for the site

As an example, let's take the site http://promo.economsklad.ru/.

The company's field of activity: warehouse services in Moscow.

The site was developed by specialists of our website service, and the semantic core of the site was developed in stages in 6 steps:

Step 1. Compile a primary list of keywords.

After conducting a survey of several potential clients, studying three sites close to our topic and using our own brains, we compiled a simple list of keywords that, in our opinion, reflect the content of our site: warehouse complex, warehouse rental, storage services, logistics, storage space rental, warm and cold warehouses.

Task 1: Review competitors' websites, consult with colleagues, brainstorm and write down all the words that, in your opinion, describe YOUR site.

Step 2. Expanding the list.

Let's use the service http://wordstat.yandex.ru/. In the search line, enter each of the words from the primary list one by one:


Copy the refined queries from the left column to Excel spreadsheet, we look through the associative queries from the right column, select among them those that are relevant to our site, and also enter them into the table.

After analyzing the phrase “Warehouse rental,” we received a list of 474 refined and 2 associative queries.

Having carried out a similar analysis of the remaining words from the primary list, we received a total of 4,698 refined and associative queries that were entered by real users in the past month.

Task 2: Collect a complete list of queries on your site by running each of the words in your primary list through Yandex.Wordstat query statistics.

Step 3. Stripping

First, we remove all phrases with a frequency of impressions below 50: “ how much does it cost to rent a warehouse?" - 45 views, " Warehouse rental 200 m" - 35 impressions, etc.

Secondly, we remove phrases that are not related to our site, for example, “ Warehouse rental in St. Petersburg" or " Warehouse rental in Yekaterinburg", since our warehouse is located in Moscow.

Also, the phrase “ warehouse lease agreement download" - this sample may be present on our website, but is actively promoted on this request there is no point, since a person who is looking for a sample contract is unlikely to become a client. Most likely, he has already found a warehouse or is the owner of the warehouse himself.

Once you remove all unnecessary queries, the list will be significantly reduced. In our case with “warehouse rental,” out of 474 refined queries, only 46 relevant to the site remained.

And when we cleaned the full list of refined queries (4,698 phrases), we received the Semantic Core of the site, consisting of 174 key queries.

Task 3: Clean up the previously created list of refined queries, excluding from it low-frequency keywords with less than 50 impressions and phrases that are not related to your site.

Step 4. Revision

Since you can use 3-5 different keywords on each page, we won’t need all 174 queries.

Considering that the site itself is small (maximum 4 pages), we select 20 from the full list, which, in our opinion, most accurately describe the company’s services.

Here they are: warehouse rental in Moscow, warehouse space rental, warehouse and logistics, customs services, safekeeping warehouse, warehouse logistics, logistics services, office and warehouse rental, safekeeping of goods and so on….

These keyword phrases include low-frequency, mid-frequency, and high-frequency queries.

Please note that this list is significantly different from the primary one taken from your head. And it is definitely more accurate and efficient.

Task 4: Reduce the list of remaining words to 50, leaving only those that, in your experience and opinion, are most optimal for your site. Don't forget that the final list should contain queries of varying frequencies.

Conclusion

Your semantic core is ready, now is the time to put it into practice:

  • review the texts on your site, maybe they should be rewritten.
  • write several articles on your topic using selected key phrases, post the articles on the site, and after search engines index them, register in article directories. Read “One unusual approach to article promotion.”
  • pay attention to search advertising. Now that you have a semantic core, the effect of advertising will be much higher.

Organic search is the most effective source of attracting targeted traffic. To use it, you need to make the site interesting and visible to users search engines Yandex and Google. There is no need to reinvent the wheel here: it is enough to determine what the audience of your project is interested in and how they search for information. This problem is solved when constructing a semantic core.

Semantic core- a set of words and phrases that reflect the theme and structure of the site. Semantics- a branch of linguistics that studies the semantic content of language units. Therefore, the terms “semantic core” and “semantic core” are identical. Remember this remark, it will prevent you from slipping into keyword stuffing or cramming content with keywords.

By creating a semantic core, you answer the global question: what information can be found on the site. Since customer focus is considered one of the main principles of business and marketing, the creation of a semantic core can be looked at from a different perspective. You need to determine what search queries users use to find information that will be published on the site.

Constructing a semantic core solves another problem. It's about about the distribution of search phrases across resource pages. By working with the core, you determine which page most accurately answers a specific search query or group of queries.

There are two approaches to solving this problem.

  • The first assumes creating a website structure based on the results of analyzing user search queries. In this case, the semantic core determines the framework and architecture of the resource.
  • The second approach involves preliminary planning of the resource structure before analyzing search queries. In this case, the semantic core is distributed over the finished frame.

Both approaches work one way or another. But it is more logical to first plan the structure of the site, and then determine the queries by which users can find this or that page. In this case, you remain proactive: you choose what you want to tell potential clients. If you tailor the resource structure to the keys, then you remain an object and react to the environment, rather than actively changing it.

Here it is necessary to clearly emphasize the difference between the “SEO” and marketing approaches to building the core. Here is the logic of a typical old-school SEO: to create a website, you need to find keywords and select phrases that will easily get you to the top of the search results. After this, you need to create the site structure and distribute the keys among the pages. Page content needs to be optimized for key phrases.

Here is the logic of a businessman or marketer: you need to decide what information to broadcast to the audience using the website. To do this, you need to know your industry and business well. First you need to plan the approximate structure of the site and a preliminary list of pages. After this, when building a semantic core, you need to find out how the audience searches for information. With the help of content, you need to answer the questions that the audience asks.

What negative consequences does using the “SEO” approach lead to in practice? Due to development according to the principle of “dancing from the stove”, the information value of the resource decreases. Businesses must set trends and choose what to tell customers. Businesses should not limit themselves to reactions to the statistics of search phrases and create pages just for the sake of optimizing the site for some key.

The planned result of constructing a semantic core is a list of key queries distributed across the pages of the site. It contains page URLs, search queries and an indication of their frequency.

How to build a website structure

The site structure is a hierarchical layout of pages. With its help, you solve several problems: plan information policy and logic for presenting information, ensure the usability of the resource, and ensure that the site meets the requirements of search engines.

To build a structure, use a tool convenient for you: table editors, Word or other software. You can also draw the structure on a piece of paper.

When planning your hierarchy, answer two questions:

  1. What information do you want to communicate to users?
  2. Where should this or that information block be published?

Imagine that you are planning the website structure of a small confectionery shop. The resource includes information pages, a publications section, and a showcase or product catalog. Visually the structure might look like this:

For further work with a semantic core, arrange the site structure in the form of a table. In it, indicate the names of the pages and indicate their subordination. Also include columns in the table for page URLs, keywords, and their frequency. The table might look like this:

You'll fill in the URL, Keys, and Frequency columns later. Now move on to searching for keywords.

What you need to know about keywords

To select a semantic core, you must understand what are keywords And what keywords does the audience use?. With this knowledge, you will be able to correctly use one of the keyword research tools.

What keywords does the audience use?

Keys are words or phrases that potential clients use to find necessary information. For example, to make a cake, a user enters the query “Napoleon recipe with photo” into the search bar.

Keywords are classified according to several criteria. Based on popularity there are high-, medium- and low frequency queries. According to various sources, search phrases are grouped as follows:

  • TO low frequency Requests with a frequency of impressions of up to 100 per month are included. Some specialists include requests with a frequency of up to 1000 impressions in the group.
  • TO mid-frequency Requests with a frequency of up to 1000 impressions are included. Sometimes experts increase the threshold to 5,000 impressions.
  • TO high frequency queries include phrases with a frequency of 1000 impressions or more. Some authors consider keys that have 5,000 or even 10,000 queries to be high-frequency.

The difference in frequency estimates is due to the different popularity of topics. If you are creating a core for an online store that sells laptops, the phrase “buy samsung laptop"with a display frequency of about 6 thousand per month will be medium frequency. If you are creating the core for a sports club website, the query “aikido section” with a search frequency of about 1000 queries will be high-frequency.

What do you need to know about frequency when compiling a semantic core? According to various sources, from two-thirds to four-fifths of all user requests are low-frequency. Therefore, you need to build the broadest possible semantic core. In practice, it should be constantly expanded to include low-frequency phrases.

Does this mean that high- and mid-frequency queries can be ignored? No, you can't do without them. But consider low-frequency keywords as the main resource for attracting target visitors.

According to user needs, keys are combined into the following groups:

  • Information. The audience uses them to find some information. Examples of information requests: “how to properly store baked goods”, “how to separate the yolk from the white”.
  • Transactional. Users enter them when they plan to take an action. This group includes the keys “buy a bread machine”, “download a recipe book”, “order pizza for delivery”.
  • Other requests. We are talking about key phrases that are difficult to determine the user’s intent. For example, when a person uses the key "cake", he may be planning to buy a culinary product or prepare it himself. In addition, the user may be interested in information about cakes: definition, characteristics, classification, etc.

Some experts classify navigation queries as a separate group. With their help, the audience searches for information on specific sites. Here are some examples: “connected laptops”, “city express track delivery”, “register on LinkedIn”. Navigation queries that are not specific to your business can be ignored when compiling the semantic core.

How to use this method of classification when building a semantic core? First, you must consider the needs of your audience when distributing keywords across pages and creating a content plan. Everything is obvious here: publications of information sections must respond to information requests. This should also contain most of the key phrases without any expressed intent. Transactional questions should be answered by pages from the “Store” or “Showcase” sections.

Secondly, remember that many transactional issues are commercial. To attract natural traffic for the request “buy samsung smartphone", you will have to compete with Euroset, Eldorado and other business heavyweights. You can avoid unequal competition using the recommendations given above. Maximize the kernel and reduce the request frequency. For example, the frequency of the request “buy a smartphone Samsung Galaxy s6” is an order of magnitude lower than the frequency of the key “Buy a Samsung Galaxy smartphone.”

What you need to know about the anatomy of search queries

Search phrases consist of several parts: body, specifier And tail. This can be seen with an example.

What about the query “cake”? It cannot be used to determine the user's intent. It is high-frequency, which determines high competition in the search results. Using this request for promotion will bring a large share of untargeted traffic, which negatively affects behavioral metrics. The high-frequency and non-specific nature of the request “cake” is determined by its anatomy: it consists only of the body.

Pay attention to the request “buy a cake”. It consists of a body "cake" and a specifier "buy". The latter determines the user's intent. It is the specifiers that indicate whether the key is transactional or informational. Look at the examples:

  • Buy a cake.
  • Cake recipes.
  • How to serve the cake.

Sometimes specifiers can express exactly the opposite of the user's intentions. A simple example: users are planning to buy or sell a car.

Now look at the request “buy cake with delivery.” It consists of a body, a specifier and a tail. The latter does not change, but details the user's intention or information need. Look at the examples:

  • Buy cake online.
  • Buy a cake in Tula with delivery.
  • Buy homemade cake in Orel.

In each case, the person’s intention to purchase the cake is clear. And the tail of the key phrase details this need.

Knowledge of the anatomy of search phrases allows you to derive a conditional formula for selecting keys for the semantic core. You must define core terms related to your business, product, and user needs. For example, customers of a confectionery company are interested in cakes, pastries, cookies, pastries, cupcakes and other confectionery products.

After that, you need to find the tails and specifiers that the project's audience uses with basic terms. With tail phrases, you simultaneously increase your reach and reduce core competition.

Long tail or long tail is a term that defines a strategy for promoting a resource for low-frequency key queries. It consists of using the maximum number of keys with a low level of competition. Promotion through low frequencies ensures high efficiency of marketing campaigns. This is due to the following factors:

  • Promotion using low-frequency keywords requires less effort compared to promotion using high-frequency competitive queries.
  • Working with long-tail queries is guaranteed to bring results, although marketers cannot always accurately predict which keywords will generate traffic. When working with high-frequency queries, decent marketers cannot guarantee results.
  • Low-frequency drivers provide higher specificity of results to user needs.

For large sites, the semantic core can contain tens of thousands of queries, and it is almost impossible to select and correctly group them manually.

Services for compiling a semantic core

There are quite a lot of tools for selecting keywords. You can build the core using paid or free services and programs. Choose a specific tool depending on the tasks you face.

Key Collector

You cannot do without this tool if you are engaged in Internet marketing professionally, develop several sites, or form the core of a large site. Here is a list of the main tasks that the program solves:

  • Selection of keywords. Key Collector collects requests through Yandex's Wordstat.
  • Parsing search suggestions.
  • Cutting off inappropriate search phrases using stop words.
  • Filtering requests by frequency.
  • Finding implicit duplicate queries.
  • Determination of seasonal requests.
  • Collection of statistics from third-party services and platforms: Liveinternet.ru, Metrica, Google Analytics, Google AdWords, Direct, Vkontakte and others.
  • Search for pages relevant to the query.
  • Clustering of search queries.

Key Collector- a multifunctional tool that automates the operations necessary to build a semantic core. The program is paid. You can do everything Key Collector can do with alternative free tools. But for this you will have to use several services and programs.

SlovoEB

This is a free tool from the creators of Key Collector. The program collects keywords through Wordstat, determines the frequency of queries, and parses search suggestions.

To use the program, enter the login and password for your Direct account in the settings. Do not use your main account, as Yandex may block it for automatic requests.

Create a new project. On the Data tab, select the Add Phrases option. Indicate the search phrases that the project's audience is likely to use to find information about products.

In the “Collect keywords and statistics” menu section, select the desired option and run the program. For example, determine the frequency of key phrases.

The tool allows you to select keywords, as well as automatically perform some tasks related to analyzing and grouping queries.

Keyword selection service Yandex Wordstat

To see which phrases the page is shown for in Yandex results, in the Yandex.Webmaster panel you need to open the “Search Queries” tab -> "Latest requests".

We see phrases that were clicked on or the site snippet was shown in the TOP 50 of Yandex over the last 7 days.

To view data only for the page that interests us, we need to use filters.

The possibilities for searching for additional phrases in Yandex.Webmaster are not limited to this.

Go to the “Search Queries” tab -> "Recommended queries."

There may not be many phrases here, but you can find additional phrases for which the promoted page does not fall into the TOP 50.

Query history

The big disadvantage of visibility analysis in Yandex.Webmaster, of course, is that the data is available only for the last 7 days. To get around this limitation a little, you can try to supplement the list using the “Search Queries” tab -> "Request History".

Here you will need to select " Popular queries».

You will receive a list of the most popular phrases for the last 3 months.

To get phrases from Google Search Console, go to the “Search Traffic” tab -> "Analysis of search queries." Next, select “Impressions”, “CTR”, “Clicks”. This will allow you to see more information that can be useful when analyzing phrases.

By default, the tab displays data for 28 days, but you can expand the range to 90 days. You can also select the desired country.

As a result, we get a list of requests similar to that shown in the screenshot.

New version of Search Console

Google has already made some tools available new version panels. To view requests for a page, go to the “Status” tab - > "Efficiency".

In the new version, the filters are located differently, but the filtering logic remains the same. I think there is no point in dwelling on this issue. One of the significant differences is the ability to analyze data over a longer period, and not just 90 days. A significant advantage when compared with Yandex.Webmaster (only 7 days).

Competitive website analysis services

Competitors' websites are a great source of keyword ideas. If you are interested in a specific page, you can manually determine the search phrases for which it is optimized. To find the main keywords, it is usually enough to read the material or check the contents of the keywords meta tag in the page code. You can also use semantic text analysis services, for example, Istio or Advego.

If you need to analyze the entire site, use comprehensive competitive analysis services:

You can use other tools to collect keywords. Here are some examples: Google Trends, WordTracker, WordStream, Ubersuggest, Topvisor. But don’t rush to master all the services and programs at once. If you are creating a semantic core for your own small website, use free tool, for example, the Yandex keyword selection service or the Google planner.

How to choose keywords for the semantic core

The process of selecting key phrases is combined into several stages:

  1. First, you will identify the basic keywords with which the audience searches for your product or business.
  2. The second stage is devoted to expanding the semantic core.
  3. In the third step, you will remove inappropriate search phrases.

Defining base keys

List common search phrases related to your business and products in a spreadsheet or write down on paper. Gather your colleagues and brainstorm. Record all proposed ideas without discussion.

Your list will look something like this:

Most of the keys you've written down are high in frequency and low in specificity. To get mid- and low-frequency search phrases with high specificity, you need to expand the core as much as possible.

Expanding the semantic core

You will solve this problem using keyword research tools such as Wordstat. If your business has a regional link, select the appropriate region in the settings.

Using the key phrase selection service, you need to analyze all the keys recorded at the previous stage.

Copy the phrases from the left column of Wordstat and paste them into the table. Pay attention to the right column of Wordstat. In it, Yandex suggests phrases that people used along with the main request. Depending on the content, you can immediately select the appropriate keywords from the right column or copy the entire list. In the second case, unsuitable requests will be eliminated at the next stage.

And the result of this stage of work will be a list of search phrases for each basic key that you received through brainstorming. The lists may contain hundreds or thousands of queries.

Removing inappropriate search phrases

This is the most labor-intensive stage of working with the kernel. You need to manually remove inappropriate search phrases from the kernel.

Do not use frequency, competition or other purely “SEO” metrics as a criterion for evaluating keys. Do you know why old-school SEOs consider certain search phrases to be trash? For example, take the key “diet cake”. The Wordstat service predicts 3 impressions per month for it in the Cherepovets region.

To promote pages for specific keywords, old-school SEOs bought or rented links. By the way, some experts still use this approach. It is clear that search phrases with low frequency in most cases do not recoup the funds spent on buying links.

Now look at the phrase “diet cakes” through the eyes of an ordinary marketer. Some representatives of the confectionery company's target audience are really interested in such products. Therefore, the key can and should be included in the semantic core. If the confectionery prepares the corresponding products, the phrase will be useful in the product descriptions section. If for some reason the company does not work with diet cakes, the key can be used as a content idea for the information section.

What phrases can be safely excluded from the list? Here are examples:

  • Keys mentioning competing brands.
  • Keys mentioning goods or services that you do not sell and do not plan to sell.
  • Keys that include the words “inexpensive”, “cheap”, “at a discount”. If you are not dumping, cut off cheap lovers so as not to spoil behavioral metrics.
  • Duplicate keys. For example, of the three keys “custom cakes for a birthday”, “custom cakes for a birthday” and “custom cakes for a birthday”, it is enough to leave the first one.
  • Keys that mention inappropriate regions or addresses. For example, if you serve residents of the Northern district of Cherepovets, the key “custom cakes industrial district” does not suit you.
  • Phrases entered with errors or typos. Search engines understand that the user is looking for croissants, even if he enters the key “croissants” into the search bar.

After removing inappropriate phrases, you received a list of queries for the base key “custom cakes”. The same lists need to be compiled for other basic keys obtained during the brainstorming stage. After that, move on to grouping key phrases.

How to group keywords and build a relevance map

Search phrases with which users find or will find your site are combined into semantic clusters, this process is called search query clustering. These are closely related groups of queries. For example, the semantic cluster “Cake” includes all key phrases associated with this word: cake recipes, order a cake, photos of cakes, wedding cake, etc.

Semantic cluster- this is a group of queries united in meaning. It is a multi-level structure. Within the first-order cluster “Cake” there are second-order clusters “Cake Recipes”, “Ordering Cakes”, “Photos of Cakes”. Within the second-order cluster “Cake Recipes”, it is theoretically possible to distinguish a third order of clustering: “Recipes for cakes with mastic”, “Recipes for sponge cakes”, “Recipes for shortbread cakes”. The number of levels in a cluster depends on the breadth of the topic. In practice, in most topics, it is enough to identify business-specific second-order clusters within first-order clusters.

Theoretically, a semantic cluster can have many levels.
In practice, you will have to work with clusters of the first and second levels

You identified most of the first level clusters during brainstorming when you wrote down basic key phrases. To do this, it is enough to understand your own business, as well as look at the site diagram that you drew up before starting work on the semantic core.

It is very important to correctly perform clustering at the second level. Here, search phrases are modified using qualifiers to indicate user intent. A simple example is the “cake recipes” and “custom cakes” clusters. The first search phrases are used by people in need of information. The keys of the second cluster are used by clients who want to buy a cake.

You identified the search phrases for the “custom cakes” cluster using Wordstat and manual screening. They must be distributed between the pages of the “Cakes” section.

For example, in the cluster there are search queries “custom football cakes” and “custom football cakes”.

If the company’s assortment includes a corresponding product, you need to create a corresponding page in the “Mastic Cakes” section. Add it to the site structure: indicate the name, URL and search phrases with frequency.

Use Keyword Research or similar tools to see what other search terms potential customers are using to find football-themed cakes. Add relevant pages to your list of keywords.

In the list of cluster search phrases, mark the distributed keys in a way convenient for you. Distribute the remaining search phrases.

If necessary, change the site structure: create new sections and categories. For example, the page “custom cakes for Paw Patrol” should be included in the “Children’s Cakes” section. At the same time, it can be included in the “Mastic Cakes” section.

Please note two things. First, the cluster may not have suitable phrases for the page you are planning to create. This can happen for various reasons. For example, these include imperfection of tools for collecting search phrases or their incorrect use, as well as low popularity of the product.

The absence of a suitable key in the cluster is not a reason to refuse to create a page and sell a product. For example, imagine that a confectionery company sells children's cakes featuring characters from the cartoon Peppa Pig. If the list does not include the relevant keywords, clarify the needs of your audience using Wordstat or another service. In most cases, suitable requests will be found.

Secondly, even after removing unnecessary keys, there may still be search phrases in the cluster that are not suitable for the created and planned pages. They can be ignored or used in another cluster. For example, if for some reason a confectionery shop fundamentally does not sell Napoleon cake, the corresponding key phrases can be used in the “Recipes” section.

Clustering search queries

Search queries can be grouped manually, using Excel programs or Google spreadsheets, or automated using special applications and services.

Clustering allows you to understand how requests can be distributed across website pages for the fastest and fastest effective promotion.

Automatic clustering or grouping of search queries of the semantic core is carried out based on the analysis of sites included in the TOP 10 search engine results Google systems and Yandex.

How automatic request grouping works: for each request, the results among the TOP 10 sites are viewed. If there are matches among at least 4-6 of them, then requests can be grouped to be placed on one page.

Automatic grouping is the fastest and effective way combining keywords to form a site structure that is almost ready for use.

If it is not correct, from the point of view of search engine statistics, to form a site structure and distribute queries among its pages, it will, alas, be impossible to successfully promote pages to the TOP!

Applications and services for automatic grouping of search queries

Among the services that automate the grouping of keywords, it is worth highlighting:

  • Key Collector.
  • Rush Analytics.
  • TopVisor.

After all the keys have been distributed, you will receive a list of existing and planned site pages indicating the URL, search phrases and frequency. What to do with them next?

What to do with the semantic core

A table with a semantic core should become a road map and the main source of ideas when forming:

Look: you have a list with pre-titled pages and search phrases. They determine the needs of the audience. When drawing up a content plan, you just need to clarify the name of the page or publication. Include your main search query. This is not always the most popular key. In addition to popularity, the query in the title should best reflect the need of the page's audience.

Use the remaining search phrases as an answer to the question “what to write about.” Remember, you don't have to fit every search phrase into your content or product description at all costs. Content should cover the topic and answer user questions. Please note again: you need to focus on information needs, and not on search phrases and how they fit into the text.

Semantic core for online stores

The specificity of the preparation and clustering of semantics lies in the presence of four very important, for subsequent, groups of pages:

  • Home page.
  • Pages of sections and subsections of the catalog.
  • Product card pages.
  • Blog article pages.

We have already talked above about different types search queries: informational, transactional, commercial, navigational. For pages of sections and products of an online store, transactional ones are primarily of interest, i.e. queries using which search engine users want to see sites where they can make a purchase.

You need to start forming a core with a list of products that you already sell or plan to sell.

For online stores:

  • as " body»requests will be made product names;
  • as " specifiers" phrases: " buy», « price», « sale», « order», « photo», « description», «
Share