Seo promotion modx. SEO optimization MODx - all technical points

Search Engine Optimization MODX has its own peculiarities and nuances, since it is not just a regular CMS, but CMF - an environment for website development. In short, you have the foundation for the house, and you do the rest yourself. And most importantly, nothing superfluous, exactly what you need for this project.

The peculiarity and advantage of MODX over other systems is that you can 100% control the output of any information, and at the output you get pure HTML without impurities. We will only look at MODX search engine optimization and how to implement it on the points that are already known from the article.

1. Formation of a unique title for each page of the site (title)
MODX allows you to create unique titles for each page of the site, but for this you need to insert this design into the site template [[*longtitle:default=`[[*pagetitle]]`]]. If the title is empty, then the title of the site page itself will be displayed by default.

Rice. 1. Extended header in MODX Revo

2. Formation of a meta description
To use a meta description, you need to register this construction in the site template.

Rice. 2. Description of the site page

3. Human Understandable URLs (NC) in MODX
MODX Revo supports CNC and many URL settings, about 20 of them. After enabling CNC, rename the ht.access file to .htaccess. In the system settings, the "Friendly URLs" section has the necessary settings.

container suffix;
- maximum length alias;
- transliteration of pseudonyms;

- characters cut from aliases;
- word separator in aliases;
- use friendly URLs;
- check for duplicate URLs;
- use nested URLs.
4. Duplicate pages, or the ability to eliminate them
If you configured the URL correctly and worked in the .htaccess file (see how you need to), then I can congratulate you - you will not have duplicates. But to be on the safe side, you need to use the Canonical add-on, which will tell search engines the canonical version of the page. The instruction for its use is simple: download the Canonical add-on, install it, into the site template in the section put the construction []
. And the following line of code appears in the site template

By the way, the .htaccess file has all the necessary settings for setting up the main domain of your site, uncomment it and you can use it.
Important: standard setting 301 redirect does not work in MODX, if you need to make a redirect then use this construction

RewriteRule ^o-studii.html

5. Nofollow and noindex support
By default, MODX does not have the ability to use these tags. But if there is a need to close the page from indexing, then you can go in two ways.

1) Download the SEO Tab app.
2) Use Additional fields(TV). It is necessary to create the SEO category, then an additional field, let's call it noindex (when creating the TV field, specify the text input parameters, and also for which template to apply) and then simply display this parameter in the site template, design.

6. Breadcrumbs
To create breadcrumbs, we use the Breadcrumbs add-on, its advantage is that microformats are implemented in it, and now site navigation will be visible in the search

Rice. 3. Breadcrumbs in MODX Revo

7. Sitemap in MODX Revo
A sitemap in MODX is done in two steps.
1) Download the GoogleSiteMap add-on.

2) We create a page at the root, set the content type to XML, then turn off the editor for this page. And in the page itself we put the design

[[!GoogleSiteMap?]]

Check: we type in address bar site/sitemap.xml

8. Page 404
MODX Revo allows you to create your own unique 404 page. This is done very simply: any page with the name 404 is created. Then you need to go to System Settings, the Site section, find the item “404 Error Page” and set the id of the created page. That's all, the 404 page is ready, now you can create a unique template for it.

Rice. 4. Setting up 404 pages for MODX Revo

9. robots.txt for MODX
Created robots file for MODX it is similar to creating a sitemap (see point 7). We create a page, select the content type txt, turn off the editor.

Check: site/robots.txt

10. Speed ​​up the site on MODX Revo
The speed of the site significantly affects the loyalty of visitors and, accordingly, the conversion. MODX developers have taken care of the possibility to speed up the site through the system itself, without additional extensions.

The "Caching" section allows you to customize your site very flexibly and reduce the load on the server.

What you should pay attention to:
- enable database caching;
- cache javascript files;
- enable script caching.

The "Control Panel" section, using the settings, allows you to increase the bandwidth of the server through compression.

For example, these functions are:
- use compressed CSS;
- enable zlib compression for JS/CSS files;
- use compressed javascript libraries.

In terms of possibilities search engine optimization, MODX you can put a solid five. Its capabilities and flexibility are impressive, but learning how to work with it requires a little effort and patience. Believe me, it's worth it.

save

Optimizing ModX is something SEOs often face. This engine is used by an impressive number of commercial platforms, such as online stores. What is the reason for his popularity in this area? And how does the optimization of the site on Modx work in general? What are the features and pitfalls?

Features of website promotion with ModX

The first thing to remember when doing ModX SEO optimization is that in order for the site to rank successfully, you must follow the recommendations put forward by search engines. For example, you can install Webmaster distributed by Yandex. It will directly say what steps to take, what errors were found during the last check. Successful completion of these points will help to promote the site to the top lines of the search engine.

In order to see your resource in the search top, you need to pay attention to whether analytical systems are integrated into the "body" of the information platform, so to speak. This will help optimize ModX relatively quickly. You also need to work with the engine itself.

  • In order to help the search robot quickly scan the site, get all necessary information, you need to create a file called sitemap. It contains data on how the materials on the site are organized, how long the videos are, if any, how often new content appears. The faster the search robot scans the resource, the higher the site position will be. Create detailed map information platform is possible with the help of a plug-in distributed by the search engines themselves. For example, GoogleSiteMap;
  • Install a ModX SEO plugin that allows you to collect statistical data. It will help you manage content more effectively, work with the structure of the site. Any change made to the design and advertising campaign will be immediately noticeable - the number of people who enter will change, the viewing depth of an individual will increase or decrease;
  • It is necessary that each page of the site use CNC (Human Readable Url). The shorter, clearer the URL of the page, the higher its traffic will be. The fact is that search robots pay more attention to those sites where the URL displays what is written on the pages of the site.

But you should pay attention to what plugins will be used by the site and its admin panel. Best to install software only from trusted developers, especially if we are talking about paid software.

COMPREHENSIVE SITE AUDIT

Professional comprehensive site audit - global analysis of your project ✔ Detailed report with a list of errors and recommendations for improvement ✔ Increase in conversions

To get the consultation

Our manager will contact you soon

Submit an application

Details of site promotion on ModX

Having engaged in the promotion of the site on ModX SEO specialist should facilitate the promotion of the resource as much as possible. Firstly, the optimization of the used URL. If the online store uses categories, then for ease of product search, you should enable the function of nested URLs.

The next item that is mandatory for optimizing a site on ModX by an SEO specialist is the formation of a snippet by a search robot. In order to facilitate his task and increase the indexing of the page, it is necessary to create unique titles, descriptions for products / categories, keywords. The correct selection on all three points can have a positive impact on the position of the site in the search engine, on its CTR rating.

How to work with the robots file

A resource created on the ModX engine has a file called robots.txt. Its content can be bewildering to many users if you do not know what it is intended for. The purpose of the robots.txt file is to exclude certain pages from indexing. It gives the creator the opportunity to inform the search robot that the materials of one or another part of the site should not be crawled, they should not be taken into account when evaluating the entire site as a whole.

Content for the site on ModX

Website optimization on ModX by an SEO specialist is impossible without what countless pages, sites and resources are created for - without content. It is according to it that search robots evaluate the page, and without unique, live content, even an online store will not be of interest.

First, the textual material. news, any Interesting Facts, tips - all this should be as unique as possible. This is the only way to earn, so to speak, the trust of search robots.

Secondly, - images, video. Even their uniqueness affects the height of resource positions. Of course, this parameter does not affect the rating of search robots as much as the "unique" text materials, but the effect is tangible.

INTEGRATED SITE PROMOTION

Comprehensive website promotion will help your resource get to the top of the search results and attract to the site target audience and increase brand awareness

The cost is from 40 000 rubles.

To get the consultation

Our manager will answer all your questions, as well as find the best solution to your problem and advise the services that best suit your business goals.

Submit an application

The main thing to be guided by when promoting a site on ModX, as, in principle, on any other engine - no theft of materials. Any text or photo with a critically low uniqueness will greatly affect the reputation of the resource in the "eyes" of search robots. In this case, with the frequent appearance of such materials, the information platform will never get out of the "sandbox" of the Internet and will receive extremely low indexing.

It is also necessary not to forget that each material posted on the site page must contain keywords and expressions. Here you need to make sure that there is no overspam, and all the "keys" look natural. And, of course, before placing them in the text, you need to create the SEO core of the site, where all the relevant keywords will be entered. This is usually done by a semantic specialist. He, for a fee, will do everything on his own - he will look at which queries are most suitable for the site, select the ones you need, and remove the extra ones. But if it is not possible to hire an SEO specialist, then you can do this selection on your own. Or use applications that facilitate the selection of "keys". True, the final selection still has to be done by the site owner himself.

Today I would like to provide full information for internal site optimization on the MODx engine. On many engines, the work of internal optimization is so complicated that you really have to spend a lot of time and money to achieve the desired result. And this is true even for commercial CMS. As for the MODx engine, everything is much simpler and a programmer with little experience can figure it out.

Stages of internal website optimization

Code Validity

Before proceeding with the internal optimization of the site on MODx and not only - register in the webmasters panel of the Yandex and Google search engines, and also install the codes of the Yandex.Metrics and Google.Analytics counters. This will help to check the structure of the site, establish the fact that the site is indexed correctly, check the robots.txt, .htaccess, sitemap.xml files. In addition, you will be able to track user behavior on the site, which is very important.

The first thing you should pay attention to is the validity of the code. If search system Yandex does not pay attention to errors in the code, while Google can significantly affect the issuance of the site in the presence of errors.

You can check the site for errors in the code at http://validator.w3.org for free. If errors are found, then you need to go into the chunk that contains the piece of code with the error and fix it. There is nothing complicated.

Of course, there will be errors that you cannot fix, for example, the Yandex metric code on my site is completely invalid. But most of it is still recommended to be corrected.

Setting up indexing

For better site indexing, you need to create an xml sitemap. I have already described how this is done.

You also need to check robots.txt to desired sections were not closed from indexing. Correct robots.txt

I already posted earlier. Therefore, we will not go deep, it is worth noting that if you have sections that also need to be hidden from indexing, then do it in robots.

Getting rid of duplicates

First you need to decide on the main mirror of the site (with or without www). If you decide on this, then you need to specify this in the htaccess file.

Above all, you need to check that non-existent pages returned a 404 error, there is no need to throw the user to a stub page and return a 200 response. THIS IS WRONG! You can design a beautiful 404 page and use navigation to direct the user where you need to go.

To get rid of duplicates that occur during pagination, you can use the ready-made solution that I suggested earlier - rel=canonical , or in any other way. Someone writes instructions in the robots.txt file, I think my method is more optimal in terms of labor costs, but it's up to you to decide.

Additionally, you need to work out options for the appearance of duplicates when using additional snippets, for example, the Jot snippet, which is used to comment on articles, also duplicates pages. Therefore, if you use it on your site, then we write the following instructions in the robots.txt file:

Disallow: /*/*/*?*=*

It is necessary to remake the design under the structure of your site.

We optimize content

Content Structure

The page should have a clear structure that should be followed. To do this, check the order of the headers. It should be next h1-h2-h3-h4-h5-h6. Stick to this rule when adding the following articles. I will talk more about content optimization a little later, because in this article I just wanted to draw your attention to technical side internal site optimization modx.

Meta data

To understand why they are needed, consider the search results snippet:

The clickability of a snippet depends on its appearance.

Be sure to fill in the meta data of the page if you want the snippet with the information you need to appear in the search results. With this in modx, special resource tags and tv-parameters will help us, the data from which will be substituted into the chunk with meta-data.

If your chunk with metadata does not have such a structure, then I recommend adding it right away:

< title>[*longtitle*]< /title>
< meta name="description" content="[ *description*]" />
< meta name="keywords" content="[ *keywords*]" />

[ *pagetitle*] - will be used as the h1 title;
[ *longtitle*] - substituted in the title of the page;
[ *description*] - substituted in the page description;
[*keywords*] is a tv-parameter, you can leave it out, but many people use it to display related articles or news.

Human-like URLs are very easy to set up in modx. To do this, go to the "Configuration" section, the "Friendly URLs" tab and set everything as in the image:

First you need to rename the file in the root folder of the ht.access site to .htaccess and write the necessary instructions, which I wrote about earlier.

Next, go to the plugins section, find the TransAlias ​​plugin, open it for editing and go to the "Configuration" tab and specify the following information:

This is necessary in order for our engine to generate the CNC on its own. We save the settings and everything worked for us.

Although I said obvious things, for some reason, in most cases, this is not done even though it is important for promotion. Now let's continue with the obvious.

let's turn the screws
As a rule, all sites in the basement have inscriptions like “all rights reserved”, “copyright blah blah blah”, “copying is prohibited”, etc. nonsense that does not give us any benefit at all. What are we doing? We are writing
Copying information from the page "[*pagetitle*]" is prohibited 2012

pagetitle vs menutitle
But after all, our pagetitle can be called news, which will not help us in any way in promoting, and by calling it “tourism news” we will destroy the menu compiled using Wayfinder. What to do?
For these purposes, there is a field: "menu item". We write “news” in it, and in the title we write “tourism news”. As they say, the wolves are full and the sheep are safe.

Website with WWW or without?
It is necessary to glue addresses with www and without www.
a) to avoid duplication
b) for gluing the weight of the pages
From where to where to put the redirect is up to you. I prefer sites without www. All this is glued with a 301 redirect to .htaccess. We won't stop for long. can be honored.

XML sitemap
For XML, we display all the pages of our site. You can do this with the snippet of this or ditto . I'm using this snippet.
For all these solutions, it is necessary to set blank template, content type text/xml and uncheck "Use HTML editor". I have a habit of giving this page alias sitemap.xml

HTML sitemap
In an HTML sitemap, you should display all pages only when you want to add the site to SAP (to make all pages 3 levels of nesting). But we are making a site for people, so you should not display all the pages in the map. I came to this conclusion after experimenting for a long time. For a sitemap of this kind, as a rule, wayfinder is used with a call [] . As a rule, this is enough, but there are different sites and different document trees, so sometimes you have to write your own snippets for the html sitemap for a specific project.

RSS feed
This important element is often overlooked. It helps to slightly speed up the indexing of new pages. We use Ditto to create an RSS feed. Be sure to install content-type application/rss+xml and blank template and uncheck "Use HTML editor".

404 error (page not found)
You can create your own original page, or you can link the page to an html sitemap. The main thing then do not forget to go to tools->configuration and on the "site" tab specify the ID of your page in the field error page "404". In addition, I recommend installing the Error 404 Logger module in order to correct the paths to non-existent pages in a timely manner.

Turn on the CNC
In order to be able to easily track in the same metric which section is most popular, I advise you to use nested CNC. To do this, go to tools->configuration and configure
use friendly urls: Yes
Use nested URLs: Yes
The rest of the options are up to you. The only thing I would like to draw your attention to is the prefix and suffix. Be sure to make these fields empty to avoid duplicates. Or you can put SEO Strict URLs .

sanding robots.txt
Feel free to write detailed instructions in robots.txt and for Yandex separately (he loves it). During compilation, the Yandex service is useful. Well, here are some general recommendations:
1) Delete the first pages of ditto pagination, namely those that have the start=0 parameter. Those. ditto can be called with an id parameter, then start can also have a prefix. For this case, we write Disallow: /*?*start=0$
2) Add to disallow everything related to service pages and should not be included in the search results: page feedback, a page with partners usually contains a bunch of links and therefore also in disallow, etc.
3) If phpthumb or something similar is installed that puts pictures in the /assets/cache/ folder, then we write Allow: /assets/cache/phpthumbof/
4) Write the path to our XML sitemap Sitemap: example.com/sitemap.xml
5) We register the main host of the site (where you redirect with www or without www) host: example.com
6) We wrote all these rules in the block User-agent: * now copy them and paste them below the block User agent: Yandex
7) Close access to unnecessary robots
8) We check through the Yandex service whether we have correctly compiled the rules in the file
Working example file

P.S. There is also a module for automatic page linking by keywords, but I don’t like it and I prefer to manually link pages to each other.

We offer professional support for sites on CMS MODX and search promotion sites and online stores on MODX. If your online store or website is developed on the administration system MODX Evolution or MODX Revolution and you need search engine promotion of the site, then you have come to the right place. We have been developing, finalizing and promoting websites and online stores on MODX for many years. We have accumulated extensive experience in optimizing and SEO-promotion of online stores and sites on MODX.

MODX is a very flexible system, convenient for SEO promotion

We opted for MODX, as reliable, flexible, convenient and giving ample opportunities for optimizing and promoting CMS. We know all the possibilities and advantages of this system. We have experience and a large number of our own developments.

Single executor - seo-optimizer and web programmer

We carry out all work on optimizing the code and content on our own. You do not have to look for a separate contractor to perform work on optimization or seo-refinement of the site code. We take full service of the site and work with it, helping to improve and develop your site or online store.

Our freelance prices

The cost of a month of work on the project will be at the level of the cost of a freelancer. Unlike a freelancer, you will get a reliable partner, work under the Contract, accounting reports, and wide payment options. A team of specialists will work on your project.

Share