- Changed URLs (products, categories, product images)
- Error messages or redirects (404, 301, 302, 500 or any despite 200)
- Proper URLs have incorrect directives for bots (Noindex in meta robots tag, Disallow in robots.txt or x-robot tag noindex in Http headers)
Important (a progressive loss of ranking positions):
SEO Migration Control Algorithm
1.URLs and Redirects
In 1-2 days before the migration you need to:
- Save your sitemap
- Check redirects
- Check Http responses
URLs must be available by the same address. Exact match to the M1 URLs.
2. Review Directives for Bots
- Save Robots.txt file from the live website 1-2 days before the migration. During the migration robots.txt file can be transferred from pre-live version (this version should be closed for indexing). Review Robots.txt immediately after migration and compare it to the saved file. It must be the same.
- Create a profile of meta name robots and x-robots tag
Pages for review:
- Filter pages
- “Excluded” pages in Google Search Console
- Pages modified before (check tasks records)
Profile Example for yourwebsite.com:
– Paginated pages: <meta name=”robots” content=”NOINDEX,FOLLOW” />
(page example yourwebsite.com/page=2)
x robots tag not in use
– Pages with parameters: ?size (color, base_color, brand, price, hand, league, ball_pack, lie_angle, gender, dir, etc)
<meta name=”robots” content=”NOINDEX,NOFOLLOW” />
x robots tag not in use
– Filter Pages: /shopby/
<meta name=”robots” content=”NOINDEX,NOFOLLOW” />
x robots tag not in use
– URLs like:
https://yourwebsite.com/mmajaxview/ajax/*
X-Robots-Tag noindex (in use in http header)
3. Sitemap
Review which sitemaps are in use and generate the same on the live version.
Example:
- Sitemap: https://www.yourwebsite.com/media/sitemaps/cms/sitemap.xml
- Sitemap: https://www.yourwebsite.com/media/sitemaps/products/sitemap.xml
- Sitemap: https://www.yourwebsite.com/media/sitemaps/category/sitemap.xml
- Sitemap: https://www.yourwebsite.com/blog/sitemap.xml
- Sitemap: https://www.yourwebsite.com/media/Google_Image_Sitemap.xml
Setup sitemaps in the admin area on the pre-live version.
4. Create a Microdata Profile
Review website page types and existing microdata
(Home page, category, Product)
5. Review Key HTML tags
Key Tags: Title, Description, H1 (h2-h3), alt and title images
Text Content
Review random pages (different types). 5-10 product and 5-10 category pages + home page.
Move templates of Title, H1, Description before the migration.
6. Page Speed
There are many different parameters that influence page speed load. However, there are several that can be transferred:
Http headers and info about the cache (date expiry, directives)
Save the Http headers for different page types:
- Product Page
- Category Page
- Home Page
- CMS page (Shipping)
Post-Migration
1. Review URL responses from the saved sitemap file to prevent critical issues 1 and 2.
2. Check if the URLs have the same URL paths:
- Home page + CMS Pages (all or 10)
- Product Pages (randomly select 30 from different categories and check manually the response)
- Category Pages (randomly select 20)
3. Compare robots txt on the live version with the saved one
4. Review meta name robots and x robots tag on the live version with the old one
5. Generate sitemap files on the 1st day after migration. If sitemap’s addresses aren’t the same, add them to robots.txt and Search Console. Check the Search Console for sitemap errors every day in the first 2 weeks.
6. Compare key Html tags templates with the live version
7. Monitor the occurrence of new trash pages or errors in the Search Console every day in the first 2 weeks
Practice has shown that this algorithm reduces the risk of critical SEO errors, occurring during Magento | Adobe Commerce migration. Feel free to adopt it and good luck!