Creation of a website is never a final step in moving your business to digital. In fact, it’s just a first step along brave new world of digital marketing. Today we will uncover basics of On-Site SEO for your e-commerce website.
Indexation. Vast amount e-shops have same issue of getting their content to index. Of course, sooner or later Google will be filled with search results from your site (you can always check it with following command in search bar – “site:[your site]” or “site:[your site]`” for clear results). But would those be useful pages leading to all of your products? For example filters on your website can generate thousands of pages with almost same content, which due to their URL changes (with parameters like “?value=x”) will be concerned as unique for search engines, causing a clear content duplication. Also receiving same content again and again from different URLs search crawlers may simply not reach other parts of your website.
To avoid this you should correctly setup both robots.txt and sitemap.xml files. Sitemap.xml is always an individual case so it’s hard to give some universal recommendations (except need to create few sitemaps if you have over 50k pages). With regards to robots here is a standard example for your Magento website:
User-agent: * ## Explanation: you exclude development files and folders Disallow: /CVS Disallow: /*.svn$ Disallow: /*.idea$ Disallow: /*.sql$ Disallow: /*.tgz$ ## Explanation: you exclude checkout and user account pages Disallow: /checkout/ Disallow: /onestepcheckout/ Disallow: /customer/ Disallow: /customer/account/ Disallow: /customer/account/login/ Disallow: /trade-account/ Disallow: /wishlist/ ## Explanation: you exclude links with session IDs Disallow: /*?SID=* ## Explanation: you exclude search pages and some modules Disallow: /catalogsearch/ Disallow: /catalog/product_compare/ Disallow: /feefo/ Disallow: /sendfriend/ Disallow: /sales/guest/form/ Disallow: /customer/account/login/ ## Explanation: you exclude sub category pages that are sorted or filtered Disallow: */*?dir* Disallow: */*?mode* Disallow: */*?cat* Disallow: */*?min=* Disallow: */*?max=* Disallow: */*?limit=* ## Explanation: you exclude technical folders Disallow: /app/ Disallow: /downloader/ Disallow: /errors/ Disallow: /includes/ Disallow: /lib/ Disallow: /pkginfo/ Disallow: /shell/ Disallow: /var/ ## Explanation: you exclude common Magento files Disallow: /api.php Disallow: /cron.php Disallow: /cron.sh Disallow: /error_log Disallow: /get.php Disallow: /install.php Disallow: /LICENSE.html Disallow: /LICENSE.txt Disallow: /LICENSE_AFL.txt Disallow: /README.txt Disallow: /RELEASE_NOTES.txt User-agent: Googlebot-Image Allow: /
Uniqueness. Above we already raised a problem of content uniqueness for your website. Indeed, it is hard to get a better positions if you have exactly same Meta data on every page or content like all of you competitors have. But how can you provide unique content for thousands of products on your store? Here are some tips on it:
- Provider’s content. In other words don’t try to invent new content for product that already have perfect description or images. Search engine crawlers do not drop sites with same content source (e.g. it is hard to make a better description for Gucci perfume than ones already made by Gucci copywriters, Google knows it).
Machine generated content. It is a common thing to use modules to generate both title and description for each page, but at the same time some types of products may totally match with specifications except few parameters (e.g. monitors may differ only by their diagonal or matrix) and it makes sense to generate content for them by certain scripts (auto creation on the basis of this parameter). Same situation about filling alts for images.
Unique content for prior products. Though provider’s and machine generated content can be a solution for some products, you can contact some copywriting agencies to create KW-filled additional texts (e.g. useful tips, extra options, author’s opinion on the product).
Audience generated content. Stimulate audience leaving their comments and reviews, but don’t forget to proofread such comments, because some dishonest users may paste spammy links for their own needs.
Semantic core. Both content creation and work with meta data should start on a semantic core basement. If you’re not a monopolist in the niche, we can assume that there already are some opponents ranking high for some big head KWs. Before 2012 this would mean for you that you simply need to get more links to achieve his positions, but lately situation has significantly changed and targeting exact KW may take months (if not years!) with a high risk of getting penalty for aggressive link-building campaign. So a much smarter solution would be creation of a wide semantic core (which for most e-commerce starts with 2k+ units) and target only those that have a low compete level.
Sublinking. Another practice that has changed these days is sublinking. Basic recommendations here are usage of a wide anchor list (don’t forget to gather semantics) and organic way of pasting such links (crawlers will easily catch artificial KW oriented anchors so try to make them exactly like you’d do a link without thinking of gaining rankings).
Technical recommendations. At the end we’d like to remind you about some things for your developers:
- Use 301 redirects for different site versions, otherwise you’ll loose link-juice;
- Use structured data – they increase CTR in search results;
- Avoid too many scripts and other things that overload you page;
- Do not run multiple e-commerce websites of same niche on one hosting – you’ll have affiliate penalty.
- If you have any questions do not hesitate to leave your comments or contact us.
You could take a look at a list of SEO recommendations we gathered for our client in our case study, where we also included comprehensive information on how we built an online cooking course marketplace for the chain of restaurants.
Material was prepared by Pabowl Digital Marketing Agency