cs-icon.svg

SEO Best Practices for Contentstack-powered Websites

Search Engine Optimization (SEO) improves your website’s ranking on the Search Engine Results Page (SERP). An exceptional SEO strategy can attract customer traffic and provide growth in revenue and sales. Let us look at some of the SEO best practices that you can implement with Contentstack to improve your site’s visibility on a search engine.

Add meta description and title tags.

Meta description and title tags provide a brief introduction to your website. This introductory content comes upon a SERP when a customer’s search criteria match keywords in your website’s metadata.

Title tags and meta descriptions are pieces of HTML code placed within the &<header> tag of a web page.

  • Title tag: The title tag assigns an appropriate name to the metadata that introduces your website. The <title> element of a web page appears on the search engine results pages, browsers, and external webpages.
  • Meta description: The meta description tag contains a summary of your website’s content. An ideal meta description should have no more than 155 characters.

Let us look at how you can assign appropriate meta descriptions and title tags to your web pages with Contentstack.

Defining SEO fields in the content type

While defining the structure of your Contentstack website, perform the following steps:

  1. Add a Group field within the content type of your Contentstack website and name it SEO.
  2. Next, add a Title (Single Line Textbox) field within the “SEO” field. The “Title” field provides an appropriate title for your metadata with relevant phrases or keywords.
  3. Finally, add a Description (Multi Line Textbox) field within the “SEO” field. The “Description” field contains a summary of the content placed on your website.

Additional Resource: Want to know more about content types? Check out the Create Content Types section that has a set of guides with all the details that you need to know.

This will make sure you have relevant metadata in place when you decide to publish content to your website.

Making SEO-related changes to the website code

Perform the following steps to add SEO related tags to your website code:

  1. Within the HTML code for your webpage, add the following SEO code in the <header> tag:
    <meta name="title" content="an appropriate title for your website content">
    

    This code assigns a title to your meta tag.

  2. Add the next line of SEO code in the <header> tag of your HTML code:
    <meta name="description" content="appropriate description for your website content.">
    
    

    This code assigns description details to your <meta> tag.

This code will make sure that the necessary SEO calls fetch and display appropriate metadata on your website.

Use images with alt text

Displaying images is the best way to explain a specific feature of your company’s software. The images provide a direct view of the steps being performed while a user uses that particular feature.

The “alt” (short for “alternate”) text appears in place of an image on a website if the image fails to load on a user's screen. Sometimes, this alternate text helps provide information about an image while the image loads on the screen. This helps improve SEO ranking as search engines find it easy to crawl the web and visit your website’s links.

Contentstack also uses alt text to provide information about the images uploaded to our website. Follow the below steps to add “alt” text to images within Contentstack:

  1. Click on the ASSETS tab on the header to view the list of available assets.
  2. Click on the asset that you wish to provide “alt” text. This will open the modal that displays the details of the clicked asset.
  3. On this page, you can edit the asset details. Enter your “alt” text inside the Description textbox to provide the image with descriptive text. This text offers alternate information about the image before it can load on the screen or if the image fails to load.

Redirect rules for changed URLs

If you happen to change the URL of a published entry, it is highly advised that you redirect the previous URL to the newer one. This is considered an appropriate SEO best practice. Users trying to visit the older link to the website will be automatically taken to the newer link, instead of seeing a “Page not found” error.

Additional Resource: We have a detailed, step-by-step guide on Redirecting URLs that you can refer to for more details. 

Let’s see what you need to do to manage URL redirects for your Contentstack-powered websites.

Create a content type to handle URL redirects

You need to create a separate content type to manage the redirection of the changed URLs.

  1. Create a new “Content Block” type content type, named “Redirect”, within your Contentstack stack, and set it as “Multiple.”
  2. Add two fields to add the “from URL” and the “to URL” .

Create entries to manage the redirection of URLs

If you change the URL of any published entry, create an entry for your Redirect content type. 

  1. Create a content type “redirect Type" with three fields: “From URL,” “To URL,” and “redirect Type” (temporary/permanent). 
  2. You need to provide the "From URL" that has been modified or no longer exists and the “To URL", which is the new URL, and the “redirect Type" that handles temporary vs. permanent redirection rules (i.e., 301, 302, 303, etc. redirection codes). 
  3. After entering relevant content in all the available fields, save and publish the entry to the specified publishing environment.

Note: After you create entries for your redirect rule and publish them to an environment, your server should read those rules and apply the redirection as configured.

If any user visits the old URL, they will be redirected to the new URL automatically.

Update broken links

Broken links can affect your website in the following ways:

  • Poor user experience: Users may navigate away from your site when they discover broken links that lead to a 404-error page
  • Poor onsite-SEO: The presence of broken links restricts the search engine’s capability to crawl to different links on your website easily. This directly impacts your site’s SEO ranking.

Additional Resource: Contentstack uses Xenu for finding broken links on the website. This tool is available online, and you can download it for free. It connects to your website and checks for broken links by crawling through web pages just like any search engine would.

Canonical tags for webpages with the same content

A canonical tag specifies to the search engine that a particular URL is a master copy of a specific page. This tag indicates which version of a particular URL should be displayed in the search results that a search engine derives.

Using the canonical tag helps avoid duplicate content being placed on multiple URLs. This allows the search engine easily crawl the Web and pick out the URL you have mentioned in the canonical tag to browse through your website. Easy navigation enables better SEO rankings.

Let us consider a sample website that can be reached through all of the following links:

  • https://www.samplesite.com
  • http://www.samplesite.com
  • http://samplesite.com
  • http://samplesite.com/index.php?r...
  • http://samplesite.com/index.php

A search engine will consider every single one of these URLs as a unique webpage. However, they are five similar copies of the homepage. Here, setting a canonical tag that points to a specific URL (e.g., https://www.samplesite.com) will tell the search engine to include just that URL in the search results.

Using the Canonical tag with Contentstack

When setting up the code for your website on Contentstack, add the following line of code inside the header tag in HTML:

<link rel="canonical" href="your preferred website address">

You can add the version of your website address that needs to be displayed on the search results page here.

Add a sitemap

A sitemap provides a list of all the important URLs that exist on your website. The sitemap forms a roadmap that Google’s search engine can follow to crawl to a particular URL link on your website.

The use of sitemap benefits organizations that have websites:

  • With hundreds of web pages or a complex website architecture
  • With constantly changing URLs
  • With constantly changing content

By providing a list of all the essential web pages on a sitemap, you indicate to a search engine that these are appropriate landing pages on your website. The sitemap then provides an easily accessible index for the search engine to traverse through your website intelligently. This offers better on-site SEO.

Using a sitemap with Contentstack

You can use service providers such as Google’s online XML sitemap generator service to make use of sitemaps with Contentstack. These sitemaps are normally stored within a file named sitemap-index.xml.

You can also create a plugin that autogenerates the sitemap-index.xml file when a particular entry on your Contentstack website is published.

Create high-quality content

High-quality content attracts search engines and visitors to your website. Visitors prefer crisp and engaging content that sticks to the main topic that your website is all about. Users can find relevant content and may end up visiting related sections of your site too.

Content that promotes good onsite-SEO should contain the following attributes:

  • Well-researched content: Search engines favor websites with comprehensive and well-researched content directed towards a specific audience, and usually reward them with higher rankings. This division of traffic across two locations with the same data affects the visibility of the content blocks on your website. Eventually, results in loss of overall user traffic.
  • Unique content: When more than one individual website address contains the same pieces of content, search engines find it difficult to choose any one of those duplicated versions as the relevant result of a search query. To attract user traffic to your website, you should maintain unique content throughout your website.
  • Competitor and Keyword Research: A detailed analysis of the keywords used by competitors in the industry to draw visitors to their website helps understand important content and gain the attention of different search engines.
  • Linking strategy: Provide links to internal content on your website wherever possible to display efficiency in the related industry.
  • External Links: Provide acclaimed external links that back up the claims made by your website’s content.
  • Varied Content: Content that includes a perfect mixture of optimized images, compelling videos, numbered lists, and bulleted lists promote itself.

Leverage search engine friendly URLs

When search engines can easily crawl to a particular website URL, onsite-SEO rankings improve. Search engine-friendly URLs provide users with an appropriate path to their desired destination.

Let us see how URLs can be optimized for SEO.

Use appropriate keywords

It is a good practice to include keywords that tell crawlers what the webpage is about in the URL. When you have the most relevant term alongside the root domain of the website URL, this helps search engines direct the right traffic to your site.

Consider a website with the URL https://example.com/headlesscms. Here, the keyword headlesscms specifies what the webpage is about.

Use appropriate URL hierarchy

An SEO-friendly URL should flow logically from the root domain to categories and sub-categories. This hierarchy will allow search engines and users to crawl through your website with ease.

Consider the following URLs:

  • https://example.com/cms/headless/contentstack
  • https://example.com/contentstack

The first URL is SEO-friendly as it provides smooth flow towards the CMS. While the second URL confuses users with its inappropriate categorization.

Additional best practices

Here are a few more practices that help maintain SEO-friendly URLs on your website:

  • Concise URLs: Short and crisp URLs often help convey what the website is about clearly. It also prevents Google from truncating part of the URL since the character length crossed 512 pixels.
  • Word delimiters: Use hyphens to separate terms in your URL address. Avoid using underscores to optimize SEO.

Optimize headlines for SEO

It is a good practice to optimize the header tags on your website for a better user experience and onsite-SEO. The use of SEO-friendly headlines makes your content easier to browse through and improves SEO ranking for your website.

Let us look at some best practices to be followed while using header tags for your website.

Appropriate use of the H1 tag

An H1 tag provides the most significant headline available on your webpage. This tag usually represents your web page's title or other significant content that should stand out from the remaining data on the web page.

It is considered a best practice to use only one H1 tag for your web page as more than one title would confuse visitors that scan through your website.

For example, the Contentstack website follows the same ideology on its webpage. The webpage maintains “Industry Leader in Headless CMS” as the sole H1 tag, providing better readability for its visitors.

The use of a single header tag enhances user experience and improves SEO rankings for the website.

Structure website content

The header tags shape the content on your website and provide an appropriate context for the data present on your webpage. To provide the appropriate flow of information on your website, you can use different header tags to structure your content.

Let us take a look at the different types of header tags:

  • H1 tags: The H1 tags highlight the main topic that your webpage is all about.
  • H2 tags: The H2 tags usually provide context for the sub-headings or sub-sections placed under the H1 tag.
  • H3 to H6 tags: The H3 to H6 tags are available when assigning additional sub-headings or sub-sections under the H1 and H2 tags.

For example, the H1 tag would create the title of your blog post, while the tags ranging from H2 to H6 would form the subsequent sub-headings inside your blog post.

Create an SEO-friendly Robots.txt file

Robots.txt is a file placed in the root directory of a website to instruct web crawlers which pages or sections of the website should not be crawled or indexed by search engines. It is a plain text file that contains directives that tell web robots (also known as crawlers or spiders) which pages or areas of a site they are allowed to crawl and which ones they should not. These instructions are important for search engine optimization (SEO), security, and privacy reasons.

Let’s consider the following basic syntax that lies within the “Robots.txt” file:

User-agent: *
Disallow: /

Here, the asterisk refers to all the search engines or web crawlers who crawl the website, while the slash indicates users not to crawl any page of the website, including the homepage.

Here is another way to instruct a specific web crawler to stop crawling a specific web page:

User-agent: Googlebot
Disallow: /example

This syntax indicates the Google crawler agent (Googlebot) not to crawl any webpage that includes the /example string in the URL of the website. For instance, www.website.com/example.

Importance of using the Robots.txt file

For example, let's consider a CMS, used to manage an e-commerce website. The website has several sections that are intended for internal use only, such as the administration panel and the customer database. To prevent search engines from indexing these sections, a good practice would be to automatically create a “robots.txt” file based on the disallow_robots setting of each CMS entry.

The CMS could include a setting in the backend that allows website administrators to create a boolean field, say disallow_robots and mark certain pages or sections as disallow_robots: true. When a page or section is marked in this way, an external function (e.g. serverless function) would automatically generate a robots.txt file that includes the appropriate User-agent and Disallow directives.

The generated “robots.txt” file would then be uploaded to the website's root directory, allowing search engines to easily identify the restricted pages and avoid indexing them.

By automatically generating the robots.txt file based on the disallow_robots setting of each entry, you ensure that the website is properly optimized for search engine indexing while also protecting sensitive information that is not intended for public consumption.

An alternate way is to create a content type called “Robots.txt” with different multi-line textbox fields for different environments. When you create an entry, you can have the whole Robots.txt content for each environment in its respective field. The developer just needs to ensure that the changes you make here are updated in the original Robots.txt.

Was this article helpful?
^