Blog

How to Get Your Site to Rank Well on Search Engines

Search engine optimization has become a process that is as much art as it is science. We have a basic idea of what can help us rank better, but the search engines guard their ranking factors quite closely. Google, for example, will tell us that high-quality content is necessary to improve the page rank. When it comes to the exact keyword percentages, though, Google is close-lipped.

Search engines do this to make it harder for spammy websites to manipulate the system. Their reluctance to share more information is understandable, but that doesn’t help legitimate businesses trying to get the best SERP. As a result, SEO is something of a trial and error process – you learn as you go along.

Now, there is a lot of information online, but not all of it is reputable. That’s why we recommend this awesome infographic for getting more data on how to rank in 2019!

The infographic has been well-researched and will give you a basic understanding of SEO and how it has evolved. You’ll get a grounding in the topic, along with actionable tips on what works for modern SEO and what doesn’t.

In this post, we look at the ranking factors that we know about and how to hack those to get the best results for your site. Are you ready to learn more?

Page Speed

Google has been open about the fact that fast-loading sites get preference in terms of SERP, especially on mobile devices. However, it doesn’t tell us how fast our sites should load. According to HubSpot, 47% of visitors expect a site to load within two seconds or less, and one second delay could reduce page views by as much as 11%.

page speed

How do you improve load times? By keeping your site’s file size as small as possible. That fancily animated banner might look cool, but it could do more harm than good if it takes too long to load. Images and videos on the site can also bulk it up considerably.

Check the file size of each image. You can reduce this considerably by saving the photo at a lower resolution or by making its dimensions smaller. Experiment with different sizes and resolutions so that you get the best balance between size and the quality of the image.

High-Quality Content

Longer form content indicates to search engines that a post is more informative. It also keeps readers on your site for longer, which, incidentally, is another the ranking factors. Aim to impress your readers with content by making it valuable to them and crafting it perfectly.

High-quality content helps to establish your company as an expert in the field, so this is an investment worth making on many different levels.

Backlink Profile

A few years ago, if you wanted to boost your ranking, you would make efforts get as many sites to link to yours as possible. Things have changed a lot since then. Search engines now also judge the quality of the sites linking to your site.

Backlinks

The problem with this is that if a spammy, low-quality site links to your site, the association will penalize you. That is why it is essential to be more careful when building backlinks. Whenever it’s possible, build backlinks with authority sites.

It’s also good practice to check which sites have linked to yours regularly. If you find a spammy site linking through to your site, then you should ask your Webmaster to remove the link.

Bounce Rate

We’ve discussed this a little when we were talking about the content on your site and your page speed. The site content is not the only factor that determines the bounce rate, though. The bounce rate refers to how long someone stays on your site.

If they click through and then immediately click away, the search engine will assume that the site was less credible. The algorithm assumes that the click-away was either a result of poor site quality or that the short description in the results was inaccurate.

Bounce Rate

Either way, the result is the same — your site will rank lower. Reduce the chances of this happening by using great content and always making sure that any snippets displayed in results are an accurate reflection of the page.

You should also spend time simplifying the layout of your pages. Make sure that they look clean and uncluttered and that text is broken up into subheadings.

Click-Through Rate

Another thing that search engines use to determine how relevant results are is the click-through rate. If people are not clicking through to your site from the results, it is seen as an indication that the site was not relevant to the search.

That’s why you need to ensure that you work well on any snippets that might be displayed. Make sure that they are interesting, well-written, and accurate.

Use Images

The first impression is the last impression. When it comes to how impressive your site is at first glance, images play a vital role. Choose great images that will capture the attention of visitors, but make sure that the file size is as small as possible.

Use Long-Tailed Keywords

With voice search starting to take off, the value of using single keywords has begun to lessen. People are more likely to use full phrases when conducting a voice search. Also, since search engines have become more adept at understanding the context of queries, the importance of individual keywords has diminished.

Work out some phrases that people might use when looking for your product and start incorporating those into your site.

Final Notes

Improving your search engine ranking is more difficult than it used to be. Search engines can more easily work out which sites have high-value content and which do not. Your best long-term strategy, therefore, should be to create a site and content that is high-quality.

Duplicate Content for eCommerce SEO – Solving Issues surrounding duplicate content on e-commerce site

Duplicate & low-quality content has become a common issue for E-commerce websites. And if it doesn’t handle appropriately then it gets worse.

You possibly will end up replicating huge sections of your website without even knowing it, resulting in the degradation of your search engine visibility.

Search engines now become more complicated, they are considering the websites with high quality and unique content for indexation.

In this guide, we’ve dug into the duplicate content Issues usually found on an e-Commerce website.

But initially, you need to understand what duplicate content is & why does it matter for your e-commerce website?

What is Duplicate Content?

Duplicate-Content-for-eCommerce-SEO

Duplicated content refers to the term in which a considerable section of text is similar or exactly matching to another text in the same site or an external site.

Let look at the definition by Google:

“Duplicate content generally refers to substantive blocks of content within or across domains that either completely matches other content or is appreciably similar. Mostly, this is not deceptive in origin […]. If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved.”

Source: https://support.google.com/webmasters/answer/66359?hl=en

Why You Should Care?

Duplicate content can originate three major issues for search engines:

  1. Search Engines get puzzled in deciding which version(s) of the webpage to include and which ones to exclude from their indices.
  2. They don’t recognize whether to associate the link metrics to a single page or keep it parted amongst numerous versions.
  3. They aren’t able to distinguish which version(s) of the webpage to rank for search query results.

Not only the search engines, but website owners also suffer a lot from duplicate content issues. The consequences of using duplicate or low-quality content can be disastrous for your e-commerce website.

You are losing the benefits of SEO if you are ignoring the quality of content on your website. The direct influence of duplicate content can lead to:

  • Lower or no search rankings
  • Lesser web traffic
  • Bad user-experience

Consequently, e-commerce websites should alert in distributing thin or duplicate content. Now let’s further understand these duplicate content issues and how to dodge them.

1. Non-canonical URLs Issue

Sorting your products is easiest and one of the best ways to improve the user experience on your e-commerce website.

Your users want several options such as price, date, review, color, and size to sort your products. And since you want to deliver a great user experience, you’ve obeyed their demands. But now your website is filled with several URL parameter options.

Here the problem is that Search engines will crawl & index all these pages; however, they will not provide them with any ranking on search results.

Without implementing the “rel=canonical” tag you’re actually allowing the search engine to choose any of your web pages to rank randomly in the search results.

At times this works out well, but mostly it doesn’t. And definitely, you can’t leave your website like this!

Solution:

You definitely don’t want these kinds of unsystematic sort orders to appear in the search results in place of your which desired product listings.

You want your potential customers to land on the webpage that you have created especially for that product. And to notify search engine about this, you need to add the canonical tag to the ‘main’ version of the webpage.

Here is an example:

If you want to buy poster online and you are checking Labno4.com check their one of the best product, https://www.labno4.com/poster/lab-no-4-customer-is-king-corporate-startup-business-quotes-poster which is having the canonical tag as follows. 

<link rel=”canonicalhref=”https://www.labno4.com/lab-no-4-customer-is-king-corporate-startup-business-quotes-poster” />

Insert this tag to the header section of your webpage and you’re done!

2. No-index (robots.txt) Issue

No-indexing is a meta tag that notifies to the search engines not to consider a webpage for indexing. If you are not utilizing robots.txt to control your website crawling then you are actually attracting search engines to crawl your website openly without any restrictions.

You are allowing them to crawl all your low-quality WebPages, or getting trapped in an endless number of URLs created on your website.

For example, a calendar segment forms a fresh URL every day.

Solution:

Every website has its crawl budget and the restricted number of WebPages that can be incorporated in a crawl.

And you need to ensure that your most imperative pages are getting indexed and you’re not wasting your efforts in crawling temporary files.

Robots.txt tags are useful in blocking content that you don’t want the search engine to crawl and index.

If you’re facing some issues regarding indexing, the robots.txt file is a perfect place to confirm and identify those issues. The robots.txt should always exist on the root of the domain, for example:

Google Search Console is one of the best SEO tools to spot the blocking URLs by robots.txt file.

Robots.txt

Moreover, your robots.txt should always be present on the root domain, for example:

https secure site

People often misunderstand No-indexing with canonicalization. No-indexing informs the search engines not to index a webpage, whereas canonicalization informs that two or more URLs are same, and one is the “Main” canonical page.

3. Product description Issue

Every e-commerce website needs content to explain the details and features of the product. And consequently, a lot of e-commerce websites are using raw description provided directly from the manufacturer/seller.

And this gives rise to the huge content duplication problem and search engines not capable to differentiate your website in the SERPs.

Since a manufacturer shares a similar type of content to all brands and this leads to a lot of websites having exact matching content on their product pages.

Even big organizations are found guilty on this. For example, the product description for the novel ‘Meluha’ is identical on both Flipkart and Indiamart.

Meluha on Flipkart

 

 

Meluha Novel on Indiamart

If you are competing with big brands by using identical content, then both search engine and users will likely to prefer the website with greater brand authority.

Moreover, identical product description becomes a major hurdle when a brand wants to increase the number of platforms they are using to sell their products.

For example, if an organization is already selling products through its website and now wants to sell their products on Amazon as well. And the organization uses the exact product description on both the websites.

In this case, the brand will lose its top ranking in the SERP for its own product, as e-commerce giant Amazon has great domain authority.

Solution

There is only one solution to this problem and that is unique content. There is no alternative for high-quality content.

You can hire a dedicated team to ensure the uniqueness and quality (proper grammar, no misspellings, etc.) of your content. Moreover, you can even hire freelancers or outsource your writing project as well.

4. Reviews – User Generated Content Issue

Several CMS offers built-in review functionality. Sometimes review pages are created separately to share the reviews for specific products. And sometimes reviews are shared on the product pages.

And due to this, multiple duplicate content create issues amongst the product pages, and the equivalent product review pages.

Solution

Such “review pages” need to be canonicalized to the official product page or you should add “no-index, follow” through a Meta robots tag.

The canonicalization technique is used, when a link to a “review page” exists on external websites.

5. WWW vs. Non-WWW URLs Issue

A search engine considers “http://www.domain.com” and “http://domain.com” as two different web addresses. And there is no SEO benefit of opting one over another.

When you put WWW  in the starting of your website URL, it operates as a hostname that will help you with the DNS flexibility, the capability to control cookies while using several subdomains, and many more. Where a non-WWW domain also termed as naked domain does not provide any technical advantage.

You should always use www in your website URL since at present you have a small website, and tomorrow you wish to have a large website.

Solution:

301 redirecting from undesired version to the desired version is the suggested solution to overcome such technical URL issues.

Moreover, you can use webmaster tool by Google to set up www and non-www version of domains.

6. URL parameters Issue

Search Giant Google seeks to crawl each and every link of your website. However, sometimes it will bump into same URLs that redirects to the similar content. This generally happens due to inaccurate URL parameters.

URL parameters are additional information attached to the ending part of a URL. For example, the “sort” parameter indicates how to organize products, and the “setCurrencyId” parameter indicates in which currency to show the cost.

Clean URL

We understand that duplicate content is a big no for every Search Engines; however, by modifying URL Parameters, such pages which have similar content will be considered as unique content.

However, simply modifying the arrangement of items on your webpage does not consider as being unique and original content. In fact, Google considers this as copied content.

Whenever Google notices duplicate URLs, it makes use of learned information regarding your website to decide which URL is the perfect representative for a particular product, and the search giant will endorse that URL while devaluing or hiding the other ones.

Usually, Google determines these URLs by using hints such as link popularity & content provided on the page.

Solution

Parameter handling is the best solution to handle this issue.

The URL parameters section in Google Search Console is one of the best tools to identify which URL to crawl, and which one to avoid.

This can help you in handling duplicate content and provides you the power to control how Google indexes and represents your website in SERP.

Moreover, Google has recently updated the URL parameter feature and added a few more setting options and control powers for the users.

Previously parameter handling was an option beneath settings but currently, you can access it below Crawl > URL Parameters.

7. Category pages Issue

Category levels are the most common form of duplicate content on an e-commerce site. It generally happens when multiple categories target a similar type of product (for example, “Men Shoes” and “Shoes for Men”).

It happens due to several reasons like trying to present the category in two different parent categories, trying to reach different markets, or sometimes in the name of targeting keywords for SEO.

Therefore, if the pages are targeting the same topic, Google will not have a clear picture of which page takes priority.

Solution

There are so many ways to take an action on this concern but it depends on whether you are working in more of a preventative or reactive capacity.

Many merchandisers don’t realize that there is a problem with having multiple landing pages for a product type.

But helping them understand what to avoid and what not might work. The automatic answer to this issue is to clean up the existing pages to indicate importance more clearly.

To identify the most visibly problematic categories, begin with a crawl and search for the exact matching or extremely similar H1s.

After that, if feasible, then  perform a manual check throughout a list of all live categories since some duplication can’t be found simply via crawl data.

While looking for the duplicate categories on your website, keep in mind that Google also prefers on-page content to understand page goal.

Consequently, product selection option in categories should be well planned, as most of the content on category pages taken from the titles of the products.

A number of e-commerce CMS’s let you identify the products with dual categories, and this can be very useful in locating this specific type of duplicate content.

And, once you’ve done with the categories, the best action is to apply 301 redirects to all pages.

Conclusion

So, that it is, the most common Issues surrounding duplicate content on e-commerce site we encounter on a daily basis. We Hope you enjoyed the guide.

If you have any questions or if you want to add a few more tips, then share with us as a comment below.

B2B marketing tips to jump-start your 2017 results

One of the first things that you need to do as a B2B enterprise in order to get the results that you want in 2017 is to make sure that you are dominating the search engine result pages (SERPs) in case of keywords that are relevant to the business that you are doing. When you can perform well with respect to high priority search queries and keywords you’ll be able to build an identity for your brand. A useful tool in this case can be a B2B email append service. You will also gain trust of your target audience and this will only add to your overall credibility.

Using Testing To Maximize Conversions

From each and every visitor that comes to your site you are getting an invaluable opportunity to learn, connect, and then convert that entity to a client. Quite often marketers make mistakes like jumping to offers involving a high degree of commitment or using no call to action and thus missing the mark totally. Testing the usability and optimization of conversion rates can be a complex affair indeed but it needn’t be as such. You should always start by using analytics and append services to understand the ways in which people are reacting to your content.

Capitalizing On Attribution Insights

As a B2B marketer, in all likelihood you are going to focus on driving the qualified leads that you feel will be good enough for your sales team. However, these are bottom-of-the-funnel stuff. You need to focus on the content and channels that are driving these prospective clients at the higher end of the funnel. If your sales cycle is a complex one it might take you around six to eight touch in order to generate a lead that is good enough for the sales team. You can always use the Multi-Channel Funnel reports of Google Analytics for this purpose. This will drive your digital marketing campaigns.

B2B_marketing_tips_2017

Integrating with the Sales Funnel

If you really want to understand the quality of leads that are coming from the various efforts being made by your marketing team in the best way possible then you need to use your customer relationship management (CRM) system in order to track the lead outcomes. This will help you get proper access to the whole thing and you would gain the level of visibility needed for a proper view of things. Google is integrating its applications Salesforce and AdWords and thus assisting B2B marketers to establish connections between leads and their online activity.

Using Call Tracking To Understand Total ROI

As far as B2B marketing is concerned a lot of the leads still come from phone calls. Quite often it is seen that leads coming from phone calls are more qualified and have a greater chance of making a commitment as opposed to ones who are procured through the World Wide Web. If you’re attempting to increase the qualified leads then it is always better to comprehend whether the qualified leads are coming through phone calls. In this case, you can gain the necessary insight by using call tracking technologies.