Table of Contents
URL Parameters and Their effect on SEO?
URL parameters are essential to the construction of URLs. Query strings can be a great asset in the hands of experienced SEO practitioners, but they can also pose major hurdles for your website’s rankings.
The most typical SEO difficulties to watch out for while working with URL parameters are detailed in this guide.
Introduction to URL Parameters
URL parameters (also known as “query strings” or “URL query parameters”) are elements added into URLs to assist you select and organize material on your website, as well as track information.
In a nutshell, URL parameters allow you to convey information about a click through the URL.
Refer to the area of the URL following the question mark (?) to identify a URL parameter. A key and a value are separated by an equal sign (=) in URL parameters. The ampersand is used to separate multiple parameters (&).
The following is an example of a URL string with parameters:
cc: ttmind.com
How Do I Use URL Parameters?
The use of URL parameters to arrange material on a website is popular, making it easier for consumers to navigate products in an online store. These query strings allow users to filter a page’s content and view only a certain number of items per page.
Tracking parameter query strings are also popular. They’re frequently used by digital marketers to track where traffic comes from so they can see if their last social media, ad campaign, or if a newsletter investment was a success
Working Principle of URL Parameters
There are two sorts of URL parameters, according to Google Developers:
1. Active content-modifying parameters: parameters that change the content displayed on the page.
For example, to link a consumer to a certain product called ‘xyz’.
http://domain.com?productid=xyz
2. Advanced tracking parameters (passive): parameters that send information about the click — such as whose network it came from, which campaign or ad group it was part of, and so on — but do not affect the content on the page.
This data will be clearly recorded in a tracking template and will provide you with useful information for evaluating your current marketing investments.
For example, to track traffic from your/our newsletter
https://www.domain.com/?utm source=newsletter&utm medium=email
For example, to collect campaign data with custom URLs
https://www.domain.com/?utm_source=twitter&utm_medium=tweet&utm_campaign=summer-sale
Although it may appear to be an easy task, there is a right and wrong way to use URL parameters, which we’ll explore shortly following some examples.
When Do URL Parameters Become a Search Engine Optimization (SEO) Issue?
The majority of SEO-friendly URL structuring recommendations recommends avoiding URL parameters as much as feasible. This is because, no matter how useful URL parameters are, they tend to slow down web crawlers by consuming a significant portion of the crawl budget.
Poorly designed, passive URL parameters (such as session IDs, UTM codes, and affiliate IDs) have the ability to generate an infinite number of URLs with non-unique content.
The following are the most prevalent SEO difficulties caused by URL parameters:
1. Duplicate content: Because search engines evaluate each URL as a separate page, numerous versions of the same page generated by a URL parameter may be deemed duplicate content. This is because a page reordered by a URL parameter typically looks extremely similar to the original page, while some arguments may return the exact same content.
2. Crawl budget loss: Keeping a simple URL structure is one of the fundamentals of URL optimization. Complex URLs with numerous parameters result in a slew of different URLs that all go to the same (or nearly identical) content. Crawlers may decide to avoid “spending” resources indexing all information on a website by marking it as low-quality and moving on to the next one, according to Google Developers.
3. Cannibalization of keywords: Filtered copies of the original URL target the same keyword group. As a result, different pages compete for the same rankings, which may lead crawlers to conclude that the filtered pages provide little genuine value to users.
4. Raking Signals Dilution: Multiple URLs connecting to the same content mean that links and social shares could point to any customized version of the page. This can make crawlers even more perplexed, since they won’t know which of the competing pages should be ranked for the search query.
5. Bad URL readability: When it comes to URL construction, we want it to be simple and easy to grasp. A long series of numbers and codes scarcely qualifies. For users, a parameterized URL is nearly unintelligible. The parameterized URL appears spammy and untrustworthy whether presented in the SERPs, in an email, or on social media, making consumers less likely to click on and share the website.
How to Take Care of URL Parameters for Better SEO ?
Crawling and indexing all parameterized URLs is the root of the majority of the aforementioned SEO difficulties. However, webmasters are not helpless in the face of the never-ending production of new URLs via parameters.
Proper tagging is at the heart of good URL parameter handling.
Please note: When URLs with parameters display duplicate, non-unique material, such as those provided by passive URL parameters, SEO difficulties develop. These links should not be indexed, and only these links should be indexed.
1. Examine Your Crawl Budget : The number of pages bots will crawl on your site before moving on to the next one is your crawl budget. Every website has its own crawl budget, and you should make sure that yours isn’t being squandered.
Unfortunately, having a large number of crawlable, low-value URLs, such as parameterized URLs generated by faceted navigations, wastes crawl budget.
2. Reliable Internal Linking: If your website contains a lot of parameter-based URLs, it’s critical to tell crawlers which pages they shouldn’t index and to link to the static, non-parameterized page regularly.
In this instance, be cautious and only link to the static version of the page, never to the versions with parameters. You’ll avoid sending search engines conflicting signals about which version of the page to index if you do it this way.
3. One version of the URL should be canonicalized : Remember to canonicalize the static page after you’ve selected which one should be indexed. Create canonical tags for parameterized URLs that point to the chosen URL.
All URL variations should include the canonical tag designating the main landing page as the canonical page if you provide parameters to help users browse your online shop landing page for shoes.
4. Use Disallow to block crawlers: Sorting and filtering URL parameters have the ability to generate an infinite number of URLs with non-unique content. Using the disallow tag, you can prevent crawlers from visiting specific areas of your website.
Controlling what crawlers, such as Googlebot, may access on your website with robots.txt allows you to prevent them from trawling parameterized duplicating content. Bots check the robots.txt file before crawling a website, so it’s a good place to start when optimizing parameterized URLs.
Any URLs containing a question mark will be blocked by the following robots.txt file:
Disallow:/*?tag=*
This prohibit tag prevents search engines from crawling all URL parameters. Make sure no other parts of your URL structure use parameters before picking this option, or they will be prohibited as well.
To find all URLs that contain a question mark (? ), you may need to crawl the site manually.
5. Use Static URLs for URL Parameters: This is part of a larger debate concerning dynamic vs. static URLs. The URL structure of the website is improved by rewriting dynamic pages as static pages.
However, if the parameterized URLs are already indexed, you should spend the time not just rewriting the URLs but also redirecting those sites to their new static destinations.
Google Developers also recommend that you:
- Remove any needless arguments while keeping the URL dynamic.
- Build static material that is interchangeable with the dynamic content.
- Only use dynamic/static rewrites that will assist you in removing unnecessary parameters.
Using URL parameters as part of your SEO strategy
Parameterized URLs make it easy to change or track content, therefore you should use them whenever possible. You’ll need to tell web crawlers when to index and when not to index specific URLs with parameters, as well as emphasize the most valuable version of the page.
Take your time deciding which parameterized URLs should be indexed and which should not. Web crawlers will gain a better understanding of how to traverse and value your site’s content as time goes on.