Understanding Search engine optimization Friendly URL Syntax Practices. SEO Friendly URL SyntaxPoor URL structure is a frequent Search engine optimization issue, one that will impair rankings, keep pages out of the search engine indexes, and suck ranking authority from your other pages or perhaps the entire websites. Some content management system bake poor URL structures directly into their websites. Lax rules can be a culprit, as an example, not encoding spaces or special characters.
Meanwhile, some CMS platforms devise URLs using illegal characters that will not can be found in addresses. Others generate multiple URLs for pages, creating duplicate content. Even though it is factual that search engines like google visit great lengths to read and index even the worst URLs, awareness of URL management and optimization can provide both SEO and usability advantages.
Good URL Structure. A few years ago, Dr. Peter J. Meyers put together a cheat sheet on the anatomy of a URL. It’s a good one to maintain handy. It is possible to read and understand. Should I saw this address pasted into a blog or forum, I would likely click on it. It is SEO optimized with breadcrumb style keywords. Search engine listings look for keywords in URLs; it’s a known ranking factor. This layout, going from general to specific, is great for enterprise SEO.
The URL includes their own anchor-text. If this type of address were pasted right into a blog or any other web page being a link, that link would possess well-optimized anchor-text. Old style dynamic addresses are legal and acceptable, though they have drawbacks.
They are generally longer and difficult to read through because they contain both parameter names plus values. Pairing parameter names with values adds extra words. This might dilute the SEO value based on keywords within the URLs. This type of address might have information better transmitted outside the URL. A person ID, session ID, sort code, print code and many other possible parameters could create duplicate content, security or other issues.
Diagnosing URL Issues – To find URL based issues:
Look for errors and warnings then determine whether URLs would be the culprit. Audit all URLs for proper syntax. To examine for errors, start out with Google and Bing webmaster tool reports. Look for duplicate content then examine the webpage addresses themselves along with their locations. Numerous third-party SEO tools can locate SEO issues too. Canonical issues, parameters that do not change page content, loose adherence to coding standards, or a variety of reasons will create duplicate content.
I dealt with a newspaper that used unique numerical identifiers, outside of parameters, to provide articles as webpages. It did not matter exactly what the URL contained, so long as the identifier was somewhere in the address. Unfortunately, the writing of link hooks into templates was inconsistent, causing a multitude of duplicate content pages. We needed to pour through each template, rewrite each link hook as an SEO friendly URL, then catalog all of the legacy URLs and 301-redirect those to the brand new optimized addresses.
When auditing URL syntax, I favor to export every webpage address into a spreadsheet or database. If you’re thinking about using Google site: queries, don’t bother as most of the issues you are going to look for tend not to can be found in search engine rankings. Each character includes a specific use. Should they appear, determine should they be used properly, ought to be encoded, or maybe the URL needs reconfiguration.
Unsafe Characters – Unsafe URL Characters. Encode unsafe characters unless employed for a certain purpose. The % symbol does not require encoding when utilized to encode a character. The # symbol fails to require encoding when qngvsy to generate an anchor tag.
Miscellaneous Characters – Miscellaneous URL Characters. As it happens, these characters usually do not require encoding. In reality, many CMS platforms will encode these automatically. If you wish links that contain these characters to remain consistent when shared from web site to website, it’s a secure bet to encode these.
Hunt For The Pound Symbol, # – Search engines ignore the # and everything after it in a URL. If using the #, make sure the webpage appears as you want it crawled and indexed if the # and exactly what follows is taken away. In the event the # changes content you desire indexed, you will need to look for a different URL structure. As an example,