Total Posts:1|Showing Posts:1-1
Jump to topic:

What should be URL structure

Posts: 2
Add as Friend
Challenge to a Debate
Send a Message
2/24/2020 7:24:38 PM
Posted: 2 years ago
The structure of the site URL should be extremely simple. Try organizing your content so that the URL has a logical structure and is understandable to the person (if possible, Use words, Not identifiers consisting of many numbers). For example, When searching for aviation information, URLs like http://ru. Wikipedia. Org/wiki/aviacia or https://chrome. Google. Com/webstore/detail/veepn-unlimited-free-fast/majdfhpaihoncoakbjgbdhglocklcgno? Hl=en Help evaluate the relevance of a link It is much harder to get users "attention with a URL like http://www. Example. Com/index. Php? Id_sezione=360&sid=3a5ebc944f41daa6f849f730f1.

Use punctuation in the URL. The address http://www. Example. Com/green-dress. Html is more convenient than http://www. Example. Com/greendress. Html. We recommend using hyphens (-) rather than underscores (_).

Too complex URLs, Especially including several parameters, Can complicate the work of search robots, As an excessive number of URLs are created that point to the same or similar content on the site. As a result, Googlebot can use much more data channel resources than necessary. In addition, There is a possibility that he will not be able to crawl all the contents of the site completely.

The main causes of this problem
Excess URLs can be due to a variety of factors. Some of them are listed below.

Incremental filtering of a group of elements. Many sites have different representations of the same set of elements or search results that the user can filter according to certain criteria (for example, "show hotels on the coast"). If filters can be added to each other (for example, "hotels on the coast with a fitness center"), The number of URLs (data views) increases significantly on these sites. It"s not at all necessary to create many hotel lists that are not much different from each other, Since Googlebot only needs to view a very small number of lists, With which it can go to the page of each hotel.

"Dynamically generated documents. Such documents may vary slightly due to the addition of counters, Time stamps or announcements.
"Problem parameters in the URL. For example, Session identifiers can cause a lot of repetitions and lead to a sharp increase in the number of URLs.
"Sort options. Some large online stores provide different ways to organize the same elements, Resulting in a large number of URLs.
"Problems with the calendar. A dynamically created calendar can generate links to subsequent and previous dates without limiting the start and end of a period.
"Broken relative links. Broken relative links often lead to infinite spaces. Often this problem is caused by repeating path elements.

How to solve this problem
To avoid possible problems with the structure of the URL, We recommend that you follow the recommendations listed below.

"Block access to problematic URLs for Googlebot using the robots. Txt file. Typically, You should block dynamic URLs, Such as search result pages or URLs, That create infinite spaces (such as calendars). Using regular expressions in a robots. Txt file, You can easily block a large number of URLs.
"Try not to use session identifiers in the URL. Instead, It is recommended to use cookies. You can learn more about this in our webmaster guidelines.
"If possible, Shorten the URL by removing unnecessary parameters from them.
"If your site has an endless calendar, Add the nofollow attribute to links to future date pages dynamically generated by the calendar.
"Make sure all relative links on the site are working correctly.

Information was taken from https://support. Google. Com/

By using this site, you agree to our Privacy Policy and our Terms of Use.