The elements of the template correspond to those that are repeated on most pages of the site: header, menu, footer, sidebar, fixed elements located on your pages.
Since the content in your site's template is repeated as many times as there are pages (approximately), it is important to understand which elements are in it.
Example: Typically, the GTC and legal notices appear in the footer of a site, access to account or social networks is in the header. Are these pages really the most important for the SEO of your site (strong internal mesh)?
Two objectives here:
NB: the more pages you have on your site, the greater the impact of these optimizations
Accessibility and inaccessibility of links in the template
On your home: the footer and the header must be fully accessible, especially with the use of <a href> HTML links
<ul> <li> Rubrique 1 <ul> <li><a href="http://www.example.com/section-1/page-1.html">Page 1.1</a></li> <li><a href="http://www.example.com/section-1/page-2.html">Page 1.2</a></li> </ul> </li> <li> Rubrique 2 <ul> <li><a href="http://www.example.com/section-2/page-1.html">Page 2.1</a></li> <li><a href="http://www.example.com/section-2/page-2.html">Page 2.2</a></li> </ul> </li></ul>On all other pages: the footer and the header must not be accessible. Only the link of the menu to the category currently consulted must be accessible to search engines, the other links will have to be invisible for search engines. The goal is to push bots visit the entire silo of pages in which they are engaged, but also to strengthen the semantic relevance of these by multiplying the links between pages with semantics proximity.
Example: my e-commerce site sells video games for XBOX games and Nintendo games. Once engaged in the Nintendo universe, the objectives are:
- to push bots visit the lower sections of this Nintendo section,
- to prevent pages with "XBOX" semantics from making internal links to pages with a "Nintendo" theme, transmitting them inappropriate semantics.
Recommended solution: use multiple headers and footers depending on the page + use link encryption to hide links that need to be made inaccessible to search engines.
This solution consists in not showing the URLs in plain text in the source code of the page to prevent them from being identified and indexed by search engines The target URL is first encoded on the server side and then reconstituted on the client side using JavaScript code. Example of a bot's vision:
If JavaScript is enabled (user view), a script decodes the ID of the <span> tag and rebuilds the link by replacing the <span> tag with <a>
The Javascript file allowing the decoding will obviously have to be blocked in the robots.txt so that Google does not try to decode your links.