SEO web hosting

HomepageSEO web hosting

SEO guide

  1. SEO

«SEO (Search Engine Optimization) or search engine optimization should allow web content that can be indexed by search engines to appear in the highest possible positions when searching for search terms (keywords) relevant to the content. It is not only a question of the position itself, but also of satisfying the user's interest in terms of a match between the search term, the display of the search results and relevance in order to ultimately create a click stimulus.» (Holl, 2016, online)

Search engine optimization is divided into two sub-areas, on-page and off-page. In order to achieve maximum success, it is important that both areas are worked on.

On-Page Optimization

On-page optimization is about the content, structure and technical aspects.

(Sistrix.de, 2020b, online)

Off-Page Optimization

«Off-page optimization deals with all external factors of a website. As a rule, this refers to backlinks, also called external links or just links.» (Sistrix.de, 2020a, online)

   Search engine optimization takes into account on-page and off-page factors.


  1. Explanation of terms Google

Bing, Yahoo and Baidu are search engines used all over the world. The most used search engine on the internet is clearly Google. With a global market share of around 86.87%, it is the clear market leader. In Switzerland, Google's market share is 93.9%.

  Google is the clear market leader among search engines in Switzerland. (Luna-Park.de, 2019, online)

Because Google is by far the most important search engine in Switzerland, search engine optimization must be clearly geared towards it.

The web crawler system

The spiders, also known as web crawlers, search and analyze the World Wide Web automatically and download the content. The Google bots look for new pages and index them. A new web page is visited again and again in the following days to analyze whether it has been updated. The deepbots index the websites in detail, even images, PDF files and PowerPoint scripts are recorded. However, it takes several days or even weeks for these to appear in the search results.

  Computer programs called bots examine websites and index them in detail.

 The search results

The search results are divided into two areas. The paid ads are displayed at the top and are marked by a small rectangle in which «Advertisement» is written. These ads can be created on Google Ads. Below the paid ads are the organic search results. Search engine optimization (SEO) focuses exclusively on this. It can rank well in organic search results.

  SEO affects only the organic ads on the search engine.

  1. SEO measures

The individual SEO measures are described in detail here. An evaluation using a barometer will show how much time has to be spent on which measure and how important it is.

There are two barometers. The "time required" barometer shows how much time is required to implement the SEO measure. The "Priority" barometer reveals how important the respective SEO measure is and how much influence this measure has on the placement on Google.

The barometer is divided into five parts. If all five parts are colored green, it means that it is very time-consuming to implement this measure and that it has a great impact on a good ranking on Google. If only a few fields are colored green, the SEO measure can be implemented very quickly. With priority, this means that this measure does not have a major impact on a better placement. The evaluation was made on the basis of previous research on this topic, discussions with Klaus Bauer (Managing Director of Bauer Medien AG) and our own experience.


Uniform Resource Locator stands for "uniform resource indicator". The URLs are better known as links. By clicking on a URL, the user is directed to the respective website. In most cases, a website's domain is used for the URL.


A keyword is a so-called keyword that a user searches for. It is a word that is relevant to the target audience and is often searched for. Long-tail keywords are a combination of several words. (cf. Schneider, 2018, p. 17) "The relevant keywords are the terms that your potential customers enter into the search mask of the search engine and with which your website should be found." (Schneider, 2018, p. 148)

  Keyword Density

In a text optimized for search engines, it must be clear which keyword should rank for it. This should be repeated several times in the text come, but not too often. Keyword density, also known as keyword density, indicates how often the keyword appears in the text as a percentage. It should be adapted to the length of the text. If the text contains only 300 to 500 words, the keyword density can be 5%. For longer texts containing 500 to 1000 words, it should be 3%. From a text length of 1000 words and more, a keyword density of 1 to 2% is recommended.

meta title

The meta title is the top line of an entry on Google's list of results. The line is written in blue color.

The meta title can also be seen directly on the website. It is displayed at the top of the window, i.e. in the tab.

The meta title is one of the most important factors in search engine optimization. It is important that each URL has its own unique meta title that contains the keyword you want the page to rank for. The keyword should come first. In addition, the meta title must not be longer than 55 characters - without spaces. If it is longer than the recommended 55 characters, it will be truncated with (...).

In order for the meta title to encourage users to click on the link, it should be as natural and informative as possible - in other words: it should offer clear added value.

  meta description

An important factor of on-page optimization is the meta description. It is a short description of the content of the website. The meta description becomes visible when a keyword is searched for and a suitable ad is displayed.

Nothing random is displayed in the meta description. In the backend, the appropriate description can be inserted for each page and for each product. It is important that it contains the respective keyword for which the page is to rank.

The length of the meta description should not exceed 156 words - including spaces. In order to encourage users to click on the link, the meta description should be informative and bring clear added value. It should also contain a so-called "Call to Action".


Content means content. This is exactly what this SEO measure is all about. It means all content elements of a website. This includes the text, but also images, videos and PDF files.

Content is one of the most important ranking factors. It must be informative and offer the user clear added value. The content of a website must be unique and one of a kind.

The following points are particularly important when it comes to content:

The texts must be written in an understandable way. Short sentences, clear structure. Highlight important words with <strong> </strong>.

The same content must not appear twice on a website. That would have a negative impact on the ranking position.

Use short and concise headings that are informative and encourage reading for the clear structure.

Up-to-date media content should be incorporated in order to offer users added value.


   Web page text should be at least 300 words, but more words are better. Basically, as many words as necessary, but kept as short as possible.

   Call-to-action button updates

Content also plays an important role in SEO.

Local SEO

button that prompts for an action,

For example: «Order now», «Learn more here».

The content should always be kept up to date. To do this, they have to be revised and updated again and again.

      If you do a search query on Google and add a location, you will get results that can be directly assigned to that location. If a request is made without specifying a specific location, the results will show ads that are local to the user's region. This is possible by storing the address in the search engine optimization and registering with Google My Business.

Google My Business is a business directory where information about the address and opening hours can be stored. Photos and a link to the website can also be made.

page speed

The page speed shows how quickly a website can be accessed. Page load speed is becoming an increasingly important factor. Since this summer, the page speed has been weighted even more heavily in the algorithm. According to a study by Akamai, “100 milliseconds can reduce

 result in a conversion rate of up to 7%». A website has to load quickly, not only on the desktop, so that users do not leave the page. Especially with the mobile version, the page load time has to be very fast. There are a great many factors that can be tweaked to achieve faster loading times. Some examples are listed belowe

 Images and Graphics CSS Sprites Responsive Images Srcset + size


Redirects (301 or 302) CSS

Minification server load time

Optimize size (compress), adjust format, remove meta data

Load recurring data into a single file

The size of the image must adapt to the end device

List with set of all images and applicable conditions, from which size which image disappears from it

Different graphics for different device sizes

No forwarding for mobile devices

Separate style information for the visible area from style information for the area that is not immediately visible

Save the file in the smallest possible form

Update server software, optimize hosting and hardware

                  The loading times can be accelerated by various measures.


"A Robots.txt file is a file in the root of your website that specifies the parts of your website that you don't want search engine crawlers to access." (Google.com, 2020, online) The files therefore specify which crawlers are not allowed to access which content.

There are several reasons why Robots.txt files should be used. One reason is that this does not waste the crawling budget, another reason that unimportant pages are not crawled. The image files can be prevented from appearing in Google search results, but not other users linking to these images.

This SEO measure is not suitable for hiding a page from Google searches. Because if another page links to yours, it can still be indexed.

Even if a robots.txt file is created, the web crawlers can still access the content because the Robots.txt file is only a guideline.


  Meta Robots

Similar to the Robots.txt files, the meta robots can give the crawlers instructions, but only for individual documents and not for entire pages or directories. The meta robots are used in the HTML document and are defined separately for each document.

The following instructions can be given with the Meta Robots:



 content=«index, follow» content=«noindex, follow» content=«index, nofollow» content=«noindex, nofollow»

Canonical Links

Index document, follow links

Do not index document, follow links Index document, do not follow links Do not index document, do not follow links

          Duplicate content should not appear on any website. However, if two very similar pages are used, or if you want a page to be accessible under multiple URLs, canonical links should be used. A canonical link tells Google which of these pages is the right one and should be indexed. If this is not done, Google will decide for itself, which can consequently result in the wrong page being indexed and thus displayed in the search results. One benefit of canonical links is that they allow Google bots to save time crawling.

There are different ways to specify the canonical page:

  • Specify preferred domain at Google Search Console
  • Specify canonical pages in sitemap
  • Apply 301 redirect

H tags

H tags are headings that act as subtitles. These subheadings allow users to find their way around the website more quickly and easily. The H-Tags are also an important factor for the search engine. They have long been one of the most important ranking factors. The titles should be as user-friendly as possible and arouse the interest of users.

 "The 'H' simply stands for 'heading'. The number indicates the hierarchy of the different headings, 1 stands for the top level, 6 for the lowest.” (Seologen.de, 2020, online)

  H tag


   to note


H2-H3 H4-H6


Main message, topic overview No keyword stuffing, avoid unnatural wording

Subheadings Maintain logical order Further subtitles Only suitable for longer texts

           If a move from a URL to a new domain is to take place and duplicate content is to be avoided, a redirect should be made. With the redirect, one URL is automatically forwarded to another. There are the following types of forwarding:

  type of forwarding


   Suitable for

 301 redirect 302 redirect 307 redirect

Permanent redirect Temporary redirect Short-term redirect

Website relaunch

Website temporarily unavailable due to update

When maintaining server

         Benefits of Redirect:

  • Forward dead links
  • Avoid duplicate content
  • Avoid poor user experience
  • No loss of link juice when relaunched
  • Forwarding from http to https


internal linking

  With internal linking, links are placed on individual words or sentences. Clicking on the word takes the user to the linked page. "The aim is to have the shortest possible route for the visitor and the crawler to reach the desired destination and the decision to click next must be made as easy as possible for the user." (Spriestersbach, 2015, p. 75)

Internal linking is an important quality factor that allows search engine crawlers to read the website quickly and easily. You start at the home page and follow each link. Therefore, it is important that a clearly structured link is made that runs from top to bottom. This is the only way for the crawlers to index as many pages of the website as possible.


 Internal links are also important for users, because this way the bounce rate can be kept low. With hard links, exactly the same word appears in the link and on the target page as in the text. Corresponding words are used for soft links.

Pyramid-like internal linking is important. Since the start page should always be the strongest page, it must be linked to the subcategories from there. From these, further links are made to the next subcategories and from these, further links are made to individual products or articles.

The links must not only be placed at the bottom of the page; they are to be distributed throughout the text. A prominently placed link in the main part of the page can pass on 60% of the link power, whereas a link placed far down can only pass on just under 40%. But there is no universal guideline as to how many internal links are required per page. The rule is: as few as possible and as many as necessary.


Backlinks are links that point to your own website from other websites. Google analyzes the external links in terms of quality and trustworthiness. Lots of backlinks have a positive impact on rankings, but only if they come from the right sites. If the links come from untrustworthy websites, this can have a negative impact.

Mobile First Index

The classic Internet search on the desktop continues to decline. Already 60% of all search queries on Google are made via a mobile device. It is therefore crucial to offer a mobile version of your own website. It is important that all the content of the desktop version is also available on the mobile version. For this purpose, a responsive design must be used, through which the website adapts to the size of the end device.

The mobile-first index is getting more and more attention from Google. Those who do not have a mobile version of their website will get a lower ranking. The desktop version is currently being analyzed primarily. But that will change. It is believed that the Mobile First Index will be launched later this year.

HTTP status codes

«The HTTP status code (or just status code) is a web server's response to a client's HTTP request. The web server uses the three-digit status code to tell the client whether the request was successful or whether an error occurred.» (Brunner, 2020)

   status code



 200 okay

301 permanent redirect

302 found

303 See other

307 Temporary redirect

404 Not found 410 Gone

500 Internal Server Error 503 Object not found

Status codes and their meaning.


No need for action.

Only necessary if the website moves.

No need for action.

The domain is available, but at a different URL. The URL will be in a different location for a short time. The page is not available.

The page is no longer available or has been removed. An unexpected server error has occurred.

The server is overloaded.

                             The hreflang attribute can be used to inform the Google bots that there are several versions of a website made for different languages ​​or regions. A good example of this: The website is in German, but there are other versions in which all of its content is translated into another language. The hreflang attribute is used to indicate which language is used for the respective website.

ch German content for users from Switzerland

de German content for users from Germany en-GB English content for users from Great Britain fr-be French content for users from Belgium

alt text

The alt text contains a description of an image that is as precise as possible. The relevant keywords for the page the image appears on should be used in the alt text. Since the Google bots cannot read the images, it is important to add the alt text to them. This allows them to read and index the images.

  Hreflang attribute


 Alongside the alt text is also the Title tle attribute important. A short description of the image is entered here. This becomes visible to the user as soon as he moves the mouse pointer over the image. It is important that the text accurately describes the image; standard text should not be inserted everywhere.


"JavaScript is a programming language that is mainly used in browsers" (Perband, 2004, online). The most popular browsers, such as Internet Explorer and Mozilla, run JavaScript by default. Online shops in particular and many large websites use JavaScript to expand individual areas or small gimmicks. A good example of this is a menu that automatically expands when the mouse cursor hovers over it. JavaScript is also used for advertising banners or tickers. An advantage of JavaScript is that it can be easily written in a text editor and no additional editors are required. A disadvantage of JavaScript is that the Google bot can only load one JavaScript file at a time. Only when this has been loaded and indexed can the Google bot continue. This takes a lot of time and can result in not all further content being indexed.


Creating a sitemap is recommended to show the search engines which pages they should index. The sitemap lists all of the pages that the Google bot should consider and index. A sitemap is particularly important for websites that have many subpages, as otherwise there is a risk that not all pages will be indexed. In order for the sitemap to be indexed by the search engine, a Robots.txt file must reference it.

To create a sitemap file, one of the many online tools can be used, or the file can simply be created by hand in a text editor. There are the following types of sitemaps:

        type of sitemap


 HTML sitemap XML sitemap Image sitemap

Is made for the user and serves as an orientation. Looks like a table of contents and is linked on the website.

Written in a special format where additional metadata is provided about each URL.

Details about the images are given here. Increase the likelihood that the images will appear in Google image search.

   Video Sitemap News Sitemap

  1. Keyword analysis

Additional information about the videos on the website is listed. Helps the videos to be discovered by the Google bots.

The content that should appear in Google News is listed.

    In order to find out which search terms lead to your own website, a keyword analysis must be carried out. The keywords should describe the content and utility of the website as precisely as possible. Brainstorming, customer surveys, and competitor analysis are good techniques for finding suitable keywords. To get more suggestions, you can enter a word in the Google search field – now more suggestions will be displayed automatically.

Here are some important keyword analysis terms:

search volume

The search volume indicates how often the keyword was searched on Google in the past month.


This shows how many Ads customers place a paid ad with the respective keyword. The stiffer the competition, the higher the prices for that keyword will be.


The "Cost per Click" specification indicates how much has to be paid for a paid ads ad when someone clicks on the ad with the keyword.

  1. Glossary of terms


Business-to-business refers to business relationships between at least two companies.

Business-to-consumer stands for communication and business relationships between entrepreneurs and private individuals.

Off-page optimization deals with all external factors of a website. As a rule, this refers to backlinks, also called external links or just links.

Traffic or user volume is the number of visits to a website within a given period of time.

XOVI-Suite is an online analysis tool for SEO. Detailed analysis reports can be created. The current analysis of the last seven days is sent weekly.

The input field in which the search terms are entered on Google.



   Search Engine Optimization (SEO)

  Search Engine Optimization refers to measures that serve to ensure that websites appear higher in search engine rankings in the unpaid search results. Search engine optimization is a part of search engine marketing (SEM).

      Ads campaigns

   Ads campaigns are paid advertisements that can be created and run on Google Ads. These appear at the top of the search results and are identified by a green icon that says «Advertisement».

   on page

  The on-page optimization g deals with all content adjustments of the own website. It therefore describes optimization measures that you can carry out yourself on your own site and that cannot be influenced from outside or third parties.

 Off Page Traffic

     Organic search results

   Organic search results are usually the ten search results that are displayed by a search engine and for which no payment was made. These are placed below the paid ads.


  A keyword is a text unit, usually a common term that either occurs in the text itself (keyword) or with which a text can be indexed. This can be a word or a combination of several words, numbers or characters.

 XOVI Suite Google Search Interface

     Spider/web crawler

   A webcrawler (also spider, searchbot or robot) is a computer program that automatically searches the World Wide Web and analyzes websites. Web crawlers are mainly used by search engines to index websites.

   Google bot

  Goolge-Bot is a web crawler from the US company Google Inc. The computer program independently downloads World Wide Web content and feeds it to its own search engine


Google Ads domain

backend algorithm

HTML document

Google Search Console

Sitemap keyword stuffing

dead link

Linkjuice at relaunch

Bounce rate link power

Mobile First Index

A deepbot is the search engine robot from Google or another search engine that searches all pages of a web, i.e. goes in depth.

Google Ads is an advertising program from Google. Paid ads can be created and placed on this.

A domain (also domain name or domain) is an Internet address. It is used to give a server a memorable name under which a website can then be accessed. The functions of the website are programmed and defined in the backend. The website is also designed in the backend.

The algorithm is a method for weighting and evaluating websites.

An HTML document is a text-based markup language for structuring digital documents such as text with hyperlinks, images and other content.

Search Console is a free service provided by Google to monitor and manage a website's presence in Google search results.

A sitemap is the complete, hierarchically structured representation of all individual documents on a website. Keyword stuffing is the en masse use of keywords. It goes against Google's guidelines, making it one of the illegal SEO methods.

A dead link is a reference on the World Wide Web that points to a non-existent resource.

The link juice of a link can have all positive properties

(e.g. PageRank, Trustrank, Content etc.) of a website.

The bounce rate shows the percentage of visitors to a page who left it immediately without visiting any other pages.

The link power refers to the passing on of the positive properties of a link. For example, a link can inherit PageRank or TrustRank.

With the mobile-first index, Google uses information found on the mobile version of a website to compile search results.

           conversion rate

   The conversion rate describes the ratio of visits/clicks to conversions achieved. Conversions are conversions from prospects to customers or buyers. For example, they can consist of purchases or downloads.

    duplicate content

   Duplicate content refers to the display of the same content on the website or on different websites. This applies to websites of the same as well as different domains.

                Responsive design

   Responsive design is a creative and technical paradigm for creating websites so that they can react to the properties of the end device used, especially smartphones and tablets.

   Alt text browser

Alt text verbally describes the content of an image. Browsers are special computer programs for displaying websites on the World Wide Web or documents and data in general.

     Robots.txt file

   A Robots.txt file is a text file that specifies which directories can and cannot be read. When a website is called up, the crawlers first look for the Robots.txt file and interpret it.


 Metadata is structured data that contains information about characteristics of other data. The data described by metadata are often larger data collections such as documents, books, databases or files.

   Broken links

   If files/websites are moved or deleted, old links that still point to this