Where does the term SEO (Search Engine Optimization) come from?
The first search engines emerged in the early 1990’s and until Google appeared in 1996 many were created, including Yahoo; Then it was when the web page boom began and people realized that you could really make money from them. It was then that they obviously came to the conclusion that they needed to attract traffic and what was the best method of attracting traffic? Indeed, the search engines. At that moment the owners of the websites began to think how they could reach the top positions … SEO was born!

The SEO focuses on the organic search results, ie, what is not paid:

Organic results
But, well, let’s go to what matters and the reason why (I think) you are reading this chapter:

1. What is SEO?
According to Wikipedia, the SEO is:
Search engine optimization or search engine optimization is the process of improving the visibility of a website in the organic results of the different search engines. It is also often named by its English title, SEO (Search Engine Optimization).

SEO is one of the “disciplines” that has changed the most in recent years, we just have to look at the large number of updates that have been of Penguin and Panda, and how they have given a 180 degree turn to what is I understood by SEO until recently. Now with the SEO is pursued what Matt Cutts himself qualifies as “Search Experience Optimization” or what is the same, all by the user.

Although there are thousands of factors in which a search engine is based to position one page or another you could say that there are two basic factors: authority and relevance

The Authority is basically the popularity of a website, the more popular a page or a more valuable web is the information it contains. This factor is what a search engine takes into account since it is based on the user’s own experience, the more content is shared, the more users have found it useful.
Relevancy is the relation that a page has in front of a given search, this is not simply that a page contains a lot of times the searched term (in the beginning it was so) but a search engine is based on hundreds of factors On-site to determine this.
In turn SEO can be divided into two major groups: the on-site SEO and the off-site SEO.

On-site: SEO on-site cares about the relevance, making sure that the web is optimized so that the search engine understands the main thing, which is the content of it. Within SEO On-site we would include keyword optimization, load time, user experience, code optimization and URL formatting.
Off-site: The off-site SEO is the part of SEO work that focuses on external factors to the web page where we work. The most important factors in off-site SEO are the number and quality of links, presence in social networks, local media mentions, brand authority and performance in search results, ie the CTRs that have our Results in a search engine. Surely you are thinking that all this is very good and that is very interesting but that you are here to know why you need the SEO in your website and what benefits you will get if you integrate it into your online strategy.
The SEO can be differentiated in whether or not we follow the “recommendations” of the search engine: Black Hat SEO or White Hat SEO

Black Hat SEO:   It is called black hat to try to improve the search engine positioning of a web page using unethical techniques or that contradict the search engine’s guidelines. Some examples of Black Hat SEO are Cloaking, Spinning, SPAM in forums and blog comments, or Keyword Stuffing. Black hat can provide benefits in the short term, but it is generally a risky strategy, with no long-term continuity and no value.

White Hat SEO:   Consists of all those ethically correct actions that meet the guidelines of search engines to position a web page and search results. Because search engines place greater emphasis on pages that best answer a user’s search, White Hat understands techniques that seek to make a page relevant to search engines more valuable by providing value to its users.

2. Why is SEO important?
The most important reason why SEO is necessary is because it makes your website more useful for both users and search engines, although the latter are more sophisticated every day, they can not yet see a web page like a human. SEO is needed to help search engines understand what each page is about and whether or not it is useful to users.

Now let’s take an example to see the clearest things:

We have an electronic commerce dedicated to the sale of children’s books, well, for the term “coloring pictures” there are about 673,000 searches per month, assuming that the first result that appears after doing a search in Google gets 22% of clicks (CTR = 22%), we would get about 148,000 visits per month.

Now, how much are these 150,000 visits worth? Well if for that term the average cost per click is € 0.20 we are talking about more than € 30,000 a month. This only in US, if we have a business oriented to several countries, every hour we perform 1.4 trillion searches in the world, of those searches, 70% of the clicks are in organic results and 75% of users They do not reach the second page; If we consider all this, we see that there are many clicks per month for the first result.

SEO is the best way for your users to find you through searches in which your website is relevant, these users are looking for what you offer them and the best way to reach them is through a search engine.

3. How do search engines work?
The operation of a search engine can be summarized in two steps: crawling and indexing.


A search engine runs the web crawling with what are called bots, these bots go through all the pages through the links (hence the importance of a good link structure) just as any user would do when browsing the content of The Web, go from one link to another and collect data on those web pages they provide to their servers. The crawl process begins with a list of web addresses from previous crawls and sitemaps provided by other web pages. Once they access these websites, the bots look for links to other pages to visit them. Bots are especially attracted to new sites and changes to existing web sites.

It is the bots themselves that decide which pages to visit, how often and how long they are going to crawl that web, so it is important to have an optimum load time and updated content.

It is very common that on a website you need to restrict the crawling of certain pages or certain content to prevent them from appearing in the search results. For this you can tell search engine bots not to crawl certain pages through the “robots.txt” file.


Once a bot has crawled a web page and compiled the necessary information, these pages are included in an index where they are sorted according to their content, authority and relevance; So when we query the search engine it will be much easier to show the results that are more related to our query.

At first the search engines were based on the number of times a word was repeated on a page, when doing a search they tracked in their index those terms to find which pages had them in their texts, better positioning the one that had more times repeated. Currently, search engines are more sophisticated, and base their indexes on hundreds of different aspects like the date of publication, if they contain images, videos or animations, microformats, etc. They now give more priority to content quality.

Once the pages are tracked and indexed, the time comes when the algorithm acts: algorithms are computer processes that decide which pages appear before or after the search results. Once the search is done, in a matter of milliseconds, the algorithms are able to search the indexes and know which pages are the most relevant considering the hundreds of positioning factors.