Guest Post by Joseph Dowdy
While creating my Udemy course “Titanic Lies About SEO”, I was challenged to create an easy way for my students to understand how search engines work.
After all, SEO can seem to be a quite complicated concept, and there are a ton of misconceptions about the industry as a whole. Case-in-point, my entire course was about deconstructing the titanic lie about SEO that permeates the world of SEO professionals and customers!
While it’s true that even as an SEO pro there are some things that may remain *mysterious* to me, Google has given us clues as to how search engines function. To make sense of these clues, I have put together what I call the “5 M’s of SEO”.
The 5 M’s of Search Engine Optimization
Essentially, each M is a fundamental component of how search engines work. And, if you know the basics, then you can be well on your way to creating and implementing your own SEO strategy – without getting overwhelmed with the data and myths!
Ready to learn more? Let’s go!
Google and other search engines map the World Wide Web address by address using software to scan, spider and crawl page by page and then word by word. This allows the search engine to later provide links to the location of pages based on the words on the pages.
You may already know the terms “spider” and “crawl” but they are really just software bots which are written to start with a list of known web page addresses (and combined with Sitemap data) and record results from each page.
This is really useful to know that this is the beginning of the process because if your site is not being scanned and you want to be in the search results, then you need to fix this.
You can help Google to scan your site by registering your website with Google Webmasters and provide them with a Sitemap of your site. You also need to make sure your site isn’t blocked from being scanned as WordPress and other sites have options to disallow scanning.
There are also other options to prevent scanning of specific pages and to tell Google how often to scan each page. The bottom line is that you can help search engines to map what you want to map and how often you want them to map you.
By the way, there are also “bad” software scanning bots out there which can affect your site performance and you can take action to block them if they are problematic.
While not all search engines measure the same way, they all measure what they find on the map.
Google is specifically looking for quality. They measure security by checking if there is an SSL certificate and if it’s valid. They measure how quickly, if at all, each page loads. Google has publicly stated that they measure not just how quickly the page loads, but also if it is mobile-friendly.
Each of these measures is a “signal” of sorts to Google and these “signals” tell Google what quality to assign to each page at each address along with the words on the page. Google assesses what keywords might be a good match for this page compared to other pages that might match the same keyword. In general, Google is measuring the appropriateness of the page to possible keywords.
While there are well over a hundred “signals” Google is tracking for each page, this is where Google is assessing each page against other pages to create what becomes the search engine ranking for each keyword of each page.
As you type something into a search engine, the search engine is ready to give you a result based on the match between what is typed and what is already mapped and measured. Using fuzzy logic and other algorithmic ideas to match a keyword or keywords for each page to what is typed is really why people use search engines. This is the only step that the user of a search engine actually sees.
As you type your query into the search engine, you may see a predictive completion of the words you enter. This predictive completion can give you insight into the popularity of that search.
Few people know how tightly the predictive search correlates to what will appear on a search engine results page, but this can give you a clue as to what that search engine already knows about what you are typing.
This is the part that I always found to be very spooky.
People don’t realize this but Google is watching you very closely to see what happens after you see the matches that they offer you. They monitor if you have clicked on the 1st match or the 2nd match or if you go to the 5th page of matches to see which one you think is the best match for your query.
Then they look at what you do next: Do you search again? Do you not search again? Do you type a new search? Do you type an unrelated search? Do you type several new searches but with different variations each time?
Google monitors your behavior to see if what they offered you and that you clicked was what you were actually trying to find. And if not, then Google is monitoring that behavior as well.
This measure of how long you spend with each match you click is known as “dwell time” and the longer that is, the better that search result is according to Google. If the amount of time is minimal (no one knows for sure if it’s less than 10 seconds or less than 30 seconds or less than a minute), then that match is not “good” in Google’s eyes.
Based on what Google sees that you have done with the search engine matches they provided, Google will either raise that match higher if they see an increase in “good” matches (in other words, the person searching did not click the ‘BACK’ button and they didn’t search again for the same thing) or will lower that search result if they see an increase in “bad” matches (someone searches again, clicks ‘BACK’ quickly, etc.).
Google uses all of this information to adjust what it found in the first M for Map. Their search engine results pages will be modified if someone clicks on a match and they get a 404 error or the page doesn’t load, but it may only be because someone who gets that result will just click the ‘BACK’ button.
However, it is quite possible that Google is using its browser Chrome to monitor these kinds of things and if a virus or malware is encountered then Google will quickly modify their search engine matches and exclude that site going forward and keep it that way until it is mapped and measured again.
If you like the way I’ve explained all this then you will probably enjoy my Udemy course, “Titanic Lies About SEO”. Check it out to get started on your own SEO strategy today.