More on this book
Kindle Notes & Highlights
by
Eli Schwartz
Read between
January 14 - January 23, 2023
Discovery is the algorithm that crawls the web to identify new pages and sites that Google has not previously indexed.
The discovery algorithm simply looks for URLs and matches them against known URLs. When it finds a new URL not already on the list, it queues it for future crawling.
The crawling algorithm is designed to crawl and understand the entire web.
The indexing algorithm determines how to cache a webpage and what database tags should be used to categorize it. This algorithm draws on library science theories as it files away pages from the internet into specific databases.
Ranking uses the information from the first three algorithms to apply a ranking methodology to every page. Once Google has accepted a URL into its index, it utilizes traditional library science to categorize the page for future ranking.
Crawling does not guarantee something will get indexed, and indexation does not necessarily mean something will get traffic and rank on search results.
According to Google, there are five primary factors that driv...
This highlight has been truncated due to consecutive passage length restrictions.
The intent of the query matches more than just the words on the page. Google’s goal is to understand the meaning behind the words.
Relevance utilizes Google’s taxonomy classification. Basically, Google will try to produce relevant results, with a human medical query matched with human content rather than content using similar anatomical wording about animals.
Google uses AI to decide whether a user will be satisfied with what they find on a page. This can include everything from assessing the content’s readability to placement on the page, size of ads, and links to the page, with more authoritative links indicating higher quality.
Each ranking score is important, but usability is critical, as it can demote or remove your page from results altogether.
Google uses many location and time signals to best answer user queries.
fifth primary algorithm, which is tasked with understanding a user’s query in a deeper meaning.
This algorithm doesn’t directly impact the rankings of websites for queries; rather, it rewrites the actual queries to what Google believes the user is searching.
A successful SEO effort will include strategies that address all the Google algorithm and ranking-score factors. We must ensure all content is discoverable, crawlable, and indexable, and the content provides an excellent user experience.
Google wants to ensure a pleasurable user experience for the searcher. Nothing more, nothing less.
Instead of chasing the algorithm, every website that relies on organic search should train its focus on the user experience. The user is the ultimate customer of search.
Once there are a few pages of content, you can go about getting some links. There are some easy links to be had in social media profiles.
The goal is not to generate traffic for traffic’s sake but to generate engaged users who will eventually become paying users
Many keyword ideas are measured by their visible rankings on search engines, which don’t necessarily translate to clicks and certainly not revenue. The idea that you should implement for SEO should be highly relevant for the user base in a way that users want to click from search, and a good number of them will convert.
The best way to get to product-market fit is to learn from users and really understand what they want. Even better would be to take this user empathy and build for personas that will be the most profitable for the business.
Content should be treated similarly. Content should never be deployed and then not measured. Unlike other marketing methods, content is inherently trackable. It should earn its keep.
Deployed effectively, content can have an ROI in the thousands of percent over many years, but content with no purpose will never have any return.
knowing how to architect a website into folders and files, the types of content to create, and the personas of the potential users (we will touch on this later); learning from performance to optimize for growth; and, most of all, building a product that resonates with real users.
every audit will include at least a look at these high-level areas.
Penalty analysis—Are there any unexplained drop-offs in metrics that align with either Google manual actions or known algorithmic updates?
URL structure—Do URLs have a nice, clean structure to make it clear to both users and search engines what is contained on each page? (Ideally, there shouldn’t be any parameters in the U...
This highlight has been truncated due to consecutive passage length restrictions.
Duplicate content and canonical usage—Duplicate-content issues cause Google to have to make a decision about which URL to index. This may not be the desired URL, so ...
This highlight has been truncated due to consecutive passage length restrictions.
Internal links—Are internal links in good working order for proper crawling and indexation?
Backlinks—Which sites link to our site, and are they helping or hurting us? For a big site, understanding the mix of backlinks can be an audit unto itself.
Indexation—Is the site properly indexed in search? What is holding it back? In my opinion, this is the most important part of any audit.
Script usage—Which scripts are being used, and what are the implications? Despite Google’s proclamations to the contrary, using JavaScript is ...
This highlight has been truncated due to consecutive passage length restrictions.
Keyword usage—What keywords are being used, and what gaps exist? Keywords are the bulwark of any SEO campaign, and mapping them can often lead to opportunities.
On-Page SEO—What title tags (titles, descriptions, H1, H2, etc.) are being used?
Content quality—What content is being used, and of what quality? SEO is driven by content, but poor content can actually be harmful.
Robots.txt—How effective are the directions to search engines on what pages of the site can be crawled? Overdoing it will lead to important pages without traffic, while underdoing it will lead to useless pages being crawled.
Sitemaps—How effective are the current XML and HTML sitemaps? They are both helpful and necessary for page discovery,
Site speed—How fast do pages and the site load? Page and site speeds are factored into the Google algorithm for very slow sites, but even if there’s no algorithmic issue, very slow loads will lead to a poor user experience and, therefore, poor conversions.
Schema markup—Where are the current markup and the available markup to help us find new opportunities for growth? In a world of voice assistants, schema markup is increasingly more important, as it helps search engines understand context.
Mobile versus desktop—How will mobile search experiences interact with the site?
There are many websites that would fail an SEO best-practices test but still do very well on search. Similarly, there are websites that check every SEO box but hardly generate any SEO traffic.
For example, before a company reaches the point of diminishing returns on their paid-marketing spend, they should be returning at least $2 for every $1 they spend.
It is amazing to me that anyone would still use rankings as a success metric for an SEO campaign. Rankings are a vanity metric and do not directly, or even indirectly, contribute to the success of a business.
The primary success metric for SEO is and should always have been the same for every marketing channel: the amount of revenue, leads, visitors, etc., the business needs to be successful.
Organic search traffic will be mostly top-of-funnel in these cases. In the case where revenue can’t be measured, the fallback measurement option should be clicks from search engines, but an effort should still be made to determine that the clicks are of value.
the business should still be looking at specific metrics from organic sources of traffic using Google Analytics or similar tracking software. These metrics include engagement rate, bounce rate, pages per visit, and time on site.
In lieu of revenue reporting for SEO, use a metric like lead forms completed, demo requests, or even measure the clicks onto a call-to-action button.
Organic can and should focus on traffic that is less competitive and a lot higher in the funnel.
Organic can also help in the mid-funnel for users who might not yet be ready to click the buy button. They may be willing to take an intermediary step, like joining a webinar or viewing a demo from a sales rep.
Paid should pick up the baton where organic is less targeted. Paid retargeting could follow organic users around the internet and remind them to come back and buy. Additionally, paid could dominate brand placements at a very inexpensive cost in a way that organic never could.