SEO Hierarchy of Needs: A Guide for Beginner Law Firms
BY Mandy Wakely
Think back to your college psychology classes. Remember Maslow’s hierarchy of needs, the theory that states our human needs must be met in a certain order? For example, something like being respected by one’s peers does not matter if physiological needs like breathing and eating are not being met.
Just as Maslow so neatly organized our human needs in an ascending order, with the most basic and critical at the bottom of a pyramid, SEO has its own hierarchy of needs.
Understanding this order of importance will help demystify SEO and get your firm’s web content seen.
SEO short for “search engine optimization” is the process of maximizing quality website traffic through organic, unpaid search engine results. Search engines want to provide users with information that is relevant, trustworthy and authoritative. Using algorithms that are not entirely known, search engines rank content on the web accordingly.
For example, a Seattle resident who has been involved in a car accident may search online for local attorneys who can help with their case. If they searched for “Seattle accident attorneys” in Google more than 3 million total results would come up, but only around twenty items, including paid advertising, map listings and organic results, show on the first page. Of those items, only ten are organic, unpaid listings.
Every attorney who would like to have a chance to earn that searcher’s business is trying to get onto that first page of results. SEO is the way for them to do that.
People click on the organic content in search engine results pages nearly 98 percent of the time over paid advertisements. It appears more credible. People doing research online are looking for genuine information, and they are savvy enough to recognize it.
Users are also not likely to move beyond that first page of their search results. The first few websites that appear on a results page get 95 percent of clicks. To put it simply, that old real estate adage “location, location, location” is indeed true on the internet too.
Not all search engines are created equal
The word “Google” has become another proprietary eponym, a specific brand name that describes an entire category in general, just like Velcro, Band-Aid or Xerox. There are around 30 other search engines beside the web giant, like Bing and Yahoo for example, but Google has the biggest market share.
As far as SEO goes, Google is king. Over 90 percent of web searches occur on Google (including Google Images, Maps, Scholar, etc.) and Google-owned sites like YouTube. Because of this, most SEO energy should be spent on Google for the best return on effort.
The hierarchy of SEO needs
Tweaking small things about your website will make it look better to the search engine algorithms that determine what people see when they conduct a search. The higher you place within the results pages, the easier it is for potential clients to find you.
Although there are plenty of companies offering technical SEO services for a price, SEO itself is not something that can be bought. SEO is a very worthwhile endeavor however, it can be overwhelming.
Therefore breaking SEO up into graduated steps is smart. It gives a starting point. If a firm must prioritize its marketing dollars or if an interested and willing marketing do-it-yourselfer is just jumping into SEO, here is what to focus on first.
The amount of information on the internet is astounding and continuously growing. Search engines constantly scour the web for new content using bots called crawlers or spiders. Crawlability refers to the process by which the bots find content; this forms the foundation of the SEO hierarchy of needs pyramid.
Crawlers find URLs and move through a website via links. They pick up information along the way to take back to the search engine’s servers. URLs are one thing bots use to see what the content is about so they must be short and descriptive in order to be more easily crawlable. Bots and users alike prefer straightforward, clean web addresses. In other words, all those ugly numbers and symbols in a URL are killer for SEO.
Certain content can be hidden from crawlers using a robot.txt file. This asks bots to not crawl certain areas of a site, such as old, duplicate or broken content (e.g. an expired promotional page) and keeps useless content out of search engine results pages to optimize the crawl budget.
The crawl budget is the average number of URLs a bot will crawl before it leaves a site. Although this is a concern mostly for big websites containing thousands of URLs, blocking the content that does not need to be seen ensures that the crawlers will look at what does.
When a crawler encounters an issue, it does not stay to try and figure it out. If a page or section is obscured by something, like a password or login requirement, that content will not be found on search engine results pages. It is equally important to be sure crawlers can get through a website as it is for them to get to it. Otherwise, they may never move from the homepage and users will not find deeper content.
Although they are always becoming smarter, search engine bots are still in need of a little help to boost crawlability. As the most basic aspect of SEO it is not a place where mistakes can be made. After all, the best, most expensive website on the internet does no good if it cannot be found.
Indexability is just a small step beyond crawlability. It is how search engines organize the content brought back to the servers by bots in the crawling process.
The index of a search engine is simply a database of all the information that has been deemed worthy to display to users in a search engine results page. The size of Google’s index is hard to imagine. It contains hundreds of billions of webpages and over 100,000,000 gigabytes of information.
Even when a website has been successfully crawled, certain things may prevent it from being indexed. These include:
- More than one version of a site, such as a mobile and website version with different content. Mobile versions of sites get preference.
- Duplicate content
- Canonical Declarations
When a more authoritative source for certain information exists, the lesser one will not be indexed. If something is not indexed it will not be able to be found by a search engine.
Accessibility refers to how easy it is to display and use a website. Once content has been crawled and indexed, it needs to be friendly for both bots and users. Sites that perform well and follow the rules of structure are more accessible than those that do not; this lends itself to higher rankings in search engine results pages.
Accessibility is a broad category which includes both technical and human aspects. A few practical features that increase accessibility include:
- Responsive design, which automatically adjusts for optimal viewing on different devices.
- Server performance
- Load time
- Page size
- Alternative text for pictures that is clear and informative
- Descriptive and helpful links (not just “click here”)
- Free access to content. Password or login protected pages will not usually be ranked.
Keeping users with physical, visual, cognitive, auditory or other issues in mind when designing content is always best practice. Beside a firm’s general responsibility to do so, ignoring this could also be a violation of laws that protect disabled persons such as the Americans with Disabilities Act and increase possible risk of legal action. At the very least it could result in unhappy users and negative feedback. Being sure that content is compatible with browsers used by the vision impaired, web readers and other supportive tech only makes a website available to more people.
When accessibility suffers so does SEO.
Rankability is the one of the two more advanced and technical levels at the top of the SEO hierarchy of needs and the first to do with the true optimization of SEO versus that which impedes it. This is the practice of using SEO tactics to improve rank in search engine results pages. This goes back to those first few entries getting the most traffic.
One way to increase rank is to use linking. Including links to other content, whether internal or external, will boost crawlability and rank by transferring some importance and traffic from more popular pages to less popular ones. It also keeps content fresh.
Linking can help build content silos too. Content silos are groups of like pages that can rank higher together than they could alone. For example, information on a firm’s site concerning car accidents might include blog posts, articles or long-form content and can be organized in a way that is easy for users to find and peruse in turn helping SEO.
The final level of the SEO hierarchy is clickability. This is a highly technical element and it involves increasing the likelihood of users to click on and interact with a website. Focusing on this can help get a website’s content featured prominently beyond just the typical search engine results page, in image or video search results for example.
Utilizing title tags and keywords well will make content more clickable. Although it may seem obvious, making sure that clickable content looks clickable and unclickable content does not look clickable helps too. Incorporating rich, properly tagged elements into a website design such as media, ratings, interior site links, tables or lists helps entice users to stay longer and interact further with a site.
Clickability is where technical SEO and human behavior online merge, and hopefully for the best.
Practical SEO improvements are not scary
SEO sometimes sounds complicated because it can be. It is especially intimidating to the beginner or the not-so tech savvy but breaking it down into steps is the most helpful way to approach it. Wasted effort (or marketing dollars) is frustrating. If there is an issue with one of the more basic principles, like crawlability or indexability, efforts to improve higher areas like clickability are ineffective.
SEO may not be simple, but it offers a great opportunity to increase web traffic, public exposure, gain new clients and revenue. Working up the hierarchy will maximize SEO success.
The Knowledge Graph uses the information on the web to understand real-world connections between the data it collects.
Content guidelines with stated direction let writers, designers, and contributors know what they need to focus on.