Search engines are going through the kind of chaos that constitutional law went through in the early 1800s: Everything is still new, so change is rapid, extreme, and unpredictable, and can change the face of the game for decades to come.
It’s an exciting time to be in the field.
Because of the high barrier of entry, there are only a handful of players in the world of search. However, they’re all vying for the attention of one group of people – internet users. To better cater to internet users, search engines across the field are watching how they interact with websites – keeping tabs on how long they linger on a site, where they click after visiting the site, and what they search for afterwards – in an effort to determine which sites they find satisfactory. Search engines then note the trends that develop, and show sites that users seem to like more prominently in their results pages.
This was not always the case, and actually represents a massive shift to a core theoretical concept of search engines.
Here are the basics of what happened.
Before: Search Engines Determine What’s Good
Search engines have always faced the monumental task of understanding a search query, and then scouring the internet to find sites that are both relevant and important.
But what’s “relevant” and “important”? And how to determine it, automatically?
By creating mathematical algorithms and programs that take a variety of factors into account, search engines were able to determine, quite successfully, which sites were relevant and important for specific search queries. A couple of the factors that these algorithms took into account were:
- Links to a website, from other sites
- Number of times words in the search query appeared on the website
- Popularity of the website
However, marketers quickly learned how to exploit these factors, and focused on developing web pages that scored well on them, without being necessarily relevant or important. The search engines combated these efforts by making the algorithms increasingly complex, to somewhat mixed results.
After: Search Engines Let Users Tell Them What’s Good
The paradigm shift came when search engines decided to start using analytics to measure a site’s worth. This decision was never announced publically – it likely never will be – but all the signs show that search engines are calibrating their algorithms to weigh user interaction with a site more and more heavily.
For instance, if the three people, in succession, Google “Detroit criminal defense attorney” and then all proceed to do the following:
- Click on the listing for Avvo.com
- Immediately hit the back button on their browser
- Click on Hypo & Name LLP’s site
- Do not return to the result page, and do not enter the same search, again
Then Google will determine that Hypo & Name LLP’s site was a good one for the search “Detroit criminal defense attorney”, and boost its placement in the search rankings.
Impact on Online Marketing for Your Law Firm
The trend towards tracking user experience and promoting websites that maximize it is something that law firms need to take into account when they plan their online marketing campaigns. No longer is it good enough to trick search engines into thinking your website is a good one – now you have to trick your viewers, too.
The best way to make this happen is by making your website a genuinely good website. For law firms, hosting a law blog is the best way to do this – it enhances your reputation as an expert, and provides space for quality legal content that readers are looking for.
When it comes to writing a legal blog, Myers Freelance is at the top of the game. Our writers all have legal degrees and experience but are, first and foremost, good writers. Contact us today to get a legal blog started on your site, and follow us on Facebook and Twitter to keep abreast of news and discounts.