Search Engine Optimization Resources

Search Engine Optimization - The Complete Guide

3 Link Building Strategies for Beginners 2020

Relevant links are the foundation of any decent ranking in the search engines. Our Link Building Strategies for Beginners will get you off on the right foot.

However, it can be a daunting task at first. These three strategies are the basic stuff for helping newer web sites gain one-way links.

If you’re not quite sure how to go about beginning to build up relevant one-way links for your web site here are my three starting strategies. If you put a decent amount of effort into them and your web site approaches a decent level of quality you will see some very good results.

Writing and Submitting Articles to Blogging Platforms

This is the best work/return link building tips. There are hundreds of free article directories out there. 3rd party blogging tools like WordPress, Blogger, Tumblr, and Medium all count for backlinks.

Setup a quick blog on Tumblr and WordPress and start writing. Link back to relevant content, not a product page. Think of something that will keep the reader engaged, want to learn more? Click here, Top 3 tips for website design in Kearney, NE.

The most important thing, put your target keyword in the anchor text of the link. Don’t just say click here, if you do actually want to see an example. Take a look at our blogging sites, GMB Site, Blogger, WordPress, and Tumblr.

Sometimes you can’t add keywords to the anchor text, I didn’t because I want you to know what you’re clicking on. Very similar to what Google expects.

The key to this though is that you want good, useful articles. You don’t want to go around submitting articles that are just ads for your product or even articles containing a ton of links to your site and affiliate links. Write about something people want to know. You can even write for pure entertainment.

Most article directories usually allow you to include a resource box and/or author bio. This is where you will add your company name, address, and phone number. This is considered a NAP reference and will help with Local SEO

Also, add a couple of links to your site or sites. The terms of the directory usually specify that anybody using the article for their site must use the unaltered article, including bio and resource box, leaving all the links live.

The ultimate test of your article’s usefulness will be measured by its spread to other sites or use in ezines. Of course, this does depend a lot on how many article directories you submitted to. I can usually find my articles on other sites within three days of submission. To check and see how far your articles have spread type your article’s title into the Google search bar in quotation marks. You should see most of the sites using your article.

To maximize the potential of this strategy you will have to write and submit articles consistently. People will begin looking for your name in the directories and your articles and one-way links will spread. Submit to as many article directories as is possible and maybe consider investing in some article submission software if you want to get really serious.

Submit to the many different web directories

There is a limitless quantity of directories that you can submit your URL to for free. Large ones like DMOZ and Yahoo will get some major recognition from search engines for your site. Smaller ones are good for a free backlink and maybe a trickle of traffic. When you put it all together it adds up eventually. This strategy is usually the very start for people.

It’s important to realize that most of the big directories are incredibly picky, however. A listing in DMOZ, for instance, will turn Google’s head and also get you in the Google directory and several other large directories and hundreds of web sites. DMOZ has seen everything though. Every listing is carefully reviewed by a human editor. If your site isn’t quite finished or just isn’t really all that special then you’re asking for rejection by submitting the URL to DMOZ. If you’re going to try to submit your site to the big directories then make sure you have all your ducks in a row.

You may be able to tell that I consider DMOZ the most important directory listing to add to your collection. DMOZ aside though, it’s a good idea to make sure your web site is in good shape before submitting to anything. This would even include search engines.

Discussion boards and other online communities

If you’re involved in any chatroom communities, forums, video game clans or anything of nature make sure to use the tools provided effectively. Obviously don’t spam people with your URL or affiliate links or whatever but if there is a place to add your profile then do it. If you can link to your web site in your signature then do it. If you email back and forth a lot or forward all of those stupid email jokes to several hundred people add your signature to your emails with a link to your site.

This is just basic common sense. Don’t spam people or act as a walking advertisement. If you’re a respected member of an online community people will check out your profile and then your web site. If your signature has a link to your site then every post becomes a one-way link to your web site. If the community ranks decently in the search engines this can definitely be worth something. When people become aware of your web site there is always the chance of forming partnerships with like-minded individuals that benefit your site and
theirs.

Conclusion – Link Building Strategies for Beginners


Give these strategies a trial run. With time you will find out what works best for you and come up with even better ways of tracking down the all-important links. I hope you found this article helpful. Good luck.

7 Most Ethical Ways To Improve Your SEO

Today we are showing you how 7 Ethical Ways To Improve Your SEO. Organic search traffic is proof that people are always talking and searching for your brand or related keywords, and they hold the potential to convert into sales. Moreover, organic search traffic is free, meaning marketers don’t have to pay for pay per click adverts or run ad campaigns, and it is more valuable for RoI since users use search volume and search phrases to auto-generate it. Research has found that a lion share of links that users click on are organic and that more customers are interested in natural organic search results that are more dedicated to their needs. In fact, organic traffic helps eCommerce retailers to save tons of money in publicizing and marketing their products. However, this doesn’t happen without effort since they must adhere to the SEO code of ethics to boost organic search traffic. Here are a few ethical ways online marketers can improve their ranking on search engines.

Ramp up Security

A secure online store can not only convince the search engines to rank it higher up in the search results but also inspire confidence in the customer. For example, Google has recently announced the launch of HTTPS encryption as a search rank signal, which will give websites that are secured by HTTPS encryption preference over the unsecured ones. Furthermore, online shoppers are likely to repeat their visits to eCommerce stores that guard their private information uncompromisingly. Security improvement is an ethical practice that not only increases the potential of garnering more organic traffic but also ensures customer privacy.

Use of the Right Targeted Keywords

One thing with SEO is that online stores cannot use just one keyword and expect to earn a higher ranking on search engines. As such, e-retailers have to use several targeted keywords with high search volume to garner more organic search traffic. You also have to plan and research keywords to find the right target audience. Fortunately, Google provides online marketers with a free keyword researcher and planner, Google Ads, to assist in comprehensive keyword research.

Content Curation

One of the ethical ways online marketers use to improve their organic search traffic is to create eye-catching and original product content. Search engines such as Yahoo and Google often flags off any content plagiarized or copied from products that are already in use somewhere else. Product content can be a combination of imagery and text, and curating it can go a long way to attract more audiences. The better the product content is, the higher the chances of generating more organic traffic. Unleashing creativity is an ethical practice since it is an individual trait.

Enable Customer Ratings and Reviews

One of the secrets that YouTube uses to rank higher on search engines is to allow users to generate video content on their behalf. User-generated video content is the force behind the continued dominance of YouTube on the major search engines. Similarly, online marketers can use user ratings and reviews to usher in traffic to their sites. User ratings and reviews are original content created by website users, and they often contain targeted keywords. As such, the repetitive use of original content makes search engines such as Google and Yahoo take note of them and increase their ranking. In short, positive customer reviews and ratings help eCommerce stores to achieve the goal of boosting their rank on search engines ethically.

Optimization of Each Product Page

Once you have created targeted keywords, it is time to optimize each product page. Your chances of meeting much success using all the keyword in one product page to rank higher are slim. Instead, online retailers need to optimize each product page separately and optimize the Meta title, the header tags, website copy, image alt tags, and call to actions to gain more organic traffic. Nonetheless, e-retailers with stores with diverse product categories may face difficulties trying to optimize each product page separately to improve its ranking.

Use of an Optimized Permalink Structure

Most online stores often lose their points when it comes to setting up optimized breadcrumbs and permalink structures. Online marketers can use an optimized website permalink structure to depict the story of the entire page. For example, a permalink structure for an online retailer that deals with shoes can read this way, example.com/fashion/shoes/. Permalink structures create a clear path for search engines and the bots to grasp what lies ahead on the web page. These breadcrumbs and links also allow users to navigate the web page with ease.

Mobile Responsiveness

A Cisco study found that online stores generate 80% of their traffic from mobile device users. Furthermore, 60% of online searches are done using mobile-powered devices. Mobile device search phrases are also relatively longer compared to desktop searches, and they are more user-directed, accurate, and have a higher conversion rate. Thus, being mobile responsive allow online retailers to maximize the population of their users and reach the target customer group of their keyword. Responsive designing is one of the ways e-retailers can make their web pages mobile-friendly.

In short, the generation of organic traffic for an online retailer is a never-ending process. Online marketers should make it a practice to alter headers and descriptions, keyword in the content, and review page ranking to sustain higher ranking. Nonetheless, the implementation of all this requires e-retailers to abide by the rules of the book. Ethical improvement of organic traffic is far much successful compared to unethical techniques.

References:

How to Rank for “Near Me” Searches | Podium

13 Ethical Ways to Increase Your Site’s Search Traffic | Mashable

White Hat SEO: It F@$#ing Works | Moz

An Introduction to Black Hat SEO

SEO

What is Black Hat SEO?

Black hat SEO is a practice against search engine guidelines, used to get a site ranking higher in search results. These unethical tactics don’t solve for the searcher and often end in a penalty from search engines. Black hat techniques include keyword stuffing, cloaking, and using private link networks.

Appearing in search results is vital for business growth, but there’s a right and wrong way of doing search engine optimization. The dark art of black hat SEO is the wrong way. Black hat SEO seeks to game search engine algorithms, rather than solve for the user. Instead of earning the right to rank highly on search engine results pages, black hat SEO uses shady tactics to get you there. Sustained use of black hat SEO techniques is likely to damage your presence in search engines rather than improve it.

If you are new to the search space, the purpose of search engines like Google is to provide the best results when someone completes a search. They want people to have a great search experience and ensure the results they provide do not include spam. They do this automatically through algorithms or manual actions that aim to recognize and penalize those engaging in black hat SEO.

Search engine algorithms have gotten more sophisticated over time, which is why you should avoid black hat SEO at all costs. White hat SEO is a much better method of doing search engine optimization. It’s a more ethical approach that abides by the terms and guidelines set out by search engines. White hat SEO consists of creating quality content and better overall user experience for people visiting your site.

? By the way, TMV – Social doesn’t use any black hat techniques for our SEO Services.

Black Hat SEO vs. White Hat SEO

Black hat SEO goes against the guidelines set by search engines and manipulates them to gain higher rankings. It can lead to being wiped completely from search results or gaining a lower position. White hat SEO is a more ethical way of doing SEO by creating quality content and good user experience.

This article will explain what black hat SEO techniques involve so you can make sure to avoid them when devising your organic search strategy.

Black Hat Techniques in SEO

Keyword Stuffing

Keyword stuffing refers to the practice of filling your content with irrelevant keywords in an attempt to manipulate where the page ranks on search results pages. Adding multiple variations of keywords where they add no value creates a bad experience for users. It may also cause your page to rank for irrelevant queries.

Google explains keyword stuffing as:

  • Lists of phone numbers without substantial added value.
  • Blocks of text listing cities and states a web page is trying to rank for
  • Repeating the same words or phrases so often that it sounds unnatural.

Here’s an example of keyword stuffing for a website selling outbound marketing software:

“We are in the business of selling outbound marketing software. Outbound marketing software is what we sell. If you are thinking of getting outbound marketing software get in touch with one of our outbound marketing software consultants.”

I think you’ll agree, that sounds like a broken record. It’s pretty easy to spot and Google will be able to tell that the content sounds unnatural.

You may have heard the joke “an SEO copywriter walks into a bar, grill, pub, public house, Irish, bartender, drinks, beer, wine, liquor…”. This joke is about keyword stuffing and it is another perfect example of the practice. The words are all similar to each other, but they are of no value as they don’t even string up a sentence.

You can do keyword research to find out what people are searching for but overusing these keywords in your content is not a good idea. Rather than filling your content with irrelevant keywords, concentrate on creating useful content that focuses on topics over keywords.

Cloaking

Cloaking involves showing one piece of content to users and a different piece of content to search engines. Websites practicing black hat SEO will do this in order to make content rank for a variety of terms irrelevant to their content. Spam websites will often do this to try and avoid a search engine bot finding out the spam content they serve to users.

Tailoring your content to different groups of users is acceptable. For example, you might shrink the size of your website when someone visits from a mobile device. You might also change the language of a page based on the country someone is visiting from. A publisher like Forbes or Inc might change the ads that appear on a page in order to fund their content. These examples are completely acceptable. As long as you are not just changing the content that appears for search engine crawlers.

While there is no hard and fast rule to determine what’s acceptable and what’s not, my best advice is to ask yourself, does what you intend to do solve for the user? If it does, then it’s acceptable. You should treat search engine bots that crawl your site the same as any other user.

If you are curious to find out how Google sees your website you can use the fetch as Google tool and compare this to what users see.

Sneaky Redirects

A redirect involves sending someone to a different URL than the one they initially clicked. Black hat SEO uses redirects outside of the purpose they are intended for. Along the same lines as cloaking, this might include redirecting a search engine crawler to one page and all other users to another page.

Another example is redirecting a highly authoritative page with lots of backlinks into another irrelevant page, just to boost its position in search results. A 301 redirect passes the majority of authority from one page to another. This means someone practicing black hat SEO could use redirects solely for the purpose of manipulating search results.

Redirects should only be used for the purpose they were designed. This might be in the event you change website domain or consolidate two pieces of content. It’s also acceptable to use JavaScript to redirect users on some occasions. Take for example, LinkedIn redirecting you to someone’s full profile when you are logged in, rather than show you the public version of a users profile when you are logged out. Sneaky redirects, on the other hand, should be avoided. They violate the guidelines of search engines such as Google and Yandex.

Poor Quality Content

Poor quality content that’s of no value to the searcher is also a common practice in black hat SEO. This includes content scraped from another website either by a bot or a person. At one point search engines like Google weren’t good at recognizing content that had been copied from other websites. The Google Panda update in 2011 resolved this issue. Many sites with duplicate content took an instant hit in search rankings. Since then, Google has gotten much better at recognizing duplicate and low-quality content.

Adding invisible keywords to your content is also a prohibited practice. Some websites that engage in black hat SEO do this by making the text the same color as the page background. This means the page may appear in search results for those invisible keywords, even though there’s no visible content about them on the page. When a user clicks on the result thinking it’s going to be about the topic they searched for, they don’t find any of the content they were looking for as the keywords are invisible. If you’re solving for the user, there should be no need to hide content on your website.

The “bait and switch” is another black hat means of misleading search engines. This involves creating content surrounding a topic you want to rank for. Once the page is ranking in results for this topic, the content is swapped out for something else. This creates a negative experience for searchers as the content they click-through to see no longer exists. These practices trick users and search engines and they are not a good way to do SEO.

Writing original, quality content is an important part of white hat SEO. Not only is it required to avoid a penalty from search engines, it will also set your website apart. Creating high-quality content builds trust with your target audience and turns visitors into customers.

Paid Links

Search engines like Google strictly ban the buying and selling of links. They state on their website that “any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines.” This includes sending a website free products in exchange for links. If you’re not sure of what’s an acceptable exchange, Matt Cutts, the former head of Google’s webspam team recommends looking at FTC guidelines.

You should avoid paying any other site to link to your content. Google asks users to tell them about instances of people buying or selling links. They state they will penalize both buyer and seller of links once the practice is detected.

If you’re reading this having purchased links without realizing this is a black hat SEO tactic, you should have them removed as soon as possible. You can also use the disavow links tool if you can’t get webmasters to remove the links.This tells Google to disregard the paid links when calculating your Pagerank.

Abusing Structured Data/Rich Snippets

Structured data is also known as rich snippets and schema. It allows you to change how your content is displayed on search engine results pages. It makes your content stand out from competitors and also gives you more real estate on results pages. You can add structured data to a page displaying a podcast, recipe, book among other products and services. Reviews schema markup is probably one of the most popular types of structured data.

Black hat SEO involves providing inaccurate information in structured data to fool search engines and users. For example, someone practicing blackhat SEO might award themselves five stars from a fake review site and add structured data so they stand out on search results pages. This is a very risky practice as search engines like Google encourage users to report websites misusing structured data.

q7VDYkkgSXGj7els 1S1gSIxrEi2Q eRL9q50svY6Dtl l9y

This should not put you off marking up truthful, accurate information on your web pages. In fact, I highly recommend adding structured data the white hat way. We added review markup to HubSpot product pages and saw a 10% increase in clicks to those pages.

You have nothing to worry about, if you provide truthful information that is helpful to users. Google has documented the rules around adding structured data to your website and also have a helpful tool for testing your structured data.

Blog Comment Spam

As the name suggests, this black hat technique involves including a link to your website in blog comments. This practice happens less often nowadays as search engines like Google updated their algorithm to discount any links in blog comments. Most authoritative blogs now make links in blog comments nofollow by default. This means search engines like Google do not follow the link nor does it the link pass any authority.

Despite the decline in the number of people engaging in the practice, you’ll still find a bunch of people on Fiverr advertising blog commenting services. Blog commenting, with links to your website is a spammy way of getting links to your website and we highly recommend avoiding the practice.

If you own a publication, forum or community that allows comments you need to take care to ensure that your comments section can’t be spammed by either bots or people. Search engines like Google will demote or completely remove pages containing spam from the search results. Using anti-spam tools like Google’s free reCAPTCHA tool is one way to mitigate the risk of spam user generated content.

Link Farms

A link farm is a website or a collection of websites developed solely for the purpose of link building. Each website links out to the site or sites they want to rank higher on search engines. Search engines rank websites by looking at the number of links that point to the website, among other factors. Black hat SEO exploits this by using link farms to inflate the number of backlinks a particular site has.

Link farms often have low-quality content and lots of links. The links normally contain the keyword they want the site to rank for in the anchor text. Search engines like Google can easily detect link farms and using them should be avoided. Instead, you should use white hat SEO tactics like creating amazing content, graphs, data, interviews or any other content that allows you to acquire backlinks naturally over time.

Private Blog Networks

A private blog network (PBN) is a bunch of authoritative websites used solely for link building. They are similar to link farms in that they both aim to exaggerate the number of links pointing to a website. Each PBN site links to the site they want to boost in the search results but do not link to each other.

nfJ0GazVbTD3ddmhJaiUsdDLODF8fhei9U39D9g aIkb2e9nrMo6ZtnWR8Kjw 2jC eQ8euSjfK2 C4TKqdvb4gyerdGUg1jeF5oznn0xqzX1tX

Black Hat SEOs wanting to build a private network will normally buy expired domains that have already built up authority. They’ll write content similar to what already existed on the domain before it expired and add links to their own site. They hope that search engines won’t notice they’re controlling a network of websites and rank their main website mucher higher in the search results.

Search engines have gotten clever at spotting PBNs and your site could be hit with a severe penalty if you are using PBNs to improve your search presence. Rather than put effort into spinning up fake websites focus on creating quality content under your own domain. Keeping your content under one roof means your site will be highly authoritative as everyone will link to the one domain.

Examples of Black Hat SEO

Groupon’s Bait and Switch

Groupon was accused of doing a bait and switch by San Francisco Comprehensive Tours. The tour company ran a one of promotion with Groupon but the voucher website continued to advertise the promotion on Google long after it had ended. When searchers clicked on Groupon’s page there was no discount to be found as the content had been swapped out. This bait and switch happened in a PPC advertisement but they often happen on organic results too.

J.C. Penney’s Black Hat Links

J.C. Penney ranked at the top of search results for a vast number of keywords from “skinny jeans” to “home decor”. The retailer’s exceptional performance in search results was perfectly timed around the holiday season. This outstanding performance in search results was thanks to black hat SEO link building techniques that slipped under Google’s radar.

Just over 2,000 backlinks were discovered by Doug Pierce. These links contained anchor text with the very keywords J.C. Penney wanted to rank for on search engines. Many of the links were found on websites of no relevance to J.C. Penney. The topics of these websites ranged from casinos to cars. J.C. Penney claimed no responsibility for the links that were found in an interview with the New York Times.

Google confirmed the actions of J.C. Penney went against their webmaster guidelines and revealed that they also had violated webmaster guidelines on three previous occasions. J.C. Penney received a Google penalty that saw them drop down close to seventy positions on Google for terms such as “living room furniture.”

Sprint’s User-Generated Spam

In 2013 a user called Redleg x3 posted on Google’s Webmaster Central forum explaining Sprint got a notification from Google warning of user-generated spam on their website. Google’s Matt Cutts commented on the thread saying he could see the majority of spam had been removed from the website. He explained the company should “…try to catch the spam a little faster or see if there are some ways to make it a bit harder for the spammers to post a large amount of messages on the community pages.”

Forbes Selling Links

Someone appearing to be from Forbes posted on the Google Webmaster Central forum seeking help with a link violation notice. The notice asked Forbes to remove unnatural links from their site’s content.

Google’s Matt Cutts commented in the thread that he had confirmed multiple times that paid links that pass PageRank. Cutts recommended that Forbes remove the paid links that pass PageRank to have the penalty reversed. TechCrunch reported that Forbes began to remove the paid links back in 2011 after receiving the penalty.

Google Chrome’s Paid Link

Even Google messes up their own SEO from time to time. On one occasion they included a follow link in a sponsored post about Google Chrome. This falls under black hat SEO as the link was included as part of sponsored content that was paid for by the company. The Google webspam team applied a penalty to www.google.com/chrome, reducing its Pagerank for a period of sixty days. The black mark against Google Chrome caused them to drop in position on search results for the term “browser”.

Why You Should Avoid Black Hat SEO

While black hat SEO is not illegal, it does violate webmaster guidelines set out by search engines. In other words, it’s still against the rules. This means if you engage in black hat SEO, you must be willing to get hit with a nasty penalty as punishment. Getting a penalty from search engines will cause your website to drop down in the search results or worse, it could be removed completely. This means your website will gain less traffic and ultimately, fewer customers.

Search engines have gotten better and better at spotting black hat SEO techniques. Nowadays getting caught for practicing black hat SEO is pretty much unavoidable. Black hat SEO does not solve for the searcher nor does it solve for the search engine. While you may see short-term gains from black hat SEO over time search engines will pick up on your black hat ways damaging your presence in search.

The Blurred Lines of Grey Hat SEO

You won’t find grey hat SEO in the middle of a Robin Thicke song, but you will find it somewhere in the middle of black and white hat SEO. If there’s an SEO tactic you find hard to categorize as black or white hat SEO, then it’s probably a grey hat technique.

What is Grey Hat SEO?

Grey hat SEO consists of slightly shady SEO tactics. While they are not against search engines prohibited practices, they are slightly unethical and could be banned in the future.

Grey hat SEO threads close to the line of black hat SEO. Grey hat tactics are normally not listed in webmaster guidelines as prohibited practices but they are a little dubious. Many grey hat practices have become black hat practices over time, once search engines found out about them.

How To Avoid Black Hat SEO

There’s no doubt black hat SEO is a risky business that’s not worth engaging in. Here are best practices to avoid black hat SEO:

  • Treat the searcher and search engines the same way. Avoid “cloaking” or tricking search engine crawlers by redirecting them to another page. You should always focus your efforts on solving for the searcher and create a great user experience from search engine to site.
  • Write only good quality original content that avoids keyword stuffing. Never scrape, duplicate or reword content that belongs to others. Google’s content guidelines and our content creation kit may be helpful.
  • Abide by the rules when adding structured data to your website. Ensure any schema markup you add is accurate and not misleading to users.
  • Never buy or sell links and remember, it’s not just money that’s considered a black hat exchange. Providing free products in exchange for links is also prohibited. If you are unsure if an exchange might be unethical lean on the FTC endorsement guidelines and consult this detailed blog post about paid links from Google.
  • Avoid setting up a private blog network for the purpose of getting links. Differentiate your website and content so people link to you naturally rather than fake it till you make it. That never ends well.
  • Stay up to date on webmaster guidelines so you can avoid black hat tactics prohibited by search engines. Here are the webmaster guidelines for Google, Yahoo and Yandex.

Don’t make your next search “how do I get rid of a Google penalty?” If you need to question whether something is black hat or not, it probably is. A white hat SEO strategy is a much better approach to search engine optimization. In the long run, it will pay dividends and you can sleep at night knowing you’ll never see a dip in your rankings due to a nasty penalty. So for the love of search engines, never do black hat SEO. After all, they are the ones that keeping us SEOs in the business.

Your Google Rank Doesn’t Matter Anymore

For a long time, keyword rankings were a staple part of any SEO campaign. In a lot of cases they were a primary metric used to judge performance.

Go back five or six years and we had so much more information on the keywords that users were searching for to reach our web content. All of this information was available transparently within Google Analytics, and you could get relatively accurate search volume estimates from within Google’s Keyword Tool.

The first major update that changed this was Google’s move to encrypted search and the dreaded appearance of “not provided” within Google Analytics.

This created a ripple effect across many SEO software providers that made a lot of their tools less effective — or at least tougher — to measure the impact coming from organic search on a granular level.

Next up, and more recently, was Google’s decision to move search volume estimate within their Keyword Planner tool to show estimates in broad ranges. Instead of learning that a keyword was being searched for 1,400 times each month, we’re told that it’s searched between 1k-10k times per month. This isn’t overly helpful.

These changes have forced marketers to adapt their search strategy to focus less on individual keywords and shift to a topic-centric content strategy, especially for content sitting at the top of the funnel.

Keyword Rankings are Inaccurate

One of the major criticisms of keyword ranking data is the fact that it is largely inaccurate. Many industry leaders and even software providers of rank tracking data have admitted that this is the case.

The reasons behind this can be broken down into three broad buckets:

  1. Personalization.
  2. Device.
  3. Location.

Personalization

Around the time of the launch of Google+, the SEO industry was talking a lot about personalization within search. Even after the death of Google+, personalization has remained a big consideration.

Bonus points if you remember Authorship snippets (circa 2012).

Ultimately, Google will deliver results that are personalized to a user based on their search history. This means that if I were to search for a query like “electric cars” and I’d previously been browsing the Tesla website, it’s a possibility that Google would tailor the rankings of the search results to show Tesla near the top.

This wouldn’t be the case for someone that hasn’t previously visited Tesla’s website, which makes it very tough to determine which website actually ranks #1 (because it can be different from one person to the next).

Device and Location

Whilst personalization plays a part in the ambiguity of keyword rankings, it’s nothing compared to the role of implicit query factors like device and location.

One of Google major advancements in search over the past five years has been its ability to take into account aspects of a search query that aren’t explicitly stated. To make sense of what I’ve just said, let’s take a query like, “Boston restaurants”.

Go back to 2010 and a search for “Boston restaurants” would yield a list of relatively generic websites that either talk about Boston restaurants or maybe are a restaurant.

Fast-forward to 2018 and a simple search for “Boston restaurants” will arm Google with a whole lot more information than before. They’re able to see which device you’ve searched from, where you’re located whilst you’re searching, even if you’re currently on the move.

Let’s say that you searched on an iPhone and you’re walking around in the center of Boston at 11:30 am. Here’s what this query would actually look like to Google:

“Which restaurants are currently open for lunch within walking distance of my current location in the center of Boston, MA?”

They’ve gathered all of this information without the individual even having to type it. As a result, they’re able to completely tailor the search results to this individual searchers’ current situation.

So … to answer the question of who ranks #1 for “Boston restaurants” becomes an even more challenging task.

Keyword Rankings are Directional at Best

Strong keyword rankings don’t always equate to high volumes of organic traffic, let alone improvements in revenue. As I mentioned at the beginning, we’ve lost a lot of visibility on search volume metrics, which makes it very difficult to accurately estimate the amount of traffic you can gain from an individual keyword. Factor in the changing appearance of the search engine results page (e.g. the widespread increase in featured snippets) and it becomes an even more daunting task.

If keyword rankings are your North Star, you may be traveling in the completely wrong direction.

When all you’re obsessing over is where each page is tracking against a ranking goal, you’ll likely be misses a ton of other value that your content is bringing in. For example, what if you’ve built out some content with the primary goal of driving backlinks or social traffic, but it isn’t necessarily designed to rank for much itself (e.g. a research report)? Using keyword rankings as a determining factor of success could evaluate content in a completely inaccurate way.

Measuring Performance at the Topic Cluster Level

To combat a lot of the issues I raised above, we shifted the way that we measured content at HubSpot. For the past couple of years we’ve taken a step back from analyzing the performance of content on a page-by-page level and looked at the performance of content at the topic cluster level.

xCFV0NgTucOoEr8tRfpacOGlx 4EOcz3dn0mIvMO3CTdR3Ba26fWUsy0oWdLzRf8sNFZjL2Bx2p UalSA iXTaAxzH3f qjkZCFTF9HZZXGWQYVKHud1ZH6SRyZ XnyrTLEpccpM

Organic search traffic and conversions are our primary search goals, so when we group our content into clusters to try and gain visibility for any searches related to a given topic, we look at the collective performance of these groups of webpages vs just the performance of individual pages.

This model of analysis helps us account for the varying goals of each individual piece of content. Also, running this analysis at scale tells us which topics tend to drive more traffic growth compared to others, and which topics tend to convert traffic at a higher rate.

This information tends to provide much clearer insights for the team as to what they should focus on next without obsessing over individual keyword rankings.

Is There Still a Place for Keyword Rankings?

Despite everything I’ve said above, I’m not actually saying that keyword rankings are dead (I can already see the tweets ready to be fired at me!). Keyword data can be useful for digging into any SEO problems that happen to your site, and also to look into the intent behind certain types of searches.

That said, the new version of Google Search Console that has just recently been rolled out should give you pretty much everything you need here.

More than anything, as a marketer you need to be aware that the data that you’re looking at related to keywords is not 100% accurate. As a result, this should never be your primary performance metric.

Should You Let a Bot Write Your Content?

Artificial Intelligence

Everyone thinks they can write. Even, it would seem, robots.

But do these automated writers really have the ability to produce content to rival the work produced by those of us who write for a living? Should we see them as a threat?

Likewise, whether you are a brand manager within retail, travel, finance or any other industry, will this affect you and what, if anything, do you need to be thinking about now? Could you actually consider using robots instead of humans to get your content written?

Well, it’s time to see what they’ve got to offer as, in what is believed to be a world-first for journalism, robot-generated stories have been produced by The Press Association.

The automated press service set up by PA — RADAR — is currently trialling computer-generated-data-driven content, funded by a grant from Google’s Digital News Initiative.

The plan: to create 30,000 localised stories a month from data using Natural Language Generation software.

At the end of November 2017 a pilot, involving 35 regional titles from 14 publishing groups including Archant, Independent News and Media and Johnston Press, resulted in multiple versions of four stories being distributed. These have since appeared in weekly and daily titles both online and in print.

What This Means for The Press Association

In its own words: “Press Association was conceived as a London-based news gathering service for the provincial papers … we are trusted because we are fast, fair and accurate. Today, much of the content people read, see or hear continues to originate from PA.”

Ultimately, it gathers data and facts that are then sent out to journalists who should source quotes and localize this information, turning the bare bones into a story suitable for their individual publication. So, in the case of computer-generated stories, this could offer an effective way of the Press Association sending out a lot of information very quickly.

It is important to note here that reporters are still at the core — merely using the information they receive as the starting point for stories.

So, the real question is, could a robot ever be trained to write the full piece? And, will they be replacing humans at news desks up and down the country?

The short answer is no.

More often than not, we want much more from an article than just the who, what, where, when and how — it must also take us to the emotion that lies beyond the fact. You don’t, for example, just want to know that a school excelled in its A-level results this year, you want to hear from the child who beat the odds to get top marks and is now set to attend Cambridge University.

This human element is key and it is why we can’t underestimate the importance of having a human write the content. There is a certain skill to writing – particularly copy that needs to be entertaining, engaging or persuasive – that goes well beyond typing words on to a page. To put it bluntly, solely data driven content is dull. It lacks the emotion and context that us writers could – and should – inject into a story.

We Still Need Human Writers

Recently, a new chapter was written for the Harry Potter series titled: Harry Potter and the Portrait of what Looked Like a Large Pile of Ash. But this was not fan fiction, nor was it written by J.K Rowling herself — it was actually typed up by a predictive keyboard.

After feeding seven books through the computer, lines such as, “He saw Harry and immediately began to eat Hermione’s family”, and “‘Not so handsome now’, thought Harry as he dipped Hermione in hot sauce,” were produced.

The fact that your content needs to make sense goes without saying … but not when it comes to robots it seems.

This isn’t the only thing to consider, however.

Injecting Humor

If you get it right, you will reap the rewards from humorous content. People like to be entertained — and will share and interact with your content if they feel they have been entertained. But can a robot be funny?

During his time as New Yorker’s cartoonist, Bob Mankoff developed an interest in the creative potential of artificial intelligence. After launching the cartoon caption contest and receiving up to 10,000 entries a week he attempted – along with Microsoft and Google’s DeepMind — to develop an algorithm that could distinguish between those that were funny and those that weren’t. However, he eventually declared it a ‘dead end’.

Yet he did form Botnik studios and created the tool used to create the Harry Potter chapter. This tool takes the essence of a publication or topic – such as David Attenborough’s Blue Planet – to create something ‘completely absurd’.
A tool which, although fun to play around with, won’t create content worthy of placing on your website in a hurry.

Being Aware of the Details

No matter how clever they seem and how much they can already do that you probably would never have expected, you can’t train a robot to have news sense. Take the example of this news story created by a robot for PA: ‘Most babies are born to married parents in Bournemouth, figures reveal’.

In this data-heavy article, the robot wouldn’t know if the maternity ward was due to be closed, for example. It’s often extra information such as this which adds an important angle to the story, making it more newsworthy.

Data can organize information, but journalists turn it into a story.

Recognizing Context and Emotion

The words shouldn’t only tell us the facts, they should bring a story to life and tap into our emotions.

Take the different approaches to this sports story, for example.

The robot version begins: Marcus Paige scored with nine seconds remaining in the game to give North Carolina a 72-71 lead over Louisville.

While the human version, written for ESPN, opens: Marcus Paige ignored the pain in his twice-injured right foot, put his head down and drove toward the rim.

This storytelling element is something the computer can’t imitate.

Capturing Thoughts and Feelings

The above sports story also included the quote from Paige: “I said jokingly to my teammates that I was back.” An understanding of natural language is and will continue to be a very big challenge for artificial intelligence.

We want to know people’s thought and feelings on the facts and stats. A robot can’t yet conduct a natural interview and filter out the answers to be used as a key quote. It’s often in these quotes where the emotion of an article comes across.

Don’t Rely on Robot Writers Just Yet

Emotion, context, news sense and humor should all feed into a compelling content calendar for brands too. You need to appreciate what your readers like, what matters to them, how best to talk to them and how to entertain them.

There is a lot that happens at Zazzle before we write any words on a page. This includes:

  • Discovering the target audience of a brand and then creating personas
  • Keyword research to make sure we are targeting the relevant terms
  • Ensuring content we plan to create is relevant and there is a variety of it.
  • Understanding (and sometimes creating) the tone of voice

Once all this is done we will start to write. But, unlike a robot, we will be able to keep all of the above in mind while making sure that we write content that other humans want to read. Writing involves analyzing and interpreting this information to deliver a message effectively.

As Mankoff said: “Machines, in the end, are idiots, or maybe idiot savants, that need humans to create content that’s going to be interesting to human beings.”

Search Engine Optimization - The Complete Guide

See How TMV Does Search Engine Optimization

See what sets us apart.
Learn More