This tutorial will show you how to tweak your website using 3 popular SEO tools, thus improving your sites performance and visibility. I’ll start with an overview of major SEO items and potential red flags on your website. I’ll follow with examples of how to use SEMrush, Google Analytics and Screaming Frog to understand your websites’ performance and start to fix it.

Overview

The single most important factor in having great SEO is having great content. That is, if you need to re-do your website because it sucks-bite the bullet. If not, start organizing your content to be crawl-able by Google bots. The third most influential SEO item is performance– if a page loads too slowly, bounce rates skyrocket. This article mainly covers quick organization fixes and tools to diagnose your issue.

Organization Red Flags

Low-hanging fruit regarding your own websites’ organization include;

Menu Issues

Controlling the menu controls what choices people can make. At the same time, people rarely stop to question what they looking at on a menu and why. Make sure the navbar you offer serves a clear purpose rather than acting as a distraction. People will straight up bounce off your site if the navbar is confusing. For example, the menu bar of one of my clients (shown below) doesn’t provide any clear avenue to make a purchase even though that was entire point of the website.

One item on the homepage labeled “designs” dropped down to 3 choices: “Exclusives”, “Historic / Nottingham” and “Designs”(again). “Designs” is not specific enough- it says nothing about actual “curtains” or “purchasing,” rather sounds like an artist advertising different styles. Users are not easily able to travel the route that is PRODUCT — →SHOPPING CART without using lots of mental energy.

Structural issues

At the back-end, things were very funkily laid out. URLs were full of dashes and keywords with no clear hierarchy. His typical URL structure, https://www.cottagelace.com/The-Eastlake-Panel-and-Sidelight-Lace-Curtains, indicated that every product he sold was in one home folder. Can you imagine the mess of dumping all of your companies’ files into a single drawer? Not only is this confusing to those who happen to read URLs, your content is much less clear to bots crawling your website. Instead one might expect https://oldeworldelace.com/product-category/lace-curtains/cotton-lace-curtains-scotland/ which at least shows what you’re looking at and where it is.

Find and Fix Problems

SEMrush

SEMrush lets you look at your websites’ current rank on a number of keywords, as well as determine what keywords drive traffic. SEMrush also lets you look at your competition’s keywords, and how many clicks they get. You can collate, organize and display the data in different ways to create reports. Here is the basic output from a SEMrush search on cottagelace.com:

This mess answers the following question: “of the people who ended up on this website, what words did they search to get there?”

The “victorian lace curtains” keyword was the biggest driver of traffic here. 13.06 % searched ‘victorian lace curtains’ to arrive. ‘Victorian lace curtains’ is in ‘pos’ (position) 7, which means it shows up as 7th in google search results. Google only displays up to rank 10 on its first page.

When people search for “lace curtains,” however, Cooper Lace can be found in the middle of the 3rd page of results (rank 34) which is horrible. Other info such as the “Volume” column helps you target your advertisement towards more frequently used keywords; here we see an average of 9,900 people search for “lace curtains” monthly in the US.

Google Analytics

Google Analytics helps you look at traffic numbers, bounce rates and conversion rates;

We see the percentage of mobile, desktop and tablet traffic hitting the website monthly. You can get a report of the number of sessions had(naturally more than the number of individuals visiting the site). You also see bounce rate is way too high; this is how many users hit their first page and left (perhaps too frustrated to stay.) “Conversion Rate” tells us that only 2.04% of sessions resulted in purchases, which is a relatively low number. Information like this helps you know whether, say, you might want to start focusing on making our website more mobile-friendly.

You can also group your data into search types: Organic, direct, referral and social media. This tells you where your streams of traffic are coming from.

Most of Cooper Lace’s traffic is from organic searches- no one was linking to it from social media or any other secondary source. So a heightened social media presence in this case would help drive traffic to the site.

A last interesting piece from Google Analytics indicates which parts of his website were common landing places;

In rebuilding the site you might redirect traffic from those more successful links, to their closest analogs in your new version. In this case “/Historic-Nottingham-Lace-Curtains” was the most important landing page (after the home-page).

Screaming Frog

Screaming Frog crawls your site and check for duplicate content, bad meta tags, and other things that can lower your SEO ranking. The meta-tags in this example (which equate to the short descriptions you’d see as you browse through your google search results) were overly-long and being cut off. This lead to lower click-thru rate as partial descriptions are nonsensical and don’t catch people’s eyes. There was also duplicate content which automatically lowers SEO. Interestingly, the website had been updated from http to https a couple of years ago, but both addresses seemed to still be functioning. The current webmaster did a work-around by canonicalizing all the https versions of each page, but having both up can still be detrimental to SEO.

WTH’s wrong with our h2’s

This analysis tells us that 100% of his h2’s were repetitive and non-descriptive. When h2’s don’t relate to the content on the page, crawlers can’t accurately map the site and rank pages. And lo, on all pages the h2 indeed read “Free U.S Ground Shipping!” a phrase that doesn’t actually allude to the pages’ content.

It was also interesting to note from these analytics that the robots.txt file included in the top-level directory of most sites was preventing the blog and a few other pages from being crawled;

Robots.txt blocking

Perhaps the previous webmaster didn’t want people visiting these places and forgot to unblock this. A rebuild might focus on having much clearer URL layouts with h2 tags that were descriptive for each section, and removing blockages of this sort so as to get rid of “dead weight” (un-visited pages) dragging SEO down.

I’ve covered just a few great diagnostic tools that can help you get an understanding of your websites search-ability. As every website is different, in terms of history and structure and god knows what else, doubtlessly your own findings will be vastly different from mine. All you can do is research what you see and come up with overall picture of your current SEO situation that can help inform what steps must be taken next.



SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here