Website Audit And Crawl
Website Audit And Crawl

Website Audit And Crawl

Understanding SEO Website Audits and Crawl

The connection between Search Engine Optimization audits and crawl is simple. To perform a comprehensive website audit from an SEO approach, you need to first understand how search engines find, comprehend, and rank your website. In an SEO audit, we’re looking at possible issues or errors that may get in the way of crawlers. In short, a website audit is an assessment of its ability to show, and rank in search engine results pages (SERPs). To go through this assessment, we need to consider the different website elements that determine how a website is crawled. First things first, we need to understand what is crawling and how it works!

Crawling is the process of constantly searching and discovering pages that exist on the web. It’s how search engines identify websites, either by following a link from an already crawled page to a new one or with the help of a website owner who manually submits a sitemap. The next step is analyzing, rendering, and trying to understand the content of the web page to add it to its index, a process known as indexation. Once indexed and ready to go, those bots then try to determine its relevance to search queries, based on many factors, and serve it to searchers appropriately.

What is the goal behind any of this? The goal is to improve your website’s crawlability to help search engine spiders understand your content cohesively and adequately to enhance your SEO performance and search rankings. How? Through manually auditing your website or with the help of a pseudo search engine crawler, plenty of tools out there. Our focus here will be on manual site audits with the help of Google Search Console.

   


Why are we only really looking at Google?

As you can see, we’re solely focusing on Google and its often contemplated web crawler, known as Googlebot. You may speculate whether other search engines should be taken into consideration or not when embarking on an audit. Truth is, more than 90% of online searches are taking place on Google and its subsidiary features, so it only makes sense to focus your optimization efforts on Google. So let’s start exploring some of the essential SEO elements that are picked up by crawlers, and how to optimize them.

Mobile-First Indexing and User Experience

Most people are searching on Google using their mobile devices; so how does that affect your ranking performance? As a part of its continuous efforts to reflect user behaviour trends, Google considers providing a mobile-friendly experience as a ranking factor. Which brings us to mobile-first indexing; meaning that Google will predominantly use the mobile version of your site for indexing and ranking. Your site’s mobile experience, as a result, becomes critical to your online presence. Briefly, here are some main elements you need to consider when auditing your site’s mobile experience:

 

 

 

 

 

 

Mobile Navigation: With limited space and small screen size, mobile navigation needs to be direct and straightforward. Consider eliminating elements that are not of high-priority and only limit your mobile navigation to four to six items on the top level. If your website requires multi-level navigation, which is prevalent in e-commerce websites, try to keep it as simple as possible. Avoid adding more than one sublevel of dropdown functionality, or a navigation design that needs horizontal scrolling.

 

Intuitive experience: Make sure your mobile navigation is intuitive, use common sense. Craft wisely-written menu language that sets the user’s expectation of its contents if it’s a dropdown, or where it directs to if it’s a link. For multilingual websites, keep language preference options clear and accessible. Also, use conventional symbols to keep it simple and short; like a magnifying glass in place of the word search, or a hamburger-style menu icon (three stacked lines).

 

Full Functionality: Mobile users should be able to access the same functionality and content that they access on the desktop version. Similarly, make sure that images and videos on your site are embedded and available on mobile.

 

Resources and Crawlability: Make sure your robots.txt file is not blocking access to your site resources (CSS, JavaScript, or Images). If Googlebot is not allowed access to a page’s resources, it may not be able to detect that this page is designed to display well on mobile browsers. If your page is not recognized as mobile-friendly, google may refrain from serving it to users searching on mobile.

 

To check if your site is mobile ready or not, take the Mobile-Friendly Test. If the results are somewhat dissatisfying, the tool will help you identify the particular issues and will even suggest additional resources to learn how to fix them. You can even test code and not only URLs, which is vital for developers to check any issues before going live. Also, check the Search Console Mobile Usability report to fix mobile usability issues that could be affecting your website.

SERP Visibility

There are several reasons why a site might not be crawled or indexed. It could be the fact that it’s a recently launched website and crawlbots just didn’t get to it yet. Deindexation could also be the result of a badly designed and structured layout, that affects both user experience and search engine bots. A site that is poorly connected from other sites can also pass uncrawled. Additionally, It could be due to an error that occurred when Google was trying to crawl your site, or because your policy is blocking crawl bots. How do I know if my website is indexed or not? Easy, enter the URL of your domain with “site:” before it in the search bar like the following: “site:yoursiteurl.com”, and the search results will show all of your indexed website pages.

Google gives you several options that help you control the visibility of your site pages to crawlers. You can help it by submitting a sitemap directly, or you can help it disregard certain web pages for any reason you may have. A sitemap is a file on your site that notifies search engines of any new or updated pages. Another essential file to crawlers is “robots.txt”, also known as the robots exclusion protocol, it’s the file that enables you to instruct crawlers on how to crawl and index your site. The file should be placed in the root directory of a website, and Google Search Console provides a user-friendly robots.txt generator that can help you create it. You can use this method for pages that are unnecessary to appear in SERPs, duplicate content, soft error pages, etc.

SEO FOR DUMMIES, WRITTEN BY BRUCE CLAY
SEARCH ENGINE OPTIMIZATION FOR DUMMIES
CONTENT MARKETING, WRITTEN BY BRUCE CLAY
CONTENT MARKETING STRATEGIES FOR PROFESSIONALS

Improve page title tags, meta descriptions, and heading tags

Next step in the audit process is on-page elements; title tags, meta descriptions, and head tags. A title tag is what communicates to users and search engines the main subject of a particular page, and should be placed at the head of the page HTML. Each page of your site should have a unique title that expresses its main topic and accurately describes its content. Always aim for a title that is both communicative and smooth to read. The homepage title should briefly describe your business, concisely articulate your value proposition, and highlight important information. You can include a call to action if needed, and of course, insert some well-searched relevant keywords.

With description meta tags you have more space to summarize what the page is about, it can be a sentence or two that provide a description of the landing page it’s linked to. Like title tags, description meta tags should appear in the head section of your HTML document and should best represent the URL’s content. Avoid using identical or similar descriptions for all your website pages, instead, focus on creating unique ones for each page; at least for high priority pages like your homepage and other popular pages in your site.

As for heading tags, they are noticeably larger in size than other text to emphasize important parts. It should help users understand the content below it and create a logical structure for it that makes it easier to navigate. To use heading tags properly, think of it as an outline that consists of main points and sub-points. And it goes without saying, keep away from stuffing unneeded keywords in your title tags, descriptions, and headings. Just keep your title tags informative and straightforward, your meta descriptions neat and descriptive, and your headings structured!

Content Optimization and User Signals

Website content is a major contributing factor when it comes to both user experience and ranking. Increasing organic traffic and establishing your website’s reputation with both users and Google, is strongly linked to the quality of your content. User signals like dwell time, bounce rate, and CTR (click-through rate) are influenced by content quality. When sent to search engine bots, these user signals can critically affect your position in SERPs. So, if you don’t already have a content strategy in place, consider these points when creating one:

  • Cater your content to users’ needs. The goal is to know what your readers search for and provide it to them. This could be done with the help of your Keyword Research data or simply through using Google’s Keyword Planner to look for relevant topics and keyword ideas and create authentic, informative, and relevant content around them. In fact, you can focus your optimization efforts on the top search queries your site appears for using Search Console’s Performance Report tool. It shows you top search queries and the ones that drove the most traffic to your site.
  • Provide an adequate amount of content. When creating content, you need to consider both quality and quantity. There’s no ideal length to which you need to conform; which takes us to the first point again, what are your users’ needs? Generally, the longer the content is, the more room you have to back it up with substance, evidence, and facts. That being said, make sure you avoid being redundant for the sake of publishing lengthy articles or blogs.
  • Ensure proper keyword density and usage. This can’t be said enough: refrain from keyword stuffing. Use keywords naturally and logically, for example, use semantically related keywords instead of using the same terms over and over again. And avoid using unnecessary keywords aimed at Googlebot, that makes no sense to users.

UNITED ARAB EMIRATES
401 Cayan Business Center, Barsha Heights, Dubai, PO Box 31705

Level 17, World Trade Center,Khalifa Bin Zayed The First Street,Abu Dhabi, PO Box 3876