The 5-Point Technical SEO Checklist for Bloggers

Total Page:16

File Type:pdf, Size:1020Kb

The 5-Point Technical SEO Checklist for Bloggers The 5-Point Technical SEO Checklist For Bloggers 1. Crawl your website with Screaming Frog and clean your untidy URL structure ● Start with crawling your website by downloading Screaming Frog (a limit of 500 URLs in the free version). ● It will fetch key onsite elements finding duplicate page titles, meta descriptions and your URL structure. ● To get started, enter your root domain in the blank box and press the start button. ● The tool will process results depending on the size of your website. There’s a progress bar at the top that you can check. ● Once analyzed, start with checking the ’Response Codes’ tab. It will show you the HTTP status codes of URLs. ● You should also cross-check your website’s crawl errors by logging your Google Webmasters Account. Here’s a guide on fixing various kinds of crawl errors. ​ ​ ● Next, check the tool’s URL tab and sort the column by length. ● A good URL is short (4 to 5 words) and explanatory of the content inside. ● Spot any unnaturally long URLs and non-ASCII character URLs that can cause indexation problems. ● Screaming Frog has filters to sort such URLs at the click of a button. ● Ensuring a compelling and punchy title, appropriate image size and a persuasive meta description is a part of on-page SEO. ● Last and a critical aspect of the tool is creating an XML sitemap – a website roadmap for search engines to show how your content is organized. ● You’ll need to use the top navigation bar to find the feature under Sitemaps > Create XML Sitemap. ● A large website can have multiple sitemaps. Because one XML sitemap can’t have more than 50,000 URLs of 50MB. ● Once created, you can submit the sitemap from Google search console from “Crawl >> Sitemaps >> Add/Test Sitemap.” ● It’s also advisable to create an HTML sitemap that can be read by the users (also helps search engines). ● If you’re on WordPress, you can use Yoast’s SEO plugin for creating an HTML sitemap. ​ ​ ● In 2007, Google and Yahoo united to use a system of auto discovery of sitemaps – without the need for you to submit your sitemap to all search engines. ● So you also need to share your sitemap location under your website’s robots.txt file. ● This article explains how you can create a robots file integrating your sitemap location. ● Pro Tip: Silo your website content to help search engines understand what your website is ​ about. This means that you break your content into easily understandable categories. And all your website pages are accessible within 3-4 clicks. ● The pages present deeper than 4 clicks might face difficulty in getting crawled. So use the right filters and a faceted navigation. ● And you might just narrow down millions of results to the essential few in a couple of clicks. 2. Properly implement redirects and fix your broken links to optimize your crawl budget ● Let’s start with understanding how to use the correct version of redirects. Many webmasters confuse between: ❏ 301 – a permanent redirect that passes 90-99% of original page’s authority. ❏ 302 – a temporary redirect that indicates to the search engine that the original page will be restored soon. ● Now if you use a 302 redirect instead of 301, then you’re going to confuse search engines. And if 301 stays for a while, you’ll lose some traffic. ● As a rule of thumb: If you’re no longer willing to have a page on your website, then 301 redirect it to a relevant and updated one. ● You should also pay special attention to the pages that show a 404 error code (found through Google Webmaster Tools and Screaming Frog in point 1). ● They might be your website pages that: ❏ have been moved to new locations, ❏ Were incorrectly linked by other webmasters, ❏ Had URLs deleted from your website. ● The bottom line is that these incorrect URLs waste Google’s time in indexing non-existing content on your website. How? ● See Google allocates a crawl budget to every website based on their Page Rank (it’s an internal metric used by Google search team, not the one that you see). ● As per Botify logs analyzer, the crawl ratio is 81% for large websites and only 11% for smaller websites. ● So if you waste your valuable budget on links that don’t even exist on your website, then the bandwidth to scan important URLs will become limited. ● So either create a user-friendly 404 page that shows your brand’s personality. ● Or 301 redirect the 404 error page to a relevant page on your website for preserving link juice. ● For ensuring that Google crawls your website’s most important URLs you need to have meaningful and user-friendly internal linking. ● So naturally use keyword anchor text and channel your users as well as bots to other relevant pages in an article. ● You’ll build keyword relevancy and vastly improve your website’s crawlability. ● If you want to determine which pages are the most important (based on internal links), you can access your internal link report inside Google’s Search Console. ● Read more tips for crawl budget optimization here. ​ ​ 3. Remove junk from your code and ensure your site loads fast ● Let’s start by understanding the issues plaguing your website with Google’s PageSpeed ​ Insights tool. ​ ● Just plug your URL inside the tool and hit the “Analyze” button. ● A score above 80 is fine. But you should consider fixing the issues presented by the tool. ● If you want the exact time, it’s taking to load your website, then go to Pingdom and take their speed test. ● You’ll get your website’s load time along with the page size, grade and the number of requests your website is sending. ● If your website loads in 7+ seconds, then you need to work on the recommendations by the tool. ● Start with installing CloudFlare – a free Content Delivery Network. It can save 60% ​ ​ bandwidth and substantially reduce the number of requests sent by your website. ● Next, WordPress users can install a cache plugin like W3 Total Cache. ● Note that improving your website’s load speed on mobile and desktop involves different steps. ● Since a great mobile experience is equally important, check out some tips for speeding up your mobile website in this mobile landing page article. 4. Resolve duplicate and thin content issues ● Start with scanning your website with Siteliner. It will crawl up to 250 pages and show you ​ ​ the duplicate content, sorted by percentage match. ● You can also use the Google Search Console to check for duplicate issues. ● Head over to “Search appearance >> HTML improvements.” And click on the number to get the specific duplicate content cases. ● Depending on your situation, you can resort to any of the following 3 steps for resolving duplicate content issues: 1. Implement “noindex” tags – If you’re facing duplicate content issues due to pagination, then ​ you can set the meta robots tag value to “noindex, follow”. This ensures that search engines crawl the links on the specified page, but will not include them in their index. 2. Implement a rel=“canonical” tag in the <head> of the duplicate page – This is a way of ​ showing the preferred version of your page to Google. ● And it resolves the unintentional duplicate content issues created as a result of session IDs, affiliate codes and URL parameters. ● Here are 3 URLs that will take a visitor to the same page, but might get indexed as separate pages in search engines and lead to a penalty: ❏ neilpatel.com/2015/09/27/13-secrets-thatll-boost-your-facebook-organic-reach/ ❏ neilpatel.com/2015/09/27/13-secrets-thatll-boost-your-facebook-organic-reach/?session id=5349312 ❏ neilpatel.com/2015/09/27/13-secrets-thatll-boost-your-facebook-organic-reach?ref=ema il/ 3. Delete or reduce the quantity of duplicate content – This is a simple solution if it’s possible to ​ implement for you. 5. Get the edge with rich snippets in SERP ● You can get started with Google’s Markup Helper. It’ll take you step-by-step through the ​ ​ process of tagging data to a URL. ● You can go through the detailed schema markup tutorial at KISSMetrics. ​ ​ ● If you’re on WordPress, the Schema Creator plugin can also help in feeding some values. ​ ​.
Recommended publications
  • Intro to Google for the Hill
    Introduction to A company built on search Our mission Google’s mission is to organize the world’s information and make it universally accessible and useful. As a first step to fulfilling this mission, Google’s founders Larry Page and Sergey Brin developed a new approach to online search that took root in a Stanford University dorm room and quickly spread to information seekers around the globe. The Google search engine is an easy-to-use, free service that consistently returns relevant results in a fraction of a second. What we do Google is more than a search engine. We also offer Gmail, maps, personal blogging, and web-based word processing products to name just a few. YouTube, the popular online video service, is part of Google as well. Most of Google’s services are free, so how do we make money? Much of Google’s revenue comes through our AdWords advertising program, which allows businesses to place small “sponsored links” alongside our search results. Prices for these ads are set by competitive auctions for every search term where advertisers want their ads to appear. We don’t sell placement in the search results themselves, or allow people to pay for a higher ranking there. In addition, website managers and publishers take advantage of our AdSense advertising program to deliver ads on their sites. This program generates billions of dollars in revenue each year for hundreds of thousands of websites, and is a major source of funding for the free content available across the web. Google also offers enterprise versions of our consumer products for businesses, organizations, and government entities.
    [Show full text]
  • Web Site Metadata
    Web Site Metadata Erik Wilde and Anuradha Roy School of Information UC Berkeley Abstract. Understanding the availability of site metadata on the Web is a foundation for any system or application that wants to work with the pages published by Web sites, and also wants to understand a Web site’s structure. There is little information available about how much information Web sites make available about themselves, and this paper presents data addressing this question. Based on this analysis of available Web site metadata, it is easier for Web-oriented applications to be based on statistical analysis rather than assumptions when relying on Web site metadata. Our study of robots.txt files and sitemaps can be used as a starting point for Web-oriented applications wishing to work with Web site metadata. 1 Introduction This paper presents first results from a project which ultimately aims at pro- viding accessible Web site navigation for Web sites [1]. One of the important intermediary steps is to understand how much metadata about Web sites is made available on the Web today, and how much navigational information can be extracted from that metadata. Our long-term goal is to establish a standard way for Web sites to expose their navigational structure, but since this is a long-term goal with a long adoption process, our mid-term goal is establish a third-party service that provides navigational metadata about a site as a service to users interested in that information. A typical scenario would be blind users; they typically have difficulties to navigate Web sites, because most usability and accessibility methods focus on Web pages rather than Web sites.
    [Show full text]
  • Unlocking Trapped Value in Data | Accenture
    Unlocking trapped value in data – accelerating business outcomes with artificial intelligence Actions to bridge the data and AI value gap Enterprises need a holistic data strategy to harness the massive power of artificial intelligence (AI) in the cloud By 2022, 90% of corporate strategies will explicitly cite data as a critical enterprise asset1. The reason why? Fast-growing pressure on enterprises to be more responsive, customer obsessed and market relevant. These capabilities demand real-time decision-making, augmented with AI/machine learning (ML)-based insights. And that requires end-to-end data visibility—free from complexities in the underlying infrastructure. For enterprises still reliant on legacy systems, this can be particularly challenging. Irrespective of challenges, however, the direction is clear. According to IDC estimates, spend on AI platforms and applications by enterprises in Asia Pacific (excluding Japan and China) is set to increase by a five-year CAGR of 30%, rising from US$753.6 million in 2020 to US$2.2 billion in 202 4 2. The need to invest strategically in AI/ML is higher than ever as CXOs move away from an experimental mindset and focus on breaking down data silos to extract full value from their data assets. These technologies are likely to have a significant impact on enterprises’ data strategies—making them increasingly By 2022, 90% of corporate strategies will integrated into CXOs’ growth objectives for the business. explicitly cite data as a critical enterprise asset. This is supported by research for Accenture’s Technology Vision 2021, which shows that the top three technology areas where enterprises are prioritizing - Gartner investment over the next year are cloud (45%), AI (39%) and IoT (35%)3.
    [Show full text]
  • SEO SERVICES. Bad Penny Factory Is a Digital Narration and Design Agency, Founded in Oklahoma and Grown and Nurtured in the USA
    SERVICES. SEO Bad Penny Factory is a digital narration and design agency, founded in Oklahoma and grown and nurtured in the USA. Bad Penny Factory is an award winning agency with prestigious accolades recognized the world over. COMPANY SEO SERVICES SEO PACKAGES Search Engine Optimization (SEO) uses specific keywords to help to boost the number of people that visit your website. How does it work? It targets users that are most likely to convert on your website and takes them there. Our SEO also packages ensure long-term keyword connections to ensure secured ranking placements on search results pages. $500/Month $750/Month • 9, 12 and 18 Month Terms • 9, 12 and 18 Month Terms • One-Time $275 Setup Fee • One-Time $375 Setup Fee • Services for 1 Website • Services for 1 Website // Entry Packages with Scalable Options Bad Penny Factory SEO Packages Aggressive Market Leader Number of Keyphrases Optimized Up to 80 Up to 150 Keyphrase Research and Predictive Selection X X Meta Tags (titles and descriptions) X X Optimization of Robots.txt and GoogleBot Crawls X X Creation and Registrations of Sitemap.xml X X Google My Business Optimization X X Custom Client Reporting Dashboards with Data Views X X Local Search Optimization X X View and Track Competitor Data and Search Rankings X X Original Blog Post Creation 2 Assets Per Month 4 Assets Per Month Link Repair and Building X X Add-On Domain Selection and Buying 2 Assets 5 Assets Media Library and Image Collection Optimization Up to 80 Up to 300 Voice Search Optimization X X Page Speed Audit X X Creation
    [Show full text]
  • Endeca Sitemap Generator Developer's Guide Version 2.1.1 • March 2012
    Endeca Sitemap Generator Developer's Guide Version 2.1.1 • March 2012 Contents Preface.............................................................................................................................7 About this guide............................................................................................................................................7 Who should use this guide............................................................................................................................8 Conventions used in this guide.....................................................................................................................8 Contacting Oracle Endeca Customer Support..............................................................................................8 Chapter 1: Introduction..............................................................................9 About sitemaps.............................................................................................................................................9 About the Endeca Sitemap Generator..........................................................................................................9 Chapter 2: Installing the Endeca Sitemap Generator............................11 System requirements..................................................................................................................................11 Installing the Sitemap Generator................................................................................................................12
    [Show full text]
  • Vermont Wood Works Council
    REPORT FOR APR 1, 2021 - APR 30, 2021 (GENERATED 4/30/2021) VERMONT WOOD - SEO & DIGITAL MARKETING REPORT Search Engine Visibility & Competitors NUMBER OF ORGANIC KEYWORDS IN TOP 10 BY DOMAIN verm on twood.com verm on tfu rn itu rem akers.com m adein verm on tm arketplace.c… verm on twoodworkin gsch ool.… 1/2 20 15 10 5 0 May Ju n Ju l Au g Sep Oct Nov Dec Jan Feb Mar Apr ORGANIC SEARCH ENGINE VISIBILITY ORGANIC VISIBILITY INCL. COMPETITORS 10 .0 0 Domain Organic visibilit y Pre vio us p e m adeinverm ontm arketplace.com 6.91 +0.47 7.50 verm ontwood.com 6.43 - 11.60 verm ontfurniturem akers.com 2.50 - 10.60 5.0 0 verm ontwoodworkingschool.com 1.56 = vtfpa.org 0.13 - 12.91 verm ontwoodlands.org 2.50 0.09 +89.7 3 0 .0 0 May Ju n Ju l Au g Sep Oct Nov Dec Jan Feb Mar Apr Google Keyword Ranking Distribution # OF KEYWORDS TRACKED # OF KEYWORDS IN TOP 3 # OF KEYWORDS IN TOP 10 # OF KEYWORDS IN TOP 20 0 8 10 Previou s period Previou s year Previou s period Previou s year Previou s period Previou s year 29 0% - 100% 0% 14% 0% 0% 1 of 8 Google Keyword Rankings ORGANIC POSITION NOTES Ke yword Organic posit ion Posit ion change verm ont wooden toys 4 = T he "Or ganic Posit ion" means t he it em r anking on t he Google woodworkers verm ont 4 = sear ch r esult page.
    [Show full text]
  • Prior Experience Education Skills
    DANNY BALGLEY [email protected] · www.DannyBalgley.com EXPERIENCE AT GOOGLE Google · Google Maps UX designer for Google Maps on the Local Discovery team working on feature enhancements for UX Designer lists and personal actions (saving & sharing.) Recent projects include high profile updates to the Dec 2017 – Present app mentioned at Google I/0 2018 including enhanced editorial lists and “Group Planning” a collaborative tool for planning a meal with friends. Current responsibilities include prototyping, rapid iteration and working with researchers and leads to evolve Google Maps from beyond just a navigation tool to a dynamic discovery app for finding places and things to do. Google · Local Search Worked on the design team tasked with modernizing placesheet from both a feature and Visual & UX Designer framework perspective, making data more accessible, organized, dynamic and contextual. This Nov 2016 – Dec 2017 involved crafting new components and scalable patterns in a contemporary way that could gracefully extend across 30+ verticals for over 250M distinct places. Responsibilities included working across PAs for UI alignment, conducting iterative research to gauge feature success and chaperoning UX updates through weekly partner team syncs. Google · Zagat Web Visual and UX designer on the small team tasked with reimagining the modern web presence for Visual & UX Designer Zagat. This required wire-framing, sitemaps, user journeys, heuristics and prototypes all while Aug 2016 – Nov 2016 working with research & content teams on both strategy and product perception. Worked closely with eng team on implementation and polish of adaptive designs and features such as the robust search and refinement experience for finding the perfect restaurant.
    [Show full text]
  • App Search Optimisation Plan and Implementation. Case: Primesmith Oy (Jevelo)
    App search optimisation plan and implementation. Case: Primesmith Oy (Jevelo) Thuy Nguyen Bachelor’s Thesis Degree Programme in International Business 2016 Abstract 19 August 2016 Author Thuy Nguyen Degree programme International Business Thesis title Number of pages App search optimisation plan and implementation. Case: Primesmith and appendix pages Oy (Jevelo). 49 + 7 This is a project-based thesis discussing the fundamentals of app search optimisation in modern business. Project plan and implementation were carried out for Jevelo, a B2C hand- made jewellery company, within three-month period and thenceforth recommendations were given for future development. This thesis aims to approach new marketing techniques that play important roles in online marketing tactics. The main goal of this study is to accumulate best practices of app search optimisation and develop logical thinking when performing project elements and analysing results. The theoretical framework introduces an overview of facts and the innovative updates in the internet environment. They include factors of online marketing, mobile marketing, app search optimisation, with the merger of search engine optimisation (SEO) and app store op- timisation (ASO) being considered as an immediate action for digital marketers nowadays. The project of app search optimisation in chapter 4 reflects the accomplished activities of the SEO implementation from scratch and a piece of ASO analysis was conducted and sugges- tions for Jevelo. Other marketing methods like search engine marketing, affiliate marketing, content marketing, together with the customer analysis are briefly described to support of theoretical aspects. The discussion section visualises the entire thesis process by summing up theoretical knowledge, providing project evaluation as well as a reflection of self-learning.
    [Show full text]
  • SEO 101: a V9 Version of Google's Webmaster Guidelines
    SEO 101: A V9 Version of Google's Webmaster Guidelines March 2016 The Basics Google’s Webmaster Guidelines have been published for years, but they recently underwent some major changes. They outline basics on what you need to do to get your site indexed, ranked and not tanked. But what if I don’t wanna follow the guidelines? A rebel, eh? Well, some of the guidelines are best practices that can help Google crawl and index your site - not always required, but a good idea. Other guidelines (specifically quality guidelines), if ignored, can earn you a hefty algorithmic or manual action - something that can lower your ranking or get you kicked out of the SERPs entirely. Google breaks the guidelines up into three sections - we’ll do the same. ● Design and content guidelines ● Technical guidelines ● Quality guidelines Before doing anything, take a gut check: ● Would I do this if search engines didn’t exist? ● Am I making choices based on what would help my users? ● Could I explain this strategy to my competitor or my mama and with confidence? ● If my site went away tomorrow, what would the world be missing? Am I offering something unique and valuable? 1 Updated March 2016 Now, Let’s Get Your Site Indexed Make Sure Your Site is Crawlable ● Check www.yourdomain.com/robots.txt ● Make sure you don’t see: User-agent: * Disallow: / ● A robots.txt file that allows bots to crawl everything would look like: User-agent: * Disallow: ● Learn more! Make Sure You’re Allowing Google to Include Pages in their Index ● Check for and remove any meta tags that noindex pages: <meta name="robots" content="noindex" /> ● Do a Fetch as Google in Search Console and check the header for x-robots: noindex directives that need to be removed X-Robots-Tag: noindex ● Learn more! 2 Updated March 2016 Submit Your Site to Google When ready, check out Google’s options for submitting content.
    [Show full text]
  • Seo-101-Guide-V7.Pdf
    Copyright 2017 Search Engine Journal. Published by Alpha Brand Media All Rights Reserved. MAKING BUSINESSES VISIBLE Consumer tracking information External link metrics Combine data from your web Uploading backlink data to a crawl analytics. Adding web analytics will also identify non-indexable, redi- data to a crawl will provide you recting, disallowed & broken pages with a detailed gap analysis and being linked to. Do this by uploading enable you to find URLs which backlinks from popular backlinks have generated traffic but A checker tools to track performance of AT aren’t linked to – also known as S D BA the most link to content on your site. C CK orphans. TI L LY IN A K N D A A T B A E W O R G A A T N A I D C E S IL Search Analytics E F Crawler requests A G R O DeepCrawl’s Advanced Google CH L Integrate summary data from any D Search Console Integration ATA log file analyser tool into your allows you to connect technical crawl. Integrating log file data site performance insights with enables you to discover the pages organic search information on your site that are receiving from Google Search Console’s attention from search engine bots Search Analytics report. as well as the frequency of these requests. Monitor site health Improve your UX Migrate your site Support mobile first Unravel your site architecture Store historic data Internationalization Complete competition Analysis [email protected] +44 (0) 207 947 9617 +1 929 294 9420 @deepcrawl Free trail at: https://www.deepcrawl.com/free-trial Table of Contents 9 Chapter 1: 20
    [Show full text]
  • Google Search Console
    Google Search Console Google Search Console is a powerful free tool to see how your website performs in Google search results and to help monitor the health of your website. How does your site perform in Google Search results? Search Analytics is the main tool in Search Console to see how your site performs in Google search results. Using the various filters on Search Analytics you can see: The top search queries people used to find your website Which countries your web traffic is coming from What devices visitors are using (desktop, mobile, or tablet) How many people saw your site in a search results list (measured as impressions) How many people clicked on your site from their search results Your website’s ranking in a search results list for a given term For example, the search query used most often to find the SCLS website is ‘South Central Library System.’ When people search ‘South Central Library System’ www.scls.info is the first result in the list. As shown below, this July, 745 people used this search query and 544 people clicked on the search result to bring them to www.scls.info. Monitoring your website health Google Search console also has tools to monitor your website’s health. The Security Issues tab lets you know if there are any signs that your site has been hacked and has information on how to recover. The Crawl Error tool lets you know if Google was unable to navigate parts of your website in order to index it for searching. If Google can’t crawl your website, it often means visitors can’t see it either, so it is a good idea to address any errors you find here.
    [Show full text]
  • SNSW for Business Assisted Channels & Middle Office App Dev
    Assisted SNSW for Channels & Digital Channels Business Personal Transactions Middle Office MyService NSW a/c Transactions Digital App Dev Digital SNSW for My Community & Service Proof Of Auth & Licence Knowledge & IT Service Tech Design & Portal Mobile Apps Website Licence Business Project (Forms) Unify Identity Tell Gov Once Linking API Platform Test Team Delivery Development Tech Stacks/Language/Tools/Frameworks Node/5 x XML x React x x x x x x NodeJS x x x Mongo DB - At x AngularJS x x Drupal 7/8 x Secure Logic PKI x HTML5 x Post CSS x Bootstrap x Springboot x x x Kotlin x x x Style components (react) x Postgres RDS x Apigee x Next.JS (react) x Graph QL x Express x x Form IO x Mongo DB/ Mongo Atlas x Mongo DB (form IO) hosted on atlas x Java x x Angular 6 x Angular 5 x SQL server x Typescript x x x Mysql Aurora DB x Vue JS x x Bulma CSS x Javascript ES6 x x x Note JS x Oauth x Objective C x Swift x Redux x ASP.Net x Docker Cloud & Infrastructure Apigee x x x x Firebase x PCF (hosting demo App) x PCF x x x x x x Google Play Store x x Apple App Store x Google SafetyNet x Google Cloud Console x Firebase Cloud Messaging x Lambda x x x Cognito x ECS x x ECR x ALB x S3 x x Jfrog x Route 53 x Form IO x Google cloud (vendor hosted) x Cloudfront x Docker x x Docker/ Kubernetes x Postgres (RDS) x x Rabbit MQ (PCF) x Secure Logic PKI x AWS Cloud x Dockerhub (DB snapshots) x Development Tools Atom x Eclipse x x Postman x x x x x x x x x VS Code x x x x Rider (Jetbrain) x Cloud 9 x PHP Storm x Sublime x Sourcetree x x x Webstorm x x x Intelij x
    [Show full text]