Brand Values and the Bottom Line 1 1

Total Page:16

File Type:pdf, Size:1020Kb

Brand Values and the Bottom Line 1 1 Brand Values and the Bottom Line 1 1. Why You Should Read This Guide 3 2. Common Obstacles to Avoid 4 3. Website Structure 7 4. Keyword Research 8 5. Meta Information 9 Contents 6. Body Content 11 7. Internal Site Linking 12 8. URL Equity 13 9. The Elements of URL Equity 14 10. Assessing URL Equity 15 11. The Consequences of Redesigning Without a URL Strategy 16 12. Migrating URL Equity 17 13. Summary 18 14. Appendix 1 19 15. Appendix 2 20 16. About Investis Digital 21 Brand Values and the Bottom Line 2 1. Why You Should Read This Guide Best Practices: SEO for Website Redesign & Migration outlines organic search optimization best practices for a website redesign, as well as factors to consider in order to maintain the URL equity during a domain or platform migration. This guide illustrates the common pitfalls that you can avoid in the redesign phase of a website, making it possible for a site to gain better visibility within search engines results. Additionally, Best Practices: SEO for Website Redesign & Migration explains the importance of setting up all aspects of a website, To do a deep dive including: directory structure, file names, page content and internal into SEO for website linking. Finally, we illustrate case study examples of successful site redesign and redesigns. migration, contact Reading this guide will set you up Investis Digital. for SEO success when you undergo a website redesign. The guide will We’re here to help. help you avoid costly errors and gain more traffic, leading to valuable conversions. We discuss some required tasks when a migration occurs, such as managing URL equity properly. But because migrations and redesigns often happen at the same time (rarely does one happen without the other), we also discuss some fundamental actions that you need to take with a site redesign. Those include doing effective keyword research and writing compelling body copy. BrandBest Practices: Values and SEO the for Bottom Website Line Redesign & Migration 3 2. Common Obstacles to Avoid Before you plan any website redesign, it’s essential that you first understand the obstacles that could sabotage your efforts – especially in making your content findable by search engines. Today’s web development technology and website platforms take advantage of visually appealing designs that may be attractive to visitors with high-speed connections and fast computers. However, these designs are not compatible with older computers, slower connection speeds, or simplistic search engine spider technology. In addition, robust content management systems and analytical enhancements can severely hinder visibility for core keywords, leaving money on the table. Beware of these “gotchas”: BrandBest Practices: Values and SEO the for Bottom Website Line Redesign & Migration 4 01 04 Java-Script Navigation: Search Deep Folder Structures: Much engine spiders have a hard time like dynamic URLs, unnecessary executing JavaScript. Website folders provide little to no value content that is accessible only by for the search engines. Directory these forms of navigation is not structures should be kept one- likely to be indexed. Additionally, to-two folders deep (if possible), such links do not contribute to with your most important pages on the site’s overall link popularity. the top level. If additional folders Most JavaScript Navigation can cannot be removed, name them be replaced with CSS or DHTML. accordingly. 02 05 Dynamic URLs: Dynamic URLs are Non-Permanent Redirects: If a a common side effect of most redirect of a page is required and content management systems. is going to be permanent, use a These URLs are problematic permanent 301 server-side redirect. because they provide little to no This not only points search engines information about the page itself. in the right direction, it transfers URLs play an important role in the existing link equity of the older search engine rankings, and should, page to the newer one. Typically, therefore, be rewritten to include all other types of redirects should relevant keywords (See the Website be avoided. This includes 302, Structure section of this document). JavaScript redirect, and META refresh. 03 06 Session IDs: Using session User-Action Requirements: Search IDs in your URL strings will engine spiders are not capable of confuse search engine spiders. completing user actions to access Each time the spider visits information. This would include your page, it receives a new ID information like age verification, (new URL); this will be seen as zip codes, or the use of an internal duplicate content, and those search box. Instead, give search pages will be dropped from their engine spiders direct access to index. In addition, session IDs this information via text-based severely reduce link popularity hyperlinks. If necessary, user-agent filtration. Identify search spiders detection can allow spiders to (Googlebot) via user-agent bypass the user-action. detection and remove these IDs. BestBrand Practices: Values and SEO the for Bottom Website Line Redesign & Migration 5 07 10 Broken Links: Broken links send Lack of Body Content: Some form a message to the search engines of relevant content should be on that your website is being all pages. Ideally, each page should updated too infrequently or is have at least one small paragraph badly maintained. It is important of unique content supporting your that check your website for target keyword phrases. If content broken links on a regular basis blocks are not an option, you must using tools such as Screaming ensure that your tier-one and tier- Frog. two level pages utilize text-based headers (<h1>). 08 11 Redirect Chains: A redirect chain Robots.txt File: Search engine occurs when there is more than spiders use this file to understand one redirect between the initial what pages on your website they URL and the destination URL. This can access. Pages with valuable creates crawl limitations while content should not be inadvertently also reducing any chance of equity blocked in this file. To allow all transfer. search engine spiders access to all areas of your website, simply place the following two lines into a text file, put it on the root level of your domain, and name it robots.txt: user-agent: *Disallow 09 12 Content in Images: Search engine Incorrect Robots Meta Tag: The use spiders cannot index valuable of the noindex or nofollow META content embedded within an tags will prevent pages of your image file. In most cases, you website from being indexed. Below can create a similar effect using is an example of the tag: plain text, style sheets, and <meta name=”robots” absolute positioning. Solutions content=”noindex,nofollow”> such as sIFR can also be considered. BestBrand Practices: Values and SEO the for Bottom Website Line Redesign & Migration 6 3. Website Structure It’s critical to consider several Best practices for key components when laying out website structure: the structure of your website. The decisions you make about • Folders (or directories) should contain the naming conventions of your relevant keywords based on the theme of folders and files, and the way each section (see Keyword Research section in which you point to specific on next page). Be as descriptive as possible, pages of your website, can have as these naming conventions should mimic a huge impact on overall traffic your site’s navigational structure. and sales. It is important to build various optimization elements Example: Having a “Outerwear” company into the initial structure of the on your website, the navigational website: structure should look something like: https://redkap.com/Products/Category/36 • Each web page file name should include relevant keywords, based on the unique theme of that page. Ideally, the URL structure should mimic your site’s navigational structure. • Break up keyword phrases with dashes (-) in file and folder names. Search engines will see these as a space between two keywords. • Create a clean and easy-to-understand global navigational structure. Ensure that your navigational elements are as descriptive as possible, and are text-based. Bread-crumbs also make your page visible to spiders. Enable them on all levels of the website if possible. • Use canonical tags; decide if you want visitors to see www.domain.com or domain.com. The version you do not want to utilize is a 301 permanent redirect to the other (Usually the “www.” version is best). • Place all JavaScript and CSS in external files. BestBrand Practices: Values and SEO the for Bottom Website Line Redesign & Migration 7 4. Keyword Research Search engines require the Guidelines to selecting presence of keywords in links, effective keywords: copy, and other page elements, including URLs, as mentioned • Start with a general topic and above in order for pages to work on derivatives of related become visible in the search terms and phrases from there. results for a given query. Over time, the increased frequency of • Experiment with popular these phrases throughout the site keyword research tools like will effectively increase webpage Google Adwords, MOZ, or visibility across the major search SEMrush. engines. • Consider the terms you would Prior to development (folder use if searching for this names, navigational elements, information. etc.), determine specific keywords to target within the • Be aware that consumers may various sections and pages of search for information using your website. With a better different terms than those used understanding of your users’ within the organization. search behavior, you’ll be able • Target one-to-three keywords or to incorporate relevant high- phrases per webpage. volume keywords into your site’s structure from the beginning. • If you have access to server log Here are some recommendations files or site analytics, look there to help you understand effective first to determine how people keyword selection.
Recommended publications
  • Search Engine Optimization: a Survey of Current Best Practices
    Grand Valley State University ScholarWorks@GVSU Technical Library School of Computing and Information Systems 2013 Search Engine Optimization: A Survey of Current Best Practices Niko Solihin Grand Valley Follow this and additional works at: https://scholarworks.gvsu.edu/cistechlib ScholarWorks Citation Solihin, Niko, "Search Engine Optimization: A Survey of Current Best Practices" (2013). Technical Library. 151. https://scholarworks.gvsu.edu/cistechlib/151 This Project is brought to you for free and open access by the School of Computing and Information Systems at ScholarWorks@GVSU. It has been accepted for inclusion in Technical Library by an authorized administrator of ScholarWorks@GVSU. For more information, please contact [email protected]. Search Engine Optimization: A Survey of Current Best Practices By Niko Solihin A project submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Information Systems at Grand Valley State University April, 2013 _______________________________________________________________________________ Your Professor Date Search Engine Optimization: A Survey of Current Best Practices Niko Solihin Grand Valley State University Grand Rapids, MI, USA [email protected] ABSTRACT 2. Build and maintain an index of sites’ keywords and With the rapid growth of information on the web, search links (indexing) engines have become the starting point of most web-related 3. Present search results based on reputation and rele- tasks. In order to reach more viewers, a website must im- vance to users’ keyword combinations (searching) prove its organic ranking in search engines. This paper intro- duces the concept of search engine optimization (SEO) and The primary goal is to e↵ectively present high-quality, pre- provides an architectural overview of the predominant search cise search results while efficiently handling a potentially engine, Google.
    [Show full text]
  • Package 'Rahrefs'
    Package ‘RAhrefs’ July 28, 2019 Type Package Title 'Ahrefs' API R Interface Version 0.1.4 Description Enables downloading detailed reports from <https://ahrefs.com> about backlinks from pointing to website, provides authentication with an API key as well as ordering, grouping and filtering functionalities. License MIT + file LICENCE URL https://ahrefs.com/ BugReports https://github.com/Leszek-Sieminski/RAhrefs/issues Depends R (>= 3.4.0) Imports assertthat, httr, jsonlite, testthat NeedsCompilation no Encoding UTF-8 LazyData true RoxygenNote 6.1.1 Author Leszek Sieminski´ [aut, cre], Performance Media Polska sp. z o.o. [cph, fnd] Maintainer Leszek Sieminski´ <[email protected]> Repository CRAN Date/Publication 2019-07-28 08:40:02 UTC R topics documented: ahrefs_metrics . 2 ahrefs_reports . 2 rah_ahrefs_rank . 3 rah_anchors . 5 rah_anchors_refdomains . 8 rah_auth . 11 rah_backlinks . 11 1 2 ahrefs_metrics rah_backlinks_new_lost . 14 rah_backlinks_new_lost_counters . 17 rah_backlinks_one_per_domain . 20 rah_broken_backlinks . 23 rah_broken_links . 26 rah_condition . 29 rah_condition_set . 31 rah_domain_rating . 32 rah_downloader . 34 rah_linked_anchors . 36 rah_linked_domains . 39 rah_linked_domains_by_type . 42 rah_metrics . 45 rah_metrics_extended . 47 rah_pages . 50 rah_pages_extended . 52 rah_pages_info . 55 rah_refdomains . 58 rah_refdomains_by_type . 61 rah_refdomains_new_lost . 64 rah_refdomains_new_lost_counters . 66 rah_refips . 68 rah_subscription_info . 71 ahrefs_metrics Ahrefs metrics Description Description of Ahrefs
    [Show full text]
  • Webpage Ranking Algorithms Second Exam Report
    Webpage Ranking Algorithms Second Exam Report Grace Zhao Department of Computer Science Graduate Center, CUNY Exam Committee Professor Xiaowen Zhang, Mentor, College of Staten Island Professor Ted Brown, Queens College Professor Xiangdong Li, New York City College of Technology Initial version: March 8, 2015 Revision: May 1, 2015 1 Abstract The traditional link analysis algorithms exploit the context in- formation inherent in the hyperlink structure of the Web, with the premise being that a link from page A to page B denotes an endorse- ment of the quality of B. The exemplary PageRank algorithm weighs backlinks with a random surfer model; Kleinberg's HITS algorithm promotes the use of hubs and authorities over a base set; Lempel and Moran traverse this structure through their bipartite stochastic algo- rithm; Li examines the structure from head to tail, counting ballots over hypertext. Semantic Web and its inspired technologies bring new core factors into the ranking equation. While making continuous effort to improve the importance and relevancy of search results, Semantic ranking algorithms strive to improve the quality of search results: (1) The meaning of the search query; and (2) The relevancy of the result in relation to user's intention. The survey focuses on an overview of eight selected search ranking algorithms. 2 Contents 1 Introduction 4 2 Background 5 2.1 Core Concepts . 5 2.1.1 Search Engine . 5 2.1.2 Hyperlink Structure . 5 2.1.3 Search Query . 7 2.1.4 Web Graph . 7 2.1.5 Base Set of Webpages . 9 2.1.6 Semantic Web .
    [Show full text]
  • 100+ Free SEO Tools & Resources
    100+ Free SEO Tools & Resources Keyword Tools ● Keywords Everywhere Great keyword tool Chrome and Firefox extension. (now paid) ​ ● Answer The Public Aggregated view of the questions asked to Google & Bing. ​ ● Keyword Keg Another free keyword tool, you can import lists and export 20 results. ​ ● Wordtracker Scout Displays what articles are about on keyword cloud. ​ ● LSI Graph Generate a list of semantically related keywords and best performing content. ​ ● Google Trends Compare one keyword to another over time ​ ● Keyword Finder Uses Google's autocomplete API to find many keywords. ​ ● KeywordTool.io An alternative to Google keyword planner. ​ ● Merge Words Takes words from 3 lists and merges them together into one. ​ ● Cognitive SEO Keywords Analyze keywords and get suggestions. ​ ● Seed Keywords Grow your own seo keywords from suggestions from your friends. ​ ● Keyword Density Checker Remember to write for people, not SEO, don’t overoptimize. ​ Keyword Rankings ● Small SEO Tools Rank Checker Check rankings of up to 10 keywords at a time. ​ ● Hoth Rankings Tracker Check 10 keywords ranking daily. ​ ● Can I Rank Explains the opportunities and benefits of ranking for certain keywords ​ Backlink Tools ● Ahrefs Backlink Checker Free version of Ahrefs' backlink checker tool. ​ ● Open Link Profiler Fast backlinks and a lot more info worth investigating. ​ ● Check My Links Chrome plugin to check pages for broken links. ​ ● Search Queries for backlink prospecting ​ ● Guide to getting backlinks Lots of great ideas for obtaining a good link or two. ​ ● Help A Reporter Be a reference for reporters looking for answers. ​ Image Optimization ● GeoImgr Image geotagging, add GPS info to photos online. ​ ● EXIF Editor A program for download that allows you to edit EXIF data of images.
    [Show full text]
  • Backlink Building Guide Version 1.0 / December 2019
    Backlink Building Guide Version 1.0 / December 2019 Backlink Building Guide Why Backlinks are important Backlinks (links) is and has historically been one of the most important signals for Google to determine if a website or a page content is of importance, valuable, credible, and useful. If Google deems a specific piece of content to have higher importance, be more valuable, more credible and more useful, it will rank higher in their Search Engine Result Page (SERP). One of the things Google uses to determine the positions in SERP is something called page rank, and it’s a direct correlation between the number of backlinks and page rank. Therefore, the more backlinks a website has, the higher the likelihood that the site will also rank higher on Google. • Backlinks are one of the most importance factors of ranking in SERP • Backlinks result in higher probability of ranking higher in SERP Backlink Building Guide Version 1.0 / December 2019 The different types of Backlinks Currently there are two different types of backlinks, follow (also called dofollow) and nofollow. A follow backlink means that Google's algorithms follow the links and the page rank on the receiving website increases. A nofollow backlink is still a link and users can still follow it from one page to another, but Google does not consider nofollow backlinks in its algorithms i.e. it does not produce any effect to rank higher in SERP. The owner of the website decides if it’s a “follow” or “no follow” backlink. Common nofollow backlinks are links that website owners have no control over and are produced by others, for example, comment on blogs, forum posts or sponsored content.
    [Show full text]
  • The Pagerank Citation Ranking: Bringing Order to The
    The PageRank Citation Ranking: Bringing Order to the Web January 29, 1998 Abstract The imp ortance of a Web page is an inherently sub jective matter, which dep ends on the readers interests, knowledge and attitudes. But there is still much that can b e said ob jectively ab out the relative imp ortance of Web pages. This pap er describ es PageRank, a metho d for rating Web pages ob jectively and mechanically, e ectively measuring the human interest and attention devoted to them. We compare PageRank to an idealized random Web surfer. We show how to eciently compute PageRank for large numb ers of pages. And, we showhow to apply PageRank to search and to user navigation. 1 Intro duction and Motivation The World Wide Web creates many new challenges for information retrieval. It is very large and heterogeneous. Current estimates are that there are over 150 million web pages with a doubling life of less than one year. More imp ortantly, the web pages are extremely diverse, ranging from "What is Jo e having for lunchtoday?" to journals ab out information retrieval. In addition to these ma jor challenges, search engines on the Web must also contend with inexp erienced users and pages engineered to manipulate search engine ranking functions. However, unlike " at" do cument collections, the World Wide Web is hyp ertext and provides considerable auxiliary information on top of the text of the web pages, such as link structure and link text. In this pap er, we take advantage of the link structure of the Web to pro duce a global \imp ortance" ranking of every web page.
    [Show full text]
  • Web Structure Mining: Exploring Hyperlinks and Algorithms for Information Retrieval
    2nd CUTSE International Conference 2009 Web Structure Mining: Exploring Hyperlinks and Algorithms for Information Retrieval Ravi Kumar P Ashutosh Kumar Singh Department of ECEC, Curtin University of Technology, Department of ECEC, Curtin University of Technology, Sarawak Campus, Miri, Malaysia Sarawak Campus, Miri, Malaysia [email protected] [email protected] Abstract —This paper focus on the Hyperlink analysis, the (NLP), Machine Learning etc. can be used to solve the above algorithms used for link analysis, compare those algorithms and the challenges. Web mining is the use of data mining techniques to role of hyperlink analysis in Web searching. In the hyperlink automatically discover and extract information from the World analysis, the number of incoming links to a page and the number of Wide Web (WWW). Web structure mining helps the users to outgoing links from that page will be analyzed and the reliability of retrieve the relevant documents by analyzing the link structure the linking will be analyzed. Authorities and Hubs concept of Web of the Web. pages will be explored. The different algorithms used for Link analysis like PageRank, HITS (Hyperlink-Induced Topic Search) This paper is organized as follows. Next Section provides and other algorithms will be discussed and compared. The formula concepts of Web Structure mining and Web Graph. Section III used by those algorithms will be explored. provides Hyperlink analysis, algorithms and their comparisons. Paper is concluded in Section IV. Keywords - Web Mining, Web Content, Web Structure, Web Graph, Information Retrieval, Hyperlink Analysis, PageRank, II. WEB STRUCTURE MINING Weighted PageRank and HITS. A. Overview I.
    [Show full text]
  • The Art of Link Building: How to Build High-Quality Backlinks
    The Art of Link Building: How to Build High-Quality Backlinks A Hands-On Guide Link Building: Staying Current in a Shifting Landscape When it comes to search engine optimization (SEO), it all comes down to the 800-pound gorilla: Google. Google is the main arbiter of value in the universe of Internet websites. Google changes how it weighs and assigns value on a regular basis, which means that what you were doing last year may hurt you next year, depending on Google’s latest efforts to assure that it has fairly assessed the true value of each site. This is not capricious on Google’s part, just good business. The trust that a Google user places in the quality of the search results brings them back to Google time and again. This allows Google to continue to expose users to paid advertising on the Google site, which is the largest part of Google’s overall income stream. Early on in the history of Google, founders Larry Page and Sergey Brin created a method of determining how sites relate to other sites in the web as a whole. They called this concept PageRank. Google’s notion was that links to your website from another website should stand as a vote of support for the value of a page, because someone took the time and trouble to create the link and point it at your site. To be counted as valid, the link had to come from another site other than yours with some relationship to your site in terms of type of business, field, area of expertise, etc.
    [Show full text]
  • SEO Powersuite Backlinks API V1.0
    SEO PowerSuite Backlinks API v1.0. Introduction This document describes SEO PowerSuite Backlinks API v1.0. The API lets you obtain a wide range of backlink data from our backlink index, which you can use for your research and analysis or make available to others through your interface according to the agreement. Table of Contents Introduction Table of Contents Glossary Available Data Authentication Command overview 1) get-metrics 2) get-inlink-rank 3) get-backlinks 4) get-refdomains 5) download-all-backlinks 6) get-raw-backlinks 7) get-new-lost-backlinks 8) get-new-lost-refdomains 9) get-subscription-info Error Handling Glossary Term Definition Backlink A reference to the target URL in the form of a link, redirect, redirect chain, or canonical link that comes from a (Link) different domain than the target. Domain A network domain is an administrative grouping of multiple private computer networks or hosts within the same infrastructure under one domain name, e.g. domain.com. Optionally, a domain may have subdomains. Subdomain A subdomain is a domain that is a part of another domain, e.g.: sub.domain.com is a subdomain of domain.com example.sub.domain.com is a subdomain of sub.domain.com example.sub.domain.com is a subdomain of domain.com Host A host is a domain name that is completely specified with all labels in the hierarchy of the DNS, having no parts omitted, e.g.: in the URL https://www.example.sub.domain.com/path, the host name will be www.example.sub.domain.com Page (URL) A reference to a web resource that specifies its location on a computer network and a mechanism for retrieving it.
    [Show full text]
  • Influencing the Pagerank Using Link Analysis in SEO
    International Journal of Applied Engineering Research ISSN 0973-4562 Volume 12, Number 24 (2017) pp. 15122-15128 © Research India Publications. http://www.ripublication.com Influencing the PageRank using Link Analysis in SEO Puspita Datta Research Scholar, Department of Computer Science, Christ University, Hosur Road, Bhavani Nagar, Bengaluru, Karnataka-560029, India. Orcid: 0000-0001-5399-6970 Prof Vaidhehi V Associate Professor, Department of Computer Science, Christ University, Hosur Road, Bhavani Nagar, Bengaluru, Karnataka-560029, India. Abstract query in similar websites were ranked or ordered differently in different search engines, for example www.thrillophilia.com The World Wide Web, along with being a boon, can also be a was ranked 6th in Google’s search engine whereas the same bane since the information contained in it is growing at an website was ranked last in Bing’s search engine. Again alarming rate, and in order to find the right information – the www.foursquare.com was ordered 4th rank in both the search informative content, [6] the search engines have to be highly engines. optimized. There has always been competition between various search engines like Google, Yahoo! and Bing. Although the three work on similar principles, they are very Table 1: The top ten results for search query “best hangout different when it comes to their ranking algorithm. We as places in Bangalore” at www.google.com and www.bing.com users will always use the best search engine which provides on 5th September,2017. the right content in least amount of time and also has a high performance rate and that is only possible if our search engine Order Google Bing is optimized.
    [Show full text]
  • Understanding SEO & Other Analytics to Increase Conversion
    Let Your Website Work for You Understanding SEO & Other Analytics to Increase Conversion. PART I Search Engine Optimization PART I Search Engine Optimization Your website can be a powerful tool for connecting to a new audience for your art. In theory, the equation is simple: bigger audience = more chances to build a following, a larger follower = more sales. However, it is not enough to simply have a website. The competition is fierce for a spot “above the fold” on the first page of a Google search result. So how can you make your site easier to find and reach more people with your art? SEARCH ENGINE OPTIMIZATION What is SEO? To understand the true meaning of SEO, let’s break that definition down and look at the parts: ● Quality of traffic. You can attract all the visitors in the world, but if they're coming to your site for the wrong reasons, that is not quality traffic. You want to attract visitors who are genuinely interested in products that you offer. ● Quantity of traffic. Once you have the right people clicking through from those search engine results pages (SERPs), more traffic is better. ● Organic results. Ads make up a significant portion of many SERPs. Organic traffic is any traffic that you don't have to pay for. SEARCH ENGINE OPTIMIZATION How SEO works. You might think of a search engine as a website you visit to type a question into a box to magically return a long list of links to webpages that could potentially answer your question. But have you ever stopped to consider what’s behind that list of links? Here’s how it works: Google (or any search engine) has a crawler that goes out and gathers information about all the content they can find on the Internet.
    [Show full text]
  • SEO Guide Download
    Search Engine Optimization A Guide for Clients of Tectonica 2 Index Introduction to SEO About this guide A Roadmap - What you will Learn: Recommendations for Using this Guide Section I: Search Engine Optimization. An Introduction What is SEO How Important Is SEO? How search engines work Quick History Type of Search Engines What makes up SEO Keywords Page Ranking Video Evaluate Your Site Traffic - Google Analytics Tool! Section II: Keyword Optimization Choosing your Keywords Gathering Marketing Intelligence Keyword Research Tools 1) Google Keyword Planner 2) Bing Ads Intelligence 3) Wordtracker 4) Other Tools Where Keywords Matter Using Yoast for Wordpress Basics of Yoast Recommendations Use Headings Keyword Density and Location within Copy Keyword in Title Permalinks 3 Section III: Page Ranking How Page Ranking Works Assess your PageRank General Strategy for Link Building Types of Linking and Linking Best Practices Content Based Linking Incentive Based Linking Social Media Linking Section IV: A Couple of Extra Tips Looking nice on the search page Google Knowledge Graph SEO Tips for Non-Profits What can hurt your SEO efforts? Addressing Search Engine Penalties Uncharted Territory: HTML Tags, Meta descriptions and Microdata Parting Words Appendix: Directory of SEO Tools The main platforms that Tectonica builds websites on are NationBuilder and WordPress. For this reason this guide is written to address both. The main principles will apply to every website regardless of platform. In places where there WordPress NationBuilder is a difference in practices we recommend, we indicate which platform. While there are a wide array of SEO tools for WordPress (such as Yaost), NationBuilder websites tend to have incredible SEO results in part because of their built-in functionality in part because of features like integrated social sharing.
    [Show full text]