Brand Values and the Bottom Line 1 1
Total Page:16
File Type:pdf, Size:1020Kb
Brand Values and the Bottom Line 1 1. Why You Should Read This Guide 3 2. Common Obstacles to Avoid 4 3. Website Structure 7 4. Keyword Research 8 5. Meta Information 9 Contents 6. Body Content 11 7. Internal Site Linking 12 8. URL Equity 13 9. The Elements of URL Equity 14 10. Assessing URL Equity 15 11. The Consequences of Redesigning Without a URL Strategy 16 12. Migrating URL Equity 17 13. Summary 18 14. Appendix 1 19 15. Appendix 2 20 16. About Investis Digital 21 Brand Values and the Bottom Line 2 1. Why You Should Read This Guide Best Practices: SEO for Website Redesign & Migration outlines organic search optimization best practices for a website redesign, as well as factors to consider in order to maintain the URL equity during a domain or platform migration. This guide illustrates the common pitfalls that you can avoid in the redesign phase of a website, making it possible for a site to gain better visibility within search engines results. Additionally, Best Practices: SEO for Website Redesign & Migration explains the importance of setting up all aspects of a website, To do a deep dive including: directory structure, file names, page content and internal into SEO for website linking. Finally, we illustrate case study examples of successful site redesign and redesigns. migration, contact Reading this guide will set you up Investis Digital. for SEO success when you undergo a website redesign. The guide will We’re here to help. help you avoid costly errors and gain more traffic, leading to valuable conversions. We discuss some required tasks when a migration occurs, such as managing URL equity properly. But because migrations and redesigns often happen at the same time (rarely does one happen without the other), we also discuss some fundamental actions that you need to take with a site redesign. Those include doing effective keyword research and writing compelling body copy. BrandBest Practices: Values and SEO the for Bottom Website Line Redesign & Migration 3 2. Common Obstacles to Avoid Before you plan any website redesign, it’s essential that you first understand the obstacles that could sabotage your efforts – especially in making your content findable by search engines. Today’s web development technology and website platforms take advantage of visually appealing designs that may be attractive to visitors with high-speed connections and fast computers. However, these designs are not compatible with older computers, slower connection speeds, or simplistic search engine spider technology. In addition, robust content management systems and analytical enhancements can severely hinder visibility for core keywords, leaving money on the table. Beware of these “gotchas”: BrandBest Practices: Values and SEO the for Bottom Website Line Redesign & Migration 4 01 04 Java-Script Navigation: Search Deep Folder Structures: Much engine spiders have a hard time like dynamic URLs, unnecessary executing JavaScript. Website folders provide little to no value content that is accessible only by for the search engines. Directory these forms of navigation is not structures should be kept one- likely to be indexed. Additionally, to-two folders deep (if possible), such links do not contribute to with your most important pages on the site’s overall link popularity. the top level. If additional folders Most JavaScript Navigation can cannot be removed, name them be replaced with CSS or DHTML. accordingly. 02 05 Dynamic URLs: Dynamic URLs are Non-Permanent Redirects: If a a common side effect of most redirect of a page is required and content management systems. is going to be permanent, use a These URLs are problematic permanent 301 server-side redirect. because they provide little to no This not only points search engines information about the page itself. in the right direction, it transfers URLs play an important role in the existing link equity of the older search engine rankings, and should, page to the newer one. Typically, therefore, be rewritten to include all other types of redirects should relevant keywords (See the Website be avoided. This includes 302, Structure section of this document). JavaScript redirect, and META refresh. 03 06 Session IDs: Using session User-Action Requirements: Search IDs in your URL strings will engine spiders are not capable of confuse search engine spiders. completing user actions to access Each time the spider visits information. This would include your page, it receives a new ID information like age verification, (new URL); this will be seen as zip codes, or the use of an internal duplicate content, and those search box. Instead, give search pages will be dropped from their engine spiders direct access to index. In addition, session IDs this information via text-based severely reduce link popularity hyperlinks. If necessary, user-agent filtration. Identify search spiders detection can allow spiders to (Googlebot) via user-agent bypass the user-action. detection and remove these IDs. BestBrand Practices: Values and SEO the for Bottom Website Line Redesign & Migration 5 07 10 Broken Links: Broken links send Lack of Body Content: Some form a message to the search engines of relevant content should be on that your website is being all pages. Ideally, each page should updated too infrequently or is have at least one small paragraph badly maintained. It is important of unique content supporting your that check your website for target keyword phrases. If content broken links on a regular basis blocks are not an option, you must using tools such as Screaming ensure that your tier-one and tier- Frog. two level pages utilize text-based headers (<h1>). 08 11 Redirect Chains: A redirect chain Robots.txt File: Search engine occurs when there is more than spiders use this file to understand one redirect between the initial what pages on your website they URL and the destination URL. This can access. Pages with valuable creates crawl limitations while content should not be inadvertently also reducing any chance of equity blocked in this file. To allow all transfer. search engine spiders access to all areas of your website, simply place the following two lines into a text file, put it on the root level of your domain, and name it robots.txt: user-agent: *Disallow 09 12 Content in Images: Search engine Incorrect Robots Meta Tag: The use spiders cannot index valuable of the noindex or nofollow META content embedded within an tags will prevent pages of your image file. In most cases, you website from being indexed. Below can create a similar effect using is an example of the tag: plain text, style sheets, and <meta name=”robots” absolute positioning. Solutions content=”noindex,nofollow”> such as sIFR can also be considered. BestBrand Practices: Values and SEO the for Bottom Website Line Redesign & Migration 6 3. Website Structure It’s critical to consider several Best practices for key components when laying out website structure: the structure of your website. The decisions you make about • Folders (or directories) should contain the naming conventions of your relevant keywords based on the theme of folders and files, and the way each section (see Keyword Research section in which you point to specific on next page). Be as descriptive as possible, pages of your website, can have as these naming conventions should mimic a huge impact on overall traffic your site’s navigational structure. and sales. It is important to build various optimization elements Example: Having a “Outerwear” company into the initial structure of the on your website, the navigational website: structure should look something like: https://redkap.com/Products/Category/36 • Each web page file name should include relevant keywords, based on the unique theme of that page. Ideally, the URL structure should mimic your site’s navigational structure. • Break up keyword phrases with dashes (-) in file and folder names. Search engines will see these as a space between two keywords. • Create a clean and easy-to-understand global navigational structure. Ensure that your navigational elements are as descriptive as possible, and are text-based. Bread-crumbs also make your page visible to spiders. Enable them on all levels of the website if possible. • Use canonical tags; decide if you want visitors to see www.domain.com or domain.com. The version you do not want to utilize is a 301 permanent redirect to the other (Usually the “www.” version is best). • Place all JavaScript and CSS in external files. BestBrand Practices: Values and SEO the for Bottom Website Line Redesign & Migration 7 4. Keyword Research Search engines require the Guidelines to selecting presence of keywords in links, effective keywords: copy, and other page elements, including URLs, as mentioned • Start with a general topic and above in order for pages to work on derivatives of related become visible in the search terms and phrases from there. results for a given query. Over time, the increased frequency of • Experiment with popular these phrases throughout the site keyword research tools like will effectively increase webpage Google Adwords, MOZ, or visibility across the major search SEMrush. engines. • Consider the terms you would Prior to development (folder use if searching for this names, navigational elements, information. etc.), determine specific keywords to target within the • Be aware that consumers may various sections and pages of search for information using your website. With a better different terms than those used understanding of your users’ within the organization. search behavior, you’ll be able • Target one-to-three keywords or to incorporate relevant high- phrases per webpage. volume keywords into your site’s structure from the beginning. • If you have access to server log Here are some recommendations files or site analytics, look there to help you understand effective first to determine how people keyword selection.