Deciphering Your SEO Plan

Deciphering Your SEO Plan

With so many industry specific terminologies for online marketing, it’s not surprising that many small through large business owners get lost in the confusion. If you find yourself looking over an SEO proposal and can’t make sense of it, then this article is for you. Articles, checklists and how to’s written to help guide you through do-it-yourself search engine optimization won’t be of any help either if you aren’t sure what they even mean. Let’s go over some technical SEO phrases and what they mean in laymen’s terms, as well as what type of duration you should expect for the work (ie. whether it is ongoing work or not).

Dallas-SEO-Strategy-Agency

Optimization of robots.txt

This is an important step in your initial optimization process. Robots.txt refers to the web protocol for marking pages that you want the search engines to read or not read. Why wouldn’t you want a search engine to read some of your content? A primary reason is duplicate content. If you have pages that you feel are important for usability (UX – user experience), but are mostly comprised of content that is duplicated from other parts of your site, especially in a copy and paste manner, than you may want to keep the search engines from reading the duplicate pages. This is specifically important for ranking since the Panda updates hit websites hard for duplicated content. Other pages that are often marked as no-index include private pages such as your login pages to minimize the possibility of brute force hacking.

Creation & Registration of sitemap.xml & Setup of website sitemap

Back in the years of web 1.0, webmasters, business owners and optimizers had to manually submit websites to search engines for indexing and crawling. As technology has progressed, so has web management and coding. Sitemap.xml is the code and page that are created and used to speed up the process for Google and Yahoo (and most likely more search engines as time goes on) to crawl your site, which is especially helpful for new websites. When the Caffeine algorithm update was released, it changed how search engines craw your website. Basically, sitemap.xml tells the search engines what pages are there to look at and examine. It may not seem super important, but this is how to ensure that content that is 3, 4, 5+ clicks deep gets analyzed for rankings. Most modern CMS’ such as Wordpress, Magento, etc. have plugins that will automatically generate these, making it a one-time setup. If not, these must be created manually, which would require ongoing work.

Information Architecture Audit

Ever been on a website where you couldn’t seem to find your way around? That is likely due to subpar information architecture, and is a part of UX/UI (user experience/user interface). The way that your content is organized is crucial to your online success. Not only does it affect whether your customers find what they are looking for which affects your conversion rate, but UX/UI is factored into your rankings indirectly (I will explain this more in-depth in another article). The primary area that an information architecture audit will cover is your navigation. Audits typically do not include making physical changes; rather, it is a list of suggestions. Usually these are done when you first set up your website, when you change marketing companies,   during a website redesign, or during conversion optimization consultations.

There are many ways to conduct these audits. Typically, initial audits may only include “best practices” knowledge while subsequent audits may include information from user tracking software such as heat mapping, google analytics, and multi-touch analytics.

Canonicalization Analysis

Sometimes the same page can be accessed from more than one URL. This is most common with ecommerce or pages for frequently asked questions and glossaries that use anchored links for quick navigation. This also can occur when tracking codes and user ids are included in the web address/url. The most common example is www.example.com and example.com. While this still may sound complicated, most CMS’s have a plugin to automate this; otherwise it has to be done manually.

Rel=”publisher” Implementation

When Google released an update to give additional importance for authorship, Rel=”publisher” became more important to many SEO’s. Many CMS’s have a plugin to handle authorship so it only takes one extra click to add or change an author. While brands use Rel=”publisher”, individuals use Rel=”author”. Brand authorship (rel=”publisher”) only needs to be added once, on the homepage. The individual authorship (rel=”author”) should be added on blog articles. Again, this is a place where SEO’s enjoy the benefits of plugins on many mainstream CMS’ to cut down on time doing coding. While it is good for this to be used, it generally only includes a setup and checking a box with each article that is posted.

Have more questions? Check back for future posts that will cover more online marketing terminology or contact us by phone or email. We are happy to help you compare SEO plans, or to audit your website to provide plan suggestions.