Google's testing new SERP layout

So there’s been talk about a new interface Google is testing. It focuses on taking away the underline from their search result links and making the titles a little bit bigger. Above is an example of the new Google SERP page interface. (The left is me signed in, the right is incognito mode).

So I finally experienced it and I thought it would be helpful to list some of the changes I see:


The Ad changes. This isn’t a new one but it’s still a difference. before google advertisements had a background a light beige color. Some fun facts – as Google’s ad program began to evolve the background color began to slowly become more translucent and harder to see (also depending on your OS and monitor) However after some legal actions and the law getting involved they were told to make them distinct ad’s. So their solution was more of a loophole in taking away the pretty much branded background color and making it a small “Ad” text with a prominent color. Ha.

No more underlines in Google SERPs. Well then. Cleaner display. I kinda like it. Doesn’t seem to change to much on the SEO front though. We’ll see how long it takes bing to change their fonts.

Bigger font on the title takes the highlight them on both the advertisements and natural search results. This is more interesting. It attracts the eye a little bit more and probably will require a call to action on each search result. We’ll see how this plays out

Less characters showing due to the bigger fonts on title tags. This will be interesting because most SEO’s try ot make the most out of their pages with their optimizedtags. The more interesting part about this though is

a new Google Mobile SERP display with bigger fonts Mobile devices already get the short of the stick on title tags in Google. With bigger fonts, this could descrease the length of the title tags even more than desktop. Leaving little room for a call to action or a long-tail keyword.

How does this affect Ecommerce SEO?

Ecommerce websites are very difficult to scale. So when Google changes their interface to lessen the character counts or change font sizes it could potentially throw a bunch of an SEO’s hard work out the door.

The problem with this is that there is a drop in a couple characters depending on the device you are using. This isn’t too big of a deal – but in rare, long-tail cases, this can kind of affect how your title tag looks.

A solution to this might be to remove call to actions in your title tags or better target your keywords for a little bit shorter tail key phrases. Me though? Keep things the way they are and trust that consumers can understand what our results mean and why they should click through 🙂


This isn’t going to affect SEO much so it doesn’t matter too much there. But it does make the page look much cleaner and more modernized. The biggest affect being the lost of a few characters on both mobile and desktop devices.

I would suggest keeping your current meta tag strategy going for now. Unless Google decides to slowly up their font to size 18 Arial. Then I would probably start using Bing. Oh please don’t let me use Bing.

Time for everyone to go secure – https?

In October of 2011 the search king Google went secure.
In 2012 – 2013 Safari, Mozilla, and Bing did the same.
In January 2014 Yahoo finally joined the club to make 99% of the worlds digital information completely secure.

Well, that’s awesome! But not for SEO’s. Secure searches mean keyword data, referral data go out the door and that leaves us in the dark with a flash light trying to find our way. With Yahoo going secure they’ve taken away our batteries and now we only have an empty tool to defend ourselves. Nevertheless we will prevail!

I work for an enterprise level ecommerce website and the news of losing data comes at a large loss. It’s hard because our SEO numbers have been on the uptrend without paying much attention to Yahoo.

So why does this matter you ask?

Well ~6% of our traffic comes from Yahoo and with the new secure search, our Yahoo referral data will become direct load (according to omniture). In short direct load is someone coming directly onto our website without coming through any type of marketing channel. So that box in your web browser, someone types in the exact domain address, that equals direct load.

This means we lose 6% of our SEO traffic to direct load by the time Yahoo goes 100% secure (which is aimed at March 2014). The problem is with high level reporting direct load is such a large number that 6% of our traffic attributed to it doesn’t look like much of a dent. When you get more granular you notice a little dip in our allocated traffic which might not look to good.

It’s ok don’t panic. I’ve got a solution.

And I found it on Danny Sullivan’s post on here I knew about most of the information before getting this article but there was one quote/tweet thread I read that I found very useful.

By the way, when it comes to searches that lead to secure servers, Yahoo appears to be following standard protocol and passing along full-referrers. However, as most sites are not secure sites, most publishers won’t receive this information.

Bam. There it is. By making your website secure itself (https) Yahoo will pass referral data because it follows standard secure server protocol’s. The twitter thread that explains most of it is located here.

So there you have it. search referral data becoming direct load can be offset by making your own website secure. Might even be some good PR around it (that’s a shout out to my old manager) 🙂

If you’d like to find out more feel free to contact me, or tweet me. I’m here all day folks

Where did my Google+ picture on the search results go?

So I’ve done a terrible job of keeping up with this website. In fact I didn’t even notice my G+ picture was gone from my serp! Picture for reference:

Andrew Liwanag no Google+ Picture

Knowledge graph was still there.

Google+ URLs in search results still had me pasted on there.

What happened to my Authorship?!

is clearly in my view source and the G+ page is linked to my website. Maybe since I took a nice friendly URL it unlinked it from my account. We’ll see if that test works out

I’ll update a little later.


found g+

So I figured out it happened right along the time I changed my G+ URL to “/+andrewliwanag30”. I guess the rel= didn’t transfer over and once I changed it to the root URL it fixed itself. Bam!

2013 Search Ranking Factors by

Listing this as a resource for myself. From

Added on stuff…

Page Contains Google+ Authorship Markup-0.03
# of Google Adsense Slots in The Page-0.03
Domain has Numbers (
# of Hyphens in Domain Name-0.03
Total Area of Adsense Slots on Page-0.04
# of Characters in the Title-0.04
URL Contains Hyphens-0.04
Total Length of the Full Domain (
URL Length in Characters-0.10
Response Time of Page in Seconds-0.10

more added on stuff…

Domain Level Anchor Text

These features describe anchor text metrics—both partial and exact match—about the root domain hosting the page. For example, for the page, these features are for anchor text links pointing to *, not just page A.

Over the past two years, we’ve seen Google crack down on over-optimized anchor text. Despite this, anchor text correlations for both partial and exact match remained quite large across our data set.

Domain Level Brand Metrics

These features describe elements of the root domain that indicate qualities of branding and brand metrics.

For this study we tracked domain name mentions in Fresh Web Explorer. The correlations for mentions are relatively high, falling between 0.17 and 0.20 for mentions of the full domain name.

Domain Level Keyword Agnostic

These features relate to the entire root domain, but don’t directly describe link or keyword-based elements. Instead, they relate to things like the length of the domain name in characters.

Although none of these factors were highly significant, we did find a negative correlation of -0.09 with the length of the domain name.

Domain Level Keyword Usage

These features cover how keywords are used in the root or subdomain name and how much impact this might have on search engine rankings.

The ranking ability of exact- and partial-match domains (EMD/PMD) has been heavily debated by SEOs recently, and it appears Google is still adjusting their ranking ability. In our data, we found EMD correlations to be relatively high at 0.16 and as high as 0.20 if the EMD is also a dot-com.

Domain Link Authority Features

These features describe link metrics about the root domain hosting the page (e.g., for the page, these features are for links pointing to *, not just page A).

As in 2011, metrics that capture a diversity of link sources (C-blocks, IPs, domains) have high correlations. At the domain/subdomain level, subdomain correlations are larger than domain correlations.

Page Level Anchor Text

These features describe anchor text metrics—both partial- and exact-match—to the individual page (e.g., number of partial-match anchor text links, exact-match links).

Despite Google cracking down on over-optimized anchor text, we found high correlations with both partial and exact match anchor text to the URL, with a 0.29 correlation with the number of root domains linking to the page with partial match anchor text.

Page Level Keyword Agnostic

These elements describe non-keyword usage and non-link metrics features of individual pages such as length of the page, and load speed.

This year’s data showed an interesting negative correlation (-0.10) to page response time.

Page Level Keyword Usage

These features describe use of the keyword term/phrase in particular parts of the HTML code on the page such as the title element, H1s, alt attributes, and more.

The data measures the relationship between the keyword and the document-both with the TF-IDF score and the language model score. We found that the title tag, the body of the HTML, the meta description, and the H1 tags all had relatively high correlation.

Page Level Social Metrics

These features relate to third-party metrics from social media sources such as Facebook, Twitter, and Google+ for the ranking page.

Social signals were some of our highest correlated factors, with Google+ edging out Facebook and Twitter.

Page Link Authority Features

These features describe link metrics to the individual ranking page such as number of links and MozRank.

Page Authority is a machine learning model inside our Mozscape index that predicts ranking ability from links and, at 0.39, it is the highest correlated factor in our study.

What Is Search Engine Optimization?

Search engine optimization (SEO) is the process of best practices and strategies to increase your websites health, relevance, and organic search ranking (organic & natural are the same). The top search engines in the world (Google, Bing) have around 2 trillion searches a year (2,000,000,000,000!) equaling out to about 250 searches per human world wide (7,000,000,000) per year. Now that’s a market to dive into. What’s organic search though?


Organic Search Results
The top red box is the search query, the bottom red box are all natural results.

People search for thousands of different topics on search engines. From trying to find out how old George Clooney is, to where you want to eat the next day – search engines will provide you with on-the-go information. For further simplicity and the sake of not having to explain the thousands of different queries a search engine sees, we’ll focus on shopping & information.

When was the last time you bought something online without doing research? Why not? All it takes is typing in your question and Google will find it for you. Simple. Think of something you’ve been wanting to buy for a long time. Like that brand new computer – but you don’t know anything about computers? Well, type it into Google and watch the magic happen.


  1. The search query you were interested in
  2. The advertisements that companies bid on when you search that query
  3. Google Product Listing Ads
  4. Organic search result #1 for your search query
  5. Google Maps for your search query
  6. Trending News for your search query
  7. More organic search results
  8. Local companies that have the query you searched for
  9. An onslaught of product pages, companies, reviews, and articles
  10. The right rail paid search ads

Keep in mind there are over 70 types of search results that vary depending on the search query. Here is a small sample of 8 of the most profound I’ve seen. Search engine optimization comes into play by making your website a user-friendly and search engine friendly website. This is done through onpage optimizations, offpage optimizations, and technical implementations.

From local listings to ecommerce websites, reviews and information to when Star Wars VII is coming out, search engines have changed the way we find information and I fully expect the trend to continue. Looking back at the past 10 years of Googles search queries, the trend has grown well into the trillions. And SEO will grow as long as the market grows.

If you’d like to learn more about Ecommerce SEO or give me your version of what Search engine optimization is – feel free to contact me here. Always stay connected.


SEO vs Design: The Ongoing Battle

SEO vs Design: The ongoing battle

seo-vs-designSEO and design are now part of the same team. They want click through rates, site traffic, call to actions, increased sales. They both rejoice in their success when numbers are up, and blame each other when numbers are down.

Sounds a lot like two rival siblings doesn’t it?

How convenient then that I have an extremely talented brother who is passionate about web design and development, and I, equally aggressive in my own field of SEO. How fitting of a metaphor is that?


When a designer gets to work purely based on what his perspective of a consumer gets everything looks so pretty and charismatic. It captures the eye of people that find it. The keyword (pun-intended) here is “people that find it.” I’m not saying you’ve done a bad job, because quite frankly, I am not nearly as creative or talented as some designs I’ve seen him create.
Click Here For More Awesome

Creepy Web Crawlers

They crawl their way into your system. They sneak around without you knowing. You never feel them coming. They are evil. Click here to be frightened beyond belief.


Alright that was blown just a bit out of context, and has nothing to do with this post. I just hate spiders. Recently, I met with a co-worker to discuss how we were going to attack a specific category SEO wise. I nearly finished the entire conversation when they finally said “You keep saying spiders are on my webpage, I haven’t seen any, but there were some cobwebs behind my computer.”

From her perspective I can understand her confusion in not knowing what they do, I didn’t explain it so I was at fault here. Making robot spiders a reality just never seemed to cross my mind (and I now realized how terrifying that sounds). After shuddering at the visual of that, I explained what spider crawlers are as follows – Crawlers are bots that sweep through a website in the blink of an eye while feeding information to search engines to give your website relevance.

What Does It Do?

A web crawler scans through websites providing search engines with a high amount of information. It tracks links, content, site architecture and relevance towards your websites target. This directly influences how you are ranked on search engines, making crawlers a very important part of organic search. That being said websites are built for great user capability, not to appeal to crawlers. Optimizers make the websites more crawler friendly and help eliminate bad practices that crawlers see as a violation.

How Do They Work?
Crawlers are basically information gatherers. They are constantly surfing the web in search of new information and are extremely up to date. They gather a websites information through internal and external links, <meta> tags, <h1> tags, <title> tags, and most importantly content. They can also see how popular the websites are to gain a better understanding of relevant and user-friendly content

Do I Need Them?

Yes spiders suck, I believe both species have come to an understanding that we don’t like them, they don’t like us. Unlike those terrifying 8-legged creatures, website crawlers are a good thing. If you want to be indexed and increase your organic traffic, being crawled is a positive.

I Hate Spiders Too, Can I Step on Them?

This was actually asked to me. This was my exact response. “Yes, but if you have the technology to virtually place your foot onto a program and crush them, I would like more information for a potential investment.” In actuality, using a robots.txt file actually gives the crawlers directions for the website. If you don’t want them sneaking around there are specific instructions you can place in your robots.txt, for more information and instructions list click here.

Now that you have a better understand of crawlers, you know how search engines gather information on your website. They check tags, and content, as well as links. So be sure to have constant monitoring of your Google Webmaster tools and analytics to see if you are being penalized or have duplicate content that Google frowns upon.

Just for kicks:


If you have any more questions feel free to comment, or a great spider destroying story email me directly, I would love to hear it.

Proper Website Hierarchy Structure

I finished my first week at Company X (I just realized I never asked permission to blog about them so I don’t want a billion dollar company suing me, doesn’t seem efficient. Call me naive, or too cautious but I’d rather play it safe). I’ll give you some background information on the situation so you aren’t lost throughout this post. Awesome company is an enormous eCommerce site. We have no Brick and Mortar retail stores, it is purely catalog, print, and online business. It’s a billion dollar company and organizes its hierarchy by the standard Supercategory > Category > Department > Class > Skuset > Sku. Here is a visual example of the companies hierarchy.


Example of Hierarchy – Credit to Invisible Children

WAIT! Don’t leave yet! Yes this seems to be entirely way off topic but it shows a great visual of ecommerce hierarchy. It also gives you awareness to what Invisible Children is running and an inspiring video that James Russell created and went viral and raised awareness through social media. But that’s another story. Lets break it down.

SuperCategory Example

Supercategory is the most general description of a section. For example “Office Supplies” is a super category that gives you a generalized overview of what that section will hold. For a visual
Category Example
Category is slightly more descriptive. This gives the user a way to distinguish by more categorical choices ie. filing supplies, filing storage, office accessories, dry erase boards.

Department/Class separates the categories into different categories. Yes it’s confusing, think of these two as taking a pile of oranges and separating them according to size, and/or tint of orange. Very confusing – for all intensive purpose and making life easier, lets imagine these are part of category.
SkuSet Example
Skuset is a highly rounded down group of items. This is where branded items becomes more specific, or
Sku Example
Sku is the main selling page. The use gets all the information on the product including reviews, pricing, other options etc.

Expand this into thousands and thousands of webpages built through an ecommerce hierarchy and you get something like this:

Now that you have a small insight of an ecommerce site. It would be a great idea to go through what I had learned in my short time here and the process its going to take to get there. To do this we can put it in the form of a simple case study (without the end results) that still lacks the results part.

Problem (so far)

Awesome Company was very far behind in their SEO practices. In a market dominated by big names like Office Max, Office Depot, Staples, and Amazon, The Company maintains its competitive edge through high customer service, and experience in the industry. This would be our value proposition I suppose. Just this year, SEO began to really gather in the minds of the executives and they began to give SEO a budget alongside paid advertising, display advertising, and affiliates. This makes up the digital marketing team. So this should be a breeze, eh?

Wrong. So wrong.

The amount of things that could use optimization varied from the basic SEO practices such as creating a large master keyword list, as well as tracking them. To advanced practices like information architecture and site architecture. Believe it the architecture of this site nearly gave me a stroke because it was so much to take in. Ok, that was a little hyperbolific but the site needed a lot of help that I can’t even fully list.

Solution (for now)

Currently I have been put on keyword research duties. Yay. The current list they have is over 9,000 words long and they aren’t organized at all. So my amazingly boring job of organized the keywords into Supercategories and Categories will take up most of my time. So naturally to make this much less boring – I’m going to blog in the middle of it.


So I understand I didn’t provide too much insight on my time here, but I did give a visual of an ecommerce hierarchy. A company this large has over 40,000 pages, so understanding how it is organized is a vital starting point. Now onto more keyword research/organization. Yay!

8 Steps to Reviving Penguin Slapped Websites

April 24th, many of you know of this day is the day machines took over the world and a hero by the name of John Con… that doesn’t sound right.

The Penguin update may have not caused the world to be overrun by highly advanced robots that seem to always be advancing their technology through complex equations, formulas and *gasp* constantly creating Arnold look-a-likes! However, the panic that spread throughout the digital marketing world felt by many businesses, large and small. there was no where to hide.

This is how I battled the Penguin… & Won

A little history on myself before we divulge into the process that I went through for the past month and a half. I am admittedly green in this subject and have taught everything I know about SEO to myself since January 2012. Shortly, I grabbed a job and turned into a sponge gathering information and strategies that internet marketers have been using for the past 10 years. Obviously the digital marketing world has changed quite a bit.

Enter me, a new breed Search Engine Specialist that has read numerous articles and has applied what he has learned to multiple websites thanks to the breakthrough I received in January. I found success using basic SEO practices (thank you, SEOmoz) as well as experimenting with more complex and difficult strategies (boo SEOmoz – kidding). So with a small amount of experience, and a lot of knowledge waiting to be exerted, I took on the Penguin with my bare hands… and a laptop. Bare with me people, I’m still in learning mode.This is how I re-generated leads lost from Penguin update in 8 Steps

Google’s Penguin Strikes First

May 1st – Nearly a week after the Penguin update, my boss asked me why there were no leads. First thought, blame it on the sales people. Kidding. In fact he told me that leads have been decreasing for the past week. Interesting. Immediately I start researching and sure enough I found the multiple “freak-out” articles online. (keep in mind these websites were made and optimized before I even started working here). I also provided a great way of visualizing how I felt. Here we go.

Penguin Problem 1

I started sifting through the 50+ old corporate website analytics and noticed the immediate downfall of nearly every website.

Problem 2

The first 5 websites I looked at haven’t been touched in 4-8 years, but were productive in gathering leads, the websites still had the 4:3 formatting, and I do not want to talk about the anchor text at the bottom of each website.

Problem 3

Google Webmasters was verified on only 25% of them.

Problem 4

The ones that did have webmasters verified had 70 errors each.

Problem 5

The websites were spammed with anchor text, with spam links to all of the websites.

This was approximately 30 minutes into auditing 4 of the websites. There are more errors but for the sake of me not wanting to have an anxiety attack or go back into the deep depression cave that was my life for the past month and a half, I must move on.

Planning the SEO Retaliation

After meeting with our senior marketing manager, I came up with a plan to hopefully to resurrect the lead channels lost after the penguin update. The bulk of my strategy was to focus on going back to the basics of great SEO. Bare with me as I take on the mighty Penguin.
A much less complicated way or describing our plan, I provided a summary of what I needed to accomplish:

The Problem: Total Leads had significantly dropped creating a hole in our projected revenue stream.

The Solution(s): Technical audit; New sitemaps; Consolidate Website onto one webmaster/analytics account; Fix Webmasters Errors; Fix Analytics; Remove Spam; Implement Basic SEO; Re-audit.

The Results: Below!

After creating an initial pre-report from webmasters, and a custom excel sheet to keep track of my journey to the center of the SEO, I finally stood armed and prepared to take on the latest Google update. We will go through this process step by step.

Step One: Check if the websites are functional

Yes, simple step. Yet these sites have not been checked on in years, so I wasn’t surprised to find that many of them were down. Everything is run internally, so after providing a list of the websites that were down, they could be rebooted by our hosting servers.

Step Two: Make sure contact forms are working

Another trivial step, but necessary. Turns out 3 of the forms were actually broken. The comback just got about 3% easier.

Step Three: Create a consolidated account

Consolidate all corporate websites into one account. Meaning all of the websites analytics, and webmasters accounts should be put on one account to maintain organization for the rest of the process. This can also make future problems much more manageable as opposed to need to sign into different accounts just to maintain the same amount of websites. Pardon the blanks, I didn’t have the approval to show the urls, so maintaining their privacies is the most logical solution. This is also the top of the page, the list goes on, and on, and on… and on.

Step Four: Create up-to-date sitemaps

Surprisingly, half of the websites did not have a sitemap to begin with. The ones that did were far out of date, and after creating a new one for each website I was not surprised to find nearly double the amount of pages were indexed. I generated these sitemaps through which is a great free tool for generating sitemaps.

Steps on how to add a new sitelink to your Webmasters Account, Click Here.

Step Five: Remove spammy links

Every website was somewhat spammy, especially when checking the backlinks along with anchor texts. The websites were following old optimization tactics and the Penguin was not having it. Since there are so many websites, and no include files, this meant going to every single page of every old website. Oh my. Considering this was a rush job, to keep it more cost-effective, I made sure I familiarized myself with the most important webpages, and what keywords they ranked for. Nearly all of the targeted key words were already slapped by Penguin and no where to be seen on the Google search engine.



Step Six: Fix error codes

Every website had errors in them. Using Screaming Frogand Webmasters I had found broken links, and anchor text spamming throughout the each website individually. Tedious. Old websites did not have php include files. This was not a fun step. all of the links are now nofollowed or fixed. Screaming Frog is a very useful tool for technical audits and I recommend it for your webmaster needs.

This is just a sample of the errors I cut into one picture. This was 1/3 of only 1 of the websites I was working with.

This was after, The website was a sms marketing website I was in charge of. I got approval to use this as an example.

Step Seven: Go back to basic SEO

Ah, basic SEO. Fixing duplicate content, and reoptimizing (the most important) webpages towarding new researched keywords and longtails. This was the most time consuming part of the process. But it felt great to look at my new Webmasters report and see the “no errors” for every website. This also was not a fun step. Sorry for not having a picture for this either, The websites have already been cleaned up with some fresh content and new targeted keywords.

After all of this work, I waited a week to check the results of my work. I reported a new webmasters & Analytics report as well (some of them haven’t been crawled yet so some waiting is still involved) However, noticeable changes had already began to come back. Leads were slowly increasing & with every new crawl from Google, our leads began to increase, however it’s very doubtful they were ever return to what they once were. There were too many websites to completely rectify every one, it would not be cost-effective.

Keywords have begun to come back. In highly competitive markets such as text message marketing textsmsmarketing has come back to the second page of Google SERPs.

Others have began to rise again and more examples will be shown later. Our leads haven’t reached the days before Penguin and probably won’t ever reach that point again, however a simple campaign with auditing and basic seo has proven successful for older websites that needed revamping.

*Revision – the spammy footer on was not added by me, I had gotten rid of those, however it would be interesting to see if Google begins to dock our page rankings*

Bare with me, I am still rather new to this. However, the lesson learned is that going back to the basics of SEO, and simplifying your material to look more natural as opposed to specifically attacking certain keywords gave me a slight advantage in the battle against the Penguin. It could have had other factors that are included, but the stressful hours spent on websites that were announced deceased seemed to have paid off.

Thanks for reading one of my first professional posts. If you have any suggestions or comments I would be grateful to hear them. Or if you have any stories you would like to share, I would love to hear them!

Marketing Strategies

How To SEO: 4 Common SEO Mistakes

Beginner SEO specialists make several trial and error moves at the beginning of their career. Here are some mistakes they should think about as they continue to grow in their experience, myself included.

1. Obsessive Compulsive Onsite SEO Disorder.

Most of the time SEO specialists tend to be very strict and detailed with their onsite keyword relevance. this is the first thing most campaigns begin with. When a campaign begins the specialist will gut the keywording of whatever site they are optimizing. This is not necessarily a bad thing. However, most people think this will completely solve their SERP woes. This is not the case.

While onsite keyword relevance is always a large part of an seo campaign, it is only half the battle. Offsite linkbuilding is just as important as keywording. Google rates both quality and quantity of backlinks to rate your site higher on their results page. If you are running a campaign make sure time is spent on keywording, as well as linkbuilding.
Key Takeaways:

  • Onsite seo won’t solve rankings
  • Offsite linkbuilding is half the battle

2. Focusing Too Much On The Homepage

The homepage is not your savior in website form. Don’t religiously optimize your homepage. While it is true it is most likely the highest ranking page of your website, your other pages will suffer in rankings. Focus on more landing pages for other pages of your website. Focusing on more optimized pages gives you a high spectrum for crawling. Stop worrying about glorifying your homepage, give your other pages some attention.
Key Takeaways:

  • Build landing pages throughout your site
  • Focus on different keyphrases

3. Ignoring Analytics

Analytics. A statistician’s wet dream. Most people that specialize in internet marketing refer to analytics as a form of measurement for their campaigns. However, most people ignore the signs analytics shows them. To ignore analytics is like doing homework without receiving a grade. It renders your campaign pointless. It takes the 6 months of work you did (either positive or negative results) towards optimizing your site and puts it in the shitter. You won’t have a clue of where you stand, or what your next goals are. Please. Don’t. Ignore. Analytics.
Key Takeaways

  • Set up your goals, funnels, and ecommerce
  • Make a schedule for checking your stats

4. Patience

You are in this for the long haul. Make sure your boss knows this. SEO takes time. Changes are not suddenly changing over night. Stay confident in your changes and let Google do its thing. Over a span of 3-6 months if there are still no changes it’s back to the drawing board. But chance are you will see changes as long as you keep the website live and healthy.
Key takeaways

  • Be confident in your changes
  • Give it time