How to Delay Loading of Non Critical CSS to Increase Mobile Site Speed

March 29th, 2015

non critical css

Its a good idea to delay loading of non critical CSS to speed mobile site page speeds. Probably you use lots of CSS files to style your site, and this…

Read full original article at How to Delay Loading of Non Critical CSS to Increase Mobile Site Speed

©2015 QuickOnlineTips. All Rights Reserved.

Quick Online Tips

What Does an SEO Do In Their Day-to-Day Work – Whiteboard Friday

March 27th, 2015

Posted by randfish

There’s a common misconception that SEO is a “one and done” task — that you clean up and optimize a site, and once that’s done, you can focus your efforts elsewhere. There’s so much more to the day-to-day work of an SEO, though, and in today’s Whiteboard Friday, Rand walks us through those ongoing parts of the job.

For reference, here’s a still of this week’s whiteboard!

What Does and SEO do in Their Day-to-Day Work board

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to tackle a question I get sometimes about the day-to-day activities of an SEO and what should you do after you’ve completed that first site audit, sort of fixed the problems, what does the day-to-day work look like?

A lot of SEOs, especially those coming from consulting backgrounds or who’ve joined companies as in-house SEOs who’ve had kind of this big project based SEO work to do, find themselves struggling after that’s done. Typically, that process is pretty straightforward. You kind of do an audit. You look at all the things on the site. You figure out what’s wrong, what’s missing, where are opportunities that we could execute on. Maybe you do some competitive analysis, some market analysis. You identify those fixes. You work with teams to make those changes, validate the results have been completed, and then you’re sort of in this, “Well, do I go back and audit again and try to iterate and improve again?”

That doesn’t feel quite right, but it also doesn’t necessarily feel quite right to go to the very, very old-school SEO model of like, “All right, we’ve got these keywords we’re trying to rank for. Let’s optimize our content, get some links, check our rankings for them, and then try to rinse and repeat and keep improving.” This model’s pretty broken I’d say and just not reflective of the reality of opportunities that are in SEO or the reality of the tactics that work today.

So the way that I like to think about this is the SEO audit, an SEO focused audit — which is trying to say, “What traffic could we get? What’s missing? What’s broken and wrong?” — only works at the low level and the very tactical trenches of a marketing process or a business process. What you really need to do is you want to be more incrementally based, but you need to be informed by and you need to be evolving your tactics and your work based on what is the business need right now.

So this process is about saying, “What are the top level company and marketing goals overall? For everyone in the company, what are we trying to accomplish this year, this quarter, the next three year plan? What are we trying to achieve?” Then figure out areas where SEO can best contribute to that work, and then from there you’re creating tactical lists of projects that maybe you’re going to positively move the right needles, the ones that you’ve identified, and then you’re going to evaluate and prioritize which ones you want to implement first, second, and third in what order, and test implement those.

So, hey we’ve figured out that we think that a new blog section for this particular piece of content, or we think that getting some user generated content, building up some community around this section would be terrific, or we think outreach to these kinds of publications or building up our social stats in these worlds will expose us to the right people who can earn us the amplification we’ll need to rank better, etc., etc. Okay, this is a fine process, and you’re going to want to do this, I would say, at least annually and maybe even think about it quarterly.

All this work is essentially centered on a customer profile universe, a universe of people. I’ve got my person X, Y, and Z here, but your customer universe may involve many different personas. It may involve just one type of person you’re targeting that you’re always trying to reach over and over again, but it probably involves also the people who influence that direct subsection of your market.

From there, you can take the, “Hey, you know what, person Z is really interested in and consumes and searches for these types of content topics and these kinds of keywords, so we’re going to start by taking keyword set A or content set A and figure out our keyword list and our content list. We’re going to create, launch, and promote work that supports that.” It could be content pieces, could be video, could be some combination of those things in social media, all forms of content. It could be tools, whatever you want, an application.

We’re going to launch that, promote it, and then work on some amplification, and then we’re going to measure and learn, which is a critical part of that process. I want to not only see what are my results, but what can I learn from what we just did and hopefully I’ll get better and better at iterating on this process. This process will work iteratively, kind of similar to our broken process over here or to our site audit process there. It will work iteratively, and then every now and then you should pop back up and go, “Hey, you know what, I feel like we’ve exhausted the easiest 80% of value that we’re going to get from 20% of the work on keyword set A. Let’s move on and go visit keyword set B now, and then let’s go visit content set C.”

Occasionally, you’re even going to want to move one step up and say, “Hey, you know what, maybe our personas or our market is changing a little bit. We want to try targeting some new customers. We’re going to look at these folks over here or this guy over here and see if we can reach them and their influencers with new kinds of content and topics and keywords, and that sort of thing.”

If your site is rocking and rolling, if you’ve completed your audit, things are just smooth sailing, then this kind of a process is going to work much better, so long as it’s tied to real business objectives. Then when you achieve results here, you can point back to, “Hey, remember I told you these are the areas SEO can contribute to our overall goals, and now I can connect these up directly. The metrics that I get from all this SEO stuff can tie directly to those areas, can tie directly to the business goals.” Everyone from the CEO on down is going to love what you’re doing for the company.

All right everyone, I hope you’ll join me again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

How to Open Mailto: Links in Gmail by Default

March 26th, 2015

gmail handlers

Have you clicked mailto: links (email links) on websites and want them to open in Gmail. When I decided to install email links at bottom of these posts, I would…

Read full original article at How to Open Mailto: Links in Gmail by Default

©2015 QuickOnlineTips. All Rights Reserved.

Quick Online Tips

Moving 5 Domains to 1: An SEO Case Study

March 24th, 2015

Posted by Dr-Pete

People often ask me if they should change domain names, and I always shudder just a little. Changing domains is a huge, risky undertaking, and too many people rush into it seeing only the imaginary upside. The success of the change also depends wildly on the details, and it’s not the kind of question anyone should be asking casually on social media.

Recently, I decided that it was time to find a new permanent home for my personal and professional blogs, which had gradually spread out over 5 domains. I also felt my main domain was no longer relevant to my current situation, and it was time for a change. So, ultimately I ended up with a scenario that looked like this:

The top three sites were active, with UserEffect.com being my former consulting site and blog (and relatively well-trafficked). The bottom two sites were both inactive and were both essentially gag sites. My one-pager, AreYouARealDoctor.com, did previously rank well for “are you a real doctor”, so I wanted to try to recapture that.

I started migrating the 5 sites in mid-January, and I’ve been tracking the results. I thought it would be useful to see how this kind of change plays out, in all of the gory details. As it turns out, nothing is ever quite “textbook” when it comes to technical SEO.

Why Change Domains at All?

The rationale for picking a new domain could fill a month’s worth of posts, but I want to make one critical point – changing domains should be about your business goals first, and SEO second. I did not change domains to try to rank better for “Dr. Pete” – that’s a crap shoot at best. I changed domains because my old consulting brand (“User Effect”) no longer represented the kind of work I do and I’m much more known by my personal brand.

That business case was strong enough that I was willing to accept some losses. We went through a similar transition here
from SEOmoz.org to Moz.com. That was a difficult transition that cost us some SEO ground, especially short-term, but our core rationale was grounded in the business and where it’s headed. Don’t let an SEO pipe dream lead you into a risky decision.

Why did I pick a .co domain? I did it for the usual reason – the .com was taken. For a project of this type, where revenue wasn’t on the line, I didn’t have any particular concerns about .co. The evidence on how top-level domains (TLDs) impact ranking is tough to tease apart (so many other factors correlate with .com’s), and Google’s attitude tends to change over time, especially if new TLDs are abused. Anecdotally, though, I’ve seen plenty of .co’s rank, and I wasn’t concerned.

Step 1 – The Boring Stuff

It is absolutely shocking how many people build a new site, slap up some 301s, pull the switch, and hope for the best. It’s less shocking how many of those people end up in Q&A a week later, desperate and bleeding money.


Planning is hard work, and it’s boring – get over it.

You need to be intimately familiar with every page on your existing site(s), and, ideally, you should make a list. Not only do you have to plan for what will happen to each of these pages, but you’ll need that list to make sure everything works smoothly later.

In my case, I decided it might be time to do some housekeeping – the User Effect blog had hundreds of posts, many outdated and quite a few just not very good. So, I started with the easy data – recent traffic. I’m sure you’ve seen this Google Analytics report (Behavior > Site Content > All Pages):

Since I wanted to focus on recent activity, and none of the sites had much new content, I restricted myself to a 3-month window (Q4 of 2014). Of course, I looked much deeper than the top 10, but the principle was simple – I wanted to make sure the data matched my intuition and that I wasn’t cutting off anything important. This helped me prioritize the list.

Of course, from an SEO standpoint, I also didn’t want to lose content that had limited traffic but solid inbound links. So, I checked my “Top Pages” report in
Open Site Explorer:

Since the bulk of my main site was a blog, the top trafficked and top linked-to pages fortunately correlated pretty well. Again, this is only a way to prioritize. If you’re dealing with sites with thousands of pages, you need to work methodically through the site architecture.

I’m going to say something that makes some SEOs itchy – it’s ok not to move some pages to the new site. It’s even ok to let some pages 404. In Q4, UserEffect.com had traffic to 237 URLs. The top 10 pages accounted for 91.9% of that traffic. I strongly believe that moving domains is a good time to refocus a site and concentrate your visitors and link equity on your best content. More is not better in 2015.

Letting go of some pages also means that you’re not 301-redirecting a massive number of old URLs to a new home-page. This can look like a low-quality attempt to consolidate link-equity, and at large scale it can raise red flags with Google. Content worth keeping should exist on the new site, and your 301s should have well-matched targets.

In one case, I had a blog post that had a decent trickle of traffic due to ranking for “50,000 push-ups,” but the post itself was weak and the bounce rate was very high:

The post was basically just a placeholder announcing that I’d be attempting this challenge, but I never recapped anything after finishing it. So, in this case,
I rewrote the post.

Of course, this process was repeated across the 3 active sites. The 2 inactive sites only constituted a handful of total pages. In the case of AreYouARealDoctor.com, I decided to turn the previous one-pager
into a new page on the new site. That way, I had a very well-matched target for the 301-redirect, instead of simply mapping the old site to my new home-page.

I’m trying to prove a point – this is the amount of work I did for a handful of sites that were mostly inactive and producing no current business value. I don’t need consulting gigs and these sites produce no direct revenue, and yet I still considered this process worth the effort.

Step 2 – The Big Day

Eventually, you’re going to have to make the move, and in most cases, I prefer ripping off the bandage. Of course, doing something all at once doesn’t mean you shouldn’t be careful.

The biggest problem I see with domain switches (even if they’re 1-to-1) is that people rely on data that can take weeks to evaluate, like rankings and traffic, or directly checking Google’s index. By then, a lot of damage is already done. Here are some ways to find out quickly if you’ve got problems…

(1) Manually Check Pages

Remember that list you were supposed to make? It’s time to check it, or at least spot-check it. Someone needs to physically go to a browser and make sure that each major section of the site and each important individual page is resolving properly. It doesn’t matter how confident your IT department/guy/gal is – things go wrong.

(2) Manually Check Headers

Just because a page resolves, it doesn’t mean that your 301-redirects are working properly, or that you’re not firing some kind of 17-step redirect chain. Check your headers. There are tons of free tools, but lately I’m fond of
URI Valet. Guess what – I screwed up my primary 301-redirects. One of my registrar transfers wasn’t working, so I had to have a setting changed by customer service, and I inadvertently ended up with 302s (Pro tip: Don’t change registrars and domains in one step):

Don’t think that because you’re an “expert”, your plan is foolproof. Mistakes happen, and because I caught this one I was able to correct it fairly quickly.

(3) Submit Your New Site

You don’t need to submit your site to Google in 2015, but now that Google Webmaster Tools allows it, why not do it? The primary argument I hear is “well, it’s not necessary.” True, but direct submission has one advantage – it’s fast.

To be precise, Google Webmaster Tools separates the process into “Fetch” and “Submit to index” (you’ll find this under “Crawl” > “Fetch as Google”). Fetching will quickly tell you if Google can resolve a URL and retrieve the page contents, which alone is pretty useful. Once a page is fetched, you can submit it, and you should see something like this:

This isn’t really about getting indexed – it’s about getting nearly instantaneous feedback. If Google has any major problems with crawling your site, you’ll know quickly, at least at the macro level.

(4) Submit New XML Sitemaps

Finally, submit a new set of XML sitemaps in Google Webmaster Tools, and preferably tiered sitemaps. While it’s a few years old now, Rob Ousbey has a great post on the subject of
XML sitemap structure. The basic idea is that, if you divide your sitemap into logical sections, it’s going to be much easier to diagnosis what kinds of pages Google is indexing and where you’re running into trouble.

A couple of pro tips on sitemaps – first, keep your old sitemaps active temporarily. This is counterintuitive to some people, but unless Google can crawl your old URLs, they won’t see and process the 301-redirects and other signals. Let the old accounts stay open for a couple of months, and don’t cut off access to the domains you’re moving.

Second (I learned this one the hard way), make sure that your Google Webmaster Tools site verification still works. If you use file uploads or meta tags and don’t move those files/tags to the new site, GWT verification will fail and you won’t have access to your old accounts. I’d recommend using a more domain-independent solution, like verifying with Google Analytics. If you lose verification, don’t panic – your data won’t be instantly lost.

Step 3 – The Waiting Game

Once you’ve made the switch, the waiting begins, and this is where many people start to panic. Even executed perfectly, it can take Google weeks or even months to process all of your 301-redirects and reevaluate a new domain’s capacity to rank. You have to expect short term fluctuations in ranking and traffic.

During this period, you’ll want to watch a few things – your traffic, your rankings, your indexed pages (via GWT and the site: operator), and your errors (such as unexpected 404s). Traffic will recover the fastest, since direct traffic is immediately carried through redirects, but ranking and indexation will lag, and errors may take time to appear.

(1) Monitor Traffic

I’m hoping you know how to check your traffic, but actually trying to determine what your new levels should be and comparing any two days can be easier said than done. If you launch on a Friday, and then Saturday your traffic goes down on the new site, that’s hardly cause for panic – your traffic probably
always goes down on Saturday.

In this case, I redirected the individual sites over about a week, but I’m going to focus on UserEffect.com, as that was the major traffic generator. That site was redirected, in full on January 21st, and the Google Analytics data for January for the old site looked like this:

So far, so good – traffic bottomed out almost immediately. Of course, losing traffic is easy – the real question is what’s going on with the new domain. Here’s the graph for January for DrPete.co:

This one’s a bit trickier – the first spike, on January 16th, is when I redirected the first domain. The second spike, on January 22nd, is when I redirected UserEffect.com. Both spikes are meaningless – I announced these re-launches on social media and got a short-term traffic burst. What we really want to know is where traffic is leveling out.

Of course, there isn’t a lot of history here, but a typical day for UserEffect.com in January was about 1,000 pageviews. The traffic to DrPete.co after it leveled out was about half that (500 pageviews). It’s not a complete crisis, but we’re definitely looking at a short-term loss.

Obviously, I’m simplifying the process here – for a large, ecommerce site you’d want to track a wide range of metrics, including conversion metrics. Hopefully, though, this illustrates the core approach. So, what am I missing out on? In this day of [not provided], tracking down a loss can be tricky. Let’s look for clues in our other three areas…

(2) Monitor Indexation

You can get a broad sense of your indexed pages from Google Webmaster Tools, but this data often lags real-time and isn’t very granular. Despite its shortcomings, I still prefer
the site: operator. Generally, I monitor a domain daily – any one measurement has a lot of noise, but what you’re looking for is the trend over time. Here’s the indexed page count for DrPete.co:

The first set of pages was indexed fairly quickly, and then the second set started being indexed soon after UserEffect.com was redirected. All in all, we’re seeing a fairly steady upward trend, and that’s what we’re hoping to see. The number is also in the ballpark of sanity (compared to the actual page count) and roughly matched GWT data once it started being reported.

So, what happened to UserEffect.com’s index after the switch?

The timeframe here is shorter, since UserEffect.com was redirected last, but we see a gradual decline in indexation, as expected. Note that the index size plateaus around 60 pages – about 1/4 of the original size. This isn’t abnormal – low-traffic and unlinked pages (or those with deep links) are going to take a while to clear out. This is a long-term process. Don’t panic over the absolute numbers – what you want here is a downward trend on the old domain accompanied by a roughly equal upward trend on the new domain.

The fact that UserEffect.com didn’t bottom out is definitely worth monitoring, but this timespan is too short for the plateau to be a major concern. The next step would be to dig into these specific pages and look for a pattern.

(3) Monitor Rankings

The old domain is dropping out of the index, and the new domain is taking its place, but we still don’t know why the new site is taking a traffic hit. It’s time to dig into our core keyword rankings.

Historically, UserEffect.com had ranked well for keywords related to “split test calculator” (near #1) and “usability checklist” (in the top 3). While [not provided] makes keyword-level traffic analysis tricky, we also know that the split-test calculator is one of the top trafficked pages on the site, so let’s dig into that one. Here’s the ranking data from Moz Analytics for “split test calculator”:

The new site took over the #1 position from the old site at first, but then quickly dropped down to the #3/#4 ranking. That may not sound like a lot, but given this general keyword category was one of the site’s top traffic drivers, the CTR drop from #1 to #3/#4 could definitely be causing problems.

When you have a specific keyword you can diagnose, it’s worth taking a look at the live SERP, just to get some context. The day after relaunch, I captured this result for “dr. pete”:

Here, the new domain is ranking, but it’s showing the old title tag. This may not be cause for alarm – weird things often happen in the very short term – but in this case we know that I accidentally set up a 302-redirect. There’s some reason to believe that Google didn’t pass full link equity during that period when 301s weren’t implemented.

Let’s look at a domain where the 301s behaved properly. Before the site was inactive, AreYouARealDoctor.com ranked #1 for “are you a real doctor”. Since there was an inactive period, and I dropped the exact-match domain, it wouldn’t be surprising to see a corresponding ranking drop.

In reality, the new site was ranking #1 for “are you a real doctor” within 2 weeks of 301-redirecting the old domain. The graph is just a horizontal line at #1, so I’m not going to bother you with it, but here’s a current screenshot (incognito):

Early on, I also spot-checked this result, and it wasn’t showing the strange title tag crossover that UserEffect.com pages exhibited. So, it’s very likely that the 302-redirects caused some problems.

Of course, these are just a couple of keywords, but I hope it provides a starting point for you to understand how to methodically approach this problem. There’s no use crying over spilled milk, and I’m not going to fire myself, so let’s move on to checking any other errors that I might have missed.

(4) Check Errors (404s, etc.)

A good first stop for unexpected errors is the “Crawl Errors” report in Google Webmaster Tools (Crawl > Crawl Errors). This is going to take some digging, especially if you’ve deliberately 404’ed some content. Over the couple of weeks after re-launch, I spotted the following problems:

The old site had a “/blog” directory, but the new site put the blog right on the home-page and had no corresponding directory. Doh. Hey, do as I say, not as I do, ok? Obviously, this was a big blunder, as the old blog home-page was well-trafficked.

The other two errors here are smaller but easy to correct. MinimalTalent.com had a “/free” directory that housed downloads (mostly PDFs). I missed it, since my other sites used a different format. Luckily, this was easy to remap.

The last error is a weird looking URL, and there are other similar URLs in the 404 list. This is where site knowledge is critical. I custom-designed a URL shortener for UserEffect.com and, in some cases, people linked to those URLs. Since those URLs didn’t exist in the site architecture, I missed them. This is where digging deep into historical traffic reports and your top-linked pages is critical. In this case, the fix isn’t easy, and I have to decide whether the loss is worth the time.

What About the New EMD?

My goal here wasn’t to rank better for “Dr. Pete,” and finally unseat Dr. Pete’s Marinades, Dr. Pete the Sodastream flavor (yes, it’s hilarious – you can stop sending me your grocery store photos), and 172 dentists. Ok, it mostly wasn’t my goal. Of course, you might be wondering how switching to an EMD worked out.

In the short term, I’m afraid the answer is “not very well.” I didn’t track ranking for “Dr. Pete” and related phrases very often before the switch, but it appears that ranking actually fell in the short-term. Current estimates have me sitting around page 4, even though my combined link profile suggests a much stronger position. Here’s a look at the ranking history for “dr pete” since relaunch (from Moz Analytics):

There was an initial drop, after which the site evened out a bit. This less-than-impressive plateau could be due to the bad 302s during transition. It could be Google evaluating a new EMD and multiple redirects to that EMD. It could be that the prevalence of natural anchor text with “Dr. Pete” pointing to my site suddenly looked unnatural when my domain name switched to DrPete.co. It could just be that this is going to take time to shake out.

If there’s a lesson here (and, admittedly, it’s too soon to tell), it’s that you shouldn’t rush to buy an EMD in 2015 in the wild hope of instantly ranking for that target phrase. There are so many factors involved in ranking for even a moderately competitive term, and your domain is just one small part of the mix.

So, What Did We Learn?

I hope you learned that I should’ve taken my own advice and planned a bit more carefully. I admit that this was a side project and it didn’t get the attention it deserved. The problem is that, even when real money is at stake, people rush these things and hope for the best. There’s a real cheerleading mentality when it comes to change – people want to take action and only see the upside.

Ultimately, in a corporate or agency environment, you can’t be the one sour note among the cheering. You’ll be ignored, and possibly even fired. That’s not fair, but it’s reality. What you need to do is make sure the work gets done right and people go into the process with eyes wide open. There’s no room for shortcuts when you’re moving to a new domain.

That said, a domain change isn’t a death sentence, either. Done right, and with sensible goals in mind – balancing not just SEO but broader marketing and business objectives – a domain migration can be successful, even across multiple sites.

To sum up: Plan, plan, plan, monitor, monitor, monitor, and try not to panic.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

How to Stop Spam Bots from Ruining Your Analytics Referral Data

March 23rd, 2015

Posted by jaredgardner

A few months back, my agency started seeing a referral traffic spike in our Google Analytics account. At first, I got excited. Someone is linking to us and people are clicking. Hooray!

Wrong! How very, very wrong. As I dug deeper, I saw that most of this referral traffic was sent from spammers, and mostly from one spammer named Vitaly Popov (or, as I like to call him, “the most recent pain in my ass”). 

The domains he owns have been giving our company’s site and most of our clients’ sites a few hundred sessions per month, enough to throw off the analytics data in many cases.

His sites aren’t the only ones I’ll cover in this how-to, but his spam network has been the biggest nuisance lately. If you’re getting spam referrers in your analytics, you should be able to follow the same steps to stop these data-skewing nimcompoops from spoiling your data, too.

Why do I need to worry about blocking and filtering these sites?

There are two main reasons I’m motivated to block these on all sites that I work with. First: corrupt analytics data. A few hundred hits a month on a site like
Moz.com isn’t going to move the needle when compared to the sheer volume of sessions they have daily. However, on a small site for a local plumber, 30 sessions per day is likely going to be 70% spam referral traffic, suffocating the remaining legitimate traffic and making marketing analysis a frustrating endeavor.

Second: server load and security. I didn’t ask them to crawl or visit my site. Their visits are using my server resources for something that I don’t want or need. An overloaded server means slower load times, which translate to higher bounce rates and lower rankings. On top of that, who knows what else they’re doing on my site while they’re there. They could easily be looking for WordPress, plugin and server vulnerabilities.

Popular referral spam domains

Using 
WHOIS.net, I found that Mr. Popov’s spam network includes these domains:

  • darodar.com (and various subdomains)
  • econom.co
  • ilovevitaly.co (and other TLD variations)

Other spammers plaguing the web include:

  • semalt.com (and various subdomains)
  • buttons-for-website.com
  • see-your-website-here.com

Many other sites have come and gone. These are just the sites that have been active lately.

Why are they hitting my site?

Why are people going through so much effort to crawl the web without blocking themselves from analytics? Spam! So much spam, it still blows me away. I looked into a few of the sites listed above. Three of the most prolific ones are doing it for very different reasons. 

See-your-website-here.com

Screen-Shot-2015-01-21-at-2.30.22-PM.png

This site takes the cake for being the most frustrating. This site is using referrer spam as a form of lead generation. What is their product you ask? Web spam. You can pay see-your-website-here.com to perform web spam for your company as a form of lead generation. The owner of this domain was kind enough to make his WHOIS information public. His name is Ben Sykes and he’s from London.

Semalt.com

Screen-Shot-2015-01-21-at-2.44.09-PM.png

Semalt.com and I have had a tumultuous relationship at best. Semalt is an SEO product that’s designed to give on- and off-page analysis such as keyword usage and link metrics. Their products seem to be somewhat legit. However, their business practices are not. Semalt uses a bot to crawl the web and index webpage data, but they don’t disable analytics tracking like most respectable bots do. They have a form to remove your site from being crawled at
http://semalt.com/project_crawler.php, which is ever so nice of them. Of course, I tried this months ago and they still crawled our site. I ended up talking with a representative from Semalt.com via Twitter after I wrote this article: How to Stop Semalt.com from Plaguing Your Google Analytics Data. I’ve documented our interactions and the outcome of that project in the article. 

Darodar.com, econom.co, and ilovevitaly.com

Screen-Shot-2015-01-21-at-4.03.48-PM.png

This network appears to exist for the purpose of directing affiliate traffic to shopping sites such as AliExpress.com and eBay.com. I am guessing that the site won’t pay out to the affiliate unless the traffic results in a purchase, which seems unlikely. The sub-domain shopping.ilovevitaly.com used to redirect to aliexpress.com directly, but now it goes to a landing page that links to a variety of online retailers.

How to stop spam bots

Block via .htaccess

The best way to block referrers from accessing your site at all is to block them in your .htaccess file in the root directory of your domain. You can copy and paste the following code into your .htaccess file, assuming you’re on an Apache server. I like this method better than just blocking the domain in analytics because it prevents spam bots from hitting your server altogether. If you want to get creative, you can redirect the traffic back to their site.

# Block Russian Referrer Spam
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://.*ilovevitaly\.com/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*ilovevitaly.\.ru/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*ilovevitaly\.org/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*ilovevitaly\.info/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*iloveitaly\.ru/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*econom\.co/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*savetubevideo\.com/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*kambasoft\.com/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*buttons\-for\-website\.com/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*semalt\.com/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*darodar\.com/ [NC]
RewriteRule ^(.*)$   – [F,L]

Warning: .htaccess is a very powerful file that dictates how your server behaves. If you upload an .htaccess file with one character out of place, you will likely take down the whole site. Before you make any changes to the file, I would suggest making a backup. If you don’t feel comfortable making these edits, see the WordPress plug-in option below.

Analytics filters

By itself, .htaccess won’t solve all of your problems. It will only protect you from future sessions, and it won’t affect the sessions that have already happened. I like to set up filters by country in analytics to remove the historical data, as well as to help filter out any other bots we might find from select countries in the future. Of course this wouldn’t be a good idea if you expect to get legitimate traffic from countries like Russia, Brazil, or Indonesia, but many U.S.-based companies can safely block these countries without losing potential customers. Follow the steps below to set up the filters.

First, click on the “Admin” tab at the top of the page. On the view column you will want to create a “new” view so that you still have an unadulterated report of all traffic in Google Analytics. I named my mine “Filter Bots.” After you have your new view selected, click in to the “Filters” section then select the “+New Filter Button.”

View_filter_fianl.png

Setting up filters is pretty simple if you know what setting to use. I like to filter out all traffic from Russia, Brazil, and Indonesia. These are just the countries that have been giving us issues lately. You can add more filters as you need them.

The filter name is just an arbitrary label. I usually just type “block [insert country here].” Next, choose the filter type “custom.” Choose “country” from the “Filter Field” drop down. The “Filter Pattern Field” is where you actually define what countries you are filtering, so make sure you spell them correctly. You can double check your filters by using the “Verify This Filter” button. A graph will pop-up and show you how many sessions will be removed from the last seven days.

Filter_settings_final.jpg

I would recommend selecting the “Bot Filtering” check box that is found in “View Settings” within the “Admin” tab. I haven’t seen a change in my data using this feature yet, but it doesn’t hurt to set it up since it’s really easy and maybe Google will decide to block some of these spammers.

Viewsettings_bot_button_final.jpg

Using WordPress? Don’t want to edit your .htaccess file?

I’ve used the plugin
Wp-Ban before, and it makes it easy to block unwanted visitors. Wp-ban gives you the ability to ban users by IP, IP range, host name, user agent and referrer URL from visiting your WordPress blog all from within the WordPress admin panel. This a great option for people who don’t want to edit their .htaccess file or don’t feel comfortable doing so.

Additional resources

There are a few other great posts you can refer to if you’re looking for more info on dealing with referrer spam:

  1. http://www.optimizesmart.com/geek-guide-removing-referrer-spam-google-analytics/
  2. https://megalytic.com/blog/how-to-filter-out-fake-referrals-and-other-google-analytics-spam
  3. http://blog.raventools.com/stop-referrer-spam/
  4. http://www.analyticsedge.com/2014/12/removing-referral-spam-google-analytics/

Conclusion

I hope this helps you block all the pesky spammers out there. There are definitely different ways you can solve this problem, and these are just the ones that have helped me protect analytics data. I’d love to hear how you have dealt with spam bots. Share your stories with me on Twitter or in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Headline Writing and Title Tag SEO in a Clickbait World – Whiteboard Friday

March 22nd, 2015

Posted by randfish

When writing headlines and title tags, we’re often conflicted in what we’re trying to say and (more to the point) how we’re trying to say it. Do we want it to help the page rank in SERPs? Do we want people to be intrigued enough to click through? Or are we trying to best satisfy the searcher’s intent? We’d like all three, but a headline that achieves them all is incredibly difficult to write.

In today’s Whiteboard Friday, Rand illustrates just how small the intersection of those goals is, and offers a process you can use to find the best way forward.

For reference, here’s a still of this week’s whiteboard!

title tag whiteboard

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about writing titles and headlines, both for SEO and in this new click-bait, Facebook social world. This is kind of a challenge, because I think many folks are seeing and observing that a lot of the ranking signals that can help a page perform well are often preceded or well correlated with social activity, which would kind of bias us towards saying, “Hey, how can I do these click-baity, link-baity sorts of social viral pieces,” versus we’re also a challenge with, “Gosh, those things aren’t as traditionally well performing in search results from a perhaps click-through rate and certainly from a search conversion perspective. So how do we balance out these two and make them work together for us based on our marketing goals?” So I want to try and help with that.

Let’s look at a search query for Viking battles, in Google. These are the top two results. One is from Wikipedia. It’s a category page — Battles Involving the Vikings. That’s pretty darn straightforward. But then our second result — actually this might be a third result, I think there’s a indented second Wikipedia result — is the seven most bad ass last stands in the history of battles. It turns out that there happen to be a number of Viking related battles in there, and you can see that in the meta description that Google pulls. This one’s from Crack.com.

These are pretty representative of the two different kinds of results or of content pieces that I’m talking about. One is very, very viral, very social focused, clearly designed to sort of do well in the Facebook world. One is much more classic search focused, clearly designed to help answer the user query — here’s a list of Viking battles and their prominence and importance in history, and structure, and all those kinds of things.

Okay. Here’s another query — Viking jewelry. Going to stick with my Viking theme, because why not? We can see a website from Viking jewelry. This one’s on JellDragon.com. It’s an eCommerce site. They’re selling sterling silver and bronze Viking jewelry. They’ve actually done very classic SEO focus. Not only do they have Viking jewelry mentioned twice, in the second instance of Viking jewelry, I think they’ve intentionally — I hope it was intentionally — misspelled the word “jewelry” to hopefully catch misspellings. That’s some old-school SEO. I would actually not recommend this for any purpose.

But I thought it was interesting to highlight versus in this search result it takes until page three until I could really find a viral, social, targeted, more link-baity, click-baity type of article, this one from io9 — 1,000 Year-old Viking Jewelry Found On Danish Farm. You know what the interesting part is? In this case, both of these are on powerful domains. They both have quite a few links to them from many external sources. They’re pretty well SEO’d pages.

In this case, the first two pages of results are all kind of small jewelry website stores and a few results from like Etsy and Amazon, more powerful authoritative domains. But it really takes a long time before you get these, what I’d consider, very powerful, very strong attempts at ranking for Viking jewelry from more of your click-bait, social, headline, viral sites. io9 certainly, I would kind of expect them to perform higher, except that this doesn’t serve the searcher intent.

I think Google knows that when people look for Viking jewelry, they’re not looking for the history of Viking jewelry or where recent archeological finds of Viking jewelry happened. They’re looking specifically for eCommerce sites. They’re trying to transact and buy, or at least view and see what Viking jewelry looks like. So they’re looking for photo heavy, visual heavy, potentially places where they might buy stuff. Maybe it’s some people looking for artifacts as well, to view the images of those, but less of the click-bait focus kind of stuff.

This one I think it’s very likely that this does indeed perform well for this search query, and lots of people do click on that as a positive result for what they’re looking for from Viking battles, because they’d like to see, “Okay, what were the coolest, most amazing Viking battles that happened in history?”

You can kind of see what’s happened here with two things. One is with Hummingbird and Google’s focus on topic modeling, and the other with searcher intent and how Google has gotten so incredibly good at pattern matching to serve user intent. This is really important from an SEO perspective to understand as well, and I like how these two examples highlight it. One is saying, “Hey, just because you have the most links, the strongest domain, the best keyword targeting, doesn’t necessarily mean you’ll rank if you’re not serving searcher intent.”

Now, when we think about doing this for ourselves, that click-bait versus searched optimized experience for our content, what is it about? It’s really about choosing. It’s about choosing searcher intent, our website and marketing goals, or click-bait types of goals. I’ve visualized the intersection here with a Venn diagram. So these in pink here, the click-bait pieces that are going to resonate in social media — Facebook, Twitter, etc. Blue is the intent of searchers, and purple is your marketing goals, what you want to achieve when visitors get to your site, the reason you’re trying to attract this traffic in the first place.

This intersection, as you will notice, is super, uber tiny. It is miniscule. It is molecule sized, and it’s a very, very hard intersection to hit. In fact, for the vast majority of content pieces, I’m going to say that it’s going to be close to, not always, but close to impossible to get that perfect mix of click-bait, intent of searchers, and your marketing goals. The times when it works best is really when you’re trying to educate your audience or provide them with informational value, and that’s also something that’s going to resonate in the social web and something searchers are going to be looking for. It works pretty well in B2B types of things, particularly in spaces where there’s lots of influencers and amplifiers who also care about educating their followers. It doesn’t work so well when you’re trying to target Viking battles or Viking jewelry. What can I say, the historians of the Viking world simply aren’t that huge on Twitter yet. I hope they will be one day.

This is kind of the process that I would use to think about the structure of these and how to choose between them. First off, I think you need to ask, “Should I create a single piece of content to target all of these, or should I instead be thinking about individual pieces that hit one or two at a time?”

So it could be the case that maybe you’ve got an intersection of intent for searchers and your marketing goals. This happens quite a bit, and oftentimes for these folks, for the Jell Dragon Viking Jewelry, the intent of searchers and what they’re trying to accomplish on their site, perfectly in harmony, but definitely not with click-bait pieces that are going to resonate on the web. More challenging for io9 with this kind of a thing, because searchers just aren’t looking for that around Viking jewelry. They might instead be thinking about, “Hey, we’re trying to target the specific news item. We want anyone who looks for Viking jewelry in Danish farm, or Viking jewelry found, or those kind of things to be finding our site.”

Then, I would ask, “How can I best serve my own marketing goals, the marketing goals of my website through the pages that are targeted at search or social?” Sometimes that’s going to be very direct, like it is over here with JellDagon.com trying to convert folks and folks looking for Viking jewelry to buy.

Sometimes it’s going to be indirect,. A Moz Whiteboard Friday, for example, is a very indirect example. We’re trying to serve the intent of searchers and in the long term eventually, maybe sometime in the future some folks who watch this video might be interested in Moz’ tools or going to MozCon or signing up for an email list, or whatever it is. But our marketing goals are secondary and they’re further in the future. You could also think about that happening at the very end of a funnel, coming in if someone searches for say Moz versus Searchmetrics and maybe Searchmetrics has a great page comparing what’s better about their service versus Moz’ service and those types of things, and getting right in at the end of the funnel. So that should be a consideration as well. Same thing with social.

Then lastly, where are you going to focus that keyword targeting and the content foci efforts? What kind of content are you going to build? How are you going to keyword target them best to achieve this, and how much you interlink between those pages?

I’ll give you a quick example over here, but this can be expanded upon. So for my conversion page, I may try and target the same keywords or a slightly more commercial variation on the search terms I’m targeting with my more informational style content versus entertainment social style content. Then, conversion page might be separate, depending on how I’m structuring things and what the intent of searchers is. My click-bait piece may be not very keyword focused at all. I might write that headline and say, “I don’t care about the keywords at all. I don’t need to rank here. I’m trying to go viral on social media. I’m trying to achieve my click-bait goals. My goal is to drive traffic, get some links, get some topical authority around this subject matter, and later hopefully rank with this page or maybe even this page in search engines.” That’s a viable goal as well.

When you do that, what you want to do then is have a link structure that optimizes around this. So your click-bait piece, a lot of times with click-bait pieces they’re going to perform worse if you go over and try and link directly to your conversion page, because it looks like you’re trying to sell people something. That’s not what plays on Facebook, on Twitter, on social media in general. What plays is, “Hey, this is just entertainment, and I can just visit this piece and it’s fun and funny and interesting.”

What plays well in search, however, is something that let’s someone accomplish their tasks. So it’s fine to have information and then a call to action, and that call to action can point to the conversion page. The click-bait pieces content can do a great job of helping to send link equity, ranking signals, and maybe some visitor traffic who’s interested in truly learning more over to the informational page that you want ranking for search. This is kind of a beautiful way to think about the interaction between the three of these when you have these different levels of foci, when you have these different searcher versus click-bait intents, and how to bring them all together.

All right everyone, hope to see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

How to Add Custom CSS Styles to WordPress Post Editor

March 21st, 2015

add editor styles

You can add editor styles and change the WordPress Post Editor look with Custom CSS styles and create a pleasant writing experience which suits your eyes and makes you enjoy…

Read full original article at How to Add Custom CSS Styles to WordPress Post Editor

©2015 QuickOnlineTips. All Rights Reserved.

Quick Online Tips

Google Mobile Search Result Highlights

March 20th, 2015

Google recently added highlights at the bottom of various sections of their mobile search results. The highlights appear on ads, organic results, and other various vertical search insertion types. The colors vary arbitrarily by section and are patterned off the colors in the Google logo. Historically such borders have conveyed a meaning, like separating advertisements from organic search results, but now the colors have no meaning other than acting as a visual separator.

We recently surveyed users to see if they understood what the borders represented & if they felt the borders had any meaning. We did 4 surveys total. The first 2 allows a user to select a choice from a drop down menu. The last two were open ended, where a user typed text into the box. For each of the 2 survey types, we did a survey of a SERP which had an ad in it & a survey of a SERP without an ad in it.

Below are the associated survey images & user results.


Google recently added colored bars at the bottom of some mobile search results. What do they mean?

answer no ads with ad
none of the other options are correct 27.7% (+2.7 / -2.5) 29.9% (+2.8 / -2.7)
the listing is an advertisement 25.8% (+2.8 / -2.6) 30.1% (+2.8 / -2.7)
each color has a different meaning 24% (+2.7 / -2.5) 19.6% (+2.5 / -2.3)
colors separate sections but have no meaning 15.5% (+2.4 / -2.1) 12.5% (+2.1 / -1.9)
the listing is a free search result 6.9% (+1.8 / -1.5) 7.9% (+2.0 / -1.6)

Given there are 5 answers, if the distributions were random there would have been a 20% distribution on each option. The only options which skewed well below that were the perceptions that the colored highlights either had no meaning or represented free/organic search results.

Link to survey results: without ads vs with ads.

And here are images of what users saw for the above surveys:


For the second set of surveys we used an open ended format

The open ended questions allow a user to type in whatever they want. This means the results do not end up biased by the predefined answer options in a quiz, but it also means the results will include plenty of noise like…

  • people entering a, c, d, k, 1, 2, 3, ggg, hello, jj, blah, and who cares as answer choices
  • some of the responses referencing the listing topics
  • some of the responses referencing parts of a search result listing like the headlines or hyperlinks
  • some of the responses highlighting the colors of the bars
  • etc.

Like the above surveys, on each of these I ordered 1,500 responses. As of writing this, each had over 1,400 responses completed & here are the word clouds for the SERPs without an ad vs the SERPs with an ad.

SERP without an ad

SERP with an ad

On each of the above word clouds, we used the default automated grouping. Here is an example of what the word cloud would look like if the results were grouped manually.

Summary

For a couple years Google has removed various forms of eye candy from many organic results (cutting back on video snippets, limiting rich rating snipets, removing authorship, etc.). The justification for such removals was to make the results feel “less cluttered.” At the same time, Google has added a variety of the same types of “noisy” listing enhancements to their various ad programs.

What is the difference between reviews ad extensions, consumer ratings ad extensions, and seller ratings ad extensions? What is the difference between callout extensions and dynamic structured snippets?

Long ago AdWords advertisements had a border near them to separate them from the organic results. Those borders disappeared many years ago & only recently reappeared on mobile devices when they also appeared near organic listings. That in turn has left searchers confused as to what the border highlighting means.

According to the above Google survey results, the majority of users don’t know what the colors signify, don’t care what they signify, or think they indicate advertisements.

Categories: 
google

SEO Book

How to Dispay Sharper Logo on High Resolution Screens

March 19th, 2015

sharp logo

Does your logo look blur and pixelated on iPad retina display screen? Why does the logo of your favorite top blog look sharper? Here is the secret to display sharper…

Read full original article at How to Dispay Sharper Logo on High Resolution Screens

©2015 QuickOnlineTips. All Rights Reserved.

Quick Online Tips

In-App Social & Contact Data – New in Open Site Explorer

March 18th, 2015

Posted by randfish

Today I’m excited to announce the launch of a new feature inside 
Open Site Explorer—In-App Social & Contact Data. 

With this launch, you’ll be able to see the
social or email accounts we’ve discovered associated with a given website, and have one-click access to those pages.


Initially, the feature offers:

  1. Availability today on the inbound links tab and in Link Intersect on the “pages -> subdomains” view. In the future, if y’all find it useful, we hope to expand its presence to other areas of the tool as well.
  2. Email accounts will only be shown if they match the domain name (e.g. rand@moz.com would be shown next to moz.com, randfishkin@yahoo.com would not) and if they appear in standard format on the page (we don’t try to grab emails in JavaScript or that use alternate formats to obsfucate).
  3. We show Facebook, Twitter, Google+, and email addresses we’ve found on multiple pages of the site (we take a small random set and analyze whether these social/contact data pieces are uniform). If we find multiple accounts, you’ll see this:

Use cases

There are three major use cases for this feature (at least for me; you might have more!):

1) Link/Outreach prospecting

It can be a pain to visit sites, find social accounts/emails, and copy them into a spreadsheet or send messages (and recall which ones you have/haven’t done yet). By including social/contact data in the same interface where you’re doing link analysis, we hope to save you time and clicks.

2) Link/site trust and audience reach analysis

We’re actually using this data on the back end at Moz for our upcoming Spam Score feature (coming very soon), but you can use it manually to help with a quick mental filter for trustworthy/authoritative/non-spammy sites, and to get a sense for the size and reach of a site’s social audience.

3) At-a-glance analysis of social networks among a group

If you’re in a given space (e.g. travel blogs), it’s a process to determine which social networks are/aren’t being used by industry participants and influencers. Social/contact data in OSE can help with that by showing which social networks various sites are using and linking to from their pages:

We need your feedback

This first implementation is relatively light in the app—we haven’t yet placed this data anywhere/everywhere it might be useful. Before we do, we want to hear what you think: Is this useful and valuable to your work? Does it help save you time? Would you want to see the feature expanded and if so, in what sections would it provide the greatest value to you? Please let us know in the comments, and by getting back in touch with us after you’ve had a chance to try it out for yourself.

Thanks for giving social/contact data a spin, and look for more upgrades to Open Site Explorer in the very near future!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Prime Reasons for, why SEO plays in enhanced web traffic

It has been recently found that, the strategies opted by the SEO Birmingham companies have become critically significant to the success of small businesses. If your online business is not making use of SEO Birmingham, this write-up discusses prime reasons why you should opt for the local strategies used by SEO Birmingham companies.

As per the recent findings, more than 39% of the NETIZENS experience problems while searching for the local businesses over the Word Wide Wed. People know about the existence of the local business, but suffer inconvenience while locating web information about such businesses. The major reason behind such inconvenience suffered by the Netizens is that such business fails to understand the relevance of SEO Birmingham. Thereby, if your business provides products and services over the web nationally or internationally, search engine optimization can be of great help over making the brand visible on the search engine results.

Nowadays, more and more people rely on the Internet for finding local businesses

There have been times, when the local business did not worry about the scope of SEO, just word of mouth has been more than enough for spreading their existence to the local consumers. But, today as per the statistics more than 84% of the people make use of the Internet medium for locating the local businesses. No telephone directories, people rely on search engines.

Thus, it becomes essential to change the traditional marketing strategy to modern day strategy of online marketing.

SEO Birmingham costs a little less

If you are considering Adword strategy to capture the online marketing domain, you must be aware of the fact; the popularity of the keyword chosen is directly proportional to the amount of the fee paid. Selection of local keywords means lesser keyword competition, which means you need not pay extra costs.

Helps in reaping the benefits of advanced Google features

SEO companies in Birmingham, such as SEO Results4u at Avon House, 435 Stratford Road, Shirley, Solihull, West Midlands, B90 4AA 0121 746 3121 also contribute to the SEO landscape of their local area, be it Solihull, Birmingham or even the wider West Midlands area.

People are actually unaware about the fact, Google plus has changed the traditional way of Internet usage. If the keywords chosen are relevant to the domain to the local market, you very wisely unlock enhanced services offered by Google:

  • A map representing the physical location of a business
  • Appealing pictures with respect to the business
  • Make use of the reviews posted by the user

Truth be told, without using the local platform of SEO, Google plus fails to recognize your business, which clearly means lack of authentic information over the web.

Local SEO encourages better and enhanced credibility

People trust Yahoo, Bing and Google with their eyes closed and believe, these magical search engines have remedies for each and every query. It is a well-accepted notion among the commoners, the brands that appear in the top search lists are most wanted and authentic service providers. So, if you want people to believe in your brand's credibility, Search Engine Optimization adds credibility to your brand power among the commoners. Local SEO adds credibility as well as a definite increase in the web traffic.

Your Peers are using

Business is all about competition. Your peers are using it and yielding the benefits, why aren’t you?