Building a Brand Online: The Golden Age of Digital

April 21st, 2014

Posted by willcritchlow

This post is based on a talk I gave at our SearchLove conference in Boston last week. It ties quite closely with the post my colleague Ron Garrett wrote last week: Search Marketers Need to Evolve. You can probably tell we’ve been doing a lot of thinking about this.

When I gave this talk at SearchLove, I hoped that it would put in context why we bring such a range of speakers and topics together at our conferences and to inspire the attendees to go back to their companies and make real changes. I hope this post will do the same for you.

As digital marketers, our focus on analytics has served us well in driving direct, measurable sales. The dominant form of
brand marketing, however, has remained offline with TV taking the lion’s share of the budget and attention. We believe that as TV faces disruptive technology and business models, digital marketers have an opportunity to grow their influence and impact. In total, this is an opportunity worth tens of billions of dollars a year.

I’d very much like for us—our industry—you and me—to be the ones who benefit.

Despite all the growth we have seen in digital marketing spend, I think that we are only just entering what I’m calling the
golden age of digital.

Building brands online first

We’re entering the age when the biggest brands in the world will be built online first. I hope to convince you of two things: first, that this change is happening right now. And second, that we are the people to win in this world.

Starting at the beginning

There are some confident statements above, but the last few years have had their share of introspection and crises of confidence. We’ve put a lot of time and energy over the years into understanding the direction marketing is moving and capitalising on the shifts. Duncan and I originally started by thinking that networked computing was going to be a big deal and then started our company initially on the back of a simple CMS that we built to help small business owners take advantage of the self-publishing revolution.

As we shifted gears to focus more on the dominance of the search channel, we started trying to understand where Google in particular might be taking things.

We’ve written plenty about that over the years, but we were talking about effects similar to Panda, Penguin, and Hummingbird years before they actually came to pass. Panda and Penguin started making our vision come true. We were more effective search marketers than we’d ever been because we’d largely bypassed building the infrastructure for search as it
was and tried to build it for how search would be.

And yet something was wrong.

Powerful content was becoming ever more effective. And yet the greatest examples of content that we were seeing at search conferences weren’t built by SEO agencies.

Brands were getting a bigger and bigger advantage in search. And yet the best brand builders weren’t SEO agencies.

For a long time, we’ve talked about how “SEO” isn’t a verb. You don’t “SEO a website.” Ranking well is an outcome, not an activity. It’s like fame. “Famous” isn’t a verb. You don’t “famous someone.” You get famous for doing other things (playing sport, performing music, appearing on TV). SEO is the same.

But what if we weren’t the right people to do those things for people? What if we weren’t the world’s best PR firm, branding agency, or creative producers?

Don’t worry. I got over my insecurity. I believe the capabilities that we have been building are going to
grow in power and influence. Here’s how:

The Innovator’s Dilemma

It was Mark Suster who kick-started my confidence with his talk in San Diego [use this link and sign up for an account to get access to the video for free]. He’s an entrepreneur-turned-investor. He’s smart and opinionated.

He talked about maker studios at our conference. You might have heard a few weeks ago that Maker Studios sold to Disney for half a billion dollars.

Maker is a producer and distributor of online video. The turning point for me was in realising that the forces they were betting on were also rampaging towards our quirky, exciting, geeky little corner of the marketing world.

There’s a book called The Innovator’s Dilemma by a Harvard Business School professor named Clayton Christensen. It’s a little dry, but if you’re interested in business theory and technology, it’s an absolute no-brainer: You should read it.

It describes two kinds of innovations that hit established markets. So-called “sustaining innovations” make existing processes faster, cheaper, or better. They can be very dramatic, but Professor Christensen’s research shows that they almost always end up benefiting the incumbent players in the market.

In contrast to “sustaining innovations” stand “disruptive innovations,” which are those that attack problems an entirely different way. They typically don’t work as well as the existing solutions, perhaps solving only part of the problem, but have a structurally different cost. So they’re “cheaper but worse.”

Cheaper but worse

Doesn’t sound too compelling, does it?

That’s what the incumbents think. They may spot a potential opportunity, and may even pay lip service to the idea that they should be pursuing it, but ultimately, their economic incentives are skewed towards maintenance of the status quo.

Therein lies the dilemma.

There’s often a subset of the market, for whom the new service is “good enough.” It may not be gold-plated, but it solves their immediate needs and they can afford it. As they invest, it gets better and better, capturing more and more of the market opportunity until it’s meeting the core needs of even the top end of the market while still being structurally cheaper. Money cascades to the new entrant and leaves the incumbents high and dry.

Let’s go back to the “cheaper but worse” innovation for a second. To me, that sounds an awful lot like the idea of building a brand online. Let’s look at the details:

  • The established way of building a brand for a generation has been via mass market TV advertising and other classic above-the-line spend. Spends of $ 100m+ are not uncommon.
  • Building a brand online is cheaper, yes, but right now, not as effective.
  • The incumbent brand-builders pay lip service to digital, but when you look at their corporate structure, their fee structures, and their economic incentives, and you realise that they’d far rather see TV get bigger than have to do all this messy web marketing.

So I think there is a disruption coming to brand marketing, and I don’t think it’s going to benefit the big guys.

Online first

I’m calling this whole phenomenon “online first”: the biggest brands of tomorrow will be built online. This will be partly because the tools we have available to build brands online are going to get better and better, and partly because money is going to flow to digital from TV. I recently wrote about this in more detail in our Future of TV report:

I am definitely not saying that TV itself is in trouble. We live in an amazing time for TV content. You just have to look at shows like Breaking Bad, Walking Dead, and True Detective to see that we have exceptional content and more ways of accessing that content than ever before—and that’s before we even get to Netflix and House of Cards. In part as a result of this resurgence, the total time spent watching video has increased every year recently.

Our devices are also getting better and better. The cost of big screens is coming down; we now have full HD on our mobile devices.

But the way we get our content is changing. 80% of US households have some form of internet-connected device paired with their TV according to gigaom research. Whether it’s an Apple TV, Roku, Xbox, Chromecast or something else, we can increasingly watch anything we like on the big screen. And conversely, we can watch more and more of our “classic TV” content on smartphones, tablets, laptops and any other screen we can lay our hands on.

This particular part of the trend has been analysed to death. I’m not interested in that for the purposes of this analysis. I’m interested in the fragmenting viewership: In general, we’re no longer all watching the same thing at the same time. That has profound impacts on the way TV advertising is bought and sold.

The innovator’s dilemma predicts that the cost per unit of the high end of the old market will continue to rise even as the bottom starts to fall away. It’s becoming ever more valuable to reach consumers on those rare occasions when we do all sit down at the same time to watch the same content.

The complexity of time-shifted internet-delivered content rapidly surpasses human optimisation ability. The upfront media market in which Oprah stands on stage and extols her show and the network and seeks tens or hundreds of millions of dollars of spend is a process which can’t survive the move to digital-scale complexity (if you’re interested in this, I wrote an introduction to TV advertising a few weeks ago).

Advertising against TV-like content will
have to be bought more like AdWords. It has to become real-time (depending on who’s actually watching at a given moment), it’ll have to be market-priced in one form or another (because you can’t negotiate all these things individually on the fly).

I’m in danger of getting dragged into deep economic arguments, but the effect of all this disruption is going to be a whole load of unbundling and a reallocation of budgets.

Of course, in part, this will open up opportunities in video marketing—both in brand-funded TV-like content and in video advertising against internet-delivered video (check out the talk by 
Chris from Wistia’s [PDF]). I don’t think it’s a given that the incumbent TV advertisers will dominate that space. It’s structurally pretty different. We are certainly betting in this area—between 
Phil and Margarita, we’re already doing video strategy and execution for ourselves and our clients.

It’s
not all about video, though.

How our industry competes

There are three broad areas that we all need to get great at to take advantage of this opportunity. Video fits into the first of these, which is technical creativity—that place where technology and storytelling meet:

1. Technical creativity

I’ve been endlessly frustrated over the years by the creative storytellers who misunderstand (or don’t even care about) technology. The stupid apps that no one uses. The branded social networks that nobody joins. The above-the-line campaigns telling you to search for phrases they don’t rank for.

Old-school SEOs can spot crawl issues or indexing problems in their sleep. We’ve had to get good at things like analytics, UX, and conversion. Indeed, one of the most popular talks last week (and Slideshare of the day) was from 
Aaron Weyenberg at TED, and was all about UX. The things that stood out to me the most were all about the different ways they listened to their audience and gathered feedback at different stages of the process. This incorporated everything from the standard hall-way tests through qualitative and quantitative surveying to a really nicely-executed beta. You can see the full deck here:

The Story of a Redesign

And we mustn’t lose sight of the value of that technical knowledge. Screw up a migration and you’re just as hosed as you’ve ever been.

For me personally, the creative is the more challenging part—but luckily it’s not all about me. We’ve been 
investing in creative for a while, and I loved the presentation our head of creative, Mark Johnstone gave last week entitled how to produce better content ideas. It really clarified my thinking in a few areas—particularly about the effort and research that should go in early in the process in order to give the “lightbulb moment” a chance. By coupling that with examples of deconstructing other people’s creative (and showing us / giving us further reading on how to practice ourselves) he made a compelling argument that we can all do this so much better—and that not only designers can be “creative.” I’m also looking forward to trying out the immersion techniques he talks about for getting from unstructured to structured. You can check out the full deck here:

How to Produce Better Content Ideas

[If you'd like to see more of the decks from Boston, you can currently get them here and in the next few weeks the videos will be available within DistilledU]

2. Broad promotional ability

The second capability we need after technical creativity is a broad promotional ability. This is your classic owned, earned and paid media.

As search marketers, we’ve typically focused primarily on the earned side of this—via outreach and digital PR—and my colleague Rob Toledo gave a great presentation about some of the cleverer forms of earned media in his presentation The Hunter/Gatherer. He talked in detail about ways of reaching that tricky kind of influencer—the one who wants to discover their own interesting share-worthy material. It was a funny presentation that contained some exceptional tactics. You can see the full deck here:

The Hunter/Gatherer

I think paid media is going to have an ever-increasing part to play in online brand building though. Pay Per Click is typically measured on direct response metrics—sending traffic to landing pages and converting them—but social and video advertising is on the rise. We increasingly spend money on promoting content instead of promoting landing pages. I expect that trend to continue.

The eagle-eyed among you might have noticed that this isn’t inbound. I make no apology for that.

3. Influence and measurement throughout the Customer Lifecycle

Finally, alongside our technical creativity and promotional ability, we need to double down on our ability to influence and measure customer behaviour throughout the customer lifecycle.

We’ve all heard (or even been) the search experts who stand on stage and talk about the measurability of digital. Sometimes they go further and make off-hand comments about how you “can’t measure TV.”

Does anyone really believe that? Anyone think Proctor & Gamble or Unilever really waste half the money they spend?

One of the most mind-blowing talks I ever attended was at ad:tech a few years ago—it was a speaker from Ogilvy talking about the econometric models they use to measure their work for P&G. It was all about how they were tying together the influence of point-of-sale, coupon codes, TV, and other above-the-line advertising to understand what’s making them the money. They are good at it but
it’s expensive. Our industry’s stuff is cheap in comparison. It’s not yet good enough but if we work hard and invest, it can be.

What I didn’t say

Remember: I didn’t say TV is dead. I didn’t say search is dead. I said that our crazy blend of technical creativity, promotional chops and measurement skills is going to be the skillset that builds tomorrow’s biggest brands. AND—crucially to the topic near and dear to much of the Moz audience’s hearts, it’s also going to be how you rank in Google.

Advertising is a half-trillion dollar a year industry struggling to understand its place in a digital world. I don’t want the same old guys to win on our turf. The internet is our domain. Let’s go get great at this.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

SearchCap: New Google Penalty, Action Schema & Subscribe To Google Trends

April 20th, 2014

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: Google Slaps Another Guest Blog Network: PostJoint Google’s Matt Cutts somewhat confirmed on Twitter that Google has taken action on another guest blogging…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Download Google Camera App to Defocus Background

April 19th, 2014

Lens Blur

Want to blur and defocus background in photos? Download Google Camera app, a new standalone Android app that enhances the power of your Android smartphone. This smart background blur photo technique is a craze and it seems everyone wants to use it. Google Camera just made it possible for everyone. Lens Blur and Defocus Background Lets Read the rest of this entry »

6 Changes We Always Thought Google Would Make to SEO that They Haven’t Yet – Whiteboard Friday

April 18th, 2014

Posted by randfish

From Google’s interpretation of rel=”canonical” to the specificity of anchor text within a link, there are several areas where we thought Google would make a move and are still waiting for it to happen. In today’s Whiteboard Friday, Rand details six of those areas. Let us know where you think things are going in the comments!

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Today, I’m going to tackle a subject around some of these changes that a lot of us in the marketing and SEO fields thought Google would be making, but weirdly they haven’t.

This comes up because I talk to a lot of people in the industry. You know, I’ve been on the road the last few weeks at a number of conferences – Boston for SearchLove and SMX Munich, both of which were great events – and I’m going to be heading to a bunch more soon. People have this idea that Google must be doing these things, must have made these advancements over the years. It turns out, in actuality, they haven’t made them. Some of them, there are probably really good reasons behind it, and some of them it might just be because they’re really hard to do.

But let’s talk through a few of these, and in the comments we can get into some discussion about whether, when, or if they might be doing some of these.

So number one, a lot of people in the SEO field, and even outside the field, think that it must be the case that if links really matter for SEO, then on-topic links matter more than off-topic links. So, for example, if I’m linking to two websites here about gardening resources, A and B, both about gardening resources, and one of those comes from a botany site and the other one comes from a site about mobile gaming, well, all other things being true, it must be that the one about botany is going to provide a stronger link. That’s just got to be the case.

And yet, we cannot seem to prove this. There doesn’t seem to be data behind it or to support it. Anyone who’s analyzed this problem in-depth, which a number of SEOs have over the years — a lot of people, who are very advanced, have gone through the process of classifying links and all this kind of stuff — seem to come to the same conclusion, which is Google seems to really think about links in a more subject/context agnostic perspective.

I think this might be one of those times where they have the technology to do it. They just don’t want to. My guess is what they’ve found is if they bias to these sorts of things, they get a very insular view on what’s kind of popular and important on the Web, and if they have this more broad view, they can actually get better results. It turns out that maybe it is the case that the gardening resources site that botanists love is not the one with mass appeal, is not the one that everyone is going to find useful and valuable, and isn’t representing the entirety of what the Web thinks about who should be ranking for gardening resources. So they’ve kind of biased against this.

That is my guess. But from every observable input we’ve been able to run, every test I’ve ever seen from anybody else, it seems to be the case that if there’s any bias, it’s extremely slight, almost unnoticeable. Fascinating.

Number two, I’m actually in this camp. I still think that someday it’s coming, that anchor text influence will eventually decline. Yet it seems to be that, yes, while other signals have certainly risen in importance, and there have been lots of other things, it seems that anchor text inside a link is still far more important and better than generic anchor text.

Getting specific, targeting something like “gardening supplies” when I link to A, as opposed to on the same page saying something like, “Oh, this is also a good resource for gardening supplies,” but all I linked with was the text “a good resource” over to B, that A is going to get a lot more ranking power. Again, all other things being equal, A will rank much higher than B, because this anchor text is still pretty influential. It has a fairly substantive effect.

I think this is one of those cases where a lot of SEOs said, “Hey, anchor text is where a lot of manipulation and abuse is happening. It’s where a lot of Web spam happens. Clearly Google’s going to take some action against this.”

My guess, again, is that they’ve seen that the results just aren’t as good without it. This speaks to the power of being able to generate good anchor text. A lot of that, especially when you’re doing content marketing kinds of things for SEO, depends on nomenclature, naming, and branding practices. It’s really about what you call things and what you can get the community and your world to call things. Hummingbird has made advancements in how Google does a lot of this text recognition, but for these tough phrases, anchor text is still strong.

Number three, 302s. So 302s have been one of these sort of long-standing kind of messes of the Web, where a 302 was originally intended as a temporary redirect, but many, many websites and types of servers default to 302s for all kinds of pages that are moving.

So A301 redirects to B, versus C302 redirecting to D. Is it really the case that the people who run C plan to change where the redirect points in the future, and is it really the case that they do so more than A does with B?

Well, a lot of the time, probably not. But it still is the case, and you can see plenty of examples of this happening out in the search results and out on the Web, that Google interprets this 301 as being a permanent redirect. All the link juice from A is going to pass right over to B.

With C and D, it appears, with big brands, when the redirect’s been in place for a long time and they have some trust in it, maybe they see some other signals, some other links pointing over here, that yes, some of this does pass over, but it is not nearly what’s happening with a 301. This is like a directive, and this is sort of a nudge or a hint. It just seems to be important to still get those 301s, those right kinds of redirects right.

By the way, there are also a lot of other kinds of 30X status codes that can be issued on the Web and that servers might fire. So be careful. You see a 305, a 307, 309, something weird, you probably want a 301 if you’re trying to do a permanent redirect. So be cautious of that.

(Number four): Speaking of nudges and hints versus directives, rel=”canonical” has been an interesting one. So when rel=”canonical” first launched, what Google said about rel=”canonical” is rel=”canonical” is a hint to us, but we won’t necessarily take it as gospel.

Yet, every test we saw, even from those early launch days, was, man, they are taking it as gospel. You throw a rel=”canonical” on a trusted site accidentally on every page and point it back to the homepage, Google suddenly doesn’t index anything but the homepage. It’s crazy.

You know what? The tests that we’ve seen run and mistakes — oftentimes, sadly, it’s mistakes that are our examples here — that have been made around rel=”canonical” have shown us that Google still has this pretty harsh interpretation that a rel=”canonical” means that the page at A is now at B, and they’re not looking tremendously at whether the content here is super similar. Sometimes they are, especially for manipulative kinds of things. But you’ve got to be careful, when you’re implementing rel=”canonical”, that you’re doing it properly, because you can de-index a lot of pages accidentally.

So this is an area of caution. It seems like Google still has not progressed on this front, and they’re taking that as a pretty basic directive.

Number five, I think, for a long time, a lot of us have thought, hey, the social web is rising. Social is where a lot of the great content is being shared, a lot of where people are pointing to important things, and where endorsements are happening, more so, potentially, than the link graph. It’s sort of the common man’s link graph has become the social web and the social graph.

And yet, with the exception of the two years where Google had a very direct partnership with Twitter and those tweets and indexation, all that kind of stuff was heavily influential for Google search results, since that partnership broke up, we haven’t seen that again from Google. They’ve actually sort of backtracked on social, and they’ve kind of said, “Hey, you know, tweets, Facebook shares, likes, that kind of stuff, it doesn’t directly impact rankings for everyone.”

Google+ being sort of an exception, especially in the personalized results. But even the tests we’ve done with Google+ for non-personalized results have appeared to do nothing, as yet.

So these shares that are happening all over social, I think what’s really happening here is that Google is taking a look and saying, “Hey, yes, lots of social sharing is going on.” But the good social sharing, the stuff that sticks around, the stuff that people really feel is important is still, later on at some point, earning a citation, earning a link, a mention, something that they can truly interpret and use in their ranking algorithm.

So they’re relying on the fact that social can be a tip-off or a tipping point for a piece of content or a website or a brand or a product, whatever it is, to achieve some popularity, but that will eventually be reflected in the link graph. They can wait until that happens rather than using social signals, which, to be fair, there’s some potential manipulation, I think that they’re worried about exposing themselves too. There’s also, of course, the case that they don’t have direct access. Well, they don’t have API-level access and partnerships with Facebook and Twitter anymore, and so that could be causing some of that too.

Number six, last one. I think a lot of us felt like, as Google was cleaning up web spam, for a long time they talked about cleaning up web spam, from ’06, ’07 to about 2011, 2012, it was pretty sketchy. It was tough.

When they did start cleaning up web spam, I think a lot of us thought, ”Well, eventually they’re going to get to PPC too.” I don’t mean pay-per-click. I mean porn, pills, and casino.

But it turns out, as Matt Brown, from Moz, wisely and recently pointed out in his SearchLove presentation in Boston, that, yes, if you look at the search results around these categories, whatever it is — Buy Cialis online, Texas hold-’em no limit poker, removed for content, because Whiteboard Friday is family-friendly, folks — whatever the search is that you’re performing in these spheres, this is actually kind of the early warning SERPS of the SEO world.

You can see a lot of the changes that Google’s making around spam and authority and signal interpretation. One of the most interesting ones that you probably observed, if you study this space, is a lot of those hacked .edu pages, or barnacle SEO that was happening on sub-domains of more trusted sites that had gotten a bunch of links, that kind of stuff, that is ending a little bit. We’re seeing a little bit more of the rise, again, of like the exact match domains and some of the affiliate sites and getting links from more creative places, because it does seem like Google’s gotten quite a bit better at which links they consider and in how they judge the authoritativeness of pages that might be hanging on or clinging onto a domain, but aren’t well linked to internally on some of those more trusted sites.

So, that said, I’m looking forward to some fascinating comments. I’m sure we’re going to have some great discussions around these. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Better Ways to Embed Tables and Spreadsheets in Web Pages

April 17th, 2014

It is easy to embed tabular data in  web pages. You can either use the standard <table> HTML tag or you can input the tabular data in a spreadsheet — like Excel Online or Google Spreadsheets — and embed the sheet in your web pages.

HTML tables are easy while spreadsheet based tables allow better formatting and complex layouts – like nested tables within a table – without fiddling with the code. Here are the different ways by which you can embed tables in your website and their pros and cons.

How to create an HTML table

If you have access to a WYSIWYG editor like Dreamweaver, you can easily create an HTML table using the built-in wizards but I prefer using Markdown for creating tables as it requires no tags. Go to gist.github.com (you don’t even need an account here) and enter the table  in the following format:

Column A | Column B
-------- | -------
Cell A1 | Cell B1
Cell A2 | Cell B2

Each column is separated by a pipe (|) while hyphens (-) indicate the table headings. Name the gist table.md (.md indicates markdown language) and click the “Create Secret Gist” button to render the markdown as a table.

Once you click the Save button, the gist will show you the visual table which you can copy-paste into any rich-text editor like the Gmail compose window. Alternatively, you can right-click the table on Github and choose Inspect Element to view the actual HTML tags for that table.

excel to html

Tableizer is another simple tool for converting spreadsheet data into HTML table code. Create a table inside Excel or the Numbers app on your desktop, copy the cells and paste it inside Tableizer. It will generate the HTML code that can be used on your blog or website.

Embed Google Sheets in your Website

A popular option for embedding tabular data in a web page is through Google Docs (Spreadsheets). The advantage with this approach is that you can modify the data in the spreadsheet and embedded table will update itself to reflect the edits. There’s no need to edit the web page containing the table.

Go to spreadsheets.google.com, enter some data in the sheet and the choose the Publish to the Web option from the File menu. Choose Start Publishing and Google Drive will offer you the IFRAME embed code for that particular sheet.

The embedded sheet – see live version – will preserve the original formatting of the cells but it will still be a static HTML document – there’s no option for sorting or filtering data in the HTML table.

Embed Excel Sheets in Web Pages

This is my favorite method for embedding spreadsheet data in web page and I’ll soon explain why.

Go to office.live.com and create new blank workbook. Enter the tabular data inside the Excel sheet and then choose File -> Share -> Embed -> Generate HTML.

Excel, unlike Google Docs, allows you to embed a select range of cells and not the entire spreadsheet. You can also include a download link in the embedded cells making it easier for your website visitor to download and open the table in their local spreadsheet app. The embedded spreadsheet also offers better copy-paste than Google Docs.

Here’s a live version of an HTML table embedded using the Excel web app.

Related: Capture Web Tables into Excel

Make Static HTML Tables Interactive

If you wish to go with static HTML tables, instead of interactive spreadsheet based tables, you can consider adding the Excel button that will make your HTML tables interactive.

You have the regular HTML code for your <table> and all you have to do is add another HTML tag to your web page that will turn the embedded static table into an interactive sheet – — see this live version.

<a href="#" name="MicrosoftExcelButton"></a>

<table>
  <thead><tr>
    <th>Column A</th>
    <th>Column B</th>
    </tr></thead>
  <tbody>
    <tr>
      <td>Cell A1</td>
      <td>Cell B1</td>
    </tr>
    <tr>
      <td>Cell A2</td>
      <td>Cell B2</td>
    </tr>
  </tbody>
</table>

<script type="text/javascript" src="http://r.office.microsoft.com/r/rlidExcelButton?v=1&kip=1"></script>

This code will add a little Excel button next to your HTML table and when someone clicks that button, it creates a beautiful and interactive view of table with support for sorting and filtering. You can even visualize the HTML table as graphs without leaving the page.

HTML Tables or Spreadsheets?

The advantage with static HTML tables is that they are SEO friendly (search engines can read your HTML table) while spreadsheet based tables are not. The latter however allow better formatting options and are relatively easy to update.

If you wish to have the best of both worlds, go with an HTML table and use the Excel interactive view that will let viewers interact with the table on demand.

Related Guide: How to Embed Anything in a Website


This story, Better Ways to Embed Tables and Spreadsheets in Web Pages, was originally published at Digital Inspiration on 16/04/2014 under Embed, Microsoft Excel, Software
Digital Inspiration Technology Blog

Google’s Effective ‘White Hat’ Marketing Case Study

April 16th, 2014

There’s the safe way & the high risk approach. The shortcut takers & those who win through hard work & superior offering.

One is white hat and the other is black hat.

With the increasing search ecosystem instability over the past couple years, some see these labels constantly sliding, sometimes on an ex-post-facto basis, turning thousands of white hats into black hats arbitrarily overnight.

Are you a white hat SEO? or a black hat SEO?

Do you even know?

Before you answer, please have a quick read of this Washington Post article highlighting how Google manipulated & undermined the US political system.

.
.
.
.
.
.
.

Seriously, go read it now.

It’s fantastic journalism & an important read for anyone who considers themselves an SEO.

.
.
.
.
.
.
.
.

######

Take the offline analog to Google’s search “quality” guidelines & in spirit Google repeatedly violated every single one of them.

Advertorials

creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links can be considered a violation of our guidelines. Advertorials or native advertising where payment is received for articles that include links that pass PageRank

Advertorials are spam, except when they are not: “the staff and professors at GMU’s law center were in regular contact with Google executives, who supplied them with the company’s arguments against antitrust action and helped them get favorable op-ed pieces published

Deception

Don’t deceive your users.

Ads should be clearly labeled, except when they are not: “GMU officials later told Dellarocas they were planning to have him participate from the audience,” which is just like an infomercial that must be labeled as an advertisement!

Preventing Money from Manipulating Editorial

Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.

Money influencing outcomes is wrong, except when it’s not: “Google’s lobbying corps — now numbering more than 100 — is split equally, like its campaign donations, among Democrats and Republicans. … Google became the second-largest corporate spender on lobbying in the United States in 2012.”

Content Quality

The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.

Payment should be disclosed, except when it shouldn’t: “The school and Google staffers worked to organize a second academic conference focused on search. This time, however, Google’s involvement was not publicly disclosed.”

Cloaking

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

cloaking is evil, except when it’s not: Even as Google executives peppered the GMU staff with suggestions of speakers and guests to invite to the event, the company asked the school not to broadcast its involvement. “We will certainly limit who we announce publicly from Google”

…and on and on and on…

It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it.

And while they may not approve of something, that doesn’t mean they avoid the strategy when mapping out their own approach.

There’s a lesson & it isn’t a particularly subtle one.

Free markets aren’t free. Who could have known?

Categories: 
google

SEO Book

Buy Cheaper Microsoft Office 365 Personal at $7/mo

April 16th, 2014

office365 personal

Microsoft has launched a much cheaper version called Microsoft Office 365 Personal Edition that costs just $ 6.99/month on subscription and $ 69.99 for a year. After launching the much hyped free Office for iPad, users realized it only allowed read access to these files, and they needed to buy an Office 365 subscription to edit or create Office Read the rest of this entry »

Getting hreflang Right: Examples and Insights for International SEO

April 15th, 2014

Posted by DaveSottimano

Most of us will remember the days in SEO where geotargeting was nearly impossible, and we all crawled to the shining example of Apple.com as our means of showcasing what the correct search display behaviour should be. Well, most of us weren’t Apple, and it was extremely difficult to determine how to structure your site to make it work for international search. Hreflang has been a blessing to the SEO industry, even though it’s had a bit of a troubled past. 

There’s been much confusion as to how hreflang annotations should work, what is the correct display behaviour, and if the implementation requires additional configuration such as the canonical tag or WMT targeting.

This isn’t a beginner- or even intermediate-level post, so if you don’t have a solid feel for hreflang already, I’d recommend reading through 
Google’s documentation before diving in.

In today’s post we’re going to cover the following:

  1. How to check international SERPs the right way
  2. What should hreflang do and not do
  3. Examples of hreflang behaviour
  4. Important tools for the serious international SEO
  5. Tips from my many screw-ups, and successes 


Section 1: How to check international SERPs the right way

I’ve said this once, and I’ll say it again: Know your Google search parameters better than your mother. Half the time we think something isn’t working, we don’t actually know how to check. Shy of having an IP in every country from which you want to check Google results, here is the next best thing:

For example, if want to mimic a Spanish user in the US:

http://www.google.com/search?hl=es&gl=us&pws=0&q=seo

Or if I want to impersonate an Australian user:

http://www.google.com.au/search?hl=en&gl=au&pws=0&q=seo

If you want a full list of language/country codes that Google uses, please visit the 
Google CCTLDs language and reference sheet. If you want the Google docs version go here, or if you want a tool to do this for you, check out Isearchfrom.

Section 2: What should hreflang do and not do


hreflang will not
:

  1. Replace geo-ranking factors: Just because you rank #1 in the US for “blue widgets” does not mean that your UK “blue widgets page” will rank #1 in the UK.
  2. Fix duplicate content issues: If you have duplicate copies of your pages targeting the same keywords, it does not mean that the right country version will rank because of hreflang. The same rules apply to general SEO; when there are exact or nearly exact duplicates, Google will choose which page to rank. Typically, we see the version with more authority ranking (authority can be determined loosely by #links, TBPR, DA, PA, etc.).

You might be wondering about duplicate content and Panda, which is a valid concern. I personally haven’t seen or heard of any site with international duplicate content being affected by Panda updates. The sites I have analyzed always had some sort of international SEO configuration, however, whether it was WMT targeting or hreflang annotations.


Hreflang will:

  1. Help the right country/language version of your cross-annotated pages appear in the correct versions of *google.*

Section 3: Examples of hreflang behaviour

Case 1: CNN.com

Configuration:

<head> hreflang, 302 redirect on homepage, and subdomain configuration

Sample of hreflang annotations:

<link href="http://www.cnn.com" hreflang="en-us" rel="alternate" title="CNN" type="text/html"/>
<link href="http://mexico.cnn.com" hreflang="es" rel="alternate" title="CNN Mexico" type="text/html"/>

What should happen according to the targeting?

Cnn.com is seen in EN-US and any Spanish queries should display Mexico.cnn.com

What actually happens?

Take a look at the US results for yourself. 

Take a look at the US results for yourself.

Take a look at the Mexican results for yourself.

Let’s try to explain this behaviour:

  • Cnn.com actually 302′s to edition.cnn.com; this is regular SEO behaviour that causes the origin page URL to display in search resuls and the content comes from the redirect. 
  • Mexico.cnn.com is not the right answer for “es” (Spanish language) IMO, because it’s the Mexican version and should be annotated as “mx-es” ;) 
  • Since cnnespanol.cnn.com exists and seems to have worldwide news, I would use this as the “ES” version.
  • Cross hreflang annotations are missing, so the whole thing isn’t going to work anyways ……

Case 2: play.google.com

Configuration:

<head> hreflang, language/country variations and duplicate content

Sample of hreflang annotations:

*FYI - I’ve shortened this for simplicity

x-default - 

https://play.google.com/store/apps/details?id=com….

en_GB - 

https://play.google.com/store/apps/details?id=com….

en - href 

https://play.google.com/store/apps/details?id=com….

What should happen according to the targeting?

X-default for non annotated versions, GB page should display in Google.co.uk

What actually happens?

Take a look at the results for yourself

Take a look at the UK results for yourself

Let’s try to explain this behaviour:

  • One thing you may not notice is that the EN, X default, and GB version are almost entirely duplicate (around 99%). Which one should the algorithm choose? This is a good example of hreflang not handling dupe content.
  • The GB version doesn’t display in UK search results, and the rankings are not the same (US ranking is higher than UK on average). The hreflang annotation is using the underscore rather than the standard hyphen (EN_GB versus EN-GB)
  • They use a self-referencing canonical, which, contrary to some beliefs, has absolutely no effect on the targeting

Case 3: Musicradar.com

Configuration:

<head> hreflang, subdomain & cctld, country targeting and x-default

Sample of hreflang annotations:

<link rel="alternate" hreflang="en-gb" href="http://www.musicradar.com/" />
	
<link rel="alternate" hreflang="x-default" href="http://www.musicradar.com/" />
	
<link rel="alternate" hreflang="en-us" href="http://www.musicradar.com/us/" />
	
<link rel="alternate" hreflang="fr-fr" href="http://www.musicradar.com/fr/" />
	

What should happen according to the targeting?

Musicradar.com should appear in GB and all other queries other than EN-US and FR-FR where each respective subfolder should appear.

What actually happens?


See the Canadian results for yourself

See the American results for yourself

See the French results for yourself

Let’s try to explain this behaviour:

  • Perfect example of perfect implementation – you guys & gals working with Musicradar are pretty great. You get the honorary #likeaboss vote from me :)
  • One thing to notice is that they double list the EN-GB page also as the X-default
  • The English sitelink in the French results is pretty weird, but I think this is the perfect situation to escalate to Google as their implementation is correct as far as I can tell.

Case 4: Ridgid.com

Configuration:

XML sitemaps hreflang, subfolders, rel canonical and dupe content

Sample of hreflang annotations:

<loc>https://www.ridgid.com/</loc>
			
<xhtml:linkhreflang="en-US" href="https://www.ridgid.com/" rel="alternate"/>
			
<xhtml:link hreflang="en-CA" href="https://www.ridgid.com/ca/en" rel="alternate"/>
			
<xhtml:link hreflang="en-PH" href="https://www.ridgid.com/ph/en" rel="alternate" />
			

What should happen according to the targeting?

Ridgid.com should appear in the US, ridgid.com/ca/en should appear for Canadian – English queries (google.ca) and ridgid.com/ph/en should appear in Google Philippines for English queries.

What actually happens?

Check out the Canadian results for yourself

Check out the Philippines results for yourself

Let’s try to explain this behaviour:

  • All 3 homepages are almost exactly identical, hence duplicate content
  • The Canadian version contains <link rel=”canonical” href=”https://www.ridgid.com/” /> – that means it’s being canonicalized to the main US version
  • The Philippines version does not contain a canonical tag
  • Google is choosing which is the right duplicate version to show, unless there is a canonical instruction

Section 4: Tools for the serious International SEO

Essentials:

  • Reliable rank tracker that can localize: Advanced Web Ranking, Moz, etc…
  • Crawler that can validate hreflang annotations in XML sitemaps or within <head>: The only tool on the market that can do this, and does it very well, is Deepcrawl.

Other nice-to-haves:

  1. Your own method of “gathering” international search results on scale. You should probably go with proxies.
  2. Your own method of parsing XML sitemaps and cross checking (even if you use something like Deepcrawl, you’ll need to double check).
  3. Obvious, but worth a reminder: Google webmaster tools, Analytics, access to server logs so you can understand Google’s crawl behaviour.

Section 5: Tips from many screw-ups and successes

  1. Use either the <head> implementation or XML sitemaps, not both. It can technically work, but trust me, you’ll probably screw something up – just stick to one or the other.
  2. If you don’t cross annotate, it won’t work. Plain and simple, use Aleyda’s tool to help you.
  3. Google says you should self-reference hreflang, but I also see it working without (check out en.softonic.com). If you want to play safe, self reference; we don’t know what Google will change in the future.
  4. Try to eliminate the need for duplicate content, but if you must, it’s okay to use canonical + hreflang as long as you know what you’re doing. Check out this cool isolated test which is still relevant. Remember, mo’ dupes, mo’ problems.
  5. Hreflang needs time to work properly. At a bare minimum, Google needs to crawl both cross annotations for the switch to happen. Help yourself by pinging sitemaps, but be aware of at least a 2-day lag.
  6. You can double-annotate a URL when using X-default, in case you were afraid to. Don’t worry, it’s cool.
  7. Make sure you’re actually having a problem before you go ranting on webmaster forums. Double check what you’re seeing and ask other people to check as well. Check your Google parameters and personalized results!
  8. You can 302 your homepage when you’re using a country redirect strategy. Yes, I know it’s crazy, yes, a little bird told me and I throughly tested this and didn’t see a loss. There’s 2 sites I know of using this, so check them out: The Guardian & Red Bull.

Closing, burning question: You might be asking yourself, how the heck did he find so many examples? Or maybe not, but I’m going to tell you anyway.

My secret sauce is 
Nerdydata.com, and if you didn’t know about this beautiful site, I hope that Nerdydata.com gives me a free t-shirt or something for telling you.

I find most SEOs who know about the tool are using it for useless stuff like meta tags (this is my own opinion), but what it really should be used for is reverse engineering things like hreflang and schema.org to find working examples. For example, a footprint you might use is hreflang=”en-us” and you’ll find a tonne of examples.

Here’s a few to get you started:

marketo.com asos.com 99designs.com sistrix.com
mozilla.org agoda.com emirates.com trivago.com
salesforce.com techradar.com symantec.com rentalcars.com
softonic.com aufeminin.com alfemminile.com moo.com
istockphoto.com ea.com freelotto.com softonic.it
americanexpress.com zara.com xero.com trustpilot.com
viadeo.com marriott.com gofeminin.de here.com
hotels.com enfemenino.com ringcentral.com mailjet.com

That’s it folks, hopefully you’ve learned a thing or two. Good luck in your international adventures and 
feel free to say hi on Twitter. :)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

The 10 HTML Codes You Need to Know for Writing on the Web

April 14th, 2014

The computer keyboard creates “dumb” punctuation marks that may not be acceptable in printed work but are common on web pages. It outputs straight quotation marks (") while your writing may require curly quotes. You can only type hyphens using the keyboard while an em or en dash may have made your text look more elegant and professional.

HTML Entities for Punctuation

HTML Entities for Punctuation

HTML Codes for Punctuation

Your keyboard lacks the keys for inserting the correct punctuation marks but there are simple HTML character codes, on entities, that you can use to mimic the typography of printed books on the web. Here’s a quick guide to using HTML codes for inserting typographically-correct punctuation marks in web writing.

Hyphens (-)

Hyphens, or half-dashes, are most commonly used to join together two or more words. You can type a hyphen directly without using any HTML codes.

  • The next-generation iPhone is expected to become available in mid-September.
  • The word email contains no hyphen but there’s one inside e-books and e-commerce.

Dashes (– —)

Dashes come in two sizes: the En dash (–) and the Em dash (—). The En dash is longer than a hyphen but shorten than an Em dash.

The En dash (&ndash;) is used for suggesting a range of numbers such as time periods, sports scores or page ranges.

  • Our bank is open 8a.m.–8p.m., Monday–Friday.
  • The players are disappointed after losing the match 3–1 to Spain.
  • Abraham Lincoln (1809–1865) was the 16th President of the United States.

The Em dash (&mdash;) is used to indicate breaks or pauses in a sentence, to quote sources or to separate a series of words within a phrase.

  • We plan to visit London this summer — if the visa is approved.
  • My three friends — John, Peter and Richards — are moving to New York.
  • “Well done is better than well said.” — Benjamin Franklin.

Ellipsis (…)

An ellipsis (&hellip;) is a series of three dots (periods) and two spaces in a row and they are used to indicate omission of one or more words in the quoted material. If the ellipses follow a complete sentence, end that sentence with a period, insert a space, then the ellipsis (…) followed by a space.

  • “All the world’s a stage, and all the men and women merely players.…” — Shakespeare
  • “I have a dream … they will not be judged by the color of their skin but by the content of their character.” — Martin Luther King, Jr

Quotes & Apostrophes (‘ ’ “ ”)

Your computer keyboard creates straight quotes (or dumb quotes) though what you really need to use are smart quotes (or curly quotes) that can be easily written in HTML. The dumb quotes are best used for writing programming code.

Use double quotation marks (&ldquo; and &rdquo;) to identify the exact words of a person, to indicate irony or for writing titles of creative works.

  • The last episode of “Friends” was the most-watched program of the year.
  • Thoreau said that “that government is the best which governs the least”.
  • They recalled the “toy safety” buttons as they were contained lead paint.

Use single quotes (&lsquo; and &rsquo;) to indicate quotes within quotes. The right curly quotation mark (&rsquo;) can also be used for smart apostrophe instead of the straight apostrophe.

  • John said, “I told her, ‘the traffic will only get worse.’”
  • Her answer was, “I’ll call you in the next hour or so.”

Prime (′ ″)

While we often use apostrophe (‘) or quotation marks to indicate units of measurement (like feet or seconds or degrees), the correct symbol is Prime that are likely slightly slanted quotes written as &prime; (single) and &Prime; (double).

  • The height of this Ferrari car is 47′8″
  • I am currently at 27° 11′ 45.315″N, 78° 1′ 27.1668″ W.

Web Typography: Further Reading & Resources

Punctuation Mark Symbol HTML Entity Code Decimal
Apostrophe &rsquo; &#8217;
En dash &ndash; &#8211;
Em dash &mdash; &#8212;
Ellipsis &hellip; &#8230;
Single Quote (open) &lsquo; &#8216;
Single Quote (close) &rsquo; &#8217;
Double Quote (open) &ldquo; &#8220;
Double Quote (close) &rdquo; &#8221;
  • Keyboard Typing Shortcuts for iOS
  • The Associated Press Stylebook
  • The Economist Style Guide
  • The NYT Manual of Style & Usage
  • Matthew Butterick’s Practical Typography
  • Smart Quotes for Smart People
  • Benedikt Lehnert’s Typographic Adventures
  • Typography for Lawyers

This story, The 10 HTML Codes You Need to Know for Writing on the Web, was originally published at Digital Inspiration on 07/04/2014 under English, Internet
Digital Inspiration Technology Blog

Report: Google Running Another Test To Map In-Store Sales To AdWords Ads

April 13th, 2014

When Google introduced Estimated Total Conversions in AdWords last September, the company made clear it was working on ways to measure the impact of online advertising on in-store sales as it outlined the initiative to give advertisers a complete view of the effect their search advertising has on…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Prime Reasons for, why SEO plays in enhanced web traffic

It has been recently found that, the strategies opted by the SEO Birmingham companies have become critically significant to the success of small businesses. If your online business is not making use of SEO Birmingham, this write-up discusses prime reasons why you should opt for the local strategies used by SEO Birmingham companies.

As per the recent findings, more than 39% of the NETIZENS experience problems while searching for the local businesses over the Word Wide Wed. People know about the existence of the local business, but suffer inconvenience while locating web information about such businesses. The major reason behind such inconvenience suffered by the Netizens is that such business fails to understand the relevance of SEO Birmingham. Thereby, if your business provides products and services over the web nationally or internationally, search engine optimization can be of great help over making the brand visible on the search engine results.

Nowadays, more and more people rely on the Internet for finding local businesses

There have been times, when the local business did not worry about the scope of SEO, just word of mouth has been more than enough for spreading their existence to the local consumers. But, today as per the statistics more than 84% of the people make use of the Internet medium for locating the local businesses. No telephone directories, people rely on search engines.

Thus, it becomes essential to change the traditional marketing strategy to modern day strategy of online marketing.

SEO Birmingham costs a little less

If you are considering Adword strategy to capture the online marketing domain, you must be aware of the fact; the popularity of the keyword chosen is directly proportional to the amount of the fee paid. Selection of local keywords means lesser keyword competition, which means you need not pay extra costs.

Helps in reaping the benefits of advanced Google features

SEO companies in Birmingham, such as SEO Results4u at Avon House, 435 Stratford Road, Shirley, Solihull, West Midlands, B90 4AA 0121 746 3121 also contribute to the SEO landscape of their local area, be it Solihull, Birmingham or even the wider West Midlands area.

People are actually unaware about the fact, Google plus has changed the traditional way of Internet usage. If the keywords chosen are relevant to the domain to the local market, you very wisely unlock enhanced services offered by Google:

  • A map representing the physical location of a business
  • Appealing pictures with respect to the business
  • Make use of the reviews posted by the user

Truth be told, without using the local platform of SEO, Google plus fails to recognize your business, which clearly means lack of authentic information over the web.

Local SEO encourages better and enhanced credibility

People trust Yahoo, Bing and Google with their eyes closed and believe, these magical search engines have remedies for each and every query. It is a well-accepted notion among the commoners, the brands that appear in the top search lists are most wanted and authentic service providers. So, if you want people to believe in your brand's credibility, Search Engine Optimization adds credibility to your brand power among the commoners. Local SEO adds credibility as well as a definite increase in the web traffic.

Your Peers are using

Business is all about competition. Your peers are using it and yielding the benefits, why aren’t you?