How to Receive Files in your Google Drive from Anyone

April 24th, 2014

A school teacher wants to have a public drop box (not Dropbox) where students can upload homework assignments. A recruiter wants to have an online form where job applicants can upload their resumes. A news organization may need a public drop box where people can upload files anonymously.

Google Forms would have been a perfect solution here but unfortunately you cannot upload files to Google Drive through Forms.

The other option is to have a shared folder inside Google Drive but there are limitations. One, you need a Gmail Account and should be invited by the folder owner to upload files in a shared folder. Second, all collaborators can view and even remove files that have been uploaded in a shared folder on Google Drive.

Drive Form with File Uploads

Receive Files in Google Drive with Forms

What you can do is create a regular web form (written in HTML and CSS) and then use Google Apps Script to post the content of this form into Google Drive.

Before diving into the implementation, take a look at this sample form. When you upload a file, it will show up in a specific folder on my Google Drive. You don’t even need to have a Google Account to upload files and the Google Script based form even works on mobile devices.

  • Click here to make a copy of the Google Script into your Google Drive.
  • This is vanilla form with a text field, a file input field and a form submit button. You can open the form.html file to apply your own CSS styles or add more input and textarea fields.
  • From the Run menu, choose doGet and authorize the script. The script needs these permissions since the form will be uploading files to your Google Drive.
  • Next choose Deploy as Web App from the Publish menu. Click Save New Version, choose Anyone, even Anonymous from the drop-down and click the Deploy button.

The Google Script will now offer you a form URL. Anyone can now use this form to upload files to your Google Drive.

<form id="myForm">

    <label>Your Name</label>
    <input type="text" name="myName">

    <label>Pick a file</label>
    <input type="file" name="myFile">

    <input type="submit" value="Upload File" 
               onclick="google.script.run
                        .uploadFiles(this.parentNode);
                        return false;">
</form>

This is a bare-bones version of the form but it can be further enhanced through Apps Script. For instance, you may choose to save the form details in a Google Spreadsheet along with the Drive URLs of the uploaded files. The script can even email you the uploaded files as attachments similar to Mail Merge.

[*] If this looks a bit too technical, you can ask guests to send you files as email attachments and then use the Gmail to Google Drive program to automatically save these file attachments in your Drive.

[**] If you aren’t on Gmail or Google Apps, you can still receive files in Dropbox from anyone anonymously.

Awesome Google Scripts →
Custom Google Scripts →


This story, How to Receive Files in your Google Drive from Anyone, was originally published at Digital Inspiration on 23/04/2014 under Google Drive, Internet
Digital Inspiration Technology Blog

SearchCap: Old Google Street Views, Bing For Schools & DailyMotion’s Upset

April 23rd, 2014

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: “Bing For Schools” Becomes “Bing In The Classroom” & Makes Program Available To All Schools Originally launched in August of last…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

What’s Wrong With A/B Testing

April 23rd, 2014

A/B testing is an internet marketing standard. In order to optimize response rates, you compare one page against another. You run with the page that gives you the best response rates.

But anyone who has tried A/B testing will know that whilst it sounds simple in concept, it can be problematic in execution. For example, it can be difficult to determine if what you’re seeing is a tangible difference in customer behaviour or simply a result of chance. Is A/B testing an appropriate choice in all cases? Or is it best suited to specific applications? Does A/B testing obscure what customers really want?

In this article, we’ll look at some of the gotchas for those new to A/B testing.

1. Insufficient Sample Size

You set up test. You’ve got one page featuring call to action A and one page featuring call to action B. You enable your PPC campaign and leave it running for a day.

When you stop the test, you’ve found call-to-action A converted at twice the rate of call-to-action B. So call-to-action A is the winner and we should run with it, and eliminate option B.

But this would be a mistake.

The sample size may be insufficient. If we only tested one hundred clicks, we might get a significant difference in results between two pages, but that change doesn’t show up when we get to 1,000 clicks. In fact, the result may even be reversed!

So, how do we determine a sample size that is statistically significant? This excellent article explains the maths. However, there are various online sample size calculators that will do the calculations for you, including Evan’s. Most A/B tracking tools will include sample size calculators, but it’s a good idea to understand what they’re calculating, and how, to ensure the accuracy of your tests.

In short, make sure you’ve tested enough of the audience to determine a trend.

2. Collateral Damage

We might want to test a call to action metric. We want to test the number of people who click on the “find out more” link on a landing page. We find that a lot more people click on this link we use the term “find out more” than if we use the term “buy now”.

Great, right?

But what if the conversion rate for those who actually make a purchase falls as a result? We achieved higher click-thrus on one landing page at the expense of actual sales.

This is why it’s important to be clear about the end goal when designing and executing tests. Also, ensure we look at the process as a whole, especially when we’re chopping the process up into bits for testing purposes. Does a change in one place affect something else further down the line?

In this example, you might A/B test the landing page whilst keeping an eye on your total customer numbers deeming the change effective only if customer numbers also rise. If your aim was only to increase click-thru, say to boost quality scores, then the change was effective.

3. What, Not Why

In the example above, we know the “what”. We changed the wording of a call-to-action link, and we achieved higher click thru’s, although we’re still in the dark as to why. We’re also in the dark as to why the change of wording resulted in fewer sales.

Was it because we attracted more people who were information seekers? Were buyers confused about the nature of the site? Did visitors think they couldn’t buy from us? Were they price shoppers who wanted to compare price information up front?

We don’t really know.

But that’s good, so long as we keep asking questions. These types of questions lead to more ideas for A/B tests. By turning testing into an ongoing process, supported by asking more and hopefully better questions, we’re more likely to discover a whole range of “why’s”.

4. Small Might Be A Problem

If you’re a small company competing directly with big companies, you may already be on the back foot when it comes to A/B testing.

It’s clear that its very modularity can cause problems. But what about in cases where the number of tests that can be run at once is low? While A/B testing makes sense on big websites where you can run hundreds of tests per day and have hundreds of thousands of hits, only a few offers can be tested at one time in cases like direct mail. The variance that these tests reveal is often so low that any meaningful statistical analysis is impossible.

Put simply, you might not have the traffic to generate statistically significant results. There’s no easy way around this problem, but the answer may lay in getting tricky with the maths.

Experimental design massively and deliberately increases the amount of variance in direct marketing campaigns. It lets marketers project the impact of many variables by testing just a few of them. Mathematical formulas use a subset of combinations of variables to represent the complexity of all the original variables. That allows the marketing organization to more quickly adjust messages and offers and, based on the responses, to improve marketing effectiveness and the company’s overall economics

Another thing to consider is that if you’re certain the bigger company is running A/B tests, and achieving good results, then “steal” their landing page*. Take their ideas for landing pages and use that as a test against your existing pages. *Of course, you can’t really steal their landing page, but you can be “influenced by” their approach.

What your competitors do is often a good starting point for your own tests. Try taking their approach and refine it.

5. Might There Be A Better Way?

Are there alternatives to A/B testing?

Some swear by the Multi Armed Bandit methodology:

The multi-armed bandit problem takes its terminology from a casino. You are faced with a wall of slot machines, each with its own lever. You suspect that some slot machines pay out more frequently than others. How can you learn which machine is the best, and get the most coins in the fewest trials?
Like many techniques in machine learning, the simplest strategy is hard to beat. More complicated techniques are worth considering, but they may eke out only a few hundredths of a percentage point of performance.

Then again…..

What multi-armed bandit algorithm does is that it aggressively (and greedily) optimizes for currently best performing variation, so the actual worse performing versions end up receiving very little traffic (mostly in the explorative 10% phase). This little traffic means when you try to calculate statistical significance, there’s still a lot of uncertainty whether the variation is “really” worse performing or the current worse performance is due to random chance. So, in a multi-armed bandit algorithm, it takes a lot more traffic to declare statistical significance as compared to simple randomization of A/B testing. (But, of course, in a multi-armed bandit campaign, the average conversion rate is higher).

Multivariate testing may be suitable if you’re testing a combination of variables, as opposed to just one i.e.

  • Product Image: Big vs. Medium vs Small
  • Price Text Style: Bold vs Normal
  • Price Text Color: Blue vs. Black vs. Red

There would be 3x2x3 different versions to test.

The problem with multivariate tests is they can get complicated pretty quickly and require a lot of traffic to produce statistically significant results. One advantage of multivariate testing over A/B testing is that it can tell you which part of the page is most influential. Was it a graphic? A headline? A video? If you’re testing a page using an A/B test, you won’t know. Multivariate testing will tell you which page sections influence the conversion rate and which don’t.

6. Methodology Is Only One Part Of The Puzzle

So is A/B testing worthwhile? Are the alternatives better?

The methodology we choose will only be as good as the test design. If tests are poorly designed, then the maths, the tests, the data and the software tools won’t be much use.

To construct good tests, you should first take a high level view:

Start the test by first asking yourself a question. Something on the lines of, “Why is the engagement rate of my site lower than that of the competitors…..Collect information about your product from customers before setting up any big test. If you plan to test your tagline, run a quick survey among your customers asking how they would define your product.

Secondly, consider the limits of testing. Testing can be a bit of a heartless exercise. It’s cold. We can’t really test how memorable and how liked one design is over the other, and typically have to go by instinct on some questions. Sometimes, certain designs just work for our audience, and other designs don’t. How do we test if we’re winning not just business, but also hearts and minds?

Does it mean we really understand our customers if they click this version over that one? We might see how they react to an offer, but that doesn’t mean we understand their desires and needs. If we’re getting click-backs most of the time, then it’s pretty clear we don’t understand the visitors. Changing a graphic here, and wording there, isn’t going to help if the underlying offer is not what potential customers want. No amount of testing ad copy will sell a pink train.

The understanding of customers is gained in part by tests, and in part by direct experience with customers and the market we’re in. Understanding comes from empathy. From asking questions. From listening to, and understanding, the answers. From knowing what’s good, and bad, about your competitors. From providing options. From open communication channels. From reassuring people. You’re probably armed with this information already, and that information is highly useful when it comes to constructing effective tests.

Do you really need A/B testing? Used well, it can markedly improve and hone offers. It isn’t a magic bullet. Understanding your audience is the most important thing. Google, a company that uses testing extensively, seem to be most vulnerable when it comes to areas that require a more intuitive understanding of people. Google Glass is a prime example of failing to understand social context. Apple, on the other hand, were driven more by an intuitive approach. Jobs: “We built [the Mac] for ourselves. We were the group of people who were going to judge whether it was great or not. We weren’t going to go out and do market research”

A/B testing is can work wonders, just so long as it isn’t used as a substitute for understanding people.

Categories: 
Conversion

SEO Book

Learn Local Search Marketing

April 22nd, 2014

Last October Vendran Tomic wrote a guide for local SEO which has since become one of the more popular pages on our site, so we decided to follow up with a QnA on some of the latest changes in local search.

Local Ants.

Q: Google appears to have settled their monopolistic abuse charges in Europe. As part of that settlement they have to list 3 competing offers in their result set from other vertical databases. If Google charges for the particular type of listing then these competitors compete in an ad auction, whereas if the vertical is free those clicks to competitors are free. How long do we have until Google’s local product has a paid inclusion element to it?

A: Local advertising market is huge. It’s a market that Google still hasn’t mastered. It’s a market still dominated by IYP platforms.

Since search in general is stagnant, Google will be looking to increase their share of the market.

That was obvious to anyone who was covering Google’s attempt to acquire Groupon since social couponing is a local marketing phenomenon mostly.

Their new dashboard is not only more stable with a slicker interface, but also capable of facilitating any paid inclusion module.

I would guess that Google will not wait a long time to launch a paid inclusion product or something similar, since they want to keep their shareholders happy.

Q: In the past there have been fiascos with things like local page cross-integration with Google+. How “solved” are these problems, and how hard is it to isolate these sorts of issues from other potential issues?

A: Traditionally, Google had the most trouble with their “local” products. Over the years, they were losing listings, reviews, merging listings, duplicating them etc. Someone called their attempts “a train wreck at the junction.” They were also notoriously bad with providing guidance that would help local businesses navigate the complexity of the environment Google created.

Google has also faced some branding challenges – confusing even the most seasoned local search professionals with their branding.

Having said that, things have been changing for the better. Google has introduced phone support which is, I must say, very useful. In addition, the changes they made in a way they deal with local data made things more stable.

However, I’d still say that Google’s local products are their biggest challenge.

Q: Yelp just had strong quaterly results and Yahoo! has recently added a knowledge-graph like pane to their search results. How important is local search on platforms away from Google? How aligned are the various local platforms on ranking criteria?

A: Just like organic search is mostly about two functions – importance and relevance, local search is about location prominence, proximity and relevance (where location prominence is an equivalent to importance in general SEO).

All local search platforms have ranking factors that are based on these principles.

The only thing that’s different is what they consider ranking signals and the way they place on each. For example, to rank high in Yahoo! Local, one needs to be very close to the centroid of the town, have something in the title of their business that matches the query of the search and have a few reviews.

Google is more sophisticated, but the principles are the same.

The less sophisticated local search platforms use less signals in their algorithm, and are usually geared more towards proximity as a ranking signal.

It’s also important to note that local search functions as a very interconnected ecosystem, and that changes made in order to boost visibility in one platform, might hurt you in another.

Q: There was a Google patent where they mentioned using driving directions to help as a relevancy signal. And Bing recently invested in and licensed data from Foursquare. Are these the sorts of signals you see taking weight from things like proximity over time?

A: I see these signals becoming/increasing in importance over time as they would be a useful ranking signal. However, to Google, local search is also about location sensitivity, and these signals will probably not be used outside of this context.

If you read a patent named “Methods And Systems For Improving A Search Ranking Using Location Awareness” (Amit Singhal is one of the inventors), you will see that Google, in fact, is aware that people have different sensitivities fo different types of services/queries. You don’t necessarily care where your plumber will come from, but you do care where the pizza places are where you search for pizza in your location.

I don’t see driving directions as a signal ever de-throning proximity, because proximity is closer to the nature of the offline/online interaction.

Q: There are many different local directories which are highly relevant to local, while there are also vertical specific directories which might be tied to travel reviews or listing doctors. Some of these services (say like OpenTable) also manage bookings and so on. How important is it that local businesses “spread around” their marketing efforts? When does it make sense to focus deeply on a specific platform or channel vs to promote on many of them?

A: This is a great question, Aaron! About 5 years ago, I believed that the only true game in town for any local business is Google. This was because, at that time, I wasn’t invested in proper measurement of outcomes and metrics such as cost of customer acquisition, lead acqusition etc.

Local businesses, famous for their lack of budgets, should always “give” vertical platforms a try, even IYP type sites. This is why:

  • one needs to decrease dependance on Google because it’s an increasingly fickle channel of traffic acquisition (Penguin and Panda didn’t spare local websites),
  • sometimes, those vertical websites can produce great returns. I was positively surprised by the number of inquiries/leads one of our law firm clients got from a well known vertical platform.
  • using different marketing channels and measuring the right things can improve your marketing skills.

Keep in mind, basics need to be covered first: data aggregators, Google Places, creating a professional/usable/persuasive website, as well as developing a measurement model.

Q: What is the difference between incentivizing a reasonable number of reviews & being so aggressive that something is likely to be flagged as spam? How do you draw the line with trying to encourage customer reviews?

A: Reviews and review management have always been tricky, as well as important. We know two objective things about reviews:

  • consumers care about reviews when making a purchase and
  • reviews are important for your local search visibility.

Every local search/review platform worth its weight in salt will have a policy in place discouraging incentivizing and “buying” reviews. They will enforce this policy using algorithms or humans. We all know that.

Small and medium sized businesses make a mistake of trying to get as many reviews as humanly possible, and direct them to one or two local search platforms. Here, they make two mistakes:

1. they’re driven by a belief that one needs a huge number of reviews on Google and
2. one needs to direct all their review efforts at Google.

This behavior forces them to be flagged algorithmically or manually. Neither Google nor Yelp want you to solicit reviews.

However, if you change your approach from aggressively asking for reviews to a survey-based approach, you should be fine.

What do I mean by that?

A survey-based approach means you solicit your customers’ opinions on different services/products to improve your operations – and then ask them to share their opinion on the web while giving them plenty of choices.

This approach will get you much further than mindlessly begging people for reviews and sending them to Google.

The problem with clear distinction between the right and wrong way in handling reviews, as far as Google goes, lies in their constant changing of guidelines regarding reviews.

Things to remember are: try to get reviews on plenty of sites, while surveying your customers and never get too aggressive. Slow and steady wins the race.

Q: On many local searches people are now getting carouseled away from generic searches toward branded searches before clicking through, and then there is keyword(not provided) on top of that. What are some of the more cost efficient ways a small business can track & improve their ranking performance when so much of the performance data is hidden/disconnected?

A: Are you referring to ranking in Maps or organic part of the results? I’m asking because Google doesn’t blend anymore.

Q: I meant organic search

A: OK. My advice has always been to not obsess over rankings, but over customer acquisition numbers, leads, lifetime customer value etc.

However, rankings are objectively a very important piece of the puzzle. Here are my suggestions when it comes to more cost efficient ways to track and improve ranking performance:

  • When it comes to tracking, I’d use Advanced Web Ranking (AWR) or Authority Labs, both of which are not very expensive.
  • Improving ranking performance is another story. Local websites should be optimized based on the same principles that would work for any site (copy should be written for conversion, pages should be focused on narrow topics, titles should be written for clickthrough rates etc).
  • On the link building side of things, I’d suggest taking care of data aggregators first as a very impactful, yet cost effective strategy. Then, I would go after vertical platforms that link directly to a website, that have profiles chockfull of structured data. I would also make sure to join relevant industry and business associations, and generally go after links that only a real local business can get – or that come as a result of broader marketing initiatives. For example, one can organize events in the offline world that can result in links and citations, effectively increasing their search visibility without spending too much.

Q: If you are a local locksmith, how do you rise above the spam which people have publicly complained about for at least 5 years straight now?

A: If I were a local locksmith, I would seriously consider moving my operations close to the centroid of my town/city. I would also make sure my business data across the web is highly consistent.

In addition, I would make sure to facilitate getting reviews on many platforms. If this wouldn’t be enough (as it often isn’t enough in many markets), I would be public about Google’s inability to handle locksmiths spam in my town – using their forums, and any other medium.

Q: In many cities do you feel the potential ROI would be high enough to justify paying for downtown real estate then? Or would you suggest having a mailing related address or such?

A: The ROI of getting a legitimate downtown address would greatly depend on customer lifetime value. For example, if I were a personal injury attorney in a major city, I would definitely consider opening a small office near a center of my city/town.

Another thing to consider would be the search radius/location sensitivity. If the location sensitivity for a set of keywords is high, I would be more inclined to invest in a downtown office.

I wouldn’t advocate PO boxes or virtual offices, since Google is getting more aggressive about weeding those out.

Q: Google recently started supporting microformats for things like hours of operation, phone numbers, and menus. How important is it for local businesses to use these sorts of features?

A: It is not a crucial ranking factor, and is unlikely to be any time in the near future. However, Google tends to reward businesses that embrace their new features – at least in local search. I would definitely recommend embracing microformats in local search.

Q: As a blogger I’ve noticed an increase in comment spam with NAP information in it. Do you see Google eventually penalizing people for that? Is this likely to turn into yet another commonplace form of negative SEO?

A: This is a difficult question. Knowing how Google operates, it’s possible they start penalizing that practice. However, I don’t see that type of spam being particularly effective.

Most blogs cannot do a lot to enhance the location prominence. But if that turned into a negative SEO avenue, I would say that Google wouldn’t handle it well (based on their track records).

Q: Last year you wrote a popular guide to local search. What major changes have happened to the ecosystem since then? Would you change any of the advice you gave back then? Or has local search started to become more stable recently?

A: There weren’t huge changes in the local ecosystem. Google has made a lot of progress in transferring accounts to the new dashboard, improving the Bulk upload function. They also changed their UX slightly.

Moz entered the local search space with their Moz Local product.

Q: When doing a local SEO campaign, how much of the workload tends to be upfront stuff versus ongoing maintenance work? For many campaigns is a one-off effort enough to last for a significant period of time? How do you determine the best approach for a client in terms of figuring out the mix of upfront versus maintenance and how long it will take results to show and so on?

A: This largely depends on the objective of the campaign, the market and the budget. There are verticals where local Internet marketing is extremely competitive, and tends to be a constant battle.

Some markets, on the other hand, are easy and can largely be a one-off thing. For example, if you’re a plumber or an electrician in a small town with a service area limited to that town, you really don’t need much maintenance, if any.

However, if you are a roofing company that wants to be a market leader in greater Houston, TX your approach has to be much different.

The upfront work tends to be more intense if the business has NAP inconsistencies, never did any Internet marketing and doesn’t excel at offline marketing.

If you’re a brand offline and know to tie your offline and online marketing efforts, you will have a much easier time getting the most out of the web.

In most smaller markets, the results can be seen in a span of just a few months. More competitive markets, in my experience, require more time and a larger investment.

Q: When does it make sense for a local business to DIY versus hiring help? What tools do you recommend they use if they do it themselves?

A: If local business owner is in a position where doing local Internet marketing is their highest value activity, it would make sense to do it themselves.

However, more often than not, this is not the case even for the smallest of businesses. Being successful in local Internet marketing in a small market is not that difficult. But it does come with a learning curve and a cost in time.

Having said that, if the market is not that competitive, taking care of data aggregators, a few major local search platforms and acquisition of a handful of industry links would do the trick.

For data aggregators, one might go directly to them or use a tool such as UBM or Moz Local.

To dig for citations, Whitespark’s citation tool is pretty good and not that expensive.

Q: The WSJ recently published a fairly unflatering article about some of the larger local search firms which primarily manage AdWords for 10′s of thousands of clients & rely on aggressive outbound marketing to offset high levels of churn. Should a small business consider paid search & local as being separate from one another or part of the same thing? If someone hires help on these fronts, where’s the best place to find responsive help?

A: “Big box” local search companies were always better about client acquisition than performance. It always seemed as if performance wasn’t an integral part of their business model.

However, small businesses cannot take that approach when it comes to performance. Generally speaking, the more web is connected to business, the better of a small business is. This means that a local Internet marketing strategy should start with business objectives.

Everyone should ask themselves 2 questions:
1. What’s my lifetime customer value?
2. How much can I afford to spend on acquiring a customer?

Every online marketing endeavor should be judged through this lens. This means greater integration.

Q: What are some of the best resources people can use to get the fundamentals of local search & to keep up with the changing search landscape?

A: Luckily for everyone, blogosphere in local search is rich in useful information. I would definitely recommend Mike Blumenthal’s blog, Andrew Shotland’s Local SEO Guide, Linda Buquet’s forum, Nyagoslav Zhekov, Mary Bowling and of course, the Local U blog.


Vedran Tomic is a member of SEOBook and founder of Local Ants LLC, a local internet marketing agency. Please feel free to use the comments below to ask any local search questions you have, as Vedran will be checking in periodically to answer them over the next couple days.

Categories: 
interviews

SEO Book

Building a Brand Online: The Golden Age of Digital

April 21st, 2014

Posted by willcritchlow

This post is based on a talk I gave at our SearchLove conference in Boston last week. It ties quite closely with the post my colleague Ron Garrett wrote last week: Search Marketers Need to Evolve. You can probably tell we’ve been doing a lot of thinking about this.

When I gave this talk at SearchLove, I hoped that it would put in context why we bring such a range of speakers and topics together at our conferences and to inspire the attendees to go back to their companies and make real changes. I hope this post will do the same for you.

As digital marketers, our focus on analytics has served us well in driving direct, measurable sales. The dominant form of
brand marketing, however, has remained offline with TV taking the lion’s share of the budget and attention. We believe that as TV faces disruptive technology and business models, digital marketers have an opportunity to grow their influence and impact. In total, this is an opportunity worth tens of billions of dollars a year.

I’d very much like for us—our industry—you and me—to be the ones who benefit.

Despite all the growth we have seen in digital marketing spend, I think that we are only just entering what I’m calling the
golden age of digital.

Building brands online first

We’re entering the age when the biggest brands in the world will be built online first. I hope to convince you of two things: first, that this change is happening right now. And second, that we are the people to win in this world.

Starting at the beginning

There are some confident statements above, but the last few years have had their share of introspection and crises of confidence. We’ve put a lot of time and energy over the years into understanding the direction marketing is moving and capitalising on the shifts. Duncan and I originally started by thinking that networked computing was going to be a big deal and then started our company initially on the back of a simple CMS that we built to help small business owners take advantage of the self-publishing revolution.

As we shifted gears to focus more on the dominance of the search channel, we started trying to understand where Google in particular might be taking things.

We’ve written plenty about that over the years, but we were talking about effects similar to Panda, Penguin, and Hummingbird years before they actually came to pass. Panda and Penguin started making our vision come true. We were more effective search marketers than we’d ever been because we’d largely bypassed building the infrastructure for search as it
was and tried to build it for how search would be.

And yet something was wrong.

Powerful content was becoming ever more effective. And yet the greatest examples of content that we were seeing at search conferences weren’t built by SEO agencies.

Brands were getting a bigger and bigger advantage in search. And yet the best brand builders weren’t SEO agencies.

For a long time, we’ve talked about how “SEO” isn’t a verb. You don’t “SEO a website.” Ranking well is an outcome, not an activity. It’s like fame. “Famous” isn’t a verb. You don’t “famous someone.” You get famous for doing other things (playing sport, performing music, appearing on TV). SEO is the same.

But what if we weren’t the right people to do those things for people? What if we weren’t the world’s best PR firm, branding agency, or creative producers?

Don’t worry. I got over my insecurity. I believe the capabilities that we have been building are going to
grow in power and influence. Here’s how:

The Innovator’s Dilemma

It was Mark Suster who kick-started my confidence with his talk in San Diego [use this link and sign up for an account to get access to the video for free]. He’s an entrepreneur-turned-investor. He’s smart and opinionated.

He talked about maker studios at our conference. You might have heard a few weeks ago that Maker Studios sold to Disney for half a billion dollars.

Maker is a producer and distributor of online video. The turning point for me was in realising that the forces they were betting on were also rampaging towards our quirky, exciting, geeky little corner of the marketing world.

There’s a book called The Innovator’s Dilemma by a Harvard Business School professor named Clayton Christensen. It’s a little dry, but if you’re interested in business theory and technology, it’s an absolute no-brainer: You should read it.

It describes two kinds of innovations that hit established markets. So-called “sustaining innovations” make existing processes faster, cheaper, or better. They can be very dramatic, but Professor Christensen’s research shows that they almost always end up benefiting the incumbent players in the market.

In contrast to “sustaining innovations” stand “disruptive innovations,” which are those that attack problems an entirely different way. They typically don’t work as well as the existing solutions, perhaps solving only part of the problem, but have a structurally different cost. So they’re “cheaper but worse.”

Cheaper but worse

Doesn’t sound too compelling, does it?

That’s what the incumbents think. They may spot a potential opportunity, and may even pay lip service to the idea that they should be pursuing it, but ultimately, their economic incentives are skewed towards maintenance of the status quo.

Therein lies the dilemma.

There’s often a subset of the market, for whom the new service is “good enough.” It may not be gold-plated, but it solves their immediate needs and they can afford it. As they invest, it gets better and better, capturing more and more of the market opportunity until it’s meeting the core needs of even the top end of the market while still being structurally cheaper. Money cascades to the new entrant and leaves the incumbents high and dry.

Let’s go back to the “cheaper but worse” innovation for a second. To me, that sounds an awful lot like the idea of building a brand online. Let’s look at the details:

  • The established way of building a brand for a generation has been via mass market TV advertising and other classic above-the-line spend. Spends of $ 100m+ are not uncommon.
  • Building a brand online is cheaper, yes, but right now, not as effective.
  • The incumbent brand-builders pay lip service to digital, but when you look at their corporate structure, their fee structures, and their economic incentives, and you realise that they’d far rather see TV get bigger than have to do all this messy web marketing.

So I think there is a disruption coming to brand marketing, and I don’t think it’s going to benefit the big guys.

Online first

I’m calling this whole phenomenon “online first”: the biggest brands of tomorrow will be built online. This will be partly because the tools we have available to build brands online are going to get better and better, and partly because money is going to flow to digital from TV. I recently wrote about this in more detail in our Future of TV report:

I am definitely not saying that TV itself is in trouble. We live in an amazing time for TV content. You just have to look at shows like Breaking Bad, Walking Dead, and True Detective to see that we have exceptional content and more ways of accessing that content than ever before—and that’s before we even get to Netflix and House of Cards. In part as a result of this resurgence, the total time spent watching video has increased every year recently.

Our devices are also getting better and better. The cost of big screens is coming down; we now have full HD on our mobile devices.

But the way we get our content is changing. 80% of US households have some form of internet-connected device paired with their TV according to gigaom research. Whether it’s an Apple TV, Roku, Xbox, Chromecast or something else, we can increasingly watch anything we like on the big screen. And conversely, we can watch more and more of our “classic TV” content on smartphones, tablets, laptops and any other screen we can lay our hands on.

This particular part of the trend has been analysed to death. I’m not interested in that for the purposes of this analysis. I’m interested in the fragmenting viewership: In general, we’re no longer all watching the same thing at the same time. That has profound impacts on the way TV advertising is bought and sold.

The innovator’s dilemma predicts that the cost per unit of the high end of the old market will continue to rise even as the bottom starts to fall away. It’s becoming ever more valuable to reach consumers on those rare occasions when we do all sit down at the same time to watch the same content.

The complexity of time-shifted internet-delivered content rapidly surpasses human optimisation ability. The upfront media market in which Oprah stands on stage and extols her show and the network and seeks tens or hundreds of millions of dollars of spend is a process which can’t survive the move to digital-scale complexity (if you’re interested in this, I wrote an introduction to TV advertising a few weeks ago).

Advertising against TV-like content will
have to be bought more like AdWords. It has to become real-time (depending on who’s actually watching at a given moment), it’ll have to be market-priced in one form or another (because you can’t negotiate all these things individually on the fly).

I’m in danger of getting dragged into deep economic arguments, but the effect of all this disruption is going to be a whole load of unbundling and a reallocation of budgets.

Of course, in part, this will open up opportunities in video marketing—both in brand-funded TV-like content and in video advertising against internet-delivered video (check out the talk by 
Chris from Wistia’s [PDF]). I don’t think it’s a given that the incumbent TV advertisers will dominate that space. It’s structurally pretty different. We are certainly betting in this area—between 
Phil and Margarita, we’re already doing video strategy and execution for ourselves and our clients.

It’s
not all about video, though.

How our industry competes

There are three broad areas that we all need to get great at to take advantage of this opportunity. Video fits into the first of these, which is technical creativity—that place where technology and storytelling meet:

1. Technical creativity

I’ve been endlessly frustrated over the years by the creative storytellers who misunderstand (or don’t even care about) technology. The stupid apps that no one uses. The branded social networks that nobody joins. The above-the-line campaigns telling you to search for phrases they don’t rank for.

Old-school SEOs can spot crawl issues or indexing problems in their sleep. We’ve had to get good at things like analytics, UX, and conversion. Indeed, one of the most popular talks last week (and Slideshare of the day) was from 
Aaron Weyenberg at TED, and was all about UX. The things that stood out to me the most were all about the different ways they listened to their audience and gathered feedback at different stages of the process. This incorporated everything from the standard hall-way tests through qualitative and quantitative surveying to a really nicely-executed beta. You can see the full deck here:

The Story of a Redesign

And we mustn’t lose sight of the value of that technical knowledge. Screw up a migration and you’re just as hosed as you’ve ever been.

For me personally, the creative is the more challenging part—but luckily it’s not all about me. We’ve been 
investing in creative for a while, and I loved the presentation our head of creative, Mark Johnstone gave last week entitled how to produce better content ideas. It really clarified my thinking in a few areas—particularly about the effort and research that should go in early in the process in order to give the “lightbulb moment” a chance. By coupling that with examples of deconstructing other people’s creative (and showing us / giving us further reading on how to practice ourselves) he made a compelling argument that we can all do this so much better—and that not only designers can be “creative.” I’m also looking forward to trying out the immersion techniques he talks about for getting from unstructured to structured. You can check out the full deck here:

How to Produce Better Content Ideas

[If you'd like to see more of the decks from Boston, you can currently get them here and in the next few weeks the videos will be available within DistilledU]

2. Broad promotional ability

The second capability we need after technical creativity is a broad promotional ability. This is your classic owned, earned and paid media.

As search marketers, we’ve typically focused primarily on the earned side of this—via outreach and digital PR—and my colleague Rob Toledo gave a great presentation about some of the cleverer forms of earned media in his presentation The Hunter/Gatherer. He talked in detail about ways of reaching that tricky kind of influencer—the one who wants to discover their own interesting share-worthy material. It was a funny presentation that contained some exceptional tactics. You can see the full deck here:

The Hunter/Gatherer

I think paid media is going to have an ever-increasing part to play in online brand building though. Pay Per Click is typically measured on direct response metrics—sending traffic to landing pages and converting them—but social and video advertising is on the rise. We increasingly spend money on promoting content instead of promoting landing pages. I expect that trend to continue.

The eagle-eyed among you might have noticed that this isn’t inbound. I make no apology for that.

3. Influence and measurement throughout the Customer Lifecycle

Finally, alongside our technical creativity and promotional ability, we need to double down on our ability to influence and measure customer behaviour throughout the customer lifecycle.

We’ve all heard (or even been) the search experts who stand on stage and talk about the measurability of digital. Sometimes they go further and make off-hand comments about how you “can’t measure TV.”

Does anyone really believe that? Anyone think Proctor & Gamble or Unilever really waste half the money they spend?

One of the most mind-blowing talks I ever attended was at ad:tech a few years ago—it was a speaker from Ogilvy talking about the econometric models they use to measure their work for P&G. It was all about how they were tying together the influence of point-of-sale, coupon codes, TV, and other above-the-line advertising to understand what’s making them the money. They are good at it but
it’s expensive. Our industry’s stuff is cheap in comparison. It’s not yet good enough but if we work hard and invest, it can be.

What I didn’t say

Remember: I didn’t say TV is dead. I didn’t say search is dead. I said that our crazy blend of technical creativity, promotional chops and measurement skills is going to be the skillset that builds tomorrow’s biggest brands. AND—crucially to the topic near and dear to much of the Moz audience’s hearts, it’s also going to be how you rank in Google.

Advertising is a half-trillion dollar a year industry struggling to understand its place in a digital world. I don’t want the same old guys to win on our turf. The internet is our domain. Let’s go get great at this.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

SearchCap: New Google Penalty, Action Schema & Subscribe To Google Trends

April 20th, 2014

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: Google Slaps Another Guest Blog Network: PostJoint Google’s Matt Cutts somewhat confirmed on Twitter that Google has taken action on another guest blogging…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Download Google Camera App to Defocus Background

April 19th, 2014

Lens Blur

Want to blur and defocus background in photos? Download Google Camera app, a new standalone Android app that enhances the power of your Android smartphone. This smart background blur photo technique is a craze and it seems everyone wants to use it. Google Camera just made it possible for everyone. Lens Blur and Defocus Background Lets Read the rest of this entry »

6 Changes We Always Thought Google Would Make to SEO that They Haven’t Yet – Whiteboard Friday

April 18th, 2014

Posted by randfish

From Google’s interpretation of rel=”canonical” to the specificity of anchor text within a link, there are several areas where we thought Google would make a move and are still waiting for it to happen. In today’s Whiteboard Friday, Rand details six of those areas. Let us know where you think things are going in the comments!

For reference, here’s a still of this week’s whiteboard!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Today, I’m going to tackle a subject around some of these changes that a lot of us in the marketing and SEO fields thought Google would be making, but weirdly they haven’t.

This comes up because I talk to a lot of people in the industry. You know, I’ve been on the road the last few weeks at a number of conferences – Boston for SearchLove and SMX Munich, both of which were great events – and I’m going to be heading to a bunch more soon. People have this idea that Google must be doing these things, must have made these advancements over the years. It turns out, in actuality, they haven’t made them. Some of them, there are probably really good reasons behind it, and some of them it might just be because they’re really hard to do.

But let’s talk through a few of these, and in the comments we can get into some discussion about whether, when, or if they might be doing some of these.

So number one, a lot of people in the SEO field, and even outside the field, think that it must be the case that if links really matter for SEO, then on-topic links matter more than off-topic links. So, for example, if I’m linking to two websites here about gardening resources, A and B, both about gardening resources, and one of those comes from a botany site and the other one comes from a site about mobile gaming, well, all other things being true, it must be that the one about botany is going to provide a stronger link. That’s just got to be the case.

And yet, we cannot seem to prove this. There doesn’t seem to be data behind it or to support it. Anyone who’s analyzed this problem in-depth, which a number of SEOs have over the years — a lot of people, who are very advanced, have gone through the process of classifying links and all this kind of stuff — seem to come to the same conclusion, which is Google seems to really think about links in a more subject/context agnostic perspective.

I think this might be one of those times where they have the technology to do it. They just don’t want to. My guess is what they’ve found is if they bias to these sorts of things, they get a very insular view on what’s kind of popular and important on the Web, and if they have this more broad view, they can actually get better results. It turns out that maybe it is the case that the gardening resources site that botanists love is not the one with mass appeal, is not the one that everyone is going to find useful and valuable, and isn’t representing the entirety of what the Web thinks about who should be ranking for gardening resources. So they’ve kind of biased against this.

That is my guess. But from every observable input we’ve been able to run, every test I’ve ever seen from anybody else, it seems to be the case that if there’s any bias, it’s extremely slight, almost unnoticeable. Fascinating.

Number two, I’m actually in this camp. I still think that someday it’s coming, that anchor text influence will eventually decline. Yet it seems to be that, yes, while other signals have certainly risen in importance, and there have been lots of other things, it seems that anchor text inside a link is still far more important and better than generic anchor text.

Getting specific, targeting something like “gardening supplies” when I link to A, as opposed to on the same page saying something like, “Oh, this is also a good resource for gardening supplies,” but all I linked with was the text “a good resource” over to B, that A is going to get a lot more ranking power. Again, all other things being equal, A will rank much higher than B, because this anchor text is still pretty influential. It has a fairly substantive effect.

I think this is one of those cases where a lot of SEOs said, “Hey, anchor text is where a lot of manipulation and abuse is happening. It’s where a lot of Web spam happens. Clearly Google’s going to take some action against this.”

My guess, again, is that they’ve seen that the results just aren’t as good without it. This speaks to the power of being able to generate good anchor text. A lot of that, especially when you’re doing content marketing kinds of things for SEO, depends on nomenclature, naming, and branding practices. It’s really about what you call things and what you can get the community and your world to call things. Hummingbird has made advancements in how Google does a lot of this text recognition, but for these tough phrases, anchor text is still strong.

Number three, 302s. So 302s have been one of these sort of long-standing kind of messes of the Web, where a 302 was originally intended as a temporary redirect, but many, many websites and types of servers default to 302s for all kinds of pages that are moving.

So A301 redirects to B, versus C302 redirecting to D. Is it really the case that the people who run C plan to change where the redirect points in the future, and is it really the case that they do so more than A does with B?

Well, a lot of the time, probably not. But it still is the case, and you can see plenty of examples of this happening out in the search results and out on the Web, that Google interprets this 301 as being a permanent redirect. All the link juice from A is going to pass right over to B.

With C and D, it appears, with big brands, when the redirect’s been in place for a long time and they have some trust in it, maybe they see some other signals, some other links pointing over here, that yes, some of this does pass over, but it is not nearly what’s happening with a 301. This is like a directive, and this is sort of a nudge or a hint. It just seems to be important to still get those 301s, those right kinds of redirects right.

By the way, there are also a lot of other kinds of 30X status codes that can be issued on the Web and that servers might fire. So be careful. You see a 305, a 307, 309, something weird, you probably want a 301 if you’re trying to do a permanent redirect. So be cautious of that.

(Number four): Speaking of nudges and hints versus directives, rel=”canonical” has been an interesting one. So when rel=”canonical” first launched, what Google said about rel=”canonical” is rel=”canonical” is a hint to us, but we won’t necessarily take it as gospel.

Yet, every test we saw, even from those early launch days, was, man, they are taking it as gospel. You throw a rel=”canonical” on a trusted site accidentally on every page and point it back to the homepage, Google suddenly doesn’t index anything but the homepage. It’s crazy.

You know what? The tests that we’ve seen run and mistakes — oftentimes, sadly, it’s mistakes that are our examples here — that have been made around rel=”canonical” have shown us that Google still has this pretty harsh interpretation that a rel=”canonical” means that the page at A is now at B, and they’re not looking tremendously at whether the content here is super similar. Sometimes they are, especially for manipulative kinds of things. But you’ve got to be careful, when you’re implementing rel=”canonical”, that you’re doing it properly, because you can de-index a lot of pages accidentally.

So this is an area of caution. It seems like Google still has not progressed on this front, and they’re taking that as a pretty basic directive.

Number five, I think, for a long time, a lot of us have thought, hey, the social web is rising. Social is where a lot of the great content is being shared, a lot of where people are pointing to important things, and where endorsements are happening, more so, potentially, than the link graph. It’s sort of the common man’s link graph has become the social web and the social graph.

And yet, with the exception of the two years where Google had a very direct partnership with Twitter and those tweets and indexation, all that kind of stuff was heavily influential for Google search results, since that partnership broke up, we haven’t seen that again from Google. They’ve actually sort of backtracked on social, and they’ve kind of said, “Hey, you know, tweets, Facebook shares, likes, that kind of stuff, it doesn’t directly impact rankings for everyone.”

Google+ being sort of an exception, especially in the personalized results. But even the tests we’ve done with Google+ for non-personalized results have appeared to do nothing, as yet.

So these shares that are happening all over social, I think what’s really happening here is that Google is taking a look and saying, “Hey, yes, lots of social sharing is going on.” But the good social sharing, the stuff that sticks around, the stuff that people really feel is important is still, later on at some point, earning a citation, earning a link, a mention, something that they can truly interpret and use in their ranking algorithm.

So they’re relying on the fact that social can be a tip-off or a tipping point for a piece of content or a website or a brand or a product, whatever it is, to achieve some popularity, but that will eventually be reflected in the link graph. They can wait until that happens rather than using social signals, which, to be fair, there’s some potential manipulation, I think that they’re worried about exposing themselves too. There’s also, of course, the case that they don’t have direct access. Well, they don’t have API-level access and partnerships with Facebook and Twitter anymore, and so that could be causing some of that too.

Number six, last one. I think a lot of us felt like, as Google was cleaning up web spam, for a long time they talked about cleaning up web spam, from ’06, ’07 to about 2011, 2012, it was pretty sketchy. It was tough.

When they did start cleaning up web spam, I think a lot of us thought, ”Well, eventually they’re going to get to PPC too.” I don’t mean pay-per-click. I mean porn, pills, and casino.

But it turns out, as Matt Brown, from Moz, wisely and recently pointed out in his SearchLove presentation in Boston, that, yes, if you look at the search results around these categories, whatever it is — Buy Cialis online, Texas hold-’em no limit poker, removed for content, because Whiteboard Friday is family-friendly, folks — whatever the search is that you’re performing in these spheres, this is actually kind of the early warning SERPS of the SEO world.

You can see a lot of the changes that Google’s making around spam and authority and signal interpretation. One of the most interesting ones that you probably observed, if you study this space, is a lot of those hacked .edu pages, or barnacle SEO that was happening on sub-domains of more trusted sites that had gotten a bunch of links, that kind of stuff, that is ending a little bit. We’re seeing a little bit more of the rise, again, of like the exact match domains and some of the affiliate sites and getting links from more creative places, because it does seem like Google’s gotten quite a bit better at which links they consider and in how they judge the authoritativeness of pages that might be hanging on or clinging onto a domain, but aren’t well linked to internally on some of those more trusted sites.

So, that said, I’m looking forward to some fascinating comments. I’m sure we’re going to have some great discussions around these. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Better Ways to Embed Tables and Spreadsheets in Web Pages

April 17th, 2014

It is easy to embed tabular data in  web pages. You can either use the standard <table> HTML tag or you can input the tabular data in a spreadsheet — like Excel Online or Google Spreadsheets — and embed the sheet in your web pages.

HTML tables are easy while spreadsheet based tables allow better formatting and complex layouts – like nested tables within a table – without fiddling with the code. Here are the different ways by which you can embed tables in your website and their pros and cons.

How to create an HTML table

If you have access to a WYSIWYG editor like Dreamweaver, you can easily create an HTML table using the built-in wizards but I prefer using Markdown for creating tables as it requires no tags. Go to gist.github.com (you don’t even need an account here) and enter the table  in the following format:

Column A | Column B
-------- | -------
Cell A1 | Cell B1
Cell A2 | Cell B2

Each column is separated by a pipe (|) while hyphens (-) indicate the table headings. Name the gist table.md (.md indicates markdown language) and click the “Create Secret Gist” button to render the markdown as a table.

Once you click the Save button, the gist will show you the visual table which you can copy-paste into any rich-text editor like the Gmail compose window. Alternatively, you can right-click the table on Github and choose Inspect Element to view the actual HTML tags for that table.

excel to html

Tableizer is another simple tool for converting spreadsheet data into HTML table code. Create a table inside Excel or the Numbers app on your desktop, copy the cells and paste it inside Tableizer. It will generate the HTML code that can be used on your blog or website.

Embed Google Sheets in your Website

A popular option for embedding tabular data in a web page is through Google Docs (Spreadsheets). The advantage with this approach is that you can modify the data in the spreadsheet and embedded table will update itself to reflect the edits. There’s no need to edit the web page containing the table.

Go to spreadsheets.google.com, enter some data in the sheet and the choose the Publish to the Web option from the File menu. Choose Start Publishing and Google Drive will offer you the IFRAME embed code for that particular sheet.

The embedded sheet – see live version – will preserve the original formatting of the cells but it will still be a static HTML document – there’s no option for sorting or filtering data in the HTML table.

Embed Excel Sheets in Web Pages

This is my favorite method for embedding spreadsheet data in web page and I’ll soon explain why.

Go to office.live.com and create new blank workbook. Enter the tabular data inside the Excel sheet and then choose File -> Share -> Embed -> Generate HTML.

Excel, unlike Google Docs, allows you to embed a select range of cells and not the entire spreadsheet. You can also include a download link in the embedded cells making it easier for your website visitor to download and open the table in their local spreadsheet app. The embedded spreadsheet also offers better copy-paste than Google Docs.

Here’s a live version of an HTML table embedded using the Excel web app.

Related: Capture Web Tables into Excel

Make Static HTML Tables Interactive

If you wish to go with static HTML tables, instead of interactive spreadsheet based tables, you can consider adding the Excel button that will make your HTML tables interactive.

You have the regular HTML code for your <table> and all you have to do is add another HTML tag to your web page that will turn the embedded static table into an interactive sheet – — see this live version.

<a href="#" name="MicrosoftExcelButton"></a>

<table>
  <thead><tr>
    <th>Column A</th>
    <th>Column B</th>
    </tr></thead>
  <tbody>
    <tr>
      <td>Cell A1</td>
      <td>Cell B1</td>
    </tr>
    <tr>
      <td>Cell A2</td>
      <td>Cell B2</td>
    </tr>
  </tbody>
</table>

<script type="text/javascript" src="http://r.office.microsoft.com/r/rlidExcelButton?v=1&kip=1"></script>

This code will add a little Excel button next to your HTML table and when someone clicks that button, it creates a beautiful and interactive view of table with support for sorting and filtering. You can even visualize the HTML table as graphs without leaving the page.

HTML Tables or Spreadsheets?

The advantage with static HTML tables is that they are SEO friendly (search engines can read your HTML table) while spreadsheet based tables are not. The latter however allow better formatting options and are relatively easy to update.

If you wish to have the best of both worlds, go with an HTML table and use the Excel interactive view that will let viewers interact with the table on demand.

Related Guide: How to Embed Anything in a Website


This story, Better Ways to Embed Tables and Spreadsheets in Web Pages, was originally published at Digital Inspiration on 16/04/2014 under Embed, Microsoft Excel, Software
Digital Inspiration Technology Blog

Google’s Effective ‘White Hat’ Marketing Case Study

April 16th, 2014

There’s the safe way & the high risk approach. The shortcut takers & those who win through hard work & superior offering.

One is white hat and the other is black hat.

With the increasing search ecosystem instability over the past couple years, some see these labels constantly sliding, sometimes on an ex-post-facto basis, turning thousands of white hats into black hats arbitrarily overnight.

Are you a white hat SEO? or a black hat SEO?

Do you even know?

Before you answer, please have a quick read of this Washington Post article highlighting how Google manipulated & undermined the US political system.

.
.
.
.
.
.
.

Seriously, go read it now.

It’s fantastic journalism & an important read for anyone who considers themselves an SEO.

.
.
.
.
.
.
.
.

######

Take the offline analog to Google’s search “quality” guidelines & in spirit Google repeatedly violated every single one of them.

Advertorials

creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links can be considered a violation of our guidelines. Advertorials or native advertising where payment is received for articles that include links that pass PageRank

Advertorials are spam, except when they are not: “the staff and professors at GMU’s law center were in regular contact with Google executives, who supplied them with the company’s arguments against antitrust action and helped them get favorable op-ed pieces published

Deception

Don’t deceive your users.

Ads should be clearly labeled, except when they are not: “GMU officials later told Dellarocas they were planning to have him participate from the audience,” which is just like an infomercial that must be labeled as an advertisement!

Preventing Money from Manipulating Editorial

Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.

Money influencing outcomes is wrong, except when it’s not: “Google’s lobbying corps — now numbering more than 100 — is split equally, like its campaign donations, among Democrats and Republicans. … Google became the second-largest corporate spender on lobbying in the United States in 2012.”

Content Quality

The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.

Payment should be disclosed, except when it shouldn’t: “The school and Google staffers worked to organize a second academic conference focused on search. This time, however, Google’s involvement was not publicly disclosed.”

Cloaking

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

cloaking is evil, except when it’s not: Even as Google executives peppered the GMU staff with suggestions of speakers and guests to invite to the event, the company asked the school not to broadcast its involvement. “We will certainly limit who we announce publicly from Google”

…and on and on and on…

It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it.

And while they may not approve of something, that doesn’t mean they avoid the strategy when mapping out their own approach.

There’s a lesson & it isn’t a particularly subtle one.

Free markets aren’t free. Who could have known?

Categories: 
google

SEO Book

Prime Reasons for, why SEO plays in enhanced web traffic

It has been recently found that, the strategies opted by the SEO Birmingham companies have become critically significant to the success of small businesses. If your online business is not making use of SEO Birmingham, this write-up discusses prime reasons why you should opt for the local strategies used by SEO Birmingham companies.

As per the recent findings, more than 39% of the NETIZENS experience problems while searching for the local businesses over the Word Wide Wed. People know about the existence of the local business, but suffer inconvenience while locating web information about such businesses. The major reason behind such inconvenience suffered by the Netizens is that such business fails to understand the relevance of SEO Birmingham. Thereby, if your business provides products and services over the web nationally or internationally, search engine optimization can be of great help over making the brand visible on the search engine results.

Nowadays, more and more people rely on the Internet for finding local businesses

There have been times, when the local business did not worry about the scope of SEO, just word of mouth has been more than enough for spreading their existence to the local consumers. But, today as per the statistics more than 84% of the people make use of the Internet medium for locating the local businesses. No telephone directories, people rely on search engines.

Thus, it becomes essential to change the traditional marketing strategy to modern day strategy of online marketing.

SEO Birmingham costs a little less

If you are considering Adword strategy to capture the online marketing domain, you must be aware of the fact; the popularity of the keyword chosen is directly proportional to the amount of the fee paid. Selection of local keywords means lesser keyword competition, which means you need not pay extra costs.

Helps in reaping the benefits of advanced Google features

SEO companies in Birmingham, such as SEO Results4u at Avon House, 435 Stratford Road, Shirley, Solihull, West Midlands, B90 4AA 0121 746 3121 also contribute to the SEO landscape of their local area, be it Solihull, Birmingham or even the wider West Midlands area.

People are actually unaware about the fact, Google plus has changed the traditional way of Internet usage. If the keywords chosen are relevant to the domain to the local market, you very wisely unlock enhanced services offered by Google:

  • A map representing the physical location of a business
  • Appealing pictures with respect to the business
  • Make use of the reviews posted by the user

Truth be told, without using the local platform of SEO, Google plus fails to recognize your business, which clearly means lack of authentic information over the web.

Local SEO encourages better and enhanced credibility

People trust Yahoo, Bing and Google with their eyes closed and believe, these magical search engines have remedies for each and every query. It is a well-accepted notion among the commoners, the brands that appear in the top search lists are most wanted and authentic service providers. So, if you want people to believe in your brand's credibility, Search Engine Optimization adds credibility to your brand power among the commoners. Local SEO adds credibility as well as a definite increase in the web traffic.

Your Peers are using

Business is all about competition. Your peers are using it and yielding the benefits, why aren’t you?