Monday, 5 December 2016

5 reasons advertisers should NOT ignore Bing Ads

bing-teal-logo-wordmark5-1920

Gone are the days when Bing took years to catch up to the new and innovative applications Google had launched within AdWords. In recent months, Bing Ads has been able to quickly implement comparable tools and features — indeed, they even have some unique features that AdWords does not offer.
Unfortunately, Bing is still often seen as the secondary choice when launching a paid search campaign. Marketers underestimate the true value of this ad platform and its unique audience. I have had great success with Bing campaigns; in some instances, the results far surpass outcomes obtained via the highly competitive Google platform.
The last couple of years have been exciting for Bing Ads, with the implementation of many new tools and features. Here are five of my favorite enhancements.

1. The Bing Ads app

I recently launched a new Bing advertising account and was impressed to see the availability of a Bing Ads app, which launched for iOS in April 2015. (The Android version was released later that year.)
app-image
Having an app available on mobile devices makes managing a paid search campaign on the go a breeze. This is a great feature for account managers who are often away from the office but still need to access their accounts to ensure proper management of campaigns.
bng-ads-app-image-cropped

2. Expanded ad text

On the heels of Google’s expanded text ads, Bing announced on October 25, 2016, that after a short pilot (conducted in August and only open to select advertisers), all advertisers worldwide are now able to implement expanded text ads within their Bing campaigns.
bng-ads-app-image-cropped
According to Bing, the expanded ads will work seamlessly across all devices, allowing advertisers the freedom to write more effective ad copy with compelling calls to action, thus driving more conversions.
Bing has also made it extremely easy for advertisers to incorporate expanded text ads into their advertising program with several implementation options available.
text-ad-samples
A standard ad (left) vs. an expanded text ad (right) in Bing
Many marketers, including me, are benefiting from the lessons they learned through the Google expanded text ad process and are now incorporating these best practices into Bing Ads.
Bing recommends that advertisers continue to run standard ads in parallel with new expanded ads, so as not to jeopardize overall performance of campaigns. In that way, marketers can test expanded ad performance against standard ads to gain valuable insights for future campaign optimization.
Google recently revised the date that standard ads will no longer be accepted (to January 31, 2017). In a similar move, Bing has indicated that it will continue to allow advertisers to create and edit standard ads through the first quarter of 2017.
(For a complete walkthrough, visit Bing Ads Help and learn more about what makes an effective extended ad.)

3. Shared budgets

In October 2016, Bing gave marketers the convenience and flexibility of a “shared budget.” This means that advertisers have the option of having multiple campaigns running under a single budget. Keep in mind that with a shared budget, Bing automatically adjusts how your budget is spent across all campaigns to help improve ROI.
Shared budgets eliminate the time spent setting up and calculating individual campaign budgets. This feature might be of benefit to you if:
  • you would like to manage a single budget that can be used by all campaigns.
  • you would like to reduce the time spent manually calculating individual budget allocations among a large number of campaigns.
  • you want Bing Ads to reallocate unutilized budget to bolster campaigns that are performing well.

4. Expanded Device Targeting

In November, Bing opened up Expanded Device Targeting globally, giving advertisers the ability to adjust bids for various device types, and expanding the range for bid adjustments as follows:
  • Desktop: 0% to +900%
  • Tablets: -100% to +900%
  • Smartphones: -100% to +900%
Bid adjustments by device type can be combined with Bing’s other targeting criteria, including:
  • geographic location
  • day of week or time of day
  • device type
  • age and gender
Targeting can be implemented at the campaign level or the ad group level. Keep in mind that ad group targeting will have precedence over campaign targeting.

5. Bing Partner Program

Also in November, Bing announced that they were expanding the Bing Partner Program to agencies, SMB partners and technology partners globally.
According to Bing, the Partner Program provides recognition and a deeper level of engagement for valued advertisers. Partners will receive special opportunities, brand association with Bing/Microsoft, access to valuable marketing content, technical and sales training and other benefits.

Utilize Bing Ads for a competitive advantage

Digital marketers should not dismiss Bing as a viable ad channel. In my experience, Bing Ads are an effective driver of brand visibility, leads and sales. I have several accounts where Bing significantly outperforms Google in terms of cost-effective, high-quality results. (Success will, of course, depend on your industry, audience and goals.)
I encourage digital marketers to look beyond the popular — and very competitive — Google ad platform. Test Bing Ads.

Google’s 2016 Santa Tracker signals the official countdown to Christmas

santa-tracker-2016

The countdown to Christmas Day has officially begun on Google with the launch of its annual Santa Tracker yesterday.
In addition to the countdown clock, Google also opened its Santa Village, offering a site for parents and educators with activities for K-12 students, a Santa Tracker app for Android and a Santa Tracker Chrome extension (in case you want to keep track of Santa’s “precise” location at all times).
For 12 years now, Google’s Santa Tracker has counted down the seconds until Santa Claus takes flight from the North Pole on December 24.
Santa’s dashboard — featuring the latest and greatest in Google Maps technology and sleigh engineering — will allow you to follow his progress around the world, and also learn a little about some of his stops along the way.
From the Google Santa Village website
This year’s Google Santa Village is designed to look like an Advent calendar, with a new activity for each day of December.
For the launch yesterday, Google had a “Present Bounce” game, and today’s activity included a link to an animated “Santa’s Back” YouTube video.

Google’s Daily Countdown to Christmas Calendar

santa-tracker-calendar
santa-tracker-calendar
Each day of December will offer a new surprise from Google, counting down the days until December 24, when Santa’s official journey begins.
Google isn’t the only one that regularly tracks Santa’s whereabouts. While Bing hasn’t launched its Santa countdown clock yet, Microsoft has partnered with NORAD for a number of years now to conduct its own Santa tracking.

Canonical chaos: doubling down on duplicate content

http and https duplication issues

Search engines are getting smarter. There is little doubt about that. However, in a CMS-driven web where content can often exist on several URLs, it is not always clear what is the authoritative URL for a given piece of content. Also, having content on several URLs can lead to problems with link and ranking signals being split across several variations of a piece of content.
It is hard enough standing out in the often hypercompetitive search landscape, so you would imagine that most businesses had these foundational SEO issues under control. Unfortunately, our experience would tell us otherwise. In fact, it seems that in the wake of many sites moving towards HTTPS for the promised ranking boost, we are seeing even more issues of URL-based duplicate content than before.
Fortunately, we have the canonical tag. With rel=canonical, we can easily specify the authoritative URL for any piece of content. Google and the other engines will then consolidate link and rank signals for all variations of this content onto a single URL. This is, of course, if rel=canonical is correctly implemented.
In this article, I take a look at how incorrect implementation of canonical URLs can exacerbate URL-based duplicate content. I also share an example of a UK-based e-commerce store that recently saw their home page de-indexed (just the home page) due to what seemingly ended up being an issue with the canonical URLs.

Dodgy duplicates

It is not unusual for a piece of content to exist on multiple URLs. This could be on one site or many. It could be due to subdomains. It could be due to your CMS creating multiple entry points for a single piece of content. It could also be due to running your site over HTTPS in line with Google’s latest recommendations.
There are a bunch of potential situations that can lead to a piece of content being available on multiple URLs, but the most common tend to be:
  • Dynamic URLs — e.g., http://example.com/?post=1&var=2&var=3
  • Mobile sites — e.g., m.example.com and www.example.com
  • International sites without correct geo-targeting
  • www and subdomain issues — e.g., www.example.com or example.com
  • CMS generating multiple URLs
  • Content syndication on other blogs
  • Running your site on both HTTP and HTTPS
We also tend to see a mixture of these issues, and it is not unusual to find sites that run HTTP and HTTPS and have content available on the www and non-www version of the site. This can quickly create a situation where the same piece of content (or the home page) can be available on several different URLs.
As an example, just the very common running of the site with and without www, and over HTTP and HTTPS, can create four potential URLs for every piece of content on the site:
  • http://example.com/page
  • http://www.example.com/page
  • https://example.com/page
  • https://www.example.com/page

Canonical chaos

Now, in an ideal world, your canonical URL would sort this out, and each of the four URLs would have the same canonical URL specified. It could be any of the above, but if you have HTTPS, you may as well run with HTTPS, so let’s say your canonical URL is https://www.example.com. You’d put this piece of code into the HTML head of all the other versions:
<link rel="canonical" href="https://www.example.com" />
I have seen debate about whether the actual canonical page should canonicalize to itself — in practice we do, and I have seen this sentiment echoed by other SEOs over the years (and have never run into any issues doing so).
Unfortunately, what we are seeing quite a bit recently is that the canonical tag is present, yet each page has a canonical that matches the URL shown in the browser window.
  • http://example.com/page canonical = http://example.com/page
  • http://www.example.com/page canonical = http://www.example.com/page
  • https://example.com/page canonical = https://example.com/page
  • https://www.example.com/page canonical = https://www.example.com/page
Clearly, this is not ideal. The canonical tag is designed to resolve these very issues, but in this instance, it further exacerbates the situation. Each URL here is saying, “Me, me, index me!!!” The search engine then has to do what it can with this mess.
Issues like this impact trust and confidence. Trust and confidence impact rankings. Poor rankings impact your business. That may all sound like something the SEO Yoda may say, but the reality is that a goofed canonical tag will only impact your results in a negative way.
We recently worked with a UK business that saw their home page mysteriously de-indexed, which hit them hard for the big keywords they target. They typically sit among amazon.co.uk and other huge brands in the top three, so there is no room for these issues. After checking all the usual suspects, we identified issues with the canonical tag implementation — this was fixed, the site was crawled, and the home page popped back in again. I was somewhat staggered, but it drives home the importance of solid technical SEO.
Fortunately, this happened and we resolved it just before the big Christmas rush — but had this issue cropped up now, the financial impact could have been far worse.

HTTP and HTTPS

The move to HTTPS is generally a good thing. Security matters. And the web is faster than it once was. However, we have seen all manner of problems here, usually due to the site being indexed on both HTTP and HTTPS URL variations.
Unfortunately, we also tend to see the canonical tags use both HTTP and HTTPS, which again further exacerbates the underlying issue that the canonical tag should resolve.

Why does this happen?

I believe there are a couple of reasons we see these issues:
  1. The site is running on HTTP and HTTPS, and the CMS has no way to force the protocol and/or subdomain for canonical URLs.
  1. Developers take a checklist approach to SEO, implementing the canonical tag without really understanding what it is for and populating it with the address bar URL.

Correcting your canonicals

In most cases, duplicate content issues can be resolved pretty easily. Fixing the canonical is one way, but this can be tricky with some web CMS software, so we can utilize permanent HTTP redirects (301). This is typically the fastest and most logical approach in that the page variation is never crawled and Google does not have to analyze multiple pages — they just follow the redirection.
  1. 301 redirections. If you can redirect, do redirect. This is the faster and preferred approach, as stated by John Mueller from Google. Redirect to your preferred subdomain. Redirect to your preferred protocol. Often, you can implement a simple sitewide, catch-all redirection rule that deals with 90 percent of this in one fell swoop.
  1. Correct canonicals. Where a canonical is required, you need to implement a page-level canonical from one variation to the other. As above, determine your primary subdomain and protocol, and ensure all duplicates have a canonical pointing to the primary page.
That is pretty much it — always redirect if you can, as it deals with duplicate content issues in the quickest and most efficient way (from a workload and ranking perspective).
Then, where this is not possible or desirable, implement page-level canonical tags. This may need some developer support.
Certainly, for WordPress there is a simple fix using the wpseo_canonical filter from the WordPress SEO plugin. This allows you to force HTTP or HTTPS or the subdomain with some fairly basic PHP. Your developer can often do the same to help you with other CMS and bespoke builds. This is not terribly complicated — it just requires a clear understanding of why the canonical exists.

One URL to rule them all

It’s not unusual for a piece of content to appear on multiple URLs. There is no duplicate content penalty as such. However, for a search engine to be 100 percent confident in the correct URL to return and to ensure all equity is consolidated in one primary version of a page, we need accurate redirections and canonical URLs.
Simply adding an SEO plugin or having your developer hack in a canonical URL is not enough — it must be implemented in a way that ensures that each piece of content has one authoritative URL.
One URL to rule them all. One URL to find them. One URL to bring them all and in the search results bind them.

SearchCap: Google Santa Tracker, duplicate content & Bing Ads

searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

  • Canonical chaos: doubling down on duplicate content
    Dec 2, 2016 by Marcus Miller
    Duplicate content issues? Problems with your canonical tags? Columnist Marcus Miller explains why these issues occur and how to fix them.
  • Google’s 2016 Santa Tracker signals the official countdown to Christmas
    Dec 2, 2016 by Amy Gesenhues
    This is the 12th year Google has counted down the seconds until Santa’s December 24 departure from the North Pole.
  • 5 reasons advertisers should NOT ignore Bing Ads
    Dec 2, 2016 by Michelle Cruz
    Columnist Michelle Cruz outlines some recent developments in Bing Ads and makes the case for this cost-effective PPC platform.
  • Search in Pics: Google swing set, bike trailer & goggles
    Dec 2, 2016 by Barry Schwartz
    In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Google indoor swing set: Source: Instagram Google Kosher Certification: Source: Twitter Google bike trailer: Source: Instagram […]
  • Gartner CMO Spend Survey: Marketing budgets continue to rise
    Dec 1, 2016 by Digital Marketing Depot
    According to Gartner’s 2016–2017 Chief Marketing Officer (CMO) Spend Survey, marketing budgets increased to 12 percent of company revenue in 2016, from 11 percent in 2015. Fifty-seven percent of marketing leaders surveyed expect their budgets will increase further in 2017. However, 14 percent of marketers say they are bracing for budget cuts, up from 3 […]

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry
Link Building
SEO
SEM / Paid Search
Search Marketing

Reminder: Live webcast: Social Media Marketing 3.0

december8_521594695



Social media is now an integral part of marketing — spending on social media is expected to double to 25% of marketing budgets by 2020. Yet, nearly half of digital marketers say they can’t prove the bottom-line impact of social media on their businesses.
Join our panel of experts on Thursday, December 8, 2016, as they show you how to reduce complexity, increase scalability, and improve your social media marketing results. You’ll learn how to:
  • streamline content creation and planning across departments;
  • identify and measure key metrics that will demonstrate social ROI; and
  • improve your brand authenticity by leveraging user-generated content.
Register today for “Social Media Marketing 3.0: Best Practices for Ramping Up Social ROI,” produced by Digital Marketing Depot and sponsored by Lithium.

Report: Microsoft responds to Amazon Echo, Google Home with “HomeHub”

cortana-logo-1920
In response to the unexpected popularity of Amazon’s Alexa devices, Google felt compelled to develop Google Home. There have also been rumors about a stand-alone Apple Siri device. I’ve been waiting for a Microsoft response; apparently there already is one.
It’s called HomeHub, which has reportedly been in the works for some time. This weekend, there were a number of stories about how HomeHub would “crush” Amazon Echo and Google Home when it’s released. However, HomeHub is not hardware, it’s a new software layer on or within Windows 10.
According to Windows Central:
Home Hub isn’t a dedicated device that’s designed to take on the likes of the Amazon Echo and Google Home, as in the end, Home Hub is just the software. But that software can do everything the Amazon Echo and Google Home devices can, but with one added benefit: a screen. Home Hub is designed to run on Windows 10 PCs, mainly All-In-Ones and 2-in-1’s with touch screens, but can work on any Windows 10 machine. Pen and ink support are also part of the plan.
A key feature of this smart-home software overlay will be an always-on Cortana, which will be accessible from the lock screen. Indeed, the central strategic difference between Microsoft’s approach and Google’s, and presumably Apple’s, in taking on Echo/Alexa is that there will be a PC or tablet screen to complement the virtual assistant experience.
This could solve an number of challenges that currently exist for both users and marketers with the screenless Echo and Home (notwithstanding the smartphone app and search companion features). But it also poses challenges for Microsoft. The “PC in the kitchen” scenario has not really materialized as a mass-market phenomenon. I could imagine a lower-cost version of Surface specifically intended for virtual assistant and smart-home management functions.
Price will be a major factor. With PC sales in decline — though Surface has been a success for Microsoft — it will be extremely challenging for Microsoft to convince people to spend hundreds of dollars for another PC, no matter how seemingly utilitarian. This is especially true against the competitive backdrop of Echo’s $179 and Google Home’s $129 price tags.
Until something more concrete makes its appearance, however, this is all speculation. Yet it makes sense that Microsoft, which has long aspired to be the brain of the smart home, would respond to the rise of Echo and Google Home, which directly threaten that role. Amazon announced last week that over Black Friday weekend it sold millions of Alexa-powered units.

As a local business, you have to own your own back yard

back-yard-ss-1920
As you know from my past articles, I typically write about real-world encounters with clients or other SEOs — and this one’s no different. The interesting point about today’s post is that the issue I’m discussing might partially be my own fault!
Last summer, I wrote a post here about local content silos, a strategy for creating silos of localized content relating to nearby cities. The strategy is valid and can give you great results, but it takes time and a lot of hard work. I’ve talked about the strategy in videos and mentioned it numerous times when speaking at conferences.
In this month’s installment of Greg’s Soapbox, I’m going to (sort of) poke holes in my own strategy. Stick with me for a minute –I promise it’ll make sense.

Is “doing the silo thing” the right way to go?

More and more often, when dealerships sign up for our SEO service, they’re asking us to “do the silo thing” so that they can compete in searches in the next town over, or the big metro that’s 20 minutes away. I get the same question at SEO conferences almost every time: “We’re in the suburbs, so can you tell us how to show up for searches in our main metro?”
Yes. The silo strategy can produce great results… but you have to own your own back yard first!
Here’s the big problem: businesses are so worried about showing up in searches in other cities that they ignore their own back yard. You’ve got to really own your own town before you go after any others. Owning your own location is like the ante in the local SEO poker game: if you don’t pay your ante, you can’t sit at the adults’ table and play.
In almost every case, businesses aren’t showing up well (if at all) in their own city. Remember, Google is going to return search results based on relevance and importance. You’ve got to handle the SEO basics first: write great content and get awesome links. Before you step up and swing for the fences with the silo strategy, you’ve got to make sure you’ve got your own location locked down.
Local search also relies on location and proximity. At a very basic level, if you’ve knocked out your basics, you should be ranking well in your own town. If you’ve got great content, you’ll have sitewide keyword relevance. If you’ve got great links, you’ll have an authoritative site. If you’ve got your location signals optimized on your site, you’ll have sitewide local relevance. If your citations are consistent and robust, you’ll have the off-site location signals nailed as well.

Don’t neglect your own town in favor of the market next door

Most of the time when businesses are laser-focused on the market next door, or the major metro, they ignore their own town. If you’re not showing up well in your own town, how do you hope to compete in the city next door, where you don’t have as much local relevance and no proximity?
It’s much harder to show up in searches in a city where you’re not physically located. That’s strike one. Typically, when businesses are targeting nearby cities, it’s because they’re going after a bigger market/population, which means a lot more competition. Strike two. If they’re not very visible in their own town (where they should easily show up), that means they’re not well optimized in the first place. There’s strike three.
If you forgo any local optimization efforts to concentrate on the nearby target, and your local signals aren’t maximized, you could be missing out on a significant amount of traffic from local searches — where you should be absolutely dominating anything else local.

After you’ve got your own back yard in order

Once you’ve got your own back yard perfectly landscaped and set up, then you can start trying to hop the fence and steal attention from your neighbors. But you need to have a realistic expectation for potential results. With the right strategy and meticulous effort, you can eventually show up in searches in nearby cities; however, this will take time. Don’t miss out on the easy local traffic in the meantime — get your own back yard in order!

Black Friday Report: Retail search ad spend shifted from text to shopping ads on desktop

shopping-bags-retail-ss-1920
Shopping ad growth continues to cannibalize text ad spend on desktop. Retail spend shifted significantly from paid search text ads to paid search shopping ads on desktop this Black Friday weekend compared to a year ago. That’s according to a report from paid search insights firm, AdGooroo, a Kantar Media company, based on analysis of 2,500 top retail product keywords.
From Black Friday through Cyber Monday, advertisers spent $8.9 million on Google desktop text ads across the keyword set analyzed in the US this year compared to $15.4 million in 2015. Product listing ad spend on desktop, meanwhile, increased from $2.9 million in 2015 to $9.6 million during the period in 2016.
That’s a total spend of $18.5 million in 2016 compared to a slightly lower $18.3 million a year ago. Earlier this year, Google removed text ads from the right rail of desktop results. Product listing ads, however, display either above the organic results in the mainline or in the right rail.
Amazon led the pack in terms of click share from text ads on Google desktop, garnering 6.3 percent of clicks over the weekend. (Amazon does not participate in Google Shopping.) Walmart accounted for 8.1 percent of desktop product listing ad clicks for the period.
desktop-paid-search-spend-black-friday-weekend-adgooroo
desktop-paid-search-spend-black-friday-weekend-adgooroo

Have Google and Facebook become unwitting tools of extremism?

data-future-world-global-eyes-ss-1920
Google and Facebook see themselves fundamentally as instruments of progress. Their company narratives argue that the information and communication they deliver or facilitate are helping, not harming, people and society as a whole.
But what if the opposite is true; what if Google and Facebook have become effective tools of misinformation and hate-group propaganda? That’s essentially the argument of a Guardian article that appeared over the weekend:
Tech-savvy rightwingers have been able to ‘game’ the algorithms of internet giants and create a new reality where Hitler is a good guy, Jews are evil and… Donald Trump becomes president.
The Guardian article asserts that extremists have become as skilled as anyone at ranking and that they’re promulgating false information and hateful propaganda, which is starting to have real-world consequences, in terms of election outcomes and social attitudes towards minorities and immigrants. The article focuses on antisemitism and Islamophobia in particular.
Both Google and Facebook have been embroiled in the fake news controversy since the US election and on the defensive regarding the degree to which their lack of editorial oversight allowed the exposure and sharing of misinformation that may have influenced the outcome.
After initially denying any negative influence, Facebook has done an about face and is now actively seeking ways to prevent fake news from being disseminated on the site, including curating content and potentially favoring established news sites. In addition, both Google and Facebook have implemented changes that seek to cut off ad revenues to fake news sites.
Some of the controversies discussed in the Guardian article are nothing new. Google has drawn scrutiny and criticism over the years around its ranking of news, content and its autocomplete suggestions. And partisans on both the left and the right have made conspiratorial claims about how Google is intentionally promoting one or another perspective through its rankings or search suggestions. Most recently conservatives complained that Google wasn’t showing search suggestions for the phrase “Crooked Hillary.”
Google isn’t manipulating its algorithm or search suggestions to advance a political agenda. But others seeking to influence the public and social attitudes are themselves trying to game it. Around the world the internet is arguably becoming more a tool of authoritarian control and propaganda (e.g., Russia, China) than of the free flow of information, facts or “progress.”
According to one of Trump’s surrogates over the weekend, we now live in a society where facts no longer matter. “There’s no such thing, unfortunately, anymore, of facts,” Scottie Nell Hughes argued. This notion is the opposite of what Google stands for.
Google and Facebook have both contributed to and celebrated the “democratization” of news and content — the idea that someone with a blog and a strong perspective or voice can potentially get the same visibility and distribution as the New York Times. While that has created considerable opportunity for enterprising individuals and entrepreneurs, this “flattening” of the news has also probably diminished the credibility of established news brands and elevated dubious sources to prominent visibility.
The quasi-utopian vision that both Google and Facebook share is one of technology contributing to ever expanding social progress. However, the Guardian’s more dystopian view argues the opposite is starting to happen: extremists and authoritarians are successfully gaming search and social media, confusing the public, spreading hate and starting to undermine democratic institutions.

Thursday, 1 December 2016

How Google is tackling fake news, and why it should not do it alone

What can Google do to combat fake news? Columnist Ian Bowden illustrates some ways the search giant can tackle -- and already is tackling -- this problem.

google-news-2015a-fade-ss-1920
Fact-checking and preventing fake news from appearing in search results will remain a big priority for search engines in 2017.
Following the US election and Brexit, increased focus is being placed on how social networks and search engines can avoid showing “fake news” to users. However, this is a battle that search engines cannot — and more fundamentally, should not — fight alone.
With search engines providing a key way people consume information, it is obviously problematic if they can both decide what the truth is and label content as the truth. This power might not be abused now, but there is no guarantee of the safe governance of such organizations in the future.
Here are five key ways Google can deal (or already is dealing) with fake news right now. They are:
  1. Manually reviewing websites
  2. Algorithmically demoting fake news
  3. Removing incentives to create fake news
  4. Signaling when content has been fact-checked
  5. Funding fact-checking organizations

1. Manually reviewing websites

Google does have the power to determine who does and does not appear in their various listings. To appear in Google News, publishers must meet Google’s guidelines, then apply for inclusion and submit to a manual review. This is not the case with the content that appears in traditional organic listings.
Understanding how each part of the search results is populated, and the requirements for inclusion, can be confusing. It’s a common misconception that the content within the “In the news” box is Google News content. It’s not. It may include content from Google News, but after a change in 2014, this box can pull in content from traditional search listings as well.
google-news-2015a-fade-ss-1920
In the News
“In the news” appears at the top of the page for certain queries and includes stories that have been approved for inclusion in Google News (shown above) as well as other, non-vetted stories from across the web.
That’s why Google was criticized last week for showing a fake news story that reported a popular vote win for Trump. The fake story appeared in the “In the news” box, despite not being Google News (so it was not manually reviewed).
There needs to be better transparency about what content constitutes Google News results and what doesn’t. Labeling something as “news” may give it increased credibility for users, when in reality it hasn’t undergone any manual review.
Google will likely avoid changing the carousel to a pure Google News product, as this may create concerns with news outlets that Google is monetizing the traffic they believe is being “stolen” from them. Unless Google removes any ads appearing against organic listings when a news universal result appears, Google has to make this carousel an aggregation of the net.
It hasn’t been confirmed yet at time of writing, but there is speculation that Google is planning to reduce the ambiguity of the “In the news” listings by replacing it with “Top stories” (as seen in its mobile search results). Like content from the “In the news” box, these listings have been a mashup of Google News and normal search listings, with the common trait being that these pages are AMP-enabled.
Top Stories Screenshot
“Top stories” consists of AMP web pages.
In my opinion, “Top stories” still implies an element of curation, so perhaps something like “Popular stories from across the web” may work better.
Manual review isn’t viable for the entire web, but it’s a start that items from Google News publishers are manually reviewed before inclusion. The opportunity here is to be more transparent about where content has been reviewed and where it hasn’t.

2. Algorithmically demoting fake news

Traditionally, search engines have indirectly dealt with fake news through showing users the most authoritative search results. The assumption is that domains with higher authority and trust will be less likely to report fake news.
It’s another debate whether “authority” publications actually report on the truth, of course. But the majority of their content is truthful, and this helps to ensure fake news is less likely to appear in search results.
The problem is, the very ranking signals search engines use to determine authority can elevate fake news sites when their content goes viral and becomes popular. That is why, in the above example, the fake news performed well in search results.
Google’s ability to algorithmically determine “facts” has been called into doubt. Last week, Danny Sullivan on Marketing Land gave several case studies where Google gets it wrong (sometimes comically) and outlines some of the challenges of algorithmically determining the truth based on the internet.
Google has denied that TrustRank exists, but perhaps we’ll see the introduction of a “TruthRank.” There will be a series of “truth beacons,” in the same way the TrustRank patent outlines. A score could be appended based on the number of citations against truth-checking services.

3. Removing the incentive to create fake news

There are two main goals for creating fake news websites: money and influence. Google not only needs to prevent this material from appearing in search results but also needs to play a role in restricting the financial incentive to do it in the first place.
Google AdSense is one of the largest ad networks, and it was one of the largest sources of income for authors creating fake news. One author of fake news around the US election was reportedly making $10,000 per month.
Earlier this month, both Facebook and Google banned fake news sites from utilizing their ad networks. This is a massive step forward and one that should make a big difference. There are other ad networks, but they will have smaller inventory and should receive pressure to follow suit.
A Google spokesperson said:
“Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content or the primary purpose of the web property.”
Google can do little to reduce the incentive to create fake news for the purpose of political influence. If the effectiveness of producing fake news is reduced, and it culturally becomes unacceptable, it is less likely it would be used by political organizations and individuals.

4. Signaling when content has been fact-checked

In October, Google introduced a “Fact Check” label for stories in Google News, their objective being “to shine a light on [the fact-checking community’s] efforts to divine fact from fiction, wisdom from spin.” The label now appears alongside previously existing labels such as “opinion,” “local source” and “highly cited.”
Fact-checking sites that meet Google’s criteria can apply to have their services be included, and publishers can reference sources using the Claim Review Schema.
The kind of populism politics that has surfaced in both America and the UK is cynical of the establishment, and this cynicism could very easily extend to any fact-checking organization(s).
Trump has claimed the media is biased, specifically calling out sources such as The New York Times and The Washington Post. Any attack from influential people on the fact-checking organizations could quickly undermine their credibility in the eyes of populists. It needs to be communicated clearly that the facts are not defined by Google and that they are from neutral, objective sources.
Fact check label
Google has introduced a new “Fact Check” label.
These labels only apply to Google News, but it will be interesting to see if and how Google can expand it to the main index.

 5. Funding fact-checking organizations

Of course, Google should not be defining what the truth is. Having the power to both define veracity and present it back to society concentrates power that could be abused in the future.
Google, therefore, has a large dependency on other organizations to do this task — and a large interest in seeing it happen. The smart thing Google has done is to fund such organizations, and this month it has given €150,000 to three UK organizations working on fact-checking projects (plus others elsewhere in the world).
One of the UK organizations is Full Fact. Full Fact is working on the first fully automated fact-checking tool, which will lend scalability to the efforts of journalists and media companies.
Full Fact caps the amount of donations they can receive from any one organization to 15 percent to avoid commercial interests and preserve objectivity. This is the counter-argument to any cynics suggesting Google’s donation isn’t large enough and doesn’t represent the size of the task.
Google needs accurate sources of information to present back to users, and funding fact-checking organizations will accelerate progress.

To wrap up

Casting aside all of the challenges Google faces, there are deep-rooted issues in defining what constitutes the truth, the parameters of truth that are acceptable and the governance of representing it back to society.
For Google to appear to be objective in their representation of truth, they need to avoid getting involved in defining it. They have a massive interest in this, though, and that’s the reason they have invested money into various fact-checking services.
Over the past decade, it’s possible to point to where the main focus of search engines has been, e.g., content or links. Going forward, I think we will see fact-checking and the tackling of fake news as high a priority as any other.
Google needs to act as a conduit between the user and the truth — and not define it.