LinkedIn Website Demographics: What Marketers Need to Know

Want more information on people who visit your website? Did you know LinkedIn can help? LinkedIn Website Demographics provides professional insights about your website visitors and allows you to remarket to them through LinkedIn ads. In this article, you’ll learn how to set up and use LinkedIn’s Website Demographics for remarketing. #1: Set Up LinkedIn

This post LinkedIn Website Demographics: What Marketers Need to Know first appeared on .
– Your Guide to the Social Media Jungle


LinkedIn for Business: The Ultimate LinkedIn Marketing Guide

If you’re new to LinkedIn business options or want to add something new to your current LinkedIn marketing plan, this page is for you. Here, you’ll find articles and resources to help beginner, intermediate, and advanced marketers use LinkedIn profiles, video, ads, analysis, and more for business. Put My Business on LinkedIn How do I

This post LinkedIn for Business: The Ultimate LinkedIn Marketing Guide first appeared on .
– Your Guide to the Social Media Jungle

Instagram for Business: The Ultimate Instagram Marketing Guide

If you’re new to Instagram business options or want to add something new to your current Instagram marketing plan, this page is for you. Here, you’ll find articles and resources to help beginner, intermediate, and advanced marketers use Instagram profiles, stories, live video, ads, analysis, contests, and more for business. Put My Business on Instagram

This post Instagram for Business: The Ultimate Instagram Marketing Guide first appeared on .
– Your Guide to the Social Media Jungle

LinkedIn for Business: The Ultimate LinkedIn Marketing Guide

If you’re new to LinkedIn business options or want to add something new to your current LinkedIn marketing plan, this page is for you. Here, you’ll find articles and resources to help beginner, intermediate, and advanced marketers use LinkedIn profiles, video, ads, analysis, and more for business. Put My Business on LinkedIn How do I

This post LinkedIn for Business: The Ultimate LinkedIn Marketing Guide first appeared on .
– Your Guide to the Social Media Jungle

92 Percent of Customers Will Call You Out on Your Poor Customer Service

92 Percent of Customers Will Call You Out on Your Poor Customer Service

I’ve worked in the social media space for the past several years, and I’m a professional user—and personal fan—of Sprout Social. As such, I was particularly interested to read the recently published Q3 2017 Sprout Social Index.

The report, entitled “Call-Out Culture: People, Brands & the Social Media Power Struggle,” provides noteworthy statistics garnered from surveying more than 1,000 individuals. In addition, it describes online situations most of us have experienced first-hand, either as marketers or as consumers. This research highlights the social media customer service challenges that brands must address to be relevant, remain competitive, and earn loyalty.

The Culture of Accountability

Sprout Social has long discussed how the rise of social media has democratized individual influence. Today’s consumers have more means than ever to hold a brand accountable for their products or services—and they are.

Consumers Believe Social Has Increased Accountability

75% of consumers say social media has empowered them to interact with brands.
Click To Tweet

Social gives everyone a very public platform to ask questions, share feedback, and relate experiences; a single post, whether positive or negative, can be viewed and distributed thousands of times before ever being addressed by the pertinent company.

This speed of amplification places a heavy burden on organizations. Yet, as Sprout Social emphasizes, “Brands must commit to delivering consistent, quality content and service—online and off—regardless of how big or small an issue may seem.”

In other words: Answer every complaint, in every channel, every time.

The Prevalence of the Call-Out

The Index shows consumers tend to complain first in person (55 percent), but social media (47 percent) and email (42 percent) are rapidly growing alternatives.

Key Channels for Consumer Complaints

.@SproutSocial research shows 47% of people have used social media to complain about a business.
Click To Tweet

Interestingly, only eight percent of those surveyed said they would not speak up at all about an issue. Sprout Social speculates that this low percentage reflects a new “era of engagement,” where consumers have become empowered to speak up and take a stand against brands. I would hypothesize further that these findings also reflect consumers’ increasing comfort with, as well as accessibility to, digital media channels in general.

This insight is important because organizations can use it to strategically approach customer service. By internalizing the importance of social listening and bolstering their teams accordingly through staffing and training, brands better position themselves to swiftly—and effectively—address issues when they arise.

The Psychology of the Call-Out

In their report, Sprout Social recommends that companies understand and identify potential triggers as well as the motivations behind customer complaints. It makes good sense. After all, the best complaint is the one that never has to be delivered.

So why do consumers call out brands on social media? Here’s what the research found.

Why Consumers Call Out Brands on Social

Top 3 reasons consumers call out brands on social: Dishonesty, Bad Customer Service, Rudeness.
Click To Tweet

While these are clearly valid reasons for a customer to complain, there’s more to it than that. Psychology Today identifies three types of complaining: the “chronic” complainer who can never be satisfied, the ”venter” who primarily wants to solicit sympathy, and the “instrumental” complainer who actually wants to have problems solved.

Sprout Social’s research supports this, mostly among the venting and instrumental complainers, or some combination of both.

What Consumers Hope to Gain by Calling Out Brands

70% of consumers call out brands on social to make other individuals aware of an issue.
Click To Tweet

I’ll admit I’ve had the occasional experience that incensed me to post a harsh word on social media, and I would bet you a cold beverage of your choice that you have, too. (Though hopefully, we’re thoughtful social community members who praise much more than rage.) But whatever the initial motivation for calling out a brand, it’s how the outreach gets handled that can lose customers for life or convert them into true advocates.

The Power of a Response

In their survey, Sprout Social wanted to know, “What’s better: responding poorly, or ignoring a complaint altogether?”

It may surprise you to learn that an unhelpful response is worse than no response at all.

Sprout Social explains if a brand does not respond to an initial complaint, consumers often give them the benefit of the doubt, either by posting again on social (18 percent), trying to reach the company through another channel such as email or phone (40 percent), or forgetting about the issue altogether (20 percent).

When Brands Ignore Social Complaints

But if a brand does respond to a complaint and does it badly, well, things just don’t go well from there. The percentages of negative customer reactions—sharing the poor experience with others, unfollowing corporate social accounts, and, most significantly, boycotting the brand—all dramatically increase. That’s not good for a business’s reputation and certainly not for its bottom line.

Consumer Reactions to Poor Brand Responses on Social

50% of consumers won’t buy from a brand again if that brand responds poorly to a complaint.
Click To Tweet

The good news is that brands can minimize detrimental outcomes and win back consumers after a negative social post, provided they make it a priority. As much as we would like it to, great (or even just good) service doesn’t “automagically” happen. Social and customer care teams need effective tools that allow them to monitor for, engage with, and analyze customer interactions. They must also work within a culture of trust that empowers them to resolve issues.

With the right infrastructure in place, brands are much more likely turn even the occasional upset customer into a brand advocate. Sprout Social’s research showed if a company responds in a timely and useful manner to a complaint, 45 percent of people will reinforce that positive interaction by posting about it on social, informing their friends about the resolution, and rewarding the brand with future business.

Reaction If a Brand Responds Well to a Social Complaint

45% of people will share their positive experience if a brand responds helpfully to an issue.
Click To Tweet
Industry Trends

When it comes to call-out culture, Sprout Social says, not all organizations are equally affected. There are specific industries in which consumers are both more likely to complain and brands are less likely to respond.

Perhaps due to their significant roles in people’s daily lives, consumer goods, retail, and government generate the most social complaints at 19 percent, 17 percent, and 15 percent respectively. These are also the industries consumers believe need the most help improving their social media service.

Industries That Need the Most Help with Social Customer Service

Top 3 industries complained about on social: Consumer Goods, Retail, Government.
Click To Tweet

Also in the top five industries where customers would like to see better engagement are Banking/Finance and Healthcare. To be fair, these specific industries do face unique challenges with regulatory restrictions on social content and responses, but critical consumer issues still need to be addressed promptly and accurately.

The Q3 2017 Sprout Social Index concluded that today, brands receive 146 percent more social messages needing response than they did three years ago.

Brands receive 146% more social media messages needing response than they did in 2014.
Click To Tweet

In the same time frame, the response rate has decreased; on average, brands now respond to only one in 10 social messages.

On average, brands only respond to 1 in 10 social messages (comment, question, or complaint).
Click To Tweet

With reputation and dollars at stake, it is evident that companies must take social media customer care more seriously. An investment in effective tools and well-trained, people-centric staff will go a long way toward ongoing success if a business is willing to honor their customers and their experiences.

For complete insights, read the entire “Call-Out Culture: People, Brands & the Social Media Power Struggle” report for free (no form-fill necessary!) on the Sprout Social blog.

I’ve been on both sides of the customer service screen and continue to learn from others’ experiences. Share one of your social media customer stories, professionally or personally, with me in the comments.

How to Use Infographics on Pinterest to Get More Website Traffic

How to Use Infographics on Pinterest to Get More Website Traffic

Blogging and online marketing, in general, is an endless journey. On top of regularly publishing content that provides value to your audience, you also need to keep up with the ever-changing trends and strategies to stay in the competition.

More often than not, it all boils down to expanding your arsenal of tools and automating as much as possible to boost your efficiency. You also need to empathize with your audience and understand their pain points, interests, and content preferences. And if truly want to maximize your success, you must also pay attention to the content distribution channels that will make your brand heard and seen.

Establishing a popular page on Facebook? Building a huge following on Twitter? Been there, done that.

In this post, we’re going to delve into the nitty-gritty of driving web traffic through Pinterest — armed with a handful of tactics and truckloads of infographics.

3 Facebook Ad Tools for Better Targeting

Want to refine your Facebook ad targeting? Looking for tools to better identify your most valuable Facebook customers? In this article, you’ll find three tools that give you a more complete picture of whom your ad audience is, what they’re doing on your site, and how much they’re worth. #1: Reveal the Employers and Quality

This post 3 Facebook Ad Tools for Better Targeting first appeared on .
– Your Guide to the Social Media Jungle

Google Shares Details About the Technology Behind Googlebot

Posted by goralewicz

Crawling and indexing has been a hot topic over the last few years. As soon as Google launched Google Panda, people rushed to their server logs and crawling stats and began fixing their index bloat. All those problems didn’t exist in the “SEO = backlinks” era from a few years ago. With this exponential growth of technical SEO, we need to get more and more technical. That being said, we still don’t know how exactly Google crawls our websites. Many SEOs still can’t tell the difference between crawling and indexing.

The biggest problem, though, is that when we want to troubleshoot indexing problems, the only tool in our arsenal is Google Search Console and the Fetch and Render tool. Once your website includes more than HTML and CSS, there’s a lot of guesswork into how your content will be indexed by Google. This approach is risky, expensive, and can fail multiple times. Even when you discover the pieces of your website that weren’t indexed properly, it’s extremely difficult to get to the bottom of the problem and find the fragments of code responsible for the indexing problems.

Fortunately, this is about to change. Recently, Ilya Grigorik from Google shared one of the most valuable insights into how crawlers work:

Interestingly, this tweet didn’t get nearly as much attention as I would expect.

So what does Ilya’s revelation in this tweet mean for SEOs?

Knowing that Chrome 41 is the technology behind the Web Rendering Service is a game-changer. Before this announcement, our only solution was to use Fetch and Render in Google Search Console to see our page rendered by the Website Rendering Service (WRS). This means we can troubleshoot technical problems that would otherwise have required experimenting and creating staging environments. Now, all you need to do is download and install Chrome 41 to see how your website loads in the browser. That’s it.

You can check the features and capabilities that Chrome 41 supports by visiting or (Googlebot should support similar features). These two websites make a developer’s life much easier.

Even though we don’t know exactly which version Ilya had in mind, we can find Chrome’s version used by the WRS by looking at the server logs. It’s Chrome 41.0.2272.118.

It will be updated sometime in the future

Chrome 41 was created two years ago (in 2015), so it’s far removed from the current version of the browser. However, as Ilya Grigorik said, an update is coming:

I was lucky enough to get Ilya Grigorik to read this article before it was published, and he provided a ton of valuable feedback on this topic. He mentioned that they are hoping to have the WRS updated by 2018. Fingers crossed!

Google uses Chrome 41 for rendering. What does that mean?

We now have some interesting information about how Google renders websites. But what does that mean, practically, for site developers and their clients? Does this mean we can now ignore server-side rendering and deploy client-rendered, JavaScript-rich websites?

Not so fast. Here is what Ilya Grigorik had to say in response to this question:

We now know WRS’ capabilities for rendering JavaScript and how to debug them. However, remember that not all crawlers support Javascript crawling, etc. Also, as of today, JavaScript crawling is only supported by Google and Ask (Ask is most likely powered by Google). Even if you don’t care about social media or search engines other than Google, one more thing to remember is that even with Chrome 41, not all JavaScript frameworks can be indexed by Google (read more about JavaScript frameworks crawling and indexing). This lets us troubleshoot and better diagnose problems.

Don’t get your hopes up

All that said, there are a few reasons to keep your excitement at bay.

Remember that version 41 of Chrome is over two years old. It may not work very well with modern JavaScript frameworks. To test it yourself, open using Chrome 41, and then open it in any up-to-date browser you are using.

The page in Chrome 41 looks like this:

The content parsed by Polymer is invisible (meaning it wasn’t processed correctly). This is also a perfect example for troubleshooting potential indexing issues. The problem you’re seeing above can be solved if diagnosed properly. Let me quote Ilya:

“If you look at the raised Javascript error under the hood, the test page is throwing an error due to unsupported (in M41) ES6 syntax. You can test this yourself in M41, or use the debug snippet we provided in the blog post to log the error into the DOM to see it.”

I believe this is another powerful tool for web developers willing to make their JavaScript websites indexable. We will definitely expand our experiment and work with Ilya’s feedback.

The Fetch and Render tool is the Chrome v. 41 preview

There’s another interesting thing about Chrome 41. Google Search Console’s Fetch and Render tool is simply the Chrome 41 preview. The righthand-side view (“This is how a visitor to your website would have seen the page”) is generated by the Google Search Console bot, which is… Chrome 41.0.2272.118 (see screenshot below).

Zoom in here

There’s evidence that both Googlebot and Google Search Console Bot render pages using Chrome 41. Still, we don’t exactly know what the differences between them are. One noticeable difference is that the Google Search Console bot doesn’t respect the robots.txt file. There may be more, but for the time being, we’re not able to point them out.

Chrome 41 vs Fetch as Google: A word of caution

Chrome 41 is a great tool for debugging Googlebot. However, sometimes (not often) there’s a situation in which Chrome 41 renders a page properly, but the screenshots from Google Fetch and Render suggest that Google can’t handle the page. It could be caused by CSS animations and transitions, Googlebot timeouts, or the usage of features that Googlebot doesn’t support. Let me show you an example.

Chrome 41 preview:

Image blurred for privacy

The above page has quite a lot of content and images, but it looks completely different in Google Search Console.

Google Search Console preview for the same URL:

As you can see, Google Search Console’s preview of this URL is completely different than what you saw on the previous screenshot (Chrome 41). All the content is gone and all we can see is the search bar.

From what we noticed, Google Search Console renders CSS a little bit different than Chrome 41. This doesn’t happen often, but as with most tools, we need to double check whenever possible.

This leads us to a question…

What features are supported by Googlebot and WRS?

According to the Rendering on Google Search guide:

  • Googlebot doesn’t support IndexedDB, WebSQL, and WebGL.
  • HTTP cookies and local storage, as well as session storage, are cleared between page loads.
  • All features requiring user permissions (like Notifications API, clipboard, push, device-info) are disabled.
  • Google can’t index 3D and VR content.
  • Googlebot only supports HTTP/1.1 crawling.

The last point is really interesting. Despite statements from Google over the last 2 years, Google still only crawls using HTTP/1.1.

No HTTP/2 support (still)

We’ve mostly been covering how Googlebot uses Chrome, but there’s another recent discovery to keep in mind.

There is still no support for HTTP/2 for Googlebot.

Since it’s now clear that Googlebot doesn’t support HTTP/2, this means that if your website supports HTTP/2, you can’t drop HTTP 1.1 optimization. Googlebot can crawl only using HTTP/1.1.

There were several announcements recently regarding Google’s HTTP/2 support. To read more about it, check out my HTTP/2 experiment here on the Moz Blog.


Googlebot’s future

Rumor has it that Chrome 59’s headless mode was created for Googlebot, or at least that it was discussed during the design process. It’s hard to say if any of this chatter is true, but if it is, it means that to some extent, Googlebot will “see” the website in the same way as regular Internet users.

This would definitely make everything simpler for developers who wouldn’t have to worry about Googlebot’s ability to crawl even the most complex websites.

Chrome 41 vs. Googlebot’s crawling efficiency

Chrome 41 is a powerful tool for debugging JavaScript crawling and indexing. However, it’s crucial not to jump on the hype train here and start launching websites that “pass the Chrome 41 test.”

Even if Googlebot can “see” our website, there are many other factors that will affect your site’s crawling efficiency. As an example, we already have proof showing that Googlebot can crawl and index JavaScript and many JavaScript frameworks. It doesn’t mean that JavaScript is great for SEO. I gathered significant evidence showing that JavaScript pages aren’t crawled even half as effectively as HTML-based pages.

In summary

Ilya Grigorik’s tweet sheds more light on how Google crawls pages and, thanks to that, we don’t have to build experiments for every feature we’re testing — we can use Chrome 41 for debugging instead. This simple step will definitely save a lot of websites from indexing problems, like when’s JavaScript SEO backfired.

It’s safe to assume that Chrome 41 will now be a part of every SEO’s toolset.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Does Googlebot Support HTTP/2? Challenging Google’s Indexing Claims – An Experiment

Posted by goralewicz

I was recently challenged with a question from a client, Robert, who runs a small PR firm and needed to optimize a client’s website. His question inspired me to run a small experiment in HTTP protocols. So what was Robert’s question? He asked…

Can Googlebot crawl using HTTP/2 protocols?

You may be asking yourself, why should I care about Robert and his HTTP protocols?

As a refresher, HTTP protocols are the basic set of standards allowing the World Wide Web to exchange information. They are the reason a web browser can display data stored on another server. The first was initiated back in 1989, which means, just like everything else, HTTP protocols are getting outdated. HTTP/2 is one of the latest versions of HTTP protocol to be created to replace these aging versions.

So, back to our question: why do you, as an SEO, care to know more about HTTP protocols? The short answer is that none of your SEO efforts matter or can even be done without a basic understanding of HTTP protocol. Robert knew that if his site wasn’t indexing correctly, his client would miss out on valuable web traffic from searches.

The hype around HTTP/2

HTTP/1.1 is a 17-year-old protocol (HTTP 1.0 is 21 years old). Both HTTP 1.0 and 1.1 have limitations, mostly related to performance. When HTTP/1.1 was getting too slow and out of date, Google introduced SPDY in 2009, which was the basis for HTTP/2. Side note: Starting from Chrome 53, Google decided to stop supporting SPDY in favor of HTTP/2.

HTTP/2 was a long-awaited protocol. Its main goal is to improve a website’s performance. It’s currently used by 17% of websites (as of September 2017). Adoption rate is growing rapidly, as only 10% of websites were using HTTP/2 in January 2017. You can see the adoption rate charts here. HTTP/2 is getting more and more popular, and is widely supported by modern browsers (like Chrome or Firefox) and web servers (including Apache, Nginx, and IIS).

Its key advantages are:

  • Multiplexing: The ability to send multiple requests through a single TCP connection.
  • Server push: When a client requires some resource (let’s say, an HTML document), a server can push CSS and JS files to a client cache. It reduces network latency and round-trips.
  • One connection per origin: With HTTP/2, only one connection is needed to load the website.
  • Stream prioritization: Requests (streams) are assigned a priority from 1 to 256 to deliver higher-priority resources faster.
  • Binary framing layer: HTTP/2 is easier to parse (for both the server and user).
  • Header compression: This feature reduces overhead from plain text in HTTP/1.1 and improves performance.

For more information, I highly recommend reading “Introduction to HTTP/2” by Surma and Ilya Grigorik.

All these benefits suggest pushing for HTTP/2 support as soon as possible. However, my experience with technical SEO has taught me to double-check and experiment with solutions that might affect our SEO efforts.

So the question is: Does Googlebot support HTTP/2?

Google’s promises

HTTP/2 represents a promised land, the technical SEO oasis everyone was searching for. By now, many websites have already added HTTP/2 support, and developers don’t want to optimize for HTTP/1.1 anymore. Before I could answer Robert’s question, I needed to know whether or not Googlebot supported HTTP/2-only crawling.

I was not alone in my query. This is a topic which comes up often on Twitter, Google Hangouts, and other such forums. And like Robert, I had clients pressing me for answers. The experiment needed to happen. Below I’ll lay out exactly how we arrived at our answer, but here’s the spoiler: it doesn’t. Google doesn’t crawl using the HTTP/2 protocol. If your website uses HTTP/2, you need to make sure you continue to optimize the HTTP/1.1 version for crawling purposes.

The question

It all started with a Google Hangouts in November 2015.

When asked about HTTP/2 support, John Mueller mentioned that HTTP/2-only crawling should be ready by early 2016, and he also mentioned that HTTP/2 would make it easier for Googlebot to crawl pages by bundling requests (images, JS, and CSS could be downloaded with a single bundled request).

“At the moment, Google doesn’t support HTTP/2-only crawling (…) We are working on that, I suspect it will be ready by the end of this year (2015) or early next year (2016) (…) One of the big advantages of HTTP/2 is that you can bundle requests, so if you are looking at a page and it has a bunch of embedded images, CSS, JavaScript files, theoretically you can make one request for all of those files and get everything together. So that would make it a little bit easier to crawl pages while we are rendering them for example.”

Soon after, Twitter user Kai Spriestersbach also asked about HTTP/2 support:

His clients started dropping HTTP/1.1 connections optimization, just like most developers deploying HTTP/2, which was at the time supported by all major browsers.

After a few quiet months, Google Webmasters reignited the conversation, tweeting that Google won’t hold you back if you’re setting up for HTTP/2. At this time, however, we still had no definitive word on HTTP/2-only crawling. Just because it won’t hold you back doesn’t mean it can handle it — which is why I decided to test the hypothesis.

The experiment

For months as I was following this online debate, I still received questions from our clients who no longer wanted want to spend money on HTTP/1.1 optimization. Thus, I decided to create a very simple (and bold) experiment.

I decided to disable HTTP/1.1 on my own website ( and make it HTTP/2 only. I disabled HTTP/1.1 from March 7th until March 13th.

If you’re going to get bad news, at the very least it should come quickly. I didn’t have to wait long to see if my experiment “took.” Very shortly after disabling HTTP/1.1, I couldn’t fetch and render my website in Google Search Console; I was getting an error every time.

My website is fairly small, but I could clearly see that the crawling stats decreased after disabling HTTP/1.1. Google was no longer visiting my site.

While I could have kept going, I stopped the experiment after my website was partially de-indexed due to “Access Denied” errors.

The results

I didn’t need any more information; the proof was right there. Googlebot wasn’t supporting HTTP/2-only crawling. Should you choose to duplicate this at home with our own site, you’ll be happy to know that my site recovered very quickly.

I finally had Robert’s answer, but felt others may benefit from it as well. A few weeks after finishing my experiment, I decided to ask John about HTTP/2 crawling on Twitter and see what he had to say.

(I love that he responds.)

Knowing the results of my experiment, I have to agree with John: disabling HTTP/1 was a bad idea. However, I was seeing other developers discontinuing optimization for HTTP/1, which is why I wanted to test HTTP/2 on its own.

For those looking to run their own experiment, there are two ways of negotiating a HTTP/2 connection:

1. Over HTTP (unsecure) – Make an HTTP/1.1 request that includes an Upgrade header. This seems to be the method to which John Mueller was referring. However, it doesn’t apply to my website (because it’s served via HTTPS). What is more, this is an old-fashioned way of negotiating, not supported by modern browsers. Below is a screenshot from

2. Over HTTPS (secure) – Connection is negotiated via the ALPN protocol (HTTP/1.1 is not involved in this process). This method is preferred and widely supported by modern browsers and servers.

A recent announcement: The saga continuesGooglebot doesn’t make HTTP/2 requests

Fortunately, Ilya Grigorik, a web performance engineer at Google, let everyone peek behind the curtains at how Googlebot is crawling websites and the technology behind it:

If that wasn’t enough, Googlebot doesn’t support the WebSocket protocol. That means your server can’t send resources to Googlebot before they are requested. Supporting it wouldn’t reduce network latency and round-trips; it would simply slow everything down. Modern browsers offer many ways of loading content, including WebRTC, WebSockets, loading local content from drive, etc. However, Googlebot supports only HTTP/FTP, with or without Transport Layer Security (TLS).

Googlebot supports SPDY

During my research and after John Mueller’s feedback, I decided to consult an HTTP/2 expert. I contacted Peter Nikolow of Mobilio, and asked him to see if there were anything we could do to find the final answer regarding Googlebot’s HTTP/2 support. Not only did he provide us with help, Peter even created an experiment for us to use. Its results are pretty straightforward: Googlebot does support the SPDY protocol and Next Protocol Navigation (NPN). And thus, it can’t support HTTP/2.

Below is Peter’s response:

I performed an experiment that shows Googlebot uses SPDY protocol. Because it supports SPDY + NPN, it cannot support HTTP/2. There are many cons to continued support of SPDY:

    1. This protocol is vulnerable
    2. Google Chrome no longer supports SPDY in favor of HTTP/2
    3. Servers have been neglecting to support SPDY. Let’s examine the NGINX example: from version 1.95, they no longer support SPDY.
    4. Apache doesn’t support SPDY out of the box. You need to install mod_spdy, which is provided by Google.

    To examine Googlebot and the protocols it uses, I took advantage of s_server, a tool that can debug TLS connections. I used Google Search Console Fetch and Render to send Googlebot to my website.

    Here’s a screenshot from this tool showing that Googlebot is using Next Protocol Navigation (and therefore SPDY):

    I’ll briefly explain how you can perform your own test. The first thing you should know is that you can’t use scripting languages (like PHP or Python) for debugging TLS handshakes. The reason for that is simple: these languages see HTTP-level data only. Instead, you should use special tools for debugging TLS handshakes, such as s_server.

    Type in the console:

    sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -WWW -tlsextdebug -state -msg
    sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -www -tlsextdebug -state -msg

    Please note the slight (but significant) difference between the “-WWW” and “-www” options in these commands. You can find more about their purpose in the s_server documentation.

    Next, invite Googlebot to visit your site by entering the URL in Google Search Console Fetch and Render or in the Google mobile tester.

    As I wrote above, there is no logical reason why Googlebot supports SPDY. This protocol is vulnerable; no modern browser supports it. Additionally, servers (including NGINX) neglect to support it. It’s just a matter of time until Googlebot will be able to crawl using HTTP/2. Just implement HTTP 1.1 + HTTP/2 support on your own server (your users will notice due to faster loading) and wait until Google is able to send requests using HTTP/2.


    In November 2015, John Mueller said he expected Googlebot to crawl websites by sending HTTP/2 requests starting in early 2016. We don’t know why, as of October 2017, that hasn’t happened yet.

    What we do know is that Googlebot doesn’t support HTTP/2. It still crawls by sending HTTP/ 1.1 requests. Both this experiment and the “Rendering on Google Search” page confirm it. (If you’d like to know more about the technology behind Googlebot, then you should check out what they recently shared.)

    For now, it seems we have to accept the status quo. We recommended that Robert (and you readers as well) enable HTTP/2 on your websites for better performance, but continue optimizing for HTTP/ 1.1. Your visitors will notice and thank you.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    What’s Better for B2B Marketers, Ebooks or White Papers?

    What's Better for B2B Marketers, Ebooks or White Papers

    Content marketers rely on high-quality content to generate leads. In fact, last year 85 percent of B2B marketers said lead generation was their most important content marketing goal, according to the Content Marketing Institute.

    But it’s not enough to just slap any old brain fart behind your landing page. What kind of content is best to hit your lead gen goals? It’s whatever motivates your audience to volunteer their name, email address, and/or other information. It needs to be something substantial, with information they can’t find anywhere else.

    Sound familiar? If you said white paper or ebook, you’re right! According to ImpactBND, 80 percent of users said they’d provide their email for a white paper or ebook. Consumers are even more likely to share their information in exchange for white papers (76 percent) than ebooks (63 percent). White papers and ebooks, however, are not the same thing. What’s the difference?

    Think of ebooks as the cooler, younger sibling of the white paper. A white paper is typically an in-depth look at a more narrowly defined topic than an ebook. An ebook is more conversational in tone, less scholarly, and may present an overall look at an issue, trend, or industry, rather than a deep dive into a particular problem or solution. (Check out The Ultimate White Paper Template [Free Download] for the definitive guide to writing a white paper.)

    Like a white paper, an ebook addresses a hot topic. But while a white paper generally presents original research or findings, an ebook can include original content, collect or mine product reviews, or curate content that has appeared in other formats. For example, a common form of ebook repurposes blog posts and adds additional related information from industry experts and thought leaders.

    Here’s a quick summary of the differences between ebooks and white papers. It comes from Ann Handley, MarketingProfs Chief Content Officer and co-author C.C. Chapman’s book, Content Rules.

    Ebooks White Papers
    Broken into smaller chunks—designed for skimming and scanning Long and linear—a deep read
    Concept-centric—based on ideas and trends of interest Data-centric—often based on formal research
    Visually heavy—main text supplemented with call-outs, bullet lists Text-heavy
    Casual and collegial—a conversation among equals Formal—impressive expert speaks to you

    80% of users say they’d provide their email for a white paper or ebook.
    Click To Tweet
    Benefits of Ebooks and White Papers

    Both ebooks and white papers offer many potential benefits. They are typically more time-consuming and expensive to produce than blog posts. But whether your goal is to generate leads, increase brand awareness, or educate customers, gate this valuable content so that you can collect visitors’ information.

    Here are four reasons to add eBooks or white papers to your content strategy.

    1. Generate Leads
      Ebooks and white papers are two of the best ways to generate leads, especially for B2B marketers. B2B customers now have access to many sources of information before making a purchase. They like to make use of them. The sales cycle can be long, and you need timely, relevant, and useful assets that motivate readers to download a copy and return to your site for more.
    2. Build Thought Leadership
      An ebook or white paper with exclusive content can define a category or present unique information or reflections on an industry. This builds thought leadership. Many experts use ebooks as calling cards to generate speaking engagements and appearances at industry conferences. Even if it’s self-published, an ebook burnishes your reputation.
    3. Enhance Customer Education
      Ebooks and white papers help educate customers so they can decide if your product or service is right for them. They can also help customers use your product or service, and get the most out of their investment.
    4. Share Expertise with Your Target Audience
      An ebook or white paper can provide detailed insight into an area of expertise that connects your company with a particular audience. Tailor the contents to their specific needs by addressing the pain points of a particular persona. This helps engage customers and turn them into brand ambassadors.

    How to Choose Between an Ebook and White Paper

    Determining whether you should create an ebook or white paper is actually quite simple. Start by answering these three questions:

    1. Which would your audience prefer? Are they looking for information that’s easy to absorb and quick to implement? Or do they want a detailed dive into a niche area of the topic you’re covering? The more specific your audience’s interest, the better a white paper will meet their needs.
    2. Which is better for this type of content? Is your topic a high-level overview? Can it be broken down into actionable steps? Or does it require supporting details and deeper explanation? Information that can be presented in scannable chunks is better suited to an ebook.
    3. What time and resources do you have to create? White papers can require a significant amount of original and third-party research. That’s not to mention lead time for gathering quotes, creating graphs, and organizing visual depictions of data. If you don’t have the time to devote to a white paper, then an ebook will be a better bet.

    Review the differences between these content types, weigh their benefits, and select the format that’s best for you. When you’re ready to start building, check out Curata’s white paper template or ebook template to help you get started.