UltimateSEO.center is theUltimate Center Of SEO search this search engine and get only the results from Ultimate SEO trusted sites.
Ultimate SEO World
Today's Google Doodle honors Stamen Grigorov, the man who discovered Lactobacillus bulgaricus bacillus, which is why we have yogurt today. Stamen Grigorov is a Bulgarian physician and microbiologist who was born 142 years ago today on October 27, 1878 in Bulgaria and died at the age of 67 on October 27, 1945.
The end of the year holiday season is a peak time for many merchants with special sales events such as Black Friday and Cyber Monday. As a merchant, you can help Google highlight your sales events by providing landing pages with relevant content and high quality images. Best Practices The following are recommended best practices for your landing pages: Create the page early. Make sure you create the page well before the sale so Googlebot has time to discover and index the page. Make sure you are not blocking Google from crawling the URL (the Google URL Inspection Tool can be used to check this). Follow standard SEO best practices. A list of SEO best practices for landing pages can be found in our Search Engine Optimization (SEO) Starter Guide. Link to the landing pages from your home page (or similar) to increase their prominence, helping users (and Google) find…
Posted by Dr-Pete
In a study of 2.1M searches and 766K videos, YouTube accounted for 94% of all video carousel results on page one of Google, leaving little room for competition.
Even the most casual video aficionado knows YouTube (acquired by Google in 2006). As a Google search user, you may even feel like you encounter more YouTube videos than videos from other sources, but does the data back this up?
A Wall Street Journal article in June 2020 measured a strong advantage of YouTube in Google search results, but that article focused on 98 hand-selected videos to compare YouTube to other platforms.
Using a set of over two million Google.com (US) desktop searches captured in early October 2020, we were able to extract more than 250,000 results with video carousels on page one. Most organic video results in 2020 appear in a carousel, like this one:
This carousel appeared on a search for âHow to be an investorâ (Step 1: Find a bag of money). Notice the arrow on the far-right â currently, searchers can scroll through up to ten videos. While our research tracked all ten positions, most of this report will focus on the three visible positions.
How dominant is YouTube?
Anecdotally, we see YouTube pop up a lot in Google results, but how dominant are they in the visible three video carousel results across our data set? Hereâs a breakdown:
YouTubeâs presence across the first three video slots was remarkably consistent, at (1) 94.1%, (2) 94.2% and (3) 94.2%. Khan Academy and Facebook took the #2 and #3 rankings for each carousel slot, with Facebook gaining share in later slots.
Obviously, this is a massive drop from the first to second largest share, and YouTubeâs presence only varied from 94.1% to 95.1% across all ten slots. Across all visible videos in the carousel, here are the top ten sites in our data set:
- YouTube (94.2%)
- Khan Academy (1.5%)
- Facebook (1.4%)
- Microsoft (0.4%)
- Vimeo (0.1%)
- Twitter (0.1%)
- Dailymotion (<0.1%)
- CNBC (<0.1%)
- CNN (<0.1%)
- ESPN (<0.1%)
Note that, due to technical limitations with how search spiders work, many Facebook and Twitter videos require a login and are unavailable to Google. That said, the #2 to #10 biggest players in the video carousel â including some massive brands with deep pockets for video content â add up to only 3.7% of visible videos.
How about how-to?
Pardon my grammar, but âHow toâ¦?â questions have become a hot spot for video results, and naturally lend themselves to niche players like HGTV. Hereâs a video carousel from a search for âhow to organize a pantryâ:
It looks promising on the surface, but does this niche show more diversity of websites at scale? Our data set included just over 45,000 âHow to â¦â searches with video carousels. Hereâs the breakdown of the top three sites for each slot:
In our data set, YouTube is even more dominant in the how-to niche, taking up from 97-98% of each of the three visible slots. Khan Academy came in second, and Microsoft (specifically, the Microsoft support site) rounded out the third position (but at <1% in all three slots).
Is this just a fluke?
Most of this analysis was based on a snapshot of data in early October. Given that Google frequently makes changes and runs thousands of tests per year, could we have just picked a particularly unusual day? To answer that, we pulled YouTubeâs prevalence across all videos in the carousel on the first day of each month of 2020:
YouTubeâs dominance was fairly steady across 2020, ranging from 92.0% to 95.3% in our data set (and actually increasing a bit since January). Clearly, this is not a temporary nor particularly recent condition.
Another challenge in studying Google results, even with large data sets, is the possibility of sampling bias. There is no truly ârandomâ sample of search results (more on that in Appendix A), but weâre lucky enough to have a second data set with a long history. While this data set is only 10,000 keywords, it was specifically designed to evenly represent the industry categories in Google Ads. On October 9, we were able to capture 2,390 video carousels from this data set. Hereâs how they measured up:
The top three sites in each of the carousel slots were identical to the 2M-keyword data set, and YouTubeâs dominance was even higher (up from 94% to 96%). We have every confidence that the prevalence of YouTube results measured in this study is not a fluke of a single day or a single data set.
How level is the field?
Does YouTube have an unfair advantage? âFairâ is a difficult concept to quantify, so letâs explore Googleâs perspective.
Googleâs first argument would probably be that YouTube has the lionâs share of video results because they host the lionâs share of videos. Unfortunately, itâs hard to get reliable numbers across the entire world of video hosting, and especially for social platforms. YouTube is undoubtedly a massive player and likely hosts the majority of non-social, public videos in the United States, but 94% seems like a big share even for the lion.
The larger problem is that this dominance becomes self-perpetuating. Over the past few years, more major companies have hosted videos on YouTube and created YouTube channels because itâs easier to get results in Google search than hosting on smaller platforms or their own site.
Googleâs more technical argument is that the video search algorithm has no inherent preference for YouTube. As a search marketer, Iâve learned to view this argument narrowly. Thereâs probably not a line of code in the algorithm that says something like:
IF site = âYouTubeâ THEN ranking = 1
Defined narrowly, I believe that Google is telling the truth. However, thereâs no escaping the fact that Google and YouTube share a common backbone and many of the same internal organs, which provides advantages that may be insurmountable.
For example, Googleâs video algorithm might reward speed. This makes sense â a slow-loading video is a bad customer experience and makes Google look bad. Naturally, Googleâs direct ownership over YouTube means that their access to YouTube data is lightning fast. Realistically, how can a competitor, even with billions in investment, produce an experience thatâs faster than a direct pipeline to Google? Likewise, YouTubeâs data structure is naturally going to be optimized for Google to easily process and digest, relying on inside knowledge that might not be equally available to all players.
For now, from a marketing perspective, weâre left with little choice but to cover our bases and take the advantage YouTube seems to offer. Thereâs no reason we should expect YouTubeâs numbers to decrease, and every reason to expect YouTubeâs dominance to grow, at least without a paradigm-shifting disruption to the industry.
Many thanks to Eric H. and Michael G. on our Vancouver team for sharing their knowledge about the data set and how to interpret it, and to Eric and Rob L. for trusting me with Athena access to a treasure trove of data.
Appendix A: Data and methodology
The bulk of the data for this study was collected in early October 2020 from a set of just over two million Google.com, US-based, desktop search results. After minor de-duplication and clean-up, this data set yielded 258K searches with video carousels on page one. These carousels accounted for 2.1 million total video results/URLs and 767K visible results (Google displays up to three per carousel, without scrolling).
The how-to analysis was based on a smaller data set of 45K keywords that explicitly began with the words âhow toâ. Neither data set is a randomly selected sample and may be biased toward certain industries or verticals.
The follow-up 10K data set was constructed specifically as a research data set and is evenly distributed across 20 major industry categories in Google Ads. This data set was specifically designed to represent a wide range of competitive terms.
Why donât we use true random sampling? Outside of the textbook, a truly random sample is rarely achieved, but theoretically possible. Selecting a random sample of adults in The United States, for example, is incredibly difficult (as soon as you pick up the phone or send out an email, youâve introduced bias), but at least we know that, at any particular moment, the population of adults in the United States is a finite set of individual people.
The same isnât true of Google searches. Searches are not a finite set, but a cloud of words being conjured out of the void by searchers every millisecond. According to Google themselves: âThere are trillions of searches on Google every year. In fact, 15 percent of searches we see every day are new.â The population of searches is not only in the trillions, but changing every minute.
Ultimately, we rely on large data sets, where possible, try to understand the flaws in any given data set, and replicate our work across multiple data sets. This study was replicated against two very different data sets, as well as a third set created by a thematic slice of the first set, and validated against multiple dates in 2020.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
After noticing changes in Google since September 23rd and actually calling out canonical issues with Google a few days later, Google has now confirmed the indexing bugs where pages were dropping out of the index. The issue was two-fold Google said; one with canonicalization and the other with mobile-indexing.
Posted by rjonesx.
One of the most difficult decisions to make in any field is to consciously choose to miss a deadline. Over the last several months, a team of some of the brightest engineers, data scientists, project managers, editors, and marketers have worked towards a release date of the new Page Authority (PA) on September 30, 2020. The new model is exceptional in nearly every way to the current PA, but our last quality control measure revealed an anomaly that we could not ignore.
As a result, weâve made the tough decision to delay the launch of Page Authority 2.0. So, let me take a moment to retrace our steps as to how we got here, where that leaves us, and how we intend to proceed.
Seeing an old problem with fresh eyes
Historically, Moz has used the same method over and over again to build a Page Authority model (as well as Domain Authority). This model's advantage was its simplicity, but it left much to be desired.
Previous Page Authority models trained against SERPs, trying to predict whether one URL would rank over another, based on a set of link metrics calculated from the Link Explorer backlink index. A key issue with this type of model was that it couldnât meaningfully address the maximum strength of a particular set of link metrics.
For example, imagine the most powerful URLs on the Internet in terms of links: the homepages of Google, Youtube, Facebook, or the share URLs of followed social network buttons. There are no SERPs that pit these URLs against one another. Instead, these extremely powerful URLs often rank #1 followed by pages with dramatically lower metrics. Imagine if Michael Jordan, Kobe Bryant, and Lebron James each scrimaged one-on-one against high school players. Each would win every time. But we would have great difficulty extrapolating from those results whether Michael Jordan, Kobe Bryant, or Lebron James would win in one-on-one contests against each other.
When tasked with revisiting Domain Authority, we ultimately chose a model with which we had a great deal of experience: the original SERPs training method (although with a number of tweaks). With Page Authority, we decided to go with a different training method altogether by predicting which page would have more total organic traffic. This model presented several promising qualities like being able to compare URLs that donât occur on the same SERP, but also presented other difficulties, like a page having high link equity but simply being in an infrequently-searched topic area. We addressed many of these concerns, such as enhancing the training set, to account for competitiveness using a non-link metric.
Measuring the quality of the new Page Authority
The results were â and are â very promising.
First, the new model obviously predicted the likelihood that one page would have more valuable organic traffic than another. This was expected, because the new model was directed at this particular goal, while the current Page Authority merely attempted to predict whether one page would rank over another.
Second, we found that the new model predicted whether one page would rank over another better than the previous Page Authority. This was especially pleasing, as it laid to rest many of our concerns that the new model would underperform on old quality controls due to the new training model.
How much better is the new model at predicting SERPs than the current PA? At every interval â all the way down to position 4 vs 5 â the new model tied or out-performs the current model. It never lost.
Everything was looking great. We then started analyzing outliers. I like to call this the âdoes anything look stupid?â test. Machine learning makes mistakes, just as humans can, but humans tend to make mistakes in a very particular manner. When a human makes a mistake, we often understand exactly why the mistake was made. This isnât the case for ML, especially Neural Nets; we pulled URLs with high Page Authorities under the new model that happened to have zero organic traffic, and included them in the training set to learn for those errors. We quickly saw bizarre 90+ PAs drop down to much more reasonable 60s and 70sâ¦ another win.
We were down to one last test.
The problem with branded search
Some of the most popular keywords on the web are navigational. People search Google for Facebook, Youtube, and even Google itself. These keywords are searched an astronomical number of times relative to other keywords. Subsequently, a handful of highly powerful brands can have an enormous impact on a model that looks at total search volume as part of its core training target.
The last test involves comparing the current Page Authority to the new Page Authority, in order to determine if there are any bizarre outliers (where PA shifted dramatically and without obvious reason). First, letâs look at a simple comparison of the LOG of Linking Root Domains compared to the Page Authority.
Not too shabby. We see a generally positive correlation between Linking Root Domains and Page Authority. But can you spot the oddities? Go ahead and take a minuteâ¦
There are two anomalies that stand out in this chart:
- There is a curious gap separating the main distribution of URLs and the outliers above and below.
- The largest variance for a single score is at PA 99. There are an awful lot of PA 99s with a wide range of Linking Root Domains.
Here is a visualization that will help draw out these anomalies:
The gray spaces between the green and red represent this odd gap between the bulk of the distribution and the outliers. The outliers (in red) tend to clump together, especially above the main distribution. And, of course, we can see the poor distribution at the top of PA 99s.
Bear in mind that these issues are not sufficient to make the new Page Authority model less accurate than the current model. However, upon further examination, we found that the errors the model did produce were significant enough that they could adversely influence the decisions of our customers. Itâs better to have a model that is off by a little everywhere (because the adjustments SEOs make are not incredibly fine-tuned) than it is to have a model that is right mostly everywhere but bizarrely wrong in a limited number of cases.
Luckily, weâre fairly confident as to what the problem is. It seems that homepage PAs are disproportionately inflated, and that the likely culprit is the training set. We canât be certain this is the cause until we complete retraining, but it is a strong lead.
The good news and the bad news
We are in good shape insofar as we have multiple candidate models that outperform the existing Page Authority. Weâre at the point of bug squashing, not model building. However, we are not going to ship a new score until we are confident that it will steer our customers in the right direction. We are highly conscientious of the decisions our customers make based on our metrics, not just whether the metrics meet some statistical criteria.
Given all of this, we have decided to delay the launch of Page Authority 2.0. This will give us the necessary time to address these primary concerns and produce a stellar metric. Frustrating? Yes, but also necessary.
As always, we thank you for your patience, and we look forward to producing the best Page Authority metric we have ever released.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Posted by MiriamEllis
Image credit: DoSchu
âNo place like home for the holidays.â This will be the refrain for the majority of your customers as we reach 2020âs peak shopping season. I canât think of another year in which itâs been more important for local businesses to plan and implement a seasonal marketing strategy extra early, to connect up with customers who will be traveling less and seeking ways to celebrate at home.
Recently, itâs become trendy in multiple countries to try to capture the old Danish spirit of hygge, which the OED defines as: A quality of coziness and comfortable conviviality that engenders a feeling of contentment or well-being.
While this sometimes-elusive state of being isnât something you can buy direct from a store, and while some shoppers are still unfamiliar with hygge by name, many will be trying to create it at home this year. Denmark buys more candles than any other nation, and across Scandinavia, fondness for flowers, warming foods, cozy drinks, and time with loved ones characterizes the work of weaving a gentle web of happiness into even the darkest of winters.
Whatever your business can offer to support local shoppersâ aspirations for a safe, comfortable, happy holiday season at home is commendable at the end of a very challenging 2020. I hope these eight local search marketing tips will help you make good connections that serve your customers â and your business â well into the new year.
1) Survey customers now and provide what they want
Reasonably-priced survey software is worth every penny in 2020. For as little as $20/month, your local business can understand exactly how much your customersâ needs have changed this past year by surveying:
- Which products locals are having trouble locating
- Which products/services they most want for the holidays
- Which method of shopping/delivery would be most convenient for them
- Which hours of operation would be most helpful
- Which safety measures are must-haves for them to transact with a business
- Which payment methods are current top choices
Doubtless, you can think of many questions like these to help you glean the most possible insight into local needs. Poll your customer email/text database and keep your surveys on the short side to avoid abandonment.
Donât have the necessary tools to poll people at-the-ready? Check out Zapierâs roundup of the 10 Best Online Survey Apps in 2020 and craft a concise survey geared to deliver insights into customersâ wishes.
2) Put your companyâs whole heart into affinity
If I could gift every local business owner with a mantra to carry them through not just the 2020 holiday shopping season, but into 2021, it would be this:
Itâs not enough to have customers discover my brand â I need them to like my brand.
Chances are, you can call to mind some brands of which youâre highly aware but would never shop with because they donât meet your personal or business standards in some way. Youâve discovered these brands, but you donât like them. In 2020, you may even have silently or overtly boycotted them.
On the opposite side of this scenario are the local brands you love. I can wax poetic about my local independent grocery store, stocking its shelves with sustainable products from local farmers, flying its Black Lives Matter and LGBTQ+ flags with pride from its storefront, and treating every customer like a cherished neighbor.
For many years, our SEO industry has put great effort into and emphasis on the discovery phase of the consumer journey, but my little country-town grocer has gone leaps and bounds beyond this by demonstrating affinity with the things my household cares about. The owners can consider us lifetime loyal customers for the ways they are going above-and-beyond in terms of empathy, diversity, and care for our community.
I vigorously encourage your business to put customer-brand affinity at the heart of its holiday strategy. Brainstorm how you can make meaningful changes that declare your companyâs commitment to being part of the work of positive social change.
3) Be as accessible and communicative as possible
Once youâve accomplished the above two goals, open the lines of communication about what your brand offers and the people-friendly aspects of how you operate across as many of the following as possible:
- Local business listings
- Social channels
- Phone on-hold marketing
- Storefront and in-store signage
- Local news, radio, and TV media
In my 17 years as a local SEO, I can confidently say that local business listings have never been a greater potential asset than they will be this holiday season. Google My Business listings, in particular, are an interface that can answer almost any customer who-what-where-when-why â if your business is managing these properly, whether manually or via software like Moz Local.
Anywhere a customer might be looking for what you offer, be there with accurate and abundant information about identity, location, hours of operation, policies, culture, and offerings. From setting special hours for each of your locations, to embracing Google Posts to microblog holiday content, to ensuring your website and social profiles are publicizing your USP, make your biggest communications effort ever this year.
At the same time, be sure youâre meeting Googleâs mobile-friendly standards, and that your website is ADA-compliant so that no customer is left out. Provide a fast, intuitive, and inclusive experience to keep customers engaged.
With the pandemic necessitating social distancing, make the Internet your workhorse for connecting up with and provisioning your community as much as you can.
4) Embrace local e-commerce and product listings
Digital Commerce 360 has done a good job charting the 30%+ rise in online sales in the first half or 2020, largely resulting from the pandemic. The same publication summarizes the collective 19% leap in traffic to North Americaâs largest retailers. At the local business level, implementing even basic e-commerce function in advance of the holiday season could make a major difference, if you can find the most-desired methods of delivery. These could include:
- Buy-online, pick up in-store (BOPIS)
- Buy-online, pick up curbside
- Buy online for postal delivery
- Buy online for direct home delivery by in-house or third-party drivers
Hereâs an extensive comparison of popular e-commerce solutions, including which ones have free trials, and the e-commerce column of the Moz blog is a free library of expert advice on optimizing digital sales.
Put your products everywhere you can. Donât forget that this past April, Google surprised everybody by offering free product listings, and that they also recently acquired the Pointy device, which lets you transform scanned barcodes into online inventory pages.
Additionally, in mid-September, Google took their next big product-related step by adding a ânearbyâ filter to Google Shopping, taking us closer and closer to the search engine becoming a source for real-time local inventory, as Iâve been predicting here in my column for several years.
Implement the public safety protocols that review research from GatherUp shows consumers are demanding, get your inventory onto the web, identify the most convenient ways to get purchases from your storefront into the customerâs hands, and your efforts could pave the way for increased Q4 profits.
5) Reinvent window shopping with QR codes
âHow can I do what I want to do?â asked Jennifer Bolin, owner of Clover Toys in Seattle.
What she wanted to do was use her storefront window to sell merchandise to patrons who were no longer able to walk into her store. When a staff member mentioned that you could use a QR code generator like this one to load inventory onto pedestriansâ cell phones, she decided to give it a try.
Just a generation or two ago, many Americans cherished the tradition of going to town or heading downtown to enjoy the lavish holiday window displays crafted by local retailers. The mercantile goal of this form of entertainment was to entice passersby indoors for a shopping spree. Itâs time to bring this back in 2020, with the twist of labeling products with QR codes and pairing them with desirable methods of delivery, whether through a drive-up window, curbside, or delivery.
âWeâve even gotten late night sales,â Bolin told me when I spoke with her after my colleague Rob Ousbey pointed out this charming and smart independent retail shop to me.
If your business locations are in good areas for foot traffic, think of how a 24/7 asset like an actionable, goodie-packed window display could boost your sales.
6) Tie in with DIY, and consider kits
With so many customers housebound, anything your business can do to support activities and deliver supplies for domestic merrymaking is worth considering. Can your business tie in with decorating, baking, cooking, crafting, handmade gift-giving, home entertainment, or related themes? If so, create video tutorials, blog posts, GMB posts, social media tips, or other content to engage a local audience.
One complaint I am encountering frequently is that shoppers are feeling tired trying to piecemeal together components from the internet for something they want to make or do. Unsurprisingly, many people are longing for the days when they could leisurely browse local businesses in-person, taking inspiration from their hands-on interaction with merchandise. I think kits could offer a stopgap solution in some cases. If relevant to your business, consider bundling items that could provide everything a household needs to:
- Prepare a special holiday meal
- Bake treats
- Outfit a yard for winter play
- Trim a tree or decorate a home
- Build a fire
- Create a night of fun for children of various age groups
- Dress appropriately for warmth and safety, based on region
- Create a handmade gift, craft, or garment
- Winter prep a home or vehicle
- Create a complete home spa/health/beauty experience
- Plant a spring garden
Kits could be a welcome all-in-one resource for many shoppers. Determine whether your brand has the components to offer one.
7) Manage reviews meticulously
Free, near-real-time quality control data from your holiday efforts can most easily be found in your review profiles. Use software like Moz Local to keep a running tally of your incoming new reviews, or assign a staff member at each location of your business to monitor your local business profiles daily for any complaints or questions.
If you can quickly solve problems people cite in their reviews, your chances are good of retaining the customer and demonstrating responsiveness to all your profilesâ visitors. You may even find that reviews turn up additional, unmet local needs your formal survey missed. Acting quickly to fulfill these requests could win you additional business in Q4 and beyond.
8) Highly publicize one extra reason to shop local this year
â72% of respondents...are likely or very likely to continue to shop at independent stores, either locally or online, above larger retailers such as Amazon.â â Bazaarvoice
I highly recommend reading the entire survey of 12,000 global respondents by Bazaarvoice, quantifying how substantially shopping behaviors have changed in 2020. Itâs very good news for local business owners that so many customers want to keep transacting with nearby independents, but the Amazon dilemma remains.
Above, we discussed the fatigue that can result from trying to cobble together a bunch of different resources to check everything off a shopping list. This can drive people to online âeverything storesâ, in the same way that department stores, supermarkets, and malls have historically drawn in shoppers with the promise of convenience.
A question every local brand should do their best to ask and answer in the runup to the holidays is: Whatâs to prevent my community from simply taking their whole holiday shopping list to Amazon, or Walmart, or Target this year?
Whatever your business can offer to support local shoppersâ aspirations for a safe, comfortable, happy holiday season at home is commendable at the end of a very challenging 2020. I hope these eight local search marketing tips will help you make good connections that serve your customers â and your business â well into the new year.
My completely personal answer to this question is that I want my townâs local business district, with its local flavor and diversity of shops, to still be there after a vaccine is hopefully developed for COVID-19. But thatâs just me. Inspiring your customersâ allegiance to keeping your business going might be best supported by publicizing some of the following:
- The economic, societal, and mental health benefits proven to stem from the presence of small, local businesses in a community.
- Your philanthropic tie-ins, such as generating a percentage of sales to worthy local causes â there are so many ways to contribute this year.
- The historic role your business has played in making your community a good place to live, particularly if your brand is an older, well-established one. I hear nostalgia is a strong influencer in 2020, and old images of your community and company through the years could be engaging content.
- Any recent improvements youâve made to ensure fast home delivery, whether by postal mail or via local drivers who can get gifts right to peopleâs doors.
- Uplifting content that simply makes the day a bit brighter for a shopper. Weâre all looking for a little extra support these days to keep our spirits bright.
Be intentional about maximizing local publicity of your âextra reasonâ to shop with you. Your local newspaper is doubtless running a stream of commentary about the economic picture in your city, and if your special efforts are newsworthy, a few mentions could do you a lot of good.
Donât underestimate just how reliant people have become on the recommendations of friends, family, and online platforms for sourcing even the basics of life these days. In my own circle, everyone is now regularly telling everyone else where to find items from hand sanitizer to decent potatoes. Networking will be happening around gifts, too, so anything you get noticed for could support extensive word-of-mouth information sharing.
I want to close by thanking you for being in or marketing businesses that will help us all celebrate the many upcoming holidays in our own ways. Your efforts are appreciated, and Iâm wishing you a peaceful, profitable, and hyggelig finish to 2020.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Posted by SamuelMangialavori
Content, content, and more content! Thatâs what SEO is all about nowadays, right? Compared to when I started working in SEO (2014), today, content is consistently one of the most popular topics covered at digital marketing conferences, there are way more tools that focus on content analysis and optimization, and overall it seems to dominate most of SEO news.
Donât believe me? Hereâs a nice Google Trends graph that may change your mind:
But why is it that content is now dominating the SEO scene? How vital is content for your SEO strategy, actually? And most importantly: how can you be content with your siteâs content? Puns aside, this post aims to help you figure out potential causes of your underperforming content and how to improve it.
Why content is key in SEO in 2020
Content is one of the most important factors in SEO. Just by paying close attention to what Google has been communicating to webmasters in the last few years, itâs clear that theyâve put a strong emphasis on âcontentâ as a decisive ranking factor.
For instance, letâs have a look at this post, from August 2019, which talks about Googleâs regular updates and what webmasters should focus on:
âFocus on content: pages that drop after a core update donât have anything wrong to fix. We suggest focusing on ensuring youâre offering the best content you can. Thatâs what our algorithms seek to reward.â
The article goes on, listing a series of questions that may help webmasters when self-assessing their own content (I strongly recommend reading the entire post).
That said, content alone cannot and should not be enough for a website to rank well, but it is a pretty great starting point!
Underperforming content: theory first
What is underperforming content?
When I say âunderperforming contentâ, Iâm referring to content, either on transactional/commercial pages or editorial ones, that does not perform up to its potential. This could be content that either used to attract a good level of organic traffic and now doesnât, or content that never did generate any organic traffic despite the efforts you might have put in.
Over 90% of content gets no traffic from Google. Ninety bloody percent! This means that nine pages out of 10 are likely not to receive any organic traffic at all â food for thought.
What are the causes of underperforming content?
There could be many reasons why your content is not doing well, but the brutal truth is often simple: in most cases, your content is simply not good enough and does not deserve to rank in the top organic positions.
Having said that, here are the most common reasons why your content may be underperforming: they are in no particular order and I will highlight the most important, in my opinion.
Your content does not match the user intent
Based on my experience, this is a very important thing that even experienced marketers still get wrong. It may be the case that your content is good and relevant to your users, but does not match the intent that Google is showcasing in the SERP for the keywords of focus.
As SEOs, our aim should be to match user intent, which means we first need to understand the what and the who before defining the how. Whose intent we are targeting and what is represented in the SERP will define the strategy we use to get there.
Example: webmasters who hope to rank for a ânavigational or informationalâ keyword with a transactional, page or vice versa.
Your content isnât in the ideal format Google is prioritizing
Google may be favoring a certain type of format which your content doesnât conform to, hence it isnât receiving the expected visibility.
Example: you hope to rank with a text-heavy blog post for a âhow toâ keyword where Google is prioritizing video content.
Your content is way too âthinâ compared to what is ranking
It doesnât necessarily have to be a matter of content length (there is no proven content length formula out there, trust me) but more relevance and comprehensiveness. It may be the case that your content is simply not as compelling as other sites out there, hence Google prioritizing those over you.
Example: you hope to rank for heavily competitive informational keywords with a 200-words blog post.
Your content isnât as up-to-date
If your content is very topical, and such a topic heavily depends on information which may change with time, then Google will reward sites that put effort into keeping the content fresh and up-to-date. Apart from search engines themselves, users really care about fresh content â no one wants to read an âSEO guide to improve underperforming contentâ that was created in 2015!
Example: certain subjects/verticals tend to be more prone to this issue, but generally anything related to regulations/laws/guidelines which tend to change often.
Your content is heavily seasonal or tied to a past event/experience
Self-explanatory: if your content is about something that occurred in the past, generally the interest for that particular subject will gradually decrease over time. There are exceptions, of course (god save the 90s and my fav Netflix show âThe Last Danceâ), but you get the gist.
Example: topics such as dated events or experiences (Olympics 2016, past editions of Black Friday, and so on) or newsworthy content (2016 US election, Kanye running for president â no wait that is still happening...).
Your tech directives have changed the pageâs indexation status
If something happens to your page that makes it fall out of Googleâs index. The most common issues could be: unexpected no-index tag, canonical tag, incorrect hreflang tags, page status changes, page removed with Google Search Consoleâs remove tool, and so on.
Example: after some SEO recommendations, your devs mistakenly put a no-index tag on your page without you realizing.
Your page is victim of duplication or cannibalization
If you happen to cover the same or similar keyword topic with multiple pages, this may trigger duplication and/or cannibalization, which ultimately will result in a loss of organic visibility.
Example: you launch a new service page alongside your current offerings, but the on-page focus (metadata, content, linking structure) isnât different or unique enough and it ends up cannibalizing your existing visibility.
Example: fictitious case where your site goes through a redesign, heavy JS is now happening on your browser and changing a key part of your content that now Google cannot render easily â that is a problem!
Your page has lost visibility following drastic SERP changes
The SERP has changed extensively in the last few years, which means many more new features that are now present werenât there before. This may cause disruption to previous rankings (hence to your previous CTR), or make your pages fall out of Googleâs precious page one.
Also, donât forget to consider that the competition might have gotten stronger with time, so that could be another reason why you lose significant visibility.
Example: some verticals have been impacted more than others (jobs, flights, and hotels, for instance) where Googleâs own snippets and tools are now getting the top of the SERP. If you are as obsessed with SERP chances, and in particular PAA, as I am and want more details, have a read here.
Your content doesnât have any backlinks
Without going into too much detail on this point â it could be a separate blog post â for very competitive commercial terms, not having any/too few backlinks (and what backlinks represent for your site in Googleâs eyes) can hold you back, even if your page content is compelling on its own. This is particularly true for new websites operating in a competitive environment.
Example: for a challenging vertical like fashion, for instance, it is extremely difficult to rank for key head terms without a good amount of quality (and naturally gained) backlinks to support your transactional pages.
How to find the issues affecting your content
Weâve covered the why above, letâs now address the how: how to determine what issue affects your page/content. This part is especially dedicated to a not-too savvy SEO audience (skip this part and go straight to next if you are after the how-to recommendations).
Iâll go through a list of checks that can help you detect the issues listed above.
Google Search Console
Use the URL inspection tool to analyze the status of the page: it can help you answer questions such as:
- Has my page been crawled? Are we even allowing Google to crawl the page?
- Has my page been indexed? Are we even allowing Google to index the page?
By assessing the Coverage feature, Google will share information about the crawlability and indexability of the page.
Pay particular attention to the Indexing section, where they mention user-declared canonical vs google-selected canonical. If the two differ, itâs definitely worth investigating the reason, as this means Google isnât respecting the canonical directives placed on the page â check official resources to learn more about this.
I love Chrome extensions and I objectively have way too many on my browserâ¦
Some Chrome extensions can give you lots of info on the indexability status of the page with a simple click, checking things like canonical tags and meta robots tags.
My favorite extensions for this matter are:
All the above tools are very useful for any type of troubleshooting as they are showcasing the rendered-DOM resources in real-time (different from what the âview-sourceâ of a page looks like).
Once youâve run the test, click to see the rendered HTML and try and do the following checks:
- Is the core part of my content visible?
- Quick way to do so: find a sentence in your content, use the search function or click CTRL + F with that sentence to see if itâs present in the rendered version of the page.
- Are internal links visible to Google?
- Quick way to do so: find an internal link on the page, use the search function or click CTRL + F with that sentence to see if itâs present in the rendered version of the page.
- Can Google access other key elements of the page?
- Check for things such as headers (example below with a Brainlabs article), products, pagination, reviews, comments, etc.
Intent and SERP analysis
By analyzing the SERP for key terms of focus, youâll be able to identify a series of questions that relate to your content in relation to intent, competition, and relevance. All major SEO tools nowadays provide you with tons of great information about what the SERP looks like for whatever keyword youâre analyzing.
For the sake of our example, letâs use Ahrefs and the sample keyword below is âevergreen contentâ:
Based on this example, these are a few things I can notice:
- This keyword triggers a lot of interesting SERP features (Featured Snippet, Top Stories, People also ask)
- The top organic spots are owned by very established and authoritative sources (Ahrefs blog, Hubspot, Wordstream etc), which makes this keyword quite difficult to compete for
Here are quick suggestions on what types of checks I recommend:
- Understand and classify the keyword of analysis, based on the type of results Google is showing in the SERP: any ads showing, or organic snippets? Are the competing pages mainly transactional or informational?
- Check the quality of the sites that are ranking in page one: indicative metrics that can help you gather insights on the quality of each domain (DA/DR) are helpful, the number of keywords those pages are visible for, the estimated traffic per page, and so on.
- Do a quick crawl of these pages to bulk check the comprehensiveness of their content and metadata, or manually check some if you prefer that way.
By doing most of these checks, youâll be able to see if your content is underperforming for any of the reasons previously mentioned:
- Content not compelling enough compared to what is ranking on page one
- Content in the wrong format compared to what Google is prioritizing
- Content is timely or seasonal
- Content is being overshadowed by SERP features
Duplication and cannibalization issues
Check out my 2019 post on this subject, which goes into a lot more detail. The quick version of the post is below.
Use compelling SEO tools to understand the following:
- whether, for tracked keywords of interest, two or more ranking URLs have been flip-flopping. That is a clear sign that search engines are confused and cannot âeasily decideâ on what URL to rank for a certain keyword.
- whether, for tracked keywords of interest, two or more ranking URLs are appearing at the same time (not necessarily on page one of the SERP). That is a clear signal of duplication/cannibalization.
- check your SEO visibility by landing page: if different URLs that rank for very similar keyword permutations, chances are there is a risk there.
- last but not least: do a simple site search for keywords of interest in order to get an initial idea of how many pages (that cover a certain topic) have been indexed by Google. This is an insightful preliminary exercise and also useful to validate your worries.
How to fix underperforming content
Weâve covered the most common cases of underperforming content and how to detect such issues â now letâs talk about ways to fix them.
Below is a list of suggested actions to take when improving your underperforming content, with some very valuable links to other resources (mostly from Moz or Google) that can help you expand on individual concepts.
Make sure your page can be crawled and indexed âproperlyâ
- Ensure that your page does not fall under any path of blocked resources in Robots.txt
- Ensure your page is not provided with a no-index meta robots tag or a canonical tag pointing elsewhere (a self-referencing canonical tag is something you may want to consider but not compulsory at all).
- Check whether other pages have a canonical tag pointing to your URL of focus. Irrelevant or poorly-done canonical tags tend to get ignored by Google â you can check if that is the case in the URL Inspection tool.
- Ensure your page is linked internally on the site and present in your XML sitemap.
Understand search intent
- Search intent is a fascinating topic in and of itself, and there are a lot of great resources on the subject if you want to delve deeper into it.
- Put simply, you should always research what the SERP looks like for the topic of interest: by analyzing the SERP and all its features (organic and non), you can get a much better understanding of what search engines are looking for in order to match intent.
- By auditing the SERP, you should be able to answer the following questions:
- What type of content is Google favoring here: transactional, navigational, informational?
- How competitive are the keywords of focus and how authoritative are those competitors ranking highly for them?
- What content format is Google showcasing in the SERP?
- How comprehensive should my content be to get a chance to rank in page one?
- What keywords are used in the competitorâs metadata?
- What organic features should I consider addressing with my content (things like featured snippets, people also ask, top images, etc.)?
- Hopefully all the questions above will also give you a realistic view of your chances of ranking on Googleâs first page. Donât be afraid to switch your focus to PPC for some very competitive keywords where your real possibility of organic rankings are slim.
Map your pages against the right keywords
- This is a necessary step to make sure you have a clear understanding of not only what keywords you want to rank for, but also what keywords you are eligible to rank for.
- Donât overdo it and be realistic about your ranking possibilities: mapping your page against several keywords variations, all of which show very different SERPs and intents, is not realistic.
- My suggestion is to pick two or three primary keyword variations and focus on getting your content as relevant as possible to those terms.
Write great metadata
- Title tags are still an incredibly important on-page ranking factor, so dedicate the right time when writing unique and keyword-rich titles.
- Meta descriptions are not a ranking factor anymore, but they still play a part in enticing the user to click on a search result. So from a CTR perspective, they still matter.
- SEO keyword research is the obvious choice to write compelling metadata, but donât forget about PPC ad copies â check what PPC ad copies work best for your site and take learnings from them.
- Donât change metadata too often, though: make sure you do your homework and give enough time to properly test new metadata, once implemented.
Make the right content amends
- Based on the intent audit and keyword mapping insights, youâre now ready to work on your actual page content.
- By now, youâve done your homework, so you just need to focus on writing great content for the user (and not for Google).
- Readability is a very important part of a page. Tricks that Iâve learned from colleagues over the years are the following:
- Read the content out loud and try to objectively assess how interesting it is for your target audience.
- Make sure to use enough spacing between lines and paragraphs. Peopleâs attention span these days is very short, and chances are people will skim through your content rather than dedicating 100% of their attention to it (Iâm sure some of YOU readers are doing it right now!).
- Make sure your tone of voice and language match your target audience (if you can write things in plain English vs. highly technical jargon, do so and donât over-complicate your life).
- Make sure youâve thought about all internal linking possibilities across the site. Not only for the same type of page (transactional page to transactional page, for instance) but also across different types (transactional page to video/blog post, if that helps people make a decision, for example).
- Optional step: once everything is ready, request indexing of your page in Google Search Console with the URL inspection tool.
Underperforming content is a very common issue and should not take you by surprise, especially considering that content is considered among (if not the) most important ranking factors in 2020. With the right tools and process in place, solving this issue is something everyone can learn: SEO is not black magic, the answer tends to be logical.
First, understand the cause(s) for your underperforming content. Once youâre certain youâre compliant with Googleâs technical guidelines, move on to determining what intent youâre trying to satisfy. Your research on intent should be comprehensive: this is whatâs going to decide what changes youâll need to make to your content. At that point, youâll be ready to make the necessary SEO and content changes to best match your findings.
To help us serve you better, please consider taking the 2020 Moz Blog Reader Survey, which asks about who you are, what challenges you face, and what you'd like to see more of on the Moz Blog.
Posted by BritneyMuller
Machine learning â a branch of artificial intelligence that studies the automatic improvement of computer algorithms â might seem far outside the scope of your SEO work. MozCon speaker (and all-around SEO genius) Britney Muller is here with a special edition of Whiteboard Friday to tell you why that's not true, and to go through a few steps to get you started. You can access the Colab notebook discussed in the video here.
To see more on machine learning from Britney and our other MozCon 2020 presentations, check out this year's video bundle.
Hey, Moz fans. Welcome to this special edition of Whiteboard Friday. Today we are taking a sneak peek at what I spoke about at MozCon 2020, where I made machine learning accessible to SEOs everywhere.
This is so, so exciting because it is readily at your fingertips today, and I'm going to show you exactly how to get started.
So to kick things off, I learned about this weird concept called brood parasites this summer, and it's fascinating. It's basically where one animal tricks another animal of the same species to raise its young.
It's fascinating, and the more I learned about it, the more I realized: oh my gosh, I'm sort of like a brood parasite when it comes to programming and machine learning! I latch on and find these great models that do all the work â all of the raising â and I put in my data and my ideas, and it does things for me.
So we are going to use this concept to our advantage. In fact, I have been able to teach my dad most of these models that, again, are readily available to you today within a tool called Colab. Let me just walk you through what that looks like.
Models to get you started
So to get started, if you want to start warming up right now, just start practicing by holding down "Shift" and then press "Enter".
Just start practicing that right now. It's half the battle. You're about to fire up some really cool models.
All right. What are some examples of that? What does that look like? So some of the models you can play with today are things like DeOldify, which is where you repair and colorize old photos. It's really, really fun.
Another one is a text generator. I created one with GTP-2 â super silly, it's this excuse generator. You can manipulate it and make it do different things for you.
There's also a really, really great forecasting model, where you basically put in a chunk of time series data and it predicts what the future might have in store. It's really, really powerful and fun.
You can summarize text, which is really valuable. Think about meta descriptions, all that good stuff.
You can also automate keyword research grouping, which I'll show you here in a second.
You can do really powerful internal link analysis, have a notebook for that.
Perhaps one of the most powerful things is you can extract entities and categories as Google perceives them. It's one of my favorite APIs. It's through Google's NLP API. I pull it into a notebook, and you basically put the URLs you want to extract this information from and you can compare how your URL compares to competitors.
It's really valuable, fun stuff.
Most importantly, you cannot break any of this!
Do not be intimidated by any of the code whatsoever. Lots of seasoned developers don't know what's happening in some of those code blocks. It's okay.
We get to play in this environment. It's hosted in Google Drive, and so there's no fear of this breaking anything on your computer or with your data or anything. So let's dive in. As I said, this is through a free tool called Colab.
You know how Google took Excel and made Google Sheets?
They did the same thing with what's known as Jupyter Notebooks. Jupyter is installed locally on your computer. It's one of the most popular notebook environments. But it requires some setup, and can be somewhat clunky. It gets confused with different versions and yada, yada. Google put that into the cloud (zero setup required) and is now calling it Colab Notebooks. It's unbelievably powerful.
So, again, it's free. It's available to you right now if you want to open it up in a new tab. There is zero setup. Google also gives you access to free GPU and TPU computing, which is great. It has a 12-hour runtime.
Some cons is that you can hit limits. So I hit the limits, and now I'm paying $9.99 a month for the Pro version and I've had no problems.
Again, I'm not affiliated with this whatsoever. I'm just super passionate about it, and the fact that they offer you a free version is so exciting. I've already seen a lot of people get started in this. It's also something to note that it's probably not as secure or robust as Google's Enterprise solution. So if you're doing this for a large company or you're getting really serious about this, you should probably check out some other options. But if you're just kind of dabbling and want to explore and have fun, let's keep this party going.
All right. So again, this is a cloud-hosted notebook environment. So one thing that I want to really focus on here, because I think it's the most valuable for SEOs, is this library known as "pandas".
Pandas is a data analysis library built on top of python. After running just two lines of code, you can upload a CSV file from your local computer. This silly example is one that I really did run with Google Search Console data.
So you run this in a notebook (Shift + Return). Again, I'm sharing this entire notebook with you today. So just click into the first text cell and start running Shift + Click to bring you down through the notebook. It's not as intimidating as it looks.
Once you run the above code, upload your CSV. Then once you upload it, you are going to name your data frame.
After uploading your CSV, you will see the output that states your CSV name. Copy that exactly how it's been uploaded and paste it into the next (above cell) to name it.
In this case, mine was just "gsc-example.csv". Again, once you upload your CSV, you will see the name in the output beneath the "from. google.colab import files" code cell.
Now, let's run some simple lines of code to make sure that your data is in there properly.
A common first check people do is "df.head()". This shows you the first five rows of your data frame. You can also do "df.tail()", and it shows you the last five rows of your data frame.
You can even put in a number in the parentheses (ex. df.head(30)) to see the first 30 rows. It's that easy!
Now comes the really fun stuff, and this is just tip of the iceberg.
So you can run this really, really cool code cell here to create a filterable table.
What's powerful about this, especially with your Google Search Console data, is you can easily extract and explore keywords that have a high click-through rate and a poor rank in search. It's one of my favorite ways to explore keyword opportunities for clients, and it couldn't be easier.
So, play around with the filterable view.
If you're doing keyword research, you're trying to bucket keywords, you're trying to organize topics, etc. but you can more easily organize your keywords with Pandas.
Here's how to create a new column that states whether or not a keyword is 'Branded'.
So to walk you through this, "df["Branded"]" creates a new column called "Branded".
Then "df.Query.str.contains("moz|rand|ose")" uses regex that labels any query with those keywords as Branded = True.
So now that makes filtering and exploring that so much faster! You can even do this in ways where you can create an entirely different data frame table (examples of that are also in this notebook).
You can use that and export your keywords into buckets like that, and there's no stall time. Things don't freeze up like Excel. You can account for misspellings and all sorts of good stuff more easily with regular expressions. It's super cool.
Again, this is just tip of the iceberg, my friends. I am most excited to plant this seed within all of you so that you guys can come back and teach me what you've been able to accomplish. I think we have so much more to explore in this space. It is going to be so much fun! If you get a kick out of this and you want to continue exploring different models, different programs within Colab, I highly suggest you download the Colab Chrome extension.
It just makes opening up the notebook so much easier.
You can save copys of notebooks you discover to your drive and play with it all you want. It's so much fun. I hope this sparked some inspiration in you, and I am so excited to hear what all of you think and create. -Really appreciate you watching.
So thank you so much. I will see you all next time. Bye.
Ready for more?
You'll uncover even more SEO goodness in the MozCon 2020 video bundle. At this year's special low price of $129, this is invaluable content you can access again and again throughout the year to inspire and ignite your SEO strategy:
- 21 full-length videos from some of the brightest minds in digital marketing
- Instant downloads and streaming to your computer, tablet, or mobile device
- Downloadable slide decks for presentations
Quick summary: Starting today, we support shippingDetails schema.org markup as an alternative way for retailers to be eligible for shipping details in Google Search results. Since June 2020, retailers have been able to list their products across different Google surfaces for free, including on Google Search. We are committed to supporting ways for the ecosystem to better connect with users that come to Google to look for the best products, brands, and retailers by investing both in more robust tooling in Google Merchant Center as well as with new kinds of schema.org options. Shipping details, including cost and expected delivery times, are often a key consideration for users making purchase decisions. In our own studies, weâve heard that users abandon shopping checkouts because of unforeseen or uncertain shipping costs. This is why we will often show shipping cost information in certain result types, including on free listings on Google Search…
Posted by liambbarnes
As most SEO specialists have learned, you must create quality content to grow organically. The same thing can be said for businesses that are building a social media presence or a new newsletter following.
But as people consume more and more content each day, they become less receptive to basic content that doesn't provide a new perspective. To counter this issue, you must make sure that your content is native to each platform you publish on.
However, that doesnât mean that you need to start from scratch. There's a way to take one content idea and turn it into multiple, which can scale across multiple platforms and improve your brand awareness.
It takes time to write a brand-new blog article every day, especially when you're an in-house team with a low number of resources and budget. The biggest challenge here is building a content strategy at scale.
So, how do you create a lot of great content?
You start with video.
If you have a video on a relevant topic, it can be repurposed into various individual pieces of content and distributed over a period of time across the right channels. Letâs walk through the process.
Using video to scale content
In fact, this article was transcribed.
For every article you write about, you must do extensive research, write out your first draft, edit, make changes, and more. It can consume an entire workday.
An easier way to do this? Record yourself on Loom or another video software, save it, and send the video file to an audio/video transcription service. There are so many tools, like Rev.com or TranscribeMe, that do this for relatively cheap.
Of course, even if you're relying on text-to-speech, there's still editing time to take into account, and some would argue it will take MORE time to edit a text-to-speech transcription. There isnât a âbest wayâ to create content, however, for those who aren't strong writers but are strong speakers, transcription will be a powerful way to move at a quicker pace.
The step-by-step process
Once you write out your content, how do you ensure that people read it?
Like any other content strategy, make sure that the process of planning, creating, and executing is written down (most likely digitally in a spreadsheet or tracking tool) and followed.
Letâs break down how to get the most out of your content.
1. Grab attention with your topic
Sometimes, content ideation can be the most challenging part of the process. Depending on the purpose of your content, there are various starting points.
For example, if you're writing a top-of-funnel blog article where the goal is to drive high amounts of organic traffic, start by performing keyword research to craft your topic. Why? You need to understand what your audience searches for and how to ensure youâre in the mix of search results.
If you're creating a breakdown of your product or service, you may want to start by interviewing a subject matter expert (SME) to gain real-life details on the product/service and the solutions it provides to your target audience. Why? Note what theyâre saying are the most important aspects or if there is a new feature/addition for the audience. These points can be tied into a topic that might pique the target reader's interest.
2. Create an outline for the blog
When you're building out your blog structure, record a video similar to how you would write a blog article.
In this case, by creating an outline for the article with the questions that you ask yourself, it'll be easier to format the transcription and the blog after you record.
3. Pick your poison (distribution strategy)
Now that you're ready to begin recording your video, decide where your content will be distributed.
The way you'll distribute your content heavily influences the way you record your video, especially if you're going to be utilizing the video as the content itself (Hello, YouTube!).
For example, if you run a business consultancy, the videos that you record should be more professional than if you run an e-commerce surf lifestyle brand. Or, if you know youâre going to be breaking the video up, leave time for natural âbreaksâ for easy editing later on.
By planning ahead of time, you give yourself a better idea of where the content will go, and how it will get there.
4. Your time to shine
With Zoom, you can record the video of yourself speaking into your camera, and you will get an audio file after you hang up your call.
With Loom, you can use the chrome extension, which allows you to record yourself in video form while sharing your screen. If you have additional content, like a Powerpoint presentation or a walk-through, this might be the tool for you.
Regardless of the way that you record, you need an audio file to transcribe and transform into other content formats later on.
5. Transcribe your video
The average writer transcribes one hour of audio in around four hours, but some of the best transcribers can do it in as little as two hours.
To put that into perspective, the average one-hour audio file is about 7,800 words, which would take the average writer around three and a half hours to write.
Additionally, you have to add research time, internal linking, and many other factors to this, so on average it'll take around an hour to write 1,000 words of a high-quality blog post.
Transcription shortens the length of this process.
When looking to transcribe your audio, you can send files out to transcription tools including Rev or TranscribeMe. Once you send them the audio file, you'll typically receive the audio file back in a few hours (depending on the demand).
6. Alter transcription into blog format
You'll receive the transcribed content via email, broken out by speaker. This makes it much easier to format post-transcription.
If you properly outlined the blog prior to recording, then this editing process should be simple. Copy and paste each section into the desired area for your blog and add your photos, keywords, and links as desired.
7. Chop your video into digestible parts
Hereâs where things get interesting.
If you're using your video for social media posts, shorten the video into multiple parts to be distributed across each platform (and make sure theyâre built to match each platform's guidelines).
Additionally, quotes from the video can be used to create text graphics, text-based social posts, or entire articles themselves.
Think of the watering holes that your target audience consumes information on the internet:
Each platform requires creating a different experience that involves new, native content. But that doesnât mean you have to start at zero.
If you have a 10-minute-long video, it can be transcribed into a 2,500-word blog that takes about 10-15 minutes to read.
Boom. You have another resource to share, which can also include proper keywords so it ranks higher on the SERP.
Letâs say you end up editing the video down to about five minutes. From here, you can make:
- A five minute video to post on YouTube and your blog
- Ten 30-second videos to post across several social media platforms
- Twenty 100-word posts on LinkedIn
- Thirty 50 to 60-word posts on Twitter
Not to mention there are other platforms like Reddit and Quora, as well as email marketing, that you can also distribute your content with. (Turn one of the 100-word LinkedIn posts into the opening in your latest newsletter, and attach the full video for those who want to learn more!)
By starting off with an all-encompassing video, you extend your content capabilities from a regular blog article into 50+ pieces of content across multiple social media platforms and search engines.
For example, Lewis Howes (and many other brands and marketers) are famous for utilizing this method.
As you can see below, Howes had an interview for his podcast with Mel Robbins, which is scaled across YouTube and podcast platforms, but he took a quote from her in the interview and scaled it across Instagram, Twitter, and LinkedIn.
When you build out your content calendar, simply copy and paste certain sections into an excel spreadsheet, and organize them based on date and platform. Make sure they make sense on the platform, add an extra line or two if you need to, and work your magic.
This will save you hours of time in your planning process.
Now that you have created your various forms of content, itâs time to make sure it appears before the right eyes.
Having a consistent flow of relevant content on your website and social media platforms is a crucial part of empowering your brand, building credibility, and showing that youâre worth trusting as a potential partner.
As you repurpose older content as well, you can repeat this process and pull together another 50+ pieces of content from a previously successful article.
Improving organic search visibility
"Discoverability" is a popular term in marketing. Another way to say it is "organic search visibility". Your brandâs search visibility is the percentage of clicks that your website gets in comparison to the total number of clicks for that particular keyword or group of keywords.
Normally, you can improve your visibility through writing a piece of content that reflects a target keyword the best and build links to that page, which improves your rankings for that keyword and long-tail variations of that keyword.
However, as you begin to grow your business, you may begin heavily relying on branded search traffic.
In fact, one of the biggest drivers of organic traffic is branded traffic. If you don't have an authoritative brand, it's challenging to receive backlinks naturally, and therefore more difficult to rank organically.
One of the biggest drivers of brand awareness is through social media. More than 4.5 billion people are using the internet and 3.8 billion are using social media.
If you want more people to search for your brand, push relevant social media campaigns that do just that.
But even further than that, we are seeing more and more social media platforms such as Pinterest, YouTube, and Twitter showing up as search results and snippets. For example, below is the SERP for the keyword âhow to make cookiesâ, where a series of YouTube videos show up:
And this SERP for the keyword âMozâ has the most recent Tweets from Moz's Twitter.
Writing content that ranks will continue to be important â but as Google keeps integrating other forms of social media into the SERPs, make time to post on every social media platform to improve search visibility and make your brand discoverable.
But, duplicate content?
Duplicate content can be defined as the same content used across multiple URLs, and can be detrimental to your websiteâs health. However, from what we have seen through multiple conversations with marketers in the SEO world, there is no indication that websites are getting penalized for duplicate content when reposting said content on social media platforms.
Say goodbye to the time drain of creating one piece of content at a time. The most effective way to create a successful content marketing strategy is to share thought-provoking and data-driven content. Take advantage of this process to maximize your output and visibility.
Here are some final tips to take away to successfully launch a content marketing strategy, using this method:
- Consistently analyze your results and double down on what works.
- Donât be afraid to try new tactics to see what your audience is interested in (Check out a real-world content strategy I helped get results for here).
- Analyze the response from your audience. They'll tell you what is good and what is not!
Have other ideas? Let me know in the comments!
Last year we released the robots.txt parser and matcher that we use in our production systems to the open source world. Since then, we’ve seen people build new tools with it, contribute to the open source library (effectively improving our production systems- thanks!), and release new language versions like golang and rust, which make it easier for developers to build new tools. With the intern season ending here at Google, we wanted to highlight two new releases related to robots.txt that were made possible by two interns working on the Search Open Sourcing team, Andreea Dutulescu and Ian Dolzhanskii. Robots.txt Specification Test First, we are releasing a testing framework for robots.txt parser developers, created by Andreea. The project provides a testing tool that can validate whether a robots.txt parser follows the Robots Exclusion Protocol, or to what extent. Currently there is no official and thorough way to assess the correctness…
Posted by DawnMacri
Understanding search intent can be the secret ingredient that brings your content strategy from okay to outstanding. As an SEO Strategist at a digital marketing agency (Brainlabs), we often find clients on the brink of ranking success. Theyâre sitting on stellar content that simply isnât ranking for their target keywords. Why? Oftentimes, the keywords and the intent simply donât match.
Here weâll discuss the different types of search intent, how to determine the best intent for given keywords, and how to optimize for search intent. Firstâletâs iron out the basics.
What is search intent?
Search intent (also known as user intent) is the primary goal a user has when searching a query in a search engine. Many times, users are searching for a specific type of answer or resource as they search.
Take pizza for example. Searching for a pizza recipe has a different intent than searching for a takeout pizza, which is also different from searching for the history of pizza. Though they all revolve around the same overall topic (pizza), these users all have different intents.
Why is search intent important for SEO?
Google cares about search intent
The short answer is: Satisfying search intent is a primary goal for Google, which in turn makes it a primary goal for SEOs. When a user searches for a specific term and finds irrelevant information, that sends a signal back to Google that the intent is likely mismatched.
For example, if a user searches âHow to build a website,â and theyâre shown a slew of product pages for CMS platforms and hosting sites, theyâll try another search without clicking on anything. This is a signal to Google that the intent of those results do not reflect the intent of the searcher.
Broaden your reach across funnel stages
When it comes to running a business and building a successful content marketing strategy, I canât stress enough the importance of remembering search intent, and letting that be the driving force behind the pieces of content you create and how you create them.
And just why is this so important? The more specific your content is to various search intents, the more users you can reach, and at different stages of the funnel. From those who are still to discover your brand to those looking to convert, you can increase your chances of reaching them all by focusing your efforts on matching search intent.
You can improve rankings
Since Googleâs primary ranking factors are relevance, authority, and user satisfaction, itâs easy to connect the dots and see how improving your keyword targeting to mirror search intent can improve your overall rankings.
Relevance: This has to do with your userâs behavior. If they find the information theyâre looking for on your site, theyâre less likely to return to Google within seconds and explore a different result (pogo-sticking). Youâll notice a difference in such KPIs as click-through rate and bounce rate when your content is relevant to search intent.
Authority: While much of a siteâs authority is connected to backlinks, itâs also important to develop a strong internal linking strategy that signals to Google âI have a lot of content covering all angles and intents surrounding this topicâ to rank well. Additionally, you can increase brand authority and visibility by creating valuable content around topics your brand is well versed in, that satisfies various intents.
User satisfaction: Does the content you create provide value and is it relevant to your audience? End of story.
Types of search intent
While there are endless search terms, there are just four primary search intents:
- Preferential/Commercial Investigation
Now you may be thinking, thatâs all well and good, but what do they mean for my content? Luckily, Iâve broken each one down with example terms that suggest intent. Keep in mind, however, that searches are not binary ââ many will fall under more than one category.
As you may have guessed, searches with informational intent come from users looking for... information! This could be in the form of a how-to guide, a recipe, or a definition. Itâs one of the most common search intents, as users can look for answers to an infinite number of questions. That said, not all informational terms are questions. Users searching for simply âBill Gatesâ are most likely looking for information about Bill Gates.
- How to boil an egg
- What is a crater
- Ruth Bader Ginsburg
- Directions to JFK Airport
Before theyâre ready to make a purchase, users start their commercial investigation. This is when they use search to investigate products, brands, or services further. Theyâre past the informational stage of their research and have narrowed their focus to a few different options. Users here are often comparing products and brands to find the best solution for them.
Note: These searches often include non-branded localized terms such as âbest body shop near meâ or âtop sushi restaurant NYC.â
- Semrush vs Moz
- Best website hosting service
- Squarespace reviews
- Wordpress or wix for blog
Transactional searchers are looking to make a purchase. This could be a product, service, or subscription. Either way, they have a good idea of what theyâre looking for. Since the user is already in buying mode, these terms are usually branded. Users are no longer researching the product, theyâre looking for a place to purchase it.
- Buy Yeti tumbler
- Seamless coupon
- Shop Louis Vuitton bags
- Vanâs high tops sale
These searchers are looking to navigate to a specific website, and itâs often easier to run a quick search in Google than to type out the URL. The user could also be unsure of the exact URL or looking for a specific page, e.g. a login page. As such, these searches tend to be brand or website names and can include additional specifications to help users find an exact page.
- Spotify login
- MOZ beginner SEO
How to determine search intent
Consider keyword modifiers
As we briefly noted above, keyword modifiers can be helpful indicators for search intent. But itâs not enough just to know the terms, you may also be wondering, when it comes to keyword research, how do you find these terms?
Thankfully, there are a range of trusted keyword research tools out there to use. Their filter features will be most useful here, as you can filter terms that include certain modifiers or phrases.
Additionally, you can filter keywords by SERP feature. Taking informational intent for example, you can filter for keywords that rank for knowledge panels, related questions, and featured snippets.
Read the SERPs
Another way to determine search intent is to research the SERPs. Type in the keyword youâre targeting into the search bar and see what Google comes up with. Youâll likely be able to tell by the types of results what Google deems the most relevant search intent for each term.
Letâs take a closer look at search results for each intent type.
SERP results for informational intent
As mentioned above, informational keywords tend to own SERP results that provide condensed information. These include knowledge grabs, featured snippets, and related questions. The top results are most likely organic results, and consist of Wikipedia, dictionary, or informative blog posts.
SERP results for preferential/commercial research intent
Preferential intent is similar in that results may include a featured snippet, but theyâll also include paid results at the top of the SERP. The results will also likely provide information about the brands searched, rather than topical information.
In the example below, the organic results compare product features between competing site hosts, rather than explaining what site hosts are and how they function.
SERP results for transactional intent
Transactional SERPs are some of the most straightforward to spot. They usually lead with paid results and/or shopping results, shopping carousels, and reviews. The organic results are largely product pages from online and brick and mortar retailers, and depending on the search, can include maps to their locations.
SERP results for navigational intent
Since users with navigational intent already know which website theyâre looking for, these results usually feature the most relevant page at the top: e.g. if the user searches âSpotifyâ, Spotifyâs homepage will be the first result, whereas the login page will take first position for âSpotify login.â
Additional features such as site links, knowledge cards, and top stories may also be present, depending on the specific search.
Look at the full picture
Keep in mind that terms often have more than one search intent, so looking only at keywords or the SERP is rarely enough to truly define it. That said, taking this holistic approach will bring you closer to the most prominent intent.
Itâs also important to note that SERPs are volatile, so while a keyword may rank for one intent this month, that could change next month.
How to optimize for search intent
Match metadata and content type to the intent
Youâve done your research and know which keywords youâre targeting with which pages. Now itâs time to optimize. A solid place to start is with your pagesâ metadata ââ update your title tag, H1, and H2s to reflect your specific keyword targeting. To increase click-through rate, try to leverage your title tag with some snappy copy (without creating clickbait).
Examine the competition
As with most competitions, itâs a good idea to suss out the current winners prior to the event. So, before jumping in to creating new pages or reformatting existing content, take a look at the top-ranking pages and ask yourself the following questions:
- How are they formatted?
- Whatâs their tone?
- Which points do they cover?
- What are they missing?
You can now use your answers to create the best, most relevant piece of content on the topic.
Format content for relevant SERP features
Just as you used the SERP features as clues to search intent, they can also be used to inform your pagesâ formatting and content. If the featured snippet contains a numbered list, for example, itâs safe to say that Google appreciates and rewards that format for that term.
In a similar vein, if the SERP returns related questions, be sure to answer those questions clearly and concisely in your content.
When creating SEO content around search intent, be sure to keep the following in mind:
- Understand the search intent before optimizing content
- When discovering new terms, use specific modifiers in your keyword research
- Use the SERPs to determine optimal formatting and content options
- Provide valuable, quality content every time
Creating SEO optimized content for specific search intents is simple, but not easy. Follow these guidelines and youâll be well on your way to giving users the content they need in a format that they want.
For a deeper dive on fulfilling search intent, be sure to check out this informative Whiteboard Friday from Britney Muller.
Posted by rjonesx.
Since day one of SEO, marketers have tried to determine what factors Google takes into account when ranking results on the SERPs. In this brand new Whiteboard Friday, Russ Jones discusses the theory behind those ranking factors, and gives us some improved definitions and vocabulary to use when discussing them.
Hi, folks. Welcome back to another Whiteboard Friday. Today, we're going to be talking about ranking factors and the theory behind them, and hopefully get past some of these â let's say controversies â that have come up over the years, when we've really just been talking past one another.
You see, ranking factors have been with us since pretty much day one of search engine optimization. We have been trying as SEOs to identify exactly what influences the algorithm. Well, that's what we're going to go over today, but we're going to try and tease out some better definitions and vocabulary so that we're not talking past one another, and we're not constantly beating each other over the heads about correlation and not causation, or some other kind of nuance that really doesn't matter.
So let's begin at the beginning with direct ranking factors. This is the most narrow kind of understanding of ranking factors. It's not to say that it's wrong â it's just pretty restrictive. A direct ranking factor would be something that Google measures and directly influences the performance of the search result.
So a classic example would actually be your robots.txt file. If you make a change to your robots.txt file, and let's say you disallow Google, you will have a direct impact on your performance in Google. Namely, your site is going to disappear.
The same is true for the most part with relevancy. Now, we might not know exactly what it is that Google is using to measure relevancy, but we do know that if you improve the relevancy of your content, you're more likely to rank higher. So these are what we would call direct ranking factors. But there's obviously a lot more to it than that.
Google has added more and more features to their search engine. They have changed the way that their algorithm has worked. They've added more and more machine learning. So I've done my best to try and tease out some new vocabulary that we might be able to use to describe the different types of ranking factors that we often discuss in our various communities or online.
Now, obviously, if there are direct ranking factors, it seems like there should be indirect ranking factors. And these are just once-removed ranking factors or interventions that you could take that don't directly influence the algorithm, but they do influence some of the direct ranking factors which influence the algorithm.
I think a classic example of this is hosting. Let's say you have a site that's starting to become more popular and it's time to move off of that dollar-a-month cPanel hosting that you signed up for when you first started your blog. Well, you might choose to move to, let's say, a dedicated host that has a lot more RAM and CPU and can handle more threads so everything is moving faster.
Time to first byte is faster. Well, Google doesn't have an algorithm that's going out and digging into your server and identifying exactly how many CPU cores there are. But there are a number of direct ranking factors, those that are related perhaps to user experience or perhaps to page speed, that might be influenced by your hosting environment.
Subsequently, we have good reason to believe that improving your hosting environment could have a positive influence on your search rankings. But it wouldn't be a direct influence. It would be indirect.
The same would be true with social media. While we're pretty sure that Google isn't just going out and saying, "Okay, whoever is the most popular on Twitter is going to rank," there is good reason to believe that investing your time and your money and your energy in promoting your content on social media can actually influence your search results.
A perfect example of this would be promoting an article on Facebook, which later gets picked up by some online publication and then links back to your site. So while the social media activity itself did not directly influence your search results, it did influence the links, and those links influenced your search results.
So we can call these indirect ranking factors. For politeness' sake, please, when someone talks about social media as a ranking factor, just don't immediately assume that they mean that it is a direct ranking factor. They very well may mean that it is indirect, and you can ask them to clarify: "Well, what do you mean? Do you think Google measures social media activity, or are you saying that doing a better job on social is likely to influence search results in some way or another?"
So this is part of the process of teasing out the differences between ranking factors. It gives us the ability to communicate about them in a way in which we're not, let's say, confusing what we mean by the words.
Now, the third type is probably the one that's going to be most controversial, and I'm actually okay with that. I would love to talk in either the comments or on Twitter about exactly what I mean by emergent ranking factors. I think it's important that we get this one clear in some way, shape, or form because I think it's going to be more and more and more important as machine learning itself becomes more and more and more important as a part of Google's algorithm.
Many, many years ago, search engine optimizers like myself noticed that web pages on domains that had strong link authority seemed to do well in organic search results, even when the page itself wasn't particularly good, didn't have particularly good external links â or any at all, and even didn't have particularly good internal links.
That is to say it was a nearly orphaned page. So SEOs started to wonder whether or not there was some sort of domain-level attribute that Google was using as a ranking factor. We can't know that. Well, we can ask Google, but we can only hope that they'll tell us.
So at Moz, what we decided to do was try and identify a series of domain-level link metrics that actually predict the likelihood that a page will perform well in the search results. We call this an emergent ranking factor, or at least I call it an emergent ranking factor, because it is obviously the case that Google does not have a specific domain-authority-like feature inside their algorithm.
But on the contrary, they also do have a lot of data about links pointing to different pages on that same domain. What I believe is going on is what I would call an emergent ranking factor, which is where, let's say, the influence of several different metrics â none of which have a particularly intended purpose of creating something â end up being easy to measure and to talk about as an emergent ranking factor, rather than as part of all of its constituent elements.
Now, that was kind of a mouthful, so let me give you an example. When you're making a sauce if you're cooking, one of the most common parts of that would be the production of a roux. A roux would be a mix, normally of equal weights of flour and fat, and you would use this to thicken the sauce.
Now, I could write an entire recipe book about sauces and never use the word "roux". Just don't use it, and describe the process of producing a roux a hundred times, but never actually use the word "roux", because "roux" describes this intermediate state. But it becomes very, very useful as a chef to be able to just say to another chef (or a sous-chef, or a cook in their cookbook), "produce a roux out of" and then whatever is the particular fat that you're using, whether it's butter or oil or something of that sort.
So the analogy here is that there isn't really a thing called a roux that's inside the sauce. What's in the sauce is the fat and the flour. But at the same time, it's really convenient to refer to it as a roux. In fact, we can use the word "roux" to know a lot about a particular dish without ever talking about the actual ingredients of flour and of fat.
For example, we can be pretty confident that if a roux is called for in a particular dish, that dish is likely not bacon because it's not a sauce. So I guess what I'm trying to get at here is that a lot of what we're talking about with ranking factors is using language that is convenient and valuable for certain purposes.
Like DA is valuable for helping predict search results, but it doesn't actually have to be a part of the algorithm in order to do that. In fact, I think there's a really interesting example that's going on right now â and we're about to see a shift from the categories â which are Core Web Vitals.
Google has been pushing page speed for quite some time and has provided us several iterations of different types of metrics for determining how fast a page loads. However, what appears to be the case is that Google has decided not to promote individual, particular steps that a website could take in order to speed up, but instead wants you to maximize or minimize a particular emergent value that comes from the amalgamation of all of those steps.
We know that the three different types of Core Web Vitals are: first input delay, largest contentful paint, and cumulative layout shift. So let's talk about the third one. If you've ever been on your cell phone and you've noticed that the text loads before certain other aspects and you start reading it and you try and scroll down and as soon as put your finger there an ad pops up because the ad took longer to load and it's just jostling the page, well, that's layout shift, and Google has learned that users just don't like it. So, even though they don't know all of the individual factors underneath that are responsible for cumulative layout shift, they know that there's this measurement, that explains all of it, that is great shorthand, and a really effective way of determining whether or not a user is going to enjoy their experience on that page.
This would be an emergent ranking factor. Now, what's interesting is that Google has now decided that this emergent ranking factor is going to become a direct ranking factor in 2021. They're going to move these descriptive factors that are amalgamations of lots of little things and make them directly influence the search results.
So we can see how these different types of ranking factors can move back and forth from categories. Back to the question of domain authority. Now, Google has made it clear they don't use Moz's domain authority â of course they don't â and they do not have a domain-authority-like metric. However, there's nothing to say that at some point they could not build exactly that, some sort of domain-level, link-based metric which is used to inform how to rank certain pages.
So an emergent ranking factor isn't stuck in that category. It can change. Well, that's enough about emergent ranking factors. Hopefully, we can talk more about that in the comments.
The next type I wanted to run through is what I would call a validating ranking factor. This is another one that's been pretty controversial, which is the Quality Rating Guidelines' list of things that matter, and probably the one that gets the most talked about is E-A-T: Expertise, Authority, and Trustworthiness.
Well, Google has made it clear that not only do they not measure E-A-T (or at least, as best as I've understood, they don't have metrics that are specifically targeted at E-A-T), not only do they not do that, they also, when they collect the data from quality raters on whether or not the SERPs they're looking at meet these qualifications, they don't train their algorithm against the labeled data that comes back from their quality raters, which, to me, is surprising.
It seems to me like if you had a lot of labeled data about quality, expertise, and authoritativeness, you might want it trained against that, but maybe Google found out that it wasn't very productive. Nevertheless, we know that Google cares about E-A-T, and we also have anecdotal evidence.
That is to say webmasters have noticed over time, especially in "your money or your life" types of industries, that expertise and authority does appear to matter in some way, shape, or form. So I like to call these validating ranking factors because Google uses them to validate the quality of the SERPs and the sites that are ranking, but doesn't actually use them in any kind of direct or indirect way to influence the search results.
Now, I've got an interesting one here, which is what I would call user engagement, and the reason why I've put it here is because this still remains to be a fairly controversial ranking factor. We're not quite sure exactly how Google uses it, although we do get some hints every now and then like Core Web Vitals.
If that data is collected from actual user behavior in Chrome, then we've got an idea of exactly how user engagement could have an indirect impact on the algorithm because user engagement measures the Core Web Vitals, which, coming in 2021, are going to directly influence the search results.
So validating is this fourth category of ranking factors, and the last â the one that I think is the most controversial â is correlates. We get into this argument every time: "correlation does not equal causation", and it seems to me to be the statement that the person who only knows one thing about statistics knows, and so they always say it whenever anything ever comes up about correlation.
Yes, correlation does not imply causation, but that doesn't mean it isn't very, very useful. So let's talk about social metrics. This is one of the classic ones. Several times we've run various studies of ranking factors and discovered a direct relationship â a strong relationship â between things like Facebook likes or Google pluses in rankings.
All right. Now, pretty much everyone immediately understood that the reason why a site would have more plus-ones in Google+ and would have more likes in Facebook would be because they rank. That is to say, it's not Google going out and depending on Facebook's API to determine how they're going to rank the sites in their search engine.
On the contrary, performing well in their search engine drives traffic, and that traffic then tends to like the page. So I understand the frustration there when customers start asking, "Well, these two things correlate. Why aren't you getting me more likes?"
I get that, but it doesn't mean that it isn't useful in other ways. So I'll give you a good example. If you are ranking well for a keyword but yet your social media metrics are poorer than your competitors', well, it means that there's something going on in that situation that is making your users engage better with your competitors' sites than your own, and that's important to know.
It might not change your rankings, but it might change your conversion rate. It might increase the likelihood that you get found on social media. Even more so, it could actually influence your search results. Because, when you recognize the reason why you're not getting any likes to your page is because you have broken code, so the Facebook button isn't working, and then you add it and you start getting shared and more and more people are engaging with and linking to your content, well, then we start having that indirect effect on your rankings.
So, yeah, correlation isn't the same as causation, but there's a lot of value there. There's a new area that I think is going to be really, really important for this. This is going to be natural language processing metrics. These are various different technologies that are on the cutting edge. Well, some are older. Some are newer. But they allow us to kind of predict how good content is.
Now, chances are we are not going to guess the exact way that Google is measuring content quality. I mean, unless a leaked document or something shows up, we're probably not going to get that lucky. But that doesn't mean we can't be really productive if we have a number of correlates, and those correlates can then be used to guide us.
So I drew a little map here to kind of serve as an example. Imagine that it's the evening and you're camping, and you decide to go on a quick hike, and you take with you, let's say, a flag or a series of flags, and you mark the trail as you go so that when it gets later, you can flick on your flashlight and just follow the flags, picking them up, to lead you back to camp.
But it gets super dark, and then you realize you left your flashlight back at camp. What are you going to do? Well, we need to find a way to guide ourselves back to camp. Now, obviously, the flags would have been the best situation, but there are lots of things that are not the camp itself and are not the path itself, but would still be really helpful in getting us back to camp. For example, let's say that you had just put out the fire after you left camp. Well, the smell of the smoke is a great way for you to find your way back to the camp, but the smoke isn't the camp. It didn't cause the camp. It didn't build the camp. It's not the path. It didn't create the path. In fact, the trail of smoke itself is probably quite off the path, but once you do find where it crosses you, you can follow that scent. Well, in that case, it's really valuable even though it just mildly correlates with exactly where you need to get.
Well, the same thing is true when we're talking about something like NLP metrics or social media metrics. While they might not matter in terms of influencing the search results directly, they can guide your way. They can help you make better decisions. The thing you want to stay away from is manipulating these types of metrics for their own sake, because we know that correlates are the furthest away from direct ranking factors â at least when we know that the correlate itself is not a direct ranking factor.
All right. I know that's a lot to stomach, a lot to take in. So hopefully, we have some material for us to discuss below in the comments, and I look forward to talking with you more. Good luck. Bye.