If you start a campaign in Moz, go to page optimization, enter a URL and keyword, and go to the bottom where it says "Content Suggestions" is that basically do a TF-IDF analysis? I want to make sure I understand how that works. Thanks!
- Home
- brettmandoes
brettmandoes
@brettmandoes
Job Title: SEO and Web Planner/Analyst
Company: Strategic America
Favorite Thing about SEO
Gathering and evaluating data
Latest posts made by brettmandoes
-
Is the Content Suggestions section under Page Optimization a TF-IDF Analysis?
-
How Can I Batch Upload URLs to get PA for many pages?
Howdy folks, I'm using advanced search operators to generate lists of long tail queries related to my niche and I'd like to take the batch of URLs I've gathered and upload them in a batch so I can see what the PA is for each URL. This would help me determine which long tail query is receiving the most love and links and help inform my content strategy moving forward.
But I can't seem to find a way to do this. I went to check out the Moz API but it's a little confusing. It says there's a free version, but then it looks like it's actually not free, then I try to use it and it says I've gone over my limit even though I haven't used it yet.
Anyone that can help me with this, I'd really appreciate it. If you're familiar with SEMRush, they have a batch analysis tool that works well, but I ideally want to upload these URLs to Moz because it's better for this kind of research. Thanks!
-
RE: Optimal URL Structure for a Multi-City Directory
I think you should consider how your users are interacting with your website and how they search for your services/products/locations and follow that. For example, Yelp is focused on local reviews. People will filter first to their city, then the category naturally. You would never filter down to restaurants first, because if you're in Huntington Beach, CA you really don't care what's in Portland, OR. If location is secondary to your product, then it makes sense to start with the category. For example, let's say you sell ATVs and other off-road vehicles and gear, but some showrooms only have ATVs while others also carry dirt bikes. Customers who are looking for a dirt bike care more about reaching a showroom with dirt bikes, so that category structure would be preferable.
Note that I'm assuming in both of the above examples that your navigation is following the structure of your website for usability purposes. In terms of structure, one way is not inherently better than the other from a ranking/algorithm perspective, but if your structure is confusing it can be detrimental to SEO. For example, outreach is a lot harder if you have a garbage navigation that contributes to poor user experience on your website. Any piece of Google's algorithm that measures user satisfaction with your website (Rank Brain, pogo sticking, etc.) will either directly or indirectly affect you depending on how user friendly your website is.
One last thing: in both instances you have the geography in the URL, so if you're hoping for a boost for local phrases from an exact match URL I think you're already tapping that. EMDs are nowhere near as effective as they were in years past, so I wouldn't make that my focus.
-
RE: Differentiating Franchise Location Names to better optimize locations
Hi Jeff, I think I can help you with this, but to clarify, it looks like you have three separate questions:
1. What is best practice for naming different locations to optimize for local SEO?
2. What is the best URL structure to optimize for local SEO?
3. Should geo specific terms be used in blogs?Be sure to let me know if I'm missing the mark. I'm also going to go heavy on industry jargon and assume you know what it means, so feel free to ask questions if I go over your head at any point.
1. For local SEO, it's important to start with a good foundation. This means you have citations claimed for each location with consistent NAP information on your GMB profile, your listings, and the landing page on your website for that location. So if your name includes the geo on the website, it should also include the geo on your GMB profile and citations. It's preferably to use the specific city name they are in. For example, if you're in Flower Mound, TX, be sure to use Flower Mound, not Dallas. Some local SEOs get tripped by targeting the metro area they're in and that can tank results. If some of your locations are in the same city, dividing them up somehow as North/South, East/West, etc. is fine. Google typically picks one or both in those circumstances to display in search.
2. For URL structure, using subpages the way you have laid out is fine. For enterprise local SEO my agency uses a proprietary, scalable CMS to build unique, local websites that rank very well, so I'm more familiar with that structure, but one of the tricks we use is to include a geo variable in the URL, which helps rank for some terms like "glass repair dallas tx", because we can get picked up on the exact match. Every little bit helps.
3. For blogs, I would recommend you completely ignore the geo unless your blog is very unique and specific to the location. You should really only target the location when it's a page that you're trying to rank for local queries and you typically don't have that in a blog. For example, a blog about "what to expect in a hundred year old house" will typically not rank for keywords that trigger the local algorithm, so there's no reason to add the geo. It just gets in the way of the content, and inferior content doesn't rank well. Now a blog like "what to plant in your [location] fall garden" just may have some localization to it, because what you plant in the fall in Des Moines is different than Atlanta. But I find these cases to be few and far between.
Hope that helps, let me know if you have questions.
-
RE: Hi. One of our competitors is ranking ahead of us on Google. Our site has a much stronger authority and much more quality links than this competitor. Would anyone have any explanations for this? Thanks
Hi barryhq, Google has in the past called the top three components of their algorithm Content, Links, and RankBrain, without naming a particular order. It sounds like you've worked hard on links, so good job! Your problem is therefore potentially related to content (ignore RankBrain, you can't really optimize for it).
So let's talk about content. When you do your keyword research, try to focus in on the searcher's intent. What tasks are they possibly trying to solve that Google is surfacing that aren't addressed on your webpage? You can learn this by studying other top results, related searches, people also ask, etc. For example, if one of the keywords you have is "best running shoes" then you know people are doing comparison shopping and including content that compares the top running shoes on your page will help you rank for that. And it's entirely possible that you'll discover in this process that your website is not a suitable match for the keywords you've targeted. I've seen this happen with clients who pick a phrase without doing the research required to make an informed decision and end up targeting something they can never rank for.
It's also possible that you have technical SEO issues, like canonicalization or poor internal link structure or cannibalization that's making it harder to rank, but assuming that your technical SEO game is on point I would recommend focusing on content.
-
RE: Footer no follow links
Hi seoman, it's definitely outdated and was never accurate to begin with. The "nofollow" attribute was always designed to be applied to external links and modern advice is to never apply a nofollow link to your own internal links. If you're concerned about passing authority from a page like your homepage down into your footer links instead of more important pages, you should know that Google tags the links on your site so that they're weighted differently, i.e. a link in your body content is worth more than a link in your footer, image links don't pass as much authority, etc.
In short, I don't think you're going to move the needle by altering your footer links to nofollow.
-
RE: 301 Redirect and Canonical link tag pointing in opposite directions!
Canonicals are not absolute directives, so Google will eventually sort out which of the two signals is more important. My guess is that the redirect takes precedence, because if they displayed the canonical to a user in search, it would be displaying a URL that sends users through a redirect which is a poor experience and they take pains not to do that.
When there are confusing signals like this out there, Google will do its best to sort out these issues and John Mueller has repeatedly stated "we do a pretty good job" at figuring it out, but he almost always adds a disclaimer that it's "better" to have a less confusing structure.
In plain english, it's not a catastrophic error, but it's something you need to clean up as part of your optimization efforts.
-
RE: Proper URL Structure. Feedback on Vendors Recommendation
Hi there, I've got a few thoughts to drop about this, but I want to make sure I answer your specific question first, then answer what I think are the lead up or follow up questions that are either on your mind or that you'll land at in the end anyway.
There are specific instances where you may favor one URL structure over the other. For example, our landing pages are similar to your current structure, and the rest of the website is more similar to your vendor's proposed structure. Folders are a great way to categorize your content and help both Google and users navigate and understand your content. However, you do not want to lose the hyphens. That can make it difficult for users to read in search when they're deciding on a page to view and it can be difficult for Google to read. Let's say your URL has an acronym in it - maybe you're writing about basketball and NBA is in the URL. So your URL becomes: website.com/sports/hownbaistakingcharge Or website.com/sports/baskteballnbakobe. Are either of those readable? You have two stakeholders, Google and Users and your URL structure should support both. Compare the above to website.com/sports/how-nba-is-taking-charge or /basketball-nba-kobe. That's much better for Google because they can clearly read the different words and make sense of it, and it's much better for Users who are trying to quickly scan the URL on Google. I would push back on the vendor that the hyphenation is necessary.
I've listed a few other questions below that I would have for my vendor and team if we were proposing a major restructuring of the site's content.
A new URL structure means a few other things will likely change.
1. Have you thought about creating a redirect map for every page that is going to move?
2. How will the new URL structure interact with breadcrumbs on your site?
3. If you move to folders are you going to need to create head pages e.g. website.com/sports/how-nba-is-taking-charge is located under a main "sports" page that maybe doesn't exist yet. You WILL have users that attempt to reach the head page whether it exists or not and they'll be sent to a 404 instead.
4. Will changing your URL structure alter your main and sub navigation elements on the site? (in almost every instance, it should)And then my final question, knowing how much work it is to take a healthy site and improve it by changing the URL structure alone is this: what is the expected value? Why are we doing this? Sometimes there's a legitimate reason and sometimes it's pure vanity. The SEO upside to a major restructuring like this isn't normally enormous, but the effort involved can be titanic. So be sure your expectations are realistic going into it and get the details fleshed out as much as possible ahead of time.
Best of luck, let me know if I can answer anymore questions.
-
RE: Related Keywords: How many separate pages?
Instead of trying to group pages by keyword, try thinking about searcher intent and task accomplishment. Can you write one comprehensive page that addresses the searcher's needs and includes all the keywords? Or does it make more sense to break into a couple different areas, such as a page that's specific to a plaintiff and a page specific to a defendant?
Try this: create a venn diagram of the different audiences that may visit that section of the site you're contemplating building out, and group the keywords that you suspect each audience would use and see where the overlap is. If there are areas that are completely blank, you don't need a page for that specific audience or task. Doing this will help you determine which pages need to cover which keywords for the right audience. For example, for an optometrist there's probably searches involving "contacts", "glasses", and "lasik". You might be able to address all three on the same page, but that's probably a horrible experience for someone who is just looking for a specific eyeglass style to have long text about the benefits of lasik. Very little overlap there because the audiences and intent may be different, so they get different pages, and that shows up in the venn diagram.
Hope this helps!
-
How to Diagnose "Crawled - Currently Not Indexed" in Google Search Console
The new Google Search Console gives a ton of information about which pages were excluded and why, but one that I'm struggling with is "crawled - currently not indexed". I have some clients that have fallen into this pit and I've identified one reason why it's occurring on some of them - they have multiple websites covering the same information (local businesses) - but others I'm completely flummoxed.
Does anyone have any experience figuring this one out?
Best posts made by brettmandoes
-
RE: Quick Fix to "Duplicate page without canonical tag"?
The simplest solution would be to mark every page in your test environment "noindex". This is normally standard operating procedure anyway because most people don't want customers stumbling across the wrong URL in search by mistake and seeing a buggy page that isn't supposed to be "live" for customers.
Updating your robots.txt file would tell Google not to crawl the page, but if they've already crawled it and added it to their index it just means that they will retain the last crawled version of the page and will not crawl it in the future. You have to direct Google to "noindex" the pages. It will take some time as Google refreshes the crawl of each page, but eventually you'll see those errors drop off as Google removes those pages from their index. If I were consulting a client I would tell them to make the change and check back in two or three months.
Hope this helps!
-
RE: Optimal URL Structure for a Multi-City Directory
I think you should consider how your users are interacting with your website and how they search for your services/products/locations and follow that. For example, Yelp is focused on local reviews. People will filter first to their city, then the category naturally. You would never filter down to restaurants first, because if you're in Huntington Beach, CA you really don't care what's in Portland, OR. If location is secondary to your product, then it makes sense to start with the category. For example, let's say you sell ATVs and other off-road vehicles and gear, but some showrooms only have ATVs while others also carry dirt bikes. Customers who are looking for a dirt bike care more about reaching a showroom with dirt bikes, so that category structure would be preferable.
Note that I'm assuming in both of the above examples that your navigation is following the structure of your website for usability purposes. In terms of structure, one way is not inherently better than the other from a ranking/algorithm perspective, but if your structure is confusing it can be detrimental to SEO. For example, outreach is a lot harder if you have a garbage navigation that contributes to poor user experience on your website. Any piece of Google's algorithm that measures user satisfaction with your website (Rank Brain, pogo sticking, etc.) will either directly or indirectly affect you depending on how user friendly your website is.
One last thing: in both instances you have the geography in the URL, so if you're hoping for a boost for local phrases from an exact match URL I think you're already tapping that. EMDs are nowhere near as effective as they were in years past, so I wouldn't make that my focus.
-
RE: Where to buy high quality backlinks in 2018?
Easy way to "buy" backlinks, in style, without running afoul of webmaster guidelines.
Step 1: curate a list of all the sites you want a backlink from
Step 2: get the emails of the webmasters there - lots of tools and methods for this, both automated and manual
Step 3: use customer match on Google or Custom Audiences on Facebook to upload your email list of webmasters you picked
Step 4: use targeted ads to get your content in front of this audienceThis really only works if you don't have garbage content. But basically you advertise this content only to people who actually have a website and would consider linking to you and voila. You've basically "bought" backlinks.
-
RE: Have there been any google algorithm updates in the past 2 weeks?
Anyone attacked by a scraper can agree that Google isn't foolproof. We have one website that has turned invisible even though it was ranking strongly for years. The scraper copied everything exactly with one exception - they changed all the links on the site so nothing would point back to ours. It was a pretty nasty hit. Still waiting on the DMCA takedown.
-
RE: Desktop in http and mobile in https
This can create some real headaches. If you're going to secure a part of the site, you may as well secure the whole thing. Leaving part of the site unsecured and just securing a few pages that are transactional or used to collect customer data like physical addresses is something other sites have done, but should be considered a temporary solution while securing the rest of the site.
While I'm not sure that this implementation would create dark traffic in your Google Analytics reports, you're still leaving yourself open to MIM attacks and other SEO issues with a partial implementation, such as creating duplicate content. I'm dealing with this issue right now with a couple clients and I can share one of the headaches we're experiencing.
Mixed sitemap URLs! Some URLs are in https and others are in http, because they've managed to confuse the CMS (don't ask, I'm not sure what they did yet). On top of that, duplicate content is created with every new page, because the CMS now creates a page in http and a page in https. The dynamic XML sitemap then picks one and adds it. It gets worse, but I'll end it there.
You can avoid all this by securing everything, and you'll have the optional benefit of upgrading the site to HTTP/2 if you secure the whole thing first.
-
RE: Is hiring bloggers to review my products while back linking to my website bad for SEO?
Yes, Google can penalize you for hiring bloggers. They view this is a link scheme (source: https://support.google.com/webmasters/answer/66356) and it's something that SEOs have been doing for a long time. Google has gotten better and better at catching these types of schemes, so it would not surprise me in the least if you were caught up by Penguin, which now operates in real time and simply devalues links instead of putting your entire domain into a sandbox.
To keep your strategy in line with Google's webmaster guidelines, you should request that those links be marked "nofollow". There is still a lot of value to this kind of outreach even with a "nofollow" added, so I wouldn't recommend you necessarily stop doing what you're doing as the bloggers have been honestly informing their audience that an editorial review was requested.
Some of the ways nofollow links can still help you:
1. quality referral traffic
2. backlinks generating backlinks over timeBest of luck, let me know if you have additional questions. Thanks!
-
RE: Adding a secondary keyword or other keyword variation to the title tag affect ranking for primary keyword?
Hello Raymond, I'll answer you second question first because the explanation is a bit lengthier.
There has been a lot of debate over the years about how accurate the information in Search Console is, particularly some of the information in Search Analytics. Personally, I found the information to be inaccurate at worst and misleading at best. As an example, I searched through the number of impressions one of my client's small local sites had received over a period of thirty days and discovered my client was ranking for some long tail keywords that were misspelled and had somehow received hundreds of impressions from those keywords. I then checked the volume of those keywords in Adwords and saw immediately that there was a discrepancy between what Adwords reported (an extremely low yearly volume) and Search Console (extremely high monthly impressions).
So either A) impressions means what we traditionally know it to mean - someone typed in a keyword and landed in the SERP and saw the listing. Which would mean the information is inaccurate (possibly tracking bot traffic?). Or B) impressions means something else entirely in Search Console, which means it's misleading.
This was a roundabout way of saying don't take Search Console data as gospel. There isn't a clear cut guide on how the information is gathered and disseminated, meaning you cannot quality check it for accuracy the way you can with Google Analytics.
To answer your first question: adding a secondary keyword that's relevant is unlikely to negatively impact you. I would find it highly suspect that Google would rank you lower for adding MORE content to your website, and the additional keyword could improve CTR, which has been associated with higher rankings for awhile now, as a sort of positive signal with a short half-life. Just make sure it makes sense for your users and the topics are related. "Tree removal and tree trimming" is acceptable, whereas "Tree removal and ice cream" would be confusing and I would be unsurprised to see a negative impact from that.
-
How Can I Batch Upload URLs to get PA for many pages?
Howdy folks, I'm using advanced search operators to generate lists of long tail queries related to my niche and I'd like to take the batch of URLs I've gathered and upload them in a batch so I can see what the PA is for each URL. This would help me determine which long tail query is receiving the most love and links and help inform my content strategy moving forward.
But I can't seem to find a way to do this. I went to check out the Moz API but it's a little confusing. It says there's a free version, but then it looks like it's actually not free, then I try to use it and it says I've gone over my limit even though I haven't used it yet.
Anyone that can help me with this, I'd really appreciate it. If you're familiar with SEMRush, they have a batch analysis tool that works well, but I ideally want to upload these URLs to Moz because it's better for this kind of research. Thanks!
-
RE: Spam Flags on my minutedrone.com
Let's start with some easy things to fix shall we?
First, thin content. I crawled your site with Screaming Frog and found you have 482 internal pages on your site. That's a good size! But 80 of those pages have less than 300 words, and a some of them have as few as 18. My french is a little rusty, but most of those pages that have a low word count look like contact pages. So while not entirely necessary for the user experience, it can't hurt to add some text that's easily readable by users and bots in HTML. At the very least, provide a call to action that represents your brand so people are more likely to fill out your form. This should help reduce or remove your "thin content" warning.
The "No Contact Info" warning is easily fixed. Add social links to your footer and an email address where users can contact you on your contact page.
Ignore the "Low number of pages found" warning for the time being. You should check this against Google Search Console to see if Google is finding all the pages you've listed in your sitemap. If Google says they've found everything and indexed it, you're probably pretty safe. When I crawled it with Screaming Frog, I found that all the pages returned either 200 or 301 status codes, so this may just be something wonky with the Moz Crawler. You can also check how Googlebot sees your website with a tool such as httpstatus.io by setting the user agent to Googlebot.
A couple other notes:
1. You did a great job specifying the language for your english pages, but you can add hreflang tags to your French pages.
2. You have both https and http pages being generated. Make sure you're only creating pages in https. This is complex and beyond anything I can easily describe in a response here, so you'll have to work on this with your web developer. Once you have everything in https, go ahead and enable HTTP/2 for your website. It will improve security and speed.
3. Finally, check you link profile with a tool like OSE. If you're being linked to from shady websites with suspicious anchor text you'll want to disavow those links. Anything from Russia check to make sure they're not scraping your site. I've been getting a lot of that junk with my clients.That was a what I got from a quick browse. I think you will reduce your spam score sufficiently by following the steps listed above as well as improve your site's security, speed, and user experience. Good luck!
-
RE: When trying to sculpt an internal link structure, is there any point in placing text links to top level pages that are already in the main menu?
There is value. You are indicating to Google throughout your website with every link you place that you consider Page A or Page B to be more or less valuable than the other to your users.
We know that Google places different values on backlinks based on the the placement of the link. For example a link that is in the main body content will pass along pagerank and so will a link in a footer, but the value is modified as it passes through on the assumption that a link in the footer is of different value than a link placed in content.
I'm not sure where you heard that Google only counts the first link it finds. That's incorrect, and you can read any of the older articles from Matt Cutt's blog on how pagerank works to see why.
I believe the best case for you is to create links that go to content you value highly for your business and your users, and to place these links in logical places.
Digital marketer, SEO enthusiast, Dad of Three Boys. I love DIY projects, gardening, cooking, and camping. Proud owner of a vintage airstream camper (painted John Deere colors). Day job is with UnityPoint Health, but consultations are always available through https://www.optimizetheory.com/. Enormous fan of Olympic National Park and its many wonders.
Looks like your connection to Moz was lost, please wait while we try to reconnect.