Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml

    | TTYH
    0

  • i add script for star snippet in my website but not work in my posts you can see it in this URL https://dlandroid.com/lucky-patcher/ when I searched in google my custom keyword "Lucky patcher apk" my competitor show with star snippet in SERP but my site doesn't show snippet stars.

    | hongloanj
    1
  • This question is deleted!

    0

  • I used http status code as 410 for some low quality pages in my site to Redirect to homepage. is this useful to improve my homepage authority?
    my website is: Nitamoshaver.com

    | ghorbanimahan
    0

  • For our product page, we want to be able to show the pricing in the local currency of the visitor. I discussed this with our web developer and he said that we can create country-specific pages, so one for UK, Australia, etc. I am afraid that this solution might hurt our SEO as Google might see this as duplicated content. What are your thoughts about this? The website runs on WordPress.

    | Maggie.Casas
    0

  • OK, been trying to piece together what is best practice for someone I'm working with, so here goes; Website was redesigned, changed urls from url a to url b. 301's put in place. However, the new url structure is not optimal. It's an e-commerce store, and all products are put in the root folder now: www.website.com/product-name A better, more organized url structure would be: www.website.com/category/product-name I think we can all agree on that. However, I'm torn on whether it's worth changing everything again, and how to handle things in terms of redirects. The way I see things, it would result in a redirect chain, which is not great and would reduce link equity. Keeping the products in the root moving forward with a poor structure doesn't feel great either. What to do? Any thoughts on this would be much appreciated!

    | Tomasvdw
    0

  • I have been trying to filter this traffic out of my Google Analytics data since it all seems to be related to spam traffic. I have had multiple instances wherein using this filter:
    (Backslash not displaying in message preview - I have written backlash to indicate its placement in the filter) Custom Filter - Exclude - Browser Size - ^backlash(not setbackslash)$ Traffic seems to appropriately filter out - but then the filter ceases working. In looking at a new site with Browser Size = (not set) traffic the filter preview doesn't appear to work either. Am I implementing the filter incorrectly? How do I filter this traffic out of GA data sucessfully? If I use the exact same method using RegEx in Google Data Studio - the filter works perfectly.

    | fuelmedical
    1

  • I have a client that has a HUGE website with thousands of product pages. We don't currently have a sitemap.xml because it would take so much power to map the sitemap. I have thought about creating a sitemap for the key pages on the website - but didn't want to hurt the SEO on the thousands of product pages. If you have a sitemap.xml that only has some of the pages on your site - will it negatively impact the other pages, that Google has indexed - but are not listed on the sitemap.xml.

    | jerrico1
    0

  • Hello, first sorry for my bad english,it isn't my first lanugage.
    I have a website with 13 years of history and activity. 5 Months ago we received an warning message from our domain provider which would seize our domain because of sanctions (i live in Iran), and they have seized many of iranian domains since then, therefore i have decided to quickly change my domain to another address so i could save my website as much as possible before they take out my domain...
    I have moved my website successfully to a new domain address and have done everything necessary for a good domain move (301 all links, change template and...) I have also used the "Change of Address Tool" provided in google search console so google knows my new domain address and change all of my links...
    Unfortunately 90% of my traffics comes from google, therefore we are hevaily depending on organic traffic.
    Since i have changed my domain address my traffic has been declining, and now i have only 30% of the traffic input left from google compared to my old domain 5 months ago. (i had recently some SEO troubles too which could effect this decline even more)
    Fortunately my old domain wasn't seized by the domain provider and i have successfully transfered it to another provide recently so there is no danger for my old domain anymore.
    My question is, should i move my website back to my old domain (cancel the google "Change of Address Tool" and use it again to move the new domain back to the old domain)? My old domain has more than 13 Years of history,has many backlinks within this 13years, and till now, i cannot get good rankings with new posts on the new domain, sometimes google even does not index my new articles several days, but my old domain ranks still well (i have tested a new article on the old domain to see how it performs and it was not very good, but i think it ranks still better than my new domain).
    My top pages and categories has been redirected successful and are still ranking well on google on the new domain address and hasn't been affected negativly, my main problem are new posts that are not ranking well o even does not get indexed for several days!
    I don't know what to do now, are 5months not enough for google to completly transfer all domain scores from my old domain to the new one? Will all scores of my old domain even transfer to my old domain eventually? How about the many Backlinks i have pointed to the old domain? (which 90% i cannot change or ask to change to my new address) Will the backlinks scores pointing to the old domain transfer to the new domain?In other hand i fear to move my site back to the old domain because i don't know how google would behave, would all my seo score and rankings come back after i move back to the old domain? Also as far as i know, after 6months of using the google "Change of Address Tool" i cannot cancle the domain change anymore, therefore i have roughly 1 month to decide to cancle the move or not...
    Please if anyone could help or guide me what to do it would be life saving for me, because my whole income and my family depends on my website...  😞

    | Milad25
    0

  • My client currently has a main website on a url and an eCommerce site on a subdomain. The eCommerce site is currently not mobile friendly, has images that are too small and are problematic - and I believe it negates some of the SEO work we do for them. I had to turn off Google Shopping ads because the quality score was so low. That being said, they are rebuilding a shopping cart on a new platform that will be mobile friendly BUT the images are going to be tiny until they slowly replace images over several months. Would you keep the shopping cart on a subdomain, or make it part of the main website URL? Can it negatively impact the progress we have made on the main site SEO.

    | jerrico1
    0

  • We are in a bit of a tricky situation since a key top-level page with lots of external links has been selected as a duplicate by Google. We do not have any canonical tag in place. Now this is fine if Google passes the link juice towards the page they have selected as canonical (an identical top-level page)- does anyone know the answer to this question? Due to various reasons, we can't put a canonical tag ourselves at this moment in time. So my question is, does a Google selected canonical work the same way and pass link juice as a user selected canonical? Thanks!

    | Lewald1
    0

  • Hi, I'm hoping someone can provide some insight. I Google searched "citizenpath" recently and found that all of our our sitelinks have identical text. The text seems to come from the site footer. It isn't using the meta descriptions (we definitely have) or even a Google-dictated snippet from the page. I understand we don't have "control" of this. It's also worth mentioning that if you search a specific page like "contact us citizenpath" you'll get a more appropriate excerpt. Can you help us understand what is happening? This isn't helpful for Google users or CitizenPath. Did the Google algorithm go awry or is there a technical error on our site? We use up-to-date versions of Wordpress and Yoast SEO. Thanks! search.png

    | 123Russ
    0

  • Hi The Service area pages created on my Shopify website is not indexing on google for a long time, Tried indexing the pages manually and also submitted the sitemap but still the pages doesn't seem to get indexed.
    Thanks in Advance.

    | Bhisshaun
    0

  • Hi there, We upgraded our webshop last weekend and our moz crawl on monday found a lot of errors we are trying to fix. I am having some communication problems with our webmaster so I need a little help. We have extremely long category pages url, does anyone have a guess which kind of mistake our webmaster could make:
    https://site-name.pl/category-name?page=3?resultsPerPage=53?resultsPerPage=53 .... And it keeps on repeating the string ?resultsPerPage=53 exactly 451 times as if there was some kind of loop. Thanks in advance for any kind of hint 🙂
    Kind regards,
    Isabelle

    | isabelledylag
    0

  • Absolutely no idea what is going on. All of our category / subcategory and other support pages are indexed and cached as normal, but suddenly none of our product pages are cached, and all of the product / offer schema snippets have been dropped from the serps as well (price, review count, average rating etc). When I inspect a product detail page url in GSC, I am either getting errors or it is returned as a soft 404. There have been no recent changes to our website that are obvious culprits. When I request indexing, it works fine for non-product pages, but generates the "Something went wrong
    If the issue persists, try again in a few hours" message for any product page submitted. We are not SEO novices. This is an Angular 7 site with a Universal version launched back in October (new site, same domain), and until this strange issue cropped up we'd enjoyed steady improvement of rankings and GSC technical issues. Has anyone seen anything like this? We are seeing rapid deterioration in rankings overnight for all product detail pages due to this issue. A term / page combination that ranked for over a decade in the top 10 lost 10 places overnight... There's just no obvious culprit. Using chrome dev tools to view as googlebot, everything is kosher. No weird redirects, no errors, returns 200 and page loads. Thank You

    | jamestown
    0

  • As of June 1 doctor pages on our website that say "No ratings are available yet" are being Soft 404ed in our Google Console. We suspect the issue is that wording, due to this post. https://www.contentkingapp.com/academy/index-coverage/faq/submitted-soft-404/ Just wondering if anyone with more expertise than me on 404s or local seo can validate that it is likely this issue. Some examples:
    https://www.nebraskamed.com/doctors/neil-s-kalsi
    https://www.nebraskamed.com/doctors/leslie-a-eiland
    https://www.nebraskamed.com/doctors/david-d-ingvoldstad 63647547-aa09-42a8-afe5-4431f277f611-image.png

    | Patrick_at_Nebraska_Medicine
    0

  • Our company implemented Google Shopping for our site for multiple countries, currencies and languages. Every combination of language and country is accessible via a url path and for all site pages, not just the pages with products for sale. I was not part of the project. We support 18 languages and 14 shop countries. When the project was finished we had a total of 240 language/country combinations listed in our rel alternate hreflang tags for every page and 240 language/country combinations in our XML sitemap for each page and canonicals are unique for every one of these page. My concern is with duplicate content. Also I can see odd language/country url combinations (like a country with a language spoken by a very low percentage of people in that country) are being crawled, indexed, and appearing in serps. This uses up my crawl budget for pages I don't care about. I don't this it is wise to disallow urls in robots.txt for that we are simultaneously listing in the XML sitemap. Is it true that these are requirements for Google Shopping to have XML sitemap and rel alternate hreflang for every language/country combination?

    | awilliams_kingston
    0

  • Here’s a situation I’ve been puzzling with for some time: The situation
    Please consider an international website targeting 3 regions. The real site has more regions, but I simplified the case for this question. screenshot1.png There is no default language. The content for each regional version is meant for that region only. The website.eu page is dynamic. When there is no region cookie, the page is identical to website.eu/nl/ (because Netherlands is the most important region) When there is a region cookie (set by a modal), there is a 302 redirect to the corresponding regional homepage What we want
    We want regional Google to index the correct regional homepages (eg. website.eu/nl/ on google.nl), instead of website.eu.
    Why? Because visitors surfing to website.eu sometimes tend to ignore the region modal and therefor browse the wrong version.
    For this, I set up canonicals and hreflangs as described below: screenshot2.png The problem
    It’s 40 days now since the above hreflangs and canonicals have been setup, but Google is still ranking website.eu instead of the regional homepages.
    Search console’s report for website.eu: screenshot3.png Any ideas why Google doesn’t respect our canonical? Maybe I’m overlooking something in this setup (combination of hreflangs and canonicals might be confusing)? Should I remove the hreflangs on the dynamic page, because there is no self-referencing hreflang? Or maybe it’s because website.eu has gathered a lot of backlinks over the years, whereas the regional homepages have much less, which might be why Google chooses to ig nore the canonical signals? Or maybe it’s a matter of time and I just need to wait longer? Note: I’m aware the language subfolders (eg. /be_nl) are not according to Google’s recommendations. But I’ve seen similar setups (like adobe.com and apple.com) where the regional homepage is showing ok. Any help appreciated!

    | dmduco
    0

  • Hi! We are trying to rank https://windowmart.ca for various local search terms. Our head office is in Edmonton where we try to rank https://windowmart.ca/edmonton-windows-doors/ for such terms as "windows Edmonton", "replacement windows Edmonton", "windows and doors Edmonton" as well as others. The website was the leader in its niche for around 2 years. Then we've got some server related issues, moved to a new server and connected CDN Nitropack that really improved our google speed test results. Recently we noticed that our rankings started to drop. Do you know if Nitropack can negatively effect local SEO rankings? Thank you!

    | vaskrupp
    0

  • Hi All, Sorry for what's about to be a long-ish question, but tl;dr: Has anyone else had experience with a 301 redirect at the server level between HTTP and HTTPS versions of a site in order to maintain accurate social media share counts? This is new to me and I'm wondering how common it is. I'm having issues with this forced redirect between HTTP/HTTPS as outlined below and am struggling to find any information that will help me to troubleshoot this or better understand the situation. If anyone has any recommendations for things to try or sources to read up on, I'd appreciate it. I'm especially concerned about any issues that this may be causing at the SEO level and the known-unknowns. A magazine I work for recently relaunched after switching platforms from Atavist to Newspack (which is run via WordPress). Since then, we've been having some issues with 301s, but they relate to new stories that are native to our new platform/CMS and have had zero URL changes. We've always used HTTPS. Basically, the preview for any post we make linking to the new site, including these new (non-migrated pages) on Facebook previews as a 301 in the title and with no image. This also overrides the social media metadata we set through Yoast Premium. I ran some of the links through the Facebook debugger and it appears that Facebook is reading these links to our site (using https) as redirects to http that then redirect to https. I was told by our tech support person on Newspack's team that this is intentional, so that Facebook will maintain accurate share counts versus separate share counts for http/https, however this forced redirect seems to be failing if we can't post our links with any metadata. (The only way to reliably fix is by adding a query parameter to each URL which, obviously, still gives us inaccurate share counts.) This is the first time I've encountered this intentional redirect thing and I've asked a few times for more information about how it's set up just for my own edification, but all I can get is that it’s something managed at the server level and is designed to prevent separate share counts for HTTP and HTTPS. Has anyone encountered this method before, and can anyone either explain it to me or point me in the direction of a resource where I can learn more about how it's configured as well as the pros and cons? I'm especially concerned about our SEO with this and how this may impact the way search engines read our site. So far, nothing's come up on scans, but I'd like to stay one step ahead of this. Thanks in advance!

    | ogiovetti
    0

  • Hi, I recently encountered a very strange problem.
    One of the pages I published in my website ranked very well for a couple of days on top 5, then after a couple of days, the page completely vanished, no matter how direct I search for it, does not appear on the results, I check GSC, everything seems to be normal, but when checking Google analytics, I find it strange that there is no data on the page since it disappeared and it also does not show up on the 'active pages' section no matter how many different computers i keep it open. I have checked to page 9, and used a couple of keyword tools and it appears nowhere! It didn't have any back links, but it was unique and high quality. I have checked on the page does still exist and it is still readable. Has this ´happened to anyone before? Any thoughts would be gratefully received.

    | JoelssonMedia
    0

  • Re: Are long URLs bad for SEO? Does a long domain name included on the count of characters bad for SEO as well?Here is an example : "https://kesslerfoundation.org/press-release/kessler-team-tests-regenerative-approach-preventing-osteoarthritis-after-knee-injury". This over 35 characters, however, does it begin after or before the domain name?

    | cesaromar1973
    0

  • We speak Persian and all people search in Persian on Google. But I read in some sources that the url should be in English. Please tell me which language to use for url writing?
    For example, I brought down two models: 1fb0e134-10dc-4737-904f-bfdf07143a98-image.png https://ghesta.ir/blog/how-to-become-rich/
    2)https://ghesta.ir/blog/چگونه-پولدار-شویم/

    | ghesta
    0

  • Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK

    | AndyKubrin
    0

  • What is your favorite tool for getting a report of URLs that are not cached/indexed in Google & Bing for an entire site? Basically I want a list of URLs not cached in Google and a seperate list for Bing. Thanks, Mark

    | elephantseo
    3

  • Howdy all, We have a few pages being hailed as copies by the google search comfort. Notwithstanding, we accept the substance on these pages is unmistakably extraordinary (for instance, they have totally unique list items returned, various headings and so on) An illustration of two pages google discover to be copies is underneath. in the event that anybody can spot what may be causing the copy issue here, would especially see the value in ideas! Much appreciated ahead of time.

    | camerpon09
    0

  • subdomains redirect

    I have a client site that is getting redesigned. Its a multi location service provider. Currently (for whatever reason) the location pages are sub domains. https://<location-name>.site.com/ In the new design the locations will be on the main domain. https://site.com/locations/<location-name> We are considering using 301 redirects from the current sub domains to the new location pages on the main domain. The current sub domains are setup on a multi-site with A records for each one in our GoDaddy account. Would like to get feedback on any unforeseen SEO issues that anyone might have input on.

    | ColeBField1221
    0

  • Is there any benefit or negative impact to including schema for both @type WebPage and NewsArticle on the same page? The websites I work on are editorial news sites. Our CMS automatically outputs WebPage schema to every article we publish. I want my dev to set up auto-generated NewsArticle schema. The are pretty much identical with a few different attributes. I just want to make sure I make the right choice about adding both or removing one.

    | DJBKBU
    0

  • I have a couple sites that were penalized by Google for hosting content that made Google look bad. After a major newspaper showcased what was going on they suddenly took a major hit as if someone at Google flipped a switch and told their system not to rank the content for anything other than their brand names. The article made Google look bad because the newspaper highlighted a lot of unverified user generated accusations the reporters assumed not to be true in the context of "these accusations are mostly false, but they still show up on the first page when people search Google." I was thinking one way to fight this would simply be to host the content at a different domain, but I am concerned about the new domain being penalized as well. I don't want to completely shut down all of the original sites because some of them have brand recognition. The oldest domain is 12 years old with backlinks from several news outlets which is why the content ranked so well, but after the penalty that is only the case on Bing. I've read various articles about this tactic. Some say that you will almost always pass the penalty to the new domain if you do a 301 redirect, but the penalties at issue in those articles were for things like buying links or other black hat tactics. This is somewhat different in that I wasn't doing anything black hat, they just decided not to let the site rank for political reasons. I was hoping that maybe that type of penalty wouldn't follow it, but right now I am leaning towards simply creating a second site to syndicate articles. It will need to attribute the articles to their sources though, so they will need either no followed links or possibly a redirection script that bots cannot follow. I would really like it if I could simply change the first site to its .net or .org equivalent and 301 everything though.

    | PostAlmostAnything
    0

  • Please I have a domain name miaroseworld.com I redirected (301 redirect) it to one of my domain names and I am having issues with the website so I decided to redirect to to my new site but moz is still showing redirecty to the previous websites
    Even when I change the redirect on search console it still showing redirecting to the previous site.

    | UniversalBlog
    0

  • Hello , Everyone I have my own  website crawlmyline we are doing seo from last 2 years but didn't do any spammy work on my my site .Still I won't able to stop the spam score and now my website has 28 % spam score according to the moz.I have taken subscription of moz pro but still i can't understand the process. Can anyone please tell me how to decrease the spam score of website ?

    | kuldeep_chauhan
    0

  • I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG Regarding: https://etcnaija.com

    | etcna
    0

  • Not a techie here...maybe this is to be expected, but ever since one of my client sites has switched to TLS 1.3, I've had a couple of crawl issues and other hiccups. First, I noticed that I can't use HTTPSTATUS.io any more...it renders an error message for URLs on the site in question. I wrote to their support desk and they said they haven't updated to 1.3 yet. Bummer, because I loved httpstatus.io's functionality, esp. getting bulk reports. Also, my Moz campaign crawls were failing. We are setting up a robots.txt directive to allow rogerbot (and the other bot), and will see if that works. These fails are consistent with the date we switched to 1.3, and some testing confirmed it. Anyone else seeing these types of issues, and can suggest any workarounds, solves, hacks to make my life easier? (including an alternative to httpstatus.io...I have and use screaming frog...not as slick, I'm afraid!) Do you think there was a configuration error with the client's TLS 1.3 upgrade, or maybe they're using a problematic/older version of 1.3?? Thanks -

    | TimDickey
    0

  • Recently there have been a a couple of pages form my website that ranked well, in top 5 for a couple of days then they disappear suddenly, they are not at all seen in google search results no matter how narrow I search for them. I checked my search console, there seems to be no issues with the page, but when I check google analytics, I do not get any data from that page since the day it disappeared, and it does not even show up on the 'active pages' section no matter I keep the url open in multiple computers.
    Has anyone else faced this issue? is there a solution to it?

    | JoelssonMedia
    0

  • Does registering or renewing a domain name for more than a year improve search as a result of a more trusted site and company that will seem to be around for longer than just a year?

    | Motava
    0

  • So we are planning redirect all these https://blue-company.com.au/
    https://www.blue-company.com.au/
    http://blue-company.com.au/ to this https://www.bluecompany.com.au/ If we do this, will it have an negative impact on our SEO, is there any downside to this? or get penalised from Google etc?

    | RodrigoR777
    0

  • We have a primary domain www.postermywall.com. We have used subdomains for offering the same site in different languages, like es.postermywall.com, fr.postermywall.com etc. There are certain language subdomains that have low traffic and are expensive to get translated. We have decided to sunset 3 subdomains that match that criteria. What is the best way of going about removing those subdomains? Should we just redirect from those subdomains to www.postermywall.com? Would that have any negative impact on our primary domain in Google's eye etc.? Anything other than a redirect that we should be considering?

    | 250mils
    0

  • I've discovered that one of the sites I am working on includes content which also appears on number of other sites. I need to understand exactly how much of the content is duplicated so I can replace it with unique copy. To do this I have tried using tools such as plagspotter.com and copyscape.com with mixed results, nothing so far is able to give me a reliable picture of exactly how much of my existing website content is duplicated on 3rd party sites. Any advice welcome!

    | HomeJames
    0

  • If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.

    | RyanOD
    0

  • Hello, Mozzers!
    I noticed something peculiar in the robots.txt used by one of my clients: Allow: /wp-admin/admin-ajax.php What would be the purpose of allowing a search engine to crawl this file?
    Is it OK? Should I do something about it?
    Everything else on /wp-admin/ is disallowed.
    Thanks in advance for your help.
    -AK:

    | AndyKubrin
    2

  • I am hoping for your good health. I would appreciate any tips on fixing technical issues on my website. Could anyone please help me to resolve some technical issues on my website? Thanks in advance. Here is my website: Apkarc

    | jjbndjkui88
    0

  • We have a news aggregator site that has 2 types of pages: First Type:
    Category pages like economic, sports or political news and we intend to do SEO on these category pages to get organic traffic. These pages have pagination and show the latest and most viewed news on the corresponding category. Second Type:
    News headlines from other sites are displayed on the category pages. The user will be directed to that news page on the main site by clicking on a link. These links are outgoing links and we redirect them by JavaScript (not 301).
    In fact these are our websites articles that just have titles (linked to destination) and meta descriptions (reads from news RSS). Question:
    Should we have to nofollow/noindex the second type of links? In fact, since the crawl budget of websites is limited, isn't it better to spend this budget on the pages we have invested in (first type)?

    | undaranfahujakia
    0

  • We have a news aggregator site that has 2 types of pages: First Type:
    Category pages like economic, sports or political news and we intend to do SEO on these category pages to get organic traffic. These pages have pagination and show the latest and most viewed news on the corresponding category. Second Type:
    News headlines from other sites are displayed on the category pages. The user will be directed to that news page on the main site by clicking on a link. These links are outgoing links and we redirect them by JavaScript (not 301).
    In fact these are our websites articles that just have titles (linked to destination) and meta descriptions (reads from news RSS). Question:
    Should we have to nofollow/noindex the second type of links? In fact, since the crawl budget of websites is limited, isn't it better to spend this budget on the pages we have invested in (first type)?

    | undaranfahujakia
    0

  • I am facing issue regarding to my Blog https://digitalmedialine.com/blog/. As some pages are not Rank in google yet. Can Anyone help me out how to rank those blogs to improve my Traffic. Thanks in Advance.

    | qwaswd
    0

  • Hi - My website title (company name) repeats in the SEO description.  My host service is Square Space.  How do I fix this? 
    Thanks! Paula board-directors

    | Peeker
    0

  • Hello all, We're doing an adwords campaign, and Google has said that there is a malicious link on the website we're looking to advertise - so cannot launch the campaign. I've tried to go through Search Console (I am a novice BTW). And it says that "Domain properties are not supported at this time". Which I don't understand. Any advice please?!

    | PartisanMCR
    0

  • Hello all, I've been having some weird stuff going on between one of my landing pages and the Homepage. What's happening is, the page showing up in the SERP keeps flip-flopping, but not only that, when it flips to the landing page it will drop 70 positions.  Then when it flips back to the homepage it goes back to around position 19. So confused about what is happening.  I think the two pages are fighting for the same keyword as they both have the keyword in the meta title. Homepage >( <title>PresenterMedia - PowerPoint Templates, 3D Animations, and Clipart</title> ) Landing Page >( <title>PowerPoint Templates at PresenterMedia.com</title> ) I've seen other answers about the flip-flopping thing, but not sure about dropping 70 positions thing that is going on. Does this huge drop tell me the better page to rank is the homepage instead of the landing page this targeting this keyword? Any help would be greatly appreciated

    | JbonesPM
    0

  • Hi guys I've recently had an EV security certificate installed on the site and have seen a drop in search visibility ever since. It was installed on Nov 27th. Though I was expecting some tracking hiccups as a result of the install and that this is a particularly competitive time of year (I know that others are bidding more aggressively on our brand terms which constitute the vast majority of our traffic) I have been quite concerned by the following: Under Acquisition > SEO > Landing Pages this has dropped to 0. In GWT, the certificate has been identified as self-signed which we know not to be the case. We've checked with the SSL provider that the certificate has been properly installed and obviously with our developers. We're just at a bit of a loss as to whether there is actually an issue and it's not just due to tracking issues and external factors. Does anyone have any advice as to confirm the existence of a problem with the install? Or how to rectify the GWT error as obviously if Google thinks it's self-signed we're not going to get the ranking benefits we were expecting? Thanks in advance for your time. Kind regards

    | Q-TCM
    0

  • i have a web site we are a third party web connecting player to the big gambling site.. How can i optimize the SEO in our page?

    | 323SM
    0

  • I am currently working on a website (Ed tech) that is doing business in India as well as USA. The courses are same. Content being served is also same. There is no cookie level redirection. The only difference is in the price range and price type. In schema we have set price type as $. We want to set different price type for India and USA through schema. How can we do this? For example given below website ranks for India & USA with the same domain name but prince range that we can setup either in INR or USA

    | DJ_James
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.