Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • The social media icons in your footer are passing PageRank. I guess my question is. Why?

    | NickPateman81
    0

  • Hello We are just now starting up a niche site that will be the new home for a large group of products from an old site. Apart from straight up informative text links, we have set up 301 redirects for the 100 most important products from the old to the new site. Right now, we are in a transition period where we openly tell our visitors that we have a new site for this certain group of products. My question is: for how long should we keep the products on the old site? Can we remove them straight away, since our intentions of the 301 redirects is to preserve the serp positions for the most important products? Does it matter to google if we let the products remain on the old site for a while? Regards
    Oskar

    | jsigwid
    0

  • Hi, I'm really interested in a good explanation of how to control the flow of link juice. Most of my inbound links currently go to my home page, and I was just wondering how to maximise the link juice flow to those pages that I want ranking. Is there any benefit to nofollowing pages in my navigation that I don't need to rank? As above, but with links in my footer, such as privacy policy and the like (can i not waste link juice on these pages?) Does more link juice flow to pages higher up in my code? These are just some of the questions I'm concerned with. Basically, I'd really like to know the best-practices for sending link juice to where it is needed most. Thanks, Matt - No Yelling Driving School

    | strilliams
    0

  • Does Google take into consideration the number of ad tracking pixels on a page into its ranking algo?

    | CLee-177996
    0

  • Hi all, I'm trying to correct some of my duplicate content errors. The site is built on Miva Merchant and the storefront page, /SFNT.html, needs to be permanently redirected to www.mydomain.com This is what my .htaccess file looks like: #RedirectPermanent /index.html http://dev.mydomain.com/mm5/merchant.mvc? RewriteEngine On RewriteCond %{HTTP_HOST} !^dev.mydomain.com$ [NC] RewriteRule ^(.*) http://dev.emydomain.com/$1 [L,R=301] DirectoryIndex index.html index.htm index.php /mm5/merchant.mvc redirect 301 /SFNT.html http://dev.mydomain.com/ RewriteCond %{QUERY_STRING} Screen=SFNT&Store_Code=MYSTORECODE [NC] When I use this code and navigate to http://dev.mydomain.com/SFNT.html the URL gets rewritten as http://dev.mydomain.com/?Screen=SFNT So I believe this is what's called a "redirect loop".... Can anyone provide any insight? I'm not a developer, but have been tasked with cleaning up the problems on the website and can use any input anyone is willing to offer. Thanks, jr

    | Technical_Contact
    0

  • Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown

    | JamesBSEO
    0

  • At the moment I my blog is paginated like so: /blogs > /blogs/page/2 > /blogs/page/3 etc What are the benefits of paginating with dynamic URLs like here on SEOmoz with /blog?page=3

    | NickPateman81
    0

  • I read somewhere - pretty sure is was in Art of SEO - that having dates in the blog permalink URLs was a bad idea. e.g. /blog/2011/3/my-blog-post/ However, looking at Wordpress best practice, it's also not a good idea to have a URL without a number - it's more resource hungry if you don't , apparently. e.g. /blog/my-blog-post/ Does anyone have any views on this? Thanks Ben

    | atticus7
    0

  • Our website www.clientfirstfunding.com ranked 13th for the keyword "structured settlement". After this weekend we are no longer ranking for this keyword at all. We haven't made any changes at all to the site and i haven't gained any backlinks that appear to be spammy. We have held this position for the last several months. I can understand a drop in SERPs but one this drastic is shocking. Any ideas as to what could have caused this would be greatly appreciated.

    | Tony1986
    0

  • We have a main sales page and then we have  a country specific sales page for about 250 countries. The country specific pages are identical to the main sales page, with the small addition of a country flag and the country name in the h1. I have added a rel canonical tag to all country pages to send the link juice and authority to the main page, because they would be all competing for rankings. I was wondering if having the 250+ indexed pages of duplicate content will effect the ranking of the main page even though they have rel canonical tag. We get some traffic to country pages, but not as much as the main page, but im worried that if we remove those pages and redirect all to main page that we will loose 250 plus indexed pages where we can get traffic through for odd country specific terms. eg searching for uk mobile phone brings up the country specific page instead of main sales page even though the uk sales pages is not optimized for uk terms other than having a flag and the country name in the h1. Any advice?

    | -Al-
    0

  • Howdy Everyone, I have a website that will span multiple countries. The content served will be different for each country. As such, I've acquired the top level domains for different countries. I want to map the cop level domains (e.g. domain.co.uk) to uk.domain.com for development purposes (LinkedIn does this). I'm curious to know whether this is adviseable and if mapping a country-specific TLD to a subdomain will maintain local SEO value. Thanks!

    | RADMKT-SEO
    0

  • Unbeknown to me our web developers have hosted our UK e-commerce site (and only serving the UK & outer islands) on a US based server.  Can this impact our SEO efforts?  My further concern is when it comes to sending out emails and opt-in regulations - I am right to be concerned about this as well?
    Thanks

    | PH292
    1

  • What has to be changed to improve rank? We had "hip hop jewelry" keyword for a while, position 4. All of the sudden it dropped for position 6 and never went back. We did some on page optimization and got couple of links here and there... but so far we still at position 6. Please suggest us what has to be done?

    | DiamondJewelryEmpire
    0

  • After searching for (city name) (business type) a number of my competitor's sites come up with the title of their web page as the results (including geographic descriptors).  However, my site is listed by name and does not reflect our URL title.  How is this possible (did someone manually change the title of our listing?) and how can I change this back so that the title includes a geo descriptor?  Do I simply edit the listing under google places or will this have a negative effect on our rankings?

    | helliottlaw
    0

  • What is consider best practice today for blocking pages, for instance xyz.com/admin pages, from getting indexed by the search engines or easily found.  Do you recommend to still disallow it in the robots.txt file or is the robots.txt not the best place to notate your /admin location because of hackers and such? Is it better to hide the /admin with an obscure name, use the noidex tag on the page and don't list in the robots.txt file?

    | david-217997
    0

  • Hello, I am getting Duplicate Content warning from SEOMoz for my home page: http://www.teacherprose.com http://www.teacherprose.com/index html I tried code below in .htaccess: redirect 301 /index.html http://www.teacherprose.com This caused error "too many re-directs" in browser Any thoughts? Thank You, Eric

    | monthelie1
    0

  • I do that, because i am using joomla. is bad? thanks

    | monotero
    0

  • If your cms has created two urls for the same piece of content that look like the following, www.domianname.com/stores and www.domianname.com/stores/, will this be seen as duplicate content by google? Your tools seem to pick it up as errors. Does one of the urls need 301 to the other to clear this up, or is it not a major problem? Thanks.

    | gregster1000
    0

  • One of the sites I work for is an employment site, they have a job database and the job pages tend to get links.  The problem is that every time one of these jobs is filled, the job page goes away.  What should I do to keep the value from these links?

    | MarloSchneider
    0

  • Our site involves e-commerce transactions that we want users to be able to complete via javascript popup/overlay boxes.  in order to make the credit card form secure, we need the referring page to be secure, so we are considering making the entire site secure so all of our site links wiould be https.  (PayPal works this way.) Do you think this will negatively impact whether Google and other search engines are able to index our pages?

    | seozeelot
    0

  • I have seen many solutions and tried most of them.  But when serving multiple companies the amount of passwords one has gets astounding.  It is hard to find a way to securely store them, but also have the convienence of looking them up. What solutions do you recommend?

    | MarloSchneider
    0

  • We have shopping cart links (<a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p"></a> <a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p">The SEOMoz site crawls are flagging these as a massive number of 302 redirects and I also wonder what sort of effect this is having on linkjuice flowing around the site. </a> <a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p">I can see several possible solutions: Make the links nofollow Make the links input buttons Block /cart/add with robots.txt Make the links 301 instead of 302 Make the links javascript (probably worst care) All of these would result in an identical outcome for the UX, but are very different solutions. What would you suggest?</a>

    | Aspedia
    0

  • I have a client that insists on using the ProPhoto WordPress theme. This theme has an interesting habit of putting empty anchor tags in the site nav in order to nest css dropdowns. By empty I mean totally empty. For example: <a>Navigation Link</a> Since the anchor does not specify a destiation, do you think it would have any effect on link juice one way or the other? This wouldn't count as an additional link on the page would it? My inclination and personal practice is not to risk quirky things like this, but I'd like a second opinion before I suggest changes to the client's site. Thanks!

    | Dameian
    2

  • I have a website that requires the site structure to be changed.  The website doesnt have many backlnks and rankings are fairly low.  I have 11,000 products on the website and want to know the best way to change the site structure without causing 404 errors all over the place.  Do I 301 redirect every page? drop all 11,000 pages from the index by adding a no follow no index to all pages? I have the following structure www.domain.co.uk/make/model/part/product I want to change this to www.domain.co.uk/Part/make/model/product whats the best way to preserve the SEO, link juice and on a large scale? 11,000 pages. thank you shivun

    | seohive-222720
    0

  • I posted this as a reply on my other question, but never got another responce... I am basically moving ecommerce platforms to answer your question. I am keep the same domain (www.stbands.com). If you want to get even more specific i am moving from storesonline to corecommerce. Here's some more info: All meta tags, URLS, content is virtually going to be the same except my product pages and normal pages will end with .html, however my category pages will still be /categoryname (like a directory). My category URLS seem to build in a mod re-write fashion, which I hear is better for SEO (like /sweatbands/head-sweatbands/). I have set all my pages to use the rel=cananical tag, so if you are on the category  /sweatbands/head-sweatbands/  it cananicals back to /head-sweatbands The images are changing names as I upload them on a product because it auto creates new sizes. This differs from our old host and may affect our vertical image rankings. Should I 301 the image URLS? I am keeping the www. in my domain name, as opposed to just domain.com to keep everything the same. CoreCommerce automatically seems to make my products uppercase in the URL as they are in the title, but the url works both lowercase and uppercase so its not case sensitive. This is just an overview of some things I think I have most of my bases covered, but i'm aiming for a april 1st launch / move for the new site.  Do you guys have any other pointers / suggestions?

    | Hyrule
    0

  • I would like to rank for words as:
    windsurfing equipment
    windsurfing news
    windsurfing sails
    windsurfing boards etc. Now am I wondering if I should use exact those words in the navigation/titles/descriptions because it seems not user friendly. The whole website is about windsurfing thus naming it just “equipement” instead of “windsurfing equipment” would be clear to a visitor that I am talking about that windsurfing related topic. Here is an example: http://madwindsurfing.com/cat/competitions-events/
    I can even change the URL to http://madwindsurfing.com/cat/windsurfing-competitions-events/ What would be the best way of choosing the naming/descriptions when I do on-page optimisation which is good for the engines and for the users and who would you do in my case?

    | madsurfer
    0

  • Hi, I have a question regarding our client's site, http://www.outsolve-hr.com/ on ASP.net. Google has indexed both www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx creating a duplicate content issue. We have added
    to the default.aspx page. Now, because www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx are the same page on the actual backend the code is on the http://www.outsolve-hr.com/ when I view the code from the page loaded in a brower. Is this a problem? Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. We cannot do a 301 redirect from www.outsolve-hr.com/default.aspx to www.outsolve-hr.com/ because this causes an infinite loop because on the backend they are the same page. So my question is two-fold: Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. Is the rel="canonical" the best solution to fix the duplicate homepage issue on ASP. And lastly, if Google has not indexed duplicate pages, such as https://www.outsolve-hr.com/DEFAULT.aspx, is it a problem that they exist? Thanks in advance for your knowledge and assistance. Amy

    | flarson
    0

  • dear team, just a question that always annoying me is Google takes typo title keyword on SERP which is not good for client's branding .  i have no problem on the actual page meta title setting(correct KW) as well as internal link text, but only thing i can found this anchor text that other people use for the link is typo, so Google still takes that into account just like serveral years ago on Geroge Bush Miserable Failure? how can i get Google correct this ? only to submit request to them ? thank you, boson

    | 172396002
    0

  • I'm receiving a duplicate content error in my reports for www.example.com and www.example.com/index.htm. Should I put the rel="canonical" on the index page and point it to www.example.com? And if I have other important pages where rel="canonical" is being suggested do I place the rel="canonical" on that page? For example if www.example/product is an important page would I place on that page?

    | BrandonC-269887
    0

  • One of my real estate clients has a website that was built by a small web design company. In reviewing the website I've discovered that many of the images on the website (ie. banners, social networking icons, ect.) are not hosted on my clients server, but on the web developers server. Ex. src="http://www.[WebDevelopmentCompany].com/ubertor/[ClientsName]/properties_image.jpg" Will this funnel pagerank/link juice away from my client's website? This struck me as odd and it's not an issue I've come across before.

    | calin_daniel
    0

  • The company I work for sells software online. We have deals learning institutes that allow their students to use our software for next to nothing. These learning institutes, which are usually quite strong domains, link to our sign in area. Nice way to get powerful links hey… or is it? There are a couple of problems with these links: They all link to a subdomain  (signin.domain.com) The URLs also contain unique identifiers (so that we know which institute they are coming from). Meaning they all link to different signin URLs. (eg. signin.domain.com/qwerty, signin.domain.com/qwerta, signin.domain.com/qwerts, etc. ) So all these links aren't as effective as they could be (or at all?). In a perfect SEO world  these links would all point to the start page, however, due to the fact that our start page is of a commercial set up this would run the risk of communicating the wrong idea to the institutes and their students. So… are there any extremely brilliant pro mozzers that have a savvy idea how set this up in a more SEO friendly way? Thanks in advance!

    | henners
    0

  • Recently I have taken over a website and I made a pretty colossal mistake.  The site was properly constructed via .htaccess to a www domain.  Typically I roll without it and I made a bad assumption that the .htaccess was not previously set correctly because there were hundreds of fundamental mistakes. After a couple of days I noticed the mistake but some of our new (non www) have picked up some solid links etc.  So now I feel that I am in a nightmare of creating redirects etc.  So should I switch back to WWW or not?  Does it matter at this point?

    | mikeusry
    0

  • My clients business is divided in chain stores. All stores are set under the same franchise. There is one domain www.company.com with branches like www.company.com/location1/content and www.company.com/location2/content etc. I've taken care of duplicate content issues with rel="canonical" and duplicate page titles are also not a concern, anymore. Right now the concept is like this: If you visit the site for the first time you get to choose between the locations. Then a cookie is set and once you revisit www.company.com it will redirect you via a php header command to the location stored in your cookie: www.company.com/location1/content. My question is if this might hurt rankings in some kind of way as these aren't permanent redirects with a 301 but rather individual ones, based on your cookie.

    | jfkorn
    0

  • Hi, Does the domain extention ie. .com .org. net effect the chances of me ranking in search engines. Is there a prefrence or does it not matter? Thanks Yaser

    | yaser
    0

  • Hi Guys, Hope you are all well. Just a quick question which you will find nice and easy 🙂 I am just about to work through duplicate content pages and URL changes. Firstly, With the duplicate content issue i am finding the seo friendly URL i would normally direct to in some cases has less links, authority and root domain to it than some of the unseo friendly URL's. will this harm me if i still 301 redirect them to the seo friendly URL. Also, With the url changed it is going to be a huge job to change all the url so they are friendly and the CMS system is poor. Is there a better way of doing this? It has been suggested that we create a new webpage with a friendly URL and redirect all the pages to that. Will this lose all the weight as it will be a brand new page? Thank you for your help guys your legends!! Cheers Wayne

    | wazza1985
    0

  • One of the guidelines you provide stipulates: "You should avoid having too many (roughly defined as more than 100) hyperlinks on any given page. When search engine spiders crawl the Internet they are limited by technology resources and are only able to crawl a certain number of links per webpage. In addition, search engine algorithms divide the value of some popularity metrics by the amount of links on a given page. This means that each of the pages being linked to from a given page are also affected by the number of links on the linking page. For these reasons, we recommend you include less than 100 links per page to ensure that they are all crawled, though if your pages have a high page authority, search engines will usually follow more links." As far as these 100 links are concerned, is this in reference to ALL links including outbound, internal, etc? Or is this referring to only outbound links to other sites?

    | johncmmc
    0

  • Ok, So my site's urls works like this   www.site.com/widgets/ If you go to www.site.com/widgets  (without the last / )  you get a 404. My site did no used to require the last / to load the page but it has over the last year and my rankings have dropped on those pages... But Yahoo and BING still indexes all my pages without the last / and it some how still loads the page if you go to it from yahoo or bing, but it looks like this in the address bar once you arrive from bing or yahoo. http://www.site.com/404.asp?404;http://site.com:80/widgets/ How do I fix this? Should'nt all the engines see those pages the same way with the last / included? What is the best structure for SEO?

    | DavidS-282061
    0

  • Is there a tool out there that will help you locate your competitors traffic sources? I would like to see how much of their traffic is coming from SEO related sources - vs. other sources. I know compete will do this - but they are ungodly expensive. Thanks!

    | DavidS-282061
    1

  • Hi, my first question and hopefully an easy enough one to answer. Currently in the head element of our pages we have canonical references such as: (Yes, untidy URL...we are working on it!) I am just trying to find out whether this snippet of the full URL is adequete for canonicalization or if the full domain is needed aswell. My reason for asking is that the SEOmoz On-Page Optimization grading tool is 'failing' all our pages on the "Appropriate Use of Rel Canonical" value. I have been unable to find a definitive answer on this, although admittedly most examples do use the full URL. (I am not the site developer so cannot simply change this myself, but rather have to advise him in a weekly meeting). So in short, presumably using the full URL is best practise, but is it essential to its effectiveness when being read by the search engines? Or could there be another reason why the "Appropriate Use of Rel Canonical" value is not being green ticked? Thank you very much, I appreciate any advice you can give.

    | rmkjersey
    0

  • Hi, On Google Places we have clients that have bad data (incorrect name, address, #) on aggregated sites (citysearch, merchantcircle, yelp) which prevents those sites from pulling into the Google Places accounts. We've been manually correcting listings for a while and many times it still hasn't pulled into the Google Places listings for months on end despite the data matching. What are your experiences with correcting aggregated Google Places data, how long has it taken for this data to pull into your places accounts?

    | qlkasdjfw
    0

  • I have read tons of guides about canonical implementaiton but still am confused about how I should best use it. On my site with tens of thousands of urls and thousands of afiiliates and shopping networks sending traffic,  is it smart to simply add the tag to every page and redirect to the same url. In doing this would that solve the problem of a single page having many different entrances with different tracking codes? Is there a better way to handle this? Also is there any potential problems with rolling out the tag to all pages if they are simply refrencing themselves in the tag? Thanks in advance.

    | Gordian
    0

  • Hi, in a few weeks we'll do a major change on our website. This involves over 1.5 million pages indexed in Google driving substantial amount of our traffic. Basically we have 2 types of changes: subdomain switches to domain:
    ex. product.company.com will become www.product.com
    for this we know how to manage DNS and Apache rules different url patterns, basically replacing ugly urls by pretty urls
    for this we have advanced 301-mapping rules set up Here is the question - what is best way to proceed with these 2 changes in order to preserve rankings and organic traffic: Do both changes simultaneously? First do url changes, than the domain switch Can you please share your thoughts?

    | TruvoDirectories
    0

  • For a number of reasons I'm confined to having to do a client side redirect for html pages. Am I right in thinking that Google treats zero seconds roughly the same as proper 301 redirects? Anyone have experience with zero second meta refresh redirects, good or bad?

    | dvansant
    0

  • I have a website that has a page for each town. rather than listing all the towns with a link to each, I want to show only the most popular towns and have a 'more' button that shows all of them when you click it. I know that the search engine can always see the full list of links and even though the visitor can't this doesn't go against Google guidelines because there is no deception involved, the more button is quite clear. However, my colleague is concerned that this is 'making life hard' for the search engines and so the pages are less likely to be indexed. I disagree. Is he right to worry about this??

    | mascotmike
    0

  • I am curently managing a .com that targets Canada and we will soon be launching a .com/us/ that will target the US. Once we launch the /us/ folder, we want to display the /us/ content to any US IP. My concern is that Google will then only index the /us/ content, as their IP is in the US. So, if I set up .com and .com/us/ as two different sites in GWT, and geotarget each to the Country it is targeting, will this take care of the problem and ensure that Google indexes the .com for Canada, and the /us/ for the US? Is there any alternative method (that does not include using the .ca domain)? I am concerned that Google would not be able to see the .com content if we are redirecting all US traffic to .com/us/. Any examples of this online anywhere?

    | bheard
    0

  • Client has two website addresses: Website A is a redirect to Website B. It has one indexed page. But this is the URL being used in collateral. It has the majority of back links, and citations everywhere list Website A as the URL. Website B is where the actual website lives. Google recognizes and indexes the 80+ pages. This website has very few backlinks going to it. This setup does not seem good for SEO. Moreover, the analytics data is completely messed up because Website B shows that the biggest referral source is... you guessed it Website A. I'm thinking going forward, I should: Move all the content from Website B to Website A. Setup Website B to permanently 301 Redirect to Website A. Is that the best course of action?

    | flowsimple
    0

  • If you have a membership site, which requires a payment to access specific content/images/videos, do search engines still use that content as a ranking/domain authority factor? Is it worth optimizing these "private" pages for SEO?

    | christinarule
    1

  • My site is getting crawl errors inside of google webmaster tools. Google believe a lot of my links point to index.html when they really do not. That is not the problem though, its that google can't give credit for those links to any of my pages. I know I need to create a rule in the .htacess but the last time I did it I got an error. I need some assistance on how to go about doing this, I really don't want to lose the weight of my links. Thanks

    | automart
    0

  • Hello - I have a client who is a realtor and changed agencies. I edited their Google Places entry and the new name of their agency and address are showing - but so is their old listing. The agency they left is now trying to sue them for showing up in a number one position with Google Places under their agency name. Is this an indexing issue with Google? Their name shows up under both agency names. The corrected one shows most often, but the old one is still popping up on occasion. Thanks,

    | seoessentials
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.