Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Dynamic Urls - Help


  • Please log in to reply
15 replies to this topic

#1 AussieRob

AussieRob

    HR 2

  • Members
  • PipPip
  • 23 posts
  • Location:Boise, Idaho

Posted 08 July 2014 - 03:17 PM

Hi, 

 

I am running an SEO audit on a client's website. I've run the site through Screaming Frog and noticed that a lot of the pages are returning dynamic URLs. 

 

This particular client is a property management company that has a rental listing page that pulls in properties from a third party feed. I've noticed two things:

 

When I property is imported it - 

 

1. Generates a dynamic URL

 

2. When the property is rented that page is removed from the feed resulting in a 404 page 

 

The site is built on Wordpress. 

 

Will these dynamic URLs hurt SEO?

 

What would you recommend in terms in of working around the dynamic URL generation and 404 pages?

 

I also have a blog post that is returning some dynamic URLs with the following strong attached : /?replytocom=11710. Is this created by blog comments?

 

Help much appreciated. 

 

Robbie. 

 



#2 AussieRob

AussieRob

    HR 2

  • Members
  • PipPip
  • 23 posts
  • Location:Boise, Idaho

Posted 08 July 2014 - 03:35 PM

Figured out the /?replyto string. I just removed the permalink variable in the Yoast settings. 


Here is an example of the duplicates created by the dynamic URL:

 

  /listing/516314/?address=1350%20Lake%20Ave%20-%201350%201%2F2   /listing/516314/

 

 

Both go to the same rental listing created a duplicate issue. 



#3 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,112 posts
  • Location:Blackpool UK

Posted 08 July 2014 - 03:47 PM

 

Will these dynamic URLs hurt SEO?

 

No it won't , but why not simply change it.

 

Dashboard -> Settings -> Permalinks -> Post name.



#4 AussieRob

AussieRob

    HR 2

  • Members
  • PipPip
  • 23 posts
  • Location:Boise, Idaho

Posted 08 July 2014 - 04:11 PM

I don't see how this is going to remove the dynamic URLs and duplicate content being generated? To my understanding, this will simply remove the timestamp from the URL....

 

Please explain. 



#5 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,112 posts
  • Location:Blackpool UK

Posted 09 July 2014 - 09:34 AM

Personally I would be more concerned about spaces (%20) appearing in the URLs than I would about some URLs possibly being filtered from the results.



#6 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,157 posts
  • Location:Georgia

Posted 09 July 2014 - 07:46 PM

If the site is generating dynamic URLs through the crawl then you're probably not getting very useful information. It would be more productive, in my opinion, to download a Wordpress XML file and look at the site structure through that. There are also some plugins that will export all the URLs (like sitemap generators).

You'll create a very big false map with a tool like Screaming Frog or Xenu or similar tools. They are, in my opinion, way overused by the SEO community.

#7 AussieRob

AussieRob

    HR 2

  • Members
  • PipPip
  • 23 posts
  • Location:Boise, Idaho

Posted 09 July 2014 - 07:55 PM

Can you explain more why Screaming Frog is overrated?

 

That's the tool I used in the audit :)



#8 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,157 posts
  • Location:Georgia

Posted 09 July 2014 - 10:58 PM

It's not that Screaming Frog or Xenu or any other SEO spidering tool is over-rated. It is that they are OVER USED.

A spidering tool is going to crawl every link it can get to. But on a dynamic, database-driven site many of those links are never clicked by users or search engines. A common problem I have heard from SEO spider users is that the spiders get caught in endless loops. We call this the "Calendar problem", because Web calendars can generate massive numbers of links that never end.

So you end up with a huge report loaded with real but never-used URLs. You'll identify all sorts of duplicate content problems and other issues that DO NOT NEED TO BE FIXED.

Furthermore, just because these tools are designed to crawl every link they can find, they don't find every link (or every page) on a Website. Many Websites use two or more CMS platforms for different sections of their sites. They may or may not interlink those sections through shared navigation. These different sections could be subdomains or subfolders. They'll earn their own backlinks and appear in search indexes and everything looks seamless to the user but there is a virtual wall between them.

SEO crawlers don't tell you that they didn't find links they don't know about.

On top of that, running an SEO crawler against a large Website hammers its server and totally screws up its analytics if they are working off the raw server logs. A good analyst will know how to filter out that activity but he won't thank you for making him do that extra work.

I only use SEO crawlers as a last resort, maybe when I need to see how crawlable a site is, or if there is simply no way to get a full list of the URLs that are indexable by search engines.

#9 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,112 posts
  • Location:Blackpool UK

Posted 10 July 2014 - 12:47 AM

The only two tools that are worth using to see what URLs the site creates and are 'crawlable' are:

 

Xenu's Link Sleuth and WebBug (or you could buy CyberSpider (from the makers of WebBug) to combine the two.

 

WebBug is the nearest you will get to see what GoogleBot 'sees' without being inside Google, Link Sleuth will build you a HTML sitemap of URLs linked by the content of the title element (just like search engines do) so you can see how well (or more often how poorly) your page titles look for gaining a click-through.

 

And yes in your 'keyboardian' slip, you are correct ALL "SEO tools" are overrated because NONE of them has been created by people who actually KNOW, They are ALL based on the ramblings of "experts" who have carried out flawed testing and declared that they know the secret to Search 'rankings', then of course they are used by every 'wannabee' "expert" so ALL the documents on the sites that they have vandalised bear the footprint of being "optimised", rather than being optimal.



#10 Jill

Jill

    Recovering SEO

  • Admin
  • 33,012 posts

Posted 10 July 2014 - 07:55 AM

You can get around most of the alleged SC problems through using different settings and filters. 



#11 AussieRob

AussieRob

    HR 2

  • Members
  • PipPip
  • 23 posts
  • Location:Boise, Idaho

Posted 10 July 2014 - 08:58 AM

Thanks for all the great feedback everyone. Much appreciated!

 

 

Moderator, I like your point regarding the use of Link Sleuth to check page title appearance in the eyes of Google. 

 

I've been able to pick up on some title tag errors in WP where the theme was automatically pulling both my page H1 tag AND business name into the title tag. This was causing messing title tags that were way over their pixel limitation. 



#12 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,112 posts
  • Location:Blackpool UK

Posted 10 July 2014 - 09:16 AM

 

This was causing messing title tags that were way over their pixel limitation.

There is no such limit in existence.

 

But stopping WordPress putting the "Blog Name" in the document title only needs one check box to be unchecked.



#13 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,157 posts
  • Location:Georgia

Posted 10 July 2014 - 09:54 AM

I always include the blog name in the title. I never worry about how long the titles are.

#14 AussieRob

AussieRob

    HR 2

  • Members
  • PipPip
  • 23 posts
  • Location:Boise, Idaho

Posted 10 July 2014 - 11:22 AM

At Moderator, I was under the impression that title tags were no longer limited on character number, but instead pixels (520). 



#15 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,112 posts
  • Location:Blackpool UK

Posted 10 July 2014 - 02:59 PM

I wonder what dumb arse, lame brained clueless [insert expletive] "expert" came up with THAT load of garbage!!!

 

Document titles have NEVER been limited by character count, certainly there is a point that very long titles are truncated for display purposes, but that is purely cosmetic, and there is a rule of diminishing returns where words in the title lessen in 'value', but it is not a limit that anyone should worry about or pay undue attention to.

 

The ONLY thing that you should care about for the title element content is;

 

Does is make for a really great 'call to action'.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!