Are you a Google Analytics enthusiast?
More SEO Content
Getting Php-based Database Into Serps
Posted 10 November 2008 - 11:28 AM
We are about to create a php-based database with many thousands of pages (say 80k), but also, with the ability to compare across many data points, so I think there are potentially many many more that 80k potential results.
What are some best practices in getting all these results pages indexed in SERPs? We were thinking we could make an xml Sitemap and submit it to google, but there should be other ways as well, I assume.
Posted 10 November 2008 - 12:17 PM
In other words, if the only way to find the resulting dynamically generated pages of content is via a search box or some other method that search engines can't use very well or very accurately, those URLs will for the most part not be found. If they do get found, they will not be given much weight and typically be a part of Google's supplemental index.
Posted 10 November 2008 - 01:38 PM
Can you recommend a site or a page that would help us figure out how to make the database have URLs available to google spiders?
Edited by eteare, 10 November 2008 - 01:44 PM.
Posted 10 November 2008 - 02:37 PM
Posted 10 November 2008 - 02:41 PM
The php pages will be accessing the database. Those are what get accessed. Even if the page filename is the same (say page.php) every combination of variables (say ?color=blue&size=small) is seen as a completely separate page.
So as long as you have the php page set up to access the information you need from your database, the rest is simply a matter of linking to the full url so that the search engines can find it.
That's it. That's all. There's no need to turn it into something more difficult.
Posted 11 November 2008 - 01:03 PM
Thanks for the info.
Randy, What do you mean by "the rest is simply a matter of linking to the full url so that the search engines can find it"?
Posted 11 November 2008 - 01:18 PM
Let's say you have a page that accesses your database named page.php on your site. And let's say further that it can pull up different product categories of shirts. So if you were to plug in an address of www.yourdomain.com/page.php?cat=polo it would show you Polo shirts, or if you want to an address of www.yourdomain.com/page.php?cat=hawaiian it would show you Hawaiian shirts, or if you want to www.yourdomain.com/page.php?cat=tshirt it would show you T-shirts.
The search engines will see all of those pages as being distinct url addresses, even though they lead to the same page filename. Search engines look at the whole URL, including the query string.
So if you were to link to the ?cat=polo shirt, say with anchor text that reads Polo Shirts the spiders would jet off to this page and know by the anchor text it's probably info about polo shirts. The same would be true if you linked to the Hawaiian and T-shirt pages too.
Posted 11 November 2008 - 01:34 PM
(and again, thanks)
Posted 11 November 2008 - 02:21 PM
Think of it as the difference between searching and browsing. Some people like to search; they'll use your search form and the pages will be generated on the fly depending on the search criteria they put in. But others don't want to use search forms. Maybe they're a new visitor, don't know the full extent of what the site offers, and they're afraid they'll miss something by not searching on the "correct" term. Or maybe they've had bad experiences in the past with poorly programmed search functions. Or maybe they just don't like to search. So instead they want to click a link and see a list of everything available so they can browse. (Sure, the list of items they can browse is probably generated by a "behind the scenes" invisible-to-them database query, just like a search would be... but as long as they don't know what they're doing is really a pre-built, structured search kicked off by a click on that "static" link, they're happy.)
The search engines are like that second group of users. They don't use search forms. For them, you need to provide permanent hard-coded links they can follow to browse your detail pages.
The cool thing is, when you create these "browse" links for the search engines, it also enhances usability of the site for that second group of human visitors.
Posted 11 November 2008 - 02:51 PM
Can you think of any sites that do this well, that I could use as an example?
Posted 12 November 2008 - 07:53 PM
Include pages you want indexed. You can have multiple site map pages if necessary to organize and prioritize links.
Bottom line, if the only way to get to a page is by submitting a search form, it's highly unlikely that page will get indexed.
Posted 12 November 2008 - 08:27 PM
So true since bots don't submit forms. At least that's the theory or SE speak.
Posted 14 November 2008 - 09:42 AM
"Static urls in some ways are the same thing as SEF URLs. Your url can still pass in information even if it doesn't have ?state=massachusetts. You write your code in a way so that it parses out the state you're looking for from /state/massachusetts
So, the URL is static in the sense that you're always going to bring back results for massachusetts and you didn't clutter it up with index.php?area=comthe_idea&state=massachusetts
Otherwise, you'd literally have to have a static page for every possible view of the database, which isn't possible.
The emails you sent were helpful in the sense that they verify the usual approach but I still think a dynamically generated sitemap may help in places where the UI fails to drill down to everywhere we want/need indexed."
Does that sound right to you all?
Posted 15 November 2008 - 11:03 AM
But yes, it sounds like what your developer wants to implement would work to possibly get some additional content from your database indexed, if that's what you're looking to do.
Posted 17 November 2008 - 10:52 AM
I think it would be impossible to have all those pages linked to on static pages. This is why my web dev folks are thinking a google Sitemap would be a good way to get these URLs indexed.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users