My site has all dynamically loaded content.
I have written a few JS functions that change the content based on the URL received. If someone goes to www.mysite.com/#1056, the content for that will be loaded.
function getLocationHash() {
//check if there is a location hash in the address bar, get that URL
if (window.location.hash != '') {
processURL()
}
}
Then it calls the processURL function
function processURL() {
if (window.location.hash != '') {
urlHash = window.location.hash;
//if it's a catalog item, it has a number above #1000
if (urlHash > 10000) {
getDetail(urlHash);
}
This works fine for history or jumping right to a URL on the site - however, other sites cannot follow this. For instance, if I enter www.mysite.com/#1056 into Facebook status, FB scrapes only www.mysite.com index page. It does not follow through to the end of the JS. Is this because the JS is looking for the 'window' property?
Same thing with Google crawling. I set up a sitemap with all of the hashed URLs but Google only crawls the index page.
So the question is: How do I take what I have here and properly format a URL that other services like Facebook and Google can "see"?
Any tips would be much appreciated.