Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm curious as to why large content websites (gawkmedia) are moving towards hashbangs, as they completely destroy any ability to share a link to an article, which one would think is a fairly big traffic driver for these sites. Maybe gawkmedia's implementation is just broken, but it's a trend I've seen in several places now, with nothing good coming from it.


they completely destroy any ability to share a link to an article

In what way do they do this? If implemented correctly, the URL fragment will uniquely identify the piece of content just the same as if the information were placed to the left of the hash.

Perhaps you don't trust that it will be implemented correctly, but I think we crossed that bridge long ago with the invention of script-generated server-side content. Servlets, perl scripts, php scripts, etc. have been breaking links to content for quite a while now.

Would you also say that cgi-bin scripts "completely destroy any ability to share a link"?


They seem to have fixed it now, but for months on end any link to an article with a hashbang in the URL would instead bring you to the frontpage of any gawkmedia site. The way they fixed it? Ditch the hashbang.

Incompetence anywhere will break things, but when the only clear case in favour of hashbangs (that I've heard of) is infinite scrolling, it's time to look at better ways of doing it. You would hope that Google would have finally figured out by now that they should be making urls prettier, not googlier.


Oops, I think I misread you. I thought you were talking about how hashbangs break the sharability of links, but you were really speaking about how Gawkmedia broke it. I'm not terribly familiar with their site.

...when the only clear case in favour of hashbangs (that I've heard of) is infinite scrolling

That's the first time I've heard them associated with infinite scrolling, but perhaps I've missed something.

The hash-bang idea is just about making ajax-y navigation URLs crawlable by spiders.

It's just an extension of the standard hash (without the bang) where the "url fragment" (everything after the hash) is used in AJAX applications to preserve the user's ability to bookmark and use the back & forth button on their browser.


I was confused by this Gawker issue for a long time, because no Gawker links at all worked for me whichever computer I was on, so I couldn't understand why people were submitting links to Reddit, or how Gawker were still getting any readers at all.

Someone pointed out to me that when I hit the url, I was being redirected to e.g. uk.kotaku.com and the issue appeared to be unaffecting American users, which goes some way to explaining away my confusion.


I don't go to gawker sites because they aren't worth turning on javascript for, and their sites are broken without javascript.

Horrible, horrible implementation.


If implemented correctly using the hash bang method can increase the amount of articles people read per visit. For every article a user reads they also get more advertising. It's a clear win for gawker if people read more content.

Faster because each article the browser has to load after the first one is smaller then a whole page load.

Unfortunately, gawker failed at the technical implementation in spectacular fashion. Though, they do seem to be turning the ship around.


If your site is designed correctly the non-content crud isn't a massive burden for the user to download it's just some small portion of a gzipped total, and assets are cached after the first view.


Following your logic, if you cache your ajax request's properly it's even smaller then doing a whole page load.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: