Verizon ups its FiOS speeds to 50Mbps, sets the internet on fire
Filed under: Networking

[Via GigaOM]
(Via Engadget.)
Filed under: Networking
[Via GigaOM]
(Via Engadget.)
I ran across this posting on Slashdot last night & thought that is was worth mentioning here:
Indus Khaitan writes "Thanks to twitter, SMS, and mobile web, a lot of people are using the url minimizers like tinyurl.com, urltea.com. However, now I see a lot of people using it on their regular webpages. This could be a big problem if billions of different links are unreachable at a given time. What if a service starts sending a pop-up ad along with the redirect. What if the masked target links to a page with an exploit instead of linking to the new photos of Jessica Alba. Are services like tinyurl, urltea etc. taking the WWW towards a single point of failure? Is it a huge step backward? Or I'm just crying wolf here?"
That aside, this does present a question for the net in general...what if all of these shortened URLs suddenly stop working? What will this do to link dependent analysis such as Google's page rank? Anywho - just some things to ponder.
Written by Marshall Kirkpatrick
The link shortening and redirection service TinyURL went down apparently for hours last night, rendering countless links broken across the web. Complaints have been particularly loud on Twitter, where long links are automatically turned to TinyURLs and complaining is easy to do, but the service is widely used in emails and web pages as well. The site claims to service 1.6 billion hits each month.
There are many free public alternatives to TinyURL, some with better ancillary features (see elfurl.com for just one example). The name TinyURL is very literal and memorable though. I use SNURL more often, myself.
It's not good when so much of the web runs through a single service. For some, politics could be a consideration as well as technical considerations. The man behind TinyURL, Keven Gilbertson, uses his hugely popular website to promote US presidential candidate Ron Paul, which I personally find somewhat distasteful, and encourages people to use TinyURL to obscure affiliate links on their webpages - which strikes me as extremely distasteful. Presumably a Paul supporter would want our redirects to run wild and free too, unbeholden to a centralized service provider capable of holding us under its thumb (I joke, but really.)
URL shorteners are important because they make long links much easier to communicate. The print world could learn a thing or two from these services; InfoWorld magazine, for example, used to to publish very short redirects through infoworld.com for all links it discussed. That's great for efficiency and brand recognition and makes me wonder whether all of us ought to have our own private TinyURL service.
If there was some sort of distributed standard or tool that could be good as well. The Online Computer Library Center (OCLC) has run Purl.org (Persistent Uniform Resource Locator) since the 1990's but user experience is something only a librarian would put up with. A public institution solving this problem gracefully might be as realistic it would have been for the Library of Congress to have acquired Del.icio.us (my fantasy) instead of Yahoo!
The moral of the story, though, is that it isn't supposed to work this way. There ought not be one single point of failure that can so easily break such a big part of the web.
Update 2:
David Winer has commented here with:
Creating a maintainable and thriving web
Steve Rubel writes about the danger of routing all our URLs through TinyUrl. I love what URL-shorteners do, it's especially important in Twitter when you're limited to 140 characters to express an idea. If you have to include a link, that could use up a lot of the space you have. The problem is if everyone uses TinyUrl, as Twitter does, what happens when TinyUrl goes down or is sold to someone we don't like, or disappears forever? I admit I don't know the owners of TinyUrl and what their motives are. Their service is reasonably long-lived, reliable and quick. Even so I've written my own URL-shortener and am running it on one of my servers, and I try to use it whenever possible. However, like all my sites, this one will likely disappear within a few days of my passing. I have to maintain my servers to keep them running. A better solution is surely needed. Rubel's epiphany just exposes the tiniest sliver of the huge problem below, creating a sustainable web. We're nowhere as far as that's concerned
Could a Billion TinyURLs Go 404?
TinyURL, a free and extremely popular five-year-old web service that shortens URLs and is a staple of tools like Twitter, has suffered some brief downtime lately. It's down as of this writing, as you can see from the screen shot below.As a result, some are starting to imagine what might happen if such a single point of failure should go down for an extended period of time or, worse, shut down or be acquired. Twitter is far from the only company using the TinyURL API service.
The thought of an evaporating TinyURL - a wonderful tool that remind you is provided to us all for free - especially considering its rising popularity is all more than a little bit frightening, yet fascinating. Take a look at the chart below, which comes from Google Trends. It shows TinyURL clearly rising. No wonder it's having a hiccup!
Perhaps Google or someone else will buy TinyURL at some point. Still, that's not a good solution since all roads still lead to one. All of this points to a big weak spot in the web as more people and services rely on the terrific TinyURL service (and its alternatives).
Update 3: David Berlind at CNET says: Yesterday, Slashdot asked ‘What if TinyURL goes down?’ Today, it’s down (and it hurts).
At the end of October one of the admins of the world's largest Bittorrent site sat down for an interview and predicted the protocol's demise. Citing Bittorrent, Inc.'s corporate ties and some technical limitations, brokep announced that The Pirate Bay was working on a new protocol to succeed Bram Cohen's Bittorrent. The idea's been percolating throughout the filesharing scene since then: a small survey of top site admins conducted by TorrentFreak found opinion divided over whether Bittorrent will be replaced.
It's true that the protocol's been asked to do things that its
creator didn't envision. Clients now use encryption to get around ISP
traffic shaping and sometimes pad
files to improve interoperability with other networks. DHT
functionality, which removes the need for a central tracker, was
implemented in a chaotic, piecemeal fashion. Private trackers have to
monkey around with torrents' announce URLs in order to monitor
individual users' activity. Torrent files lack metadata. Traversing
firewalls remains an issue. And various researchers have created custom clients that
prove the protocol can be subverted by selfish users. There are
tacked-on, vulnerable and subpar aspects to the way Bittorrent works
— plenty of room for improvement, in other words.
But assuming a technically superior standard is produced, will it be adopted? It's easy to find examples for and against: the Ogg Vorbis audio codec offers better sound quality than MP3, no licensing entanglements, and several awfully-cool features (like the ability to reduce a file's size without reencoding it). But Ogg has never really caught on. Some users employ the also-technically-superior WMA and AAC formats, but only to the extent that Microsoft and Apple force them. For most users, MP3 seems to be good enough. On the other hand, online video has adopted new codecs almost as soon as they become available, moving from VCD to SVCD to MPEG to DivX to Xvid and beyond. The situation's so complex that utilities exist for the sole purpose of untangling a given AVI's miasma of codecs.
What makes these cases differ? It all comes down to timing: consumers will switch technical standards so long as doing so carries few costs (i.e. only requires that more free software be downloaded). Ogg Vorbis hadn't attracted enough attention by the time portable MP3 players arrived. Once the supply chain for MP3 decoding chips was established and a generation of compatible players purchased, the game was pretty well decided. By comparison, only a handful of exotic DVD players bother to support the video formats commonly found on P2P networks. Most portable digital video players still count on users recompressing their files to save space and conserve CPU cycles. Once there's an established infrastructure — of either hardware, accumulated code or simple corporate momentum — consumers may stick with suboptimal technical standards. But prior to that point, users will stay close to the cutting edge.
Bittorrent seems to be on the cusp of this transition. Some hardware devices are coming to market with the standard baked in, but not too many. Various organizations like Miro, Joost and Blizzard Software are building parts of their business around the protocol, but not in an irreversible manner. If Bittorrent gains much more attention, its supporting infrastructure of trackers and open source projects will likely trump whatever advantages a new standard can offer. But I think that there remains a window of opportunity for elite users to popularize a new protocol, should they settle on one. Brokep and his peers still have a few months to steal BT's thunder.
(Via Techdirt.)