Imagine Twitter without links

Anecdotal observation: Twitter users rely heavily on links. That’s why this Almight Link blog has had so many posts about social media.

Imagine Twitter without hyperlinks.

  • “A plane just landed in the Hudson.” I snapped a shot of passengers on the wing. Unfortunately, I can’t share it with the world. I’m sure newspapers will be happy with the pictures the wires will shoot in a few minutes.
  • “I agree with what Pogue said about Google Voice. What do you think?” I hope it doesn’t take you too long to find that August column. That would really kill this conversation.
  • “I know a good auto shop that can help. No site, but I’ll email a link to the map.” I appreciate one-line emails. Don’t you?
  • “Orange County #followfriday: @jonlan @MelAclaro and @RochelleVeturis” To follow: select one username and hit ctrl-C. Then go to your browser’s address bar, type “” and hit ctrl-P. Rinse. Repeat. Remember, this fictional Twitter can’t just add links to @mentions.
  • “You have to check out that one Lego Star Wars video. You can watch it on YouTube.” Okay, now you go to YouTube and search. Wait a minute. That must be a mistake. There can’t really be that many Vader videos starring toy bricks.
  • “@ksablan that is just plain wrong.” I’m confused. My last few tweets were just thank yous. It would be great if there was a link to the tweet to which you are responding.
  • “Does Scoble, Brogan and Vaynerchuk just tweet all day?” No, but how would you know that? Even profiles lack links in this hypothetical Twitter. Don’t get me wrong, those three agent of trust would crush it with their naked conversations, even without Twitter links.
  • “Use TinyURL to shorten that long link.” Sorry, you won’t be able to track it to see if anyone clicked. Gosh, I wish there were other shorteners out there. Oh well.

Update (12/31/09): I’ve created a linked edition of the examples in this post.

    Hudson Storytlr mashup was editing, not “Real-Time Web”

    Hudson crash lifestreamedReadWriteWeb’s Bernard Lunn recently said it is “the Real-Time Web that will unseat Google” and linked to a presentation I created as an example of that real-time web. I’m flattered that he called it “the future of media,” but have to admit that it was the product of nothing more than old-fashioned editing.

    I documented the the landing of of US Airways flight 1549 onto the Hudson river by pulling content from various social tools and editing  them with the Storytlr lifestreaming platform.

    One definition of the word edit is “to collect, prepare, and arrange materials for publication.”

    • Traditional journalists collect comments, photos and video by conducting interviews and shooting still and moving images. I collected materials by searching  Twitter, Flickr and YouTube.
    • Traditional journalists prepare notes by transcribing them. They prepare digital pictures by adding meta information and saving them. They prepare video footage by transferring and logging it. I prepared content by importing it via RSS feeds.
    • Traditional journalists arrange words, pictures and video with word processors, layout programs and video editing software. I arranged parts of the story by identifying those that helped create a story, and deleting the rest.

    Not real-time

    Storytlr can aggregate information in real-time, but the Hudson piece was created by pulling in not-so-real-time content, at least 8 hours after the Hudson landing took place. It was only in hindsight that I was able to look at the pieces and construct a rough story.

    Lunn’s prediction might be right. Storytlr “could become an event-streaming mashup platform for media.” But that can only happen if the person using the tool can quickly identify interesting subjects and add their feeds to the live stream. That human will need to do that work in real-time.

    Take a look at this  small sample of tweets sent during the inauguration and tell me if you can identify/choose a story worth following …

    inaug09 tweets on January 20


    Linked to from this post:

    Hudson crash, lifestreamed by storytlr

    [Update: My new post addresses the idea that this mashup was an example of “Real-Time Web.”]

    I used storytlr to gather feeds from Twitter, Flickr, YouTube and Vimeo to create this aggregated “story” about yesterday’s crash of US Airways Flight 1549 into the Hudson river. Click on the image below to watch the story. The links are “hot”, so click when you see blue.

    Storytlr calls itself  a “platform to build the centralized you.” Although intended to tell one  person’s story, it does a fine job of pulling together bits of information from various “citizens” to create one story.

    To make the presentation, I used Twitter Search to find comments with the word “Hudson” from people within 50 miles of New York City. I waded through those comments to find a handful of people whose comments, pictures and video might work to tell a story.

    I imported the content from each of the sources into storytlr by simply providing their usernames.

    The hard part was editing, or what Tim Windsor calls curating,  the approximately 700 bits of information into some semblance of a disjointed story.

    Storytlr orders content strictly by chronology. There is no chance to move one piece of information up or down a little to improve the flow of the story.

    The result is a stream of moments captured by individual storytellers, the  “lifestream” not of a person, but an event.

    This particular story says a whole lot about the power of “citizen journalism” while showing how active mainstream media has become in social spaces. (That analysis really needs another post altogether.)

    Unfortunately, storytlr does not credit each piece of content. Remember, it wasn’t meant to be used to tell the story of multiple  lives.

    So here is a list of the sources I tapped to create this presentation. Check them out and follow them if their content fits your interests.




    Linked to from this post: