Last week the New York Times launched a site re-design, the first since 2006, which seems implausible but there it is. There has been a lot of discussion about it, by peoples many and varied with better journalistic chops than me. I wanted to document a few of the features I found the most interesting. Speaking broadly, the new site looks cleaner, more spare, less like a derivative of the print edition and more like a digital publication. That may have been the point…
I’m reading a fascinating book by Frank Rose, The Art of Immersion.
In reading it, it also got me thinking about the now rather abused phrase “high quality content”. We all know what this means, we all recognize it when we see it. But this book got me thinking more about where the need for “high quality content” came from.
Follow me and I shall explain…
But first, some content. The Art of Immersion details the profound changes taking place in all media now, thanks to the opportunity to now interact with the content. Frank Rose labels this co-creation and tracks its history and, more importantly, where this co-creation might lead. I will cover this in more detail when I have completed the book, however last night I read one passage which made me view links in a new light.
Essentially, he reminded me of something that I already knew: world wide web. There’s a clue right there. The invention and use of hyperlinks as a mean of ordering information is exactly that, a web, linking simultaneously in lots of different directions to lots of different things. A link is the preferred mode of passing from A to B and then to F, Z, 23, AB47 etc. Essentially, the link allows us to flit from information hub to information hub in a non-hierarchical manner.
Non-hierarchical. We can move via free association, rather than rigid taxonomy. Links are merely the mode of transport.
So, with this in mind, given that links are the means of sorting the wheat from the chaff, your content had better be extremely authoritative to be worthy of a link and to not end up in the internet abyss. A web publisher must find it so valuable, so insightful, so funny, so cute, so unmissable, that they do not want it to vanish, they want to be able to find it again and allow their readers to enjoy it too.
This may be an obvious point for many, it was just a new, deeper way of looking at a familiar problem, and don’t we all need that from day-to-day?
In a recent NPR piece, The Tricky Business Of Predicting Where Media Will Go Next, there are some fascinating details, from the implications of an Apple conference hosted in Japan in 1992, to the details of the platform models of both The Huffington Post and BuzzFeed.
In 1992, Apple hosted a conference to discuss the intersection of technology and news. Bob Kaiser, then-editor of The Washington Post was there, alongside the great and good and occasionally weird of the two industries. The letter is a treat to read, it appears that almost everything that we all take for granted now in media, sharing, editing our own content, bespoke payment models, was mentioned in the conference. The only notable omission in our current smorgasbord of media options is the opportunity to insert yourself into existing films. That would be fun, although there are video games that run along similar lines.
It is rather poignant however, as Kaiser in the NPR piece references the letter but also how the climate of success and power in 1992 at The Washington Post meant that no-one was scared enough to trial the big ideas and innovations that were necessary. To be fair, they gave it a fighting try, by launching their classified ads online however they lost ground to the Craigslists of the world.
In terms of now, the NPR piece goes on to cover the existing business models for the Huffington Post and BuzzFeed.
The Huffington Post is built on aggregation.
1) Use blogs from all over the web on any subject, elevate those blogs by putting them on the front page of the HuffPo site.
2) Curate the stream of incoming content, there is an enormous push on this, the editors are always being trained to hone and refine their particular editorial ‘voice’.
3) Encourage commentary (and, this is my thought, use this a metric of success?)
4) As the business grew, invest in original content.
They have been wildly successful in this endeavour as they started this at about the same time search engines were the primary means of finding the news. Buzzfeed is doing it slightly differently, they operate on the assumption that people now find news and content via social networks, which is certainly true of younger users. Apparently 60% of Buzzfeed’s users are 18-34, whereas the average Fox News viewer is 65. Buzzfeed does not have paywalls or subscriptions and they make their money through ad revenue. They are so optimised for social that their CMS has dynamic elements to it, I did ask the Buzzfeed editor, Ben Smith, to elaborate on this however no response as yet. Ah, he’s a busy man and it is a Sunday so we won’t begrudge him some peace. Rather like Reddit, the presence of an article on the front page of Buzzstream, relates directly to how shared it has been. So, whilst strictly democratic, it does almost remove the editorial role, which I think is rather sad.
As Bob Kaiser said way back in 1992:
“Successful media provide an experience, not just
bits of information…Confronted by the information glut of the modern world, I suspect even the computer-comfortable citizens of the 21st Century will still be eager to take advantage of reporters and editors who offer
to sort through the glut intelligently and seek to make sense of it for them.”
This is insightful, paidcontent.org has added a twitter summary of the article you are reading at the head of the article. This allows you to tweet it without the 140 character dance that summarising an article usually involves. It will encourage lazy tweets however, as some people will not bother reading the article in full, preferring to simply tweet it to their followers instead. I wonder if it will even be possible to track a combination of time on site (ie, time theoretically spent reading the article) alongside the nature of the tweet that followed so the apparent reading of the article?
A user spends a mere 30 seconds on a long article followed by an automated tweet that simply repeats the offered twitter summary,
A user has spent time on the page, possibly with page clicks and other metrics of interaction tracked, followed by a personalised tweet on the subject.
Scenario 2 would be more ‘valuable’
You could set this up comparatively easy in Google Analytics, with the virtual page views feature (to track the interaction portion of this scenario) combined with that person’s twitter feed. Assuming they reference your article URL, you should be able to pull in this data into excel and, frankly, have a party with it all.
You would have to factor in average reading times in relation to the number of words of the article but that’s about as complicated as it would get. My knowledge of Google Analytics is entry-level at this point but even a newbie like me could set this up… I wonder if any publishers are thinking about their data in this way? We could even devise a ranking system for the tweets themselves, factoring in the time on page and tweet content metrics I mentioned above, multiplied by that person’s twitter ‘influence’. Of course, the issue of influence in twitter is itself something of a thorny subject, but, perhaps worth investigating all the same.
[Time on Site (probably factoring in other signs of interaction yet to be defined) *
unique tweet (this will have to be a binary number, 5 for a unique tweet, 1 for an automated tweet)
* user’s twitter influence]
Project! I had better start talking to the staff Data Scientist whiz kid.
I saw this interesting video about USAtoday.com – the link is as the foot of this post. The video goes into details about how Fantasy Interactive (F.I for short) tackled the site redesign. Two points I found interesting:
1) They went to great lengths to flatten the architecture of the site. This is a clumsy screen grab from the video, but it demonstrates how all of the sections (tech, business, home, style, people) are the next layer down from the home page.
I just like and admire the simplicity of this approach to Information Architecture, it is good for users to know where they are in a site, it is just helpful and grounds them.
2) They have the redesign for desktops + tablets however a separate site for mobile users (which isn’t as pretty by far) and separate apps for tablet and mobile users too. This is all commendable effort but I can’t help but thinking that this is a flawed strategy. In the video, the speakers acknowledge that they want to give users a different experience on each device. This seems fatally flawed to me, for, isn’t this just confusing, ultimately for the user? If you had to remember slight site differences depending on the device you happen to be using at the time? Not to mention the technical trials of using only partial responsive design.
I had thought the advantage of signing into browsers was that you could easily pick up where you left off, if each device has a slightly different UI, isn’t that counter the general direction of travel for the internet at large? It may not disrupt your experience, but it also does not ensure that you can just pick up where you left out quite as easily.
This is just one site, it will be interesting to monitor if other large publishing sites follow suit.
Whilst this could be construed as a vote of no-confidence in the co.uk domain, I don’t think it is. It’s more, as Tanya Cordrey mentions in her blog post, a move towards a more international set-up.
“… this move reflects our evolution from a national print newspaper based only in the UK – reaching hundreds of thousands of people once a day – to a leading global news and media brand with an ever-growing worldwide audience of tens of millions accessing Guardian journalism every minute of every day.”
The post goes on to discuss recent scoops and triumphs concerning specifically American news. Given that the Guardian is not yet behind a paywall whereas US ‘industry titans’ like the WSJ and The New York Times are, this does seem like a transparent power play for market share. I cannot blame the Guardian for doing this however it does seem like a loss leader, or rather, not the general direction of travel for the industry. How can this be sustainable for them? Their writers and editors need to eat too.
It reminds me of advice I was given when I first graduated some time ago, namely, it’s fine to work for free to get a start, but ultimately, my skills and abilities are worth something and therefore to continue to work for free is detrimental. The Guardian may make short term gains now but when the playing field is level, that is, when they do start charging for content, I wonder how loyal their readers will be?
On the other side of this, The Guardian is a damn good publication and I’m pleased it might now reach a wider audience, perhaps one day papers will not be limited by country or language?
In terms of the publishing industry, wouldn’t it be cool if someone formed a digital subscription model based on a variety of papers? Allowing users 10 (say) per day/week/month based on their news consumption and publication preferences. The news buffet!
Two articles from The Guardian, a few from the FT (it covers international news very well!), some more from The New York Times and maybe a few more magazine-y publications too, like The Atlantic and The New Statesmen. Wouldn’t that make for a rich, varying tapestry of news and culture? Something to delight the senses and inform the mind?
Absurdly interesting, yet thoroughly sad article, detailing the problems of restraint (or lack thereof) of “citizen journalism” published in today’s New York Times magazine. It may well be behind the pay wall but it’s well worth a read.
In short, it calls into question the role of “new media” in breaking news via tweets and posts and that of “old media” using these unverified tweets and posts in a bid to get the scoop. I won’t undermine the piece by trying to summarise it here, it wouldn’t do it justice.
It does support my theory that the role of editors will take on new importance as more of us incorporate the pace and efficiency of digital news into our news-gathering habits. Right now, the news industry is still transitioning from the old model (often called Fortress Journalism) to the new, where anyone with a smart phone can express an opinion and, with canny timing, get that opinion our there. It does make me hopeful that Google’s Author Rank might contribute towards a solution, by rewarding more established subject experts with better representation in the search results and, hopefully, pushing those less-established sources further down the page into obscurity.
That, or will ‘we’, all of us, soon evolve to dismiss the breaking news stories on sites like Twitter and Reddit?
A short post I wrote for the work blog, concerning the Future of News and possible wider implications for general content production has got me thinking and hitting The Google.
There’s a public television show dedicated to it, I found it via Newseum.org (catchy name!) and whilst the format is that of a curiously old-school magazine TV show, the pundits and themes are spot on. There is an emphasis on TV news but this is set against the rise of digital media so it should be revealing.
The preview video isn’t embeddable (come on!) but you can find it here. La Mayer is featured, alongside a few industry veterans so my expectations for non-fluffy, thoughtful discussions are high.
My next few blog posts will be about the ideas discussed in the show and any subsequent debate.