Rachel Lavin joined the Bureau of Investigative Journalism this summer as a Google News Lab Fellow after graduating from Goldsmiths, University of London. She explains what it was like working with the Bureau Local team and what the initiative means for new journalists like her.
One of the first pieces of advice I ever received from an experienced journalist was this:
“Journalism is like being dropped into a sea full of sharks without a lifejacket. And their fins are circling you and you’re running out of time and there’s only one lifeboat and you have to elbow other people out of the way to get to it.”
Elbows were a continuing theme.
“I just can’t see you elbowing other journalists out of the way”, a career guide told my teenage self.
Three years into my journalism career, I have yet to navigate shark-infested waters, but the impression that journalism is a cut-throat every-newspaperman-for-himself industry perpetuates.
Perhaps it is heightened by the small marketplace I come from in Ireland. There, competition between journalists is seen as part and parcel of the job - to some, it is the core of it - and amidst the turbulent disruption of “new media”, young journalists and digital journalists are still treated with an air of hostility from the old guard. Profits are shrinking, the future is unsure and there simply isn’t much room on the lifeboat.
My career guidance teacher was right. This attitude doesn’t really suit me - I am not an elbowing-out-of-my-way type of journalist. I am enthralled by the idealistic pursuit of journalism as a democratic pillar of society, a social good rather than purely a profit-making venture and see technology’s current disruption of the industry as an exciting opportunity, rather than a death knell.
But is there a place for that attitude in modern journalism or should we all just get scrum-ready, and also ready to tackle the robots, as well as our fellow reporters?
One organisation says no.
The Bureau Local, the local data unit of the UK’s non-profit organisation, The Bureau of Investigative Journalism, launched in March of this year. I was already salivating at the mention of “non-profit journalism” and was therefore delighted when I was accepted to work there this past summer, as a Google News Fellow.
Its goal is to promote and maintain the viability of local journalism as an essential public service and it does this through providing a service many local news outlets may not have yet - data journalism. So in the Bureau Local, a small team of experienced investigative and developer journalists have gotten to work, applying this exciting, emerging discipline to revive local newspapers.
What is Data Journalism?
Data Journalism, as the Bureau Local’s director Megan Lucero describes it, is using computational methods to get stories you couldn’t otherwise. It is an exciting emerging discipline - 90% of the world’s data has been generated between 2014 and 2016, according to IBM. Vast data-sets are buried in the internet, on government websites and large databases and reports online. They are, at the moment, a relatively untapped resource for journalists.
Potentially, you can pull hundreds of story angles from a data-set that can then be applied to different regions all over the country. This is original research, not filtered to journalists by a PR company or government report. It is incredibly exciting and is only now becoming a recognised expertise of modern journalism.
However, it’s also a complex process, requiring specific coding skills and training in different programming languages in order to scrape, analyse and visualise large data-sets.
Developer journalists, who combine these coding skills with journalism know-how, are few and far between and the process of data investigations are long and laborious. As such, data journalism is not yet within the reach of ordinary local journalists to find the time and develop the skill set necessary to access large national datasets and find a local angle. At least it wasn’t, until the Bureau Local arrived, with the goal to make data journalism accessible to local outlets across the UK. Bureau Local is using its expertise to scrape, curate and analyse large national datasets and make them available to local reporters, who can then start to dig into the story. However, using data journalism skills to make national datasets relevant to local papers is only one half of the story. The Bureau Local relies on a second pillar - collaborative journalism.
Wait but, collaboration and journalism...together?
The type of collaboration which saw the Panama Papers become a global story in 2016 proved something - that it was possible, even necessary, to collaborate on one huge data story across multiple news organisations and countries, in order to create the maximum impact.
The Bureau Local is no different. It aims to share data, tips and findings to a network of reporters nationwide so that local newspapers across the UK can release the findings relevant to their audiences at the same time, making the stories hugely impactful on both a local and national level.
So far the model has been incredibly successful. In its first seven months the Bureau Local helped to create over 100 local news stories through a network of 400 regional reporters, bloggers and civic tech workers. Their partners span 90 cities across the UK. And it’s only just beginning.
My Role: Developing Data Visuals
During my fellowship I helped build the Bureau Local’s first data visualisations and a stylesheet for two of its national investigations.
Although a small contribution to their first foray into data visualisation, it was an exciting one to be a part of, as within the Bureau Local there is the potential to make data visualisations part of its mission.
Collaborating with the Bureau’s environmental reporters, Bureau Local helped disseminate the investigations into the rise in intensive farming and farming pollution incidents to local outlets across the UK. The visualisations I helped build, seen above and below, aimed to show both the rise of intensive farms and the prevalence of different types of pollution using D3.js.
We did this in a small way, by completely open-sourcing our methodology and code for the visualisations on Github, so that they could be replicated by others. And there is the potential to do even more, for example, by developing interactive graphs that can be divided into their local counterparts and used by local websites or else an accessible “bank” of Bureau Local graphs, that can be rolled out with easy application to local users across digital and print.
This is much like the Financial Times’s bold move, of making their entire style-sheet for their trademark graphs open source. The newspaper's visualisations can be adapted with little knowledge of the code involved and modified across multiple platforms (such as responsive screen, video and print). Both the Bureau Local and the FT Visual Vocabulary have been shortlisted for innovation of the year at the British Journalism Awards.
This open-sourcing of complex coded graphs is yet another way in which the Bureau Local is promoting a revolutionary, open and collaborative approach to journalism.
Much like this small example of innovation and openness, my time at the Bureau opened up a whole new sense of what journalism, particularly collaborative data journalism, could possibly be.
As the industry struggles to adapt, maybe this is exactly the kind of life-line modern journalism needs.
[Comment marked as spam]