← Older Newer →

Eventful: Journalism as a Service Through Crowdsourcing

By andresmh4 months ago - permalink

Tags: research civic media

What if requesting an event report was as easy as ordering an Uber ride?

In a previous blog post we described the process of leveraging collaborative writing tools and TaskRabbit, for local news production. We then began to wonder what an automated, streamlined platform for journalism-as-a-service would look like. Would there be a way to connect on-demand labor, with those who need a report of an event?

We built a system that would support exactly this process. Eventful makes it possible for anyone to access the service of event reporting. The reporting process is executed behind the scenes through a crowd of people working online or at the event to produce a news report


Eventful makes requesting a report a one-step process: identify the event type, location, date, and duration. A few hours later and under $150 dollars, a multimedia event report is ready. 

Eventful hopes to prevent community groups that organize local events on a tight budget from scrambling to cover the event themselves (or risk getting no coverage). A city council could use the system to request reports of discussions that take place during important public meetings. A local blogger could solicit news of community events that they can’t attend themselves. 


Eventful assigns reporting tasks, called “missions,” to on-demand paid workers, or event attendees who act as field reporters. Field reporters access and submit new photos or interviews, through their mobile phones as they progress on their missions. 

As reporters submit material in real-time, curators working online give them feedback, approving their work or requesting more information. Curators pass on the content feed to a writer or composer who crafts a news report on the spot.


We tried out this process of crowdsourcing local event reporting on eleven events. For some events, we recruited amateur reporters from TaskRabbit, in others, we recruited volunteer event attendees from the community or a mixture of both. Through these deployments, we observed the following:

  1. Cost. Reporting on events lasting up to two hours, with 2-4 field reporters, a writer and a curator, costs less than $150.
  2. Need for tools. Using existing tools, such as Twitter, Soundcloud, and Youtube, for content proved problematic. These tools are not designed for helping workers receive structured instructions, and workers themselves were often unfamiliar with these tools and have to struggle through a long learning curve. With this in mind, we built a mobile web app that enables reporters to receive instructions and feedback as well as submitting content through the same system.
  3. Selfies for Quality Assurance. Online crowdsourcing can tempt people who claim earning pay without doing work. Luckily, it’s hard to fake attending a small event, especially when submitting photos and interviews. Of our 11 experiments, we had a number of poor submissions for only two events. One recruit took photos from the Internet to submit as his own work. In response, we devised a “Selfie Quality Assurance” method: To prove they were taking photos of the actual event, reporters had to take a selfie while there.
  4. Recruitment constraints. We discovered that paid workers booked through TaskRabbit required at least one day of advance notice. When we hired workers and told them where to go, they often asked exactly what to do at the event, expecting detailed instructions more than 2 hours in advance of the event’s beginning.
  5. Cognitive load on attendees. When leveraging existing attendees to report on events, we received content that was much richer in context about the event or community. On the other hand, reporting duties disrupted the volunteers’ own experience of the event.
  6. Objective Paid Workers. Paid workers did not usually know the context of the event they were attending, which community members reading their reports noticed. Naturally, this led field workers to report facts “too objectively” for some local readers’ tastes. A local blogger re-framed one article written by a crowd-worker with appropriate context for reuse on her blog.
  7. Task Structure. Workers and volunteers consistently appreciated structured instructions. One of our workers was even happy to have picked up her task: The detailed directions, she explained, trained her for a job she otherwise couldn’t have done and expanded the range of tasks she would be able to do on TaskRabbit in the future.
  8. Engagement. We were surprised by the amount of deep personal connection the process of crowdsourcing journalism can foster. Our event managers took their roles very seriously. The majority of our reporters submitted great work, and some of them engaged personally with the events and the writers. In both roles, the workers sometimes went beyond their job description and seemed to enjoy the process.    

Work from Elena Agapie on the project we worked on while she was a FUSE visitor.

← Older Newer →