News app developers share tips and tricks

Senior, American University

Three news app developers spoke at ONA10 Saturday afternoon, each showcasing their unique approaches to displaying news in nontraditional ways.

USA Today’s Jarmul: Teamwork leads to ‘really cool things’

For Katharine Jarmul, the sky is the limit when she works on a new multimedia story.

Jarmul, a designer and developer for USA Today, showcased an elaborate package she recently worked on to commemorate the five-year anniversary of Hurricane Katrina.

USA Today's Five Years Later: Hurricane Katrina employs Django and HTML5 and is optimized for mobile devices (top right).

Taking an unconventional approach is essential in online journalism, she said.

“The first thing we need to do as journalists is we have to start thinking outside of the box,” she said. “And this means … the box of what is a story. This also means the box of what technologies do we use.”

The project was built in Django and HTML5, among other technologies, and took roughly two months to build, she said. It’s optimized for mobile devices, including the iPad, and incorporates numerous video and interactive map elements.

Jarmul also said developers should not be intimidated by the size or scope of a project.

“Don’t be afraid to take on a large project, and figure it out as you go,” she said. “I think we need to challenge ourselves to kind of figure it out as we go along.”

USA Today, Jarmul said, emphasizes teamwork. She echoed the words of Juan Thomassie, a senior designer at the paper, who spoke at a Saturday morning session on data visualization.

“As long as you have the developer and the designer working together in unison, then you can do really cool things,” she said in a post-session interview.

PolitiFact’s Waite: It’s all in the reporting

PolitiFact, the Pulitzer Prize-winning political fact-checking website of the St. Petersburg Times, is nothing without solid reporting, said Matt Waite, the site’s developer.

The simplicity of PolitiFact's website belies the strong verification of its claims.

“If you’re going to put the full faith and reputation of the St. Petersburg Times up for question by calling a liar a liar, you better have the goods,” he said. “You better be right.”

PolitiFact is accountability journalism, which is hardly a new form of journalism, Waite said. However, Waite’s team presents accountability journalism in a new format, which is what sets the site apart.

“We approached what we were doing not as stories …” he said. “We approached it as a structured data problem of taking a type of story and extracting the structure out of it and rebuilding a content app out of that structure.”

The content app succeeds, Waite said, because of its ability to present the public with the source materials that PolitiFact’s editors use to rate a politician’s claims. That is achieved by heavy use of linking and by using services such as DocumentCloud to serve primary source documents.

Waite said more journalists and news app developers should be concerned with structured data problems rather than content problems.

“We as an industry are not doing nearly enough with this structured data approach, of viewing what we do as a structured data problem more than a content problem,” he said.

ScraperWiki’s quest to liberate the Web’s data

The London Metropolitan Police Service provides the public with a high level of transparency: Each unit posts on its website what it’s focusing on at the moment.

But there’s a catch. All that information is embedded in HTML, and to get a snapshot of what each unit is doing at once, one would have to go to 620 websites and find the data.

Enter ScraperWiki. The entirely Web-based tool allows developers to automate the scraping process, said Richard Pope, one of the tool’s developers.

ScraperWiki allows its users to improve upon the work of others.

Not only does ScraperWiki scrape data that’s embedded in HTML, but it also stores and visualizes the data. The tool is available in three different, though popular, programming languages: PHP, Python and Ruby.

In true wiki fashion, completed scrapers are posted on its website, allowing developers to re-purpose and improve them.

ScraperWiki is to data what Wikipedia is to knowledge, Pope said.

Share the digital evolution:

  • Print
  • email
  • Twitter
  • Facebook
  • FriendFeed
  • Digg
  • del.icio.us
  • Mixx
  • Google Bookmarks
  • LinkedIn
  • Posterous
  • StumbleUpon
  • Tumblr
blog comments powered by Disqus