Authoring Data-Driven Videos with DataClips

Data videos, or short data-driven motion graphics, are an increasingly popular medium for storytelling. However, creating data videos is difficult as it involves pulling together a unique combination of skills. We introduce DataClips, an authoring tool aimed at lowering the barriers to crafting data...

Description complète

Détails bibliographiques
Publié dans:IEEE transactions on visualization and computer graphics. - 1996. - 23(2017), 1 vom: 02. Jan., Seite 501-510
Auteur principal: Amini, Fereshteh (Auteur)
Autres auteurs: Riche, Nathalie Henry, Lee, Bongshin, Monroy-Hernandez, Andres, Irani, Pourang
Format: Article en ligne
Langue:English
Publié: 2017
Accès à la collection:IEEE transactions on visualization and computer graphics
Sujets:Journal Article Research Support, Non-U.S. Gov't
Description
Résumé:Data videos, or short data-driven motion graphics, are an increasingly popular medium for storytelling. However, creating data videos is difficult as it involves pulling together a unique combination of skills. We introduce DataClips, an authoring tool aimed at lowering the barriers to crafting data videos. DataClips allows non-experts to assemble data-driven "clips" together to form longer sequences. We constructed the library of data clips by analyzing the composition of over 70 data videos produced by reputable sources such as The New York Times and The Guardian. We demonstrate that DataClips can reproduce over 90% of our data videos corpus. We also report on a qualitative study comparing the authoring process and outcome achieved by (1) non-experts using DataClips, and (2) experts using Adobe Illustrator and After Effects to create data-driven clips. Results indicated that non-experts are able to learn and use DataClips with a short training period. In the span of one hour, they were able to produce more videos than experts using a professional editing tool, and their clips were rated similarly by an independent audience
Description:Date Completed 30.07.2018
Date Revised 30.07.2018
published: Print
Citation Status PubMed-not-MEDLINE
ISSN:1941-0506