Tangi

a Tool to Create Tangible Artifacts for Sharing Insights from 360° Video

Conference Paper (2025)
Authors

Wo Meijer (Internet of Things)

Jacky Bourgeois (Internet of Things)

Tilman Dingler (TU Delft - Human-Centred Artificial Intelligence)

G. Kortuem (Internet of Things)

Affiliation
Internet of Things
To reference this document use:
https://doi.org/10.1145/3689050.3704928
More Info
expand_more
Publication Year
2025
Language
English
Affiliation
Internet of Things
Pages (from-to)
1-14
ISBN (print)
979-8-4007-1197-8
ISBN (electronic)
9798400711978
DOI:
https://doi.org/10.1145/3689050.3704928
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Designers often engage with video to gain rich, temporal insights about the context of users, collaboratively analyzing it to gather ideas, challenge assumptions, and foster empathy. To capture the full visual context of users and their situations, designers are adopting 360° video, providing richer, more multi-layered insights. Unfortunately, the spherical nature of 360° video means designers cannot create tangible video artifacts such as storyboards for collaborative analysis. To overcome this limitation, we created Tangi, a web-based tool that converts 360° images into tangible 360° video artifacts, that enable designers to embody and share their insights. Our evaluation with nine experienced designers demonstrates that the artifacts Tangi creates enable tangible interactions found in collaborative workshops and introduce two new capabilities: spatial orientation within 360° environments and linking specific details to the broader 360° context. Since Tangi is an open-source tool, designers can immediately leverage 360° video in collaborative workshops.