portrait

David Porfirio
dporfirio@wisc.edu
People and Robots Lab
Computer Sciences Department
University of Wisconsin–Madison

Download my CV


About Me
I'm a PhD candidate at the University of Wisconsin–Madison, and my goal is to investigate and build new technologies that can help people program social robots. To do that, I create software and hardware interfaces that capture the intent of interaction designers and developers and convert their intent into programs by means of program verification and synthesis.

I am mentored by Bilge Mutlu, Aws Albarghouthi, Allison Sauppé, and Maya Cakmak.

Some recent updates:
5.11.21
I presented Figaro, our tabletop authoring environment, at CHI 2021.
1.21.21
Figaro, our tabletop HRI authoring environment, has been accepted to CHI 2021.
12.19.20
I presented my research at Talking Robotics.
9.23.20
I attended the Microsoft Research AI Breakthroughs event.
6.20.20
I completed my preliminary examination and am officially a dissertator!





Projects

repair

Figaro: A Tabletop Authoring Environment for Human-Robot Interaction

CHI 2021 pdf, video, github

Figaro is a tabletop authoring environment in which demonstrators use figurines to play out scenes of human-robot interactions. In each scene, Figaro records the positions, movement, and speech of the figurines, in addition to actions enacted on the figurines themselves. Figaro then synthesizes a full human-robot interaction program that can be executed on a robot.

repair

Transforming Robot Programs Based on Social Context

CHI 2020 pdf, video, github

We developed a novel method for automatically making modifications to a robot program after the program has been deployed on a physical robot. The goal of the modifications is to maximize user experience for a specific interaction context, while maintaining adherence to a prespecified set of baseline context-free social norms.

synthe

Bodystorming Human-Robot Interactions

UIST 2019 pdf, video, github

We developed a programming environment, Synthé, that enables design teams to act out, or bodystorm, human robot interactions. Designer demonstrations are converted to execution traces, which are then used as input to an inductive synthesis algorithm which synthesizes a full human-robot interaction program from scratch.

rover

Authoring and Verifying Human-Robot Interactions

🏆 UIST 2018 (Best Paper Award) pdf, video, github

We developed a visual programming environment that allows people to design human-robot interaction programs and receive feedback in real-time on whether these programs violate social norms. In order to provide this feedback, we model in-progress human-robot interaction programs as transition systems, and a set of context-specific social norms within temporal logic. The transition systems and social norms are then input into an off-the-shelf model checker.