User Interfaces and Scheduling and Planning

An ICAPS 2018 Workshop
Delft, The Netherlands
June 26, 2018

Click Workshop Proceedings to view/download them

The User Interfaces and Scheduling and Planning (UISP) workshop will be held at the ICAPS 2018 conference, focusing on bridging the gap between automated planning and scheduling technologies and user interface technologies that can both support them, and also benefit from them.

Topics and Objectives

Automated planning and scheduling technologies have been useful in applications ranging from robotics to factory organization to travel design; many of these applications have been designed by members of the ICAPS community. The utility of automated planning and scheduling systems is often constrained by the design of the user interfaces. Members of the ICAPS community as a whole have noted that the real world is overlooking automated planning and scheduling technologies in domains where it should be used; lack of good user interfaces may be one reason for this.

In parallel with this thread is the potential for automated planning and scheduling to help design user interfaces. Workflows for many different user interface tools can be constructed using planning systems as well as other automated reasoning technologies. Historically, there have been a small number of investigations of this type; this workshop presents a new set of challenges to, as well as revives interest in past research initiatives of, the ICAPS community to help design better user interfaces.

The time is also right for the ICAPS community to investigate novel user interface modalities such as natural language processing and augmented reality as ways to facilitate human-planner interaction. While natural language processing systems have been developed over at least the past 20 years, the advent of commodity spoken language systems (e.g. Siri) and natural language processing systems on a chip provides exciting opportunities for integration with automated planning and scheduling. Augmented reality is a 'rising' technology; when coupled with computer vision systems, augmented reality provides new, potentially disruptive methods for supporting plan execution, if not planning, and augmented reality systems may benefit from automated planning and scheduling technology as a form of user interfacing.

The goals of this workshop are thus: 1) to emphasize how automated planning and scheduling and user interface can support each other; 2) to explore how user interface can assist various companies and everyday users in better understanding automated planning and scheduling for their own applications; and 3) to discuss how automated planning and scheduling can be used to improve user interfaces for everyday interaction. Particular topics in each of these goals include, but are not limited to:

User interfaces for automated planning and scheduling

  • Plan and schedule visualization.
  • Mixed initiative planning and scheduling.
  • Emerging technology for human-planner interaction.
  • Modeling tools and language designs to facilitate planning domain construction.
  • Metrics for human readability / comprehensibility of plans and schedules.

Automated planning and scheduling for user interfaces

  • Representing and solving planning domains for user interfaces creation and design tasks.
  • Plan, activity, and intent recognition of users' interactions with interfaces.
  • Improving user experience via personalized constraints and objective preferences.
  • Developing user (mental) models with description languages and decision processes.

We also invite participation from the intelligent user interface (IUI), artificial intelligence for interactive digital entertainment (AIIDE), and human-computer interaction (HCI) communities.

Solicitation and Submission Guidelines

Authors may submit several types of papers to allow the engagement between the variety of communities involved in the themes of this workshop:

  • Ongoing or preliminary research may be submitted as long (up to 8 pages + 1 page exclusively for references) or short (up to 4 pages + 1 page exclusively for references) papers in AAAI format
  • Recently published works at related venues may be submitted for presentation without publication, but a 2-page (including references) extended abstract should accompany the published work in AAAI format for review and inclusion in the proceedings
  • Descriptions of applications that may benefit from user interfaces for/with scheduling and planning may be submitted as extended abstracts (up to 2 pages + 1 page exclusively for references) in AAAI format
  • NEW!: We also encourage submissions that show ways to use natural language interfacing between users and any aspect of automated planning and scheduling. The advent of commodity spoken language systems (e.g. Siri) and Natural Language processing systems on a chip provide exciting opportunities for integration with automated planning systems. Conceptual discussions based on relevant literature or related field studies that show the promise of these technologies will be considered as well as current work.

Submissions should be e-mailed to freedman@cs.umass.edu. Please include the submission type in the message as well as attach a copy of the submission for review.

Important Dates

  • Paper submission deadline: March 13 April 2, 2018 (UTC-12 timezone)
  • Notification of acceptance: April 20, 2018
  • Camera-ready paper submissions: May 22, 2018
  • Workshop date: June 26, 2018

Organizing Committee

  • Jeremy D. Frank, NASA Ames Research Center
  • Richard G. Freedman, University of Massachusetts Amherst
  • J Benton, NASA Ames Research Center
  • Ronald P. A. Petrick, Heriot-Watt University

Program Committee

  • Amedeo Cesta, Italian National Research Council, ISTC-CNR
  • Tathagata Chakraborti, Arizona State University
  • Scott Sanner, University of Toronto
  • Neil Yorke-Smith, Delft University of Technology

Confirmed Invited Speakers

  • David Kortenkamp, TRACLabs Inc.
  • Neil Yorke-Smith, Delft University of Technology

Program

Tuesday (June 26, 2018)
09:00
Welcome and Introduction
09:10
Invited Talk: Who is the Scheduler?
Neil Yorke-Smith
09:50
Paper Talks #1: Interactive Modalities for Automated Planning and Scheduling

MA-RADAR - A Mixed-Reality Interface for Collaborative Decision Making
Sailik Sengupta, Tathagata Chakraborti, and Subbarao Kambhampati

NL2PDDL: A Conversational Interface for Model Generation and Iteration
Kshitij P. Fadnis and Kartik Talamadupula

Visualizations for an Explainable Planning Agent
Tathagata Chakraborti, Kshitij P. Fadnis, Kartik Talamadupula, Mishal Dholakia, Biplav Srivastava, Jeffrey O. Kephart, and Rachel K. E. Bellamy
10:35
Coffee Break
11:00
Invited Talk: Transitioning a Plan Execution System from Research to Deployment
David Kortenkamp
11:40
Paper Talks #2: Execution Display and Monitoring via User Interfaces

Generating Human Work Instructions from Assembly Plans
Csaba Kardos, András Kovács, Balázs E. Pataki, and József Váncza

Projection-Aware Task Planning and Execution for Human-in-the-Loop Operation of Robots in a Mixed-Reality Workspace
Tathagata Chakraborti, Sarath Sreedharan, Anagha Kulkarni, and Subbarao Kambhampati

Technologies for Mixed-Initiative Plan Management for Human Space Flight
Melissa Baltrusaitis, Karen Feigh, Martijn IJtsma, Amy Pritchett, William Lassiter, and Martin Savelsbergh
12:25
Closing and Final Remarks
12:40
UISP Workshop Group Lunch

Accepted Papers

  • Generating Human Work Instructions from Assembly Plans
    Csaba Kardos, András Kovács, Balázs E. Pataki, and József Váncza
  • Technologies for Mixed-Initiative Plan Management for Human Space Flight
    Melissa Baltrusaitis, Karen Feigh, Martijn IJtsma, Amy Pritchett, William Lassiter, and Martin Savelsbergh
  • NL2PDDL: A Conversational Interface for Model Generation and Iteration
    Kshitij P. Fadnis and Kartik Talamadupula
  • Projection-Aware Task Planning and Execution for Human-in-the-Loop Operation of Robots in a Mixed-Reality Workspace
    Tathagata Chakraborti, Sarath Sreedharan, Anagha Kulkarni, and Subbarao Kambhampati
  • MA-RADAR - A Mixed-Reality Interface for Collaborative Decision Making
    Sailik Sengupta, Tathagata Chakraborti, and Subbarao Kambhampati
  • Visualizations for an Explainable Planning Agent
    Tathagata Chakraborti, Kshitij P. Fadnis, Kartik Talamadupula, Mishal Dholakia, Biplav Srivastava, Jeffrey O. Kephart, and Rachel K. E. Bellamy

© 2018 International Conference on Automated Planning and Scheduling - Photo license information: