When Oracles Go Wrong: Using Preferences as a Means to Explore.

Sheidlower, Isaac S.

Short, Elaine Schaertl.

2021

Description
  • Keywords: human-robot interaction, interactive reinforcement learning.

    Topic: Computing methodologies / Artificial intelligence

    ACM Open.
This object is in collection Permanent URL Citation
  • Isaac S. Sheidlower, and Elaine Schaertl Short. "When Oracles Go Wrong: Using Preferences as a Means to Explore." Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 2021
ID:
z603rc30k
To Cite:
TARC Citation Guide    EndNote
Usage:
Detailed Rights
Rights Note:
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions@acm.org