Sort by: [year] [type] [author]

Averting Robot Eyes

Margot Kaminski, Matthew Rueben, William D. Smart, and Cindy M. Grimm.
Maryland Law Review 76(4), University of Maryland, 2017.

Home robots will cause privacy harms. At the same time, they can provide beneficial services — as long as consumers trust them. This Essay evaluates potential technological solutions that could help home robots keep their promises, avert their eyes, and otherwise mitigate privacy harms. Our goals are to inform regulators of robot-related privacy harms and the available technological tools for mitigating them, and to spur technologists to employ existing tools and develop new ones by articulating principles for avoiding privacy harms.

We posit that home robots will raise privacy problems of three basic types: (1) data privacy problems; (2) boundary management problems; and (3) social/relational problems. Technological design can ward off, if not fully prevent, a number of these harms. We propose five principles for home robots and privacy design: data minimization, purpose specifications, use limitations, honest anthropomorphism, and dynamic feedback and participation. We review current research into privacy-sensitive robotics, evaluating what technological solutions are feasible and where the harder problems lie. We close by contemplating legal frameworks that might encourage the implementation of such design, while also recognizing the potential costs of regulation at these early stages of the technology.

Paper: [PDF]
External link: [link]

  author = {Kaminski, Margot E. and Rueben, Matthew and Smart, William D. and Grimm, Cindy M.},
  title = {Averting Robot Eyes},
  journal = {Maryland Law Review},
  volume = {76},
  number = {4},
  pages = {983--1024},
  publisher = {University of Maryland},
  address = {Baltimore, MD},
  year = {2017}