Beyond Pixel Norm-Balls: Parametric Adversaries using an Analytically Differentiable Renderer
Hsueh-Ti Derek Liu
and
Michael Tao
and
Chun-Liang Li
and
Derek Nowrouzezahrai
and
Alec Jacobson
arXiv e-Print archive - 2018 via Local arXiv
Keywords:
cs.LG, cs.CV, cs.GR, stat.ML
First published: 2018/08/08 (6 years ago) Abstract: Many machine learning image classifiers are vulnerable to adversarial
attacks, inputs with perturbations designed to intentionally trigger
misclassification. Current adversarial methods directly alter pixel colors and
evaluate against pixel norm-balls: pixel perturbations smaller than a specified
magnitude, according to a measurement norm. This evaluation, however, has
limited practical utility since perturbations in the pixel space do not
correspond to underlying real-world phenomena of image formation that lead to
them and has no security motivation attached. Pixels in natural images are
measurements of light that has interacted with the geometry of a physical
scene. As such, we propose the direct perturbation of physical parameters that
underly image formation: lighting and geometry. As such, we propose a novel
evaluation measure, parametric norm-balls, by directly perturbing physical
parameters that underly image formation. One enabling contribution we present
is a physically-based differentiable renderer that allows us to propagate pixel
gradients to the parametric space of lighting and geometry. Our approach
enables physically-based adversarial attacks, and our differentiable renderer
leverages models from the interactive rendering literature to balance the
performance and accuracy trade-offs necessary for a memory-efficient and
scalable adversarial data augmentation workflow.