[Busec] Fwd: Synthesizing Robust Adversarial Examples Talk TODAY

Mayank Varia varia at bu.edu
Wed Dec 6 10:24:30 EST 2017


Hi everyone,

I'm forwarding along an announcement for a talk at 5pm today at MIT. It
shows how to generate 3D-printed objects in an adversarial way to fool
machine learning classifiers. (Also there's a link to the paper below.)

Mayank


---------- Forwarded message ---------
From: Machine Intelligence Community <bumic at bu.edu>


Location: MIT 56-154
Time: Today 12/6/2017 @ 5 PM
The authors of the work on 3D adversarial examples Anish Athalye, Logan
Engstrom, and Andrew Ilyas will be presenting tomorrow. Come thru!
Paper: https://goo.gl/uay7UC
TL;DR: https://goo.gl/Lncb84
Abstract: Neural network-based classifiers parallel or exceed human-level
accuracy on many common tasks and are used in practical systems. Yet,
neural networks are susceptible to adversarial examples, carefully
perturbed inputs that cause networks to misbehave in arbitrarily chosen
ways. When generated with standard methods, these examples do not
consistently fool a classifier in the physical world due to viewpoint
shifts, camera noise, and other natural transformations. Adversarial
examples generated using standard techniques require complete control over
direct input to the classifier, which is impossible in many real-world
systems. We introduce the first method for constructing real-world 3D
objects that consistently fool a neural network across a wide distribution
of angles and viewpoints. We present a general-purpose algorithm for
generating adversarial examples that are robust across any chosen
distribution of transformations. We demonstrate its application in two
dimensions, producing adversarial images that are robust to noise,
distortion, and affine transformation. Finally, we apply the algorithm to
produce arbitrary physical 3D-printed adversarial objects, demonstrating
that our approach works end-to-end in the real world. Our results show that
adversarial examples are a practical concern for real-world systems.

[image: bumic_email_sig.png]
-- 
Boston University Machine Intelligence Community
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs-mailman.bu.edu/pipermail/busec/attachments/20171206/73cf2951/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: bumic_email_sig.png
Type: image/png
Size: 33196 bytes
Desc: not available
URL: <http://cs-mailman.bu.edu/pipermail/busec/attachments/20171206/73cf2951/attachment-0001.png>


More information about the Busec mailing list