Today in CV Dazzle News

Adversarial Patch:

These adversarial patches can be printed, added to any scene, photographed, and presented to image classifiers; even when the patches are small, they cause the classifiers to ignore the other items in the scene and report a chosen target class.

We present a method to create universal, robust, targeted adversarial image patches in the real world. The patches are universal because they can be used to attack any scene, robust because they work under a wide variety of transformations, and targeted because they can cause a classifier to output any target class.

Previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously.

Tags: , ,

5 Responses:

  1. MattyJ says:

    Someone needs to put that on a "Make America Private Again" hat.

  2. Zach says:

    I, for one, await the first murder by putting an adversarial patch on someone's back so electric cars think they are a small piece of trash instead of a person.

  3. Otto says:

    So, after the stickers become popular and well used, they start being recognized as stickers. Seems like a self solving problem.

    • margaret says:

      like buffer overflows. once one person figured out how to filter out one malicious string the problem went away for everyone accepting user inputs and all malicious strings. that was a good day.

  4. ennui says:

    illegal fashion for the 21st century see: 20th century illegal fashion, masks.

Leave a Reply

Your email address will not be published. But if you provide a fake email address, I will likely assume that you are a troll, and not publish your comment.

You may use these HTML tags and attributes: <a href="" title=""> <b> <blockquote cite=""> <code> <em> <i> <s> <strike> <strong> <img src="" width="" height="" style=""> <iframe src="" class=""> <video src="" class="" controls="" loop="" muted="" autoplay="" playsinline=""> <div class=""> <blink> <tt> <u>, or *italics*.

  • Previously