Matthew GreenVerified account

@matthew_d_green

I teach cryptography at Johns Hopkins. Screeching voice of the minority.

Baltimore, MD
Joined January 2010

Tweets

You blocked @matthew_d_green

Are you sure you want to view these Tweets? Viewing Tweets won't unblock @matthew_d_green

  1. Pinned Tweet

    List of non-US contact points for crypto/security PhD applicants (by )

    Undo
  2. Retweeted
    5 hours ago

    A number of people have asked why the NSA didn't notice they lost their backdoor. I don't know the answer, but I wrote up some possibilities. I'd love to hear others' thoughts on it.

    Undo
  3. Retweeted

    I don’t want to let the ANSI or NIST folks off the hook, having read this 2004 email exchange.

    Undo
  4. Retweeted
    Sep 3

    Earlier this year the French government snuck an anti-porn clause within a 'domestic abuse' law. Anti-porn groups said 'don't worry, our victory was only symbolic.' Well, they just sued all the major French internet service providers demanding they block the main porn sites:

    Show this thread
    Undo
  5. Retweeted
    Sep 3

    Small thread: Now that a confirmed backdoor using the Dual_EC DRBG is in the news, it's worth revisiting two simple techniques that cryptographic protocols and software can do to make themselves more defensive: 1. public/secret separation, and 2. DRBG mixing.

    Show this thread
    Undo
  6. Retweeted
    Sep 3
    Replying to

    Well, when I saw it, it was obviously kleptography applied! It fits nicely w/SSL "random bits in the clear" so motivation was clear. Then, if one wants to practically use it "exclusively," software security should make sure it only employs the designated trapdoor! --> failure!

    Undo
  7. Sep 3

    Addendum: the White House Press Secretary was asked about this story, and their answer is “please stop asking about this story.” h/t

    Show this thread
    Undo
  8. Sep 3

    More good reporting on Apple’s CSAM scanning delay.

    Undo
  9. Sep 3
    Undo
  10. Retweeted
    Sep 3

    With Apple's announcement that it's going to delay implementation of child safety features until it, you know, actually talks to people and gets feedback, and this news that WhatsApp is going to start encrypting back ups, privacy advocates have a moment to feel good about things.

    Show this thread
    Undo
  11. Sep 3

    And I’m also very grateful to Apple for taking some time to pause their rollout and think hard about it. Because the way these things go, Apple’s rollout would have justified a deluge of further expansion. 6/6 fin

    Show this thread
    Undo
  12. Sep 3

    In a limited sense I’m grateful to Apple for making such a big splash this summer with their scanning proposal, and for making it so broad and expansive (and unpopular.) It takes a company like Apple to actually bring these ideas out into the public sphere. 5/

    Show this thread
    Undo
  13. Sep 3

    But at every point of the process when someone objects to expanding the scope of scanning, someone will say “it’s already deployed in this slightly less expansive way” so how can you possibly object? 4/

    Show this thread
    Undo
  14. Sep 3

    At no point in the rollout of these systems does anyone say “is this objectively the right thing to do?” Nor do they consult with users. (In fact many of these scanning systems *require* companies to sign NDAs prior to deployment.) 3/

    Show this thread
    Undo
  15. Sep 3

    The way these debates proceed is that someone deploys a limited scanning system for unencrypted files being mailed around. Then people say “look, it’s best practice” and push the same scanning for other providers. Then they expand to private backups and client-side scanning. 2/

    Show this thread
    Undo
  16. Sep 3

    I’m so, so tired of talking about Apple photo scanning but I just want to say one more thing about their (thankfully now paused!) proposal: One of the leading indicators of whether a scanning proposal is “ok” is “is anyone else doing it.” 1/

    Show this thread
    Undo
  17. Sep 3

    NCMEC deciding to get involved on the advocacy side of this issue sure didn’t help them.

    Show this thread
    Undo
  18. Sep 3

    I’ve seen two people post PhotoDNA implementations in the last two weeks. One is on the front page of HN. These are algorithms that were secret and under NDA for years before Apple’s announcement. (And Apple doesn’t even use PhotoDNA.)

    Show this thread
    Undo
  19. Sep 3

    The degree to which Apple has screwed NCMEC and content scanning in general cannot be overstated.

    Show this thread
    Undo
  20. Sep 3

    This is the one I can’t entirely blame on poor executive decisions. The technical folks at Apple had to know how broken it would be the second the design became public.

    Show this thread
    Undo
  21. Sep 3

    And (5) if you’re going to make your system design public, make *all* of it public. Withholding NeuralHash and then having it REed, broken: that was a catastrophe. Keep it secret and succeed, or don’t!

    Show this thread
    Undo

Loading seems to be taking a while.

Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

    You may also like

    ·