Meg Foulkes

@FoulkesMeg

Working to ensure public impact algorithms cause no harm with the Open Knowledge Justice Programme and . BPTC student.

London, UK
Ağustos 2012 tarihinde katıldı

Tweet

@FoulkesMeg adlı kişiyi engelledin

Bu Tweetleri görüntülemek istediğinden emin misin? Tweetleri görüntülemek @FoulkesMeg adlı kişinin engelini kaldırmaz.

  1. Sabitlenmiş Tweet
    14 Nis 2020

    We did it! Open Knowledge Justice Programme officially out there, equipping lawyers with the tools they need to hold algorithms to account 🥳

    Geri al
  2. Retweetledi
    10 Ağu

    "The immigrant experience is not easily reduceable to an algorithm" - Petra Molner. These words really resonate with us and the work we're doing to enable immigration law practitioners to start holding the deployers of algorithmic decision-making to account. Thank you !

    Photo by <a href="https://unsplash.com/@etiennegirardet?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Etienne Girardet</a> on <a href="https://unsplash.com/s/photos/immigration?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a>
    Geri al
  3. 3 Ağu

    Really pleased that we're kicking off a regular, drop-in talking space for anyone interested in collaborating on challenging algorithms using law. Totally informal, all very welcome!

    Geri al
  4. Retweetledi
    27 Tem

    Lawyers, campaigners and activists working in the UK immigration system should sign up for this 90-minute interactive workshop; brimming with all the info required to make sure that public interest algorithms do good rather than harm.

    Geri al
  5. Retweetledi
    8 Tem

    Yey! So excited to unveil our shiny new website today. Check it out for all the info on our brand new projects in 2021!

    Geri al
  6. Retweetledi
    11 May

    Thanks to for their support of ’s work on strategic litigation challenging the use of remote proctoring tools during the pandemic - and for featuring us in their 2020 annual report

    Geri al
  7. Retweetledi
    26 Şub

    We're taking action to protect the data and privacy rights of students subjected to remote proctoring apps, that 'watch' test-takers as they take assessments from home: read more here:

    Geri al
  8. Retweetledi
    27 Şub

    Challenging the (mis) use of proctoring apps: our Justice Programme () team take action to protect the data and privacy rights of UK Bar students cc

    Geri al
  9. Retweetledi
    17 Şub

    It's Day #2 of ! Time for breakout rooms on , , , & more ✨ We'll end the day with our panel 📉 which will be livestreamed here w/live closed captioning:

    Geri al
  10. 28 Oca

    Absolute pleasure to speak all things remote proctoring in the company of , and at today. All the CPDP sessions will be available for free, for everyone, on their youtube channel soon:

    Geri al
  11. Retweetledi
    28 Oca

    Today is is and this evening @MegFoulkes1 - director of our Open Knowledge Justice Programme () - will take part in a discussion about online test proctoring, AI and surveillance at

    Geri al
  12. Retweetledi
    25 Oca

    Register now for and join @MegFoulkes1, director of our Open Knowledge Justice Programme, for this discussion about online proctoring at 6:30pm CET on Thursday 28th January

    Geri al
  13. 19 Kas 2020

    Thanks to who invited me to chat about online privacy and the work we're doing at the OK Justice Programme to hold AI and algorithms to account:

    Geri al
  14. 7 Ağu 2020
    Geri al
  15. Retweetledi
    22 Tem 2020

    ⚖️TWEET OF THE WEEK⚖️This edition comes from our partners . Justice Programme Director discusses AI and facial recognition in the context of university exams. We live our lives increasingly online in the COVID context- but are some new methods too intrusive?

    Geri al
  16. 22 Tem 2020

    Me in the today, on the potential for discrimination and privacy harms in online exam technologies:

    Geri al
  17. 13 Tem 2020

    leak data including highly sensitive, personal information to identity thieves. We need to take these exams. But the discrimination and intrusion into our private lives these remote proctoring services cause is a wholly unacceptable solution. 11/11 End.

    Bu Tweet dizisini göster
    Geri al
  18. 13 Tem 2020

    Other potential privacy nightmares include the software having a security weakness that allows a hacker manipulating the remote control capabilities, the proctor could use their position to maliciously direct the student to install malware, or the proctoring platform could 10/11

    Bu Tweet dizisini göster
    Geri al
  19. 13 Tem 2020

    When the police take a fingerprint, there are safeguards in place to ensure the intrusion is warranted; what are BPTC students suspected of having done to warrant obtaining our face print? 9/11

    Bu Tweet dizisini göster
    Geri al
  20. 13 Tem 2020

    Also of concern is the interference with our privacy rights. Apparently the software ‘complies with the GDPR’, which is good to know given our highly sensitive biometric data, even more sensitive than a fingerprint, will be required to be given up to this commercial entity. 8/11

    Bu Tweet dizisini göster
    Geri al
  21. 13 Tem 2020

    biased as the developers who make it and they are - guess what - predominantly white men: . 7/11

    Bu Tweet dizisini göster
    Geri al

Yükleme biraz zaman alacak gibi görünüyor.

Twitter aşırı kapasiteyle çalışıyor ya da anlık sorunlar yaşıyor olabilir. Yeniden dene ya da daha fazla bilgi almak için Twitter Durumu sayfasını ziyaret et.

    Şunları da beğenebilirsin

    ·