this post was submitted on 21 Jun 2024
20 points (88.5% liked)

Videos

14492 readers
73 users here now

For sharing interesting videos from around the Web!

Rules

  1. Videos only
  2. Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
  3. Don't be a jerk
  4. No advertising
  5. No political videos, post those to [email protected] instead.
  6. Avoid clickbait titles. (Tip: Use dearrow)
  7. Link directly to the video source and not for example an embedded video in an article or tracked sharing link.
  8. Duplicate posts may be removed

Note: bans may apply to both [email protected] and [email protected]

founded 2 years ago
MODERATORS
 

This was an interview on ABC (Australian public broadcaster) with Signal Foundation president Meredith Whittaker. It covered current events about Signal and encrypted messaging, with a small bit on AI at the end. The original title of the video is bad.

Key points in the video:

  • 1:30 - Should platforms be held responsible for [the content]
  • 3:15 - (paraphrased) Governments want law enforcement to have access to encrypted communications, why not?
  • 4:15 - (paraphrased) What if people are using it for criminal behaviour
  • 7:00 - (paraphrased) Random AI section
top 1 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 7 months ago* (last edited 7 months ago)

This part of the interview felt relevant to the fediverse (note that this was pasted from a transcript, and you might find it easier to watch the video than read the transcript):

Australia's safety commissioner recently took on Elon Musk for example requesting the removal of vision of a stabbing in a church here in Sydney. It was unsuccessful, should tech platforms be held responsible for spreading that sort of content.

Well I think we need to break that question down and actually question the form that tech platforms have taken, because we live in a world right now where there are about five major social media platforms that are very literally shaping the global information environment for everyone. So we have a context where these for-profit surveillance tech actors have outsized control over our information environment, and present a very very attractive political target to those who might want to shape, or misshape, that information environment. So I think we need to go to the root of the problem. The issue is not that every regulator doesn't get a chance to determine appropriate or inappropriate content. The issue is that we have a one-size fits all approach to our shared information ecosystem, and that these companies are able to determine what we see or not, via algorithms that are generally calibrated to increase engagement; to promote more hyperbolic or more inflammatory content, and that we should really be attacking this problem at the root: beginning to grow more local and rigorous journalism outside of these platforms and ensuring that there are more local alternatives to the one-size fits-all surveillance platform business model.