Darnell (Seize The Day)'s latest activity

Telegram's CEO's Arrest Should Be A Wake Up Call For Social Media Admins & CEOs

Telegram Icon

Hours ago, France πŸ‡«πŸ‡· arrested the CEO of Telegram, Pavel Durov (@durov@telegram.me) for failure to provide adequate moderation over his social messaging platform (props to @chris@threads.net for alerting me about this story).

French authorities have detained Pavel Durov, the French-Russian billionaire who founded the messaging app Telegram, at an airport outside Paris, according to CNN affiliate BFMTV. [...]

Durov, 39, was wanted under a French arrest warrant due to the lack of moderation on Telegram which led to it being used for money laundering, drug trafficking and sharing pedophilic content, according to BFMTV.

According to BFMTV, the Telegram founder had not regularly travelled to France and Europe since the arrest warrant was issued. (Cable News Network)

The move has rattled corporate leaders of other large platforms, including Elon Musk (@elonmusk@x.com), who has posted several times about the incident, often portraying Europe as a region lacking in freedom of speech.

Musk's snark aside, Pavel Durov's arrest will create a precedent among social media administrators and CEOs (Corporate Executive Officers), as it means the leaders and administrators of each respective social media network can be held accountable for failing to properly moderate their networks.

For example, Elon Musk is already facing legal challenges in Australia πŸ‡¦πŸ‡Ί, Brazil πŸ‡§πŸ‡·, & the European Union πŸ‡ͺπŸ‡Ί (which comprises of 27 countries, plus an additional 4 in the Schengen Area) for moderation failures, & it would not be surprised if arrest warrants were issued.

Note: The United States πŸ‡ΊπŸ‡Έ would probably intervene if any international arrest warrants were issued, mainly because Elon Musk is already embroiled in numerous court cases within America πŸ‡ΊπŸ‡Έ (πŸ˜‚πŸ€£πŸ˜‚), & the US government would object to the opportunity of dealing out justice themselves upon an American πŸ‡ΊπŸ‡Έ citizen who is violating American πŸ‡ΊπŸ‡Έ laws.

Fediverse Icon

So Why Should Social Media Admins In The Fediverse Be Worriedβ€½

Although most Fediverse admins are probably too small to be noticed by most governments, they could still be subject to raids by authorities over offensive content, which happened to a reasonably active Mastodon instance last year.

In May, Mastodon server Kolektiva.social was compromised when one of the server's admins had their home raided by the FBI for unrelated charges._ All of their electronics,_ including a backup of the instance database, were seized. [...]

According to Kolektiva, the seized database, now in the FBI's possession, includes personal information such as email addresses, hashed passwords, and IP addresses from three days prior to the date the backup was made. It also includes posts, direct messages, and interactions involving a user on the server. Because of the nature of the fediverse, this also implicates user messages and posts from other instances.

To make matters worse, it appears that the admin targeted in the raid was in the middle of maintenance work which left would-be-encrypted material on the server available in unencrypted form at the time of seizure. (Via @eff@mastodon.social on Electronic Frontier Foundation Blog)

Like most Federated instances, Kolektiva probably relied on users to report offensive content to moderators. However, some content ended up slipping through, which resulted in a surprising raid by the Federal Bureau of Investigation (FBI).

Robot Emoji

Automated Solutions For Fediverse Admins

Fediverse Admins managing ActivityPub powered sites with over 1,000 plus monthly active users (MAU) &/or 100,000 total users should consider integrating automated tools that can detect objectionable content like Child Sexual Abuse Material (CSAM) & financial fraud.

A few automated tools for detecting CSAM that are available to admins are Safer by Thorn (more info here), & a CSAM Scanning Tool by @cloudflare@cloudflare.social, the latter which is freely available for Cloudflare customers.

Automatically detecting fraud is more complex & more probably more expensive. However, several companies are designed to automatically detect fraud at scale, which can assist admins in moderating their online content. Mateusz Pniewski has an excellent list of 13 companies that admins may want to check out, which could make it easier for Admins to detect fraud on the federated platforms.

Pixelfed icon

What About A Decentralized, Less Expensive Solutionsβ€½

Unless one is a Cloudflare client, implementing automated tools like this is prohibitively expensive and probably not affordable for most admins. However, there could be a way to extend automated protection to the Fediverse without contracting assistance from a corporate giant.

Even as the digital world has transformed over the years, with social media platforms like Pixelfed becoming hubs for visual sharing and communication, the challenge of spam remains as relevant as ever. Pixelfed's ingenious implementation of the Naive Bayes classifier to combat spam is a testament to the algorithm's versatility. By analyzing the captions accompanying images, Pixelfed's spam filter can swiftly determine whether a post contains genuine content or is simply trying to clutter your feed with unwanted promotions or irrelevant information. (Pixelfed Blog)

@dansup@mastodon.social has created built-in anti-spam features for Pixelfed instances, which theoretically could be tweaked to thwart more offensive content upon each instance without sacrificing privacy.

This would probably require a massive amount of time, energy & collaboration between the Fediverse Founders (as well as funds). However, if successful, it would help safeguard the Fediverse from unsavory actors attempting to ruin the internet for everyone normal.

πŸ‘¨πŸΎβ€πŸ’» by @darnell@darnellclayton.com πŸ”› @darnell@darnell.day πŸ“§ darnell@darnell.day

πŸ•ΊπŸΎ Follow my adventures upon: 🐘 Darnell (One) 🦁 Darnell (Movies, Opuses, Entertainment) πŸ“Έ Darnell (Out Of Office)

πŸ¦ΉπŸΎβ€β™‚οΈ WordPress Workarounds: πŸ’» Darnell (TeleVerse) 🌍 Darnell (Africa) πŸ‘¨πŸΎβ€πŸŽ¨ Darnell (Creative Outlet)

πŸ₯·πŸΎ Other Hideaways: 🧡 Darnell (Threads) πŸ”ž Darnell (Hard News) 🐬 Darnell (Flipboard)

0
Share
Share on Mastodon
Share on Twitter
Share on Facebook
Share on Linkedin

Threads Muting Pixelfed And Other Fediverse Instances Over One Simple Rule

Pixelfed icon

I remember @dansup@mastodon.social mentioning this last month (original post was deleted), but Threads by Instagram continues to mute the entire Pixelfed community as well as other Fediverse instances (also known as servers) over one simple rule.

Server guidelines

A server may be added to our server blocklist if it doesn't comply with our guidelines for communicating with Threads. [...]

We'll also block a server if it doesn't have a:

  • Sufficient privacy policy; or
  • Publicly available policy to restrict access to users under the age of 13; or
  • Publicly accessible feed (Emphasis mine, via Instagram Help Center)

For those unaware, a publicly accessible feed is merely a stream on a local instance where anyone can view all of the public chatter on the server. Mastodon sums it best with this description:

To allow you to discover potentially interesting content, Mastodon provides a way to browse all public posts. Well, there is no global shared state between all servers, so there is no way to browse all public posts. When you browse the federated timeline, you see all public posts that the server you are on knows about. There are various ways your server may discover posts, but the bulk of them will be from people that other users on your server follow.

There is a way to filter the federated timeline to view only public posts created on your server: The local timeline. Mind that β€œlocal” here refers to the server, not to a geographical location. (Mastodon Documentation)

Pixelfed has a local timeline, but it is not publicly accessible to Threads, which explains why the site is muting the entire Pixelfed ecosystem and any other Fediverse instances that decline to allow Threads to access the public timeline without permission.

The Pixelfed creator Dan may resolve this issue by allowing instance admins to create a publicly accessible feed to allow their respective users to communicate with the greater Threads community.

However, in hindsight, it might be wise to leave it off by default, especially with numerous tech companies scouring the web to train the synthetic brains of artificial intelligence programs (Meta included, props to @r_alb@mastodon.social for alerting me about this).

πŸ‘¨πŸΎβ€πŸ’» by @darnell@darnellclayton.com πŸ”› @darnell@darnell.day

πŸ•ΊπŸΎ Follow my adventures upon: 🐘 Darnell (One) 🦁 Darnell (Movies, Opuses, Entertainment) πŸ“Έ Darnell (Out Of Office)

πŸ¦ΉπŸΎβ€β™‚οΈ WordPress Workarounds: πŸ’» Darnell (TeleVerse) 🌍 Darnell (Africa) πŸ‘¨πŸΎβ€πŸŽ¨ Darnell (Creative Outlet)

πŸ₯·πŸΎ Other Hideaways: 🧡 Darnell (Threads) πŸ”ž Darnell (Hard News) 🐬 Darnell (Flipboard)

0
Share
Share on Mastodon
Share on Twitter
Share on Facebook
Share on Linkedin
Replies