This wired article ( https://www.wired.com/story/join-mastodon-twitter-alternative/ ) about Mastodon is mostly good. It covers the basic features and talks about a shift from Twitter to Mastodon.
It confuses one key issue though, and that’s the “culture” of Mastodon.
What we’re seeing now across the Fediverse are the first adopters. The fringe. The queer. The hackers. The staunch individualists. The communal care takers.
As Mastodon becomes more mainstream, the “culture” will shift.
If you’re here for the culture, be wary... 1/2
Mastodon at it’s heart is a software application wrapped around a federated protocol.
Anyone can use it. Spin up an instance by thenselves or join one they like. It can federate with any other software application using ActivityPub. It is decentralized. HIGHLY resistant to censorship.
And this last part is key.
This platform is IDEAL to users that espouse unpopular viewpoints: fascism, hate, calls for violence, illegal content, etc. 2?/2...
While the culture is of first adopters and is open and affirming right now. Nazi’s and Fascists and $BAD_ACTOR’s will move in once they realize they can’t be censored or kicked off their own instance.
Those who left twitter because Jack didn’t ban folks there are going to be sorely disappointed when they realize that (while folks can be banned from one instance) they can’t be banned from their own instance. 3/2
@tinker that’s true, but we have had discussions about that eventuality. Basically, the only recourse will be instances blocking other instances that exhibit bad behavior. I am not sure what we’ll see first: Nazi-haven instances or spammer-haven instances. Both will have to be addressed the same way. Ultimately, the fediverse is going to have to (like it or not) develop a reputation system for instances (which are optional to use) to make moderation tolerable.
In terms of condoning, enabling, and promoting alleged human trafficking, slavery, rape, misogyny, and sexual abuse, not to mention unethical activities of other kinds, mastodon.social has already faced this issue of a rogue instance, switter dot at.
The admins have utterly failed in their responsibilities in this regard, & even to acknowledge that there is a problem in federating with switter.
How does switter vet their accounts and account holders to insure safety and public health?
How does swittter exclude and prevent the site from being used for human trafficking, child rape, forced prostitution and opiate abuse, slavery etc?
If switter does anything to protect the sex workers and exclude people from being marketed there under terms of coercion and abuse please inform me about it.
As our conversation has diverted from the original topic, I’m removing some of the participants.
These are all great questions. I’d recommend reaching out to the folks at https://assemblyfour.com/ who maintain switter.
A lot of your questions are answered in their faq. You might also look through their rules and code of conduct: https://assemblyfour.com/switter/rules
The original post in two parts was about how "This platform is IDEAL to users that espouse unpopular viewpoints: fascism, hate, calls for violence, illegal content, etc."
That is the context in which I responded and I added all the mastodon dot social admins because they have legal and moral responsibility for the instance I am on and what instances are federated is their decision and their moral responsibility.
@hhardy01 @jerry @Gargron - Mastodon.social is a huge generalized instance. Legalities are different globally (sex work is legal in Austrailia where switter is hosted) and moralities are all over the place. Good luck trying to get such a large instance to bend to your very specific will.
I can’t speak for your instance or admins.
I can ask you... again... if you don’t want to see switter.at in your feed, why haven’t *YOU* banned their instance?
You asked, "I can ask you... again... if you don’t want to see switter.at in your feed, why haven’t *YOU* banned their instance?"
I haven't said I have or have not blocked switter or specific switter posters. You trying to make an invalid form of argument, specifically, namely, "tu quoque."
I answered your question already:
"Closing your eyes doesn't make bad things go away, nor does it free you from moral responsibility to speak up about them. You know that, right?"
@hhardy01 - You did! I missed it, I apologize.
So if I understand you correctly, you’re trying to change “bad things” not by blocking them in your own feed, but by blocking them in your instance’s feed.
I'm agreeing, not disagreeing; and giving an example of what you said about the platform and unpopular viewpoints. Specifically how questionable that can be in practice in the federated model.
Moderation decisions are often problematic. Such as banning 70 year old fascist ideology and support but not Trump and Trumpism, for instance.
That doesn't necessarily mean those decisions are wrong. I'd just like to see some acknowledgement of moral and ethical responsibility for the effects.
Yeah... how did that happen?
And why are technologies and social structures for common people essentially stuck in 1950's mode?
America is in reruns.
@hhardy01 - Because that was the last time we studied and published overarching tomes of social philosophy. Now everything is blogged (microblogged) and shared to our echo chambers or lost in the greater noise.
A Mastodon instance for info/cyber security-minded people.