this post was submitted on 02 Feb 2024
201 points (91.4% liked)

[Outdated, please look at pinned post] Casual Conversation

6590 readers
1 users here now

Share a story, ask a question, or start a conversation about (almost) anything you desire. Maybe you'll make some friends in the process.


RULES

Related discussion-focused communities

founded 1 year ago
MODERATORS
 

So, what are you guys up to?

you are viewing a single comment's thread
view the rest of the comments
[–] 0x01@lemmy.ml 25 points 9 months ago (1 children)

I'm meeting up with a friend and their SO for a little hiking and brunch at a local breakfast joint. Picking up the vision pro today for work so that's exciting!

I think it's easy to be negative if a person doesn't consider the human reading their anonymous posts, people feel empowered to take out their problems from life in general on random faceless internet strangers.

[–] Blaze@discuss.tchncs.de 10 points 9 months ago (1 children)

Picking up the vision pro today for work so that’s exciting!

Sounds cool! Hope you'll enjoy it!

I think it’s easy to be negative if a person doesn’t consider the human reading their anonymous posts, people feel empowered to take out their problems from life in general on random faceless internet strangers.

Yeah, definitely. Hopefully in communities like this we can avoid that.

[–] 0x01@lemmy.ml 6 points 9 months ago (2 children)

Not sure it's possible to avoid entirely, as the community here grows we get more of all types, it's hard to have a truly only-positive community when moderation is voluntary and the masses have full reign to submit literally anything.

Do you think a technology solution would help? Sentiment analysis on posts and gently redirect people to other communities instead of outright blocking them?

Maybe the only real solution is personal resilience and recognizing that we don't need to feel negative feelings just because we received a negative communication

[–] stevecrox@kbin.run 5 points 9 months ago (2 children)

You can't fix a people problem with process.

For example I've worked in DevSecOps for 10+ years, whenever consulting my first step is to implement a CI that picks up Pull Requests, builds them and runs a code analysis tools (e.g. pep8, spotbugs, eslint, etc..) and have the CI comment the Pull Request. The idea is to get an understanding of the projects technical debt and stop things getting worse and ensure the solution 'just works'.

Teams with huge amounts of technical debt will find a way to disable it when your not looking. They will develop all kinds of reasons and in reality the technical debt was created because of cultural issues in the team.

So I've learnt its important if you spot a team doing that, the solution isn't locking it down the solution so they can't disable it or more process. But forcing out the technical leader and sitting with the team and working out why each one is fighting the tool and not seeing them as an asset and teaching them to be better.

[–] agent_flounder@lemmy.world 4 points 9 months ago* (last edited 9 months ago)

Earlier in my career the biggest lesson I learned was infosec was first and foremost a culture problem. Similar to your experience, working with people individually, meeting them where they are, listening, understanding, guiding, and modeling a better mindset all helps given enough time.

There are cases where some of those people just aren't willing to work with you. It's still possible to change the culture around them by influencing it more than they do. For every belligerent person, you can find one or more advocates

[–] Blaze@discuss.tchncs.de 3 points 9 months ago

Yes, same experience here. People will go the path of least resistance except if you actively collaborate with them to make it work.

[–] Blaze@discuss.tchncs.de 3 points 9 months ago

Not sure it’s possible to avoid entirely, as the community here grows we get more of all types, it’s hard to have a truly only-positive community when moderation is voluntary and the masses have full reign to submit literally anything.

Very true, hopefully by the time we get there we'll have more mods tools to allow to identify bots and trolls.

Do you think a technology solution would help? Sentiment analysis on posts and gently redirect people to other communities instead of outright blocking them?

I'm not too sure, I'm always dubious about technology solutions for psychological issues.

Maybe the only real solution is personal resilience and recognizing that we don’t need to feel negative feelings just because we received a negative communication

Yes, I guess so, also why I made this post. I feel better already.