• Online Trolls, Mental Health, & Social Justice

  • 2024/04/05
  • 再生時間: 40 分
  • ポッドキャスト

Online Trolls, Mental Health, & Social Justice

  • サマリー

  • For the benefit of the majority of Americans who are capable of understanding what I'm about to say, I appreciate the opportunity to share some insights with you that might help you better frame how you think about current events and other people's behaviors. For those of you who struggle to understand what I'm about to say, just know that the point is to find a way for you to still be included in the public discourse with as much understanding as can be achieved. We want everyone making thoughtful, informed decisions and not just reacting emotionally to things they don't understand, which requires patience and understanding on everyone's part. Recent events have inspired this post/podcast, and they arose around other online content I'd already published and then promoted through Facebook Ads, which was probably just asking for it. Facebook has become a toxic environment in which conspiracy theories abound as they are passed around among our least informed and/or least emotionally stable members of society and boosted by Facebook's algorithms. Even though our content was supposed to be targeted to pro-democracy users, enough people on Facebook are apparently hate-searching the same hashtags as those used by pro-democracy activists and then posting hateful messages full of misinformation, which likely feeds the algorithm information about their user habits that increases their ability to engage with pro-democracy content without regard for how they are actually interacting. The algorithm is looking at the frequency and duration of a user's involvement with content, not the qualitative nature of what that involvement looks like. Hateful comments are just comments to the algorithm. Clicks are just clicks, regardless of the beliefs or intentions of the users doing the clicking. These algorithms are configured to increase the exposure of frequently clicked- and commented-on content based on its popularity with users, regardless of why it's becoming popular. This is how social media has been weaponized by bad actors to feed lies and misinformation to unsophisticated users who have no idea that their behaviors are being reinforced for all the wrong reasons, which effectively manipulates them into behaving in hateful ways with increasing intensity over time. My working theory about what reinforces trolling behaviors is that it's automatically reinforcing because there is an internal adrenaline rush that users get when their posts and comments gain popularity and get shared, which gives them emotional validation. It's a protest behavior that gets reinforced and maintained by attention from others. It is only people who are starved for emotionally validating attention from others who seek it out online and fall into the deep well of online trolling behaviors to get it. If that's the only source of validation and feeling "successful" in their lives, they're going to do it. The solution is to give them a more appropriate functionally equivalent replacement behavior that still allows them to express their wants and needs such that they are validated with attention, but more importantly, that are met with more powerful reinforcers than the ones they receive by trolling. We've got to give them something more rewarding than what they get from spewing hatred while still giving a voice to their wants and needs, as well as access to appropriate solutions. These are not our brightest problem-solvers. These are the people with arrested emotional development and limited coping skills who resort to name-calling and hostile behavior because that's the best they've got. They feel trapped in a life they can't handle where their wants and needs go unmet and they don't know how to appropriately advocate for themselves. Emotionally speaking, they are simply very old children. Thankfully, only a handful of trolls found our online content. All of them were adult males, mostly middle-aged or older and white, based on their Facebook profiles. All of them were triggered by a single word in the title of the program being promoted, which is our Social Justice group on Meetup, in which I conduct live events and share content with group members who are interested in learning how to participate in the advocacy processes of publicly funded programs to enforce their rights as program beneficiaries or the rights of other eligible beneficiaries who need help advocating for themselves. In our Meetup group, I take my experiences working in special education, regional center, rehabilitation, and other publicly-funded programs for people with disabilities and generalize them to the same processes and procedures that exist within other publicly-funded programs that exist to benefit citizens with other other types of need than disability. Many of these other programs address social welfare issues, like housing, food, and healthcare. Americans pay into these programs so that they are there for them if and when they need ...
    続きを読む 一部表示

あらすじ・解説

For the benefit of the majority of Americans who are capable of understanding what I'm about to say, I appreciate the opportunity to share some insights with you that might help you better frame how you think about current events and other people's behaviors. For those of you who struggle to understand what I'm about to say, just know that the point is to find a way for you to still be included in the public discourse with as much understanding as can be achieved. We want everyone making thoughtful, informed decisions and not just reacting emotionally to things they don't understand, which requires patience and understanding on everyone's part. Recent events have inspired this post/podcast, and they arose around other online content I'd already published and then promoted through Facebook Ads, which was probably just asking for it. Facebook has become a toxic environment in which conspiracy theories abound as they are passed around among our least informed and/or least emotionally stable members of society and boosted by Facebook's algorithms. Even though our content was supposed to be targeted to pro-democracy users, enough people on Facebook are apparently hate-searching the same hashtags as those used by pro-democracy activists and then posting hateful messages full of misinformation, which likely feeds the algorithm information about their user habits that increases their ability to engage with pro-democracy content without regard for how they are actually interacting. The algorithm is looking at the frequency and duration of a user's involvement with content, not the qualitative nature of what that involvement looks like. Hateful comments are just comments to the algorithm. Clicks are just clicks, regardless of the beliefs or intentions of the users doing the clicking. These algorithms are configured to increase the exposure of frequently clicked- and commented-on content based on its popularity with users, regardless of why it's becoming popular. This is how social media has been weaponized by bad actors to feed lies and misinformation to unsophisticated users who have no idea that their behaviors are being reinforced for all the wrong reasons, which effectively manipulates them into behaving in hateful ways with increasing intensity over time. My working theory about what reinforces trolling behaviors is that it's automatically reinforcing because there is an internal adrenaline rush that users get when their posts and comments gain popularity and get shared, which gives them emotional validation. It's a protest behavior that gets reinforced and maintained by attention from others. It is only people who are starved for emotionally validating attention from others who seek it out online and fall into the deep well of online trolling behaviors to get it. If that's the only source of validation and feeling "successful" in their lives, they're going to do it. The solution is to give them a more appropriate functionally equivalent replacement behavior that still allows them to express their wants and needs such that they are validated with attention, but more importantly, that are met with more powerful reinforcers than the ones they receive by trolling. We've got to give them something more rewarding than what they get from spewing hatred while still giving a voice to their wants and needs, as well as access to appropriate solutions. These are not our brightest problem-solvers. These are the people with arrested emotional development and limited coping skills who resort to name-calling and hostile behavior because that's the best they've got. They feel trapped in a life they can't handle where their wants and needs go unmet and they don't know how to appropriately advocate for themselves. Emotionally speaking, they are simply very old children. Thankfully, only a handful of trolls found our online content. All of them were adult males, mostly middle-aged or older and white, based on their Facebook profiles. All of them were triggered by a single word in the title of the program being promoted, which is our Social Justice group on Meetup, in which I conduct live events and share content with group members who are interested in learning how to participate in the advocacy processes of publicly funded programs to enforce their rights as program beneficiaries or the rights of other eligible beneficiaries who need help advocating for themselves. In our Meetup group, I take my experiences working in special education, regional center, rehabilitation, and other publicly-funded programs for people with disabilities and generalize them to the same processes and procedures that exist within other publicly-funded programs that exist to benefit citizens with other other types of need than disability. Many of these other programs address social welfare issues, like housing, food, and healthcare. Americans pay into these programs so that they are there for them if and when they need ...

Online Trolls, Mental Health, & Social Justiceに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。