• AI DIY Episode 3 Roboaudit

  • 2021/12/28
  • 再生時間: 1 時間 16 分
  • ポッドキャスト

AI DIY Episode 3 Roboaudit

  • サマリー

  • Dr Pamela Ugwidike is an expert on the use of AI in justice systems and is particularly interested in how AI and data bias impact criminal justice. She wants to make an AI that can audit other AIs for bias to make sure that all Artificial Intelligence always operates in the interest of humanity. There's currently a huge amount of interest in AI ethics and biases in the research community, and we are discovering many ways in which our

    data and our computations can lead to unfairness.


    Perhaps the most famous case of AI bias comes from the field of criminal justice. The COMPAS algorithm (used in US courts to predict whether a defendant would go on to reoffend) was twice as likely to incorrectly label black people as reoffenders than white people. While there are lots of advice and programming toolkits aimed at helping human developers to eradicate bias from the AI products that they are building, IBM's Watson OpenScale is the only AI platform that claims to detect and correct biases in its own operation. Although is that just IBM's marketing department being a bit biased?


    Hosted on Acast. See acast.com/privacy for more information.

    続きを読む 一部表示
activate_samplebutton_t1

あらすじ・解説

Dr Pamela Ugwidike is an expert on the use of AI in justice systems and is particularly interested in how AI and data bias impact criminal justice. She wants to make an AI that can audit other AIs for bias to make sure that all Artificial Intelligence always operates in the interest of humanity. There's currently a huge amount of interest in AI ethics and biases in the research community, and we are discovering many ways in which our

data and our computations can lead to unfairness.


Perhaps the most famous case of AI bias comes from the field of criminal justice. The COMPAS algorithm (used in US courts to predict whether a defendant would go on to reoffend) was twice as likely to incorrectly label black people as reoffenders than white people. While there are lots of advice and programming toolkits aimed at helping human developers to eradicate bias from the AI products that they are building, IBM's Watson OpenScale is the only AI platform that claims to detect and correct biases in its own operation. Although is that just IBM's marketing department being a bit biased?


Hosted on Acast. See acast.com/privacy for more information.

AI DIY Episode 3 Roboauditに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。