• Throwing out the rulebook: How age, race and gender are key inputs for fair lending

  • 2024/10/28
  • 再生時間: 24 分
  • ポッドキャスト

Throwing out the rulebook: How age, race and gender are key inputs for fair lending

  • サマリー

  • In this first of two episodes with Kareem Saleh, Founder and CEO of FairPlay AI, host Vince Passione takes a deep dive into the world of AI underwriting: How it’s been used in the past, how it has fallen short of its potential to remove bias in lending decisioning, and how the pioneering concept of Fairness-as-a-Service may yet revolutionize the AI lending game.


    Key takeaways:

    0:00 Intro

    2:06 How the murder of George Floyd compelled Kareem Saleh to look for ways he could more effectively change the system he operates in, and increase fairness for everyone.

    3:08 Machine learning is capable of learning, but also of learning the wrong things.

    3:55 Financial services regulators have clearly defined definitions of “fairness”: One focusing on disparate treatment, the other focusing on disparate impact.

    5:18 One of the core challenges (and/or failures to-date) for using AI to mitigate bias is that it can only learn from the data it is trained on; and that typically over-represents certain populations.

    8:49 Traditional logistic regression models don’t always work because they assume credit behaviors are linear. In reality, an individual’s financial behaviors are non-linear.

    9:31 Unfairness can be born out of AI-driven models when seemingly independent variables interact with one another in a way that most humans wouldn’t interpret, but machines do.

    10:01 Underwriting regulation and compliance hasn’t yet caught up with the technological capabilities of our AI-driven world.

    11:56 Why monitoring an AI model’s fairness trends is critical.

    14:40 Examples of ways seemingly disparate or arbitrary variables can interact with one another to bias an outcome–and how personal attributes are still being leveraged to a greater or lesser extent in decisioning models.

    19:53 Concerns vs. regulatory comfort with using personal attributes like age and gender in decisioning.

    22:44 The privacy implications, particularly in light of 1033, and how finding a way to share your personal information in some way–even if it’s just through the census–will help contribute to a fairer credit system.


    Resources Mentioned:

    • https://fairplay.ai/ Fairplay AI
    • https://www.fdic.gov/system/files/2024-06/iv-1-1.pdf Fair lending rules and regulations
    • https://www.justice.gov/crt/equal-credit-opportunity-act-3 Equal Credit Opportunity Act
    • https://www.justice.gov/crt/fair-housing-act-1 Fair Housing Act
    • https://www.lendkey.com/podcast/modern-credit-reporting-can-old-tech-handle-new-data/ Christian Widhalm, Bloom Credit, episode


    Thanks for listening to the 22 Minutes in Lending podcast. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show and be sure to subscribe so you never miss another insightful conversation.

    #CreditUnions #FairLending #AI #Lending

    続きを読む 一部表示

あらすじ・解説

In this first of two episodes with Kareem Saleh, Founder and CEO of FairPlay AI, host Vince Passione takes a deep dive into the world of AI underwriting: How it’s been used in the past, how it has fallen short of its potential to remove bias in lending decisioning, and how the pioneering concept of Fairness-as-a-Service may yet revolutionize the AI lending game.


Key takeaways:

0:00 Intro

2:06 How the murder of George Floyd compelled Kareem Saleh to look for ways he could more effectively change the system he operates in, and increase fairness for everyone.

3:08 Machine learning is capable of learning, but also of learning the wrong things.

3:55 Financial services regulators have clearly defined definitions of “fairness”: One focusing on disparate treatment, the other focusing on disparate impact.

5:18 One of the core challenges (and/or failures to-date) for using AI to mitigate bias is that it can only learn from the data it is trained on; and that typically over-represents certain populations.

8:49 Traditional logistic regression models don’t always work because they assume credit behaviors are linear. In reality, an individual’s financial behaviors are non-linear.

9:31 Unfairness can be born out of AI-driven models when seemingly independent variables interact with one another in a way that most humans wouldn’t interpret, but machines do.

10:01 Underwriting regulation and compliance hasn’t yet caught up with the technological capabilities of our AI-driven world.

11:56 Why monitoring an AI model’s fairness trends is critical.

14:40 Examples of ways seemingly disparate or arbitrary variables can interact with one another to bias an outcome–and how personal attributes are still being leveraged to a greater or lesser extent in decisioning models.

19:53 Concerns vs. regulatory comfort with using personal attributes like age and gender in decisioning.

22:44 The privacy implications, particularly in light of 1033, and how finding a way to share your personal information in some way–even if it’s just through the census–will help contribute to a fairer credit system.


Resources Mentioned:

  • https://fairplay.ai/ Fairplay AI
  • https://www.fdic.gov/system/files/2024-06/iv-1-1.pdf Fair lending rules and regulations
  • https://www.justice.gov/crt/equal-credit-opportunity-act-3 Equal Credit Opportunity Act
  • https://www.justice.gov/crt/fair-housing-act-1 Fair Housing Act
  • https://www.lendkey.com/podcast/modern-credit-reporting-can-old-tech-handle-new-data/ Christian Widhalm, Bloom Credit, episode


Thanks for listening to the 22 Minutes in Lending podcast. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show and be sure to subscribe so you never miss another insightful conversation.

#CreditUnions #FairLending #AI #Lending

Throwing out the rulebook: How age, race and gender are key inputs for fair lendingに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。