News and analysis for the well-informed compliance or audit exec. Select an option and click continue.
Annual Membership $499 Value offer
Full price one year membership with auto-renewal.
Membership $599
One-year only, no auto-renewal.
- Chief Compliance Officer and VP of Legal Affairs, Arrow Electronics
By Adrianne Appel2023-07-28T19:02:00
Companies that use automated tools to screen candidates for jobs based in New York City must check those systems for bias or potentially run afoul of a first-in-the-nation law.
The law, which took effect July 5, is aimed at rooting out any bias against individuals in job hiring when an automated employment decision tool (AEDT) is used. The law requires that the employer or third-party hiring partner audit the AEDT for bias.
The final rules describe AEDTs as certain systems that rely on algorithms, statistical modeling, data analytics, artificial intelligence (AI), or machine learning to score, classify, or recommend job candidates. AEDTs may search résumés for gaps in employment history or for certain words and, based on the results, not recommend candidates for interviews.
THIS IS MEMBERS-ONLY CONTENT. To continue reading, choose one of the options below.
News and analysis for the well-informed compliance or audit exec. Select an option and click continue.
Annual Membership $499 Value offer
Full price one year membership with auto-renewal.
Membership $599
One-year only, no auto-renewal.
2024-07-10T15:46:00Z By Adrianne Appel
Sorenson Communications agreed to pay $34.6 million and implement a comprehensive compliance program to settle allegations levied by the Federal Communications Commission that its subsidiary illegally retained call content of users who relied on captions to make and receive calls.
2023-07-21T15:29:00Z By Kyle Brasseur
Technology companies including Google, Meta, and OpenAI agreed to a series of voluntary commitments they’ll make regarding their management of risks when developing artificial intelligence systems.
2023-07-13T20:20:00Z By Kyle Brasseur
The Federal Trade Commission sent to ChatGPT developer OpenAI a list of questions seeking clarity on how the company monitors, collects, and retains user personal information and ensures control over its popular artificial intelligence chatbot.
2024-11-14T20:36:00Z By Adrianne Appel
The U.S. Department of the Treasury’s Financial Crimes Enforcement Network issued an alert to financial institutions about their obligations to report deepfakes, warning artificial intelligence has given bad actors additional tools in their arsenal.
2024-07-31T15:31:00Z By Adrianne Appel
A nationwide rental outlet affiliated with Rent-a-Center and its chief executive have been sued by the Consumer Financial Protection Bureau for allegedly deceiving five million consumers about the terms of credit agreements.
2024-07-24T17:54:00Z By Neil Hodge
A lack of risk visibility is causing companies to reject customers–and potentially lose money–over fears they might be in danger of violating rules around anti-money laundering and sanctions regulations.
Site powered by Webvision Cloud