Worries DOJ is ‘dumping’ AI responsibilities on compliance departments in ECCP update

AI_risk_web

When the U.S. Department of Justice (DOJ) released its revised Evaluation of Corporate Compliance Programs (ECCP) earlier this year, it turned some heads. Tucked into a section on risk assessments was a strongly worded series of questions that appeared to shoulder compliance teams with the responsibility for ensuring the safe use of artificial intelligence tools by their firms.

It was a surprising move, in part because AI is so new. Many in the compliance community believe that the DOJ did this to plant a flag about what the government expects from businesses that use AI–because Congress appears reluctant to pass any law that would set guardrails for businesses to follow.

Two chief compliance officers, who asked to remain anonymous because they were not permitted to speak on the subject, said they hoped the DOJ would adjust the ECCP in the future as it relates to AI.

The reason, one of the CCOs said, was they believed the responsibility of ensuring the safe use of AI tools should be shared among all business functions.

“It creates a potential situation where compliance is seen as owning AI, even though it didn’t have input in the decision-making process,” the CCO said. “It seems like the DOJ is dumping this responsibility on compliance.”

lock iconTHIS IS MEMBERS-ONLY CONTENT. To continue reading, choose one of the options below.