News and analysis for the well-informed compliance or audit exec. Select an option and click continue.
Annual Membership $499 Value offer
Full price one year membership with auto-renewal.
Membership $599
One-year only, no auto-renewal.
- Chief Compliance Officer and VP of Legal Affairs, Arrow Electronics
By Yasmine Abdillahi, CW guest columnist 2024-11-15T13:00:00
The era of artificial intelligence (AI) adoption is testing the old ways of doing compliance, underscoring the need for continuous monitoring. Compliance isn’t a one-and-done activity, but sometimes organizational incentives and goals fail to prioritize the importance of this.
While your organization may only need to conduct an audit annually, compliance is a continuous process. Factors are constantly changing, whether it’s new regulations, the adoption of new technologies, or new threats arising.
As organizations rapidly adopt AI, a host of new security risks and compliance concerns proliferate, and leaders across the spectrum are quickly trying to put guardrails in place. A proven approach to staying ahead is continuous controls monitoring (CCM). When leaders have visibility into the compliance posture of information and technology they own, they are empowered to make better tech decisions.
THIS IS MEMBERS-ONLY CONTENT. To continue reading, choose one of the options below.
News and analysis for the well-informed compliance or audit exec. Select an option and click continue.
Annual Membership $499 Value offer
Full price one year membership with auto-renewal.
Membership $599
One-year only, no auto-renewal.
2024-12-13T16:47:00Z By Aaron Nicodemus
When the DOJ released its revised Evaluation of Corporate Compliance Programs, it turned some heads. Tucked into a section on risk assessments was a strongly worded series of questions that appeared to shoulder compliance teams with the responsibility for ensuring the safe use of AI tools by their firms.
2024-12-12T14:32:00Z By Aaron Nicodemus
The Department of Justice’s Evaluation of Corporate Compliance Programs has made the importance of artificial intelligence governance frameworks clear, but it didn’t say what role compliance should play. Here’s the answer.
2024-11-21T16:25:00Z By Neil Hodge
Data governance has become a key concern for companies, especially when the EU AI Act and General Data Protection Regulation have put a premium on handling data responsibly and ensuring that artificial intelligence does not cause harm.
2024-10-31T14:43:00Z By Aaron Nicodemus
While companies are exploring and building artificial intelligence technology, lawmakers and regulators are trying to identify what ground rules they need to set. These guardrails are what companies and governments alike believe are essential parts of ensuring safe and responsible use of the technology.
2024-10-28T15:29:00Z By Aaron Nicodemus
Companies are adopting artificial intelligence tools at a breakneck pace, but it’s increasingly clear that they set guardrails early. AI leaders say that approaching the technology with safety and ethics in mind will help ensure its upside benefits, while avoiding the significant risks it poses as well.
2024-10-18T12:00:00Z By Aaron Nicodemus
For all the hype surrounding generative artificial intelligence, the technology has been met with a healthy skepticism in the compliance community. Compliance practitioners want to know: Is it safe? Can it be deployed ethically? Are the risks greater than the rewards? And what should an AI acceptable use policy contain?
Site powered by Webvision Cloud