News and analysis for the well-informed compliance or audit exec. Select an option and click continue.
Annual Membership $499 Value offer
Full price one year membership with auto-renewal.
Membership $599
One-year only, no auto-renewal.
- Chief Compliance Officer and VP of Legal Affairs, Arrow Electronics
By Neil Hodge2024-04-01T13:22:00
The world’s first major piece of legislation for regulating artificial intelligence (AI) moved another step forward.
On March 13, European Parliament approved the AI Act, which aims to regulate the technology based on its capacity to cause harm. The act follows a risk-based approach: the higher the risk, the stricter the rules.
The legislation—which aims to ensure “trustworthy” AI—provides developers and users with clear requirements and obligations regarding specific uses of the technology.
The rules are meant to increase transparency about the way AI is used, when it is used, what data the technology uses to produce results and make decisions, and to prevent harmful outcomes.
There are four risk categories.
THIS IS MEMBERS-ONLY CONTENT. To continue reading, choose one of the options below.
News and analysis for the well-informed compliance or audit exec. Select an option and click continue.
Annual Membership $499 Value offer
Full price one year membership with auto-renewal.
Membership $599
One-year only, no auto-renewal.
2024-10-17T16:22:00Z By Neil Hodge
Concerns about how robustly European member states may enforce the EU AI Act, which took effect on Aug. 1, are divided between if regulators will take a “light touch” approach or a sledgehammer for noncompliance. One thing’s for sure, the pace of AI innovation will make enforcement very difficult.
2024-04-18T20:42:00Z By Kyle Brasseur
With senior-level decisions on technology only increasing in frequency as new tools rapidly evolve, a panel at Compliance Week’s 2024 National Conference agreed compliance must consider the opportunities available to influence those conversations.
2024-04-03T18:23:00Z By Adrianne Appel
If there was one takeaway Diana Kelley offered during her keynote address at Compliance Week’s 2024 National Conference, it was that artificial intelligence tools—especially generative AI—need compliance.
2024-12-20T16:47:00Z By Neil Hodge
Any product that uses AI needs to be safety assessed for its entire lifespan under new rules that went into effect recently across the EU. Experts warned companies using AI to tailor products could be classed as “manufacturers” and face the same duty of care as developed.
2024-12-19T16:18:00Z By Neil Hodge
When lawmakers slam the U.K.’s chief financial regulator as “incompetent,” it not only opens the doors for others to pile criticism on it, but it sparks a debate about how the organization can be improved–or removed.
2024-12-19T16:17:00Z By Aaron Nicodemus
The U.K. Financial Conduct Authority apologized to investors in peer-to-peer investment firm Collateral for not acting swiftly enough to prevent Collateral from defrauding its customers.
Site powered by Webvision Cloud