Let's talk about cyber-security risks. After all, everyone else is.

The Securities and Exchange Commission talked about the issue for five hours last week, at its much-anticipated cyber-security roundtable. The Center for Audit Quality published guidance on cyber-security risks one day before the SEC's confab, diplomatically but firmly stating that external auditors are not responsible for testing a company's IT controls beyond those relevant to financial reporting. And earlier this year the National Institute of Standards and Technology published a basic framework for managing cyber-security risks.

So we have plenty of discussion—much of it quite intelligent—about cyber-security risks and the urgent need to address them more effectively. Unfortunately, that seems to be about as far as the conversation has gone so far, leaving compliance officers looking for direction still wandering in the wilderness.

The SEC seems a bit lost, too. Commissioner Luis Aguilar, who pushed to get the cyber-security roundtable on the SEC's calendar, said as much. “There is no doubt that the SEC must play a role in this area,” he said. “What is less clear is what that role should be.”

Let's unpack that dilemma. The SEC's core mission is investor protection. For compliance officers that boils down to one question: what disclosures should companies be required to make to investors, so investors can be fully informed about the stocks they intend to buy. More specifically, what are the disclosures companies should make to the public about cyber-security risks?

There's a legitimate argument that companies shouldn't disclose every risk they have, because hackers will use that against them. One roundtable participant was Leslie Thornton, general counsel of Washington Gas & Light Co. She told of how hackers have tried to burrow into WGL's IT systems and change the gas pressure WGL uses in its pipelines, apparently to test whether they could create the risk of an explosion. Should WGL disclose whether those hackers succeeded or how much of a disaster they could cause? I don't think so, especially when I'm driving over one of those gas lines.

On the other hand, suppose you're an investor in Target. Target actually had disclosed the possibility that a major data privacy breach could cause material harm to the business. But that language, included in its 2012 annual report, is practically meaningless:

If our efforts to protect the security of personal information about our guests and team members are unsuccessful, we could be subject to costly government enforcement actions and private litigation and our reputation could suffer.

Another paragraph follows, for a total of 181 words Target devotes to the risks of a cyber-security breach. All of it is a boilerplate recitation of possible risks (private litigation, regulatory enforcement, loss of trust), and “we have a program in place to detect and respond to data security incidents.”

Fast forward to Target's annual report for 2013, filed just two weeks ago. Target's board expanded its disclosure to start with this:

The data breach we experienced in 2013 has resulted in government inquiries and private litigation, and if our efforts to protect the security of information about our guests and team members are unsuccessful, future issues may result in additional costly government enforcement actions and private litigation and our sales and reputation could suffer.

This time the company doubled its disclosure to 365 words, and “…because the techniques used to obtain unauthorized access, disable or degrade service, or sabotage systems change frequently and may be difficult to detect for long periods of time, we may be unable to anticipate these techniques or implement adequate preventive measures.”

Everything Target disclosed to investors was accurate, and generally in compliance with what the SEC requires for discussing risk factors in the Form 10-K. Still, let's be honest: a reasonable investor cannot make good judgments about how well Target is working to protect customer data from disclosures like that. Those disclosures don't say anything.

Somewhere between those two extremes—of forcing companies to disclose sensitive information about weaknesses, and allowing them simply to say they might suffer these risks some day—is where the SEC needs to be. The agency made a start with its cyber-security disclosure guidance of 2011, but we all still struggle with basic questions: what are the minimum standards of care companies should undertake with investors? What cyber-incidents qualify as material harm to the company?

Investors want to know a company is making diligent, good-faith efforts to protect data. Well, we still haven't even clearly defined what “diligence” and “good faith” mean in this context; remember, after all, that Target did comply with the Payment Card Industry standards for data security, and it got hacked anyway. How much more diligent was the company supposed to be? Maybe quite a lot; I suspect plaintiff lawywers will argue that point in civil lawsuits against Target for quite some time. 

But what standards, what framework, what basic expectations, can boards use ahead of a cyber-threat, to understand how they should proceed and to give instructions to senior executives? That is what the SEC wants to know.

The NIST framework is one place to start. But as anyone watching the SEC's roundtable last week could see, despite the immense brainpower focussed on this issue, we have a long way to go. Compliance officers' time in the wilderness continues.