Starting this month, banks that discover certain cyberattacks and other security incidents have just 36 hours to report them to
federal regulators. This rapid reporting is a regulation trend that is spilling over to other sectors; Congress recently passed a law requiring
critical infrastructure providers to report cyber incidents within 72 hours, and a proposed
U.S. Securities and Exchange Commission rule would also require publicly-traded companies to publicly report such incidents to regulators and shareholders within
four business days after determining that they could have any material effect on business, or are incidents that the average investor would want to know about.
Although compliance with such measures may be challenging, these regulations are important in that they promote transparency and information-sharing that could prevent future attacks or limit their damages. It could also let the government know whether any
significant entities are under attack and to see if this is part of a bigger attack on the country. Both increased transparency and decreased attack severity are especially critical in the financial sector, which relies heavily on public trust and is the foundation
of most economic activity. But in order for these regulatory requirements to have maximum effect, banks and financial institutions–as well as regulators– need to make sure they are going about them in a smart way.
Context is critical
When disclosing a cybersecurity incident,
especially to the public – as could happen with the proposed SEC regulation, which requires disclosure to shareholders – financial institutions need to put the breach in context; not just in terms of the technical landscape and possible remedies, but in
terms of business risk and exposure. Statements should include what parts of the business and which assets could be affected, and what that means in financial terms. This will require businesses to invest in the predictive analytics tools and human talent
to properly and thoroughly quantify risk and potential damage from attacks and to be able to achieve this in the allotted window of four days.
A major goal of such regulations is public accountability, and that will only come with context; if context is missing, consumers and investors will still not be able to understand and evaluate institutions' cyber risk. Lack of context could also lead to
panic, causing additional market volatility. On the other hand, reporting multiple incidents without context could also lead to alert fatigue among the public, which is as equally unhelpful as panic. Putting breaches and other incidents in context for the
public could very well mean going beyond including only the information that regulations require, but investing in this context will go a long way in increasing public understanding and, ultimately, public trust and confidence.
Compliance burden should be reasonable
Both regulators and companies need to focus on clear reporting structures, mechanisms and routines. Regulators, whether in banking or for general financial markets, need to make sure that reporting is continuous after an attack is initially disclosed;
after all the situation 36 hours after an attack can look very different from the situation a day later, and a week later. But this needs to be balanced with what is practically achievable.
In addition, reporting requirements should not jeopardize security. The 36-hour reporting rule that took effect for banks this month requires disclosure of incidents to regulators; not directly to the public. But the proposed SEC scheme, which will affect
publicly-traded banks and financial institutions, will require reporting to the public. There is no question that there is a risk here in
tipping off hackers, who after realizing that an attack has been discovered, often attempt to erase and remove evidence that could identify them or their methods. In general, every piece of information can help an attacker, so there should be discretion
in reporting requirements. In some cases, it will make sense to hold off on public reporting until an incident has been further investigated and remedied.
Meanwhile, financial institutions should refine their own procedures for carrying out the reporting process. Compliance with the 36-hour rule is no doubt requiring extra work for banks, especially
smaller ones. Banks should be careful that reporting routines are not mistaken for actual incident response.
Closing the security information gap
It is not just the public that can benefit from these more stringent reporting regulations. But companies, including banks and financial institutions, can learn from others’ reported cyber incidents–especially if the SEC proposal comes to pass, making
much of the incident reporting publicly available. Moreover, banks and public companies will have to create better and time-efficient cyber response plans.
Often, I see in my work that organizations experience similar vulnerabilities and attacks, but this information is rarely shared so that others can benefit. This lack of sharing knowledge contributes to one of the biggest gaps in cybersecurity, in my opinion.
Not only can sharing such information prevent attacks, but it can also help companies who experience attacks remedy them more quickly and more effectively, which is very much in the public’s best interest. Cybercriminals collaborate and share information all
the time; it is high time that organizations do the same.