Ransomware such as WannaCry and Petya / NotPetya 1 have worried businesses worldwide and caused significant damage. Yet they are only the visible part of an inadequate safety culture that urgently needs an update.

Ransomware, also called crypto-trojans, is not a new phenomenon, but the increasingly visible concomitant of collective IT insecurity. The ransomware WannaCry infected at least 220,000 Windows computers worldwide in mid-May. The Trojan gained access to computer files and encrypted them to obtain a ransom payment. This was possible through the gap known as EternalBlue, which has been operating since the Windows XP operating system and was closed by Microsoft only this year in February. EternalBlue was in the hands of the NSA for an unknown period until it was “stolen” and released by the NSA by a hacker group called Shadow Brokers earlier this year, And although Microsoft hastened to release a patch, the most momentous known crypto worms so far revealed the dilemma cybersecurity culture is currently in. It is a culture of silence that encourages rather than prevents the collection and misuse of security holes,

The payment of traffic to the Bitcoin accounts of WannaCry has been viewed through an automatic Twitter feed automatic Twitter feed. There you can see: There are still victims who pay. Despite the public education and the presence of a patch. Microsoft, therefore, continues to speak of a “heightened risk” for its customers and classifies the use of Windows XP as irresponsible. And while this ransomware has received public attention, it remains hidden how many businesses, government agencies, and citizens are affected by espionage software that could have used exactly the same gap for years, and still can partially use it.

These ongoing attacks on known and patched vulnerabilities and Microsoft’s response show that security patches are far from adequate for current operating systems, but that they need a culture of security in which different security systems interact and secure each other. It is not enough to publish updates, but they also need to be installed and accompanied by other measures, such as encrypted backups. NotPetya has even installed a compromised update and was spread this way. This shows that there is no single means for more security, but can only be part of a comprehensive safety culture, for which also state authorities by their actions have a shared responsibility. Microsoft blamed NSA for the consequences of WannaCry in a one-off move, equating its knowledge of these gaps with the possession of conventional weapons such as rockets:

“Repeatedly, in the hands of government have leaked into the public domain and caused widespread damage. An equivalent scenario with conventional weapons would be the US military having some of its Tomahawk missiles stolen. This most recent attack represents a completely unintended but disconcerting link between the two most serious forms of cybersecurity threats in the world today – nation-state action and organized the criminal activity. ”

The unintentional cooperation of states and cyber criminals is promoted by a culture of silence because states and criminals want to exploit the vulnerabilities for themselves. State actors use discovered gaps to either spy on other states or to monitor citizens and companies unnoticed. Cybercriminals have profited in many ways from the lack of interest shown by states to fill in these gaps, including by selling knowledge about security vulnerabilities to states, using economic espionage, or using computers for their own purposes, for example as part of a botnet. But companies also profit from silence: as victims, they fear a loss of trust among their customers, and as IT security or software service providers, they benefit from

Microsoft, therefore, demands the international law regulation of state cyber attacks and the stockpiling of the exploits through a “Digital Geneva Convention”:

“For two-thirds of a century, since 1949, the world’s nations have been recognized by the Fourth Geneva Convention that they need to protect themselves in times of war. But nation-state hacking has evolved into attacks on civilians in times of peace. ”

And other organizations and experts are calling for legal norms that oblige the disclosure of software gaps, such as the Chaos Computer Club (CCC) and the Forum for Informatics for Peace and Social Responsibility (FIfF). At the IT Security Congress of the BSI in May 2017, the experts, therefore, demanded that a reporting obligation for security breaches be legally enshrined.

What can be done politically against this culture of cyber-insecurity? And how can a resilient safety culture be established despite the inadequacy of individual measures? Updates for current software alone are not enough, but can even spread malware even if the update mechanism is compromised. Even with secure software, the problem of “social engineering” remains 4 that is, the exploitation of human behavior as a strategy of attack, which must also be addressed in the safety culture.

To replace the current cyber-insecurity culture, the political debate should address the following issues:

Cyber attacks almost exclusively affect civil structures. Their use by states can therefore hardly be legitimized. We need a discussion about what action by state actors is unacceptable. As Brad Smith (Microsoft) rightly notes, civilians are protected by the Geneva Conventions, and this also applies to digital combat.
Stockpiling increases the vulnerability of digital infrastructures. If known vulnerabilities are concealed, this forcibly reduces security from cyber attacks. Stockpiling by state institutions, which are supposed to increase security, becomes particularly ambivalent. The renewed discussion about state trojans to the sources TKÜ must take up this: Apart from rule of law concerns, it is quite conceivable that such interventions lower the security in the sum.
The incentives in our current safety culture should be evaluated. When organizations benefit or at least take less damage, when successful attacks are not made public, it is difficult to reasonably address potential threats. This is particularly important in view of the increase in smart devices, Industry 4.0 and the Internet of Things. An effective safety culture necessarily includes incentives for safety-enhancing behavior.