Last week President Obama issued an Executive Order on Improving Critical Infrastructure Cybersecurity and a Presidential Policy Directive — Critical Infrastructure Security and Resilience (PPD 21). Unfortunately, these are a watered down rehash of the previous administrations’ proposals. Without legislation, these are directives to the executive branch and mere suggestions to industry.
What I found interesting was the report, Raising the Bar on Cybersecurity, released by the Center for Strategic and International Studies. The report identified risk reduction measures that reduced the risk from attacks by 85%:
I agree. These measures are effective. If these controls had been in place at Department of Energy, they may have prevented the intrusion and subsequent loss of data. But, first I would add two additional controls to this list:
But, there are challenges.
Whitelisting is time consuming and can be difficult to implement. You need locked-down desktops, standards for server builds, and network configuration profiles already in place. Otherwise, whitelisting may be circumvented, especially by determined outside attackers and disgruntled internal users.
Patching is always a good thing, but testing is required before any patch should be applied. This is particularly challenging for industrial control systems associated with critical infrastructure.
Reducing administrative privileges is a good idea that is noted in most InfoSec improvement programs. It is one of the first thing covered by a security auditor’s review. A tool like Unix’s sudo is a good way to assign specific privileges to specific users. Versions of, or equivalents to, sudo are available for other operating systems
Often outdated software or hardware is the source of vulnerabilities as patches and updates are no longer provided. Unfortunately, some entities do not have the resources to eliminate old software and hardware so they must identify alternate means to provide adequate security.
Finally, annual A&P testing may be expensive, but for organizations at risk — especially for loss of intellectual property, exposure of personally identifiable information (PII) or at risk of taking a hit to corporate reputation – the expense is well justified. A&P testing performed by qualified third-parties is a “no-harm, no-foul” way of ensuring networks and systems are properly protected and will show that a modicum of due diligence has been performed which may help forestall or lessen the impact of a law suit following a breach.
So, why doesn’t every organization jump on the A&P testing bandwagon? Well, in addition to the expense side of the equation, some IT directors are reticent to employ A&P testing because of the risk of exposing poor network design and/or configuration errors to executives or the board of directors. These folks, however, should be thinking of how they can protect their shareholders’ interests and not their careers.
Implementing these five processes will not only raise the bar with respect to protecting organizational assets, but will also reduce the efforts required to maintain an efficient and effective IT environment.
Leave A Comment
You must be logged in to post a comment.