[ISN] The federal computer security report card: Lessons from Uncle Sam

InfoSec News isn at c4i.org
Mon Apr 12 04:00:11 EDT 2004


http://www.computerworld.com/securitytopics/security/story/0,,91899,00.html

Opinion by Marc Gartenberg
APRIL 08, 2004
COMPUTERWORLD

For the fourth year in a row, the federal government released its
"Report Card on Computer Security at Federal Departments and Agencies"  
(download PDF) [1]. The average grade for fiscal 2003 was a D (65).  
The overall average grade in 2002 was an F (55); in 2001, it was also
an F (53). Since 2000 was the first year that any measurements were
taken, that year's score was "Incomplete" with a letter grade of D-.

Looking at the situation through the lens of an eternal optimist (and 
realist), maybe, just maybe, agency heads, the Office of Management 
and Budget and Congress will start looking for ways to get these 
agencies where they should be. An empire in the age of technology can 
and should be able to get passing grades in information security.

As an alternative to looking at the trends and drawing the conclusion 
that things aren't really that bad since, after all, the overall score 
is improving, let's examine instead the underlying factors that led to 
these scores. Then we can see why our dear Uncle Sam needs some help, 
and we can offer some suggestions.

Through this analysis, it will become clear that the issues are 
related to establishing, maintaining and measuring enterprise security 
management strategy as part of the systems development life cycle so 
that no government agency or company ever has to settle for a D.

Why the bad grades?

To answer that, we need to examine the factors upon which the 
scorecard is based. These include certification and accreditation 
processes and recognize subtle distinctions in the categories of IT 
systems, namely general support systems and major applications (a.k.a. 
mission-critical applications), and realize that there are still many 
legacy systems long overdue for retirement. Ready?

First, here's the process, which sounds simple. The National Institute 
of Standards and Technology (NIST), a component of the U.S. Department 
of Commerce, publishes and updates its policy guidance for information 
security. Federal agency security chiefs are supposed to see that 
these guidelines are followed within their agencies. The problem is 
that the NIST guidance isn't very concise regarding implementation. It 
also isn't an operational procedure manual. Rather, to a great extent, 
it's a higher-level management policy document. This creates a gap 
between knowing what to do and how to do it. Yet the scorecard rates 
an agency on how well it implements the guidelines.

Then there's the reporting scheme, which is handled by each agency's 
own Office of the Inspector General. These offices are designed as 
semi-autonomous bodies operating within and under the jurisdiction of 
each agency head.

Another factor is that this year agencies had to meet the requirements 
of the Federal Information Security Management Act (FISMA). This law 
expands on the information-security evaluation and reporting 
requirements enacted in 2001 under the Government Information Security 
Reform Act (GISRA). Under FISMA, agencies must demonstrate their 
progress in areas including risk management, contingency and 
continuity procedures to ensure that their mission-critical and 
general-support systems are protected. This includes annual IT 
security reviews, reporting and remediation planning on systems at all 
stages of the systems development life cycle.

So while agencies in previous years were showing improvements on the 
standards according to GISRA, the fact that the regulations changed 
midstream caused many agencies to have a problem meeting the new 
mandates.

When all these dependent variables are synthesized, you can see that 
getting an A isn't all that easy.

Is it fission or fusion?

Interestingly, though, FISMA has been a long time coming, and the 
federal security chiefs had fair time and warning before the sun set 
on GISRA. It's also noteworthy that the grades in previous years 
weren't much better -- how could they get any worse? Last year's grade 
average was an F. This year it's a D. So, maybe things are getting 
better. It's hard to say, since each agency, regardless of size, is 
given an equal weight in determining the overall average, so a couple 
of A's such as for the National Science Foundation and the Nuclear 
Regulatory Commission (whew!) helped improve the scores a bit overall.

The ROI of security

Federal and industry chief security officers will agree that it's hard 
to build a tangible case for increased security appropriations. This 
is primarily because it's hard to quantitatively justify increased 
spending on IT security because there's little tangible immediate 
return on investment.

That's what makes IT security policy development all the more 
challenging. Namely, proving that the need exists and that a properly 
formed strategy can mitigate risks, protect critical information 
assets and ensure confidentiality, integrity and availability.

One way to increase appropriations, though, is to fail a security 
audit and place the blame on inadequate funding, which is essentially 
what's happening. The fiscal 2005 federal budget increases IT spending 
by about 10% over fiscal 2004, to close to $60 billion.

A company, especially a public one, has to maintain solid earnings 
while building equity. Meanwhile, it takes leadership and vision to 
recognize the value of a solid IT enterprise security policy. Much has 
been written on demonstrating ROI for IT security, so I won't get too 
granular here. Suffice it to say that it doesn't take too much effort 
to perform a solid risk assessment and produce a risk-level matrix 
that clearly demonstrates the risk thresholds of any enterprise.

The tough part is gaining support and momentum for developing a solid 
set of plans (contingency, continuity of operations, training and 
education) and to ensure that these plans get the critical 
executive-level support within the organization.

Lessons learned - the "P word"

That would be policy, and that's just what NIST provides. But that's 
not enough. The federal government - and this applies to industry as 
well - needs guidance but also needs procedures to follow. The average 
IT professional needs a set of standards to subscribe to and a set of 
guidelines on how to meet those standards. That's the missing piece, 
which I hope will be recognized and developed sometime soon.

It was another year of dismal federal IT security grades, and the 
complexities and threats in the world aren't diminishing. Government 
agencies and related organizations have their work cut out for them, 
but the pieces are there. By optimizing talent, focusing on embedding 
security into the systems development cycle and including further 
refinement of continuity planning, along with the continual retirement 
of legacy systems, the overall grades should show improvement next 
year.


[1] http://public.ansi.org/ansionline/Documents/Standards%20Activities/Homeland%20Security%20Standards%20Panel/ComputerSecurity.pdf





More information about the ISN mailing list