return to ICG Spaces home    ICG Risk Blog    discussions    newsletters    login    

ICG Risk Blog - [ Infrastructure Defense Public ]

Revisiting Clarke's six bleak IT trends from October 2003

  #

While Clarke was often dismissed as a Cassandra, and a gloomy one at that, while cybersecurity czar, I would agree with his assertion that the cost of the So big attack justified taking his warnings more seriously. I absolutely feel that subsequent attacks have justified his assertions.

Clarke outlined six trends when he addressed the Gartner Symposium/Typo 2003 in October 2003:

  1. Rising vulnerabilities: Announced vulnerabilities doubled every year for the last three years (Wonder if Moore's Law will have an analog in Clarke's Law?)
  2. Rising patches: Patches for those vulnerabilities has doubled every year for the past three years. (Patch management is a sinkhole for both individuals and companies)
  3. Falling "time to exploit": "Time to exploit" has dropped from months to six hours (in late 2003). (This is the time for an exploit to reach hacker blogs and IRC rooms. "Time to the wild" -- that's us -- follows shortly thereafter)
  4. Rising rate of propagation: Attacks now quickly infect 300,000 to 400,000 machines
  5. Rising cost of cleanup: Worldwide cleanup cost for 2002 was $48 billion, rising to an estimated $119 billion to $145 billion for 2003)
  6. Rising identity theft: $99 billion cost in 2002 (and 2002 incidents were 1/3 of the last five years' total)

Status? We have done nothing as of today to ameliorate any of the six. As I mentioned in an earlier note, the bad guys are operating inside our decision loop:

Ex-cyber security czar Clarke issues gloomy report card
By David Berlind, Tech Update
October 22, 2003

Gordon Housworth



Cybersecurity Public  InfoT Public  Infrastructure Defense Public  

discussion

  discuss this article

Improving the Suspicious Activity Report (SAR)

  #

BENS tasked its members with a "Follow the Money" project in an effort to improve the government’s ability to identify, follow and disrupt financial activity related to terrorism. The Suspicious Activity Report (SAR) is the primary method for financial services firms to report activity that may be related to terrorism.

"The Suspicious Activity Report (SAR) system was created in 1996 to replace six overlapping methods of financial information reporting with a single, more uniform process. The SAR was designed to reduce paperwork for the banking community and to increase the amount of useful information available to investigators. SARs have been called "haystacks of needles", therefore it is crucial the information in these reports is systematically collected and analyzed carefully. Since the Patriot Act in 2001, the SAR process has expanded from money laundering detection to intelligence gathering for identifying financial transactions that may be related to terrorism. Moreover, the Patriot Act requires that many more financial institutions submit SARs resulting in a corresponding increase in the number of reports."

US government has expanded recommendations to recognize traditional money-laundering activity to include possible terrorist activity. These recommendations include:

  • Financial activity to and from countries identified as state sponsors of terrorism
  • Financial activity inconsistent with the stated purpose of the business
  • Financial activity not commensurate with stated occupation
  • Use of multiple accounts at a single bank for no apparent purpose
  • Importation of high dollar currency/traveler’s checks not commensurate with stated occupation
  • Structuring of deposits at multiple bank branches to avoid BSA requirements
  • Abrupt changes in account activity
  • Use of multiple personal and business accounts to collect and then funnel funds to a small number of foreign beneficiaries.

Financial institutions express a common concern that the SAR process lacks feedback to filers, i.e., their reports go to a "black hole" despite the publishing the SAR Activity Review Report, information about, and examples of, how SAR’s are being utilized.

Opinions differ as to source of the problem -- better process or systems -- but it is likely both.

I would tend to use this as a guide to how SAR does and doesn’t work in practice. Yes, "more frequent, more concise communication" would be better, and that may work with white collar crime, or organized crime.

It does not affect the off books method of money transfers of the hawala informal financial system in which merchants around the world act as intermediaries for money transfers and is so favored by terrorist groups.

Improving the Suspicious Activity Report (SAR)
Recommendations for Improving the Suspicious Activity Report (SAR)
Business Executives for National Security (BENS)
11 April 2003

Gordon Housworth



InfoT Public  Infrastructure Defense Public  

discussion

  discuss this article

NASA's Air Travel Piracy PPT: Learning good things from bad ideas

  #

It is possible to learn good things from bad ideas. Remember DARPA's aborted futures market, Policy Analysis Market (PAM), that was going to be an absolutely great futures tool for risk analysis in the Middle East?  (Too bad that its examples ('How long with Jordan's King Hussein survive?') were a PR disaster and led to its being killed off -- along with Poindexter.)  Well, NASA was moving along another ill-fated line towards air piracy detection tools.

NASA's gaff, beyond the request for massive volumes of private consumer data, was its Air Travel Piracy PPT that NASA presented to Northwest Airlines in December 2001 for the purpose of obtaining "system-wide Northwest Airlines passenger data from July, August, and September 2001." You can read an html copy of the Air Travel Piracy FOIA Documents (also included at the end of the PPT are two assocatied documents including the written request from the Chief of NASA's Aviation Systems Division.)

NASA'a purpose was to perform a proof-of-function to use both data-mining and "brain-monitoring" technology installed at airport terminals in an effort to identify "threats." The proposed brain-monitoring technology would detect EEG and ECG signals from the brain and heart and then have that data analyzed by software, in combination with previously-floated plans to cross-reference passengers' travel history, credit history, and other information from hundreds or even thousands of databases as part of the Computer-Aided Passenger Pre-Screening (CAPPS) program.

Yes, NASA's Director of its Strategy and Analysis Division, Robert Pearce, disavowed the report in a press release, noting that "NASA does not have the capability to read minds, nor are we suggesting that would be done." Yet another NASA spokesman, Herb Schlickenmaier, confirmed that reading the brainwaves and heart rates of airline passengers was a NASA goal. The idea was that such data combined with body temperature and eye-flicker rate could make a feasible lie detector. Furthermore, the PPT NASA presented to Northwest in December 2001 did speak of "Non-invasive neuro-electric sensors under development as a collaborative venture between NASA Ames and commercial partner."

A few foils of the PPT are blacked out but it gives one a feel for the types of data and process that NASA envisioned - and now appears to have dropped.

There are some good things in the area of more prosaic detection that we can focus upon.

Gordon Housworth



InfoT Public  Infrastructure Defense Public  

discussion

  discuss this article

Missile defense, not terrorism, was Rice's topic for aborted 11 September speech

  #

I like to say that the hole is as interesting as the donut, perhaps more so in intel terms. The 11 September speech that Condi Rice never gave was on missile defense as the cornerstone of a new US national security strategy.

"The text also implicitly challenged the Clinton administration's policy, saying it did not do enough about the real threat -- long-range missiles."

The mention of terrorism was one of the rubrics of the day, a WMD threat from rogue nations. It made no mention of al Qaeda, Osama bin Laden or Islamic extremist groups.

That focus was an administration constant.  Just four months earlier, in June 2001, the president's five top defense interests in his first speech to NATO heads of state in Brussels were, in order:

  1. Missile defense
  2. NATO's relationship with Russia
  3. Common US-European working relationship
  4. Increased NATO defense spending
  5. NATO's enlargement with former East European countries

Al Qaeda operatives were, at that time, in flight training and final preparations for 11 September.

Rice did give her postponed speech in April 2002, but missile defense was gone. In its place was international terrorism -- the stateless kind.

We should expect more materials to emerge and for them to figure in Rice's questioning under oath before the 9/11 Commission.  Rice is a gifted speaker.  She will need to put those skills to use.

Top Focus Before 9/11 Wasn't on Terrorism
Rice Speech Cited Missile Defense
By Robin Wright
Washington Post Staff Writer
Thursday, April 1, 2004; Page A01

Gordon Housworth



InfoT Public  Infrastructure Defense Public  Terrorism Public  

discussion

  discuss this article

Profiling the Amerithrax perpetrator(s)

  #

The FBI linguistic and pre- and post-offense behavioral assessments of the person responsible for mailing anthrax letters on September 18 and October 9, 2001, paints a picture of a lone male domestic terrorist, a loner with a grudge, or possibly a bioevangelist ("someone with experience in the bioweapons arena who believed the U.S. government and public were oblivious to the magnitude of the potential threat from bioterrorists").

But profiling is an approximating science. I am reminded that the profile in use by the Unabomber Task Force in 1991 was substantially revised years later based on the Unabomber’s writings. While the Unabomber, Theodore Kaczynski, was said to have "startling similarities" to the FBI profile at capture (male Caucasian, highly educated, quiet, antisocial, meticulous) there were (had been) significant variances:

Much older (by more than a decade).
Quite unkempt in appearance (assumed to be very neat as well as meticulous).
Underestimated intelligence (revised sharply upwards by the Manifesto).
Modus operandi (Manifesto pointed to non-reliance on, or rejection of, technology).
Residence (Montana as opposed to assumed Northern California).

Perhaps the Amerithrax profile is spot-on this time.  It would appear that a foreign terrorist or multiple terrorists are not being considered at least in the official, unclass press. There is a blizzard of pro and con commentary on this profile, along with claims of investigative bias and discounting of Muslim-related terrorists. I admit to a great curiosity as to the justification of FBI's apparent focus on a single individual.

Renaldo A. Campana, then Unit Chief of the FBI's Weapons of Mass Destruction Countermeasures Unit, said at an emerging bio-threat seminar sponsored by GWU and the Potomac Institute on 16 June, 1998:

"The closest I've ever come to biological-chemical issues is when the toilet on the 37th floor gets backed up. So let's keep it in the right kind of perspective. The job of the FBI is really to deal with the crisis when it involves weapons of mass destruction… [What] do you consider to be… the largest and most important threat to the United States today? Please. Who do you think? Foreign-directed terrorist, individual, white extremist, black extremist?... Let me tell you. Let's get back to reality. It isn't the Middle Eastern people. It isn't white supremacists. It is the lone individual, lone unstable individual. That, statistically, from the cases that we have, is the biggest threat right now."

Leave aside how frighteningly wrong that was, even back in 1998, and how clearly unskilled that FBI agents were in dealing with bioagents during the 2001 anthrax investigations. Authorities may have valid reasons for adhering to this profile, but I am still struck by the continued attachment to the domestic loner to the exclusion of all other candidates, foreign and domestic.

And then there is the question as to where did the perp or perps go, why did they stop, and if they didn't stop, why were they deflected and when will he/they strike again?  Richard M. Smith spoke to some of the possible reasons why the anthrax attacks stopped after October 9, 2001:

Fear of capture.
Lapsed access to supply and weaponized production equipment.
Achieved goal in the 2001 attacks.
Failed to achieve goal and is seeking other means of delivery.
Dead after accidental infection.
US anti-terror operations affected ability to conduct future attacks.
Planning for a larger-scale attack.

I am struck by similarities to investigators’ theories over a six-year Unabomber silence: "He'd committed suicide, was serving time for an unrelated charge or was busy perfecting his technique."

I will be one among many following the FBI's progress between now and June/July. If a single individual -- domestic, foreign-resident, or foreigner -- did produce a genuinely world class weaponized anthrax, he will have pulled off a project worthy of a state sponsored anthrax program.

Gordon Housworth



InfoT Public  Infrastructure Defense Public  Strategic Risk Public  Terrorism Public  

discussion

  discuss this article

Senate anthrax powder: State of the art

  #

My 4 Dec 2003 note has renewed relevance now that Amerithrax investigators are said to be "at a "critical" and "sensitive" stage and could unearth significant leads by early July."

Gary Matsumoto's article in Science magazine,
"Anthrax Powder: State of the Art?" drew together many threads -- especially over recent months with regards to nanoglass technology used in computer chip manufacture, specialty paints and pigments -- to paint a picture of supreme skill and manufacturing prowess in the making of the anthrax used in 2001 against the Senate office building.

[Note that while the article is subscription-based, it has been mirrored at sites such as url1 and url2.  Cryptome has html text version -- smaller than the pdf but no photos.]

Even by hardened WMD standards, the Senate weaponized anthrax was off the charts in lethality. As great a master as D.A. Henderson said, "It just didn’t have to be that good" to be lethal.

The problem was not just how lethal it was, how leading edge it was, who could make it (here or offshore), but how its lethality could be obscured, even denied, for so long.

Early in the investigation, the FBI voiced the view of a consensus of military and civilian biodefense specialists that only a sophisticated lab could have produced the material, that it was "weapons-grade" of exceptionally high spore concentration, uniform particle size, contained silica to reduce clumping, and was electrostatically charged to create an "energetic" aerosol.

Then the FBI about faced to the opinion that the material could have been made by a knowledgeable person or persons with run-of-the-mill lab equipment on a modest budget. Now the anthrax contained no additives, had large particles, agglomerates (lumps), substandard milling. The prince had turned into a toad.

The Armed Forces Institute of Pathology (AFIP), however, would not back off and reported that its mass spectrometry analysis found extraordinarily high silica counts in the anthrax.

Nonetheless, the Justice Department locked onto a "person of interest": Steven J. Hatfill, a virologist and physician who conducted Ebola research at Fort Detrick, Maryland (which houses the U.S. Army Medical Research Institute of Infectious Diseases). Leaks to the media did everything but convict Hatfill as FBI and Justice pursued the idea that an individual or small group with limited means could have produced it.

One of the FBI’s most senior scientists, Dwight Adams, then makes the claim that the silica in the Senate anthrax had occurred naturally in the organism’s subsurface spore coat. That unfortunately contravenes the body of anthrax knowledge available to many microbiologists.

To support the small/rogue team hypothesis, the FBI charged a skilled team at Dugway Proving Ground, Utah, with the effort to produce a similarly high-grade anthrax without silica on a modest budget. No success as the Dugway effort only produced a coarse product that stuck together in little cakes.

The Senate anthrax is now revealed to be more advanced than any known weaponized product in US or Russian inventory -- it is the unclass, world-class state of the art in anthrax as it contains:

(1) Virulent Ames strain of anthrax

(2) Extraordinarily high spore concentration

(3) Uniform particle size

(4) Silica to reduce clumping

(5) Polymerized glass (nanoglass) coating to anchor the silica to the anthrax

(6) Electrostatic charge for "energetic" aerosol

It is now believed that this level of weaponization demands equipment worthy of a state-sponsored lab.

It is tantalizing that one of the few firms making "electrohydrodynamic" aerosols for inhalation drug therapy is BattellePharma, Battelle’s pharmaceutical division. Battelle also has a "national security division" that produces bioweapons, performs bioaerosol research, and manages certain US facilities. No "person of interest" has been found at Battelle.

There are now massive questions over the provenance of the Senate anthrax. If it was made in the US, then who, where, and why? If it was made offshore, or sanctioned from overseas, then a state of war should exist.

Gordon Housworth



InfoT Public  Infrastructure Defense Public  Weapons & Technology Public  

discussion

  discuss this article

Clarke's vision of securing the net

  #

It appalls me that we have overlooked Richard Clarke's recommendations in cybersecurity as we have in other areas.  I would agree with all of Vamosi's comments in Richard Clarke: He could have secured the Net save for his disagreement over the potential for a digital Pearl Harbor

I think that something with at least a small "p" is possible -- and that opinion rises if I consider a concentrated attack on one critical element, given that the 2003 Federal Computer Security Report Card (9 December, 2003) scored the critical 24 federal agencies into an overall D grade from an F -- after four years of scoring, and that the those still getting an F are the departments of Homeland Security, Energy, State, Justice, Health And Human Services, Interior, Agriculture, and Housing And Urban Development.  (Defense got itself into the D category along with Transportation, GSA, Treasury, Office Of Personnel Management, and NASA.)

"Had Clarke's proposals been taken seriously, all broadband users would have antivirus and firewall protection, and we might not have endured the MSBlast worm meltdown in August of 2003 nor be dealing with these pesky e-mail viruses right now. Microsoft might also be talking about releasing a version of Windows XP that had been independently proven to be secure (instead of us just taking the company's word that it's secure). In retrospect, we're no better off today, and perhaps we're actually worse off, than before the [National Strategy to Secure Cyberspace] existed."

Clarke further suggested that the government procure "only computer products certified by the National Intergovernmental Audit Forum (NIAF) testing program," but it was dropped as excessive regulatory intrusion.

With Clark and his former reports departed, we now have no one with the breath and vision needed to craft and lead a cybersecurity mandate.  DHS is in disarray.  As Peter G. Neumann observed:

"Technology alone does not solve management problems. Management alone does not solve technology issues. Reducing risks is a beginning-to-end, end-to-end system problem where the systems include all of the relevant technology, all of the relevant people, and all of the dependencies on and interactions with the operating environment, however flawed and complicated. But those flaws and complexities must be addressed systemically."

Not an easy thing to achieve on the best of days.

See 2003 Federal Computer Security Report Card

and IT Security Gets First Passing Grade — Barely
Published: December 15, 2003
By KAREN ROBB
Federal Times

Also these -- what might be called Clarke's legacy:

The National Strategy to Secure Cyberspace

National Strategy for Physical Protection of Critical Infrastructures and Key Assets

Richard Clarke: He could have secured the Net
By Robert Vamosi: Senior Associate Editor, Reviews
Friday, March 26, 2004

Gordon Housworth



Cybersecurity Public  InfoT Public  Infrastructure Defense Public  

discussion

  discuss this article

Networked sensor cloud of trailing, ever-present data

  #

In 2015: sensors everywhere, computers invisible describes a Gartner prediction that:

"[By] 2015, passive tags would begin to inhabit every non-trivial object, and every thing could be identifiable and located. Active, intelligent wireless networking and sensing devices will cost less than 50 cents. The sensors would run low power CPUs, have wireless and sensor chips, ad hoc networking algorithms, and gain power from the electromagnetic spectrum. In addition, the majority of computers will be invisible and disposable."

My experience with Gartner predictions, as with most predictions, is that the implementation glide slope is rarely as quick as predicted (often for societal drag in adoption as much as technology maturation) and that the development slope is not uniform across all technologies (some items hit snags, technical and regulatory, while others accelerate).

As long as one keeps this in mind and never forgets George Box's admonition that, "All models are false, but some models are useful" the prediction has merit. For my part I use a technology food chain analysis over time to see what items are advancing and which are bogging down (and where a "fix" is often in an unstudied, unrelated technology not under examination in the lagging segment).

In this case, the intelligent network tipping points are said to be "the availability of smaller, cheaper sensors, as well as two new breakthrough networking technologies: ultrawideband and WiMax (802.16). Ultrawideband creates a fast wireless connection that consumes about 10-4 the power of a cell phone, and WiMax promises 70 megabits per second across a 30-mile range."

While this article speaks to the fact that "[n]etworks have very long memories," creating a trailing cloud of data "that never gets deleted and gets backed up," it does not speak to the more malicious security aspects to which terrorism and 'garden variety' espionage can exploit the network.

Following this prediction to its conclusion makes the TSA's current CAPPS II effort seem quaint by comparison, but that is not to say that this particular level of data acquisition is acceptable as any nominally free society sanctions a certain level of approval, a willingness perhaps, to be knowingly monitored.

What I do see continuing is that commercial firms will continue to pursue data harvesting and analysis strategies that will be in turn harvested by government. Government can or will, depending upon your point of view, then integrate its own technology food chain.

In 2015: sensors everywhere, computers invisible
By Dan Farber,
Tech Update
March 30, 2004

Gordon Housworth



Cybersecurity Public  InfoT Public  Infrastructure Defense Public  

discussion

  discuss this article

Intelligently restoring sequestered governmental geospatial information to public access

  #

The lessons drawn from America's Publicly Available Geospatial Information: Does It Pose a Homeland Security Risk? extend beyond the vast number of libraries removed from federal and state agencies and into documentation that is, and will be, captured in Sarbanes-Oxley compliance efforts. Federal agencies have restricted considerable, formerly publicly available geospatial information, especially that available by the net. (In the case of public utilities and power producers, a portion of their Sar-Ox compliance documentation will be part of the same body of materials removed from view.)

RAND notes that while "publicly available geospatial information on federal Web sites and in federal databases could potentially help terrorists select and locate a target, attackers are likely to need more detailed and current information -- better acquired from direct observation or other sources" such as textbooks, street maps, non-governmental web sites, and trade journals. (Remember last year's FBI warning to police with regards to possession of almanacs.)

"Fewer than 6 percent of the 629 federal geospatial information datasets examined appeared as though they could be useful to meeting a potential attacker’s information needs. Furthermore, the study found no publicly available federal geospatial datasets that might be considered critical to meeting the attacker’s information needs (i.e., those that the attacker could not perform the attack without). Additionally, most publicly accessible federal geospatial information appears unlikely to provide significant (i.e., useful and unique) information for satisfying attackers’ information needs (i.e., less than 1 percent of the 629 federal datasets examined appeared both potentially useful and unique). Moreover, since the September 11 attacks these useful and unique information sources are no longer being made public by federal agencies. In many cases, diverse alternative information sources exist. A review of nonfederal information sources suggests that identical, similar, or more useful data about critical U.S. sites are available from industry, academic institutions, nongovernmental organizations, state and local governments, foreign sources, and even private citizens."

RAND notes that an analytical, rather than wholesale, examination process needs to be instituted to identify sensitive geospatial information and offers a starting point for assessing the Homeland Security sensitivity of publicly available geospatial information by filters for usefulness, uniqueness, societal benefits, and costs.

For more complete information, see the larger document: Mapping the Risks: Assessing the Homeland Security Implications of Publicly Available Geospatial InformationChapters Two and Three are of great interest:
  • What Are the Attackers’ Key Information Needs?
  • What Publicly Available Geospatial Information Is Significant to Potential Attackers’ Needs?
Gordon Housworth
 


Cybersecurity Public  InfoT Public  Infrastructure Defense Public  

discussion

  discuss this article

Lisa Dean and TSA: Will opposites attract?

  #

In what can only be called an inspired choice, the Transportation Security Administration (TSA) has appointed one of its harshest critics, Lisa Dean, as it first chief privacy officer. I say inspired as Dean's credentials as an adversary of government surveillance at the Electronic Frontier Foundation and the Free Congress Foundation leave her, at the moment, above reproach.

My question is, has TSA silenced, absorbed or adopted this critic. While that may not have been the intent of those who hired her, bureaucracies have a way of convincing and co-opting those they envelop. What seems implausible from the outside can come to seem valuable from the inside.

I can only assume that CAPPS II, its purpose to combat international terrorism, and potential "scope creep" that draws it more deeply into the national fabric will be high on her priorities. In a 17 February, 2004 letter to the House Committee on Transportation and Infrastructure, Dean raised critical questions with respect to CAPPS II (Computer Assisted Passenger Prescreening System) and airline PNRs (Passenger Name Records):

What passenger information is collected, how is it shared and with whom?
How long is the information retained?
What are the names and numbers of government contractors (Torch), data-brokers and other third parties as well as their level of involvement in the PNR process?
What rights do passengers have to correct information, as they do their credit reports?
What rights do passengers have to view their personal data, as they do their medical records?
What recourse do passengers have if they believe they have been wrongly "flagged"?
Will CAPPS II be effective for identifying individuals who pose a threat to aviation security?
How much will it cost the travel industry as a whole to comply with requirements to provide TSA with data not currently collected by the agency?

Only days earlier, the GAO had noted that TSA had no effective answer for any of these questions. Dean will have to resist not only federal employees who have heretofore seen state protection as a higher calling, but commercial firms that seek to implement such data mining applications. The excesses of the JetBlue affair again highlighted the fact that tool builders and data repository owners have not previously demonstrated restraint in this respect.

The understandable intent of the contracting firms is to leverage their TSA/CAPPS investment and so that will drive them to expand their tools content. Should Dean stipulate limitations, it will require diligent verification that the applications do not exceed their brief and that neither private data or their source code migrate to inappropriate venues.

It will be interesting to follow Dean's trajectory at TSA.

See GAO's Aviation Security: Computer-Assisted Passenger Prescreening System Faces Significant Implementation Challenges

Gordon Housworth



InfoT Public  Infrastructure Defense Public  

discussion

  discuss this article

Prev 1  2  3  4  5  6  7  8  9  10  11  12  [13]  14  15  Next

You are on page 13

Items 121-130 of 148.


<<  |  November 2014  |  >>
SunMonTueWedThuFriSat
2627282930311
2345678
9101112131415
16171819202122
23242526272829
30123456
view our rss feed