Archive

Archive for July, 2015

The big cybersecurity challenge: Time-to-detection

July 29, 2015 Leave a comment

Do you sunbathe? You shouldn’t in this day of hypersensitivity about skin cancer. But if you do, the sunlight falling on your liver-spotted, lizard-like skin has been traveling through space for about nine minutes. When you gaze at the night sky and see Alpha Centauri, you probably remember from grade school that light from that nearby planet takes about 4.3 years to get to earth.

If something like a Burning Man festival were held on Alpha Centauri, you wouldn’t know about it until 4.3 years after it was over. Too late to load up your Airstream and get there in time for the fun. Most stars are so far away, they probably collapsed into black holes a billion years ago, yet all we see is merry twinkling millennium after millennium.

Not to over-dramatize, but this is how things are in cybersecurity — specifically intrusion detection. When the Office of Personnel Management was patching its systems, it discovered its great breach, months after the break had occurred. It might have been still more months before anyone noticed the anomaly. It reminds me of a corny roadside display in Pennsylvania when i was a kid. A sign on a little barn said, “World’s Biggest Steer Inside.” When you pulled over and peered in the window, you saw a big jagged hole in the back of the barn, a chain lying in the dirt, and another sign, “Too bad, guess he got away!” There must’ve been a gift shop or goat’s milk fudge stand nearby.

This is one of the big problems with modern-day cyber attacks. Too often, IT and security staffs only find out about them long after the damage has been done and the hackers moved on to other soft targets. If it takes seconds or minutes to exfiltrate data, what good does discovering it do next year?

I recently spoke with John Stewart, one of the top security guys at Cisco. The topic was Cisco’s Midyear Security Report. Here’s my summary: Federal IT and security people, like everyone else, have plenty to worry about. Like the fact that a thousand new security product vendors have started up in the last five years, yet most of them sell non-interoperable software. Or that the white-hat, good-guys side of the cybersecurity equation is literally about a million qualified people short.

Yet among the most seemingly intractable problems lies time-to-detection, or how long on average it takes for organizations to find out they’ve been hacked. This makes it likely that many more successful attacks have occurred than systems administrators are aware of. Stewart says most of the data show that IT staffs routinely take months to detect breaches. A major goal of the products industry and practitioners’ skill sets must therefore be getting time-to-detection down to seconds. At this point, I’ll bet many federal agencies would be happy with days or hours.

Malicious hackers aren’t standing still, the Cisco report points out. They’re switch vectors and modalities at lightning speed. They’re using wealth transfer techniques that stretch law enforcement’s ability to detect. Stewart says, systems like Bitcoin and the murky avenues of the dark web don’t include or even require the typical middlemen of the surface financial transaction world — such as banks, transfer networks, mules. He describes the bad-hacker industry using a term the government likes to use for itself: innovative. 

Embedded IP domains and fungible URLs, jacking up data write-rewrite cycles to dizzying speeds, or quietly turning trusted systems into automated spies in the time it takes someone to go for coffee — that kind of thing. You might call it agility. They’re dancing circles around systems owners. The hacking community has become wickedly innovative at evading detection, Stewart says, exploiting the common systems and software everyone uses routinely.

He adds that the motivations of bad hackers have blossomed into a veritable bouquet. They go after systems for espionage, theft of money or intellectual property, terrorism, political activism, service disruption and even outright destruction. That’s a good case for the so-called risk-based approach to cybersecurity planning. If you’re a utility, disruption or destruction is more likely to be the hackers’ goal. If you’re a database of people with clearance, espionage and theft are good bets.

Answers? As cybersecurity people like to say, there is no silver bullet. Stewart says nations will have to cooperate more, tools will have to improve, people will have to get smarter. Cisco hopes to build some sort of architecture framework into which the polyglot of cyber tools can plug, reducing what he calls the friction of integration.

For now, a good strategy for everyone connected to cybersecurity is to bore in on the essential question: How soon can we know what’s going on?

Thoughts on bloated web sites, complex software

July 21, 2015 Leave a comment

With my wife at the wheel, we swing off Route 21 in New Jersey onto E 46. The GPS in the dash of our new Subaru is guiding us to Saratoga Springs, NY for the weekend. Kitty-corner from the exit is a big bilboard that reads, “WHO IS JESUS? CALL 855-FOR-TRUTH. Nice and succinct. I admired the certitude, but didn’t try the number.

The car is filled with slightly more mystifying tech. Somewhere I read the average modern car has 200 microprocessors. How many lines of code do they run, I wonder? No matter, the car does what it’s supposed to. Anyone who ever dealt with distributor caps, points and engine timing lights appreciates the way today’s cars work.

The GPS-bluetooth-navigation complex in the dash is another matter. It’s a mishmash of hard-to-follow menus. No matter what we do, every time we turn on the car, the podcasts on my wife’s phone starts up. As for navigation, no two systems I’ve ever seen work quite the same way, at least their user interfaces don’t. Voice commands can be ambiguous, and if occasionally directs you off the highway only to direct you right back on again.

This same overload is ruining many web sites, as it has many once-simple applications. No wonder people love apps, in the sense of applications designed or adapted to work easily and quickly on the small touch screens of mobile devices. Standards like Word, Outlook, iTunes and many other have become so choked with features and choices, I’ve practically given up on them. I can figure out what they do, but it’s all too much, too fussy and time-consuming to manage.

The major media sites are so choked with links — most of them for ads, sponsor content, and unrelated junk such as 24 celebrity face-lifts gone horribly wrong — that you can barely navigate them with out constant, unwanted and frustrating detours.

The drive to make software more and more functional may be behind what seems to be a disturbing trend towards failures in critical systems. They’ve happened a lot lately. In fact, it happened first rather close to home. Literally a minute before going on the air one recent morning, the system that delivers scripts and audio segments failed. A Federal News Radio, we’d gone paperless for a year, reading scripts online and saving a package of printing paper every day. Talking, trying to sound calm, ad-libbing while gesticulating wildly to my producer — that’s what a software crash causes. Controlled panic. Panic, anyhow. It took the engineers an hour to fix. It turned out, a buffer overflow crashed the Active Directory on which the broadcast content environment depends for user privileges. So down it went with the ship.

It was the same day United Airlines passenger boarding system failed, apparently the result of lingering incompatibility from the merger with Continental. And the same day that the New York Stock Exchange famously experienced an hours-long crash, reportedly because of network connectivity issue. Earlier in the month, a hardware-software interaction interrupted for two weeks the State Department’s globally-distributed system for issuing visas.

Successive program managers for the F-35 fighters have all complained they can’t get the software development for this fussy and delicate airplane in any sort of predictable schedule. Yet the plane is unflyable and unmaintainable without its software.

In short, two problems linger with software controlled systems. They can be difficult to interact with. And in their complexity they produce effects even expert operators can’t foresee. I believe this is the basis for the spreading appeal of agile development. It forces people to develop in pieces small enough that people can keep track of what is going on. And in ways that the users can assimilate easily.

Complexity, or the desire to avoid it, is why people like apps on mobile devices. I confess to checking Buzzfeed on my phone when I’m bored. The content is inane, but it’s such a fast, simple app, like eating gumdrops. I recently checked out the regular Web site of Buzzfeed, and sure enough, it’s a confusing kaleidoscope. Although, an ice cream cone swaddled in Cocoa Krispies does sound good.

Archuleta departs. Now what? Some ideas

July 10, 2015 1 comment

OPM director Katherine Archuleta, as I predicted three weeks ago, has resigned. Problem solved, let’s move on.

Fat chance.

In reality, Archuleta’s departure solves nothing fundamental. But she had to go, as I’m sure she understood probably from the moment she peered over the edge and realized — long before most everyone else — the size of the abyss caused by The Data Breach. Talk about big data. As primarily a politician, Archuleta must have realized that she would eventually take the fall for the administration, which of course is ultimately responsible. That’s the way of Washington; always has been. Katherine Archuleta isn’t a horrible person, nor do we have any reason to think she didn’t have the best interests of federal employees at heart. But as President Obama’s campaign manager who secured a visible, plum job, she would get it: This goes with the territory.

And the more the White House spokesman, a sort of latter day Ron Ziegler, pushed culpability away from the administration in the aftermath of Thursday afternoon’s revelation, the more it’s clear the White House itself knows it is somehow responsible for potentially messing up 22 million lives, compromising national security, and making the government look totally incompetent.

“There are significant challenges that are faced not just by the federal government, but by private-sector entities as well. This is a priority of the president,” the spokesman said. Yeah, well, the vulnerabilities of OPM’s systems and the Interior Department facility that houses them existed seven years ago and before that. The incoming just happened to land and explode now. Now we can presume they really, really are a priority.

So now what? It will fall to Beth Colbert, the deputy director for management at the White House, to salve the wounds.

And Obama himself ought to voice his personal concern over this. Some things that occur externally do get to presidents personally. Johnson and Nixon waded into crowds of Vietnam protesters. But more than that, some concrete things should happen:

  • The White House should convene a meeting of the CIO Council to make it clear the 30-day cyber sprint ordered by Federal CIO Tony Scott is now a year-long effort.
  • Pressure test every important system in the government. Hire the top corporate cybersecurity experts — a group populated in part by some famous formerly malevolent hackers — and have them bang away until they find all the weaknesses. Then give agency heads one working week to prove their vulnerabilities are plugged. Two-factor authentication, encryption of data at rest — for heaven’s sake do it already.
  • Hire a tiger team to install Einstein 3A in every agency by July 31st, never mind December 31st. Require the internet service providers to do whatever it takes to make their inbound traffic compatible with this system. If Einstein 3A is so good, how come it’s taken so long?

I know what you’re saying. Yes, it does sound naive. I wasn’t born last night either. This is one of those times, though, that requires an all-out effort. For years we’ve heard warnings of a cyber 9/11. Well, we just had one.

This data loss was no third-rate burglary. Mr. President, America is under attack.

Have you visited the deep, dark Web recently?

July 2, 2015 Leave a comment

At the end of a long cul-de-sac at the bottom of a steep hill, our house sat near a storm sewer opening that in my memory is a couple of yards or so wide. If you poked your head down the opening and looked in the right direction, you could see daylight where the culvert emptied into a sort of open catch basin. None of us ever had the nerve to slip into the drain and walk through the dark pipe to come out in the catch basin, maybe 300 yards away. But that is where, at the age of maybe 5 or 6 years old, I realized a vast network of drainage pipes existed under our street, beneath our houses. That culvert fascinated me and my friends endlessly. We’d try to peer at one other from each end, or shout to see if our voices would carry. Or, after a rain, we’d drop a paper boat down the opening and see how long it would take to flow into the catch basin.

That sewer is like the Internet. Underneath the manifest “streets” that are thoroughly used and mapped lies a vast subterranean zone with its own stores of data. Some experts say the surface or easily accessed Internet holds only 4 percent of what’s out there. Much of the out-of-view, deep Internet consists of intellectual property that people — like academics or scientists — want to keep to themselves or share only with people they choose. But other areas lie within the deep Internet where criminal and terrorist elements  gather and communicate. That’s called the dark Internet. It’s also where dissidents who might be targeted by their own country communicate with one another. To people using regular browsers and search engines, this vast online zone is like a broadcast occurring at a frequency you need a special antenna to detect.

At the recent GEOINT conference, held for the first time in Washington, I heard a theme from several companies: Agencies will need to exploit the deep web and its subset dark web to keep up with these unsavory elements. The trend in geographical intelligence is mashing up multiple, non-geographic data sources with geographic data. In this and a subsequent post I’ll describe some of the work going on. In this post, I’ll describe work at two companies, one large and one small. They have in common some serious chops in GEOINT.

Mashup is the idea behind a Lockheed-Martin service called Halogen. Clients are intelligence community and Defense agencies, but it’s easy to see how many civilian agencies could benefit from it. Matt Nieland, a former Marine Corps intelligence officer and the program manager for the product, says the Halogen team, from its operations center somewhere in Lockheed, responds to requests from clients for unconventional intel. This requires data from the deep internet. It may be inaccessible to ordinary tools, but it still falls into publicly available data. Neiland draws a crude sketch in my notebook like a potato standing on end. The upper thin slice is the ordinary Internet. A tiny slice on the bottom is the dark element. The bulk of the potato represents the deep.

Halogen uses the surface, searchable Internet in the unclassified realm. Analysts ingest material like news feeds, social media, Twitter. They mix in material that is inaccessible to standard browsers and search engines, but are neither secret nor requiring hacking. It does take skill with the anonymizing Tor browser and knowledge of how to find the lookup tables giving URLs that otherwise look like gibberish. Beyond that, Nieland says Lockheed has contacts with people around the world who can verify what it finds online. Halogen’s secret sauce is the proprietary algorithms and trade craft its analysts use to create intel products.

At the opposite end of the size spectrum from Lockheed Martin, OGSystems assembles teams of non-traditional, mostly West Coast companies to help federal agencies solve unusual problems in cybersecurity and intel, or problems they can’t find solutions for in the standard federal contractors. CEO Omar Balkissoon says the company specializes in getting non-traditional people to think about traditional questions. A typical project is the Jivango community where agencies can source answers to GEOINT questions.

OGSystems calls its R&D section VIPER Labs, crafts services, techniques and data products for national security. At GEOINT I walked to a big monitor by Jessica Thomas, a data analyst and team leader at VIPER Labs. She’s working on a OSINT (open source intelligence) product for finding and stopping human traffickers and people who exploit minors. It’s a good example of mashing up non-GEO data with GEO. The product uses an ontology used by law enforcement and national security types of words found on shady Web sites and postings to them that may be markers for this type of activity. Thomas pulled two weeks worth of posting traffic and used a geo-coding algorithm to map it to the rough locations of the IP addresses. Posters tend to be careless about how easy it is to reverse-lookup IP addresses to get a general area from where it originated. In many cases, posts included phone numbers. It wasn’t long before clusters of locations emerged indicating a possible network of human trafficking.

An enthusiastic Tor user, Thomas wants to add dark Internet material to her trafficking data mashup. She also hopes to incorporate photo recognition, and sentiment analysis that can detect emotion within language found on a web site. She says OGSystems has applied for a grant to develop its trafficking detection technology into a tool useful for wildlife trafficking — a major source of funding for terror-criminal groups like El Shabab.

Next week, some amazing things text documents can add to GEOINT.