Posts tagged ‘Rodney Joffe’

How to Address Government Data Security

Late last week I attended a federal data security event sponsored by Neustar. What impressed me about this event was the frank admission that sensitive data will be lost – the only issue is how to minimize the vulnerabilities and mitigate the inevitable losses.

It was an intimate event with just two speakers. William Crowell is the former Deputy Director of the NSA, and Rodney Joffe is SVP and Senior Technologist for Neustar. Both highlighted vulnerabilities, suggested some best steps and in Joffe’s case, proposed a fundamentally new way to view online security.

Crowell is now a security consultant, and mentioned he was part of the team at NSA that worked way back when on decrypting the Venona cables. (As a former PoliSci major, I found that pretty cool.) He told the group that if government supervisors took data security as seriously as they do physical security there would be fewer breaches.

According to Crowell, intelligence has never won a war, but intelligence allows soldiers to win wars. He identified the combination of social engineering and fast advancing technological capabilities as a “unholy alliance” that is behind fully half of all advanced, persistent online attacks directed against the U.S. government.

He outlined the following steps to fight back:

  • Stop talking about ID management, and start doing — there is currently no system deployed within the U.S. government with cryptographic capabilities
  • We need gateways and firewalls that can handle hundreds of distinct rules per packet — current tools can handle about 25
  • Develop anomaly models for online behavior — just like we have in the physical world
  • FISMA — refocus on real-time security, these requirements have become a static checklist
  • Education of users — very often overlooked
  • Move to the cloud — government needs to stop arguing about the benefits, which are manifest, and focus on securing

Joffe opened his address by stating that the current tools for data security aren’t enough. As an example of the level of threats today, he pointed to the take down that day of a two million strong botnet by the DoJ and the FBI. He told the audience that he’s learned a lot by “being a target since 2002,” through his founding of the leading managed DNS provider UltraDNS (purchased by Neustar in 2006).

Joffe preaches mitigation, since 100 percent prevention is an illusion. He encouraged the audience to engage in a premortem when considering data security. This approach assumes failure, then looks for evidence that can lead back to specific areas of weakness. It’s a fundamentally different way to visualize security — here’s a Harvard Business Review article with more detail on the premortem methodology.

I’ve known Rodney for many years, and he’s very good at explaining technology with analogies. In describing why planning for failure in data protection is necessary and not at all defeatist, he used the example of the modern conference room where we all were sitting. The building employed the latest in safety construction, right down to fire retardant materials in the furniture. Yet there was still a sprinkler system overhead.

Why, he asked the audience? Because fires still happen, despite taking all the proper steps to prevent them. It’s the same with data security.

Here’s Rodney’s to-do list for feds looking to protect their data:

  • Continue with all the current best practices and the layered approach to security — firewalls, the latest anti-virus programs, IDS/IPS
  • Deploy failure sensors and plan for losses — this is where existing solutions fall down
  • Work backwards from failures by examining the artifacts breaches leave behind -- Joffe said he will have more to say publicly on how this can done in a few weeks

In today’s online threat climate, focusing only on perimeter defense is like the French relying on the Maginot Line in 1940. I’m looking forward to seeing how the federal market reacts to this new way of conceptualizing data security.

April 19, 2011 at 7:18 am Leave a comment

The Dark Internet

I consult on communication issues for Neustar, an Internet infrastructure company. Neustar works behind the scenes to ensure the smooth operation of many critical systems like DNS, the .us and .biz domain extensions, local number portability and digital rights management.

One of the cool things about working for them is the chance to attend the events they sponsor. Last week Neustar sponsored a security briefing for senior federal IT personnel focused on Cybersecurity and Domain Name System Security Extensions (DNSSEC). The speakers were Rodney Joffe, SVP and Senior Technologist at Neustar; Merike Kaeo, founder of Double Shot security and a prominent security expert; and Edward Lewis, a Director at Neustar and author of numerous RFCs dealing with DNS and DNSSEC.

What they all described was very sobering. Bottom line, there are fundamental protocols of the Internet that were not designed to be secure. And there is only so much anyone can do to protect themselves.

There’s no way I can communicate all the material presented in this post — I’m just not that good a note taker. But I can share how they framed the escalating security threats.

Merike led off the presentations. She grouped threats into four categories — Protocol Errors, Software Bugs, Active Attacks and Configuration mistakes. Here’s how she charted the evolution of online threats:

In the Past – Deliberate malware was rare, bugs were just bugs, mitigation was trial by fire and the regulatory structure did not exist.

Today – Highly organized criminals are designing specific malware, bugs are now avenues for attack, mitigation is understood but deployment issues remain, and regulations struggle to assess the reach and impact of cybercrime, though global coordination is much better

She also shared some interesting insights into the cyber attacks in Estonia in May of 2007. Merike is Estonian and was in the country at that time. She shared how cyber literate the population is in that country, and how they fended off the attacks far better than media reports indicated.

Rodney titled his presentation “Black Swans and Other Phish,” a reference to the Nassim Taleb theory, not the new Natalie Portman movie. His overall message was the miscreant of the distant hacking past became the spammer of yesterday. The spammer became the hardcore online criminal of today, hired by organized crime and nation states alike.

Some other interesting point for me:

  • DDoS attacks first arose to attack anti-spam efforts
  • Malware specifically designed to steal personal information and credentials appeared around 2005
  • In 2007 nation states got into the dark game

In an effective demonstration, Rodney brought up a false FBI web site by typing in an IP address corresponding to www.fbi.gov. The cache had been poisoned, and that morning a fake web site was announcing to the world it was the real site of the FBI. Many in the room were clearly surprised by how easy it is to poison the cache of such a high profile government site.

Rodney also talked about the need for better information sharing between government and private networks. (Actually, he said government shares nothing, so anything would be an improvement.) Neustar will be launching a  new service soon that will offer agencies full visibility OUTSIDE their networks, and analysis based on actual packet inspection, not just sampling. This gives them a dashboard so they can monitor, understand and then (hopefully) mitigate.

There was no mistaking Ed as the engineer of the group, in his jeans and flannel shirt. He’s also one of the foremost experts on DNSSEC in the world, and feels that finally there is consensus around a critical point. Finally, people are realizing that the cost of implementing DNSSEC pales in comparison to not implementing it.

The biggest challenge of DNSSEC is not the signing, it’s the key management. The more or less final version of DNSSEC has been ready since 2004, and got a huge visibility boost with Dan Kaminsky’s revelations on DNS vulnerabilities in the summer of 2008. That same year, OMB mandated DNSSEC for the .gov domain.

Ed sees that as a good first step, although it doesn’t address the security of others caching .gov IPs. There’s still a lot of work to be done, but Ed is a lot more confident that he used to be. First, because of the cost question mentioned above. Second, because the security problem is real. Finally, because there is no better solution to the problem.

He also cautioned the government audience to focus on the right end goal. The goal is a secure DNS, not a deployment to meet a mandate.

I left the briefing a lot smarter on this topic, and a lot more worried. There seems to be more official recognition of online dangers, and one of the presenters referenced the fact that Janet Napolitano has announced she wants to hire 1,000 cybersecurity professionals over the next three years.

But it was also mentioned the Chinese government is training 10,000-20,000 cybersecurity students per year in their national defense universities. The land where the Internet was invented is starting from behind in this race. We’d better start sprinting!

January 25, 2011 at 8:53 am 3 comments

Counting the WikiLeaks Story Lines

With every new day the WikiLeaks saga seems to reveal another story line. As the expression goes, “you can’t make this stuff up.” In the process, it’s highlighting rapidly changing — and often unstable — elements in Internet-based communication.

First let’s take the emerging model of cyber attacks. People no longer just get angry at home or in bars, they attack those they disagree with online. First supporters of WikiLeaks were subjected to Distributed Denial of Service (DDoS) attacks, then companies and organizations deemed against WikiLeaks were targeted. Network World has a good roundup.

Many have questioned the obligations of cloud computing companies in the wake of Amazon dumping WikiLeaks as a customer, under pressure from the U.S. government. Here’s the New York Times on that angle, and here’s the view from a cloud computing blog focused on the federal market.

John Battelle wonders if Google should “mirror” WikiLeaks, making sure they would be able to continue disseminating information. Personally I believe strongly in transparency, and that sunlight is  the best disinfectant. Too much secrecy begats incompetence, corruption and tyranny.

But does it trump all other considerations? Would this be a PR masterstroke by Google, or a disaster? A provocative question to be sure, but isn’t Google indexing the files enough?  The story comments are almost universally against the idea.

Finally, what happens when governments decide to block or divert Internet traffic? This isn’t directly related to WikiLeaks, but last week a congressionally chartered commission decided that in April of this year a company owned by China deliberately diverted roughly 15% of the world’s Internet traffic.  The reasons are unknown, and China denies the accusation. The PBS Newshour did a great discussion around this danger.

This story looks to keep on giving, with WikiLeaks founder Julian Assange threatening to unleash a “poison pill” if he is imprisoned or killed. The file supposedly contains a huge amount of leaked information that is totally unredacted and secured by 256 bit encryption. He’s currently in prison in Britain and fighting extradition to Sweden, where he is wanted for questioning regarding a sex crimes investigation from August of this year. Like I said, you can’t make this stuff up.

I can’t wait for the next chapter. But along with the drama, I hope there will be some positive benefits. Maybe WikiLeaks will lead to more awareness around how the Internet can be abused, and how it needs to be made more secure.

December 9, 2010 at 9:05 am 1 comment


Categories

Archives

Traffic Sources

Alexa Rank

Twitter Stream

Become a Strategic Communications Fan

Add to Technorati Favorites


Follow

Get every new post delivered to your Inbox.

%d bloggers like this: