Advertisement

You will be redirected to the page you want to view in  seconds.

Finding security threats in the noise

Feb. 25, 2013 - 03:34PM   |  
By ZACHARY FRYER-BIGGS   |   Comments

SAN FRANCISCO — For years, company executives have clamored for higher and higher security “fences” to keep cyber attackers out. That model has failed mainly because companies need to allow their employees to access the Internet if they want to achieve reasonable productivity.

But companies including McAfee are now turning to a biological model to improve security. It’s an approach that is becoming increasingly popular, and is front and center at this year’s RSA Conference.

“Everybody likely is owned, and instead of trying to block everything, which is a two- decades-old paradigm, we try to function like an immune system,” said Phyllis Schneck, chief technology officer, Global Public Sector McAfee. Schneck will be panelist for several featured events at the conference.

The immune system comparison suggests that viruses will inevitably enter a system so the best approach is to work within the body to hunt for the anomalies.

Getting that message through to customers isn’t always the easiest, she said, but the company is emphasizing the need to approach security as more of a filtering process than a blockage.

“They don’t always understand,” she said. “We pride ourselves on trying really hard to be a partner and not just a vendor. Part of that is educating, not selling.”

For McAfee, which calls its approaches Security Connected and Global Threat Intelligence, the emphasis is on cutting through the totality of data streaming through a network to find possible threats.

Computers aren’t always the best at finding the anomalies, a fact that has maintained quite a few vulnerabilities in systems. And they haven’t always been able to handle the vast amounts of data that routinely travel networks.

But Schneck sees advances in technology as allowing for the processing of far larger quantities of data.

“We’re at a point like no point in history where we actually have the processing power to look at something behaviorally and determine whether it’s an application that should be permitted, even on mobile devices,” she said.

(Page 2 of 2)

Once that data has been sifted and potential issues are uncovered, you still need “big brains” to determine if anomalies are indeed threats.

Schneck compared the process to that of weather forecasting. Forecasting has improved radically in the last several decades, as guesses about temperature and precipitation are now backed by supercomputers that process tremendous quantities of meteorological data.

But the National Weather Service (NWS) still has experts review and modify the computer predictions. It turns out that computers, while good at processing data, aren’t the best at detecting patterns and are even worse at understanding anomalies.

For the NWS, the forecasts that are modified by experts show 25 percent greater accuracy in predicting precipitation than the computer models alone, and 10 percent greater accuracy in regard to temperature.

(For more on this, pick up a copy of Nate Silver’s new book, “The Signal and the Noise,” which details this improvement and provides the accuracy numbers quoted above).

Schneck, who had a job working with weather data while in graduate school, said that focusing on just the right part of the data, the meaningful part, is helped by computers but still requires a human eye.

“There’s a lot of noise on radar with weather maps,” she said. “But if you’re able to focus the map, to eliminate some of that noise, you’ll notice some very interesting things.”

More In SHOWSCOUT

More Headlines