Menu
Security Vendors Question Accuracy of AV Tests

Security Vendors Question Accuracy of AV Tests

The increasing complexity of security software is causing vendors to gripe that current evaluations do not adequately test other technologies in the products designed to protect machines

Antivirus software is frequently tested for performance, so picking a top product should be straightforward: Select the number-one vendor whose software kills off all of the evil things circulating on the Internet. You're good to go then, right? Not necessarily.

The increasing complexity of security software is causing vendors to gripe that current evaluations do not adequately test other technologies in the products designed to protect machines.

Relations between vendors and testing organizations are generally cordial but occasionally tense when a product fails a test. Representatives in both camps agree that the testing regimes need to be overhauled to give consumers a more accurate view of how different products compare.

"I don't think anyone believes the tests as they are run now ... are an accurate reflection of how one product relates to the other," said Mark Kennedy, an antivirus engineer with Symantec, based in the US.

Representatives of Symantec, F-Secure and Panda Software agreed at the International Antivirus Testing Workshop in Reykjavik, Iceland, to design a new testing plan that would better reflect the capabilities of competing products. They hope all security vendors will agree on a new test that can be applied industry-wide, Kennedy said.

A preliminary plan should be drawn up by September 2007, Kennedy said.

One of the most common tests involves running a set of malicious software samples through a product's antivirus engine. The antivirus engine contains indicators, called signatures, that enable it to identify harmful software.

But antivirus products have changed over the last couple years, and "now many products have other ways of detecting and blocking malware", said Toralv Dirron, security lead system engineer for McAfee.

Signature-based detection is important, but an explosion in the number of unique malicious software programs created by hackers is threatening its effectiveness. As a result, vendors have added overlapping defences to catch malware.

Vendors are employing behavioural detection technology, which may identify a malicious program if it undertakes a suspicious action on a machine. A user may unwittingly download a malicious software program that is not detected through signatures. But if the program starts sending spam, the activity can be identified and halted.

Also, a program can be halted if it tries to exploit a buffer overflow vulnerability, where an error in memory can allow a bad program to run. Host-based, intrusion-prevention systems, which can employ firewalls and packet inspection techniques, can also stop attacks.

The ways in which a computer can be infected also make comprehensive testing complex. For example, users may infect their computers by opening malicious e-mail attachments or visiting harmful Web sites designed to exploit known vulnerabilities in a Web browser.

The different modes of attack also involve different defences, all of which would need to be tested to arrive at an accurate ranking, analysts said.

By contrast, signature-based tests can take as little as five minutes. "This is a very basic test," said Andreas Marx of AV-Test.org, who wrote his master's degree thesis on antivirus testing. "It's easy, and it's cheap."

Other concerns remain, over sample sets of malicious software, the age of the samples and the relative threat those samples pose on the Internet as they become older. Security vendors also think tests should check how well security applications remove bad programs, a process that can affect a computer's performance.

For vendors, a failed test can be embarrassing, since the testing companies often issue news releases highlighting the latest results.

Testing companies make money in various ways. AV-Test.org is often commissioned by technology magazines such as PC World (a magazine owned by IDG). Virus Bulletin licenses its logo to companies for use in promotional material and publishes a monthly online magazine.

Earlier this month, Virus Bulletin announced that its latest round of testing produced some "big-name failures", including products from Kaspersky Lab and Grisoft SRO.

The company's VB100 tests antivirus engines against malware samples collected by the Wildlist Organization International, a group of security researchers who collect and study malware. To pass the VB100, products must detect all samples.

Kaspersky briefly removed a signature for a worm out of its product for "optimization" purposes on the day of the test, wrote Roel Schouwenberg, senior research engineer for Kaspersky, in an e-mail. The signature has since been put back in, he said.

"Obviously, we would have rather passed than failed," Schouwenberg wrote. "Had the test been conducted a day earlier or a day later, we would have passed."

Similarly, F-Secure initially failed its test also because of a technicality, but the failed rating was later reversed. All vendors are told after testing which samples they failed to detect, thus most end up adding signatures to their products.

So what should a user do? John Hawes, a technical consultant for Virus Bulletin, cautioned that the signature-based tests are "not enormously representative of the way things are in the real world".

But Hawes also noted that signature-based tests can indicate the reliability and consistency of a vendor's software. Virus Bulletin also writes reviews of AV suites, which take into account aspects such as usability, which may be just as important as detection for consumers. The company is developing more advanced tests that will test new security technologies.

AV-Test.org is already performing more comprehensive tests, although it uses between 30 to 50 malware samples, a much smaller sample set compared to the Wildlist, which uses more than 600,000 samples, Marx said. Those tests may give a better indication of how a security software suite performs.

At a bare minimum, through, users should install some security software, as computers without it can face high risks, Marx said. Several free suites are available that may be fine for light Internet use, he said.

Ironically, Marx doesn't use any antivirus software. That's because AV-Test.org collects malware for its testing, most of which comes through e-mail from other researchers. "I'm getting about 1000 viruses a day," he said. "It [antivirus software] would be counterproductive."

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about F-SecureGrisoftHISKasperskyMcAfee AustraliaPandaPanda SoftwareSymantec

Show Comments
[]