Geeks for your information
AV-Comparatives - False Alarm Test March 2024 - Printable Version

+- Geeks for your information (https://www.geeks.fyi)
+-- Forum: Security (https://www.geeks.fyi/forumdisplay.php?fid=68)
+--- Forum: Independent Organizations Reports (https://www.geeks.fyi/forumdisplay.php?fid=149)
+--- Thread: AV-Comparatives - False Alarm Test March 2024 (/showthread.php?tid=19934)



AV-Comparatives - False Alarm Test March 2024 - harlan4096 - 17 April 24

Quote:AV-Comparatives - False Alarm Test March 2024:

Introduction

In AV testing, it is important to measure not only detection capabilities but also reliability. One aspect of reliability is the ability to recognize clean files as such, and not to produce false alarms (false positives). No product is immune from false positives (FPs), but some produce more than others. False Positives Tests measure which programs do best in this respect, i.e. distinguish clean files from malicious files, despite their context. There is no complete collection of all legitimate files that exist, and so no “ultimate” test of FPs can be done. What can be done, and is reasonable, is to create and use a set of clean files which is independently collected. If, when using such a set, one product has e.g. 15 FPs and another only 2, it is likely that the first product is more prone to FPs than the other. It doesn’t mean the product with 2 FPs doesn’t have more than 2 FPs globally, but it is the relative number that is important. In our view, antivirus products should not generate false alarms on any clean files, irrespective of the number of users affected. While some antivirus vendors may downplay the risk of false alarms and exaggerate the risk of malware, we do not base product ratings solely on the supposed prevalence of false alarms. We currently tolerate a certain number of false alarms (currently 10) within our clean set before penalizing scores. Products that yield a higher number of false alarms are more likely to trigger false alarms with more prevalent files or in other sets of clean files. The prevalence data we provide for clean files is purely for informational purposes. The listed prevalence may vary within the report, depending on factors such as which file/version triggered the false alarm or how many files of the same kind were affected. There can be disparities in the number of false positives produced by two different programs utilizing the same detection engine. For instance, Vendor A may license its detection engine to Vendor B, yet Vendor A’s product may exhibit more or fewer false positives than Vendor B’s product. Such discrepancies could stem from various factors, including differences in internal settings, additional or varying secondary engines/signatures/whitelist databases/cloud services/quality assurance, and potential delays in making signatures available to third-party products.

Tested Products
 
Test Procedure

In order to give more information to the user about the false alarms, we try to rate the prevalence of the false alarms. Files which were digitally signed are considered more important. Due to that, a file with the lowest prevalence level (Level 1) and a valid digital signature is upgraded to the next level (e.g. prevalence “Level 2”). Extinct files which according to several telemetry sources had zero prevalence have been provided to the vendors in order to fix them, but have also been removed from the set and were not counted as false alarms.
...
Full Report