The security market is constantly changing! A few years ago, there was the “UTM” (“Unified Threat Management“) market which offered to customers all-in-one solutions (firewall, anti-virus, IDS, VPN, load-balancing, etc). Some of them were close to make coffee! Then, the “Next Generation” wave started. On top of it, all those products are promoted by vendors with ties as the killer-solution-to-make-your-CSO-life-beautiful! My goal is certainly not to destroy the image of such solutions. It’s part of the game. The security business is growing and each vendor try to get its share of it. But I would likeÂ to make you more aware of a potential weakness.
First of all, an all-in-one solution may have positive impacts in your organization:
- A simplified architecture, easier integration within your existing infrastructure.
- Reduced set of management tools (console, log analyzer, monitoring).
- Costs reduction (less power, simplified licensing model, less training)
- Limited point of contact for support
- Quick and automatic databases updates (new threats, viruses or websites blacklists)
Sounds great! Now let me tell you a story. Let’s imagine a customer “C” who is testing a next-generation firewall solution “F“. He tested the integrated anti-virus solution and reported that all the tested viruses passed through the firewall while his regular anti-virus did well.What happened? The customer downloaded and tested some old viruses from public repositories. The firewall manufacturer was contacted and reported the following fact:
“To prevent the database from growing too large we prune old signatures that are no longer considered a risk. To keep our database small (and our scanner fast) we only protect against signatures that are still “in the wild” and haven’t been addressed properly by the targeted supplier.”
By reading this comment, my first reaction was a big â€œWTF!â€ but, after a deep breath, it looked more as a best practice. Keep in mind that the tested device is primarily a firewall and:
- The number of virus signatures is constantly growing. Each virus is forked in multiple versions which conduct to multiple signatures.
- Itâ€™s a fact that signature based detection is often weak. Still today most anti-virus programs fail to detect viruses which use basic obfuscation techniques.
- Processing a complete database of signatures would require a lot of system resources and affect the overall throughput.
Do we have to blame the vendors in this case? IMHO, No! Their security solutions are often deployed in first line and must keep the best match between emerging threats detection and performances (low latency). There is no magic recipe: They reduce the number of security checks or their reduce the throughput. It’s more important to focus on emerging threats and to propose rapid updates. And from a customer perspective? Be careful! If vendors can’t be blamed for dropping old viruses signatures, they can be blamed for promoting the solutions as “bullet-proof”.
From my point of view, integrated solutions remain primarily â€œfirewallsâ€ (with great features, I admit). But keep in mind that manufacturers must make technical concessions to keep their products powerful. Multiple lines of defense must remain a best practice.
In conclusion, itâ€™s a question of risk management. What is the risk to be infected by an â€œoldâ€ virus? Can you trust the vendor to decide that virus “A” or “B” is not a threat anymore? What makes a virus “old”? From an attacker point of view, what about waiting for a threat level to be reduced and then re-use it to compromise the target? It’s a question of time…