IT Security should be an ongoing process
While many organisations recognise the need to invest in firewalls and intrusion prevention devices to protect their networks, due to cost and pressure on resources the procedures needed to ensure these devices are regularly updated to reflect the changing threat landscape are not always implemented.
Ray Bryant, CEO of Idappcom explains why failing to keep network security systems updated can be a serious error for any organisation.
On the 8th of December 2015 Microsoft released a dozen security vulnerability bulletins eight of which it rated as Critical and two that were currently under attack. As most IT network managers will know this is a regular, frequent and unwelcome occurrence, not just from Microsoft but from most of the leading software and hardware vendors throughout the year.
Whilst it is clearly a good thing that these companies take their responsibilities very seriously and want to provide their users with the latest vulnerability information as soon as it becomes available, it can put a serious strain on limited IT resources when it comes to knowing how best to respond and then finding the time to apply the appropriate fixes. However, with the consequences of leaving any vulnerability unaddressed including loss of data, extensive periods of network downtime not to mention the IT team’s time dealing with the breach, no organisation can afford to ignore these notifications for long.
In theory it all sounds straight forward enough, just apply the patches as soon as they are released and “Bob’s your uncle”. Meanwhile back in the real world life is not so simple. Today’s corporate IT networks are the complex beating heart of any organisation and even relatively small businesses are likely to have 10’s if not 100’s of servers and work stations connected by multiple network devices to support day core operations. Running on top of this infrastructure there will be a range of critical business applications with the added complication of potentially operating on different versions of the same software.
This means that before doing anything the IT team needs to ask itself some important questions and decisions need to be made. Does the vulnerability apply to my network? If it does how critical is it that it is fixed immediately or can it wait for a scheduled update? And importantly, will it mean network downtime and disruption to revenue generating operations?
Making the right decisions in these situations is never easy and are compounded by the fact that few if any organisations can afford to have spare capacity ready to jump into action when a new vulnerability appears. Ultimately it will come down to human judgement based on balance of risk, never a good option when the stakes are so high. Who would want to be a network security manager?
To the army of hackers relying on companies delaying the decision to apply the patches needed to update their networks these vendor bulletins are now taken as a signal to “make hay while the sun shines”. The key to combatting this threat is to implement an ongoing, proactive approach to security. This means never assuming that the network is 100% secure and putting policies and procedures in place to ensure that security sensors such as firewalls and IPS devices are routinely assessed and updated with latest signatures that can recognise and respond to the exploits being used to target any new (and old) vulnerabilities.
New exploits are being detected almost on a daily basis as hackers play cat and mouse with the vendors, leveraging the vulnerabilities in the system to access valuable data, particularly banking and credit card information, before users have time to react. Often vendors themselves can be slow to issue fixes so even if organisations do apply the patches as soon as they are released it can mean that they have already, unknowingly, been operating with a critical vulnerability, for several months in some cases.
To be effective in today’s dynamic and growing threat climate security needs to be multi-layered, multi-dimensional and ongoing. Most businesses recognise the importance of installing specialist firewall and intrusion prevention devices to monitor all external traffic for tell-tale signatures at the network perimeter. For added protection many now also place these sensors in front of critical servers and sub-nets to protect against internal threats, providing the defence-in-depth posture needed to protect the whole infrastructure.
However, despite the considerable investment it can involve, this alone in no guarantee of keeping the hackers out and your data secure. In addition to sophisticated exploits and evasion techniques hackers often use, there are many factors that can easily compromise the effectiveness of a firewall or intrusion prevention system after its initial set-up. These can range from a simple human error, out of date software or an unauthorised change somewhere in the network. Network sensors are like the company car fleet; they need to be regularly maintained to keep them fully tuned for optimum performance.
In the case of network sensors this boils down to ensuring that they are actually capable of recognising and blocking the latest exploits. (It is dangerous to assume that they can and do). The only practical way of doing this is by sending real-world exploit traffic, under controlled conditions, through the network and monitoring how the sensors behave under a simulated attack. If they fail to recognise the exploit and allow the traffic through it typically means that the sensor needs to be updated with the latest security rules.
There are several tools and methodologies that network managers can use to assess the effectiveness of the network defences and many organisations will employ a specialist security consultant to carry out periodic penetration tests as part of a compliance audit process, to achieve the same end. However, this can be a very expensive and time consuming, and ultimately only confirms the security status of the network at a specific moment in time.
With new exploits being identified with growing frequency waiting months between audits is not an acceptable risk and what is needed is a cost effective way of routinely testing the system regularly and often. At Idappcom we recognised this problem over 10 years ago and have developed a range of tools and services that enable network managers to quickly and easily assess the threat responsive of their sensors, either themselves or via our Cloud services, and to manage the whole process of applying the correct rules needed to help to prevent the vulnerabilities from being exploited.
When incorporated into a proactive security policy this ongoing, routine approach to monitoring the network’s threat response capabilities not only provides peace of mind that the systems are operating at their optimum levels at all times but can actually save the time and money needed to firefight the consequences of s security breach or a botched response to vendor’s vulnerability warnings.