Free Newsletter
Register for our Free Newsletters
Newsletter
Zones
Access Control
LeftNav
Alarms
LeftNav
Biometrics
LeftNav
Detection
LeftNav
Deutsche Zone (German Zone)
LeftNav
Education, Training and Professional Services
LeftNav
Government Programmes
LeftNav
Guarding, Equipment and Enforcement
LeftNav
Industrial Computing Security
LeftNav
IT Security
LeftNav
Physical Security
LeftNav
Surveillance
LeftNav
View All
Other Carouselweb publications
Carousel Web
Defense File
New Materials
Pro Health Zone
Pro Manufacturing Zone
Pro Security Zone
Web Lec
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
 
News

Raising the Bar On Detecting And Preventing Malicious Code Attacks

Idappcom : 22 June, 2011  (Special Report)
Ray Bryant of Idappcom details the need and the methods of preventing malicious code attacks by hybridisation during a time when cyber criminals are becoming more sophisticated
Raising the Bar On Detecting And Preventing Malicious Code Attacks

In today's fast-changing world of security threats, the need to raise the security bar - by enhancing an IT platform's ability to detect and prevent malicious code from `breaking through' the network perimeter - has never been greater.


But how do you tackle the process in a modern IT department? Idappcom's CEO Ray Bryant provides some thoughts:


In November 1988, the world of computing was changed forever with the world's first worm - the Morris worm - which disrupted around 10 per cent of computers hooked up to the Internet resources of the day.


Fast forward more than two decades to the present day and we have a malware landscape that has altered immeasurably. Arguably more has happened in the last couple of years than in those 20 years, with Web 2.0 security threats, social networking attacks and all manner of attack vectors becoming an everyday occurrence.


Against this backdrop there is a clear and present need to defend an organisation's IT platform as never before. And this can only be achieved by raising the security bar.


Although the task may appear to be daunting at first sight, by breaking the enhancement process down into a series of stages, the task can made a lot more manageable for the IT department.


Virtually all network traffic these days is TCP/IP-based, and, as a result, conventional threat signature analysis can identify a significant proportion of malware, phishing attacks and even the latest evasion techniques and hybridised attack vectors.


By hybridise, we mean that a cybercriminal is using more than one attack methodology to achieve their aim. They may, for example, use a highly attractive information feed, offers or video files to persuade users to `click through' and infect themselves.


All of these advanced attacking emails and program code, however, always have some nefarious purpose at their heart, to bring down systems or to steal money, access to money or information, even extract money through threat to bring down systems.


There is no such thing as a perfect secure structure; however, what can be done is to ensure that each stage of ‘inspection’ is working at its optimum level of protection possible. Traffic entering the network has to be passed through firewalls as well as Intrusion Protection and Detection devices. Data entering the desktop has to be checked on the desktop to prevent malicious code being launched by the user. Patch management is essential to ensure weaknesses in applications cannot be exploited and Vulnerability Scans have their place to identify weaknesses in what can be vast networks of thousands of desktops.


Virus signatures on desktop Anti Virus applications build on the static digital signature analysis of anti-virus applications seen in the late 1980s and 1990s.  The updates to these signatures are now a daily occurrence, even hourly. Advances in these applications including behavioural and heuristic techniques have been necessary to counter the ever increasing variations in delivery method of malicious code. Ensuring all desktops and mobile devices are updated with new software and signature releases is a necessary habit that IT has to instill.


It is virtually impossible to detect (as most recently experienced by Sony and RSA) legitimate files that are encrypted in transmission and launched when the file is opened. An email with an attached pdf or spreadsheet that looks legitimate can launch code as it is opened. The applications have to be patched to prevent this happening, which necessitates constant updates to MS Windows, Office, Java, Adobe and many other applications. Making sure every device that connects to the network is patched is a major task. Vulnerability scanners can ease the task but often just add to it by assuming vulnerability on such a scale that the data is impossible to manage.


It is better to stop the attack at entry point rather than look for where it may finish up. Good intrusion detection system (IDS) technology works at the entry to a network to  analyse the risk of a given data stream - no matter what it carries at the time it arrives at the network from external sources. The devices need regular updating for code, signatures and configuration.


The ability to balance what is being checked against the throughput is a real need that equates to Risk vs. Cost. Just because an intrusion prevention system (IPS) can be costly does not make it superior to other technologies - any device needs expert configuration and constant audit to ensure that malicious traffic really is being stopped.


The reliance on IT to manage our business data and do business with other companies across the globe as well as the rapid development of the Internet have created a new risk element. Indeed, this new type of risk can be considered more likely to happen than historical risk incidents such as fire and flood.


Modern network appliances are required to handle data throughputs of 10 Gbps - or clusters of 10 Gbps data streams. This will result in both an increase in the number of companies attaining these throughputs and those already at these levels needing even more in the future. We must therefore start raising the effectiveness of the IDS/IPS platform itself.


There has to be a distinction between firewall and IDS/IPS functionality. While in equipment from some vendors  functions of both are blurred, in general the firewall is there to reject certain types of traffic and control which traffic can flow in which direction and from what connections.


The rejection of certain data types from certain sources and the filtering out of suspect data that cannot be automatically detected as good or bad will reduce the volume of data being inspected.


Very large volumes of data can be split with the aid of load balancing, with each load having its own Intrusion system.


This balanced approach is the security posture that IT security platforms need to adopt when dealing with modern high-speed network data streams - reject or slow down undesirable traffic or connections and then inspect the remaining traffic for malicious code.


Whilst a sizeable proportion of traffic can be analysed and different categories of streamed data handled appropriately, there will always be an underlying risk that evaded, hybridised and zero-day threats will pass across the IDS battlefront unchecked.


To counter this, there must be a number of additional stages in threat detection that are carried out. In addition, an understanding of the ‘vulnerability’ and the exploit is crucial, as well as the difference between security signatures able to recognise vulnerabilities (which may have thousands of variant exploits) and audit and vulnerability testing. The latter requires actual exploits to test that the signature really does do what it is supposed to.


The recognition that an Intrusion Detection/Prevention device cannot possibly examine all traffic against all known exploits is key to understanding the need for constant auditing and testing of security devices.


The audit must identify malicious code that is not mitigated under test; provide information to ‘tune’ the configuration and the signatures being used; and where required it should devise additional rules that are applied to fix the issue. The tuning process will take account of the corresponding, acceptable level of performance, namely the throughput.  This audit and test of vulnerability has to be individual to your network, your equipment and finally to managements risk appetite.


The potential for false positives and false negatives is growing – therefore, the ability to audit and test using real threat traffic in the live environment is essential. Ever-increasing volumes of traffic mean that the IPS/IDS has to be left to accept or reject traffic automatically. The secret is for this to happen ‘effectively’ and ‘in a timely manner’. Then, when suspect data is not immediately recognisable as good or bad, it can be quarantined. The ability to audit your live environment with real traffic is capable of increasing  the ability to mitigate and reduce the number of manual interventions.


These approaches generally involve a high level of resources and possible delay being applied to what may be important data. Therefore, our approach here at Idappcom is to raise the bar on the IDS process by  providing the tools to regularly audit and analyse the efficiency of the devices under test.


It is essential to have a  constantly updated library of traffic files consisting of recordings of real live exploits attacking vulnerable machines as well as good traffic that should be allowed through. By playing this traffic in and out of a network security device there can be no doubt about the effectiveness or performance of the Firewall/IDS/IPS.


A security signature may be written to detect a vulnerability. However, if a single or variants of an exploit can beat the signature, it is clear that either a signature is weak or a configuration change needs to be made. It should be noted though that some offerings have hundreds of variants of the same exploit when all you need is a select few that will test the security rule for the vulnerability. Recent tests by renowned labs have shown threats have not been spotted or evasion has not been detected simply because the ‘out the box’ configuration had certain functions switched off for performance reasons.


A risk analysis will show what the dangers are in balancing detection rates with performance. In most cases, unless there is a bandwidth problem, existing devices can be enhanced and performance maintained or even improved.


The test/fix/test cycle has many functions. Nevertheless, whichever way you look at it you can see without a doubt whether your devices are performing the way they should. It is through audit, vulnerability detection, deployment of high quality signatures/rules and the performance tuning of the device that the effectiveness of the device(s) can be increased. This raises the bar on effectiveness without massive investments in new equipment, which often offer the same level of effectiveness, only faster.

Bookmark and Share
 
Home I Editor's Blog I News by Zone I News by Date I News by Category I Special Reports I Directory I Events I Advertise I Submit Your News I About Us I Guides
 
   © 2012 ProSecurityZone.com
Netgains Logo