Free Newsletter
Register for our Free Newsletters
Newsletter
Zones
Access Control
LeftNav
Alarms
LeftNav
Biometrics
LeftNav
Detection
LeftNav
Deutsche Zone (German Zone)
LeftNav
Education, Training and Professional Services
LeftNav
Government Programmes
LeftNav
Guarding, Equipment and Enforcement
LeftNav
Industrial Computing Security
LeftNav
IT Security
LeftNav
Physical Security
LeftNav
Surveillance
LeftNav
View All
Other Carouselweb publications
Carousel Web
Defense File
New Materials
Pro Health Zone
Pro Manufacturing Zone
Pro Security Zone
Web Lec
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
 
News

Securing networks in the 100Gbps era

Napatech : 18 May, 2016  (Special Report)
Dan Joe Barry of Napatech details the different approach needed to keep up with data growth and the high speed networks of today, requiring the Last Security Tool
Securing networks in the 100Gbps era

When it comes to IT security, the traditional wisdom has been to defend the perimeter. Of course this makes sense – organisations want to keep malicious actors out of their network. Nonetheless, they continue to get in. And today’s threat landscape is far less defined that it used to be, as the BYOD trend muddies the waters and employee errors create problems from within.

Efforts to keep the network secure are complicated by the rapid increase in network speeds. They now routinely hit 100 Gigabits per second, or roughly 70 million times faster than the typical network connection when firewalls were introduced. This poses a number of challenges, particularly in the area of security. Network growth, along with the data deluge, puts a great amount of pressure on organisations to combat cyber threats and analyse cyber attacks in real time so that necessary actions can be taken with minimum delay.

It comes as no surprise, then, that IT security teams are focusing on “security detection” using network analysis to detect anomalies as a first indicator of new types of threats, either zero-day threats or ones that come from within the network. Gartner’s Shift Cybersecurity Investment to Detection and Response research projects that by 2020, 60% of enterprise information security budgets will be allocated to rapid detection and response approaches, up from less than 20% in 2015. It is in this context that post-analysis comes into its own, as it is not always easy to catch threats as they happen.

It becomes critically important, then, to be able to conduct deep analysis offline or even post-attack to determine what happened. The analysis allows management to make decisions and take actions in response to an attack. More importantly, it is needed to ensure that a cyber event has been truly resolved so that all public disclosure, notification of impacted parties and internal remediation can be completed.

Alert Overload

High-speed networks are experiencing attacks at higher levels than ever before. However, in most cases the attacks are discovered weeks later. Network security products are facing a two-fold growth challenge: Data traffic is increasing exponentially, so there is more to analyse at faster speeds. At the same time, cyber-attacks are also growing in number and complexity.

Recognising the need to warn IT teams of pending threats or in-process attacks, vendors have created all manner of security alerts. Entire industries have been created to fulfill the need to process the tens of billions of events generated every day in a typical large enterprise. The security team faces the huge task of collecting this data from all the tools and then prioritising them by severity.

However, though these tools create huge quantities of alerts, they often give either incomplete or contradictory information about a given event. Add to this that once an attacker is inside, he will often compromise the credentials of a legitimate user and might disguise himself as an employee to do searches and extract sensitive data.

The Last Security Tool

Organisations understand what’s at stake if a breach occurs. They need to deploy a diverse strategy that:

* ensures all security prevention methods have the necessary bandwidth and capacity to handle high-speed, high-volume attacks.
* ensures the security detection methods are in place to not only detect anomalies in real time but also to record network activity for deeper analysis and/or later detection of a past breach.

With so many attack vectors today, organisations cannot rely on any one security product. Traditional point defences cannot adequately address the new, faster-moving, multi-layer threats and more sophisticated attackers. What’s required is a layered approach with defence-in-depth, where an organisation not only relies on network security appliances for indications of data breaches but also network behaviour analysis.

It can be said, then, that continuously recorded network data is the  “last security tool.” A network forensics approach should continuously capture all data 24x7, regardless of whether anything interesting is happening in a particular moment or not. Then, in conjunction with alerts from the other tools, the security team can investigate whether the event was a false alarm or something that needs to be actioned. Moreover, they can see what happened after the breach and achieve the ultimate goal: determining all the assets the attacker may have accessed and whether it has truly been eliminated from their environment.

Taking the Larger View

There are tools today that can give IT teams a partial network recording based on an event, that data will be incomplete if the recording tool did not see anything it considered interesting. For effective network forensics, best practices today suggest complementing tools that can record everything continuously at high speed. It must be purpose-built for this, since the demands for storage and indexing of this volume of data are much different than the architecture of other security tools.

Data capture and retrieval-on-demand can take real-time data capture one step further. The network forensics tool must provide an immediate and indexed answer to an investigator pursuing an event. It is crucial that security officers can quickly go to the time and place of the event to start analysis, and waiting several hours for this initial answer can cause serious delays while the attacker may still be inside.

The swiftly moving tides of data that IT teams must cope with, and the cost and tedium involved in storing and analysing that data, creates the need for data-on-demand. With this capability, teams can quickly access the packets from a specified time period or server to zero in on the root cause. Organisations today need to be able to both capture data and retrieve it quickly. This layered approach can create a stronger security perimeter.

Daniel Joseph Barry is VP Positioning at Napatech and has over 20 years’ experience in the IT and Telecom industry. Prior to joining Napatech in 2009, Dan Joe was Marketing Director at TPACK, a supplier of transport chips to the Telecom sector.  From 2001 to 2005, he was Director of Sales and Business Development at optical component vendor NKT Integration (now Ignis Photonyx) following various positions in product development, business development and product management at Ericsson. Dan Joe joined Ericsson in 1995 from a position in the R&D department of Jutland Telecom (now TDC). He has an MBA and a BSc degree in Electronic Engineering from Trinity College Dublin.

Bookmark and Share
 
Home I Editor's Blog I News by Zone I News by Date I News by Category I Special Reports I Directory I Events I Advertise I Submit Your News I About Us I Guides
 
   © 2012 ProSecurityZone.com
Netgains Logo