Free Newsletter
Register for our Free Newsletters
Newsletter
Zones
Access Control
LeftNav
Alarms
LeftNav
Biometrics
LeftNav
Detection
LeftNav
Deutsche Zone (German Zone)
LeftNav
Education, Training and Professional Services
LeftNav
Government Programmes
LeftNav
Guarding, Equipment and Enforcement
LeftNav
Industrial Computing Security
LeftNav
IT Security
LeftNav
Physical Security
LeftNav
Surveillance
LeftNav
View All
Other Carouselweb publications
Carousel Web
Defense File
New Materials
Pro Health Zone
Pro Manufacturing Zone
Pro Security Zone
Web Lec
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
 
News

Matching security to increased network performance

Napatech : 26 March, 2015  (Special Report)
Dan Joe Barry, a Vice President at Napatech, examines the need for greater focus on network performance when deploying network security systems
Matching security to increased network performance

In a 2014 report, IDC analyst Frank Gens popularised the phrase “third platform” to describe the next-generation IT software foundation that includes cloud computing, mobile, Big Data and social engagement. All of these modalities generate more data—and faster—than organisations have ever had to manage before.

Many recent analyst predictions regarding IT center on the unprecedented quantity of data being produced by the third platform. As analyst firms monitor the ever-changing IT landscape, it is clear that regardless of the platform, or the means of delivery, the volume, variety and velocity of data in networks is continuing at explosive rates.

Like no other time in history, network management and security appliances will need to ensure network performance and stay ahead of advancing network speeds. Software acceleration platforms and tools are emerging technologies that organizations need to maximise engagement and value, enable innovation and remain ahead of the data explosion.

This new reality requires platforms and tools that accelerate access to data. As network engineers work to deliver these massive data streams in real time, performance and application monitoring is turning into a pressure cooker, with multiple usage crises dragging down network performance at any given time.

Appliance Design for a New Era

All software platforms need acceleration and support. To address this need, hardware acceleration must be used to both abstract/de-couple hardware complexity from the software while also providing performance acceleration. De-coupling the network from the application layer helps to realize this focus, while at the same time opening appliances up to opportunities that support new functions that are not normally associated with their original design.

Administrators can use high-performance network adapters to identify well-known applications in hardware by examining layer-one to layer-four header information at line-speed. By identifying what is performed in hardware and what is performed in application software, more network functions can be offloaded to hardware, thus allowing application software to focus on application intelligence while freeing up CPU cycles to allow more analysis to be performed at greater speeds.

Massive parallel processing of data is now possible using the hardware that provides this information to identify and distribute flows up to 32 server CPU cores. All of this should be provided with low CPU use. Appliance designers should consider features that ensure as much processing power and as many memory resources as possible and identify applications that require memory-intensive packet payload processing.

Improving Real-time Analysis

Though many tools exist to address the problem of downstream analytics in the voluminous security environment, the ability of these tools to perform real-time analysis and alerting is limited by their performance. Products that are used to extract, transform and load data into downstream systems tend to increase the latency between data collection and data analysis. Moreover, the volume and variety of data being ingested makes it impossible for analysts and decision makers to locate the data they need across the various analysis platforms.

“Third platform” activities will be accelerated by improving real-time analysis capabilities, thereby pushing intelligence to the point of data ingress. Some best practices include:

Intelligent Alerts in Real Time: Knowing what data is entering the system in real time, before it reaches decision-making tools, is real-time alerting. It provides intelligent alerts to stakeholders, informing them of the presence of new data that is of interest for their area of responsibility.

Real-time Analytics: Organizations need to analyze data at the very moment it is received in order to make use of perishable insights—that is, data whose value declines rapidly over time. Doing so ensures that an organization can begin acting immediately on what is happening.

Immediate Data Inspection: Data flow decisions can be made to direct the data to downstream consumers at line-rate by inspecting it immediately upon ingress. This minimises the unnecessary flow of data through downstream brokers and processing engines.

Building New Network Infrastructures

Telecoms, carriers, cloud providers and enterprises face a challenge in paying for these specialised network applications, as they can be incredibly expensive and make scaling to meet demand a costly proposition. Even worse, if the market shifts toward adoption of novel network hardware, these organizations must bear the cost of updating legacy infrastructure in order to stay competitive.

Security appliance designers now have the choice to separate out network and application data processing and build flexibility and scalability into the design. This introduces a powerful, high-speed platform into the network that is capable of capturing data with zero packet loss at speeds up to 100 Gbps.

The hardware platform provides an analysis stream that can support multiple applications, not just performance monitoring. Multiple applications running on multiple cores can be executed on the same physical server with software that ensures that each application can access the same data stream as it is captured.

For any application that requires a reliable packet capture data stream, this analysis stream transforms the performance monitor into a universal appliance. With this capability, it is possible to incorporate more functions in the same physical server, increasing the value of the appliance.

Data loads continue to increase, but new technologies have come onto the scene to help enterprises manage them without losing performance. If organizations are able to scale in a timely manner and keep up with increasing connectivity speeds, they are well on their way to handling the era of the third platform. But they must also accelerate security and network management applications to enable a more robust network in today’s fast-paced environments and stay ahead of the ever-expanding data growth curve.

Bookmark and Share
 
Home I Editor's Blog I News by Zone I News by Date I News by Category I Special Reports I Directory I Events I Advertise I Submit Your News I About Us I Guides
 
   © 2012 ProSecurityZone.com
Netgains Logo