Blog


When hunting in multi-100G network data lakes, you need to be carrying the biggest harpoon

03 Jul 2017

It’s a challenging puzzle – network data is increasing at a rate faster than virtually all network analytics, security and forensics platforms can keep up with. Legacy network and behavioural analytics platforms drown in the resultant data lake.

Add in to the mix lots of fluctuating variables such as a wide variety of data sources such as probes, infrastructure components, cyber defence logs, syslogs and quality of service reports and you really start to flesh out the kind of landscape facing network security or security operations centre (SOC) professionals and teams. This leads to a data lake that is simultaneously getting bigger, deeper and harder to explore.

Even as the lake moves towards and past the petabyte scale with the movement and evolution of time and technology, preventing loss and expanding capability is of paramount importance to the Cyber Security industry:

SecOps want the lake to grow with any and all rivers containing source data flowing in. They also naturally want to dive in holding the biggest harpoon – watching over the security of the users and network by being continually updated on all the active predators in the lake and keeping a very close eye on proceedings. They also want the ability to swim softly, one careful stroke at a time, before administering their well-positioned, efficient and ultimately deadly blow - all the time able to forensically analyse and remember each step of the hunt for more efficient future forays.

The network admins want to look down at the entire lake; rapidly diagnosing service-affecting issues before periodically throwing themselves deep within its depth and swimming directly to rescue of those specific users, servers and services that may be in trouble.

There are others drawn to the cool waters and tantalising tendrils of the mist emanating from the lake, knowing that great information – potentially business critical – lies somewhere down in the murky depths, tantalisingly out of reach. Within this group lie financial stakeholders of the business, quality of service/experience (QoS/QoE) teams, business intelligence, network architects and business strategists.

Previous generation advanced cyber, analytics and forensics platforms – whilst being feature rich – are simply unable to handle these huge and expanding data lakes. This manifests itself in one or more of a number of ways:

  1. They are simply unable to ingest the raw data at rate
  2. Analytic, forensic and reporting requests take minutes or even hours to produce results
  3. Resulting prohibitively large costs for initial deployments and subsequent scalability to required rate and geographic network topology
  4. They are unable to perform their basic reason of existence

The ideal solution is one that is proven but futureproof, flexible and scalable, responsive and thorough. It is performant to the level required regardless of rate.

Break out your swimming costume, sunglasses and ask us for a harpoon...contact