By definition, big data means there’s a lot of it. If it’s being held in one static repository, then at least you don’t have to worry about moving it around and checking it as you do so. If on the other hand your big data is being generated by hordes of devices attached to the Internet (the Internet of Things or IoT), data security takes on a different aspect. In this case, it comes into your data centre over the network and must be checked for suspect packets before being let through into your servers and storage spaces. But how do you cope with up to terabytes of data arriving at the same time?
Raw power is a necessary, but not sufficient ingredient in a solution to handle network security for traffic of this size. Underpowered firewalls will hold up traffic flows and may themselves break down under the onslaught. And if the big data from jet engines, buildings, cars and fridges (to name just a few) doesn’t cripple your outer layer of protection, there are always denial-of-service (DoS) attacks waiting in the wings. Firewall vendors offer different additional solutions for tackling this challenge of data volume and its companion conundrum of data latency, so important for financial transactions for example, where microseconds count.
One is to offer different levels of checking according to the profile of the traffic. Latency-sensitive traffic from financial trading activities for instance might be let through with a lighter level of checking to ensure rapid transfer. Another is to separate the resources for administrative functions from those for checking data. In that way, data overloads or DoS attacks can still be brought under control. However, one thing is sure: the challenge of big data is not only in knowing how to store it and analyse it, but in knowing how to check and ‘disinfect’ it too.