Tuning your network to dial Big Data

Yet again, Data Analytics and Business Intelligence are at the top of the CIO’s agenda for 2015. Big Data and Internet of Things are also on the radar for many organisations.

Investment Priority 2014 2015
1. BI/analytics 41% 50%
2. Infrastructure & Data Centre 31% 37%
3. Cloud 27% 32%

Their success will depend on the ability of your network to perform. So what do you need to do to prepare your enterprise data network to support these initiatives?

How Big is Big?

Industry analysts and vendors endlessly predict the impact of big data, and the need for high-performance networks to cope. However that really depends on the type of data and where it needs to go.

Business Intelligence systems performing special jobs collating and analysing data from multiple sites, then presenting and distributing it in real-time, could require significant bandwidth to deliver as users expect – think MapReduce programs such as Hadoop, or tools like Splunk. They also require extremely powerful servers.

However large the data sets you’re analysing, if they live in one place, bandwidth becomes less critical. And some forms of big data – such as SCADA or machine-generated data collected by tools like Splunk – don’t necessarily create massive traffic volumes, even when they are traversing sites.

Implications for your network

If you are planning any big Data initiatives or business analysis applications, then your existing network could let your down. First ask yourself, “What role will our network need to do to support the success of these innovations?” Specific questions to answer may include:

Where does the data we need to process/analyse reside?

If the data you need to analyse is distributed across your operations, you need to ensure it will be to transfer across your network to the place of processing. This means ensuring the network ‘pipes’ between the locations of your data must be adequately provisioned to carry this data to where it needs to be.

Will we need to get it there in real time or can we wait an hour (or a day)?

If real-time reporting is essential to nature of your Big Data or analysis application, it means that these network pipes’ will need to be lightning fast. It also means that the data must be continuously flowing as it is created or amended, rather than ‘batched’ into intermittent sends.

This might also involve prioritising the data within your network pipes to minimise latency. Otherwise, your users will not be making decisions on the latest data, or experience performance issues in using the application – such as delays in uploading dashboards or graphs.

Getting it right

Planning for the exploitation of Big Data or the IoT for new ways of harvesting, analysing and acting on internal or external sources of data is a significant challenge, calling for specialised expertise. This challenge also demands re-thinking the ways data is stored, where it’s located and how it is transmitted throughout your organisation – taking all the necessary security and compliance risks into consideration.

Any new data-related business initiative needs to be carefully considered from all angles – and can benefit from the expertise of network specialists, right from the initial planning stage.