Subscribe Now

* You will receive the latest news and updates on the Canadian IT marketplace.

Trending News

Blog Post

Big Data? Big storage costs
C-SUITE

Big Data? Big storage costs 

Even though this data is digital, businesses still need to store it somewhere. Data centres provide the answer to that problem. However, many firms do not consider the expense of warehousing data. Mac Wheeler, head of marketing for the voice of customer and customer experience management software provider SandSIV, explained the hidden costs of Big Data and how to counteract them.

“I believe that many companies come into Big Data capture without a clear idea of what the eventual costs may be,” Wheeler said. Businesses look at Big Data through a narrow scope. “They may consider the cost of the initial setup for their Big Data initiative, but I feel that many fail to take into consideration the possibly massive amount of storage they may eventually need,” he commented.

In addition to the cost of storing information, Wheeler predicts that companies will simply be unable to handle all of the data they have gathered. “I believe that many companies will reach a stage of information overload once their data silos become a certain size, making it hard to build data sets and perform analytics,” he said. Wheeler referred to the situation as “too much noise.”

Wheeler pointed out that there are solutions to this problem. He suggested using a Big Data as-a-service (BDaaS) provider. BDaaS refers to the managed services that deliver statistical analysis tools or information to help organizations glean insights from their vast stores of information. It often relies upon cloud storage that enables continuous data access.

“BDaaS is just an extension of SaaS,” Wheeler commented. “The SaaS model has proven to work exceptionally well, as its popularity demonstrates.” He noted that BDaaS offers a more affordable option for many businesses. “It pushes the costs of data storage on to the BDaaS provider, and overcomes the headache,” Wheeler remarked.

Aside from the analytics platform, Wheeler recommended changing the method of gathering information. “Big Data needs to be captured intelligently, to fit data set requirements, rather than capture absolutely everything possible,” he stated. “Put simply, they should only warehouse the data they need to produce the business intelligence they require.” Wheeler offered another piece of advice: “However, this should be done in an agile manner, enabling rapid changes to the type of Big Data being captured driven by business needs.”

Wheeler expressed confidence that technology will solve the problem of the cost of warehousing data. “Storage tech is improving all of the time,” he commented. Wheeler pointed out that storage has been evolving for years. “Go back 30 years and we were still using piles of punched cards and spools of paper tape to store tiny amounts of data,” he remarked. “Fast forward to today, and we can fit that same volume of data millions of times over on a hard disk drive.” The evolution of storage shows no signs of stopping, according to Wheeler. “As storage requirements go up, innovation will close the gap eventually, giving us more compact ways to store ever increasing volumes of data,” he concluded.

Related posts