Subscribe Now

* You will receive the latest news and updates on the Canadian IT marketplace.

Trending News

Blog Post

Telemetry Enables Better Business Decisions
C-SUITE

Telemetry Enables Better Business Decisions 

From a planning and architecture perspective, understanding the impact and characteristics of IT workloads provides opportunities for consolidation and efficiencies of scale. Operationally, better telemetry increases the ability to troubleshoot, refine and performance-tune your VMs and workloads, leading to improved IT and business functions that can reduce costs per transaction and create streamlined business processes. According to a recent McKinsey & Co. report, “Most companies collect vast troves of process data but typically use them only for tracking purposes, not as a basis for improving operations. Some companies, particularly those with months and sometimes years-long production cycles, have too little data to be statistically meaningful when put under an analyst’s lens.” 1

Telemetry and logging is not new to IT, however, the amount and detail of data and the insights we gain from it, are much greater than they’ve ever been. Between cloud systems and artificial intelligence that can now be applied to telemetry data to give us better insights, the value that this kind of data can provide to businesses is only increasing.

Placing Value On Telemetry

The most basic and impactful way telemetry can help is by providing the ability to proactively manage risk and to predictively operate your total facilities as well as individual applications or workloads. Rich data tied to the performance and load of your IT facilities as it relates to hardware power status, temperature and environmental warnings and IT system status (e.g. CPU load, RAM occupancy, network saturation and performance, storage saturation and performance, and any alarms or failures on such elements) can allow you to move to a just-in-time model of IT growth. Rather than having to spend capital and effort on growing your infrastructure based on assumptions, you can now base your future needs on real-world data.

Having quality data and analytics on individual workloads, also means that you can get quite predictive with your actual applications, and not just with the hardware that it runs on. This means being able to avoid software or application failures, better tune availability to key services, and generally, reduce or eliminate unwanted downtime and support issues.

Getting a Front Row Seat

Building a good view across both hardware resources as well as IT workloads, be they virtual or physical, requires planning out an architecture that includes hardware and facilities that provide good telemetry. Vendor or technology choices can provide great levels of detail such as basic network data from an SNMP tool, status and health alerts from hardware and information from IT operating systems in terms of applications or CPUS and how they are affecting load or resource consumption. Try to understand the capabilities you need and want up-front. Additionally, you’ll need to tie this data into some sort of tool or system that can help you make sense of all the different data points available to you. This will help you interpret what it all means and what actions you can drive out of the intelligence.

A Look On The Inside

For a company to be successful in its data efforts, the task is best owned by the people who can realize the most value out of good telemetry and analytics – this would be the operations teams who have to keep your cloud systems running. Putting direct data and insight into the hands of the people actually making the changes is always best. Primarily it’s the IT and Operations people who will get the most value out of this kind of insight. However, using this data to build better financial and procurement planning around IT is an immediate “next-step” for most businesses, as the benefit can be significant.

By working in synergy, the business and the technology arm of a company can create a lot of value. For example, a finance team working with an operations team could use data and analytics from telemetry to build a predictive procurement model for new IT gear. In such a situation, finance is teaching Ops about IT procurement and how it can help IT be more efficient with their spend, and IT is teaching finance how understanding basic data from the machine can help drive better visibility into costs.

Taming The Beast

There’s no wrong or right way to collect and transmit data, and this will depend quite a bit on what technologies, tools, and locations a business may want to push their data to. The key here is to start by securing your data storage repositories and transit paths to ensure your data is kept private both at rest and in transit.

Telemetry data should always be stored securely. There’s a lot of valuable information in here that you’ll want to keep private to your business and/or projects. So make sure that whatever facilities or services you’re using to collect and store your data are well secured. Additionally, the thing that makes telemetry data useful is analyzing it. Generally speaking, businesses with sophisticated IT operations want to gather telemetry data from multiple elements and facilities and be able to assess and analyze them from a holistic approach. So in addition to being secure, a best practice would be to be able to collect and aggregate all your data and make it accessible to whatever analytics tools you may be using, from a central and secure location

The Relationship between Telemetry and Cloud

The fact that cloud systems can scale up and down automatically means that getting a real-time sense of what’s happening on your systems can be difficult, especially when the speed of change in a cloud system means you need to be able to analyze and act on your data very quickly, if not immediately and automatically. If you are able to build good telemetry and analytics on a cloud system, and then actually use that data in a timely manner to drive increased performance and overall system uptime, you will positively impact your business both in reduced costs and in more efficient processing of customer and transactions. So good analytics on a cloud system can impact both your top and bottom lines.

Q9, Canada’s leading provider of outsourced data centre services collects telemetry as required to maintain availability and quality of service for its customers, while still preserving a demarcation that ensures we’re not accessing their data. We also provide basic telemetry of environments and hardware to enable customers to apply any sort of telemetry and analytics mechanisms they want. In this situation, Q9 is enabling the customer to define and build whatever scenario they choose, and tie the data provided into that scenario. Just recently, Q9 also launched a suite of cloud services that are based on technologies that provide a great level of detail at both the physical as well as virtual level, and can be used to drive really detailed and intelligent business decisions.

Ultimately, data driven decisions lead to better business outcomes. By helping our customers access and understand their data, we can drive both better costs as well as more effective overall operations.

1 “How Big Data Can Improve Manufacturing” by Eric Auschitzky, Markus Hammer and Agesan Rajagopaul; July 13, 2014

nabNabeel Sherif is the creator and lecturer for University of Toronto’s Cloud Computing Certificate program. He is also the Cloud Product Manager at Q9. For the past decade, his focus has been in developing and creating the next generation of services and products in hosting, cloud computing, datacentre services, and application networks. You can follow him @themightynab.

 

Related posts