Solve your Data Lake Analytics Challenges like Bell Canada did
A futuristic data strategy is important for your organization if you want to use your data lake for business benefits. As more and more data gets generated, it becomes challenging to meet the reporting needs of the business stakeholders who want reports with more dimensions, metrics, and measures. Your existing systems cannot deal with such massive amounts of data as they’re not built for that, and thus you should explore new technologies that have been built specifically to solve your analytical challenges.
In this article, we will look at the challenges that Bell Canada faced with their data and the solution they adopted to achieve instant BI at a massive scale.
Data Lake Analytics Challenge #1: Inability to ingest high velocity and volume of data
Bell, the largest telecommunication company in Canada, follows a data-driven culture where thousands of users across the organization use data to make business decisions. Their traditional data warehouse architecture worked well until they faced their first challenge where their existing systems could not keep pace with new data sources that were streaming a massive volume of data at high velocity.
Soon they realized that their existing systems could ingest only half of the data that was being generated. This resulted in the loss of valuable data, and it was not possible to build meaningful reports from incomplete data. No amount of optimization of existing infrastructure worked and they knew it was necessary to build a modern architecture that could deal with their massive, high-velocity data.
To overcome this challenge, Bell Canada implemented its Hadoop platform, and within a few weeks after implementation, they could ingest all their data. The consolidated data lake could store all of their current data as well as scale up to meet their future needs.
Data Lake Analytics Challenge #2: Inability to consume massive data effectively
Once the data came into Hadoop, the next data lake analytics challenge that Bell Canada faced was the consumption of this rich data for effective use by business users. They were using advanced BI tools such as MicroStrategy and Tableau that delivered response times in seconds for smaller datasets. But when they tried to analyze larger datasets, the response times declined drastically, making it difficult for them to meet the reporting expectations of their business users.
Besides response times, each BI query needed raw data access and consumed a lot of resources. This often led to resource overload on the Hadoop cluster and resulted in inconsistent response times on BI queries. Reporting often became cost-prohibitive, and they know they had to design a data model for consumption that would allow easy, quick access to massive data.
To solve these problems, they used Kyvos to create a BI acceleration layer on their data platform. It pre-aggregated massive volumes of data into OLAP cubes that supported multi-dimensional analysis, providing deeper insights into their data with instant response times. Reporting became easy and cost-effective, allowing thousands of users to access massive volumes of data in easily consumable formats. Besides this, the BI acceleration layer allowed them to scale their BI platform easily by adding more nodes, just as they did for their Hadoop cluster.
Bell Canada chose Kyvos’ revolutionary OLAP technology to achieve phenomenal results with its existing BI and data platform. If you want to understand how our OLAP technology works at a massive scale and what it can do for you, read our blog.
To get further details on how Bell Canada used Kyvos to solve their data lake analytics challenges and achieve multi-dimensional analytics on massive data, listen to our webinar recording “BI at Exponential Scale at Bell Canada.”
In this webinar our speakers, Mark Huang, Director – Data Engineering at Bell Canada and Ajay Anand, VP Products & Marketing – Kyvos Insights, discuss the solution and its benefits in detail.