Our comprehensive solutions and services guide you on your digitalization journey.
Finding the right business solution:
We speak your language! Our industry-specific solutions are perfectly tailored to your needs.
Gartner® Report: Transform Your CRM with 5 Essential Steps
Tips and updates for our solutions, informative blog posts, compelling case studies, and news about our company.
BE-terna achieves the 2023-2024 Microsoft Business Applications Inner Circle award.
6 min read • Jun 25, 2020
Data, data, data, … and then more data... And then more data again. Do you know the approximate quantity of data that is generated globally on a daily basis today? 2,500,000,000,000,000,000. Don’t worry, I can’t read that either. And by the time you have tried to read it, the total will be much higher anyway, so don’t even bother! But wait, it’s not just higher quantities of data, the data is also very different too. And when it is finally entered into a warehouse, it is followed by another data source. And then another one, and another and another.
When all the data is finally in, what now, besides the fact that new ones just keep coming and that overnight loads no longer make any boss happy, and instead they require real-time data-based insights? It is not a database anymore and it might take weeks or even months to get some insights out of them. And then the next day the boss might ask for some completely new insights. Wait, is the data reliable after all, and of good quality, and would the numbers provided be correct…? Yeah, you are not the only one here who is worried, many of us are!
So is there anybody out there who can help? Yes, there is, but not many of them really can. However, Qlik might be able to, so keep reading...
In the last decade or two, in parallel with the start of the exponential growth of data, many organisations have invested heavily in Big Data, building data lakes and focusing solely on loading as much data as possible to them. Of course, this was not enough and has been mostly unsuccessful, because not many timely business insights could be delivered from such huge, unmanaged and ungoverned systems.
Nick Heudecker, a leading Gartner analyst, estimated in 2017 that
the failure rate of data initiatives is close to 85%, which was even higher than the previous prediction of 70% from Gartner.
In fact, interest in data lakes continues to grow. Ironically, enterprises seem to think they can just buy a data lake "solution' off-the-shelf. The lack of sophistication and rigour in data lake RFPs is staggering. Save time and just write, 'Please sell me magic beans' on an index card. Then go work on your resume, which you'll need in 6 months, he ads.
Prior to Big Data and Data Lakes, the full focus was on what was needed by data, which produced DWHs, data marts and data analytics that were too slow, inflexible and limited. However, in this new era the focus has been on gathering all the data. It is obvious that Data Management is still urgently needed, probably more than ever, and that the link that is missing for connecting all the new big data to meaningful insights, (as Accenture nicely point to make a Big Data supply chain that actually finds the business in your data), might be Data Integration.
Forbes, in their article titled ‘The State Of Enterprise Data Integration, 2020’, which was based on high-quality Dresner Advisory Services research, claim that over 80% of enterprise Business Operations leaders say data integration is critical to ongoing operations, 67% of enterprises rely on data integration to support analytics and BI platforms today, and 24% are planning to move in this direction in the next 12 months.
So basically, all that vast data, known and unknown, checked und unchecked, necessary and not necessarily necessary, needs to be gathered, prepared, integrated and then analysed. But preferably in a quick, governed, up-to-date, validated, formatted, sometimes aggregated or integrated, clean, transformed, profiled manner. Not that easy, right? It usually takes a lot of manual work and modifications with a lot of scripting, often by different teams, with a huge risk of errors and defects, leading to wrong or unreliable results. And last, but certainly not least, it most often takes a lot of time and provides very slow delivery.
Actually, there are many great tools out there that can help with data processing, from extracting data from the source to business insights. However, not many of them can deal with such a diversity of data sources, data formats, and structures (or with no structure at all), and especially not with the required speed. A further issue is that while quite a few of them could really cover most areas, many are only good in some parts of it but not in others. For instance, there are tools that can do a good ETL job but are actually quite an obstacle in data analytics and insights.
On the other hand, next-generation analytics tools that have been clear leaders for years, such as Qlik, Tableau and Power BI, have struggled with data management. That was until recently when Qlik acquired and integrated a few excellent tools (Attunity, Podium Data) that now make up the Qlik Data Integration (QDI) product line, and together with Qlik Sense make for a very unique Data Management & Analytics end to end offering, from Data Integration to Business Insights, making DataOps possible in its fullest sense.
From a vast variety of data sources, QDI can generate, deliver and refine data in a quick, governed, reliable, real-time and insight-ready manner. This can be easily and thoroughly analysed by Qlik Sense, but also by other BI tools.
First, and most importantly, data is gathered in real-time from any source, thanks to Change Data Capture, by reading it all from the logs and thus keeping production data out of any effects, and always providing users with the latest information.Data gathered from sources could then be replicated to wherever it is needed – to Data Warehouses, Data Lakes, Streaming and Cloud Platforms. However, much more can be done with the data – QDI can automate the creation of data mart and data lake structures. By automating, I don’t mean endless ETL scripting, but on real, agile, model driven automation. Furthermore, the Qlik Catalog simplifies and accelerates cataloguing, management, preparation and delivery of business-ready data to analytic tools.
So, all in all, the Qlik Data Integration Platform really automates the creation of analytics-ready data pipelines to empower users to operate at the speed of business, finally making DataOps more than just another buzzword.
For example, Swiss Life, a leading insurer, managed to transfer and consolidate multiple customer data sources (incl. mainframes) to an unified „Vision 360“ repository on Elastic Search in just two days with QDI, instead of the initially planned 45 days of ETL scripting.
QDI can also help a lot in SAP to S/4HANA migrations by allowing SAP users to quickly and simply replicate and move selected data from and between SAP applications and data warehouses.
If we add here the fact that the well-known and well-established Qlik Data Analytics portfolio has been a Gartner leader for 10 yrs in a row, and with all the new add-ons and features, such as chatbots, natural language querying, insight bots, alerting, geo-analytics etc, it can clearly be seen that Qlik has a very unique end-to-end, one-stop-shop offering in today’s highly complex and challenging data world.
Subscribe to our Newsletter and get relevant updates …