How does automation fit in the landscape of BI and Analytics environments of today? Which tasks – if any – can be automated and at what purpose? A short introduction on how to bring up speed in the time-to-data when new pieces of data needs to be brought into play, how to put less strain on scarce IT resources and how to solve some of the confusion that arise when multiple BI users access data using different front-ends like Qlik, Tableau and PowerBI.
Majken Sander, Business Analyst & Solution Architect, TimeXtender, email@example.com
As a business person, when you tell me that automation – robots? – should build my next data warehouse, my immediate response is to wonder how computers and artificial intelligence could possibly know what is important and best for my company. After all, if it was that easy, what have I been doing spending endless hours over the last few decades constantly assessing ways to measure and analyse how to run my business?
As a tech-person, if I heard about the invasion of machine learning and automation, I would immediately argue that no piece of software has ever been wiser or better at writing code than a critical human brain! Sure, AI-developed software has been attempted before, but how successful has that been? And agreed, some database vendors do make pretty neat software that automatically creates explain plans, that actually chooses the better index, and that puts the most needed and used data in memory for a faster response. But let’s all agree on one thing – sometimes, everything technical can go a bit haywire, right?
“The solution we have right now works!”
I’m sure that currently in your business, data is being collected from various sources. Scripts that have been manually written and maintained ensure that it is the right data. Maybe it has to be filtered and transformed but performance is… okay. At least, it’s okay for now. Perhaps this system has already run unsupervised for years, as long as the source systems has stayed the same and no additional data has been needed – and as long as your core IT person does not go on retirement.
But the fact is, the world changes and so do ideas for using data.
This is a re-write of the phrase ‘time-to-market’. Time-to-data boils down to the time between getting an idea about a new way of using data and the time that idea hits the BI front end or analytics tool of your choice.
What’s your current time frame? Do you measure time-to-data in hours, days, weeks or even months, from getting an idea or recognising a need until data shows up at the fingertips of your data visualisation tool?
Maybe the backlog of ideas and requests in your organisation is growing ever longer by the day?
How does gaining access to your data translate into the value of business insights this data facilitates? Would a faster and more flexible access give you answers that could save you both time and money?
Now, this might sound like a phrase from a commercial, one of those questions that it’s almost too easy to answer “Yes” to. But as most of us have learned, speed and flexibility often come at a price. With automation, though, it can be a slightly different case.
What is BI automation?
It’s a part of the system that takes care of underlying changes and maintenance, while at the same time providing an easy overview for those human eyes that are busy elsewhere building new parts of the system or developing new analytics apps and charts.
In slightly more technical terms – if this seems to technical or your coffee is getting cold simply go to the next header ‘The business advantages of automation’ – Challenges in systems can cover tasks such as:
- Names of fields in databases or the database itself changing to a new name, then the code that fetches the data and prepares it for the BI front-end needs altering to reflect these changes to avoid suddenly missing data or getting errors
- A new person needing access similar to a colleague’s access rights
- Because knowledge of the fields and their underlying calculations are hidden in dusty lines of code written long ago, it can be unclear which numbers and fields are being used in reports and the BI front-end as well as how measures are being calculated
- Dealing with one of the most dreaded, tiresome, boring tasks within IT – writing documentation
…Just to mention a few of the challenges that most users of BI and Analytics come across from time to time.
In an automated solution, scripts are automatically kept up to date to reflect the correct names of fields and sources that have changed. When a system keeps track of security, you can simply add the person to the same security group, without copy and pasting access rights or manually keeping track. The system provides an always up-to-date overview of people who have access to every piece of data.
Measures and calculations are made only once and reused through-out the entire BI solution and in every front-end like Qlik, PowerBI or Tableau accessing the data warehouse. Automated data impact analysis and lineage can bring clarity about the origin of data without IT staff having to answer these requests by going through hundreds lines of code manually.
Other benefits include the automatic creation at the touch of a button of documentation that keeps track of which data goes where. Admittedly, you are then still stuck with the boring task of actually reading that documentation, but that’s another matter. At least you can satisfy various audits and compliance needs like some of those included in GDPR.
The business advantages of automation
Being able to hand over the tedious tasks to an automated platform frees up the often scarce resources of your IT specialists, leaving them free to pursue more inspiring work as intelligent humans to develop new ideas and integrate new data into an insightful data discovery solution.
I often tell people this:
“There is no such thing as collecting data too heavily, only understanding too slowly.”
By that I mean that IT must never be the limiting factor for a company to become more data-driven, to leverage all the great and valuable information it already has stored as data, or to use data as support for even better intelligent decisions. If there is any way that I can get faster access to the data my business needs, I am curious as to what that might be!
Rick van der Lans recently wrote an article about a unified data delivery platform. He writes:
“It must be able to hide for business users how and where data is stored, how it is copied, which technologies are used, whether data is integrated on-demand or on batch, and so on. In addition, it must be transparent enough to business users to determine how source data has been manipulated. A data delivery platform must be able to support a wide range of business users, ranging from users requiring governable and auditable reports, to users demanding a highly agile marketplace, and to data scientists who analyse raw data.”
(the entire article can be found here)
In today’s modern data environment where we have to incorporate both enterprise data warehouses, modern data warehouses, data lakes as well as streaming data, we need a fast and agile way to handle the ever-increasing amount of both data and also the number of sources that need to be accessed.
To be able to deliver data to the business, the analysts, the data scientists and the self-service BI front-end users, automation might be exactly the missing link in your BI and Analytics solution. Choosing an architecture ready for the future needs and demands, a unified Data Delivery platform or a Discovery Hub if you like.
Automation offers to free up human resources, track changes and keep systems up-to-date making maintenance easier and the adding of new data sources a lot faster and more flexible. More time for you and people in your organisation to put data to use.
The goal of course: bring your company in front of your competitors when it comes to business insight gained from getting access to data at a much faster rate.
Which considerations have you and your company done to move towards a modern architecture and getting ready for the constant increase and need for data throughout your organisation?
Majken Sander is a data nerd, business analyst, and solution architect at TimeXtender. She is well-known in industry circles as an influential industry executive, international speaker, and accomplished data expert. Majken has worked in IT, management information, analytics, business intelligence and data warehousing for more than 20 years. Majken is a tech evangelist and often blog on topics like Business value of data, Data Warehouse Automation, GDPR, BI and Analytics http://blog.timextender.com/author/majken-sander @majsander
Copyright Majken Sander, Business Analyst & Solution Architect, TimeXtender