Learn what it takes to achieve the powerful reality of agile, governed self-service analytics with any BI tool, and with Cognos Analytics specifically. Expensive self-service BI implementations often serve as nothing more than a simple data extract tool that eventually feeds downstream processes into Excel. Whether you’re running Cognos, Power BI, Tableau or a combination, watch this on-demand webinar to get valuable information for achieving well-adopted self-service that delivers exceptional ROI.
Self-service analytics topics we address
- What to expect if self-service analytics lives up to its promise
- Best practices for achieving the goal of self-service
- Self-service requirements: discovery vs. gathering
- The value of the semantic layer/business logic to enabling self-service
- Pros and cons of the three main types of Cognos architecture
- IT-driven enterprise model
- End-user driven model
- Hybrid model
- Cognos self-service components
Presenter
Pedro Ining
Principle BI Analytics Architect
Senturus, Inc.
Pedro joined Senturus in 2010 and brings over 20 years of BI and data warehousing experience to his role. He has been instrumental in implementing data warehousing systems from scratch and has experienced the evolution of the BI industry through several iterations of BI products including Cognos, MicroStrategy and Tableau.
Read moreMachine transcript
Welcome to the latest installment of the Senturus Knowledge Series. Today we’ll be discussing how to successfully implement self-service analytics.
0:18
Before we get into the core of the presentation, some quick housekeeping items.
0:23
Please feel free to use the GoToWebinar control panel to help make the session interactive while we have the microphones muted out consideration for our presenter.
0:32
We encourage you to enter questions through the GoToWebinar control panel, and while we’re generally able to respond to your questions while the webinars and progress, we save them to the end, and in general. So, stick around for that. If we’re unable to cover it during the Q and A session, we will provide a written response document that we’ll post on senturus.com, which leads us logically into the next slide.
0:56
And the question we get early and often throughout all these presentations, can I get a copy of today’s presentation? And the answer is an unqualified, yes.
1:04
It is presently available on senturus.com. Go to the Resources tab and look at the Knowledge Center.
1:10
Alternatively, you can click the link that has been posted in the GoToWebinar control panel here.
1:15
And, while you’re there be sure to bookmark the Knowledge Center as it has tons of valuable content, including many other webinars such as this one, and other interesting information addressing a wide variety of business analytics topics.
1:30
Our agenda today, we’ll do some quick introductions and then we’ll get into the core of the presentation on self-service analytics.
1:37
We’ll define self-service analytics, go over some misconceptions, discuss some best practices, review some requirements at a high level, discuss Cognos architectures that support self-service analytics.
1:50
After that, please be sure to stick around through the very brief Senturus overview for some additional valuable and generally entirely free resources and the aforementioned Q and A I’m pleased to be joined today by my colleague, Pedro Ining, Principle BI Analytics Architect here at Senturus.
2:11
Pedro has been with us for a while since 2010. It has over 20 years of BI and data warehousing experience. He’s been instrumental in implementing data warehousing systems from scratch and has experienced the evolution of the BI industry through several iterations of BI products, including Cognos MicroStrategy and Tableau.
2:31
As usual, we always want to take the pulse of our audience through our polls.
2:38
Our first poll today is: which BI platforms does your organization do self-service reporting and dashboarding? And this is a multi-select, so you can select all that apply.
2:51
Do you use Cognos for self-service reporting and dashboarding Tableau, Power BI or, something else? Or, not at all. You haven’t adopted self-service reporting.
3:00
We’ll give you a little time to answer.
3:05
You must be well caffeinated today, we’re nearly 80%, barely 30 seconds in.
3:11
Go ahead and check those boxes.
3:18
And, I’m going to go ahead and close that out and share the results back with you.
3:23
So, the preponderance of this audience, this is a little surprising to me honestly, 85% using Cognos for that, then, a third or so, closing in on 40% evenly split between Tableau and Power BI.
3:35
Interestingly, 16% other, and only 5% not doing it, so, that’s interesting.
3:41
I guess, not surprising, that self-service analytics is definitely, the hot topic today.
3:52
The second question we have that I’ve presented here is, What percent of your Cognos platform is due solely for traditional, canned reporting?
4:03
Please select one of the above a quarter, half, three quarters, Or, all of it. Or, you don’t have Cognos or? You don’t know?
4:12
Easy question.
4:21
Got about three quarters in here. I’ll give people a few more seconds to answer.
4:29
All right, great.
4:32
So, about half are using it, 75% for traditional kind of canned reporting.
4:42
Almost a quarter are using it 100% for canned reporting, and then a smaller percentage using it half or less for canned reporting.
4:52
So that would be especially predominantly self-service, that flies a little bit in the face of the last question or maybe I’m misinterpreting that.
5:00
But anyway, very interesting. So, thank you for sharing that.
5:04
And with that, I’m going to hand the microphone over to Pedro, Pedro, the floor’s, yours.
5:13
We did a version of this webinar last year and a high percentage of Cognos implementation is still doing a lot of canned reporting. And, I think that’s indicative of the legacy. It kind of shows the legacy of Cognos or kind of where it came from. This webinar, obviously, is on self-service analytics.
5:36
And there’s been an explosive growth over the last few years, 5, 10 years or so, on the products coming out, which really promote self-service analytics, self-service BI, do it yourself, your own way.
5:50
Products like Tableau and Power BI are making quite a lot of inroads, let’s give the users, access to the data, doing your own way, create your own visualizations.
6:04
And, even IBM Cognos is now, for many years now, has been called Cognos and it was Cognos Analytics, throwing in the analytics words in there, too, because they have also morph their product away from just simply canned reporting, IT reporting, very structured, centrally maintained metadata to more of a self-service model, and, for those of you out there who are that have been using Cognos for quite a while and have not really explore their analytics portions of the product. A lot has changed, and they’ve basically been having to keep up with the Tableau and Power BI is to make their products more self-service, capable.
6:48
And we’ll go into that a little bit.
6:51
The focus of the first part of this presentation is trying to pull back agnostic away from specific aspects of particularly BI tools, but talk about self-service analytics in general.
7:03
Let’s look at the question, what is self-service analytics? And there’s a lot of a variety of different definitions out there.
7:10
I’m going to come up with one, you might have your own definition, but let’s look at the one I kind of have over here.
7:17
So, self-service analytics allows business users to access more data sources on their own, potentially modeled their own data, then create reports or dashboard visualizations with very little help from IT.
7:34
So, in the old paradigm, you typically have to have requirements. You have to have IT model the data. You give them samples of reports. They go away for a few months. Do they come up with version one of the reports, you make them change that, your kind of tied in the way the model works. You know, self-service analytics is actually pushing more of some of the modeling capabilities back into the user’s hands.
7:59
And IT, then, doesn’t have to be that report factory, that army of developers out there that has to create 10, 15 reports, particularly for each business unit, right.
8:10
The outcome of that is that this will lead, and can lead to faster and more Agile data analytics, as compared with your traditional BI, development, SCLC life cycle.
8:23
So, that’s one of the goals, what the definition of self-service analytics is.
8:31
And there’s definitely a promise of what these products are trying to offer you.
8:34
And this whole paradigm is trying to offer you, in terms of promise of self-service, right, giving business users direct access to data and reporting tools, we’ll remove the reporting burden from IT. That’s one of the promises, right. We’re going to give these tools out to users, and then IT will not have to be that report factory anymore.
8:58
Analytical decisions can happen more quickly, obviously, where the whole aspect of this is to drive business growth. We don’t have a lot of time to create reports. We need to make some, and we need to get some answers to them, some very important business questions, which will make our decisions faster and quicker and more reliable so that we can be more profitable business.
9:17
The new modern BI tools will provide more visual insightful, automated analysis, right? The Tableaus, very visual, cool graphs. You know, we can get a lot of inside of that, and it’s faster to create.
9:30
These are some of the promises of self-service, but then, we have to ask our question, you know, why do so many self-service, analytical BI efforts go wrong, Right? And, this word, self-service analytics. It’s been around for quite a while. I mean, initially, it was all about modeling the metadata, putting Cognos on top of it, and giving them query Studio Report Studio Now, Studio, whatever.
9:56
Even, the aspect of really just trying to run manual queries against a metadata layer, initially, and then now we have these newer tools of more visual insightful tools.
10:07
There’s still, the implementations are still kind of hampered and things definitely go wrong.
10:15
So there’s some common myths misconceptions around these tools and implementations that we need to kind of go through.
10:22
Well, one of them is basically, there’s the idea that, you know, we’ll install it.
10:27
We will build the platform out, we’ve already got a data warehouse. Will put our Tableau on the desktops of 200 users, build out the platform, maybe, but Tableau server up. We’ve given to the users off they go, No. But then, there’s a whole host of problems that people encounter from that idea, from misconception wise.
10:51
The other one is we eliminate the need for IT. So, the misconception is, we’ll put this, again, these tools out there.
10:57
But, in reality, no data is complex and complex data still needs to be modeled.
11:05
The BI tool is a doorway to access the data in the organization.
11:11
And sometimes, if you just give them complete raw access without having a good semantic layer and govern data, they still won’t get the right answers. Number one.
11:22
They’ll still need to come back to IT and they’ll may just throw their hands up in there and say, IT. Can you just create me there, support those tools to art.
11:32
A modern self-service BI tool will be a successful project from a management perspective.
11:37
I’m going to spend a lot of money on these new tools, and it’s going to be successful, because of all the things we’ve talked about.
11:42
But if your foundational data layer, your data architecture is not strong, if your governance is not strong, putting those tools out there for users will not automatically lead you to a successful project.
11:58
Users automatically understood how to use a tool very slow. A bunch of smart folks out there right now, I think the democratization of analytical tools, the whole data science field, very smart people.
12:10
But there are also people who are just really trying to get their job done as well, and assuming that your typical financial analysts can just be plopped into a tool and understand how to use it with minimal training is short-sighted. So, change management still always have to be an issue. Thing, we have to drive towards user training. They’re all critical, OK? And these are just some of the misconceptions and think we have to clear up and establish before we started our project, before we start implementing analytics there.
12:42
So, if we focus on a couple of areas, if you want to implement a self-service analytics project, a self-service analytics BI tool, platform infrastructure, there are best practices. We’re going to run through that can definitely help you with that.
12:57
Um, and the whole requirements area, you know, we’ll talk a little bit more about that, Kind of calling it more of a requirements discovery.
13:05
Sessions working with users, not just to implement a technical aspect of the tool, but working with your end users, your business users, users, your business sponsors, to really understand what it means when they want to be able to have access to the data and answers to their business questions. This is something you need to focus as well.
13:26
Technically, we’ll pivot and we’ll talk about specifically some Cognos self-service architectures now, that are in play that you may not know about, or maybe you’ve heard about on how we can implement self-service architectures on Cognos and, we’ll talk a little bit about that.
13:46
So, best practices, number one, obviously, data governments, continued data governance is very important data, Warehouses are not going away.
13:57
There was a lot of time spent in the nineties in 2000 of establishing very, very robust, vigorous govern data warehouses. Where we finally nailed how to make a customer nano dimension, a product dimension usable for the entire organization, OK.
14:16
This curation, the key enterprise data, and governance is going to be critical component for continued data quality.
14:24
OK, we’ve actually sent tourist has a topic on this is called Why Bother with Data Governments. And I suggest you look at that as a complete topic on its own.
14:34
But we can’t ignore that. We can’t ignore that at all.
14:37
We can’t have people just plop in tools and then bypassing data warehouses that are very QA it and go directly against the ERP, for example, and try to create their own custom dimension.
14:48
We have to intercept that process early on, and make sure that they’re pointing to the right area, because there’s been a lot of work in that area.
14:56
Requirements, discovery.
14:57
We’re going to talk about these requirements gathering techniques that had to go beyond the typical questions that IT likes to ask any users, you know, what? Do you, what data you need? We need to have more of a discovery process.
15:10
Kind of drill into that semantic layer, the semantic metadata layer is very important.
15:17
But from the perspective of self-service analytics, we need a layer, you know, that doesn’t overburden you users, we have to have make efforts to, to make a semantic layer for users that commonly used metrics, things like that. Things that we’re going to find out from the discovery session, simply serving up a fact table and dimensions, and think you’ve created a semantic layer that is usable for your end users, will not cut it.
15:45
So, we need to be able to work in tandem with the business units to find out what that semantic layer should be, that would help them out.
15:54
Of course, training, be like, capability training. You know, there are different types of users, so you need different types of training for different types of users. This is critical.
16:04
You know, maybe a blanket training approach might satisfy some, some groups, but some groups will need very specific kind of training. Maybe your eventual early consumers will blend graduate and they could become producers of more elaborate BI, objects and dashboards as their capabilities increase.
16:26
So, we need to be able to bring the users along with good capability training.
16:32
Performance tuning, I have it up there because, you know, you could have the best BI tool in the world.
16:37
And if they drag a measure on to that dashboard or report and it’s going to take 45 seconds to render, you’re going to lose folks.
16:45
You’re going to say, they’re going to say, this is too slow.
16:49
It’s too complicated.
16:51
They want it, they’re going to want to reach for something else. They’re going to, want to go somewhere else and I think out there in Cognos land, you probably hear a lot of that where, you know, what is too slow?
17:02
For whatever reason, that report that I run is too slow, but I’m going to schedule that report and just dump it to Excel and then continue on by analytics. And Excel, or X, extract that data out of that BI platform and put it into Power BI.
17:16
So, a lot of, a lot of companies struggle with this because a lot of people still use those Cognos tools, like the Cognos platform as an extract tool. But, even in Power BI, I’ve seen some dashboards that do not perform well, and you start losing people, right?
17:34
My last bullet is really a recognition that, you know, self-service will handle a certain percentage of your organizational BI and reporting needs.
17:43
But there’s, you have to recognize that not all areas are suited for self-service examples of creating very specific layouts reports for regulatory filings. For example, you have to send something to the government. You have to. Customers want a PDF 8.5 by 11 with all this data on it with certain font sizes.
18:05
Sometimes you got to send it back to IT, you have to actually, maybe this is beyond my capabilities, I don’t want to be able to learn.
18:14
I don’t want to learn about hiring how to write a PDF report that looks exactly like this, Bring it back to IT development. So, there’s going to be areas like that. There’re some reports. Reports are never going to really go away.
18:26
So, we have to be able to know that self-service, no warning, will answer all our needs. But sometimes, we have to fall back to the reporting areas.
18:36
So, requirements, discovery, versus Gathering reports.
18:44
So, usually, we get into a BI project, and we want to gather user requirements for BI even self-service, and there’s obviously a very inherently difficult task. And, you know, as data and BI professionals, we typically start requirement sessions by asking some typical questions, like, What kind of reports you need?
19:07
Do you have samples of the reports that, that you can show me?
19:11
Data, do you need, what are the measures you will eat and what are the hierarchies?
19:15
You’ll need, do you need dashboards, and these are all important questions that usually leads a BI professional, jump into a more comfortable discussion, tables, and fields, and report.
19:27
And it kind of little basically stems from the base questions. You know, what? We are the order takers, basically. What do you want?
19:36
Or what do you want me to do for you?
19:39
Really, the users don’t know, or realize what they don’t, know, what. They don’t know what they want from an analytics system mission.
19:45
Turn generates a lot of churn, you know, you come up with reports. Then you have to redo them, and all those kinds of things. So, we need to kind of maybe think about what we’re doing.
19:56
We want to discover requirements.
20:00
We want to, we want to determine, understand the client’s goal. We want to think of it from a global perspective.
20:04
We want to understand the current business process, our workflow that a user is doing. Before we even get to the point of what are the measures you need, no other kind of reports You need. We need to discuss existing pain points.
20:22
Those existing pain points might really like night show a lot of things that are important for a self-service tool.
20:33
What we want to do is consolidate with a user story, you know, we want to bring all these things together from a requirements perspective.
20:45
Determining and understanding the client’s goals, so usually this is a discussion with a client that is independent of the BI tools feature.
20:56
We wanted to understand what the goals are.
20:58
And oftentimes, an end user’s goal is in lockstep with their job responsibilities, Yeah, for example, a C level executive may want to see a current snap, a snapshot up FTE headcount.
21:14
How can parents with last year, staffing manager, just want to see, uh, head count at a location and organization, and what’s experienced the most growth, because they need to plan and forecast future hiring plans? You know, these are kind of discussions, you’re going to have, maybe a data at a director level.
21:33
Management would like to analyze their current workforce, see exempt versus nonexempt, describing what parts are, or of the organization, are experiencing the most attrition.
21:44
You know, these are some of the goals that different types of users in your organization may want to get to through the analytics tool that you’re going to be implementing.
21:57
Understanding the business process or workflow.
22:02
So usually, a user’s workflow starts with some sort of triggering event.
22:08
An event that requires them to seek more information.
22:13
Then it ends with a set of tasks they need to do. They need to analyze. We need to analyze these workflows and find out what?
22:19
What are the details of going from the triggering event, to finding out what the tasks are, what the workflow is, and then the goal, OK?
22:30
Maybe the triggering event is, they have to get a regulatory filing out.
22:35
Maybe they’re looking at a metric on a dashboard, or report that suddenly goes beyond a threshold.
22:40
You know, does a user review the report on a daily or weekly basis, and what are those steps that contribute to the user’s decision making?
22:52
Here, we might find what that friction is in terms of like, which steps cause the most friction?
22:59
When they do, to create the actual goal here, is the current process too reliant on canned reports.
23:06
You know, so, as you go through this and you find out that one of the friction points again could be that the current self-service tool goes only to a certain point. And I have to extract that data and I got to bring it down to Excel.
23:21
And I got to put things together to ultimately produce the analytics object as a goal for my next step.
23:29
So, walking through that, not really talking too much about the BI tool, but understanding what their business process or workflow will uncover quite a lot of things.
23:40
So understanding your users issues with their current workflows, because it’s important thing, then after you actually get to a certain point, create a create that user story based on your findings, you know.
23:56
After you go through all that, this is a typical story that you might come out from your findings, like here is a story, John as an HR analyst, for example, and he has been requested to analyze the workforce FTE headcount.
24:11
And the company currently reviews work the current workforce and wants to see how the FTE headcount compares to the same period last year. Right. So, there you go, here’s a comparison metric that I that he has to do, OK? Is management would like to see the tenure of the workforce? They’re also interested in seeing the breakdown by organization and location.
24:33
And then, he gets to a certain point in this narrative where he’s been working say, we’ve given them a self-service package in Cognos, the headcount package.
24:42
And he sat down with some of the people from MIT to understand the components of the package and how to query the package.
24:49
But he knows enough to be able to extract data for Excel because one of the one of his sticking points was, for example, he has never been able to calculate the change in FTE headcount year over year in Cognos. And he finds that that task is easier if he just extracts very slices of the data to Excel or file reporting.
25:09
In addition, John has to integrate other datasets into his final reporting.
25:15
It’s not currently available in the Head Count package.
25:18
Maybe that Excel file, he gets from PeopleSoft, has a service dates which relates back to figuring out the tenure of the workforce.
25:28
And that date field is not in the headcount package. He doesn’t want to go back to IT to figure out how to do it, but he can get an extract from PeopleSoft. And he has to merge those things together in Excel, to come up with that final report.
25:42
That final report is then primarily made up of Excel sheets and it work book.
25:47
So, what we have will be fine here, is that, yeah, here’s go, as an analyst is doing, is doing FTE headcount reporting year over year metrics, but his current workflow uses Cognos.
25:59
The self-service package, but, it’s used as an extract tool.
26:04
And, he completes reporting in Excel.
26:06
And, he’s got a variety of process pain points.
26:09
And, one of the things that are missing are key calculations that are not available in the package multiple extracts or download an Excel because, for example, offline data needs to be integrated into the final report.
26:22
So, that’s kind of an agnostic way of looking at things, in terms of trying to establish some requirements, trying to understand some of the pain points best practices of self-service analytics.
26:36
We’re going to go now into a little bit of the technology specifically in Cognos of how we could derive some self-service semantic layer architectures. OK.
26:54
So, Cognos 11 We are now at 11.2, Cognos 11 series started with 11.01 at 11.1.
27:07
The last release of 11.1 was 11.1, really 7 and 11.2 was just recent release. I’m pretty sure a lot of folks have move that out to production yet, but every time they released something, there are new things evolving and adding to the toolset.
27:24
But some of the key Cognos 11 self-enablers I’ve listed here and the very first bullet, FM packages Everybody knows that FM modeling is a classic, a modeling tool that most Cognos 10 users know about 11. It’s classic.
27:42
And, I say, but it’s still useful because we can still use those FM models and packages as a base to help with enabling self-service.
27:54
In the newer technologies, you’ve done a lot of work modeling those FM packages.
27:59
You’ve done a lot of work in modeling your data.
28:02
You know, we don’t really just want to throw those all away, but they could be used in helping you create self-service environments using the Word self-service, enabled, aspects of the 11th Series product, such as a second bullet, Data modules, which is a very user end user centric modeling tool.
28:24
OK, which allows users to not have to use Framework, but we could use Framework packages as a source for the Data Modules datasets.
28:33
This allows users the ability to extract subsets of data from Cognos and stored on on Cognos without having to constantly query the database.
28:44
That technology basically also exists in Tableau, in Power BI with terminologies like data extracts and datasets, right?
28:54
And dashboards, enabling self-service, enabling user defined dashboards, users creating dashboards are getting easier with each release.
29:03
And, of course, reporting basic reporting doesn’t work workhorse of most analytical tool sets.
29:10
And I would argue that actually, the reporting tool, since it’s been around for so long and has had, has gone through a lot of iterations, is much better than the Tableau and Power BI versions of it, where those tools really focus on visualizations and self-service. Reporting has been cognizant, unleavened bread and butter for many years. It’s a self-service tool, but it’s, so good. And it’s been around for so long.
29:38
So, what I want to do now is kind of go through a couple of potential self-service models thinking about it in terms of Cognos. We have the first one: This is the classic IT driven enterprise model.
29:51
And, it could be a lot of great cases to keep using this model for a year environment, right, it’s, it’s IT centric modeling, low, slow on the SCLC cycle, slow to make incremental changes. But, if you have a very important set of Mali, it has to be done, it has to be very controlled, has to be very govern.
30:11
There’s the classic model where you have your framework tool modeling against source databases, generally done by IIT, publishing packages, and publishing reports, that end users will consume.
30:24
And sometimes, at the report, developing layer, maybe it is IT doing those reports and minimal amount of end users, or maybe they’re our end users doing self-service against those packages and creating their own reports. So, it’s still an option for that particular case.
30:42
Now, we have a newer one where we have IT driven still with Cognos Data modules, OK?
30:50
Um, this would be basically a data module that’s modeled against your data warehouse directly, or your online systems IT still controls this, right?
31:03
But now IT can make use of a lot of the benefits of data modules, such as the relative time features, the built-in data cleansing, data integration with Excel files, and things like that, they could also use this IT control data module as a source. The end users can use it as a source to link to their own data modules.
31:25
So this basically kind of takes the FM modeling out of the loop and goes direct for that.
31:32
And this is actually a nice feature in terms of, you know, I’ve got a brand new, maybe application database, and we’ve not done an FM model on it. And we are it’s a new project. You can start right away with the data module going against the data server connection against those tables directly. And you make that read only. And you make it, we not only as IT controlled, you still have that kind of governance around that.
31:58
On the far right of the spectrum is the end user driven model.
32:04
This allows your self-service, analytics, and end users here at the end, to take full, full control of the modeling aspects of it.
32:15
You have a data module here, which is going against a direct data server connection, and pulling in tables and databases from the data warehouse directly. Or, it could be an application database, and the end users are doing the models modeling themselves. Independent of IT is in charge the databases, making sure they’re up, making sure the application is running. And I’ve seen use cases like this in an organizations or business units where there’s somebody who has developed this application database over there. They know those tables very well.
32:48
In fact, they know those database tables better than IT.
32:51
And, maybe what they’ve been doing right now is just running manual SQL queries against this database, right? Or pointing Excel directly to this database with no control. Right? And doing their analytics. Just bypassing any other BI tool.
33:08
And, what we can do is have Cognos point to these databases: Expose the data module capabilities to your end users and let them do the modeling?
33:18
Then, they can take advantage that they can create, you know, your datasets over here that which can be then imported into other data modules and then they can also add their own files to the data module here. Offline data files.
33:35
This is full end user, you know, driven modeling in this particular model here.
33:44
Now, the hybrid model, I like to call this the hybrid model.
33:48
These leverages, the central IT maintain FM packages. This is, we still have a need for this, OK?
33:56
And instead of just throwing these away, we might create data modules off of subject areas of the FM model over here.
34:06
We, we’ve got a lot of work where we’re here, we could link to the FM packages, bring in those tables directly into the data modules, just by just by linking to it and leveraging, maybe even security here in the FM package.
34:19
They might have some very complicated row level security that we need to use, and then also integrate other sources of, kind of data here, in the data module, Through data server connections, back to the database, OK?
34:33
So, the end user, here, if you’re doing end user modeling at the business unit level, they would leverage this stuff here at IT maintain packages, for that purpose in creating their own data module.
34:46
We could also use these FM packages to extract datasets.
34:52
And this is the concept where maybe the FM packages have already modeled a very clean and QA product dimension, for example, OK. And we can create a dataset for that as offline. It gets refreshed every day, but then the end user here can use the data module and use that and bring it in to their own analysis over here for self-service. And then they can then leverage here, at the bottom end, users maintaining their own uploaded files.
35:23
And then, bringing into data models and in and bringing in those product dimensions, and doing their own models, using their own data.
35:32
So, those are some of the models that, that we can use in Cognos And, I’m going to go over to the Cognos environment right now, Justice, to physical eyes. Some of the stuff that I’ve been talking about.
35:48
I’m going to give you an example of both of a potential data module, metadata layer, and expand upon that. OK, so I’m going to go over here to My Content. And I have a data module here, and I’m going to go ahead and create a dashboard.
36:12
OK, so this is a self-service example of a metadata layer, right, and this very simple. For the purposes of demonstration. Here are my sales.
36:21
OK, I’ve got measures in here and I can slice it by the invoice date because slice it by product.
36:30
I could slice it by sales locations, for example.
36:33
And if I drag in my sales, total excluding tax, I drag that in here, and I get total sales, for example.
36:41
And I could also then slice it by invoice date over here, and maybe what I’m trying to do is find out for a particular calendar year.
36:51
Look at 2016, I’m done, OK? So, I’ve got basically, the measures.
36:56
I have, I need to be able to do this kind of analysis, right, but then what happens is, users will say, OK, that’s cool, but now, how do I do, for example, year over year data analysis, I want to be able to compare current month’s sales: two current months Last year, are year to date, versus year today, last year.
37:20
So what we’ve done is we’ve expose the based measures over here, very simple star schema, with base measure metrics, But then end users will have to then go ahead and try to figure out how to calculate that.
37:35
Um, so, one of the things we want to do in terms of a good semantic layer, like I alluded to before, was to really work with the end users, is to find out if they do need to do those kinds of calculations in slices. and with Cognos.
37:49
What we can do is implement a relative time features built into data modules, so if I expand this particular metric, now I expose those slices to the end users, right, there is my year to date.
38:02
There’re my current month’s sales.
38:05
Here’s my same one sale compared to last year.
38:10
It’s already built in.
38:12
Then when a user could do is I’m going to go ahead and delete this.
38:19
And I’m going to bring in a new visualization here called the KPI Visualization.
38:26
And now I’m allowed to be able to compare these two measures together.
38:31
So I’m going to bring in current month sales as a base value.
38:37
And then I’m going to compare that to same month last year as a target.
38:44
In right away, I can see that comparison and I can go ahead and clean this up.
38:48
And this is like current month sales year over year.
38:58
And I’ve basically got what I want to display.
39:02
Now, the point of that is that there was just a couple drags right? These are already built in here. I didn’t have to calculate everything. Anything.
39:13
And this is just a small example of how we can help make the metadata letters semantic layer easier for people to use by just even implementing this as simple technologies like that which is fairly easy to do now in Cognos data models, OK?
39:31
The other aspect of this was allowing users to bring in spreadsheets and data offline data. So, I’m going to go ahead and minimize this a little bit.
39:44
Get out of my presentation here, and typically, we have a spreadsheet. I’m going to open up the spreadsheet. This is on my desktop.
39:53
And, this is a spreadsheet that could be used by users that is created by very defined process, you know. And, to be frank, you know, a lot of times, these Excel processes are, have been created over the years. They’re not going to go away.
40:11
People get the sales goals from the organization via e-mail, via different planning systems. They, they put this together, and they need to be able to compare the sales goals, to the sales.
40:21
Right, and I have this spreadsheet here on my desktop.
40:26
And it, in before, and older systems to try to get this data up in the data warehouse, wasn’t sure, it would take a long time to do.
40:34
So, one of the self-service aspects of Cognos that we could do now is simply drop that spreadsheet into Cognos, so, I’m going to take this. I’m going to close this dashboard.
40:47
I’m going to take this. I’m going to drop it on top of my Cognos environment. You can see is analyzing, analyzing the sales goals.
40:55
OK, and it brings a sale go in, and where does it put it, it puts it in the My content, at the root folder over here, Sales goals, and Then all I’m going to do now is I’m going to go to that data module that I had up.
41:13
And I’m going to bring that in to my data module.
41:17
So, I’m going to say Add use sources, and I’m going to go to my Content, and Content. And.
41:33
Find it, where to go, where it is. It’s already because it’s in their hold on a second.
41:40
My Content, and here’s my Sales Goals.
41:47
OK, and Add You Sources!
41:54
Some reason, it’s not showing up all a second, guys.
41:58
I’m going to go ahead and bring in, Create.
42:00
Try that, again, sales goals, also, by the way, these Spreadsheets Here, you could you could just directly query with a dashboard. Let’s take a look at that and make sure that works. OK.
42:11
There’s my data on a dashboard, and I could do queries against that. And let me go ahead and close that out.
42:20
And go back into My Content, self-service webinar, Data Module, and say Add.
42:31
For some reason is not bringing it.
42:32
I’m going to have to close this one out again. Live demos. These things happen.
42:38
Do this again. We try this one.
42:43
And new sources, there, we go. Hm. Hm, I have to think about that one, OK.
42:48
I’m in a data module on bringing the sales goals in, Yeah, OK.
42:54
There is my sales goal now, right. So, I’ve integrated spreadsheet data into my data module. I’m going to go ahead and save this.
43:09
OK, I’m going to go back to my content, and let’s do a dashboard off of that.
43:17
There’s my Sales Girls and there’s my sales goals and total excluding tax.
43:23
Let’s go ahead and bring in the KPI widget and this remember, this is a spreadsheet now.
43:31
And I’m going to say that one.
43:35
And then, against the sales, go, target value.
43:38
There we have it, right? So, I brought in my spreadsheet.
43:41
Then I can actually join these against the different parts of the other data module and do my analysis.
43:47
So I’ve augmented end user data module with offline spreadsheet data, and enhance that. And the end user now is able to do this much self-service perspective versus extracting the data from the data module.
44:04
And then bringing in the spreadsheet data into another spreadsheet and combining the data together, or trying to get those sales goals back into the data warehouse itself. OK, why didn’t work any other way? I think I’m out of my rights issue or a buffering issue, but that’s a live demo.
44:23
But basically, that’s what I’m trying to show you here, is that aspect of the hybrid environment to where you could actually see bringing in spreadsheets in.
44:32
So, let me go back to my presentation, though.
44:36
Let’s see where we are.
44:38
We are at here.
44:44
Yeah.
44:47
OK, so in this example here, those are the spreadsheets, spreadsheets have brought in, I brought it into Data Module, self-service analytics, and then also combining it with data in another data module.
45:01
So, in summary, no self-service BI implementations really require a tight partnership with your business sponsors, OK, if you involve your business sponsors early in the game?
45:16
No.
45:16
Then you’ll have a nice, if you bring them in through all the requirements gathering and understand their goals, their workflows and processes, you will have a nice understanding of what their real needs are.
45:33
And this snazzy BI tool does not mean an automatic success. Remember that, OK.
45:38
So, a good tool is implemented properly will be a good foundation.
45:43
If you have the foundation there, then you will have a very good start on a self-service analytics platform.
45:50
So, technology, we have the technology side, we have our business partnership, you work the two together, not in isolation, that in silos, You put those two together, and then you’ll have a successful, at least a good start on a successful foundation for four analytics, self-service analytics.
46:09
I’m going to turn it back over to Mike.
46:12
Good stuff, Pedro, thanks for throwing that little glitch in there just so everybody knows, that was a real glitch. Yeah. That’s right.
46:22
Great, so thanks for that content. Stick around, we have some good questions in the questions pane.
46:34
If you have other questions, please get those in while we go through this other quick information.
46:40
At Senturus, this is what we do all day, every day we get out of bed, put our pants on or really our jammies these days, I guess. And, we do analytics, right, and we help people realize self-service analytics.
46:54
To that end, we have a lot of different resources, some of which are related to this, a couple of webinars of Framework Manager versus data modules, there’s a blog on Framework Manager versus data modules and data module architectures and use cases.
47:12
And there’s one on the data modules, new capabilities, and that’s just kind of the tip of the iceberg, right?
47:16
There’s hundreds of webinars at this point, upwards of 200 or 300.
47:22
So, definitely head on over to our Knowledge Center and check those links out.
47:26
Likewise, on the next slide, how can we help you with this stuff? A lot of times, you know the basics of this can be straightforward.
47:34
But really, getting to self-service is multi-faceted, complex process, and this is something that we help people do all day every day.
47:44
So, if you want to talk about it, and discuss how Senturus might be able to help you, please go ahead and check out that link, there, we have a link where we discuss some of those things in our framework. Or, if you want to talk to one of us, you can reach out to us at info@senturus.com or call 888-601-6010.
48:08
At Senturus, we concentrate our expertise on modern business intelligence with a depth of knowledge across the entire BI stack.
48:17
Our clients know us for providing clarity from the chaos of complex business requirements, disparate data sources, and constantly moving targets.
48:27
We have made a name for ourselves, because of our strength, bridging the gap between IT and business users, delivering solutions that give our clients access to reliable analysis ready data across their organizations, so they can quickly and easily get answers at the point of impact in the form of the decisions they make, and the actions they take.
48:45
Our consultants are leading experts in the field of analytics.
48:48
Folks like Pedro, with years and years of pragmatic, real-world experience, and experience advancing the state-of-the-art, we’re so confident in both our team and our methodology that we back our projects with an industry unique 100% money back guarantee.
49:04
Likewise, we’ve been doing this for a long time, over 20 years, now, 3500 plus clients, and 3000 successful projects.
49:12
We’ve worked across the spectrum from the Fortune 500 to the mid-market. no doubt you’ll recognize nearly all of those logos on the slide there.
49:22
Solving business problems across virtually every industry and functional areas, including the office of finance, sales and marketing, manufacturing, operations, HR, and IT.
49:31
Our team is both large enough to meet all of your business analytics needs, but small enough to provide personalized attention.
49:39
If you like what you hear, and you think you might be cut from the same cloth, you can look into joining the Senturus as team. We’re currently hiring talented and experienced professionals. You can see the job titles there.
49:51
You can drill into those a little more senturus.com, at the Link below. Send your resume to jobs@senturus.com.
50:00
Again, another invitation to expand your knowledge. We have hundreds of free resources on our website, from our webinars, on all things, BI to our fabulous, up to the minute, easily consumable blog.
50:11
You can find that again over at Senturus.com/senturus-resources
50:15
Our next upcoming event, Power BI Enterprise deployment.
50:21
You can, again, go over to senturus.com to our events page and register for that. That will be on Thursday, June 24th, so in just a couple of weeks here at the usual time and channel.
50:34
And then, finally, I’d be remiss if we didn’t talk about our complete BI training offerings across the three major platforms.
50:40
We support Microsoft, Tableau and IBM Cognos.
50:45
We offer all the modalities from tailored, instructor-led group sessions too.
50:51
Small group mentoring to instructor led online courses and self-paced e-learning. And we’re ideal for organizations that are running multiples of these platforms, or who are moving from one to another.
51:02
We can provide training in these different modes, and can mix and match those, to meet the needs of your organization.
51:09
And last slide, before we get to the Q and A We provide hundreds of free resources on our website, as I mentioned, a little earlier, and we’ve been committed to sharing our BI expertise for over a decade.
51:21
With that, we’re going to jump over to the Q and A, if you want to go to that slide there, Pedro. I don’t know if you’ve had a chance to look at any of those. But there are a lot of questions really pertaining to data modules.
51:31
First of all, how does the security work for data modules in terms of object security, and who will control the security for those data modules?
51:39
Right, OK, so, uh, several questions rolled into one.
51:45
So, basically, data modules within the data modules. You can secure relational data sources, for example.
51:55
If you want to secure a dimension and have certain groups only see certain regions right through the Cognos Namespace Security groups, there’s ways to do that there. I would have to say, also that, you know, you don’t have all of complete fudge role level security feature sets as framework manager.
52:20
Right, framework manager. You could have security table, as you could be very, very, very complex security requirements. Enable their through MACRA substitutions through parameter of maps. That kind of stuff is not quite there yet And Data Module and the fact we did a Webinar Data Modules versus Framework Manager was there. It was not there. So.
52:41
At a very minimum, you could secure the data module self to be not be seen or are shown to other people. Inside the data module you can secure levels of a dimension through Cognos security groups and this all there from relational sources.
52:56
one thing that is missing within the data module technologies right now is to secure objects. So, for example, I’ve got a table, product dimension table. I don’t want a certain group to see that. I cannot say, secure that product dimension table in the data modular and when this group logs and the data module they see it, and when group B looks logs into the data module. They don’t see it so that it’s not there. So, we’re hoping that some of those things get chains. I’m sure that that’s already actually in the works to enhance some of the more elaborate features of security.
53:29
In the data modules, but if you actually use framework, manage your packages as a source, for example, that has embedded row level security in the FM package. And you link a table in a data model to the package. It will respect the security of their framework package, and that’s one example, to where you can use your framework package as a source of data module, because one of the main reasons is because you want it to respect the security of that framework package.
53:57
Thanks, Pedro, and there’s all kind of a lot to unpack in that one.
54:00
Along those lines, when you do bring an FM into a data module, is it a reference or a copy?
54:07
It’s a link. It’s a, it’s a reference to it, right. So, when you bring it in, and it looks like, one package. And then you could expand it, and then you can actually link to the actual table. So, it’s not a copy. It’s not like we’re copying data into the data module. You actually, it’s a reference to it and when you expose it into data module, and you run a query against that table.
54:29
It’s going to run a query through the FN package.
54:33
And then back to the database, it’s going to do it that way.
54:37
Got it.
54:37
Can you speak to the level of skill, skills, or expertise required, from a, I guess, the same, from the end user point of view, to handle data modules? So, I mean, what, what does it take to create, and, or, I guess, maintain those.
54:54
Yeah. So, it could be various. I’ve seen use cases where actually they don’t even go to databases. Because there’s a lot of spreadsheets, for example, that contain a lot of good data. And I say I’ll use case, where they were using a spreadsheet, and they were creating graphs on the spreadsheet. They were e-mailing the workbook around, right.
55:14
And this is the case, where the user didn’t need to have a lot of knowledge.
55:17
But they were able to, like I showed you, copy that spreadsheet onto cargos, Created Data Module off of that spreadsheet.
55:26
Just simply expose the spreadsheet, because it had all the metrics and dimensionality already there. And then, he was able to create very nice dashboards of that very simplistic data module, right? The data module only contains one spreadsheet.
55:38
And he was able to do that. And as usual, capabilities and training grow, bigger, then, take that data module and add maybe a dimension that’s not there, you know, the spreadsheet has a product ID, for example, but doesn’t have all the other dimensionality.
55:52
Well, then you could actually then link that data module to your data warehouse and bring in the product table and then join it inside the data module to get that kind of roll up, Roll up, feature involvement there.
56:05
So, definitely training.
56:08
There’re some simplistic use cases. As the end user’s knowledge grows, they’ll keep expanding upon that.
56:16
No.
56:18
And new data models can feed off data models. Data models can feed.
56:22
We didn’t we didn’t talk a lot about datasets, but off of datasets, which can be created by end users, so it’s key to be there as narrow or as broad as the end user wants to be able to get to in terms of using all the capabilities.
56:36
Yeah, and it’s certainly data modules were or I think it’s safe to say definitely born out of a response to the market presence of products like Tableau, and click and Power BI that really put that in the hands of end users. And so, it’s designed to be pretty easy to do that sort of basic stuff that Pedro was talking about. And it’s certainly, I would say, light years significantly easier.
57:05
Then say, framework Manager, right? So, it’s definitely meant to be easier to use. So probably, I would say, easy to use as a for basic stuff, but then definitely training to take advantage of the more sophisticated capabilities of the product, which are definitely evolving over time rapidly.
57:24
Can you bring single or multiple packages and join with the data modules and or with Excel spreadsheets?
57:30
And I think you kind of you obviously demonstrated you could bring in an Excel spreadsheet but what about multiples of other things?
57:37
I mean, you can. And I would lead that into another discussion.
57:42
I think we have a couple of webinars on that. I really liked.
57:46
Yeah, datasets where we can leverage instead of maybe if you have to look at the use case, if the data in the FM package is very large, for example. But you’ve done your analytics, and you only need a slice of that fact table or that dimension that’s embedded in that FN package. And it’s relatively, you know, not as large. So, for example, you might have a, you know, a billion row, invoice, line table, or whatever, but you only need the last six months or the last year at a higher level, right?
58:17
You can create a data set off that F and package it gets offline, then bring that into your data module, and you’ll have a very good performance off that dataset now.
58:28
Use the FM package as the source for a data set, and as it’s been narrowed and vertically shrunken, because you don’t need access to all that data, right?
58:40
You’re not creating a very large report, you’re doing analytics, right?
58:44
I would recommend creating a data set off that package, maybe off that dimension, bringing that into your data module, then you’ll have you’ll be able to leverage the high performing nature of datasets and memory capabilities of Cognos.
58:56
That’s one example of doing that, but, yes, you could bring multiple packages in. You could link to them, but you gotta remember the lowest common denominator of whose performance is, how fast that package is performing.
59:08
If it’s performing poorly with reports, because they’re going, it’s very large tables, it’s not going to magically perform faster in the data module.
59:16
And you’d have to think about your use case, your analytical use case. And maybe you don’t, like I said, you don’t need all that data, in fact, all the reports or you’re doing, are you really filtering things down? You don’t need access to all the data. You just take a slice of a very good candidate for a dataset to be put into a data module.
59:34
I think you answered the next question, which was about what’s the performance like, right? And that’s mm or, whatever.
59:39
The performance is, really, out of the package is going to kind of manifest itself in the data module, and then if you start adding things to it could probably expect potential degradation, Bhutan depending on how Well you You model that source.
59:53
Correct.
59:54
All right.
59:56
Someone asked about webinars about datasets.
59:58
I actually don’t think we’ve done any specific webinars on datasets, but again, you can go over there.
1:00:04
And that’s very searchable, check it out. There’s a lot of good information.
1:00:08
They will put that on our radar for something to two hour down the road.
1:00:13
Good question. I mean, we did talk about a little bit on some of the data module. Self or Data Module is a framework manager.
1:00:18
There’s been a lot of the last release, 11.1, release, seven, did a great change in dataset editors, which could be a potential candidate for a webinar.
1:00:27
So, good, good topic.
1:00:30
Alright.
1:00:31
Well, we are at the top of the hour, so I want to be respectful of everyone’s time here.
1:00:38
First of all, Pedro, if you’re on advanced for the last slide, want to thank our presenter, mister Ining, here for a fabulous presentation on enabling self-service, and thank you, our audience for joining us, taking an hour of your valuable time out of your day to join us here.
1:00:53
And if there’s anything we can help you with Business Analytics YSP, please feel free to reach us at the e-mail. You see at the bottom there. info@senturus.com. You still pick up a phone and there’s the 888 number.
1:01:03
I know mister Felten put a link there in the Chat if you want to speak to him directly about anything you might have.
1:01:11
And thank you for joining us today. We look forward to seeing you on the next Senturus knowledge series event. Thanks to have a great rest of your day.