If you aren’t all that familiar with the data modules capability in Cognos Analytics, you are doing yourself and your data a disservice. Allowing you to connect to and combine the world of your data, data modeling in Cognos Analytics opens the door to self-service analytics and a world of deeper insights. With software release 11.1, Cognos includes some major—dare we say huuuuge—improvements to its data modeling capabilities. New features and capability tweaks that boost the ability of organizations’ self-service content creation.
Watch this on-demand webinar to find out what’s to love about data modules. Our in-house Cognos expert, and one of our most sought after Cognos instructors, September Clementin shares her perspectives, discuss intent-driven modeling, calculations and data security, and demo uploading files, creating custom tables, unions, joins and content organization.
Presenter
September Clementin
Cognos Consultant & Trainer
Senturus, Inc.
September has over 15 years of expert level experience with the Cognos suite. She has helped 50+ public and private sector clients with large scale business intelligence implementations including dashboard design, interactive performance metrics development and data modeling. She designs business intelligence course curriculum, delivers 20+ advanced BI reporting courses a year, develops BI best practice webinars and provides mentoring services for other BI professionals.
Read moreMachine transcript
0:15
Greetings, everyone, and welcome to the latest installment of the Senturus Knowledge Series. Today, I’m pleased to be presenting with you on the new capabilities with Cognos Data Modules which will feature demos of features that help boost self-service analytics in Cognos. First, a couple of housekeeping items before we jump into the main content. The GoToWebinar control panel that you see on your screen can be minimized or restored using the orange arrow. Everyone’s microphones are muted for the so, we can hear the presenters are clearly and don’t have any interruptions, but we do allow you to and encourage you to submit questions via the Questions Pane.
1:00
As shown here in this slide.
1:04
One of the first questions we get and get repeatedly throughout the presentation is, can I get the presentation slide deck? And the answer to that as an unqualified. Yes. It will be available on senturus.com, where you can find it on the Resources tab, at the URL posted here. And we also put the link in the control GoToWebinar control panel. You can access it there.
1:30
Our agenda, today will do some brief introductions, and then we’ll get into the content which features data module’s, including an overview.
1:42
We’ll cover the new features, and we’ll do a demo September, if you could advance the slide, please. After that, we encourage you to stick around.
1:50
We will do an overview of Senturus, as well as additional resources, and then we wrap up with Q&A.
1:58
So, today, our presenters, I am pleased to be joined by September, Clementine, who is one of our Senior Cognos Consultants and Trainers, who has over 15 years of expert level experience on the Cognos Sweep. She’s helped over well over 50 public and private sector clients with large-scale implementations, BI implementations, including dashboard design, interactive performance metrics, development, and data modeling. She also is pivotal and instrumental in designing our business intelligence course curricula and delivers over 20 different advanced BI reporting courses a year Develops, BI Best Practice webinars like this one and provides mentoring service for other BI professionals, She proclaims that. One of her favorite things to do is to tackle the hard problems.
2:45
So if you’ve got a tough nut to crack in your organization, September is the one you want in your corner.
2:51
My name is Mike Winehouse were a number of hats here ranging from running our training practice to our Tableau practice and serving in a product management role for our analytics connector, as well as being an emcee for our webinars.
3:06
So, we’d like to, As those of you who’ve been on, many of our webinars can attest to, we’d like to get a finger on the pulse of our attendees.
3:16
So, we’ve got many of you here today, and we’d like to understand what you’re using in your organization here. And the first question is, what data sources do you use in Cognos? I’m going to launch the poll and ask you all to select all that apply. Are you using packages?
3:35
Are you using data modules and or are you using datasets? So we’ll give you guys a minute to respond there.
3:45
And I feel a little like an auctioneer here.
3:49
We’re at Nan, about 70%.
3:52
It looks like we have about 170 people online here, so good sizable audience.
3:57
This would be a nice dataset, about 83% of you here.
4:00
I’m going to close this out and show you the results set. So, not surprisingly, pretty much everyone using packages, but a good almost half of you are using modules and, that’s a little surprising to me, but that’s great that, uh, well, over a third of you, almost the same number or using datasets, which also tells me you’re probably using a more recent version of Cognos. Great. So, we’ll try not to completely pull you to death here. But we do have a couple other polls, a mascot one now. one of them is we’re interested in finding out are you or anyone in your organization attending planning on attending the IBM data and AI forum and Miami this October.
4:43
So if you could let us know that would be great. We asked the question, because we will actually be there. We encourage you to come by and visit us. We’re trying to get a finger on the pulse of how many people are actually going to be at this event. So we’ll give this just a couple more seconds, and then we’ll get into the content today.
5:02
All right, great.
5:02
You guys are quick on the draw today, So, I’m going to go close this poll out and share it with you all, so, two thirds, you guys aren’t going, 13% are, and another 20% or so don’t know.
5:19
All right, fabulous, great. Thank you so much for your participation there. And, so, on the next slide, now, we’ll get into the meat of the presentation now, hand the microphone over to September, September. The floor is yours.
5:34
Thank you, Mike. Thank you all for joining today’s IBM Cognos Analytics Data Module’s Webinar. So, while data modules have been available since the very first release of Cognos analytics with the most recent 11.1 releases, the new features and enhancements really provide organizations with the ability to take that self-service content creation to the next level.
6:02
Today, we are going to first understand what data module, or, if you’re new to those, some of the new and enhanced capabilities, and we’ll see a demonstration of some of those exciting new features and action. We’re going to upload several local files, will learn how to create unions and joins and experience some of the new data module organization features all through the web browser.
6:28
So, first off, what are data modules Cognos Analytics? Data models are Data Source objects that encourage users to perform self-service analytics through the ability to combine data from multiple data sources, and in a user friendly interface, empowering a simple, immediate, and shareable data integration solution.
6:52
Data models can be based on a combination of existing data modules, database connections, local files, and that would include Microsoft Excel spreadsheets and text files that contain comma, tabs, semi colon, and pipe separated values. They can contain data sets or subsets of data from existing packages, as well as live connections to existing relational framework manager packages.
7:21
Join relationships cardinality join filters can also be easily managed by business users using non-technical terms and queues.
7:32
So extending new features include a redesign of the Data module interface, which includes the ability to filter by data source type, and most recent modification timeframe.
7:43
Also, intent driven modeling. This allows users to provide key terms, with smart type ahead metadata based suggest gins, and combines those with the underlying source metadata to propose tables to include in the data module.
7:59
When possible, the proposal is a star or snowflake of tables if that isn’t appropriate than a single table or perhaps a collection of tables is proposed.
8:09
Another exciting enhancement is the ability to create custom tables through a variety of table operations, including creating table views, joins, unions, intersections, and exceptions, and we’re going to see a demonstration of somebody’s here shortly.
8:28
The new metadata organization Customization Options are also a great enhancement for those of you who might be familiar with Tableau or other Business Intelligence tools. One of the nice features of some of those tools is the ability to organize your metadata according to your own business needs prior to the newest of the Cognos Analytics releases. This was something that could be done in a framework manager.
8:52
But now with data modules, you can organize your data items in folders, which not only enables a more user friendly data source. If, for instance, you have a large number of data items, that users would typically need to scroll through to find desired elements. But it’s also incredibly helpful in creating and maintaining a single data module that can accommodate varied reporting needs and audiences.
9:18
Properties such as usage, aggregation, geographic properties, and default formatting, can also be defined for your data items.
9:30
Another great enhancement is the calculation editor. Custom calculations can be created at the data module and now also at the custom table level. So, simple calculations can be created by just selecting the items to include and choosing the calculation type.
9:46
When you’re using the calculation editor, you can drag and drop or double click columns.
9:52
In your Data module tree, to add them to the expression editor, and to enter a function, you can simply type the first character of the function and choose from a drop-down list of selections.
10:05
Data model, row level, security is another exciting enhancement. So this feature is available when your data module source our database tables at the moment, but security can be assigned at the user, group, or role level.
10:22
A few tips that will hopefully prove helpful when you are in your data module planning stage, or even, if you’re perhaps troubleshooting, when you get some sort of unexpected results.
10:33
Data, granularity, data granularity differences between Data Module sources may need to be addressed and granular. Challenges, of course, are not specific to data modules. You often have to find solutions to granularity differences when you’re crossing namespaces within a single Cognos FM package and a report.
10:55
Same thing. And most BI tool, same thing in, for instance, a Tableau when you’re using joins or data blends, you might need to create table calculations, level of detail calculations, or some other technique to address those differences.
11:08
Within the data module, what you might try would be creating table views, all turn in your measure aggregation settings, or perhaps creating calculations that will address those differences. And, of course, granularity differences can often be addressed by creating unions, joins, or calculations directly in the report.
11:29
Another tip, data format must be consistent between fields use for joins and between all columns, when you are creating a union, between tables and, of course, more complex modeling requirements, like comp, flex security requirements or parameter maps, versioning, those sorts of things. They may be best suited still for framework manager.
11:54
Let’s go ahead and move into the live demonstration portion of today’s webinar and see some of these exciting new features in action.
12:08
So, this is our IBM Cognos Analytics homepage. Today, I will be working with the go sales dataset which comes with every install of Cognos Analytics.
12:18
Already created a few Excel files using the dataset just to demonstrate uploading those local files and bringing those together in a couple of different ways in a data model.
12:28
So our use case today is that we receive sales data from our three regions. Are Americas, Asia, Pacific, and Europe regions, and separate files? Right, not uncommon that sometimes you need to create unions between datasets, but we need to create reports that combine data from all regions. So first off, I’m going to upload this file. So bottom left-hand corner here, I’m clicking New and upload files.
12:58
I’ll select my three files for Americas, Asia, Pacific, and Europe, and Open.
13:04
And so as those files are being read, you can also click on your details here to see the progress. You can click view details for each of those, and figure out what phase it’s in, if it’s in discovering, analyzing, or if it has been successfully uploaded.
13:23
And you can actually expand all of these to see. You can see some of them have already been uploaded. Now, they’ve all been uploaded successfully. So I’ll go ahead and click close.
13:31
I can see these in my recent list of items here in the middle of the interface. But by default, your uploaded files will go to your my content folder. And so if I click on my content, I can just verify that those uploaded files are there under my content.
13:52
So, next, we’re going to combine these three files, in a data module to enable Reporting across our regions.
14:00
Again, bottom left-hand corner, I’m clicking on new, and selecting Data Module.
14:09
And by default, here, I’m on my Content folder, but I can also go to my team content, if my sources existed there as well.
14:18
I’m going to stay here and my content and just control click all three of the sources, and click OK.
14:30
And so now, here on the left, I can see my three sources. If I click first, just on America’s right here in the middle of the interface. I am looking at my grid view, so I can actually see the columns that exist in that underlying Excel file.
14:46
I confirm that I’m seeing what I what I think that I should be, and I want to combine these into a new custom table.
14:54
So, I’m going to control click All three of these tables, click the more tool and select, Create New Table.
15:04
So, now, some of these options are grayed out so Cognos well determined based on what you’ve selected which are appropriate. So, the ones that are grayed out are creating a copy, creating a join, and intersection, or exception. The two that are available for selection are creating a view, creating a union, and those are the two that makes sense based on what we’ve selected.
15:26
And we will choose Create a Union of Tables, and click Next.
15:33
And now, I’ll just name this new table all regions.
15:43
And click Finish.
15:48
And so before adding on, I just want to confirm that things look good. So I’m going to hide the three tables here. I don’t want to confuse my end users by having this separate these separate tables, along with all regions. So I’m just going to choose to hide these. And then in the upper right-hand corner, I’m clicking to try it tool, just for testing the data module to make sure that things are working correctly.
16:17
And it looks very much like the report authoring load. Right, I have my inscrutable Objects pane on the right, my Properties pain. On the left, I can switch between different ways of viewing the page, pretty much the same features that I see when I am and report offering.
16:35
I’m going to expand my all regions and double click on sales region, and just confirm that I do, in fact, have all of those regions. And I do. I see Americas, Asia Pacific, and then my two sales regions that were in that Europe file as well.
16:51
I’ll also DoubleClick product line to add that to the list, and revenue.
17:05
Like that again, and so, just as I would, and a standard report, I can group my columns.
17:12
I could add some summary values, maybe just to do some checking, to make sure that, that union, in fact, did happen correctly and my values are a match. I can even adjust my data format if that’s helpful.
17:26
Alright, so if I go to my Data format property, just as I went authoring, I can choose a format type. Perhaps I won’t currency. And I don’t want to see any decimal places. All of these sorts of things are available when you are in the Try it mode.
17:42
Notice, though, in the upper left-hand corner, I do not have a Save icon as a wood and report offering. So this Triad tool is really just for data module testing. So if we wanted that revenue data item, for instance, to already be formatted as currency when zero decimals for end users that is something that we can build in as a default in the data module. Now, building in those defaults does not alter the underlying data. We’re not rounding or anything like that. It would just be the default way that our revenue would be formatted in a report, an end user could, of course, change that, and include decimals, and so forth.
18:22
So I’m going to go ahead and Close out of this tab.
18:27
And let’s add on a bit. I’m going to go ahead and save what we have so far, here in the upper left-hand corner. I’ll do a Save As and save in my Content. I’ll save this as Sales Data module.
18:46
So, next, we’re going to explore some of the new data module, organization features. We can create folders, modify, data items, sort order, set default data formats to enhance the end user reporting experience, as well.
19:02
I’m going to select my all regents’ table, and click the More Tool, and choose Create New Table.
19:13
I’ll create a table called Measures.
19:17
I’ll choose Quantity in Revenue, my only two measures, and just drag those right into the Measures folder.
19:25
I’ll create a second folder under all Regions.
19:31
And name this dimensions.
19:36
And within these folders, you can also create sub folders. So, within my dimensions folder, I will create another folder.
19:46
And name it Location.
19:51
And they’ll select sales, region, and country, and drag those right into the location folder.
20:01
I’ll add another subfolder for dimensions for product.
20:09
And we will choose Product Line, product type, and Product, and drag those right into the product folder.
20:18
I’ll create another sub folder order.
20:25
And choose my order number and order detail code data items and drag those into order.
20:33
And I’ll create one last sub folder.
20:38
And this will just be miscellaneous.
20:41
And I’ll choose this row ID, which is grayed out, and that’s because that by default, that is hidden row ID and year. And I will drag those to Miscellaneous.
20:55
Here we go.
20:56
And so also, the order in which your folders are displayed can be altered. So, I will drag location here, above order. Same thing product.
21:07
I’ll drag that above order, as well.
21:12
Then, the last thing that we want to do is set our default data format for our Revenues, so that end users don’t have to make changes to that every time they drag that in. So, if you click on Revenue, and actually any of your data items and click your more tool, you’ll see a lot of different options for things that you can do to customize your data module. You can add filters, create calculations, custom data groupings, you can hide, items, rename them.
21:39
I will go down to Data Format and set the default Format type to currency with zero decimals, and click OK.
21:54
And I will save, and then, again, click Try It to see what our new organization looks like from an end user perspective.
22:06
So now, if I expand all regions, I have a really clean looking model, here, where I have two filters for my measures and dimensions. I can expand dimensions. And I know that I’m creating something that is region and revenues.
22:20
So I can go right to location, double click my Sales region, go right to measures and DoubleClick revenue.
22:29
And notice here, as well, that my revenue column is formatted as currency with Zahra Decimals.
22:38
All right. So, things are looking good.
22:40
I’m going to close this Triad tab here and let’s add on to this.
22:45
So, the granularity of our sales data is at the sales transaction level, meaning for every order, there could be one or more products ordered or detailed transactions for that order.
22:59
Our current Data module only has Quantity and Revenue data for our order detail.
23:04
Let’s say we want to be able to include gross profit data as well, but that exist in a separate Excel file, also at the same grain or order Detail code level.
23:16
We’re going to upload that new file and create a join to the new data source to provide that additional detail.
23:24
So, here at the top of my Data module plane, I’ll paint.
23:27
I’m going to click my ad source and choose Add New Sources, and I’ll click on my Upload tool and upload that gross profit data source.
23:43
Again, you can always check the status of the upload, as that’s moving along by clicking on your details.
23:51
So now, I remember. Sorry to interrupt, and we got a request to just slow down the pace. Just a tad. Sure. Absolutely.
23:59
So now in our data module pane here, on the left-hand side, I can see that I now have both Tables, separate Tables, Gross Profit.
24:09
And I can see the columns that exist in that Excel file.
24:13
And then underneath that, I have my all regions, which is what we’ve been working with here. So I’ve got two different tables.
24:20
So now I click Create a table view. As we did in the last example, I created a union between three tables, so I could, in fact, do the same thing.
24:31
Could choose my gross profit all regions, and I might say, I want to create a new table. And notice now, I have the option to create a join, because that makes sense based on what we’ve selected.
24:43
I’m going to cancel that. I don’t have to create a brand-new table, though. If I just wanted to create a join between these, and I’m fine with them existing in different tables for my end users, I can just simply create a join.
24:56
So for this demo, I’m going to switch over to the Diagram view, right? And so, I’ve gotten, you can re-arrange this in any order that makes it easy to figure out, kind of what’s going on.
25:06
I’ve got my all regions, and that includes Americas’, Asia, Pacific Europe.
25:14
Now I have this new table that I’ve uploaded which is gross profit.
25:19
So now instead of creating a new custom table, in this case, I’m going.
25:27
To just create a join, so I’m silting all regions.
25:35
Control right click and create relationship.
25:44
Can I have the right and left side of this join definition?
25:49
Here on the left, table one that is my all regions table on the right table to my gross profit table.
25:57
So now I know that what I want to join on my lowest grain is order detail code. So, I can select that here under gross profit, and you get a sample of data what’s inside of that. So I can confirm that. That is, in fact what I want to join on.
26:12
And then, under all regions, I’m searching for that same data item. So I’ll expand my dimensions.
26:20
Expand order, and, again, select order detail code.
26:26
Now, they don’t have to have the same name for this join, but we just need to make sure that, again, following our tips, that our data types are consistent for what we want to join on, and that it is data that does, in fact, go together.
26:41
Again, on the left-hand side, again, we’ll see a sample of that data so that we can confirm.
26:48
Then we’ll select Match Selected Columns.
26:55
If you scroll down a bit to the bottom here, bottom right-hand corner, we can see that we have one that matched column.
27:02
If we wanted to make a change, we can always remove that match, right, choose something else. You can also have multiple matches.
27:09
So if that’s necessary to get at uniqueness you can have multiple join clauses bottom left-hand corner.
27:19
You also have the options for relationship type cardinality optimization.
27:25
I’m going to change this to a 1 to 1.
27:26
In our dataset, we have, for every record of order detail code, we have a matching record for gross profit as well, in our sample data.
27:37
And click OK.
27:42
So we’ve addressed, or join, if we wanted to review that join, or just kind of figure out what’s going on with that, you can always just hover or select that join object, and it’ll get a description of what that join, actually is.
27:57
So now just a bit of cleanup for our data module, I’m going to expand Gross Profit. Row, ID is already hidden. I want to also hide my order detailed code because we already have order detailed code in all regions, right, so I will just hide this instance.
28:18
Gross profit.
28:19
I’ll go ahead and set the default data format for this as well as we did with revenue, so that when users bring that in, it is also, by default, formatted as currency with zero decimals.
28:39
Then finally, you can also re-arranged the orders in which the tables appear.
28:43
So I will drag gross profit below my all regions.
28:50
And let’s go ahead and save this.
28:53
And now, for testing this time, instead of going to try it, we’re going to go ahead and just build out a report with this.
29:00
I’m going to create a new report.
29:04
I’ll choose the built-in one column template.
29:11
So now, again, we’re in report authoring mode. Looks very similar to what we saw when we were trying out our data model except that. Also, you will see some other items. Like, say, for instance, in the upper left-hand corner that we can see this report.
29:25
So, now, on the left-hand side, under Suitable Objects, I’m clicking here to add a data source.
29:32
And from my my content, I will select the Sales Data module and click Open.
29:44
And I will click the click, add here in the middle of the interface.
29:52
Chews Visualization. And I’m going to go with one of our chart objects that will allow you to really showcase all of the different measures that we have in our data module coming from different sources.
30:03
So, I will choose charts, and I’ll start this off with just a simple clustered column.
30:11
And, I’m going to go ahead and make this a bit bigger, right off the bat. I will change my size and overflow properties to make this 1200 by 600.
30:24
I’ll expand that all regents’ table. And, again, I can see my organization here of dimensions and measures.
30:32
Expanding dimensions and location. I will drag in sales region. I want that here on the X axis.
30:40
And I will nest below that product line.
30:46
And you may get some warnings here about the layout. We’re OK with that and page design.
30:53
I will expand my measures and drag in revenue into this series.
31:03
Then I’ll also add a combination here.
31:08
Scroll up and the property pane to the combinations property, and add a secondary axis.
31:19
So to my primary axis here, right, really mix things up and show that these different data sources are working together.
31:26
I’m actually going to bring gross profit in as a peer of revenue side-by-side bars.
31:36
And I’ll bring, I’ll actually select the secondary.
31:39
And let’s make this, instead, a line chart and drag over quality.
31:53
And let’s go ahead and run this and see what we have here, run this HTML.
32:08
So we can see we’re able to create a visualization with data from essentially four different data sources. The first three are the result of a union. So down here on our X axis, we have all three of our data sources with region data, Americas Asia, Pacific, and our Europe data.
32:29
And we have our product lines nested within that.
32:32
Then, we have side-by-side bars of revenue, again, revenue, the result of our union, and then gross profit, the result of the join, and even quantity here on a secondary axis.
32:47
So, again, so much exciting new functionality added to data modules. This is just a sample of some favorite features that clients have been requesting, but if you would like to learn more about all the new features, they are covered in detail in our new Data Modules Fundamentals Course here, as well.
33:06
And, with that, I’m going to hand things back over here to Mike.
33:15
Good stuff, September. Thank you very much. We’re going to ask you one more poll here. We encourage everybody. Please stick around, we have all the Q and A at the end. Submit your questions in the Control Panel, and we will get to those at the end. With the caveat that if we don’t have enough time to answer all the questions, or if we have to do research on something, we will fill out the question login, post that along with the recording and the slide deck that you can watch later on or refer to later on. So, the last poll we have is, what version of Cognos are you currently running? I’m going to launch that poll give you all a second to answer that. So, your choices are eight version of 8, 10.X, 11.X, and, in other words, 11.0 or, if you’ve upgraded to one of the 1001 versions, please respond to that.
34:07
OK, we’re up about 75% here.
34:12
Give me a couple of seconds to respond to that.
34:16
OK, and I’ll share, so, happy you all are on a current version, 11.1, something, and then we get we’re hitting close to 85%, or at least on version 11, and then the rest are kind of lagging down with version 10. Thank you. That’s very helpful information.
34:34
All right, so coming back to the slides, I’ve got a couple of offers here for you before we do, Again, a quick intro on Senturus and we’ll get to the Q and A, so please stick around for that. The next slide, September.
34:48
If September alluded to this, if you like, what you heard here, we invite you to visit senturus.com/training, where we have a new offering, the self-service data modeling with data modules, which will be, generally run by someone like, September or one of our other professional trainers. So, you can register for that over there.
35:10
That’s a one day class and, if you need more specific assistance helping with your Cognos environment, we offer several different options here ranging from a Quick start upgrade, which will help you with the installation configuration of a dev environment and document that for you to a full service upgrade implementation, which will do all of that across your appropriate environments. In other words, you know, QA and production. Or, if you need more specific mentoring or tailored expertise, we have our BI experts on demand. We will give you access to our entire team, a PI experts that can fill in, as needed. You can reach us at the e-mail below, I put our contact information at the end of the deck, as well, so you can always reach us. A couple of quick slides, in terms of who we are, its Senturus, September, if you want to jump ahead, one more slide. Our clients know us for providing clarity from the chaos of complex business requirements, myriad, disparate data sources, as well as constantly moving targets. We’ve made a name for ourselves because of our strength at bridging the gap between data and decision makers.
36:14
We deliver solutions that give you access to reliable analysis, ready data across your organization, so you can quickly and easily. And answers, at the point of impact, in the form of the decisions you make. And the actions you take are consultants are leading experts in the field of Analytics with years of Pragmatic, real-world expertise like September.
36:32
And experienced, advancing, the state-of-the-art were so confident in our team, and the Senturus methodology that we back our projects with an, with a unique 100% money back guarantee.
36:45
We’ve been at this for a while. We have over 1500 clients, across 2500 different projects, and Banat this for nearly two decades. At this point, our projects range companies from the Fortune 500 down to the mid-market, and across every line of business, ranging from sales to finance HR and across the organization. So, if you have an analytics project, we hope that you’ll consider leveraging our expertise in your next project. We have some great additional resources here that we’d like to bring to your attention. First of all, we’ve got a couple of great, upcoming events.
37:21
The first of those being are very successful comparison of Power BI, Tableau and Cognos where we do a demonstration and compare data loading Data prep and building of dashboard visualizations across those three major platforms, So, we’re Reprising that. And enhancing it for Thursday, September 26th.
37:41
You can register for that, as well as other events, its senturus.com/events. We’re also doing a, an exciting new on an enterprise security Tableau versus Power BI. Or, we’ll do a demo and compare approaches to key security concerns there. Again, you can sign up for that over at senturus.com/events. We have a ton of free resources over at
38:03
senturus.com/resources where in addition to the aforementioned upcoming events, you can access our full resource library which has all of our past webinars, again, recordings and slide decks and question logs on on all kinds of topics, ranging from tips and tricks to best practices and BI, et cetera, et cetera.
38:26
And our blog, which contains nice bite sized morsels from various presenters and, and within Senturus, talking about what’s top of mind and the latest and greatest in the industry.
38:39
I’d be remiss if I didn’t talk about our extensive training offerings. If you go to Senturus.com/training, we provide training in the three top tools: Cognos, Tableau, and Power BI. Our ability to be conversant across multiple BI platforms makes it ideal for organizations that have embraced bimodal BI or running multiple platforms, or those, perhaps, moving from one to another. We offer training in a number of different modalities.
39:06
We offer corporate training, self-paced learning, mentoring, and instructor led online courses to help meet whatever your training needs are.
39:15
So, thank you for your attention to those. And we will now flip over to the next side and jump over to the Q&A.
39:23
September, imagine you’ve had a few minutes to look at this, and hopefully the flagging those helped out a little bit. Did you have a question that you wanted to lead with, or do you want me to kind of Tone up here for you?
39:33
I’ve got one that I just saw. There was a question on number of rows for Excel for those uploads. So, yeah.
39:42
So, the size limits for individual users, the default limits, are hundred megabytes for an individual file, and then there’s a 500, those are all modest. You can modify those so you, your sys admin can go in and move those up if that’s needed, as well.
40:05
So, it’s just really going to be the organization specific about which groups or folks can upload what file sizes.
40:14
Got it. That’s helpful. And then, are you aware of what Cognos capabilities should be granted for users to be able to use data modules? I know what licensing level they need, but I think there’s a difference between that and the capability.
40:27
And so, I’m not sure.
40:28
So, yeah, I would say if your end users are trained to understand things like joins, and it also depends on what you’re connecting to. So in addition to local, it’s just local files. That’s probably a little bit easier. But you can also connect too directly to databases. So if you’ve got end users that are touching databases and then joining those two other things than, probably quite, a bit more training might need to go in to make sure that that they understand what those relationships are and what the impact would be.
41:01
Right? I know that I understand.
41:03
You know, one of the, one of the things that might compel organizations to leverage data modules, as was brought up, that the, to use framework manager, they need to be a, a cognitive analytics administrator, effectively. Which is a much more expensive license and the Cognos Analytics user, or the Cognitive Analytics Explorer, which allows you to, from a licensing perspective, use data modules and our datasets. We had another question about, you know, formatting columns and whatnot.
41:33
So are you able to do things like format, you know, this person asked about formatting column of revenue in dollars, can you format a date column into to other permutations you can you can do with that to to enable you to use it for grouping purposes in a report, for example.
41:53
So at least you can format date, you have a lot of formatting options and if for some reason you have a really specific formatting need, that’s not covered. You can also create your own custom calculations and maybe do some sort of concatenating and create whatever it is that you need for grouping that your end users are used to seeing if that’s needed as well.
42:15
Good. Are you able to create a package from a data module, or leverage an existing package?
42:24
Yes, you can leverage existing packages in your data modules, you can connection directly to the framework manager package.
42:34
And then, also, join that just some other source, as well. You can connect from our grantees, or package, and Excel, or it could just be framework Manager Packages. And then you also have the option from a Framework manager package to create a dataset, some subset of data from that package, and include that in your data module instead or in addition to.
42:58
OK, yeah, that’s great. And that sort of speaks to the next question, there, were there. This person is asking about joining data sources in a table and data modules.
43:07
I think they’re sort of asking if you can join date multiple data modules or datasets and or join those, maybe two packages. Can you speak to that a little bit?
43:19
I would say it depends. Because if you are joining directly to A, depending on the sources that you’re joining two, you may or may not have control over the data type. And that was kind of one of the one of the goths that we talked about earlier. Right. If you have to make sure that those data types are consistent, if it’s the key for a join, if it’s not it just as some other column, then you should be fine.
43:46
But if you’re, it’s a key for a join, or if you are doing a union, that’s when you have to make sure that those, that those data types are consistent.
43:55
Yeah. That makes sense. And then, can you speak at all to how Cognos manages this? Because, and I think what this person is getting at is, you know, are you pushing it to the cognitive server?
44:05
And, I imagine the answer probably is, is, you know, It depends, right, because it’s going to try to push processing down to the database where possible.
44:14
But, when you start, you know, maybe combining things, and it has to, excuse me, pull things in and process locally, but maybe I could talk about that.
44:21
Yeah, that’s exactly right. If possible, you know, that’s what Cognos will do, but there may be use cases where it will be local. Absolutely.
44:30
And then, also, depending on calculations, just, like, weather report, depending on the calculations that you use, they’re there can also be local processing that happens as a result of that.
44:43
Can you speak at all about best practices? In terms of aggregations. Now, this person that asked about, for example, calculation of percentages so that your aggregations work properly and then a further asks if you’re able to move the calculations into different folders. So can you do some of the organization stuff?
45:01
So, again, I will say, it depends.
45:04
If you are, if you’re creating calculation, so, something that is a percent or something like that, that works great, oftentimes, when you’re at some sort of a row level list or that sort of thing, but, if you want to create group aggregates or something like that, then Cognos is may tried to total those percentages if it’s not clear. So, depending on how you’re planning to use that in a report, you might choose to have some percentages, depending on the type of reporting, but have the actual items that would be included as the denominator and numerator. So that if the end, users need to report on it in a slightly different way, or they need to create some sort of summaries that are a bit different, or do you need to?
45:51
Are working in a cross tab, And I’m kind of getting into the, into the reporting piece of this. But if you’re in a cross tab, and you need to modify the solve order so that, that, that denominator, that division happens at a certain time, then you could get into trouble if you just have the calculation there, instead of having the numerator and denominator available for your end user.
46:11
Hmm, that sounds a little like with a problem that Tableau table calculations, kind of tackle aggregations that depending on how you roll things up, are always a tricky thing to do in your data.
46:23
We had a question, so I will say, kind of one last thought about that, is that.
46:28
so if you have different grains of data that you’re using in your data model, you may need to take the aggregation off instead of having it. So revenue, or forecast, or whatever the cases you would typically leave it at total often. But if your forecast is at a different grain than your sales, then you may need to set, depending on your again, your use cases, you may need to set the aggregation for forecast to none instead of total. So that things are rolled up properly.
46:58
Got it.
46:59
And then, someone asked about if there’s improvements in the dataset functionality, and latest Cognos version, And I think based upon what we’ve You know, what we’ve researched and have in our environment the functionality in in datasets and whatnot as is largely unchanged at the point of 11 1 3.
47:18
Aside from I’m sure some bug fixes and stability improvements, but they haven’t really I’d say they haven’t really changed the underlying functionality to any significant degree between what you just demoed and what’s out there right now.
47:31
Right? That’s right. And I think that we could say the same thing for what’s coming in 11, 1.4, from what we’ve seen of the, of the sort of roadmap information that’s out there.
47:43
Let me see here. Sorry, I know. I’m not. I’m looking through some of the questions looking for another one here.
47:52
This one’s around parameters and dashboards. I’m not sure how relevant that is here. Do you know if ethanol if planning analytics, you’ll be able to use Modules Andorra cubes? I know that’s kind of a little outside this. But, do you have any ideas on that?
48:08
Not sure. I’m not sure about that, yet.
48:09
We can always definitely research it and get back, though. I wasn’t sure. And then, this is a big question out how far down the hole you want to go on this one. But, this gentleman asked about suggestions on best practices for organizing the data sources and Cognos analytics best practice for your Data? Where’s Warehouse organization For BI usage now? That’s a really big topic.
48:30
I mean, I think I’ve seen, you know, in terms of organizing the data sources, that I see a lot of organizations, they have a packages folder, and it’s, it’s organized under their by, say, line of business or organization, and that, can, you know, that, that will vary by organization. I don’t know if you have anything you’d add to that.
48:51
I would definitely agree with that. As far as what I.
48:56
Seen in the field, it’s largely based on your end users are used to seeing things. So, depending on organization, they might be fine with seeing several different tables. And knowing, Hey, I’m in finance, I’m going to go to the finance table, and I’m not going to cross over into the procurement table. Whereas in other places, that might just kind of be a little bit too much, and it might be better to have those separated out. So, really, I would say it’s really organization specific.
49:23
Yeah, and I think that’s the million dollar question, right? When Scott here gets into what’s the best practice for your data warehouse organization for BI usage? And really? what we’ve always espoused is the idea of the, you know, sort of the Kimball structure with conform dimensions so that you are able to traverse different subject areas, leveraging common dimensions, and then you get into things like hierarchy, management, and slowly changing dimensions. So those are kind of the big questions that, that we help our customers with all the time.
49:52
So, you know, the high level answer to the best practice for data warehouse organization would be, you know, tackle it from a kind of a conformed dimensions, bus matrix approach, but that takes a lot of analysis and it’s important to get that right. So you’re getting the right answers to your organization and people are numbers align, and people are adopting the solution and things like that. So those are the kinds of things the media questions that we tackle all the time.
50:21
Are able to There’s a couple of questions around being able to connect to Excel data sources sort of automatically or programmatically refresh those things, so, it what do you know about attaching to Excel files? And I think the idea is they’re thinking that this Excel file that they’ve dropped in for, say, a dataset or something is, dynamic area gets refreshed. And then, how do we handle that in in Cognos and datasets or modules?
50:46
Again, the default way is a manual approach to refreshing.
50:51
So, and if you have a lot of Excel files there, there may be probably a better way to do that than just kind of uploading those files. But if you right click, there, it is, a manual refresh. So, there may be some new features. And when that are 11.1, or that are addressing that, I haven’t seen that yet.
51:13
Got it. And there’s a question here about all that package is being used as a source. And I think what we heard there is that that’s not a capability that’s there right now, but it is on their roadmap. Without a, they didn’t give us any sort of timing on it.
51:26
Right?
51:27
So what happens with that is if you can actually connect to a dimensional source, it won’t function as it would in a report. Giving drill up, drill down capability, and you may encounter some error messages if the report is not really simple. So what you’re typically going to want to do, you can still use the data, but you would create a dataset from the dimensional source and, which, essentially, creates relational right? And then use that to, to feed your data module. So again, you get the data, but you don’t have that drill up, drill down functionality that you want if you connected directly to that in a report.
52:05
Got it. Thanks. And then, so there’s a good question about, how did you version control and migration to production, so sort of life cycle management?
52:14
Do you have any, I know, again, that’s kind of a big topic, but what capabilities are there, is that kind of, kind of less governed?
52:24
Yeah, I’d say, I want to see what’s available and new to kind of figure out what’s available with that, But that’s more when you’re getting into more of the governance than For now, I would say Framework manager is more capable of handling that, but I would say down the road that that could change or maybe is changing.
52:44
Got it.
52:45
Yeah, and I think that that sort of that leads to one of the other questions that I saw here in the, you guys are asking a lot of great questions. Oh, by the way, someone asked, you know, Kind of, what’s our opinion on the on the future of framework manager? You want to take a shot at that one?
53:00
Well, TBD is the, is the true answer. But I would say from what, I’ve seen in working with data modules now, there are so many great capabilities of it. But there are still some, they’re quite so much functionality and framework manager. That is specific to framework manager. That doesn’t exist.
53:17
So, it would depend largely on what you’re using it, what you’re doing, and framework manager, as far as creating those packages, just because there is so much functionality that is not yet part of data module.
53:30
So, as it progresses, perhaps it could, there could be more of a 50 50 split or something like that, but, I would say, you know, with the over 20 years behind framework manager, I don’t see it replacing it very soon.
53:46
Yeah, I think that’s, that’s spot on. And, of course, this is just our opinion, right?
53:51
But, I think it’s informed by a lot of interactions with customers and that, you know, I think we do see Cognos responding to what’s out in the market in terms of the Tableau? And Power BI is where they’re able to pull in flat files. And, you know, as long as I was with Cognos from for it for a decade, between 2002 and 2012. And they were even asking about bringing in Excel spreadsheets back then.
54:13
So, I think that’s always been something that users want to do, and they want to expand self-service in organizations.
54:20
And so, that, by necessity, sort of entails being able to do a lot of this stuff, without the use of necessarily fat clients, or having to use something as, like, framework manager. But she’s absolutely right, that it’s a 20 year old tool, and it’s very robust. So, I don’t see it going anywhere anytime soon. But, the use cases that will be covered by data modules and datasets I think will rapidly expand, and you’ll see more of a, maybe, a Venn diagram with a little more, A lot more overlap as they start to enhance that functionality.
54:52
Let’s see, anything else you see up there?
54:57
Oh.
55:03
Yep, I think, I think those are kind of the key questions that we have here, and we are nearing the top of the hour here, so I think we will.
55:13
Well, I’m, like, I said, we’ll go through the question log here, and any questions we didn’t answer, we will seek to answer in the question log and post along with the deck. And the video, the video usually takes a week or two. We have to do a little bit editing on it before we presented, before we put it up, on the on the website, but it will definitely be up there. So, with that, I want to move to the last slide, and thank September for an excellent presentation on Data Module, so that was very helpful. We encourage you all to reach out to us with any questions you might have, any analytics needs you might have, or training needs. You can always reach us at our website, at info@senturus.com. Or if you want to pick up the phone, we have AAA number there. We also encourage you to reach out to us and connect with us on social media via LinkedIn or slide share, YouTube, Twitter, and on Facebook. So, we thank you for joining us today. Appreciate your time and attention. And look forward to seeing you soon on another Senturus webinar. Thanks, everyone. Have a great rest of your day.