Migrations. Oh for the days when the process involved simply moving from one tool to another. The variety and complexity of migrations has multiplied. Project types fall under a wide umbrella and can include everything from moving Excel to the likes of Tableau, Power BI and Cognos to consolidation and moving from on-prem to the cloud. For starters. Given the mission-critical role of BI in today’s most successful companies, failure is not an option!
This webinar recording takes the migraine out of migration, providing the information to ensure a successful change. Learn how migrations have changed over time and the common phases and steps in most migrations. Get the common pitfalls that can trip you up as well as the tips and tricks to help ensure a smooth transition.
Practice Area Director/Solutions Architect
Michael is part of the team that developed the Senturus migration utilities and Analytics Connector. He also heads up our Tableau Practice. Before Senturus, Michael spent over 20 years in different roles at Oracle, IBM and SAP acquiring a wealth of hands-on, practical BI and big data experience.Read more
I’m John Petersen, CEO and one of the founders of Senturus. I’ve been working in BI for almost two decades now. And today’s session is, I think people will find it to be a really interesting one. It started because over the years, as we tackle projects, over 2000 some odd projects, we have actually built out some very interesting technologies to help us facilitate those projects, both in terms of cost time and an accuracy. And we call the neck of the company that produces those, the Senturus labs. If you’ve been on our webinars before, have seen some of the products to come out of the labs.
One of which is a pretty hot seller, is the Senturus Analytics Connector tool, allows you to connect numerous BI systems to metadata layers and make the most of existing systems with some of these newer self-service platforms.
What’s interesting about today’s session is we actually have created some tools that can help people migrate their systems, and I’m going to ask, or we’re blessed to have Mike Weinhauer, who you may have heard before, walk us through the migrations. Now, the interesting part is that, as we put this webinar together, we realized that the tools component, and the automation component was really a fairly modest piece of the entire puzzle when you go to do a significant migration.
And if you really want success, namely high user adoption of particularly the new platform or, or co platform, you’ve really got to look at a much broader set of aspects of a methodology. And so that’s what this webinar morphed into. I will tell you now that the deck and this presentation will be available on our resource site. The deck is already up. And so we’re going to dive really race through it. But, there are aspects that we’d be more than happy to circle back with you on. If there are some specific questions around things like tool selection, you name it.
Anyhow, with that, let’s roll right into today’s session.
A couple of housekeeping items, as always, we love the interaction of Q and A. We do it through the question and answer pane, and we try and keep up real time during the presentation. Some of the media questions that might be of interest to everybody will queue up for the tail end of the presentation and address them live, and so I encourage you to interact. I think that’s one of the main reasons people love doing, showing up these things live.
As I mentioned before, the most popular question we get is, hey, how can I get this material? It’s available up, the deck is already available on our site under resources.
If you haven’t had a chance to look at the Senturus as resources area. You really should. You can slice it and dice it. There’s some really meaty content that we’ve built out over the 15 some odd years that we’ve been running these webinars. So take a look at that.
And if you don’t see something you uh, specific, feel free to contact us.
Alright, so driving right into today’s session, we’ll have a brief introductions of the presenters. Then we’ll get into an overview of migrations, what do we mean by migrations and this has actually morphed over time.
I’ll queue that up in just a second, talk about some of the key challenges of analytic system migrations, and then break down how we have tackled projects. Probably hundreds of these projects, some of our largest projects are migrations into sort of a methodology around the phases and, and steps, you need to go through pretty much on almost all cases to make a successful project. And then we’ll talk, we’ll show you a demo of a very interesting piece of the equation for us, alluded to a couple others. And then we’ll finish up with just mentioning a little bit about some tourists the additional resources, and, of course, one of the favorite areas, the Q And A session.
So today’s presenter, predominantly is going to be Mike Weinhauer. Those of you who’ve been on our webinars before, and probably heard, Mike. He does a lot of our webinar content and is an exciting speaker on this stuff, make sure you throw him a curve ball question or two, He’s a bit of a jokester. I have been involved in, as a senior architect in a lot of our projects. So I have a little bit of past experience. Doing the good old honest work. Alright. So, going to the, today’s session, the interesting part about all this is we looked at our project load over the last almost two decades.
We realize that I would say the majority of the large projects that we’ve tackled have been some form of migration.
You know, people are putting in Greenfield system. How does that work? Well, reality is, nobody flew completely blind in the old days. They, oftentimes, we’re leveraging Excel as your predominant BI system, and some of that still occurs today much of the planning areas of companies, for instance, in the Office of Finance is typically still Excel Senturus. So, the migration was from an Excel based system to, say, a classic integrated BI system, like, you know, like Cognos and more. Currently, this aspect of migration has just blossomed into all sorts of aspects. And Mike’s going to get into some of the specifics, but migrations from on premise to cloud, migrations from sort of classic IT centric systems to end users, self-service systems there.
Migrations, from current, an older version of a particular environment to a newer version. That’s a very common thing for us to tackle. And so, there’s lots of different angles on it, but it all comes down to some basic stuff. And that’s, we’re going to walk through today, and show you some things that we’ve learned about how to make them successful on the method side.
But also, some tools that we, we have to help improve the performance, the speed of completing these projects, reduce the cost, and massively increase the accuracy of the things like data modeling. So, with that, I’m going to hand it over to Mike for his whirlwind tour of the Senturus methodology on migrations and some really interesting tools coming out of the labs.
Take it away, Mike.
Thank you very much, John. Appreciate that introduction.
So, as John mentioned, the first thing we need to do is, is really define what is a migration, and a joke that I seem to pull out more often than I probably should is that change is inevitable, except from a vending machine, and that jokes really. My kids don’t get it anymore, because vending machines take credit cards and get away with charging at three bucks.
For a 12 ounce cans of soda, It ironically makes it all and all the more apt metaphor for migrations and kind of their current state in that the old definition of a migration, you know, quote, unquote, simply moving one tool to another really no longer applies.
The business and technology environment we find ourselves in has only increased its rate of change in complexity, subsequently driving a greater need for BI migrations. They’re bigger, more complex, ubiquitous, and their success is more critical than ever. John’s already alluded to too much of this stuff, but they range. The varieties changed a lot. They kind of call this.
I’m fond of calling this multi axial, because you’re not just moving, say, from Excel to Tableau, or doing a classic system to system migration. But you’re also then pivoting from on perm to cloud or hybrid and in-house to perhaps managed service in addition to cultural changes like centralized to self-service. So, that, in and of itself, drives this massive variety.
And, of course, if you’re looking through this list, you’ll probably end up checking several of the items, if not all of them.
The other item is, that is that migrations are way more complex. Now, for the many of the reasons, I just mentioned all those, the multi axial nature of them.
But the shift from simple list of visual from static to interactive, printer desktop to any device, you’re kind of pivoting on all these different axes.
And migrations precipitate these inflection points where you need to ask yourself in order to be really successful and have a system that works and is effective for your organization over a long lifespan. You have to think about all of these elements.
Their scope is ubiquitous, because BI now touches all functional areas. Every area of the business, it crosses those functional areas. If you look at the 360 customer view, for example, that crosses sales, support, marketing, supply, chain, and manufacturing. It touches all these areas and BI even crosses organizational boundaries, reaching out to customers and suppliers and partners and communities depending on the organization in which you find yourself.
Also migrations are mission critical BI is not a cost center, it’s more than mission critical if harnessed correctly. It’s a competitive differentiator. It’s no secret that companies, leveraging analytics consistently outperform those that don’t.
And to quote bane here, you can see that companies who use them the best are twice as likely to have financial performance. In the top quartile.
There are three times more likely to execute decisions as they intended, and five times more likely to make decisions faster so they make better decisions faster and perform better.
Again, this is well, this is well documented in numerous other studies.
However, migrations are very, very challenging. And sort of stepping back a little bit.
Gardener in 20 11 cited that 80% of BI projects fail right? Think about that for a second 80%, it’s a stunning number. As of 2018, they pegged that closer to 60%. But that’s still terrible.
Imagine going into a project and realizing you had, maybe, coin, toss, odds of getting it right, you know, two thirds chance or so of failing. It’s kind of bonkers when you think about it.
In fact, one of our biggest current projects is a follow on to a failed lift and shift project with a client that’s now migrating to Cognos using our project based approach.
When you compound that with the problem of, even if you do the technical stuff right and get the, get the system up and running the adoption rates, still hover around a rather abysmal 25%. The reality is that migrations are one of the costliest, riskiest projects an organization can undertake. And firms tend to lack the skill sets to successfully accomplished migrations. They skills are unique to migrations and are not core competencies for many organizations.
The skills are needed and all of the tools, the current tool, and the future tool, or tools. Applications and environments, again, remembering that sort of multi axial pivot that we’re making, that, potentially, cloud and managed services and whatnot. They’re very resource intensive. So, the time and bandwidth is A, is an issue and if you tap internal resources, they have day jobs and can’t take on the migration. The the challenges of the migration without sacrificing and other key areas that can impact the business deleterious lead cost is always a big factor. And that ranges for everything from licensing to infrastructure and whatnot.
And, of course, risk, There’s a big business impact of downtime. In fact, we see oftentimes old systems, the old BI system, that the business relies upon. They can’t be turned off until a new system is up and running, and those business users are satisfied, that that new system can run the business. So, not done correctly. You see parallel execution happening for years, sometimes, at, at huge expense, and, of course, there’s just the risk of outright failure, either a Black Swan project, or just one that goes way over time, and over budget with low adoption rates.
So with that, we’d like to always get a finger on the pulse of what our, what the attendees are doing.
And I’d like to ask you, and what you feel the biggest challenges are to that you envision. Or if you have to be going through migration, you choose all that apply. And I tried to pick were limited to five choices. So I kind of bucketed some of them? Is it the complexity around your current environment, quantity of, say, models and reports? Is it that lack of in-house skillsets? Is it that the tool selection is too daunting? There’s too many tools, too many options. Is it that you need a cloud managed services strategy? Or did I completely missed the mark here, or something else that, that we’re not aware of.
So go ahead and fill that out, and we’ll give people about a minute to do this.
Are up at about 40%.
OK, get those votes and guys.
So, just for references, about 100 people on the phone, here today, and, I’m going to close this thing out in just a few seconds, if you guys want to get your last second votes in here. Right, and I’m going to share the results of the poll. So, interesting, full, you know, two thirds, a little more.
It’s the complexity and quantity of models and reports that that dominates and then it looks like. I may have gotten that, that, that, those items correct. So, thanks for sharing that. That’s really interesting, and keep in mind that you can answer those. You can provide multiple different answers for that. So thanks for giving us some insight into the makeup of our audience here. I have another poll for you in a couple of seconds here. So thanks for answering that. Then the other question we have to ask ourselves is, why are we doing this? And a word of caution, when you, if you build presentations like this, you probably know, you end up spending a lot of time searching for icons. And my first thing that popped into my head when thinking about why we’re doing it was self-inflicted torture. And I just want to warn the audience don’t type self-inflicted torture into a Google search for images.
You’ll see things that you can see and are no way, shape or form appropriate for a professional presentation. So Word otherwise there. So, why are we doing it?
The most common item that drives, this is mergers and acquisition, and this may be forced, right. The, the acquiring company says, hey, we standardize on tool X or to a Y, and you guys have to do this, right. You don’t have a choice. Oftentimes, it’s the business need, right? The current system isn’t allowing you to leverage analytics modernization, end of life, platform shifts, move to Cloud, et cetera, et cetera. A culture change to self-service, right, and some of these overlap a little bit. I realize a regulatory change, a desire to reduce licensing costs, or a desire to consolidate purely around reducing complexity, and having a single solution of reducing the number of solutions that are out there. So to that end, I’m going to put up another poll here and ask you, if you’re contemplating a migration or you’re in the throes of one, go ahead and if you could tell us what issues precipitated that or are precipitating a possible migration? And again, I was forced into five.
So, is it consolidation either as a result of M&A or license cost or a desire to streamline? Your operations? Is a business need and that you want to be able to leverage. Analytics better in your organization. Is it modernization, are you being, you know, forced off and end of life platform, or something? Or you’re not planning one or is it something else, again, that I did that I wasn’t able to capture in here. So I’ll just let this run for another Nan or so.
We’ve got about 55% of you in.
Interesting, everything’s running, kind of, neck and neck. I get to see the results changing in real time.
It’s like watching the Kentucky Derby.
All right, so here we go, I’m going to share this out.
So, yeah, it’s pretty evenly split really, between a consolidation business need and modernization, and then about 20% of you have other reasons. So, I’d be curious to learn what some of those other reasons are.
Feel free to reach out to us if you care, to share any of that or you want to discuss it. So, thanks for sharing that, those insights. So, Senturus developed a migration framework here, too, that’s designed to facilitate all the components of BI migrations to minimize risk while maximizing adoption and Roy.
Those, the concepts that we share here are, they spend any BI system migration. They’re not unique to any tool. Now. We’re going to touch upon each one of these, so but bear with me. I promise you the technologies in here, and I’m going to demo all this stuff. As John said, it is bit of a whirlwind tour. And so it’s £15 of sausages and a £5 bag. Is he actually said, but we’re going to touch upon each one of these.
So bear with me through some of this, and I’m I’ve got a lot of hyperlinks to other content in here that you can dive deeper into any one of these if you want to. But, again, this framework is sort of a way to have hold the conversation and keep in mind, you may not execute each one of those steps necessarily serially, and you may end up looking back on some of them.
So in a migration, it’s really important to go back to all the various steps.
What we tend to do is, and what we’re going to focus on more today, perhaps, somewhat hypocritically, is, is we focused on the semantic model design, downstream, right to the rollout. When, if you’re smart, you need to look back at the entire system, and I’ll try to point that out, where it’s appropriate in the, in the presentation. The, in terms of the supporting structure, so you have documentation, governance, people, and technology, it’s important to take this holistic approach, again, across this. Technology is one component that we tend to be really good at, and we focus on, But, really, the, this, the best projects, start with good documentation, an understanding of people and process, and all that good stuff. And, there’s a phrase, I really like, it’s a business focus and intent, so, this sort of top-down driving a culture of analytics. So, try to consider the entire ecosystem. And, I’m a fan of that diagram.
So, the first phase involves assessing your organization’s current BI state and the first step there is organizational readiness, again PI’s everywhere and success demands. Executive sponsorship.
Top-down leadership and broad support across the organization.
And it requires cross functional support between business and IT. Now, whatever your buzz word of choice is, whether it’s a culture of analytics, are adopting digital DNA or being insights driven. Now the question is, you know, do you have a CDL?
And again, data driven or insight driven organizations perform much better than their counterparts. Everyone wants to move away from gut based decision making or simply extrapolating from past data. You don’t want just more or better data. You want to make better decisions at all levels of your organization.
You also need to then assess the current state of your BI platform as it really isn’t on fire.
Or, can you sort of coexistence and do a phased cutover? Or will the current platform even remain in place for certain areas? In other words, we would be considering adopting what Gartner calls a bimodal approach, where you have more than one tool, one for established systems, that’s mode 1. 1 for experimental and innovative areas.
This really involves doing a gap analysis of the physical existing BI platform and understanding where the problems are what can and should be salvaged and migrated. Or they’re quick fixes that makes sense to invest in in the interim. Looking at software, hardware, network queries, databases, and all that good stuff.
And then assessing who your stakeholders are. And that’s really anyone who’s vested in the project’s success. And I got so, sorry for sidebar here, But again, when you’re searching for icons, I did a search for stake holder, and one of the first guys that popped up here was literally this little guy holding a stake. So I’ve just makes me chuckle. And I had to throw them up here, but not that kind of. Stakeholder stakeholders. So your executive sponsors. Your business, and power users, your IT people, and maybe even customers, suppliers, investors, communities, whatever. Again, it’s important to understand, and identify all of those people, so it meets the needs of everyone that needs to use the system.
Then, business requirements are extremely important. What are we trying to measure and improve? Are we trying to increase revenues, decrease costs, comply with regulations, penetrate new markets, and reduce risk, some, or all of the above, right? Would you build a house or a bridge without a blueprint?
And I’m really fond of the Frank Lloyd Wright quote here, where he says you can use an eraser on the drafting table, or sledgehammer on the construction site.
In other words, you save yourself a ton of effort, time and money.
And, indeed, the success of your project is predicated upon good planning. And, of course, failing to plan is planning to fail who, depending on which Mimi, pick off the off the web is attributed to. Winston, Churchill, Ben Franklin, or a host of other people, I think it’s Ben Franklin. But, really, this gets to have a business requirements document which describes how and why you’re doing the project documented in plain English, and it should attempt to be forward looking, right. So, this is a system that’s going to be in place and is going to be mission critical for your organization for the foreseeable future. And then, lastly, here, change management. How do we understand and mitigate the negative impacts of change? And this is where effective project management is really key to manage scope, to avoid cost and time overruns that our project killers. So again, this is a very high level view of the assess area. You want to go deeper into this stuff. Please watch Sean Outpaced, excellent KPI. It’s not about the tools webinar.
So Mike, let me, if I could jump in just for a second and just add one thing, 100%, Go ahead. A very common technique that we hear out the shoot from folks doing migrations is all we, we, we just have budget to do a lift and shift.
And really, we know that the current system kind of modestly meets the requirements. Let’s just do a lift and shift, and that’s pretty obvious. We got 100 reports over here. We know 100. That is typically a great way to fail because these new systems have completely different capabilities that could be leveraged.
If you just went back to the business requirements that Mike was alluding to and said, you know, rather than a static report, prompted report, if I gave you an interactive dashboard, would you, would you need additional information and how would you use it, and how would your job be different? And so, the mere fact that you’re migrating, which are typically, you know, long span, you know, a lot of technology has changed, really forces you to kind of reconsider and get back to the business.
And say, if you had these new capabilities often didn’t involve demoing. What would you know, what would you like to do with them? Anyway, sorry to interrupt, but I just think it’s a very, very important point. John’s projects.
So, it segues nicely into this one. The, the, the first item on our Roadmap section is, is to do tool selection, which is really a daunting task, right? And as John … states, it should be based on the business and technical requirements juxtaposed with your budgets and an understanding of Total cost of ownership. So, based upon that, which tools make the most sense? And Senturus tools, do, they meet your needs now, but what about the next 3, 5, 10 years? And that involves taking a look at what types of reporting are needing operation are needed operational batch versus visual or data discovery and dash boarding.
And then how the data for those types of reporting is captured? Some tools are more optimized with certain data sources. You should, absolutely, as John said, try before you buy POC, these things do it in house hands-on. Don’t let the vendor do it.
So, you truly understand the level of effort required to operate and administer and whatnot the various tools.
The vendors know their own products best when they’re going to position them in their best light. So, it’s important to ask questions. They may not be positioning and talk to neutral or trusted partners and customers that may have gone through similar situations or use those tools. You maybe have the tool choice sort of foisted on you.
If it’s been an M&A for example, then it’s important to understand the strengths and limitations of a given tool or, tools and then understand the roadmap.
This is only compounded by products, increasingly rapid release cycles, so features are constantly changing, and it’s important to mind the consumption gap.
It’s a real thing, right? By what you need, not what you don’t think about things like Microsoft Office, they’re always adding tons of things in there. But how much of that stuff do you really use? Very little of it. So by the things you need by the things you do, you really think you’re going to use but don’t get mesmerized by some glitch widget that throws you off the throws you off the Rails and doesn’t add a ton of business value?
And this is something where we can really help out, because we do a an objective data driven evaluation that looks at over 150 attributes, and I invite you to take a look at that over at this link here, watching a comparison of Power BI, Tableau and Cognos.
Then the question arises, how do you rip and replace the current system? Or do you seek to optimize? You might already own the most effective tool. It might not be deployed correctly, or the problem might lie elsewhere. Again, thinking holistically about all those phases and all those different supporting structures, the data might not be captured or not captured properly in analysis ready.
Maybe your solution is to augment with a mode to tool and coexist that lowers risks and allows you to tailor to specific business needs. But it does come at a higher cost of supporting another system.
Cloud strategy this is a huge one. Again, migrations precipitate these inflection points. And there’s a bunch of questions here that I’m not going to dive into it, but I’m going to I’m going to point you to another excellent webinar called Shedding Light on Cloud BI Options. But some questions, or which cloud, right. And then, which cloud beyond AWS and Google? What type of cloud within those environments? And then, from there, what components go, where your data sources, your BI tools, ETL, security files, and images, there’s tons of options. Lots of pros and cons that typically revolve around cost versus control, and ease of use, versus scale, and flexibility. And, again, we can help you with that. So, take a look at that webinar. Another, sort of, inflection point axis is that in house versus managed services.
So are you going to continue to maintain and manage expensive BI infrastructure, both from a hardware and a software perspective, but from a human capital perspective, or is it time to look outside the organization for that as well? In fact, for one of our clients, Senturus, provides an entire environment, including licensing, support, infrastructure, database, and authentication for less than the cost of one full-time employee for this particular client.
Then we move on to the Optimize phase, and first up here is to identify missing data, and of course, most of you probably heard the great line. You can’t manage, what you can’t measure. It’s important to get this data in, the In House, and Roy from Constellation groups as you can’t have a digital organization. Without good data, we can’t emphasize this enough. So you ask yourself, how has the business changed since the last one was implemented, how are we quantifying and measuring success?
And how does that ripple into the measures and dimensions that that apply to a given business process? And therefore, the owners? In other words, how do the business users like to slice and dice or roll up the data, and where is that data? Do you have it? Is it in your four walls or is it syndicated? And then is it analysis ready?
And, Mike, I just want to chime in there and say one of the other very common things we see is when people are doing migrations, oftentimes from older legacy systems with static or mildly prompted reports to what they want is full interactivity.
The dataset is not arrayed in the proper fashion to really facilitate that. So oftentimes what happens is, a lot of stuff gets kind of glued together at the Blend levels. Say for instance, using Tableau’s analogy and you are on M two. Horrible performance issues, so the point is that to take advantage of the capabilities of these new toolset, particularly things like interactivity and, you know, very rich visualizations You really have to kind of push back into the dataset, as Mike alluded to. And that’s, you know, there’s some really solid techniques that many people who have shown up in our webinars over the years, you have learned through our courses and stuff like that.
That’s a great point, John.
And it does relate to this slide, where making model improvements is, This is the time to ask those questions, is when you’re going through a migration. Getting clean business friendly data in an agile fashion is a never-ending challenge, given the explosion of systems and data volumes. And the business users demands that continue to grow.
This is the time to optimize where that logic should live. Much of it, as John just pointed out, ends up here in your, in the front end tools, right?
If a user can’t find a given item in the semantic model or in the backend data source, they’ll go and create it in the front end, and there are tradeoffs there, and the logic and Reports may be more agile, but it’s less governed, and there’s less re-use, and there may be advantages here, to taking stuff that’s in the reports, and pushing that back into the upstream, as we like to call it, so pushing that in.
If the further upstream you push this site downstream, you push the logic, in other words, in the reports and whatnot. It’s more agile, but it’s less governed.
And can lead to performance problems, and reporting errors by pushing that upstream either into the semantic model or metadata layer. Whatever you want to call that, or back even further into whatever your data repository is, you tend to reap higher levels of governance and greater levels of re-use and generally better performance. The other advantage of this is, is when you get back behind the semantic model further upstream, you have a greater. You have more flexibility in terms of what tool or tools you can use in front of this. If it’s in the data warehouse, any tool can probably hit if it’s in the semantic model, or in the report. Oftentimes, that’s proprietary, this is also an important time to take a look at hierarchies and consider hierarchy, management. How do you drill up and drill down through time through your customers, your products, your Vendors? How did, and how do these change over time, right? Think about how many times you organize.
You re-organize your sales organization, and different functional areas have different lenses on the business, HR, looks at employees differently from, say, manufacturing.
Then the last section here and in optimize, is around some of the technology that we created here, and we call this, and tourist reporting insights. Now, one of the, the components here of migrations is, again, cataloging and understanding the current system, and then optimizing and prioritizing what you’re going to bring over.
So, we developed a tool that allows you to catalog, and inventory report logic, and this is an extremely tedious and time consuming process, done manager manually, You think about the, the number of reports that exist on systems now, can be in the hundreds or thousands, or tens of thousands range. So, if you do this manually, its very time consuming and tedious and, and more often involve probably some human error, and, and a small team, or a large team of people.
So, we built a utility that helps you inventory and catalog the report logic and your BI environment. You can when combined with audit reports, it helps our clients prioritize and streamline report migrations by collecting the makeup of the reports, including the component objects list. It’s cross tab, the type of chart, the data items, the queries, and calculations. It further helps organizations identify standardized reports versus subtle permutations of those standardized reports and those tend to really proliferate the higher degree of self-service you find. And we can I do those. And they become candidates then for consolidation and reduction.
There’s a reason that you don’t automate the entire process and John talked about this because the new tools generally have features that you want to take advantage of that are light-years ahead of what your current system has. And there’s a lot of just dead weight in the current system and in most cases so it really helps reduce the footprint.
Which then reaps benefits of reduced risk, cost, effort and time. And again it allows you to focus on sort of maximizing the functionality of leveraging those new tools.
And then rapid prototyping is de Rigueur, so gone are the days of waiting and throwing a solution over the fence and waiting six months, right?
Do wireframes and mock ups. You don’t even have to have data for these, or you can use spreadsheet data. Sit in the room with somebody. This is mockup tire. There’s even free ones that are out there, and have people design what their desired future state, as? What if technology wasn’t a barrier or cost whatsoever?
Just what’s, what’s Nirvana for these guys? And you can even use the tools themselves these days. A lot of the visual tools really lend themselves to getting in a room and allowing it iterate very quickly with both IT and business, and that gives us the luxury of failing early. Right? Understanding where soon I’ll go charging down a path that that is not fruitful and become a Pareto disciple of the 80 20 rule.
In a lot of cases, unless it’s regulatory reporting, if you get a report that’s 80% of the way there, it’ll add a lot of business value. And you can iterate on it and get it to 100% down the road. But don’t get stuck in Project purgatory, because you don’t have everything 100% correct.
All right, so congratulations. You can pat yourself on the back. You’ve gotten this far. Now, it’s time to make the magic happen. Now, the first step in actually executing this is bringing over your, your data models is doing the data model migration. There’s a lot of complex logic, right?
You have this physical view and that gets pulled out into maybe a logical view or a presentation layer before it finally finds its way into reports.
Now, metadata Lott models in tools like Cognos or Business Objects, often have one or more levels of abstraction even beyond this. And there’s tons of tribal knowledge that may or may not be in house any longer. There’s a ton of models, lots of diploid metadata and again, a lot of dead weight in there.
And so when you consider a typical migration from one system to the other, there’s this need for expertise on expertise on both of the systems, on the old system, the current system, and the, the new tool or tools. It involves reverse engineering, the old system, looking at all these fields. Now think about an address field, and how many times that likely exists in your system. It’s probably in the dozens or hundreds. And then, they have to pull that over, decide which one, and how to optimally implement it in the new system.
This is extremely tedious and time consuming if done manually, and so, I’m going to give you a demonstration here of, of our migration utility or our model migration utility to illustrate some of this stuff.
All right. So, first, just to kind of drive this point home and to, for some of you who may not be Cognos Proficient. Now, the demo I’m going to give you here is, I’m going to, pick the Cognos model, and the reason I do it is because of this this is Cognos is Metadata Client Utility.
And, this is the sample Model. one of the sample models, the Great outdoors Data Warehouse that ships with the product and, this is where you design your metadata, create your semantic layer in Cognos and, you can see Cognos uses this for support. People use it for demos and whatnot. This thing is a beast. It’s huge. It’s got folders. It’s got multi-layered what we call namespaces to organize, what I would call in, say, Tableau parlance.
This would be more of a data set or a subject area.
And when you drill into this thing, it’s, it’s a mess. And this thing’s been around for about 20 years, and it’s, it’s tricky to navigate.
And you might not want to migrate the entire thing. But, there’s a ton of value in this thing. There’s been and there’s a lot of business logic that’s been resolved. There’s a lot of great stuff in these models that we might want to bring over.
So, I wanted to show you what this model looks like.
It really is a, a big kind of a hairball.
All right, so, wouldn’t you agree, though, Mike, that when we actually hit client sites and look at their FM models, are there BO universes that they take on characteristics?
Exactly, like that, and in some cases, actually, are even harder to figure out and navigate. Like 100% agree, that the model. Certainly, we’ve seen all shapes and sizes ranging from large monolithic models, like this one, to the, on the other end, a proliferation of models, smaller models. And oftentimes, you know, hundreds or thousands of packages from hundreds of models, that results in A, in a system that’s really overburdened by a lot of by a lot of overhead.
So, now, I’m going to go over to the Senturus Analytics Migration Tool, which, pretty much pretty straightforward interface here, where we have a source and a destination system. And, right now, it’s Cognos Tableau and or Analysis Services. So what I can do here is. I can pull in and this is literally that framework manager model file that’s stored in XML.
On the cognitive system. And I pull it over and what we do, is, we create what we call a … model out of that. So I’ve already run that, to pull all that stuff in there, and you’ll recognize some of the data items here. Now, what I’m going to do is, is the first one is if I wanted to convert this model into a Tableau data source so Tableau can store data sources as a TDS file.
So I pick my send tourists model that I’ve parsed out from my remember my go Data Warehouse Model that has all of that complex logic and all those subject areas. And I can go into my sales query, which is kind of a nice neat dataset that I want to capture.
And I can bring over, like, order method, and some of my organization information, and my products, and I can bring over sales. And I can even if I just wanted a particular field, I could just say, you know, Maybe I just want date over here.
I could bring over the entire table and from there, I pick a template, which just has some basic configuration information in it for a TDS file and then I give it a name and I let it run its magic. Now I’m going to pull a little bit of a Martha Stewart here on you and jump over to the Finished Tricky but I can open up that Tableau data source in Tableau And what that’s done is it’s actually brought over its Parse that model and gone down and gotten all of the physical data.
All the links and everything for that particular, for that particular chunk of the model that I’ve chosen to pullover.
So, now, if I go over and I want to create a new visualization, you can see, I have my time, my order method, my product, all that information pulled over for me by Tableau or, sorry, in Tableau and connecting directly to that database.
Alright, so, I can go and create brand-new content here and this is not going through Cognos. This is literally going back out to the to the database and doing those, doing those queries live.
Of course, I can then, the rest of this just becomes a regular Tableau exercise.
So, this is hitting that same data source that that model was hitting. And, and it’s all coming over here into Tableau and it’s, it’s migrated.
And, of course, I can do things in my data source, like, create a product hierarchy, so that I can drill up and down and on the, on these data items. And, that’s all stuff that you can do.
And, once you’re over here, on the Tableau side, OK, so, that’s one, sort of, pass through here. The other option that I can do is, again, if I want to take this Cognos model, and I want to push this into now, an analysis services model, which, which is a, is in the form of a BI file.
So, I’m once again, going to Now, I’m switching gears, right? So, the Tableau Data source is a SQL of data source, so that one was going directly against the database.
Now, what I’m doing on this side, and it’s more designed to push that a little further upstream, if you picture my, my, my diagram of moving from the report to the semantic layer, which was more what we’re doing with the TDS to a two to the backend data source. And this one I’m going to once again access my Data Warehouse Model which again has all of its richness and complexity to it. And I’m going to go ahead and do the same exercise, right?
I pulled my order method, and my Organization, and all this information over, and my sales fact EMI time, then I give it a name for a BIM, I want that. I want to create.
OK, and then when I click Finish, it’s actually going to generate this … file, which, again, I’m going to sort of jump ahead here, just because, in the interest of time, it takes a little while to tutor some of that stuff.
And I move over to another tool here called Tabular Editor, which is an it’s a, free utility that allows you to interpret, B, I am files and our utility has populated all this information, the source data, the relationships with all the tables here, and then, it actually has all my tables in here.
And from there, I can I then publish that to a SQL server analysis services instance, and you can do things like, if I wanted to create a hierarchy, I can just go and click, you know like year, quarter, month, end Date, and I can right click this and do create new hierarchy. I can rename fields. There’s a whole bunch of stuff that I can do here, keeping in mind that this is going to be published out, as a Cube source too.
My analysis services environment.
Alright, so this is going to go on. I’ve got a series of databases here.
It’s going to have me login into all this stuff, which, again, I’m going to sort of skip over in the interest of time, and now what I can do is, if I go over to Power BI, Now I have the ability to, it’s pulled over order method, organization products, and sales facts in much the same way, But this is accessing SQL Server Analysis Services. So, instead, what I’ll do here is I’ll look at, I’ll click on my order method type, and I’ll pull over some of my unit price information.
And you can see that Power BI puts that over in this object here.
And it’s doing the query for me.
And the queries, I apologize, are a little slow because, Or we have the, we have the freebie version of Analysis Services, but you can see over here, for example, that I have two years of data, my 2012 through 2013. And my totals are 6686, 120, 226, for the average of my unit Cost and Unit Price.
And guess what? If I go over here to my Tableau dashboard that I had pulled over, I’m also filtered on 2012 and 2013.
My unit costs of tick and tie is I’m pulling it from that same original metadata model, Then, of course, what I forgot to point out here, and I’m not going to in the interest of time, go down that full path, is, you can now, in either of these environments, I can publish this data source. And, or this workbook up to Tableau server and use Tableau servers, governance capabilities that their data server.
Or, on my Power BI side, I can publish this up to the Power BI service, and therefore, have the ability to, again, you know, have a govern data source up there and access all this content on powerbi.com.
All right, so, hopefully that wasn’t hopefully I’ve struck a balance there between expediency and showing you some of that rich content, But the utility is a great way to sort of be able to cherry pick things.
Things like Tableau and Power BI or sort of, they behave a little better or they, they tend to build workbooks and whatnot around it around a more manageable dataset, then that full model. So, it’s kind of optimal for stuff like that. And you pull over the stuff that’s really valuable based upon some of the steps that we’ve already gone through.
All right. So you’ve got your metadata, and you’ve used report insights to catalog your environment, and you understand the usage, and you’ve been able to prioritize this stuff. So you have all this stuff, you combine it with experts in both the tools to make a report migration now happen successfully. Again, consolidation and reduction can really reduce that footprint and, therefore, the time complexity and risk of doing this.
What we typically see is customers have run both an auditor reports for purposes of cataloging the inventory, and many of them tend to triage.
Those are their reports into buckets of simple, medium, and complex. And the way you tackle these is really simple reports. Oftentimes, you end up just recreating those because it’s easier to just make them again, or perhaps make them visual, right? A basic list, or Porter across tab, there’s probably a better way to, to display that information visually, and then you look at Permutations. Again, look at consolidating those, and leverage all the great capabilities in these tools of hierarchies, and drill through capabilities, dressage, and drill up, drill down capabilities.
And, or filters, and parameters with the, through the use of those, you can make one report satisfy the business needs of hundreds or even thousands of users, and then complex reports the fall into that category. That’s really where a redesign and or optimization can make a lot of sense. Again, starting with the most highly used and beneficial reports to the organization, based upon that inventory, and then leveraging the capability of, of the, of the new tools to make them more visual and interactive, and just add a lot more value.
Then, a quick word about QA, we’ve seen, in our experience, that it’s really important to have QA outside of development, because when you’re developing, you tend to get blinders on, and, and you’re not able to see issues with that. It’s important to have dedicated resources. And those resources should include business users who both know the data and understand what they need. Because, as John mentioned, at the outset of this, there’s two big components to this.
You have to get the system in right, so that it performs and can deliver data, but that the people have to adopt it.
So, if they trust the data, and it helps them do their jobs better, they’ll adopt the system.
Alright, so, now you’ve executed this. Now, this is really, we move into the Secure and Manage phase, where it’s about deployment. And, the first phase here is, is, is access or the first step here is access.
The monitor being, it should be very easy for users who should get, who need access to get the info. They need very hard for anyone else, and that’s really around security.
And, obviously, the attacks and risks of data breaches are higher than ever.
And the consequences of those breaches can be, are very, very serious. It’s important to consider centralizing to streamline and ease application of security. In other words, use things like Active Directory across different systems, and leverage groups versus applying.
Security at an individual user level. This can really complicate things and either expose data you don’t want to, or make it hard for people to get at things. The extent to which you can push security upstream, again, think about that diagram. From a report level to the server level or the semantic layer, even better yet, at the data source level, you get greater re-use. There’s less chance of human error and greater ease of application.
And then distribution really talks about how users are accessing the system. And really, today, that’s always, they’re accessing it via the web mobile, it’s on your shop floor, on big monitors. It’s embedded in business processes or similar applications.
Support community is really important, and it needs to have many touch points, with a PICC, or a Center of Excellence, whatever you choose to call it. Leveraging, things like wikis are communities, where you host lunch and learns, dashboard makeovers and SQL days and whatnot that, that enables you to groom subject matter experts in all of these various tools. And the supporting structures, which again, maybe multi axial in this case, can be, cloud technologies, or dealing with, you know, tickets with a managed services provider. Do you want to have users that are excited and engaged, and know they always have a place to go to solve problems, and have support and overcome challenges? And they aren’t alone, working in silos.
The training plan I really want to dive into here, because, again, the second component big component is you got the system in, but to get people to adopt it.
They’re really the result of having struggles stem from having a user base that may have been used to receiving only static reports, controlled by IT, right field report, factory, and methodology. They don’t know how to create stuff. They don’t know how to think correctly. They are starting from scratch. As organizations make this move to self-service culture of Analytics. It’s a massive change and shouldn’t be underestimated. You’re teaching your employees to Phish You’re teaching them, to ask questions of data. They may not have answers to and they need to earn to learn to ask the next question. So I’ve answered this question is probably spawns 10 other, in order to do that, you need to provide them with great data. It needs to be relevant, reliable and accurate.
Actionable and performance and timely, was a great phrase that accorded Rae from constellation before about nurturing digital artisans. And I just love that phrase, because they make data consumable by the entire organization. And I have my little Senturus, our artisan guy that I do with my bad copy paste in my limited graphics skills where. But the ideas are our people combine the sort of the nerdy aspects of data science with an understanding of design and user experience and storytelling.
It’s really important from a training perspective.
Training needs to be multimodal then and include a variety of different ways for people to learn whether that be classroom online, instructor led, self-paced and or customizer personalized training, which can really help drive adoption. It’s good to do hands-on workshops and small groups that give you the ability to sit with the business users and, again, play with these self-service tools.
To do this, you, again, need experts in both or all of the tools across all of the different roles that you’re that you’re touching here. Right modelers, content creators, consumers, et cetera, et cetera.
The rapid release cycle again, products requires that you have this sort of ongoing training too, so that they understand new features that are going to benefit your organization, and can actually leverage them, and don’t make the mistake of how however easy the tool seems, or however proficient You are.
If you can steeped in it, users always need more training than you think.
The slightest changes I’ve seen even on the same tool really can throw people for a wrench.
You need to develop internal gurus and evangelists that really propagate the usage of the tool of the new tool and drive adoption.
Quick note on roles here, it’s important to look at the impact on each role and ensure the appropriate level of investment and enablement and recognize that people oftentimes wear multiple hats. You have your data modelers.
And the tool changes can be significant there ranging from Tableau, data, source pain to Power BI Desktop, and DACs to Framework Manager, your report writers, maybe, moving from something like Report Studio Tour, from Tableau to or from Power BI. Your administrators might have huge changes here, if you’re thinking about that, it might be going from on perm to cloud, and various different configure and utility tools. Some of the, some of the administrative tools are command line utilities. So, there can be a really big changes across the spectrum of users and roles, and it’s important to look at all of that.
We’d be remiss if we didn’t touch upon performance, monitoring performance. Obviously, trust them, the data, and performance are the two main drivers to adoption, right? If it’s fast.
And, and, and, they trust the data, probably adopted, the problem is with this multi axial pivot, again, to the cloud, or managed services, your bottlenecks proliferate that can be anywhere. Especially in these more complex environments that, that span the enterprise. It could be in the cloud. It could be network latency.
It could be bandwidth issues with clustering or virtualization, fail-over query performance, database issues, and to have the skills to be able to chase those things down.
It is something you’ll need to nurture or you’ll need to augment your staff with. We have a ton of different webinars on this subject, on our, on our resources page that I’ll point you to in a little while.
And then, of course, ongoing maintenance here, this is our last point. These systems are constantly in flux. The business and regulatory environment, again, all changing rapidly, technology, and capability are constantly being added to these products. So you need a plan for your dev, test, QA environment, and that migration in life cycle management, which now becomes not on every two year thing or every three years or five years, but in every three months thing. And of course, you have to consider things like backups and disaster recovery.
So that’s our tour of the Senturus framework.
There’s a bunch of benefits here that I’m not going to go through.
I’ll leave the slide, and you can always download the deck here, but in summary, migrations are complex, ubiquitous, and critical mission critical.
It’s important to approach these holistically and methodically, and many organizations lack the critical elements and the skills to succeed in migration.
Consider where automation can help.
So, again, we showed you a couple of tools that we’ve developed to help you catalog your reports and, and migrate over, metadata models, and consider help with some of these important elements, Right, whether it’s assessment of your current environments, whether it’s the tool selection capabilities, Cloud, or managed services, implementation.
There’s a ton of this list kind of goes on and on, training and self-service.
With that in mind, the Senturus experts have, and I invite you over to our migration page, which I have a link to later on.
Our experts can help with platform migrations, whether that’s from Cognos to Power BI or Tableau or vice versa.
Business Objects to Cognos Obi Aida Cognos, Power BI or Tableau. We can help with application or tool migration.
So, if you’re switching ETL tools, there’s moving from, say, Cognitive Data Manager, which is end of life to SSIS or data stage Cognos Cubes to SSIS.
There’s a bunch of different bunch of different ways that we can help you optimize analytics in your organization and help with those migrations.
If you’re considering cloud migration, we can help with those as well, and, of course, we do, we can offer hosting for you of your BI environment. So, how can we specifically make you successful? Where there’s a ton of different area ways here, we can help you with, again, these data model and or report migration tools.
We can help you with assess, assessments, road mapping in or tool selection.
When you get to implementation, we can help you with architecture, bring you some of our data, our artisans help you with development staff, and help you with upgrades and performance monitoring, all that good stuff. And, of course, the software solutions that we talked about.
And if you, Oh, there’s the link, right here, to our platform, migrations at the bottom of this page. So you can get that from the deck.
And we’d like to offer, as a call to action, here, a proof of concept, where we will help you migrate your, your metadata model.
And so give us a call, and we’ll discuss your situation. And if it’s appropriate, we will help you migrate a portion of your model and walk through what a report rebuild might look like. And then provide some recommendations on how we might be able to help you accelerate your migration. So, give us a shout to will have some contact information at the end of the webinar here. Stick around for a few minutes here. I’m just going to give you a couple of a couple of slides on Senturus and who we are for Those of you who don’t know, our clients know us for providing clarity from the chaos of complex business requirements, disparate data sources, and constantly moving targets.
We made a name for ourselves because of our strength: Bridging the gap between IT and business users, We deliver solutions that give our clients access to reliable analysis, ready data across the organization, enabling you to quickly and easily get answers at the point of impact, in the form of the decisions you make, and the actions you take.
As I mentioned, our consultants are leading experts in the field of analytics with years of Pragmatic, real-world experience, and experience advancing the state-of-the-art.
We’re so confident in our team, and our methodology, that we back our project with a 100% money back guarantee that is unique in the analytics consulting industry. We’ve been doing this for a long time. We’re approaching two decades here.
We’ve our Client breadths gives you a really good idea of the range of industries we’ve worked. And we’ve delivered thousands of successful projects for hundreds of clients ranging from the Fortune 500 to the mid-market. We solve business problems across many different industries and functional areas, including the Office of Finance, sales and marketing, manufacturing, operations, HR, and IT. So if you need someone to help you with your next Analytics project, we’d love the opportunity to leverage our experience for you.
Some additional resources, upcoming events. We have a great webinar about Tableau dashboards that captivate and communicate. That’s on Thursday, head on over to senturus.com/events. And then another great one on June six, where we’re going to talk about the latest version of the … Analytics Connector which allows you to use visual analytics tools with your trusted Cognos metadata and reports. So, go on over there to check those out, and wire over their visit, our resource library, where you can see, again, those great upcoming events.
Access our free Resource, library that John talked about earlier, that we’ll have this webinar, as well as hundreds of other great resources on our blog where you can learn about what’s top of mind at Senturus.
I’ve included this page just for reference here. You’ve got a bunch of great links to webinars and resources that are specific to or related to this webinar.
And then a quick shout out about our training options.
We have, If you go to senturus.com/training, we have regularly scheduled courses, as well as customized, large group and private instruction.
And we’ve recently added self-paced learning, which is great, cost effective approach. You can see it here, its $49. If you go to CBT senturus.com, computer based Training, self-paced, easily digestible with downloadable course materials.
I did chew up pretty much the entire hour here, I think we have about Nan.
Question, So, Exactly! And, when we only have a few really good complex ones, for, for you. But, Yeah, thanks. My, I make a whirlwind tour, actually. one question came in that I actually didn’t get a chance to address in the very beginning, so, I’ll, I’ll, I’ll tackle it and, feel free to add on. The question from Christian was R or, Crystal was the webinars called, Cognitive Calming, the Cognos farewell Fears. I’m interested to know, is this a trend migration away from Cognos to other BI platforms? And was interesting because somebody else asked us after that e-mail does that mean for tourists is actually, you know, getting off of Cognos?
And I have to apologize for what I’ll say the marketing department being a little overzealous with the titling. The fact of the matter is the largest project that we are currently doing is a migration to Cognos. So, the high level answer to your question is, There is not a mass migration away from Cognos.
However, Cognizant has been around for a while and makes up a very large portion of our client base. And we’ve had a number of inquiries for, from those clients that we’re looking to migrate either to a newer version of evaluating a new version of the analytics, or to other tools. And, so, when we try and put together our webinar content, we look for things that, you know, we’re getting inquiries on, but it doesn’t necessarily reflect the sort of mass migration away from, from cognitive.
So, as I say, particularly with some of the new capabilities that we’re seeing with 11.1, a lot of folks are, are seeing they can actually get to a bimodal approach with a single tool, which is something that we’re finding is very challenging with, with tools, for instance, like Tableau. So, the short answer is no, and we sorry if we scared people. There isn’t a mass trend that you’re missing. However, if that is on your roadmap, then it’s an area that we can help.
You definitely can help you and you saw all the different ways that we could do so.
Squared point, John and I almost threw up the Gartner bimodal slide. But the gist of that is, really that you’re seeing an uptick in the number of customers that they asked that are adopting a bimodal approach. All right, they’re seeing the benefit of having the mode, one tools and the mode. Two tools sort of coexisting in an organization. And the really interesting part about it is not only that universe growing, but the top performers in that universe are the ones adopting the bimodal approach. So, when they used it successfully, it’s benefiting their business.
Oh, I had the wrong mute button. Sorry about that. I was busy queuing up a question for you. Now, I’m going to distill a question here that gets, gets very technical. But the essence is, does your migration tool migrate, help, migrate, very complex measures?
And, you know, how does it work with regard to, once you get some really tricky stuff set up in, in one of these existing metadata models and move it to the other?
And I’ll, I’ll kind of start the answer by saying, one of the things that we have learned both by doing these migrations manually for years, but particularly now that we’re automating them is, there is no standard for how these systems are configured by vendors. The only thing that’s really standard is you have some data sources and you typically have some sort of graphics and reports output. What happens in the middle tends to be very different between Tableau, Microsoft Cognos business object, I would say Cognos in Business Objects share a lot more commonality, as does, …, but, maybe, Mike, you can speak to that, for instance, just the number of layers in a model. Or the types of technologies being used across the board. From how security is handled to how calculations in hierarchies are handled, right. And, I hate to give the answer, but the, the, it depends in some cases, you can pull them straight over, and, and they, they will, they will work.
The most common issue we see is, you know, say you have kind of a ratio or percentage, right?
And then, if you just pull that over, if you just throw that into a record in the aggregate, it, it’ll get the answer wrong. So you still need to go, say, in Tableau. And you need to tell it to do the order of operations correctly so that it rolls up the, the totals, and then does the divide. So, it really kind of depends on the measure and the type of measure.
But again, that’s where we would want to look at your environment and understand what you’re, what you’re trying to bring over.
Yeah, and I think this also gets to a question that I had expected to see, and probably is in the back of people’s minds. How do I get that tool? And the reality is, it’s not a gun. We will not be delivering it as a software product that you download and do on your own, because of exactly the reasons Mike just said, there are. So, if, for us, it, it has become an enabler. But, it really has to be integrated with people who understand, sort of both, the source platform and the target platform. As well as all of these issues that manifest themselves, because these platforms are typically so different. So, we are in, a related question we often get asked is, OK, we bring you guys in to help our migrations.
You know, you’re automating pieces. How much, what percentage benefit do I get on that? And, and, like Mike said, it depends, for certain types of applications, we’ve seen as high as 70% on, you know, in a real-world application, where you’re migrating from sort of a more legacy system. That’s not set up the way you would really want to lift and shift. And, you need to do sort of improvement, at the same time, that percentage goes down. So, and depending on how much is sort of individual, the highly manual report development in a new tool. That, you know, the logic or the tech capabilities didn’t exist in in older versions of, of the legacy platforms. Those, those are ones that typically have a more manual aspect to it, so it really does depend. But, it, Definitely, I can tell you one thing.
Having watched over the years, our team tackle projects in the highly manual fashion, where you literally take an expert, who’s aware of both the source and target tools. And they are literally clicking into one system, getting down to which address field that SQL is referencing. Going back out to the new system, encoding that in, you can imagine how fraught with Tara that is and how easy it is to introduce manual errors into that. And that’s really the, the impetus for us to put together these automated approaches that do away with these accuracy issues that can really plague you, because they’re very hard to actually find and, and debug.
Let me see, I didn’t see the report migration tool name.
Mike, I think that was a pretty cookie.
The, so, the, there isn’t a report migration tool, per se.
The report insights is really the tool for cataloging and inventory in the environment and we call it report insights and the metadata migration utility.
We literally just call that the metadata, my Christian utility, right now, so maybe we need to work on that. Yeah, the fact of the matter is, if you break down, actually, let’s say you’ve now sort of automated the movement of your metadata model, and you break down, I need to build out some reports, or some dashboards. It breaks down into kind of two components. The tricky part is actually figuring out what logic is making. The numbers show up on the screen, you know, this particular number, that measure: what, what is it? Where did it come from? How is it calculated? And that’s the part where our report insights tool gives you, insight into two that logic the next thing, which today, is still fairly manual, or 100% manual is drawing. The pretty pictures on the screen. Because these, each vendor, their UI, is the way you create, that they don’t have APIs and structures that make it easy to.
Do that out of box, not in any systematic fashion.
So, it’s a little bit above is when it comes down to.
There’s one interesting question that came up from the very beginning. Joe, I found the majority of in-house upgrades that clients are always cutting their costs. By going with minimum hardware requirements that’s impacting over performance and then having overall performance and then you know, after the fact going hey, what’s going on with the system, maybe Mike? You can speak to it.
I think it comes back to, again, the success factors are more than just automating, lift and shift.
Right. I mean I think that’s absolutely true cost, often gets underestimated and organizations are always looking for ways to save, I think one of the, you know, if there’s a positive to that, it’s when you do make the shift to, to cloud and things like that.
There, there’s a level of elasticity that they provide that, if you can chase down the bottleneck, and you determine that, hey, it’s, it’s my PI application server or, you know, I need to make a cluster out of this, or, this database is too slow. Or you can sort of easily grow the capacity in those environments, kind of seamlessly, in a lot of cases, or if you just need that capacity, temporarily, like for your end of month, closing the books kind of stuff. That’s great.
But if you’re doing it on perm and you’re doing it on your own hardware, then, yeah, it’s important to, you know, not only get it right for the initial rollout, but then you got to bake a fair amount of growth into it. And given the ubiquity and mission critical nature of BI systems. That usually means investing pretty significantly.
Well, I think that, that does it for the questions here, Mike? To Anna.
Yeah, we’re also over time. So we are careful with folks whose day.
But I want to thank you, and the audience, or a wonderful session today, and really pull this through a whirlwind tour of what it takes to have a successful migration and migration apply to lots of different ways that you’re manipulating your BI environments. So, thank you very much. Great. Thanks, John. Thanks for you guys for sticking around. I put up our contact info here at the top. You can reach us via our website, or e-mail us at firstname.lastname@example.org, or if you’re old school, and want to pick up the phone. We’ve got an AAA number. And also encourage you all to connect with us on various social media links down here with LinkedIn and slides hare, YouTube, Twitter, and Facebook. Thanks so much for hanging around with us today. Hopefully, you enjoyed the content today, and we’ll look forward to seeing you at one of our upcoming events. Have a great rest of your day.