Showing posts with label ERP. Show all posts
Showing posts with label ERP. Show all posts

Friday, March 06, 2015

Showing Value Early and Often Boosts Software Testing Success at Pomeroy

Transcript of a BriefingsDirect discussion on how a managed service provider uses HP tools to improve software quality and to speed modernization across application testing and development.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing sponsored discussion on IT innovation and how it’s making an impact on people’s lives.

Gardner
Once again, we're focusing on how companies are adapting to the new style of IT to improve IT performance and deliver better user experiences, as well as better business results.

This time, we're coming to you from the recent HP Discover 2014 Conference in Las Vegas. We're here to learn from IT and business leaders alike how big data, cloud, and converged infrastructure implementations are supporting their goals.
Seven best practices
for business-ready applications
Our next innovation case study interview explores how Pomeroy, a Global IT managed services provider, improves quality for their applications testing, development and customization. By working with a partner, TurnKey Solutions, Pomeroy improves their overall process for development and thereby achieves far better IT and business outcomes.

In order to learn more about how they're improving app testing proficiency, please welcome Mary Cathell, Quality Assurance Analyst at Pomeroy in Hebron, Kentucky. Welcome, Mary.

Mary Cathell: Hi. Thanks for having me.

Gardner: We're also here with Daniel Gannon, President and CEO at TurnKey Solutions in Denver. Welcome.

Daniel Gannon: Dana, thanks for having me.

Gardner: Tell us about Pomeroy and then how improved development has boosted software benefits internally, as well as for your end-user customers across your managed-service provider (MSP) offerings.

Cathell: We're a premier provider of IT managed services. We do end user, network, data center, and everything in between. We’re hands on all over the place. We have a global footprint. Quality is absolutely imperative. We have big client companies like Nestle, Goodyear, and Bayer. These are companies that have a certain amount of respect in the business world. They depend upon quality in their products, and we need to deliver quality in our products to them.

Gardner: And you're the sole quality assurance analyst. So you have a big job.

Cathell: I do.

Gardner: What did you find when you got there, and what was the steady state before you started to make some improvements?

Making improvements

Cathell: This was November of 2012. They gave me an opportunity to bring something new that they were unfamiliar with and to teach, which I love to do. They purchased Oracle E-Business Suite (EBS), everyone had their own piece of the process from our sales to logistics, and they were all using different applications to do this process.

Cathell
It was a paradigm shift to take one system and bring us together as one company using one product. There was a lot of struggle through that, and they struggled through testing this, because they had no testing background. I was brought in to bring it to steady state.

After we went live, we got to steady state. Now it was like, "Let's not reinvent the wheel. Let's do this right. Let's begin scripting."

Testing is terrible. It's tedious. No one has the time to do it. No one has the patience to do it. So they either don’t do it or they throw buckshot at it. They do ad-hoc testing, or they just let errors come in and out, and they fix them on the back end, which is client facing.

Does Goodyear want to see a real-estate problem on an invoice? No, they don't, and we lose credibility. Goodyear is talking to their clients. They have friends. Their CEO is talking to another company's CEO. Now, you’ve got a word-of-mouth situation off of one mistake. You can't have that.

Gardner: What were some of the hurdles that you needed to overcome to become more automated, to take advantage of technology, to modernize the quality assurance processes? Then, we'll talk about how TurnKey works in that regard. But let's talk about what you had to overcome first?
We can function better now in just our regular business process, not even testing, but what we do for our customer.

Cathell: I had to show the value. Value is everything, because people ask, "Why do we need to do this? This is so much work. What value is that going to bring to me?"

Again, it lets your processes work with the business function as an oiled machine, because you're not separate anymore. You’re not siloed. You need to work together. It's cross-functional. It taught us our data.

Now there's an understanding that this works. We can function better now in just our regular business process, not even testing, but what we do for our customer. That’s value for our internal customers, which ends up being absolute value to our external customers.

Gardner: The solution you went for included HP Quality Center, but you wanted to take that a step further, and that's where TurnKey comes in

Due diligence

Cathell: I talked to several other companies. You need to. You need to do the due diligence. TurnKey did a wonderful thing. They provided something that no one else was doing.

We didn’t have the bandwidth, the talent internally, to script automation. It's very difficult and it's a very long process, but, they have an accelerator that you can just drag and drop from out-of-the-box Oracle and make changes, as you need to, for their customizations and your personalization.
Seven best practices
for business-ready applications
They also had cFactory, so that when your system changes -- and it will, because your business grows, your process changes -- it tells you the differences. You just click on a form, and it brings back what's there, shows you the comparison on what's changed, and asks if you would like to keep those changes. You don’t have to update your entire test case suite. It does it for you. It takes out that tedious mess of trying to keep updated.

Gardner: Daniel, is this what a lot of your clients go through, and what is it that you're bringing to the table in addition to HP Quality Center that gets their attention and makes this more powerful?

Gannon: Yeah, her story resonates. It’s very, very common for people to have those same issues. If you look at the new style of IT, it's really about two things, the two Vs, volume and velocity. You have a lot more data -- big data -- and it comes at you much faster. The whole notion of agility in business is a real driver, and these are the things that HP is addressing.

Gannon
From the perspective of how we deal with test automation, that’s what our products are designed to do. They enable people to do that quickly, easily, and manage it in a way that doesn't require armies of people, a lot of labor, to make that happen.

If you think about a standard environment like Mary’s at Pomeroy, the typical way people would address that is with a lot of people, a lot of hands, and a lot of manual effort. We think that intelligent software can replace that and help you do things more intelligently, much more quickly, and most importantly, at much, much lower cost.

Gardner: Mary, you've been at this for going on a couple of years. When you do intelligent software well, when you pick your partners well, what sort of results do you get? What’s been the change there?

Cathell: There is a paradigm shift, because now, when they, specifically our sales department, see the tool run, they're wowed. They're working with me to champion the tool to other parts of the business. That's ultimately the biggest reward -- to see people get it and then champion it.

Gardner: Has this translated into your constituents, your users, coming back to you for more customization because they trust this so that they're more interested in working with software, rather than resisting it?

Difficult to automate

Cathell: We absolutely did have that change, again specifically with sales, which is the most difficult process to automate, because it can go in so many different ways. So they're on board. They're leading my fight that we need to do this. This is where this company needs to go. This is where technology is going.

Gardner: And when you bring this mentality of better software quality and giving them the means to do it that’s not too arduous, doesn't that then also extend back into your fuller application development processes? How far back are you going through development and change? Is there a DevOps opportunity here for you to bring this into operations and start to sew this together better?

Cathell: That could happen in the future. Our requirements phase is a lot better, because now they're seeing scenarios with expected results -- pass/fails. Now, when we build something new, they go back and look at what they've written for our test scenarios and say, "Oh, our requirements need to be unambiguous. We need to be more detailed."

I do that liaison work where I speak geek for the developer, and the English for the business. We marry them together, and that creates now new quality products.
At the same time, we provide a common set of tools to provide test automation across the entire portfolio of applications within a company

Gardner: Daniel, Pomeroy uses this to a significant degree with Oracle EBS, but how about some of your other customers? What other applications, activities, and/or products has this been applied to? Do you have any metrics of success across some instances of what people get for this?

Gannon: We find that customers leverage the HP platform as the best-in-class platform for test automation across the broadest portfolio of applications in the industry, which is really powerful. What TurnKey Solutions brings to the table is specialization in conjunction with that platform. Our partnership reaches back well over a decade, where we have developed these solutions together.

We find that people use mission-critical applications, enterprise resource planning (ERP) applications like Oracle EBS, SAP, PeopleSoft and others that run the business. And our solutions address the unique problems of those applications. At the same time, we provide a common set of tools to provide test automation across the entire portfolio of applications within a company.

Many companies will have 600, 700, or thousands of applications that require the same level of due diligence and technology. That's what this kind of combination of technologies provides.

Gardner:  Mary, now that you’ve done this now with some hindsight -- not just from your current job but in previous jobs -- do you have any words of wisdom for other organizations that know that they've got quality issues? They don't always necessarily know how to go about it, but probably think that they would rather have an intelligent, modern approach. Do you have any words of wisdom that you can give them as they get started?

Break it down

Cathell: Absolutely. Everybody wants to look at the big picture -- and you should look at the big picture -- but you need to break it down. Do that in agile form. Make those into iterations. Start small and build up. So if you want to go from the smallest process that you have and keep building upon it, you're going to see more results than trying to tackle this huge elephant in the room that's just unattainable.

Gardner:  A lot of times with new initiatives, it’s important to establish a victory early to show some returns. How did you do that and how would you suggest others do that in order to keep the ball rolling?

Cathell: Get people excited. Get them onboard. Make sure that they're involved in the decision making and let them know what your plans are. Communication is absolute key, and then you have your champions.

Gardner: Daniel, we're here at HP Discover. This is where they open the kimono in many ways in their software and testing and application lifecycle management, businesses. As a long time HP partner, what are you hoping to see. What interests you? Any thoughts about the show in general?
Big data is both problem and opportunity. The problem is it’s big data. How do you cull and create intelligence from this mass of data.

Gannon: What's exciting is that HP addresses IT problems in general. There's no specificity necessarily. What companies really grapple with is how to put together a portfolio of solutions that addresses entire IT needs, rather than simple, specific, smoke stack kinds of solutions. That’s what’s really exciting. HP brings it all together, really delivers, and then values the customers. That’s what I think is really compelling.

Gardner: Okay, how about the emphasis on big data -- recognizing that applications are more now aligned with data and that analysis is becoming perhaps a requirement for more organizations in more instances? How do you see application customization and big data coming together?

Gannon: For me, big data is both problem and opportunity. The problem is it’s big data. How do you cull and create intelligence from this mass of data. That's where the magic lies. Those people who can tease actionable information from these massive data stores have the ability to act upon that.

There are a number of HP solutions that enable you to do just that. That will propel businesses forward to their next level, because you can use that information -- not just data, but information -- to make business decisions that enable customers going forward.

Gardner: Very good. I'm afraid we’ll have to leave it there. We have been learning about how Pomeroy, a Global IT Managed Services Provider in Kentucky has been improving on their quality of application’s development and has found more modernization and intelligence through the combination of HP Quality Center and TurnKey Solutions.

So, a big thank you to our guests. We’ve been talking with Mary Cathell, the Quality Assurance Analyst at Pomeroy in Hebron, Kentucky. Thank you so much. 
Seven best practices
for business-ready applications
Cathell: Thank you, Dana. It was an honor.

Gardner: And also a big thank you to Daniel Gannon, President and CEO of TurnKey Solutions. Thanks.

Gannon: Thanks, Dana.

Gardner: And I shouldn't forget to thank our audience as well for joining us for this special new style of IT discussion coming to you directly from the HP Discover 2014 Conference in Las Vegas.

I’m Dana Gardner; Principal Analyst at Interarbor Solutions, your host for this ongoing series of HP sponsored discussions. Thanks again for listening, and come back next time.

Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: HP.

Transcript of a BriefingsDirect discussion on how a managed service provider uses HP tools to improve software quality and to speed modernization across application testing and development. Copyright Interarbor Solutions, LLC, 2005-2015. All rights reserved.

You may also be interested in:

      Tuesday, February 24, 2015

      Columbia Sportswear Sets Torrid Pace for Reaping Global Business Benefits From Software-Defined Data Center

      Transcript of a BriefingsDirect discussion on how a major sportswear company has leveraged virtualization, SDDC and hybrid cloud to reap substantial business benefits.

      Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: VMware.

      Dana Gardner: Hello, and welcome to the special BriefingsDirect podcast series coming to you directly from the recent VMworld 2014 Conference. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of BriefingsDirect IT Strategy Discussions.

      Gardner
      We’re here in San Francisco to explore the latest developments in hybrid cloud computing, end-user computing, and software-defined data center (SDDC).

      Our next innovator case study interview focuses on Columbia Sportswear in Portland, Oregon. We're joined by a group from Columbia Sportswear, and we'll learn more about how they've made the journey to SDDC. We'll see how they’ve made great strides in improving their business results through IT, and where they expect to go next with their software-defined efforts.

      To learn more, please join me in welcoming our guests, Suzan Pickett, Manager of Global Infrastructure Services at Columbia Sportswear; Tim Melvin, Director of Global Technology Infrastructure at Columbia, and Carlos Tronco, Lead Systems Engineer at Columbia Sportswear. Welcome.

      Gardner: People are familiar with your brand, but they might not be familiar with your global breadth. Tell us a little bit about the company, so we appreciate the task ahead of you as IT practitioners.

      Pickett: Columbia Sportswear is in its 75th year. We're a leader in global manufacturing of apparel, outdoor accessories, and equipment. We're distributed worldwide and we have infrastructure in 46 locations around the world that we manage today. We're very happy to say that we're 100 percent virtualized on VMware products.

      Pickett
      Gardner: And those 46 locations, those aren't your retail outlets. That's just the infrastructure that supports your retail. Is that correct?

      Pickett: Exactly, our retail footprint in North America is around 110 retail stores today. We're looking to expand that with our joint venture in China over the next few years with Swire, distributor of Columbia Sportswear products.

      Gardner: You're clearly a fast-growing organization, and retail itself is a fast-changing industry. There’s lots going on, lots of data to crunch -- gaining more inference about buyer preferences --  and bringing that back into a feedback loop. It’s a very exciting time.

      Tell me about the business requirements that you've had that have led you to reinvest and re-energize IT. What are the business issues that are behind that?

      Global transformation

      Pickett: Columbia Sportswear has been going through a global business transformation. We've been refreshing our enterprise resource planning (ERP). We had a green-field implementation of SAP. We just went live with North America in April of this year, and it was a very successful go-live. We're 100 percent virtualized on VMware products and we're looking to expand that into Asia and Europe as well.

      So, with our global business transformation, also comes our consumer experience, on the retail side as well as wholesale. IT is looking to deliver service to the business, so they can become more agile and focused on engineering better products and better design and get that out to the consumer.

      Gardner: To be clear, your retail efforts are not just brick and mortar. You're also doing it online and perhaps even now extending into the mobile tier. Any business requirements there that have changed your challenges?

      Pickett: Absolutely. We're really pleased to announce, as of summer 2014, that Columbia Sportswear is an AirWatch customer as well. So we get to expand our end-user computing and our VMWare Horizon footprint as well as some of our SDDC strategies.

      We're looking at expanding not only our e-commerce and brick-and-mortar, but being able to deliver more mobile platform-agnostic solutions for Columbia Sportswear, and extend that out to not only Columbia employees, but our consumer experience.

      Gardner: Let’s hear from Tim about your data center requirements. How does what Suzan told us about your business challenges translate into IT challenges?

      https://www.linkedin.com/pub/tim-melvin/1/654/609
      Melvin
      Melvin: With our business changing and growing as quickly as it is, and with us doing business and selling directly to consumers in more than 100 countries around the world, our data centers have to be adaptable. Our data and our applications have to be secure and available, no matter where we are in the world, whether you're on network or off-premises.

      The SDDC has been a game-changer for us. It’s allowed to take those technologies, host them where we need them, and with whatever cost configuration makes sense, whether it’s in the cloud or on-premises, and deliver the solutions that our business needs.

      Gardner: Let's do a quick fact-check in terms of where you are in this journey to SDDC. It includes a lot. There are management aspects, network aspects, software-defined storage, and then of course mobile. Does anybody want to give me the report card on where you are in terms of this journey?

      100 percent virtualized

      Pickett: We're 100 percent virtualized with our compute workloads today. We also have our storage well-defined with virtualized storage. We're working on an early adoption proof of concept (POC) with VMware's NSX for software-defined networking.

      It really fills our next step into defining our SDDC, being able to leverage all of our virtual workloads, being able to extend that into the vCloud Air hybrid cloud, and being able to burst our workloads to expand our data centers our toolsets. So we're looking forward to our next step of our journey, which is software-defined networking via NSX.

      Gardner: Taking that network plunge, what about the public-cloud options for your hybrid cloud? Do you use multiple public clouds, and what's behind your choice on which public clouds to use?

      Melvin: When you look at infrastructure and the choice between on-premise solutions, hybrid clouds, public and private clouds, I don't think it's a choice necessarily of which answer you choose. There isn't one right answer. What’s important for infrastructure professionals is to understand the whole portfolio and understand where to apply your high-power, on-premises equipment and where to use your lower-cost public cloud, because there are trade-offs in each case.
      What’s important for infrastructure professionals is to understand the whole portfolio and understand where to apply your high-power, on-premises equipment and where to use your lower-cost public cloud

      When we look at our workloads, we try to present the correct tool for the correct job. For instance, for our completely virtualized SAP environment we run that on internal, on-premises equipment. We start to talk about development in a sandbox, and those cases are probably best served in a public cloud, as long as we can secure and automate, just like we can on-site.

      Gardner: As you're progressing through SDDC and you're exploring these different options and what works best both technically and economically in a hybrid cloud environment, what are you doing in terms of your data lifecycle. Is there a disaster recovery (DR) element to this? Are you doing warehousing in a different way and disturbing that, or are you centralizing it? I know that analysis of data is super important for retail organizations. Any thoughts about that data component on this overall architecture?

      Pickett: Data is really becoming a primary concern for Columbia Sportswear, especially as we get into more analytical situations. Today, we have our two primary data centers in North America, which we do protect with VMWare’s vCenter Site Recovery Manager (SRM), a very robust DR solution.

      We're very excited to work with an enterprise-class cloud like vCloud Air that has not only the services that we need to host our systems, but also DR as a service, which we're very interested in pursuing, especially around our remote branch office scenarios. In some of those remote countries, we don't have that protection today, and it will give a little more business continuity or disaster avoidance, as needed.

      As we look at data in our data centers, our primary data centers with big data, if you will, and/or enterprise data warehouse strategies, we've started looking at how we're replicating the data where that data lives. We've started getting into active data center scenarios -- active, active.

      We're really excited around some of the announcements we've heard recently at VMworld around virtual volumes (VVOLs) and where that’s going to take us in the next couple of years, specifically around vMotion over long-distance. Hopefully, we'll follow the sun, and maybe five years from now, we'll able to move our workloads from North America to Asia and be able to take those workloads and have them follow where the people are using them.

      Geographic element

      Gardner: That’s really interesting about that geographic element if you're a global company. I haven't heard that from too many other organizations. That’s an interesting concept about moving data and workloads around the world throughout the day.

      We've seen some recent VMware news around different types of cloud data offerings, Cloud Object Store for example, and moving to a virtual private cloud on demand. Where do you see the next challenge is in terms of your organization and how do you feel that VMware is setting a goal post for you?
      vCloud Air, being an enterprise-class offering, gives us the management capability and allows us to use the same tools that we would use on site.

      Tronco: The vCloud Air offerings that we've heard so much about are an exciting innovation.

      Public clouds have been available for a long time. There are a lot of places where they make sense, but vCloud Air, being an enterprise-class offering, gives us the management capability and allows us to use the same tools that we would use on-site.

      It gives us the control that we need in order to provide a consistent experience to our end-users. I think there is a lot of power there, a lot of capability, and I'm really excited to see where that goes.

      Gardner: How about some of the automation issues with the vRealize Suite, such Air Automation. Where do you see the component of managing all this? It becomes more complex when you go hybrid. It becomes, in one sense, more standardized and automated when you go software-defined, but you also have to have your hands on the dials and be able to move things.

      https://www.linkedin.com/in/ctronco
      Tronco
      Tronco: One of the things that we really like about vCloud Air is the fact that we'll be able to use the same tools on-premises and off-premises, and won't have to switch between tools or dashboards. We can manage that infrastructure, whether it's on-premise or in the public cloud, will be able to leverage the efficiencies we have on-premise in vCloud Air as well.

      We also can take advantage of some of those new services, like ObjectStore, that might be coming down the road, or even continuous integration (CI) as a service for some of our development teams as we start to get more into a DevOps world.

      Customer reactions

      Gardner: Let’s tie this back to the business. It's one thing to have a smooth-running, agile IT infrastructure machine. It's great to have an architecture that you feel is ready to take on your tasks, but how do you translate that back to the business? What does it get for you in business terms, and how are you seeing reactions from your business customers?

      Pickett: We're really excited to be partnering with the business today. As IT comes out from underground a little bit and starts working more with the business and understanding their requirements -- especially with tools like VMware vRealize Automation, part of the vCloud Suite -- we're now partnering with our development teams to become more agile and help them deliver faster services to the business.

      We're working on one of our e-commerce order confirmation toolsets with vRealize Automation, part of the vCloud Suite, and their ability to now package and replicate the work that they're doing rather than reinventing the wheel every time we build out an environment or they need to do a test or a development script.

      By partnering with them and enabling them to be more agile, IT wins. We become more services-oriented. Our development teams are winning, because they're delivering faster to the business and the business wins, because now they're able to focus more on the core strategies for Columbia Sportswear.

      Gardner: Do you have any examples that you can point to where there's been a time-to-market benefit, a time-to-value faster upgrade of an application, or even a data service that illustrates what you've been able to deliver as a result of your modernization?
      Our development teams are winning, because they're delivering faster to the business and the business wins, because now they're able to focus more on the core strategies.

      Pickett: Just going back to the toolset that I just mentioned. That was an upgrade process, and we took that opportunity to sit down with our development team and start socializing some of the ideas around VMware vRealize Automation and vCloud Air and being able to extend some of our services to them.

      At the same time, our e-commerce teams are going through an upgrade process. So rather than taking weeks or months to deliver this technology to them, we were able to sit down, start working through the process, automate some of those services that they're doing, and start delivering. So, we started with development, worked through the process, and now we have quality assurance and staging and we're delivering product. All this is happening within a week.

      So we're really delivering and we're being more agile and more flexible. That’s a very good use case for us internally from an IT standpoint. It's a big win for us, and now we're going to take it the next time we go through an upgrade process.

      We've had this big win and now we're going to be looking at other technologies -- Java, .NET, or other solutions -- so that we can deliver and continue the success story that we're having with the business. This is the start of something pretty amazing, bringing development and infrastructure together and mobilizing what Columbia Sportswear is doing internally.

      Gardner: Of course, we call it SDDC, but it leads to a much more comprehensive integrated IT function, as you say, extending from development, test, build, operations, cloud, and then sourcing things as required for a data warehouse and applications sets. So finally, in IT, after 30 or 40 years, we really have a unified vision, if you will.

      Any thoughts, Tim, on where that unification will lead to even more benefits? Are there ancillary benefits from a virtuous adoption cycle that come to mind from that more holistic whole-greater-than-the-sum-of-the-parts IT approach?

      Flexibility and power

      Melvin: The closer we get to a complete software-defined infrastructure, the more flexibility and power we have to remove the manual components, the things that we all do a little differently and we can't do consistently.

      We have a chance to automate more. We have the chance to provide integrations into other tools, which is actually a big part of why we chose VMware as our platform. They allow such open integration with partners that, as we start to move our workloads more actively into the cloud, we know that we won't get stuck with a particular product or a particular configuration.

      The openness will allow us to adapt and change, and that’s just something you don't get with hardware. If it's software-defined, it means that you can control it and you can morph your infrastructure in order to meet your needs, rather than needing to re-buy every time something changes with the business.

      Gardner: Of course, we think about not just technology, but people and process. How has all of this impacted your internal IT organization? Are you, in effect, moving people around, changing organizational charts, perhaps getting people doing things that they enjoy more than those manual tasks? Carlos, any thought about the internal impact of this on your human resources issues?

      Tronco: Organizationally, we haven’t changed much, but the use of some thing like vRealize Automation allows us to let development teams do some of those tasks that they used to require us to do.

      Now, we can do it in an automated fashion. We get consistency. We get the security that we need. We get the audit trail. But we don’t have to have somebody around on a Saturday for two minutes of work spread across eight hours. It also lets those application teams be more agile and do things when they're ready to do them.
      We can all leverage the tools and configurations. That's really powerful.

      Having that time free lets us do a better job with engineering, look down the road better with a little more clarity, maybe try some other things, and have more time to look at different options for the next thing down the road.

      Melvin: Another point there is that, in a fully software-defined infrastructure, while it may not directly translate into organizational changes, it allows you to break down silos. Today, we have operations, system storage, and database teams working together on a common platform that they're all familiar with and they all understand.

      We can all leverage the tools and configurations. That's really powerful. When you don't have the network guys sitting off doing things different from what the server guys are doing, you can focus more on comprehensive solutions, and that extends right into the development space, as Carlos mentioned. The next step is to work just as closely with our developers as we do with our peers and infrastructure.

      Gardner: It sounds as if you're now also in a position to be more fleet. We all have higher expectations as consumers. When I go to a website or use an application, I expect that I'll see the product that I want, that I can order it, that it gets paid for, and then track it. There is a higher expectation from consumers now.

      Is that part of your business payback that you tie into IT? Is there some way that we can define the relationship between that user experience for speed and what you're able to do from a software-defined perspective?

      Preventing 'black ops'

      Pickett: As an internal service provider for Columbia Sportswear, we can do it better, faster, and cheaper on-premise and with our toolsets from our partners at VMware. This helps prevent black ops situations, for example, where someone is going out to another cloud provider outside the parameters and guidelines from IT.

      Today, we're partnering with the business. We're delivering that service. We're doing it at the speed of thought. We're not in a position where we're saying "no," "not yet," or "maybe in a couple of weeks," but "Yes, we can do that for you." So it's a very exciting position to be in that if someone comes to us or if we're reaching out, having conversations about tools, features, or functionality, we're getting a lot of momentum around utilizing those toolsets and then being able to expand our services to the business.

      Tronco: Using those tools also allows us to turn around things faster within our development teams, to iterate faster, or to try and experiment on things without a lot of work on our part. They can try some of it, and if it doesn’t work, they can just tear it down.
      Today, we're partnering with the business. We're delivering that service. We're doing it at the speed of thought.

      Gardner: So you've gone through this journey and you're going to be plunging in deeper with software-defined networking. You have some early-adopter chops here. You guys have been bold and brave.

      What advice might you offer to some other organizations that are looking at their data-center architecture and strategy, thinking about the benefits of hybrid cloud, software-defined, and maybe trying to figure out in which order to go about it?

      Pickett: I'd recommend that, if you haven’t virtualized your workloads -- to get them virtualized. We're in that no-limit situation. There are no longer restrictions or boundaries around virtualizing your mission-critical or your tier-one workloads. Get it done, so you can start leveraging the portability and the flexibility of that.

      Start looking at the next steps, which will be automation, orchestration, provisioning, service catalogs, and extending that into a hybrid-cloud situation, so that you can focus more on what your core offerings are going to be your core strategies. And not necessarily offload, but take advantage of some of those capabilities that you can get in VMware vCloud Air for example, so that you can focus on really more of what’s core to your business.

      Gardner: Tim, any words of advice from your perspective?

      Melvin: When it comes to solutions in IT, the important thing is to find the value and tie it back to the business. So look for those problems that your business has today, whether it's reducing capital expense through heavy virtualization, whether it's improving security within the data center through NSX and micro-segmentation, or whether it's just providing more flexible infrastructure for your temporary environments like SAN and software development through the cloud.

      Find those opportunities and tie it back to a value that the business understands. It’s important to do something with software-defined data centers. It's not a trend and it's not really even a question anymore. It's where we're going. So get moving down that path in whatever way you need to in order to get started. And find those partners, like VMware, that will support you and build those relationships and just get moving.

      20/20 hindsight

      Gardner: Carlos, advice, thoughts about 20/20 hindsight?

      Tronco: As Suzan said, it's focusing on virtualizing the workloads and then being able to leverage some of those other tools like vRealize Automation. Then you're able to free staff up to pursue activities and add more value to the environment and the business, because you're not doing repeatable things manually. You'll get more consistency now that people have time. They're not down because they're doing all these day two, day three operations and things that wear and grate on you.

      Gardner: I suppose there's nothing like being responsive to your business constituents. That, then, enables them to seek for more help, which then adds to your value, when we get into that virtuous cycle, rather than a dead end of people not even bothering to ask for help or new and innovative ideas in business.
      It’s important to do something with software-defined data centers. It's not a trend and it's not really even a question anymore.

      Congratulations. That sounds like a very impactful way to go about IT. We've been learning about how Columbia Sportswear in Portland, Oregon has been adjusting to the software-defined data center strategy and we've heard how that's brought them some business benefits in their fast-paced retail organization worldwide.

      So a big thank you to our guests, Suzan Pickett, Manager of Global Infrastructure Services at Columbia Sportswear; Tim Melvin, Director of Global Technology Infrastructure, and Carlos Tronco, Lead Systems Engineer at Columbia Sportswear. Thanks so much.

      And a big thank you to our audience for joining us for this special discussion series, coming to you directly from the recent 2014 VMworld Conference in San Francisco.

      I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host throughout this series of VMware-sponsored BriefingsDirect IT discussions. Thanks again for listening, and come back next time.

      Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: VMware.

      Transcript of a BriefingsDirect discussion on how a major sportswear company has leveraged virtualization, SDDC and hybrid cloud to reap substantial business benefits. Copyright Interarbor Solutions, LLC, 2005-2015. All rights reserved.

      You may also be interested in:

      Tuesday, April 09, 2013

      Agnostic Tool Chain Approach Proves Key to Fixing Broken State of Data and Information Management

      Transcript of a BriefingsDirect podcast on how Dell Software is working with companies to manage internal and external data in all its forms.

      Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Dell Software.

      Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you're listening to BriefingsDirect.

      Gardner
      Today, we present a sponsored podcast discussion on better understanding the biggest challenges businesses need to solve when it comes to data and information management.

      We'll examine how a data dichotomy has changed the face of information management. This dichotomy means that organizations, both large and small, not only need to manage all of their internal data that provides intelligence about their businesses, but they also need to manage the reams of increasingly external big data that enables them to discover new customers and drive new revenue.

      Lastly, our discussion will focus on bringing new levels of automation and precision to the task of solving data complexity by embracing an agnostic, end-to-end tool chain approach to overall data and information management.

      Here now to share his insights on where the information management market has been and where it's going, we're joined by Matt Wolken, Executive Director and General Manager for Information Management at Dell Software. Welcome, Matt. [Disclosure: Dell Software is a sponsor of BriefingsDirect podcasts.]

      Matt Wolken: Dana, thanks for having me. I appreciate it.

      Gardner: From your perspective, what are the biggest challenges that businesses need to solve now when it comes to data and information management? What are the big hurdles that they're facing?

      Wolken: It's an interesting question. When we look at customers today, we're noticing how their environments have significantly changed from maybe 10 or 15 years ago.

      Wolken
      About 10 or 15 years ago, the problem was that data was sitting in individual databases around the company, either in a database on the backside of an application, the customer relationship management (CRM) application, the enterprise resource planning (ERP) application, or in data marts around the company. The challenge was how to bring all this together to create a single cohesive view of the company?

      That was yesterday's problem, and the answer was technology. The technology was a single, large data warehouse. All of the data was moved to it, and you then queried that larger data warehouse where all of the data was for a complete answer about your company.

      What we're seeing now is that there are many complexities that have been added to that situation over time. We have different vendor silos with different technologies in them. We have different data types, as the technology industry overall has learned to capture new and different types of data -- textual data, semi-structured data, and unstructured data -- all in addition to the already existing relational data. Now, you have this proliferation of other data types and therefore other databases.

      The other thing that we notice is that a lot of data isn't on premise any more. It's not even owned by the company. It's at your software-as-a-service (SaaS) provider for CRM, your SaaS provider for ERP, or your travel or human resources (HR) provider. So data again becomes siloed, not only by vendor and data type, but also by location. This is the complexity of today, as we notice it.

      Cohesive view

      All of this data is spread about, and the challenge becomes how do you understand and otherwise consume that data or create a cohesive view of your company? Then there is still the additional social data in the form of Twitter or Facebook information that you wouldn't have had in prior years. And it's that environment, and the complexity that comes with it, that we really would like to help customers solve.

      Gardner: When it comes to this so-called data dichotomy, is it oversimplified to say it's internal and external, or is there perhaps a better way to categorize these larger sets that organizations need to deal with?

      Wolken: There's been a critical change in the way companies go about using data, and you brought it out a little bit in the intro. There are some people who want to use data for an outcome-based result. This is generally what I would call the line-of-business concern, where the challenge with data is how do I derive more revenue out of the data source that I am looking at?

      What's the business benefit for me examining this data? Is there a new segment I can codify and therefore market to? Is there a campaign that's currently running that is not getting a good response rate, and if so, do I want to switch to another campaign or otherwise improve it midstream to drive more real value in terms of revenue to the company?

      That’s the more modern aspect of it. All of the prior activities inside business intelligence (BI) -- let’s flip those words around and say intelligence about the business -- was really internally focused. How do I get sanctioned data off of approved systems to understand the official company point of view in terms of operations?
      How do I go out and use data to derive a better outcome for my business?

      That second goal is not a bad goal. That's still a goal that's needed, and IT is still required to create that sanctioned data, that master data, and the approved, official sources of data. But there is this other piece of data, this other outcome that's being warranted by the line of business, which is, how do I go out and use data to derive a better outcome for my business? That's more operationally revenue-oriented, whereas the internal operations are around cost orientation and operations.

      So where you get executive dashboards for internal consumption off of BI or intelligence for the business, the business units themselves are about visualization, exploration, and understanding and driving new insights.

      It's a change in both focus and direction. It sometimes ends up in a conflict between the groups, but it doesn't really have to be that way. At least, we don't think it does. That's something that we try to help people through. How do you get the sanctioned data you need, but also bring in this third-party data and unstructured data and add nuance to what you are seeing about your company.

      Gardner: Just as 10 or 15 years ago the problem to solve was the silos of data within the organization, is there any way in traditional technology offerings that allows this dichotomy to be joined now, or do we need a different way in which to create insights, using both that internal and external type of information?

      Wolken: There are certainly ways to get to anything. But if you're still amending program after program or technology after technology, you end up with something less than the best path, and there might be new and better ways of doing things.

      Agnostic tool chain

      There are lots of ways to take a data warehouse forward in today's environment, manipulate other forms of data so it can enter a data warehouse or relational data warehouse, and/or go the other way and put everything into an unstructured environment, but there's also another way to approach things, and that’s with an agnostic tool chain.

      Tools have existed in the traditional sense for a long time. Generally, a tool is utilized to hide complexity and all of the issues underneath the tool itself. The tool has intelligence to comprehend all of the challenges below it, but it really abstracts that from the user.

      We think that instead of buying three or four database types, a structured database, something that can handle text, a solution that handles semi-structured or structured, or even a high performance analytical engine for that matter, what if the tool chain abstracts much of that complexity? This means the tools that you use every day can comprehend any database type, data structure type, or any vendor changes or nuances between platforms.

      That's the strategy we’re pursuing at Dell. We’re defining a set of tools, not the underlying technologies or proliferation of technologies, but the tools themselves, so that the day-to-day operations are hidden from the complexity of those underlying sources of vendor, data type, and location.
      We’re looking to enable customers to leverage those technologies for a smoother, more efficient, and more effective operation.

      That's how we really came at it -- from a tool-chain perspective, as opposed to deploying additional technologies. We’re looking to enable customers to leverage those technologies for a smoother, more efficient, and more effective operation.

      Gardner: Am I right then in understanding that this is at more of a meta level, above the underlying technologies, but that, in a sense, makes the whole greater than the sum of the parts of those technologies?

      Wolken: That’s a fair way of looking at it. Let's just take data integration as a point. I can sometimes go after certain siloed data integration products. I can go after a data product that goes after cloud resources. I can get a data product that only goes after relational. I can get another data product to extract or load into Hive or Hadoop. But what if I had one that could do all of that? Rather than buying separate ones for the separate use cases, what if you just had one?

      Metadata, in one way, is a descriptor language, if I use it in that sense. Can I otherwise just see and describe everything below it, or can I actually manipulate it as well? So in that sense, it's a real tool to actually manipulate and cause the effective change in the environment.

      Gardner: I'd like to go into more of the challenges, but before we do that, what are the stakes here? What do you get if you do this right? If you can, in fact, manage across various technology types and formats, across relational and unstructured data, internal and external data sources and providers.

      Are we talking iterative change, a step change, or is it something that is a bit larger and that we might have some other examples of companies when they do this well can really demonstrate something perhaps quite unique in terms of a new level of accomplishment?

      Institutional knowledge

      Wolken: There are a couple of ways we think about it, one of which is institutional knowledge. Previously, if you brought in a new tool into your environment to examine a new database type, you would probably hire a person from the outside, because you needed to find that skill set already in the market in order to make you productive on day one.

      Instead of applying somebody who knows the organization, the data, the functions of the business, you would probably hire the new person from the outside. That's generally retooling your organization.

      Or, if you switch vendors, that causes a shift as well. One primary vendor stack is probably a knowledge and domain of one of your employees, and if you switch to another vendor stack or require another vendor stack in your environment, you're probably going to have to retool yet again and find new resources. So that's one aspect of human knowledge and intelligence about the business.

      There is a value to sharing. It's a lot harder to share across vendor environments and data environments if the tools can't bridge them. In that case, you have to have third-party ways to bridge those gaps between the tools. If you have sharing that occurs natively in the tool, then you don't have to cross that bridge, you don't have the delay, and you don't have the complexity to get there.

      So there is a methodology within the way you run the environment and the way employees collaborate that is also accelerated. We also think that training is something that can benefit from this agnostic approach.
      You're reaching across domains and you're not as effective as you would be if you could do that all with one tool chain.

      But also, generically, if you're using the same tools, then things like master data management (MDM) challenges become more comprehensive, if the tool chain understands where that MDM is coming from, and so on.

      You also codify how and where resources are shared. So if you have a person who has to provision data for an analyst, and they are using one tool to reach to relational data, another to reach into another type of data, or a third-party tool to reach into properties and SaaS environments, then you have an ineffective process.

      You're reaching across domains and you're not as effective as you would be if you could do that all with one tool chain.

      So those are some of the high-level ideas. That's why we think there's value there. If you go back to what would have existed maybe 10 or 15 years ago, you had one set of staff who used one set of tools to go back against all relational data. It was a construct that worked well then. We just think it needs to be updated to account for the variance within the nuances that have come to the fore as the technology has progressed and brought about new types of technology and databases.

      Gardner: As for business benefits, we hear a lot about businesses being increasingly data driven and information driven, rather than a hunch, intuition, or gut instinct. Also, there's an ability to find new customers in much more cost-effective ways, taking advantage of the social networks, for example. So when you do this well, what are typically some of the business paybacks, and do they outweigh the cost more than previous investments in data would have?

      Investment cycles

      Wolken: It all depends on how you go about it. There are lots of stories about people who go on these long investment cycles into some massive information management strategy change without feeling like they got anything out of it, or at least were productive or paid back the fee.

      There's a different strategy that we think can be more effective for organizations, which is to pursue smaller, bite-size chunks of objective action that you know will deliver some concrete benefit to the company. So rather than doing large schemes, start with smaller projects and pursue them one at a time incrementally -- projects that last a week and then you have 52 projects that you know derive a certain value in a given time period.

      Other things we encourage organizations to do deal directly with how you can use data to increase competitiveness. For starters, can you see nuances in the data? Is there a tool that gives you the capability to see something you couldn't see before? So that's more of an analytical or discovery capability.

      There's also a capability to just manage a given data type. If I can see the data, I can take advantage of it. If I can operate that way, I can take advantage of it.

      Another thing to think about is what I would call a feedback mechanism, or the time or duration of observation to action. In this case, I'll talk about social sentiment for a moment. If you can create systems that can listen to how your brand is being talked about, how your product is being talked about in the environment of social commentary, then the feedback that you're getting can occur in real time, as the comments are being posted.
      There's a feedback mechanism increase that also can then benefit from handling data in a modern way or using more modern resources to get that feedback.

      Now, you might think you'll get that anyway. I would have gotten a letter from a customer two weeks from now in the postal system that provided me that same feedback. That’s true, but sometimes that two weeks can be a real benefit.

      Imagine a marketing campaign that's currently running in the East, with a companion program in the West that's slightly different. Let's say it's a two-week program. It would be nice if, during the first week, you could be listening to social media and find out that the campaign in the West is not performing as well as the one in the East, and then change your investment thesis around the program -- cancel the one that's not performing well and double down on the one that's performing well.

      There's a feedback mechanism increase that also can then benefit from handling data in a modern way or using more modern resources to get that feedback. When I say modern resources, generally that's pointing towards unstructured data types or textual data types. Again, if you can comprehend and understand those within your overall information management status, you now also have a feedback mechanism that should increase your responsiveness and therefore make your business more competitive as well.

      Gardner: I think the whole concept of the immediacy to feedback, applied across various aspects of business -- planning, production, marketing, go-to market, research, and to uses -- then that's been the Holy Grail of business for a long time. It's just been very difficult to do. Now, we seem to be getting closer to the ability to do it at scale and at reasonable cost. So, these are very interesting times.

      Now, given that these payoffs could be so substantial, what's preventing people from getting to this Holy Grail? What's between them and the realization?

      It's the complexity

      Wolken: I think it's complexity of the environment. If you only had relational systems inside your company previously, now you have to go out and understand all of the various systems you can buy, qualify those systems, get pure feedback, have some proofs of concept (POCs) in development, come in and set all these systems up, and that just takes a little bit of time. So the more complexity you invite into your environment, the more challenges you have to deal with.

      After that, you have to operate and run it every day. That's the part where we think the tool chain can help. But as far as understanding the environment, having someone who can help you walk through the choices and solutions and come up with one that is best suited to your needs, that’s where we think we can come in as a vendor and add lots of value.

      When we go in as a vendor, we look at the customer environment as it was, compare that to what it is today, and work to figure out where the best areas of collaboration can be, where tools can add the most value, and then figure out how and where can we add the most benefit to the user.

      What systems are effective? What systems collaborate well? That's something that we have tried to emulate, at least in the tool space. How do you get to an answer? How do you drive there? Those are the questions we’re focused on helping customers answers.

      For example, if you've never had a data warehouse before, and you are in that stage, then creating your first one is kind of daunting, both from a price perspective, as well as complexity perspective or know-how. The same thing can occur on really any aspect -- textual data, unstructured data, or social sentiment.
      Those are some of the major challenges -- complexity, cost, knowledge, and know-how.

      Each one of those can appear daunting if you don't have a skill set, or don't have somebody walking you through that process who has done it before. Otherwise, it's trying to put your hands on every bit of data and consume what you can and learning through that process.

      Those are some of the things that are really challenging, especially if you're a smaller firm that has a limited number of staff and there's this new demand from the line of business, because they want to go off in a different direction and have more understanding that they couldn't get out of existing systems.

      How do you go out and attain that knowledge without duplicating the team, finding new vendor tools, and adding complexity to your environment, maybe even adding additional data sources, and therefore more data-storage requirements. Those are some of the major challenges -- complexity, cost, knowledge, and know-how.

      Gardner: It's interesting that you mentioned mid-market organizations. Some of these infrastructure and data investments were perhaps completely out of their reach until a new way to approach the problems through the tool chain, through cloud, through other services and on-demand offerings.

      What is it now about the new approach to these problems that you think allows the fruits of this to be distributed more down market? Why are mid-market organizations now more able to avail themselves of some of these values and benefits than in the past?

      Mid-market skills

      Wolken: As the products are well-known, there is more trained staff that understands the more common technologies. There are more codified ways of doing things that a business can take advantage of, because there's a large skill set, and most of the employees may already have that skill set as you bring them into the company.

      There are also some advantages just in the way technologies have advanced over the years. Storage used to be very expensive, and then it got a little cheaper. Then solid-state drives (SSD) came along and then that got cheaper as well. There are some price point advantages in the coming years, as well.

      Dell overall has maintained the status that we started with when Michael Dell started recreating PCs in his dorm room from standard product components to bring the price down. That model of making technology attainable to larger numbers of people has continued throughout Dell’s history, and we’re continuing it now with our information management software business.

      We’re constantly thinking about how we can reduce cost and complexity for our customers. One example would be what we call Quickstart Data Warehouse. It was designed to democratize data to a lower price point, to bring the price and complexity down to a much lower space, so that more people can afford and have their first data warehouse.

      We worked with our partner Microsoft, as well as Dell’s own engineering team, and then we qualified the box, the hardware, and the systems to work to the highest peak performance. Then, we scripted an upfront install mechanism that allows the process to be up and running in 45 minutes with little more than directing a couple of IP addresses. You plug the box in, and it comes up in 45 minutes, without you having to have knowledge about how to stand up, integrate, and qualify hardware and software together for an outcome we call a data warehouse.
      We're trying to hit all of the steps, and the associated costs -- time and/or personnel costs – and remove them as much as we can.

      Another thing we did was include Boomi, which is a connector to automatically go out and connect to the data sources that you have. It's the mechanism by which you bring data into it. And lastly, we included services, in case there were any other questions or problems you had to set it up.

      If you have a limited staff, and if you have to go out and qualify new resources and things you don't understand, and then set them up and then actually run them, that’s a major challenge. We're trying to hit all of the steps, and the associated costs -- time and/or personnel costs – and remove them as much as we can.

      It's one way vendors like Dell are moving to democratize business intelligence a little further, bring it to a lower price point than customers are accustomed too and making it more available to firms that either didn’t have that luxury of that expertise link sitting around the office, or who found that the price point was a little too high.

      Gardner: You mentioned this concept of the tool chain several times. I'd like to hear a bit more about why that approach works, and even more detail about what I understand to be important elements of it -- being agnostic to the data type, holistic management, complete view, and then of course integrate it.

      In addition to the package, it sounds from your earlier comments that you want to be able to approach these daunting issues iteratively, so that you can bite off certain chunks. What is it about the tool chain that accomplishes both a comprehensive value, but also allows it to be adopted on a fairly manageable path, rather than all at once?

      Wolken: One of the things we find advantageous about entering the market at this point in time is that we're able to look at history, observe how other people have done things over time, and then invest in the market with the realization that maybe something has changed here and maybe a new approach is needed.

      Different point of view

      Whereas the industry has typically gone down the path of each new technology or advancement of technology requires a new tool, a new product, or a new technology solution, we’ve been able to stand back and see the need for a different approach. We just have a different point of view, which is that an agnostic tool chain can enable organizations to do more.

      So when we look at database tools, as an example, we would want a tool that works against all database types, as opposed to one that works against only a single vendor or type of data.

      The other thing that we look at is if you walk into an average company today, there are already a lot of things laying around the business. A lot of investment has already been made.

      We wanted to be able to snap in and work with all of the existing tools. So, each of the tools that we’ve acquired, or have created inside the company, were made to step into an existing environment, recognize that there were other products already in the environment, and recognize that they probably came from a different vendor or work on a different data type.

      That’s core to our strategy. We recognize that people were already facing complexity before we even came into the picture, so we’re focused on figuring out how we snap into what they already have in place, as opposed to a rip-and-replace strategy or a platform strategy that requires all of the components to be replaced or removed in order for the new platform to take its place.
      We’ve also assembled a tool chain in which the entirety of the chain delivers value as a whole.

      What that means is tools should be agnostic, and they should be able to snap into an environment and work with other tools. Each one of the products in the tool chain we’ve assembled was designed from that point of view.

      But beyond that, we’ve also assembled a tool chain in which the entirety of the chain delivers value as a whole. We think that every point where you have agnosticism or every point where you have a tool that can abstract that lower amount of complexity, you have savings.

      You have a benefit, whether it’s cost savings, employee productivity, or efficiency, or the ability to keep sanctioned data and a set of tools and systems that comprehend it. The idea being that the entirety of the tool chain provides you with advantages above and beyond what the individual components bring.

      Now, we're perfectly happy to help a customer at any point where they have difficultly and any point where our tools can help them, whether it's at the hardware layer, from the traditional Dell way, at the application layer, considering a data warehouse or otherwise, or at the tool layer. But we feel that as more and more of the portfolio – the tool chain – is consumed, more and more efficiency is enabled.

      Gardner: It sounds as if rather than look at the ecosystem that’s in place in an organization as a detriment, you're trying to make that into an asset, and then even looking further to new products available to bring that in. So I guess partnering becomes important.

      Already-made investment

      Wolken: Everything is an already-made investment in the company. If the premise to rip and replace is from the get-go, then you're really removing the institutional knowledge, the training of the staff, and the investment into the product, not to mention maybe the integration work. That's not something we wanted to start out with. We wanted to recognize and leverage what was there and provide value to that already existing environment.

      One of the core values that we were looking at from a design point is how do you fit into an environment and how do you add value to it, not how do you cause replacement or destruction of an existing environment in order to provide benefit.

      Gardner: We have been talking about the tool chain in terms of its value for analytics and intelligence about the business and bringing in more types of data and information from external sources.

      It also sounds to me as if this sets you up for a lifecycle benefits, not just on the business benefits, but also on the IT benefits, for things like a better backup and recovery, a better disaster recovery strategy, perhaps looking towards more storage efficiency. Is there an intramural benefit from the IT side to doing this in the fashion you have been describing as well?

      Wolken: We looked at the strategy and said if you manage this as a data lifecycle, and that’s really what we think about it as, then where does data first show up in a company? That’s inside of a database on the backside of an application most likely.
      Doing that, you also solve the problem of how to make sure that the data that was provisioned was sanctioned.

      And where is it last used inside of a company? That would generally be just before retirement or long-term retention of the data. Then the question becomes how do you manipulate and otherwise utilize the data for the maximum benefit in the middle?

      When we looked at that, one of the problems that you uncover is that there's a lot of data being replicated in a lot of places. One of the advantages that we've put together in the tool chain was to use virtualization as a capability, because you know where data came from and you know that it was sanctioned data. There's no reason to replicate that to disk in another location in the company, if you can just reach into that data source and pull that forward for a data analyst to utilize.

      You can virtually represent that data to the user, without creating a new repository for that person. So you're saving on storage and replication costs. So if you’re looking for where is there efficiency in the lifecycle of data and how can you can cut some of those costs, that’s something that jumps right out.

      Doing that, you also solve the problem of how to make sure that the data that was provisioned was sanctioned. By doing all of these things, by creating a virtual view, then providing that view back to the analyst, you're really solving multiple pieces of the puzzle at the same time. It really enables you to look at it from an information-management point of view.

      Gardner: That's interesting, because you can not only get better business outcome benefits and analytics benefits, but you can simplify and reduce your total cost of ownership from the IT perspective. That's kind of another Holy Grail out there, to be able to do more with less.

      One of the advantages

      Wolken: That's what we think one of the advantages can be, and certainly, as you have the advantage to stand on the shoulders of people who have come before you and look at how the environment’s changed, you can notice some of these real minor changes and bring them forward. That's what we want to do with IT as partners and with the solution that we bring forward.

      Gardner: How should enterprises and mid-market firms get started? Are there some proven initiation points, methods, or cultural considerations when one wants to move from that traditional siloed platform and integrate them along the way, an approach more towards this integrated, comprehensive tool-chain approach?

      Wolken: There are different ways you can think about it. Generally, most companies aren’t just out there asking how they can get a new tool chain. That's not really the strategy most people are thinking about. What they are asking is how do I get to the next stage of being an intelligent company? How do I improve my maturity in business intelligence? How would I get from Excel spreadsheets without a data warehouse to a data warehouse and centralized intelligence or sanctioned data?

      Each one of these challenges come from a point of view of, how do I improve my environment based upon the goals and needs that I am facing? How do I grow up as a company and get to be more of a data-based company?

      Somebody else might be faced with more specific challenges, such a line of business is now asking me for Twitter data, and we have no systems or comprehension to understand that. That's really the point where you ask, what's going to be my strategy as I grow and otherwise improve my business intelligence environment, which is morphing every year for most customers.
      It's about incremental improvement as well as tangible improvement for each and every step of the information management process.

      That's the way that most people would start, with an existing problem and an objective or a goal inside the company. Generically, over time, the approach to answering it has been you buy a new technology from a new vendor who has a new silo, and you create a new data mart or data warehouse. But this is perpetuating the idea that technology will solve the problem. You end up with more technologies, more vendor tools, more staff, and more replicated data. We think this approach has become dated and inefficient.

      But if, as an organization, you can comprehend that maybe there is some complexity that can be removed, while you're making an investment, then you free yourself to start thinking about how you can build a new architecture along the way. It's about incremental improvement as well as tangible improvement for each and every step of the information management process.

      So rather than asking somebody to re-architect and rip and replace their tool chain or the way they manage the information lifecycle, I would say you sort of lean into it in a way.

      If you're really after a performance metric and you feel like there is a performance issue in an environment, at Dell we have a number of resources that actually benchmark and understand the performance and where bottlenecks are in systems.

      So we can look at either application performance management issues, where we understand the application layer, or we have a very deep and qualified set of systems around databases and data warehouse performance to understand where bottlenecks are either in SQL language or elsewhere. There are a number of tools that we have to help identify where a bottleneck or issue might be from just a pure performance perspective as well.

      Strategic position

      Gardner: That might be a really good place to start -- just to learn where your performance issues are and then stake out your strategic position based on a payback for improving on your current infrastructure, but then setting the stage for new capabilities altogether.

      Wolken: Sometimes there’s an issue occurring inside the database environment. Sometimes it's at the integration layer, because integration isn’t happening as well as you think. Sometimes it's at the data warehouse layer, because of the way the data model was set up. Whatever the case, we think there is value in understanding the earlier parts of the chain, because if they’re not performing well, the latter parts of the chain can’t perform either.

      And so at each step, we've looked at how you ensure the performance of the data. How do you ensure the performance of the integration environment? How do you ensure the performance of the data warehouse as well? We think if each component of the tool chain in working as well as it should be, then that’s when you enable the entirety of your solution implementation to truly deliver value.
      At each step, we've looked at how you ensure the performance of the data.

      Gardner: Great. I'm afraid we we'll have to leave it there. We're about out of time. You've been listening to a sponsored BriefingsDirect podcast discussion on better understanding the challenges businesses need to solve when it comes to improved data and information management.

      And we have seen how organizations, not only need to manage all of their internal data that provides intelligence about the businesses, but also increasingly the reams of external data that enables them to improve on whole new business activities like discovering additional customers and driving new and additional revenue.

      And we've learned more about how new levels of automation and precision can be applied to the task of solving data complexity and doing that to a tool chain of agnostic and capability.

      I want to thank our guest. We have been here with Matt Wolken, Executive Director and General Manager for Information Management Software at Dell Software. Thanks so much, Matt.

      Wolken: Thank you so much as well.

      Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks again to our audience for joining us, and do come back next time.

      Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Dell Software.

      Transcript of a BriefingsDirect podcast on how Dell Software is working with companies to manage internal and external data in all its forms. Copyright Interarbor Solutions, LLC, 2005-2013. All rights reserved.

      You may also be interested in: