Thursday, October 27, 2011

Dumb, Dumber and Dumberer: Or how to make sure your data migration problems really hurt.

A little while ago I blogged / ranted about the importance of keeping the data migration activity close to the main project and not allowing it to drift away into its own little world. I used an example of a company that services me which had been through a recent data migration process and how choices it made with regard to communications with customers had apparently been made without regard for data migration issues, with the end result being data quality issues which might otherwise have gone unnoticed being made very obvious to the customer. If you like you can read the original piece here: Dumb and Dumber, or how to make sure your data migration problems get noticed.


Fast forward three weeks and I received my next quarterly bill from the company. No folks, that wasn't a typo - the bill for the next quarter arrived only three weeks after last quarter's bill. A quick glance at the bill confirmed that the data migration problems hadn't been resolved. That's OK, it had only been three weeks after all! But, amazingly, I got the same extra communications and marketing collateral I'd received last time that had inadvertently drawn attention to the data migration and data quality problems. Ok, I can understand how a lack of an holistic approach to the new billing system rollout could have resulted in this happening last time, but surely the flood of calls they must have got to their call centre after the last bill cycle must have alerted something that something was amiss. Do these people need a sledgehammer to drive the point home? Or, more likely, is it symptomatic of how bad the isolation of data migration activities are - a problem for IT to  own and solve. (Watch this space for a blog post on that issue as well!)

So, what's to be learned here? The key [new] takeaway for me is that data migration activities don't end at the conclusion of the go-live weekend. Having the data loaded into the new system is only one milestone on the journey. Someone needs to continually monitor the quality of data  and assess if it continues to be fit for purpose in its new home. This should include monitoring indicators from around the organisation that might provide clues that there are data issues afoot. And, in today's world where social media abounds, this monitoring really should extended outside the organisational boundaries as well. Chances are the marketing department already has some sort of monitoring of social media, so connect in with these folks and work out from there.

As a footnote (and perhaps not connected to the data migration) the company had also failed to reconcile my payment of the last bill against my account and so included a demand to pay this prior amount immediately. However they were nice enough to point out that if I was having trouble paying my bill they have options to help.  I'm contemplating if I should send them a reply along the lines of "Thanks for your offer. All good here, but are you experiencing difficultly migrating your data? Please contact us for a confidential discussion on how we may help you. Information architects have Data Migration Policies and can arrange suitable remediation plans for immature organisations."

But it's not all bad, at least they provided an example of how brands and reputations can be damaged as a result of poor data migration (and related) decisions. That's material for another blog post. You can read it here: Selling the Data Migration Strategy.

Selling the Data Migration Strategy.

There are two types of data specialists in this world: those who have been through a failed, or severely challenged, data migration project and those who will go through one in their future. Stick around in our field long enough and you'll join the former group, and it may not even be your fault. Even with the best intentions in the world, the foresight to see problems well ahead of time and a plan mapping out a path which has business users, data stewards and processes owners heavily involved and engaged you may still fail!

Why is that?

In reality there are several reasons but I'll drill in on just one here: the lack of senior management buy in and top cover for your data migration strategy. And I'm not talking about a data migration strategy just at the project level, but rather a data migration strategy set at the enterprise level. You do have one of those, don't you?

If you don't, do yourself a favour and start putting one together. Without such a strategy you'll likely find yourself fighting the same battles over and over,  having to justify your choices in the face of tight project budgets and deadlines which were set without regard for your tacit data migration principles, years of hard won experience or contemporary data migration best practices. You really can't [solely] blame the project manager who put the plan together either, after all what guiding documentation did he or she have to consult when they built the plan? Good PMs will look to be guided by existing strategy and principles documents if they exist and excellent PMs will probably seek out those people within a company with migration experience and specialist skills to see what they can leverage. I'm lucky enough to be working with one such PM now and have worked with a select few others in my time - you guys know who you are! But unfortunately the vast majority won't take such steps, preferring instead to depend upon their own past experiences or tasking business analysts to come up with a number - a hard task without specialist skills, experience and guidance. I've worked with far more than a few of these in my time - you guys probably don't realise who you are!

So, you need an overarching data migration strategy document to influence and guide behaviour when it matters - both early in the project planning and design stages and also later on as the pressure mounts close to the go-live date.  But, that's not enough. You also need to sell it to senior management, you'll need their support when the pressure hits. That pressure will come, there's no doubt about it. It may strike at different stages  depending upon the type of organisation you're working with. If you're faced with the task of introducing the concepts that data migrations are business, not technology issues, that business leaders cannot be passive in the migration and that ownership cannot be divorced from accountability then chances are you'll strike push back from business managers who don't want to lose key people to the project, don't have budget for business as usual back-fill or feel that the approach is just IT trying to abdicate responsibility. Alternately you may be faced with a aggressive timeframes or budgets which need to be challenged to ensure that you give the migration at least some chance of success. And, at some time a go-live deadline will mean that pressure will  come for data migration processes to be sidelined in a frantic rush to just get the data in any way you can.

When these things happen you'll need someone with real authority in your corner. So, how do you sell your overarching data migration strategy to the higher levels of the org chart in such a way that you don't just get moral support, but true buy-in? Chances are it will vary from organisation to organisation and even from person to person, but here are three ideas:


  • Link the failure of data migration to delayed or foregone project benefits realisation. There's plenty of research out there that can show this linkage and whilst you won't know any specific numbers until the next major project business case appears you can at least plant the seed. Bonus points if you can look back in your company's history and quantify the amount of unrealised benefits for a prior project due to data migration issues. 
  • If your company has culture of, or current focus on, safety then draw out how failed migrations could feasibly lead to a major safety incident or death.
  • Show how data migration issues lead to damage reputation and brand. There are plenty of cases where this has occurred. I can count two in my city in the past three months alone.
Ideally back up your case with real world examples where a lack of data migration strategy, or a strategy which differs from yours in key elements, contributed to major data migration failures and manifestation of one or more of the above points. You'll find these in the press, case studies and academic papers, but perhaps the best source of such information will come from Auditor General's reports into failed or problematic Government projects as these will be both detailed and publicly available. Also gather some evidence of successful projects which were guided by a strategy which is well aligned with the one you've developed. These can be harder to find. There are academic papers and industry case studies, or you can leverage your network to gather these stories (and hard evidence to support them). Of course, if your strategy is well researched then chances are you may already have amassed this material as you built the strategy. 

Your data migration strategy should be a living document. Take the opportunity to learn from each data migration project, consider these learnings and adjust the strategy as and when there is benefit in doing so. Also keep your eye on the wider area as well. You may only do two or three data migrations every five years inside your company, but there will be many more going on in the broader community. Take the opportunity to learn from those where you can. To that end, I'm always keen to hear what others are  finding has worked (or not worked) for them. Drop me a line with your pearls of wisdom or to share your data migration battle scars - I'm keen to evolve my strategy too!



Friday, October 21, 2011

Findable, trustable and believable. How FaceBook Slipped Up

If I had to sum up the core of what's important for me in the unstructured data area right now I'd use three words: findability, believability, and trustability. Given that my word processing software has just drawn squiggly red lines under each of them, I'm not even sure they are official words. But they should be - without these three things users won't be totally comfortable using whatever piece of software, system or web portal we give them to work with and will probably disengage with it, at least in some fashion.

Don't believe me? Even the big players miss the mark sometimes. Look at FaceBook as an example. I suspect that the recent rumblings and user backlash against the new interface wasn't just because the look and feel changed. The experience changed as well. People didn't know where to look to find things and even if certain things were still possible. For me I wasn't bothered by the change to the look and feel, but I was still thrown by the change. Little changes make the difference. I no longer saw links I posted from external
sites showing up in the my general news feed. Did that mean that the post had failed, or could other people see them and my own posts where just filtered out of what I saw, or had I done something wrong - selected the wrong privacy settings, perhaps? My level of trust had dropped, and dropped quickly. So much so that I took to using other means to share links in some areas, resorting to emailing links around or using specialist sites like Garmin Connect and Strava to publish my running and cycling information. Changes to FaceBook's iPhone interface also left me wondering where to look to find certain posts. I now had a taxonomy of sorts which I could use to refine and filter what displayed on screen. I could choose to view all stories, just see photos, status updates, links, pages, posts from close friends, posts from people in my area or post from people who were associated with the University I attended, etc. Refiners! Great, less noise to wade through, give me just what I want to see. That's a good thing right? I thought so too, but again I didn't always see what I thought I should see, leading to a lingering doubt. Again a trust issue - was I seeing all of the data available or did I have to look somewhere else or do something else to bring it up? Findability had suffered as well (or at least I perceived it had).

Translate this into the corporate world and this might mean that if users lose trust in how their unstructured data is being stored or made available to thrm that they'll stop using a document management system or content management system and resort to emailing documents around the organization and / or putting content on file shares or in other information silos. Even in a well used and hitherto trusted system small changes can quickly result in users working around the system, taking us back toward the unstructured data anarchy that many of us are slowly working to bring order to. Slip behind on that challenge and suddenly your information is fragmenting and findability suffers as well.

What lessons are there to be taken from this? Firstly, small changes matter. Something seemingly insignificant can send ripples out which can set your information management goals back months. Secondly, user interface changes can and will impact on information management and information integrity and quality. Don't make design choices without considering the impact on the wider issues and your information management goals. Choices here may not seem important but can at times be just as critical as deeper information architecture questions or taxonomy design. And finally, communication is key. Make it clear what changes mean to how information can be found and what information is and isn't available and perhaps problems like these, and the negative perceptions that come with them, can be avoided.

Have you lived through similar issues? My medium term goals for information inside our organisation rely heavily on getting findability, believability and trustability right so I'd love to hear about what you faced and how you tackled it.

Thursday, October 13, 2011

Rethinking Your EIM Program

I've not looked at a Gartner Hype Cycle in a while, but I'm pretty sure if I did I'd find Enterprise Information Management (EIM) Programs heading into the Trough of Disillusionment. From what I'm hearing these programs are doing it tough and its not surprising.  EIM aims and value can be hard to articulate, I know I've struggled at times with ours . If you have an EIM program what's your elevator pitch? Phrases like improved decision making, better data management, and leveraging the data asset abound but chances are that many decision makers are coming to see those as little more than motherhood statements, particularly if there's been an EIM program in place for a while with few real benefits yet realised.  Although there may have been initial agreement around the problems to be solved and excitement around the promise of the program your sponsors may no longer really be signed up for the journey and are probably now struggling to see the way there.  


Perhaps it's time to rethink the strategy we take toward reaching our EIM goals. Rather than one big program maybe the use of strategic themes might work better. Use themes to chunk your nebulous program down to smaller quick projects which deliver toward and enable a common theme and focus the efforts and imaginations of people around a few key areas and projects at a time. It's easier for people to see value along the way yet (if you keep your eye on the longer tern prize and larger goals) still possible to get to the bigger picture by drawing synergies from projects along the way. Some of the themes I'm promoting right now include making information findable and believable, and the use of templates to tie information and process together and allow nimble reuse when new business initiatives arise. 


Using strategic themes also means it's much easier for people to "get it" and you, as the architect of the broader EIM initiative, will have better control over how they perceive what's important to do next. Equally importantly you'll have a better chance that they won't "get the wrong end of the stick" and become focused on the wrong things and put their energy into work that really delivers little to advance your true EIM aims.  Another plus is that it's more likely that you'll have people wanting to get involved (and hence get better people) if they can see what is going to be delivered in the smaller projects within the theme. 


If your team structure allows it then look to embed account managers in to the various business areas to get early signals and information about what needs are out there and weave these into the strategic theme sets where appropriate. Doing so will get people on the train.  Let the inertia build and then use that inertia to drive through those pieces of work which are less tangible - whether they be advancing data stewardship, building out the enterprise data model, or something else.


I believe this chunking and strategic themes approach will become more important across the coming 12 -24 months,  particularly if your company places more stock in initiatives with strong business cases and has an over representation of business improvement projects among the successfully funded initiatives and pushes those with less obvious or quantifiable benefits to the back of the queue. This approach will also give you the flexibility to be nimble and stay aligned with business strategies and priorities as they change rather than locking you in to a full year's budget cycle many months before the first work even begins. 


With this approach there are bound to be trade offs but, hey, at least you're moving forward and making (some) progress toward your EIM goals. And that's got to be better than the program being killed entirely. 

Wednesday, October 12, 2011

Where’s the BI Value Proposition?


I can remember a time when the pitch for BI could (almost) be made along the lines of reducing headcount – “put in this reporting solution and you’ll no longer need the team of people who currently spend all of their time compiling numbers in spreadsheets for others in the business to use”. Even when BI cost a mint this scenario still offered an organization the chance to come out in front financially. These days things aren’t so simple. Despite decreasing costs for BI the sales pitch is probably harder today than it was a decade ago. And I’m not just talking about vendors and consulting companies selling BI products and services into their prospects. I’m also talking about selling the value of BI inside companies – making the case for a BI initiative to internal funding boards is getting tougher and tougher with shrinking budgets and competition from other projects and programs.

To have the best chance of success the value proposition needs to be clear.  Often now BI projects will be seen as providing productivity support. They won’t save headcount but they may free up some percentage of time for resources throughout the company or help those within the organization make faster and / or better decisions. Proposing cost savings based upon these time savings is a dangerous endeavor. Such savings may not be viewed as real given fixed wage and salary costs or, perhaps worse still, the department may simply have next year’s budget cut by that amount to ensure the proposed benefit is locked in for realization.

Coming up with the appropriate value proposition may need you to stand back and honestly examine what’s within the scope of the proposed BI program. Is it just static reporting or are you largely replacing existing reports like for like due to a burning platform? If so then perhaps it’s time to swallow your pride and pitch the program like an infrastructure project. Chances are your client or your internal funding board looks at the criteria for assessing these types of projects differently than others, recognizing that they are unlikely to deliver strategic value, cost savings or drive top line growth, but rather just have to be done to sustain the business as usual state – to keep the lights on, if you will.

If your proposed initiative is broader and can be aligned to more tangible business objectives then you have a number of things you should be mindful of when building the BI value proposition:

·      Value is a subjective thing. Look at the perceived value of the initiative at different levels of the organization. Do they all see the value, or does your push to realize enterprise value actually make things harder for one or more departments? If you can find win-win scenarios or at least avoid win-lose scenarios then it will be easier to get agreement on the value.

·      Can you show how the value can be measured over time (and is it actually going to be possible to measure this value)? Financials are just one aspect of this. Speed of decision making, greater visibility and transparency and reduced risk might be things to consider. What ever you decide to include make sure that you can measure and articulate the current state. Not only will it give you a benchmark to measure against but you’ll also have concrete examples of the issues you are trying to solve, rather than just pitching the value of the initiative based around academic or theoretical concerns or problems that have manifested in other industries.

·      Is all of the value expected to be realized in a short to medium timeframe? Chances are this isn’t the case, especially if the project involves analytics or data mining. There likely to be a perceived drop in value whilst the design and implement stages are underway and the business resources are having to participate in workshops, field questions and test while holding down their day jobs. The perceived value is likely to spike once the initiative goes live as the low hanging fruit are harvested but chances are the perceived value will again dip as the business gets more complex and costly due to changed and changing processes, new ways of working and information overload. The real value pay-off may well not come until this stage passes and this state of anxiety and uncertainty can be resolved by working with the business to work with these new learnings and new information to further enhance the analytics and any new or changed processes to again simplify the way people work. Expect this, plan for it and build it into your value proposition. If this pushes the benefits realization out too far to the right then look for ways of chunking the program down into smaller tactical projects all building toward the strategic end game. Show the early value and paint the alignment with the strategic aims and the later value that will be realized.

And of course if you’re coming from the IT side of the fence don’t forget to work with the business through all of this. Not only are you likely to have covered off more stakeholders prior to the pitch, but chances are you’ll reap a higher value from the finished product as well.

Oh, and if you’re from the business side, bring the IT guys in on things too!! Involving them early will lead to higher longer term value.


Monday, October 10, 2011

Dumb and Dumber. Or how to make sure your data migration problems get noticed.

Recently I received a bill from a company which went to great lengths to make sure I knew they had just replaced their billing system. Not only did their bill now look different (with fonts and colours that probably helped pay this year's private school fees for the children of some design studio director) but they also took the time to include their quarterly magazine (complete with an article on the new system) and a double sided full page insert explaining the layout of the new bill.

All good so far. Help the customers through the change, and all that! Glancing through the insert I noticed that loudly called out was a section which showed the amounts of prior bills and trends - last bill, this time last year, etc, and how this could be used to help customers monitor and control costs. Cool! Being a data guy I went straight to this section on my bill..... only to find nothing but zeros! Discussions with other customers in passing over the next few days told me that several lacked any historical data, some of them had historical data, although often for services that they didn't own and some lucky souls even had current period data for services that had long ago been terminated.

Now those of us that have been through a data migration or two know that they won't be without challenges and that mistakes will occur and some data issues may manifest. Calling these out is a good thing. Customers and users affected by the problem will likely be a lot more accommodating if they know about a problem in advance and are kept in the loop about progress toward rectification. But that's not what has happened here. Rather a feature which could have been a selling point may well have turned into a negative as it alerted customers to potential problems with their bills. At best, even if the data issues were nothing more than missing historical data a potential good news story was wasted and I bet that the company's call centre spent a lot of time fielding enquiries from concerned customers.

So how did this one slip through? Were the data problems a surprise to the company, a black swan that crashed into them after the system go-live, in which case the testing process has failed dismally. Or, were the data problems known but the system put into production anyway due to timelines which couldn't, or weren't allowed to, slip to the right? If this latter scenario was the case why was the extra promotional material sent out too? Surely, a short statement acknowledging the problems and talking about the anticipated fix times would have been more appropriate. Better still why not hold off on the release of the new paper bill layout until the issues had been sorted? This way no-one would have spotted a problem and few people would have been materially impacted.

What lessons can we take from this? The obvious ones we've seen before: Data migrations take time and they will strike unexpected problems - allow both time and money to deal with this when they occur. Plan data migration testing early, expect it to take numerous iterations and allow time for these. If there is not time during the go-live activity for comprehensive data migration testing then be sure to run a full dress rehearsal and test its outputs thoroughly. Perhaps not so obvious, is a very important lesson. Do not allow the data migration to become divorced from the rest of the project. Consider the impacts of data migration events and, when and if needed, take steps in the wider project to ensure that migration issues don't compound and lead to embarrassment, brand damage or unnecessary costs.

Monday, October 3, 2011

Single Source of Truth, Nirvana and a Good Run Spoilt


This post came out of a presentation I recently gave to a group of colleagues. I’d promised to put something down on paper to draw out the key points for that group so thought I’d take the opportunity to share with the wider community at the same time. 

When presented with the question “is a single source truth a good thing?” how would you answer? For most people, particularly those of us with business intelligence backgrounds, the intuitive answer is likely to be “yes, of course”.   We’d answer in quick-fire fashion because it’s just one of those things that we know to be true. In a similar fashion we’d quickly voice our agreement with the statement “there has been almost no good music released since 1989”.  Wouldn’t we?

Unexpectedly, I recently had my morning run violently collide with a single source of truth issue. Before I left for the run I loaded up my iPod shuffle with a playlist drawn from the Classic Rock genre within my iTunes library. Half an hour into my run all was going well, the sun was shining and I was happily singing (puffing) along to the sounds of The Police, Queen, Peter Gabriel, Iggy Pop and Nirvana. Hang on, Nirvana? In what universe is Nirvana classified as Classic Rock? So thrown was I by this heinous data quality issue that I had to physically stop running to jump to the next track. Talk about poor data quality having a huge impact on productivity!!

But what could have gone wrong? Surely iTunes is the best single source of truth for information about the music from its own library.  But the problem is just that – it’s a single source of truth but its expected to serve a huge diverse audience. How can we expect this one source and one version of the truth to suit us all? While I’d put Nirvana into the Grunge genre I’m sure many others would agree with the iTunes view and call it Classic Rock, while some of the current crop of teenagers would probably relegate it to the Golden Oldies genre.

For me, one potential answer to this problem is to draw different information from different sources, taking what’s the best fit for a given use and perspective from whatever the most appropriate source is and bringing it together into a composite view – a 360 degree view of an item. A key point to note here is that some of these sources of truth may not be the system of record, there may actually be more appropriate data hidden on a PC under a desk somewhere.  The critical data should, and perhaps, even must come from the system of record, but don’t rule out other data. In my iTunes example I’d expect items like the track names, the album name, the artist name and the year of release to all come from whatever central source feeds iTunes as these are all hard and fast and should be indisputable. But as for the rest, let them be drawn from elsewhere. Let people be able to take genres and ratings from sources which match their perspective and have it surfaced together on their screen with the core information. If one of those sources is unofficial then there’s an opportunity to bring it into the tent so that there are some controls and governance around it, and this should be explored.

Now, for all I know this iTunes data mash-up may already be possible with some combination of settings in iTunes and on my iPod, but I’m just a dumb user and I couldn’t find it in the 30 seconds that my patience allowed me to look for an answer. In a business scenario our users aren’t going to spend time looking for ways to work with the tools we give them either. We need to make sure what we give them is flexible, easy to use and intuitive or they’ll find other ways to work with the data, potentially opening the door to a raft of data quality and integrity issues in so doing.

And as for those of you asking why I had Nirvana in my iTunes library in the first place. There was a passing moment in Philadelphia in the 2003 when I was listening to them, but the attraction has long since passed.  We’ll leave the discussion of my failure to implement a decent Information Lifecycle Management Policy over my music collection for another day!

Oh, and by the way, if anyone does happen to know an alternate source of truth which presents genres as perceived by the average forty something middle class man please drop me a line J




Pragmatic Data Governance - It's a Matter of Trust.


This post had its genesis in a comment I wrote in response to a great piece about data governance written by Jim Harris (Twitter: @ocdqblog) over at Obsessive Compulsive Data Quality (http://www.ocdqblog.com/home/aristotle-data-governance-and-lead-rulers.html). I hope Jim won’t mind if I expand on those comments here. This post has grown organically and I'm not totally sure it captures all of the key issues bouncing around in my head, but getting it our there might help my thoughts develop and gel. So look out for a follow up post in the not too distant future.

Among other things Jim talks about the need for data governance to be applied with a degree of flexibility rather than following a set of rigid rules. This is a great perspective which exposes one of the key reasons why data governance initiatives often fail. In my opinion too often it seems that those involved (or those sponsoring) expect that hard and fast rules are what's needed.

I wonder how much of this stems from organisational cultures where staff time is closely budgeted and monitored. A common pushback I've struck in the past is managers of potential data stewards (almost) insisting on an exact breakdown of what data governance activities their people will be working on and how long each activity will take. Add to that the desire to help out new data stewards by providing them with tools to use as they learn what their new role involves and a situation where the rules and their strict application come to the fore can quickly emerge! The problem is compounded if the stewards feel that they may be blamed in some way if others are not happy with their choices or the downstream effects of them.

In my opinion, when setting up data governance programs we need to make sure that data stewards have top coverage – that is senior management endorsement and support when it matters- to give them time to act and the freedom to make considered and pragmatic choices that might involve bending or extending data governance “rules”. We must trust and allow the stewards to interpret rules according to the circumstances at hand whilst still being mindful of past precedent and any negative impacts their choices might have.

Without this support I’d question the value that a data governance program is going to deliver. Sure, maybe some cosmetic short-term data quality wins might occur, but chances are they’re the low hanging fruit and would probably have happened eventually anyway.  In the longer term all that is likely to happen is that the data stewards will come to be seen as blockers to progress - the holders and enforcers of a set of petty rules, putting up hoops for others to jump through. The hard issues (the ones likely to be standing in the way of unlocking real value) may never get tackled, the program will slowly wither and die as complaints mount against it and future data governance efforts will have a much harder time gaining any traction.

When a steward feels the need to build in a degree of butt covering into what he or she does there is a problem. If he or she can claim to have exactly followed the rules as a means of avoiding any blame storming about a bad situation that may have manifested from a data governance decision chances are the best outcome for the business isn’t going to manifest. We need to give data stewards the room to think, reason, weigh up the contributing points of view and then make the appropriate choice in the circumstances. A little trust in human nature and faith that people will want to do the right thing is required. If the organizational culture doesn’t support this then a successful data governance initiative might be possible, but it may mean that the data stewards need to sit higher in the food chain. But, ideally you can seek out and find stewards who already have the trust and respect of senior management and then will have the flexibility to act pragmatically when needed.