This section sets out some remarks and suggestions about the IS Rating Tool beta version, gathered within the IS Rating Working Group. These contributions are published with the authorization of their authors (listed by dates).
Contribution by Richard Ordowich Independent Consultant with STS Associates Inc. located in Princeton New Jersey Linkedin profil December 3rd, 2009
Description - Governance must be considered in any assessment. The culture of the organization must be considered. The cultural aspects are critically important. I suggest that the IS Rating Tool include some factors that assess the governance of the organization.
Quick Answer by Sustainable IT We agree on this remark. Cultural aspects and organizational issues are key concerns for deploying an ACMS foundation and then leverage its IS rating. Reminder: ACMS (Agility Chain Management System) is based on Business Repositories set up through Master Data (MDM), Business Rules (BRM) and Processes (BPM). To tackle this remark, we plan to add a new part ‘Human Resources” to measure the HR maturity level regarding ACMS. But we also know that there is an interesting work to give more details about the connection between the IS Rating Tool and existing IS/IT governance frameworks such as CobiT, CMMI, ITIL V3, etc. Indeed, the IS Rating Tool doesn’t focus on IS Governance, it tackles IS Assets, in other words the stocks of key IS Assets based on Master Data, Business Rules and Processes. Each of these assets is measured through three domains: Mastering Knowledge, Mastering Governance and Mastering IT. Our Governance domain deals with the level of performance regarding the business features given to Business Users such as Version Management, Right Management, Authoring…of Master Data, Business Rules and Processes. It doesn’t measure the ability of the organization to use these business features in the right way. This later assessement should be gauged with help from CobiT, CMMI, ITIL, etc. All help to give more information regarding the complementary between the IS Rating Tool and IS governance frameworks is welcome.
Contribution by Jean-Philippe Auzelle, IS Architect at Henri Poincaré University of Nancy Linkedin profil December 9th, 2009
Description - The IS Rating Tool should exist in French language. The current beta version is in English only. - The tool is currently based on spreadsheets which could be insufficient when using it at a large scale of a company.
Quick Answer by Sustainable IT The translation in French is not planned. This work remains feasible within 2010. The current version of the IS Rating Tool is based on spreadsheets. We hope to implement it with help from EA tools. All suggestions to set up the IS Rating Tool to EA tools are welcome.
Contribution by Olivier Lallement, MDM Manager at Logica Management Consulting Linkedin profil December 14th, 2009
Description Data Assessment: - Tacking into consideration differences between internal Data Flows and external Fata Flows (B2B). In the current beta version, Data Flows are studied only. - Tacking into consideration the relation from Data to Rules. In the current beta version this relation is evaluated from Rules to Data only. - Tacking into consideration Meta-Data. The current beta version doesn’t provide any evaluation about Meta-Data. Rules Assessment: - Tacking into consideration the lifecycle Rules management. This is similar to business objects’ lifecycles but applied at the level of Rules. - Taking into consideration Rules supervision issues: how to control Rules behaviours and throw alerts when needed. - Vocabulary issue: is CEP a BRMS? Is it useful to keep CEP/BRMS term or CEP only? Process Assessment: - Terms should be well-defined as they are less understandable than those used for Data and Rules.
Quick Answer by Sustainable IT We agree on these suggestions. The next version of the IS Rating tool will integrate them. Regarding the term/vocabulary ‘CEP/BRMS’ we don’t know what the best approach is: either CEP only or CEP/BRMS. It depends if CEP solutions embedded BRMS engines or not. All suggestions are welcome to provide us with more details about this issue.
Contribution by Philippe Herbert, BU Manager EAI/SOA/ Referentials at Soft Computing Linkedin profil January 4th, 2010
Description Data Assessment: - Add an assessment about the level of Data standardization (ISO, ARTS, CEFACT-ONU, MDC-DGME, etc.). Rules Assessment: - Questions BR-KN-01-2 or BR-GV-01-2 should also be evaluated through the autonomy level of the business users when authoring and testing business Rules. Most of the time and even though a BRMS is used, a pure waterfall development lifecycle is deployed.
Quick Answer by Sustainable IT We agree on these suggestions and remarks. The next release of the IS Rating Tool will benefit from them.
Contribution by Jean Evelette, Chief Architect & Data Manager Linkedin profil January 6th, 2010
Description - There is a possible misunderstanding between the three letters providing the consolidated IS Rate (Data, Rules and Processes) and the four letters used to compute it (A, B, C, D). - A documentation providing definitions and explanations about each questions and marks levels could be very useful to support the use of the IS Rating Tool.
Quick Answer by Sustainable IT - We will provide an additional information in the IS Rating Tool user guide to remove the possible trouble between the consolidated IS Rate and the four letters used to compute it. We have planed to write a complete user guide, given explanations by questions within 2010.
Contribution by Hamid Hammouche, IS Architect - Leader IBM ILOG Solutions at Atos Origin Linkedin profil Carolin Chai, IS Architect at Atos Origin Linkedin profil January 14th, 2009
Description - The current beta version of the IS Rating Tool doesn’t tackle other IS Assets than Data, Rules and Processes: what about IT infrastructure, applications, services, etc. - There is a lack of examples to help the utilization of the tool. - A glossary should be set up to define all terms used in the tool. - Set up a first and easier list of questions with answers limited to yes/no rather than the current scale of percentages. Keep this scale (bad to good) for a second stage and more detailed assessment - To enforce a well-balanced assessment between companies, it is needed to provide more information to make the right decision within the scale of percentages, for each question. - Questions related to Budget and Replacement value should be moved to the Governance domain. Data Assessment: - DT-KN-01: the term ‘Data enterprise architecture” should be detailed - DT-KN-02: add some questions about the number of available data dictionary, which users can access to these data dictionary, etc. This suggestion could be applied to every question. - DT-KN-02-02: unclear question. - KN-03-01: the term ‘dynamic modeling’ is not semantic. - KN-02 ‘Basic modeling’ should be renamed by Static or Semantic modeling - DT-GV-03: we need several questions about the integration between MDM and EAI. This integration is a key point to leverage the MDM solution. Rules Assessment: - BR-KN-01: is the goal to define a first classification of Rules? If so, the first attempt presented in the current beta version of the IS Rating Tool should be detailed (Core Business, Organizational, and Security) - BR-KN-03: the goal to gauge the integration level between MDM and BRMS is interesting but still remain complicated - questions related to the Rules assessment are not focus on the method but more on the ability to identify and extract from existing systems Rules. We have nothing that tells us how to do that.
Quick Answer by Sustainable IT Many thanks for this important contribution. What about other IS Assets? - The IS Rating Tool tackles IS Assets only, not IT Assets such as IT Infrastructure. We think that other resources such as Services (SOA approach), Applications… should rely on Data, Rules and Processes managed as real assets, in other words what we are studying with the IS Rating Tool. - However, it could be very useful to complement the IS Rating Tool with other domains of assessment and we hope that Systems Integrators will deploy such works. Lack of examples - We agree on this trouble. We hope that companies using the IS Rating Tool will be willing to share their rating with the community. A further web site section will be opened to gather these results. All contributions are welcome and we will deliver them under an anonymous publication. Quick IS Rating - The idea of a easier set of question (response with yes/no only) is a very good suggestion. We will plan to do something in this field within Q1 2010. Providing more information to support the use of the IS Rating Tool - Yes it is required and we will be working on it during 2010. We also hope that Systems Integrators will be readiness to package their own offering around the core IS Rating Tool that will be available on end of February 2010. This release will be stable an sustainable enough for allowing Systems Integrators to complement it with their own added-values, including guidelines and best practices to help and support companies in using the IS Rating Tool. Obviously, the Sustainable IT Architecture community and its website are available to publish information by Systems Integrators providing works related to this matter.
Contribution by Martha Lemoine, Enterprise Architect, Toronto Canada Linkedin profil January 20th, 2009
Description Global remark: I am worry about how you demonstrate the value and benefits (agility, data quality,) in business terms and get the funding to improve the IS Rating . A white paper on that will be a great companion with this tool. My guess is that not a lot of Organizations will have a high IS Asset rate but then question is what to do next to influence the paradigm change from “application driven” to Model Driven and position the Organization and Architecture so that change can become a reality. Organizational Readiness is a key element to be evaluated and tackle in order to get the value of any IS asset management tool.
General comments: - Glossary: agree with existing feedback asking for adding a Glossary of Terms / acronyms as well as examples is key from usability point of view. Once glossary is available remove notes that include definition and add examples where there are not. - Guidelines: Add a recommendation that it is a good practice to have more than one person doing the IS rating questionnaire to avoid subjectivity, to frequently review IS rating to gauge the evolution as per roadmap driven by the evaluation. Use IS rating to baseline metrics and track improvement showing business value of following all the good practices that are evaluated. Add those metrics in corporate scorecards (metrics) to track execution of IS assets strategies. - Rating: Percentage is very hard to evaluate unless you have a very mature organization with metrics (total number of rules, number of data elements, number of processes, vs. architected data elements, rules, processes…). For some questions percentage or low medium high rating doesn’t match very well maybe evaluate to use something along the line of: Strongly agree (Very High), Agree(High), Neutral (Medium), Disagree(Low), Strongly Disagree(Bad). - Weight for your rating: Provide example or guideline explaining when it will make sense to change the weight. - A diagram showing rating , summarizing key points for each rate rage and how the rating can evolve overtime will be helpful in the user guide (something like the CMMI diagram) - Ideally IS rating should be a tool that let Organizations go to Sustainable IT Architecture web site do the evaluation (online) and benchmarks against all existing evaluations and or to same business sector (financial, government, insurance,..) that may drive interest in improving rating overtime. The IS rating says “for each question make one choice only”, but it would be better to prevent the choice of more than one.
Data Assessment: - Why unstructured data is not one evaluate in IS Data assets? I think it is important for BPM the ECM (Enterprise content management piece) lot of people will refer to that as Information Architecture. - DT-KN-01: change ‘Data enterprise architecture” to Enterprise data Architecture (same comment for Rule Enterprise Architecture and Process…) - DT-KN-01-4: shouldn’t ETL be added after ESB. ? If not why not. - DT-KN-01-5: not very clear please provide some examples. Initially I see this as an overlap of 01-1 to 01-4. Maybe I am missing the point... - DT-KN-02: add some questions about the number of available data dictionary and its accessibility to the whole Organization/projects for reuse. Sometimes model are available but with no metadata. Metadata on valid values should be linked to MDM - DT-KN-02-1- Business Dictionary/ Business Glossary - Business Glossary exists with process defined for its maintenance and is accessible by any potential user - DT-KN-02-2: instead of “not reliant on any IT DB” write “independent of any physical implementation”. - How would you rate an organization that has Enterprise business model, logical and physical model using Data Modeling notation IDEF1X or IE instead of UML. I believe in general there are more mature Organizations in data assets management (data modeling) than process and rules management. Organizations that are not yet doing Model Driven architecture, SOA have frequently silo Application development using UML for Object modeling for a specific application but Enterprise data Architecture may exist manage Business, Logical and physical/dimensional models existing in Data Modeling tools. - DT-GV: scope says org issues out of scope but isn’t Data Governance a key “process” that relies on people to resolve issues in MDM. If technology maturity is there but people maturity in term of information management is not will IS asset have the same rating? - DT-GV-01-01: suggest to change “nature of data” for “data domains” - DT-GV-01-2,3 Not sure why we need to separate CDI and PIM from MDM. How would you deal with an Organization that has an MDM tool like Siperian or Initiate that are defining themselves as multi domain MDM solution. Should be those questions replaced by one that says any application that uses master data is connected to MDM to leverage data governance and trusted master data. - DT-GV-05 -1: It is not very clear. Is it referring to the Corporate Scorecard or IT scorecard? IT strategy, Information Management strategy or what? How about saying something like. A scorecard exists with Key Performance Indicator (KPI) that measure quality of master data against targets and tracks actions to drive improvements in Data Assets rating. (Similar comment applies to rule and process PR-GV-03, BR-GV-03) - DT-GV-05 -2 is this about business rules, process monitoring or MDM real-time data quality Monitoring and alerts?
Rules Assessment: - If you have all your rules in excel spreadsheet where would you rate this vs. not having it. - BR-KN-04 – What if you have rules well defined (in excel or word) but you don’t have a BRMS or CEP tool. How would you arte this if scope of evaluation is an application vs. an entire organization. - In which point is evaluated that rules are defined base on a common business vocabulary and reference data. It may be part of KN-03
Process Assessment: PR-KN-01-03 Qualify Process as “Non Core Process”
Quick Answer by Sustainable IT Your Global remark: We agree on this remark. We have to publish such a white paper. In 2008 we published the white paper “IS Governance when restructuring IS around ACMS” by Orchestra Networks, Logica Management and ILOG. It provides some interesting information. You can download it here. We are preparing a quick view about the mapping between CobiT (Plan & Organise Processes) and the IS Rating Tool. It should bring additional information. Stay tuned through the linkedin Group.
Your general comments: - We agree on all comments. Your suggestion about the tool is relevant. We hope that software vendors in the field of EA tools and/or modeling tools will be willing and readiness to contact us for studying an implementation. Systems Integrators should also be interested in delivering something in this field to support their offering around the IS Rating Tool.
Your remarks about Data Assessment: - We haven’t integrated unstructured data yet and this is a missing topic. Unfortunately we don’t have a good command of this matter so we need help. If you can/want suggest an additional list of rating questions about unstructured data, please contact us.
- Regarding UML notation, you are right, there are other notations and we must take them in consideration.
- Regarding your remark about the organizational issue: yes we agree that this is a key point. The IS Rating Tool deals with Governance Features but not with Organizational issues. These Governance Features are just a check-list of target features required to be able to govern IS Assets based on Master Data, Business Rules and Processes. For example: version management, authoring, traceability, etc. Therefore the IS Rating Tool doesn’t gauge the ability of the organization and its processes to well-governed IS Assets, for example through strategic and tactic plan, SLA, etc. We believe that CobiT is the right tool to use in this field. Therefore, the IS Rating Tool complements with CobiT.
- Regarding your remarks about CDI; PIM and MDM classification, this is a complicated topic. Pierre Bonnet, founder of Sustainable IT Architecture, holds a detailed point of view about this matter as he is also cofounder of Orchestra Networks, Software Vendor in the field of generic MDM, in other words what we call Model-driven MDM. From an IT point of view we need to distinguish OLTP data repository (CDI) relying on a pure and usual relational physical data model and more semantic data repository relying on a relational+oriented object+hierarchical rich data model, including business rules to check data quality and manage referential integrity constraints. Troubles are raised when companies set up PIM/MDM with a pure OLTP data repository because all data governance features are frozen with rigid data models: when you modify or customize your data models, you need to deliver specific software development to align governance features. On the opposite, when using a Model-driven approach, data governance features are well-aligned with rich data models and its evolutions and variations. CDI is different as it should be limited to identifiers/keys management (broker), and even if OLTP is required it doesn’t provide any trouble as data governance features are not needed.
- Regarding the scorecard. It is first of all an IS scorecard; not only IT but not yet a pure business scorecard. We will provide more detailed in the next release of the IS Rating Tool to avoid misunderstanding.
Your remarks about Rules Assessment: - About the spreadsheets as Rules Repository. We don’t like very much the idea of spreadsheets to govern IS Assets, including Business Rules: lack of version management, lack of permission management, lack of traceability, etc. Therefore, the IS Rating Tool should provide a bad rate when using spreadsheets as a Rules repository. We will check this point.
- We agree on your remark about the common business vocabulary. This is a linking value between Data Assessment part and this Rules Assessment part: we will check this point.
Contribution by Romain Montillet, IS Consultant at Devoteam Linkedin profil January 21st, 2009
Description Method: - IS Rating is a feeling-based analyze. The same IS Rating for the same firm could be different according to evaluators. - A “from scratch” analyze should be different from a “recursive” one. Moreover, in a “recursive” rating, it will be relevant to be able to identify evolutions of our performances. In the same way, the IS Rating should be customized according to the importance of the scope and the level of criticality. - The maturity or performance target (“to be”) should be defined in order to compare with the current rating (“as is”). This would allow us to define a sort of policy to follow with action scenarios. - Three main themes are not taken into account : Organization and governance, Man and Culture and compliance - What about widening the scope with unstructured data (ECM…)?
The advantages of this analyze should be emphasized for consultant as well as for enterprise: 1/ For consultants: - An industrialized tool which capitalizes knowledge, know-how… - A method which provide a concrete, progressive and modular analyze - Better way to build deliverables... 2/ For enterprises: - A good start for a global strategy with innovation/change projects - A point of view according to three assets for a better efficiency in projects definition...
Functionalities: - The letter rank should be calculated automatically. - We should be able to make a global weighting for each measure.
Extensions: - It would be great to have pivot tables (for instance knowledge according to Data, Rules and Processes). - A data base with type questions, guidelines, types of deliverables (for instance in a monthly report…), specific examples according to type profiles or type clients… - It would be great to be able to export results to Access, MEGA or others software.
Quick Answer by Sustainable IT Your remarks about the method: - We must admit that the current version of the IS Rating Tool cannot provide a deterministic approach. Not sure that the further release plan for the end of March 2010 will be able to do better in this field. We believe that an audit IS Rating Tool should be defined, later within this year to fix this issue. - Your suggestions regarding the assessment of the existing IS and the targets are relevant. We will include this advise when writing the user guide of the IS Rating Tool. - As already stated (see answers above), the IS Rating Tool doesn’t tackle organization and culture issues but the stocks of IS Assets (Data, Rules and Processes) only. We believe that CobiT and CMMI have to be used in this field. On the opposite, CobiT and CMMI don’t provide a tool such the IS Rating. Regarding you remark about the compliance, we must study in-depth this question as we firmly believe that ACMS and IS Rating Tool leverage the ability of companies to comply with business regulations, requiring a better traceability of Data and Rules, with a better involvement of stakeholders which is quite impossible when implementing IS with a hard-coded approach only. - About unstructured data; please read our answer within the Martha Lemoine’s contributions above.
Your suggestions to emphasize the importance of the IS Rating Tool towards Consultants and Companies are very good. We will take benefit from these suggestions when writing the user guide of the IS Rating Tool.
Your remarks about functionalities and extensions: Yes we need solutions to implement the IS Rating in software such as EA or modeling tools. Once again, all software vendors and SI wishing to study this matter with us are welcome. The added-value of such implementations should be very high to support and emphasize consulting offerings.
Contribution by Stéphane Mulard, IS Consultant at Infhotep Linkedin profil January 25th, 2009
Description Localized version: Although strongly in favor of an international approach to the IS Rating initiative, we do believe that we need a French version (or even a localized version) of both the rating tool and any guidelines accompanying it. The relevance of the rating really depends on how well the questions are understood, especially when some of them may use a vocabulary that can be relatively new to some people – in fact we rated some IS ourselves rather than involving our clients specifically because of the language issue.
Systematic guidelines /examples: We think that the guidelines accompanying the questions have to be developed in order to be more precise and to give out more examples. Rather than expanding the size of the cells containing the questions, perhaps all the notes could be transferred to the Guidelines references column (J) or an extra column.
More precise and measurable questions : Some of the questions are formulated too openly in terms of interpretation and reliability of the answer. Typically starting with the first question DT-KN-01-1 “Enterprise Architecture works at the reference and Master Data level”. It is very hard to rate this with specific measurable elements matching the rating levels themselves. We think that for each question, or maybe sections like DT-KN-01, 02, etc. there should be a list of elements, pieces of evidence that are expected to be rated a given level. This may be a big task but it makes the rating more accurate. We have created a similar type of rating ourselves, though a bit more specific to what we call “Urbanisme” in French, with a slightly smaller scope than Enterprise Architecture. In our document we actually included a column that expects a type of evidence for each question. Sometimes it is a document, sometimes it is just about the existence of a committee, sometimes it’s just a name. Without this kind of clue the rating is very (too much?) dependant on the skills / Mood / involvement of the person carrying out the rating. Ex: DT-KN-03-2. This question is more formulated like the 100% mark description, the top maturity level but leaves the “rater” imagining the appropriate mark for the less than ideal situation.
Pre-requisites / Assessment of maturity: We feel that it might be useful to include a set of pre-requisites, either in shape of questions or as a description of some required elements. This touches on some of the feedbacks that other contributors have made. Basically we felt that some of the questions required an appropriate level of maturity to be asked (on each of the three axes, Data, Rules and Processes). Sometimes there is no use in marking lots of questions with a “bad” mark if some prerequisites are not in place. Ex: All the DT-GV questions assume a MDM tool is in place… if this is not the case then the master data cannot be stored and managed in it. The weight of the questions is a way around this but you cannot really cancel too many questions. How about adding a question on the fact that implementing such a tool is planned or available on a limited scope? DT-GV-01-4 touches it this through the existence of Excel spreadsheets but it may be interesting to rate the amount of these Excel documents and light applications: are these well identified? Is there a plan to migrate them into a tool?
Splitting questions: Some of the questions contain multiple questions within the description, making it more difficult to rate the right element. Ex DT-KN-02 Business Levels. All the elements are very relevant points to rate and should maybe be split into 3 parts: Models are using a formal notation, coverage of these models relating to referential data, how the models are maintained. For instance how would we rate the following situation: “30% of the identified referential data is modeled in UML and natural language accurately and made available to all but the business does not really control the models whose maintenance is ensured by an isolated team or individual”. What is important? What do we want to rate? The existence of the models, the format of the models, the control that the business has over the models?
Improvement path for a company: I did find very relevant the idea in your PowerPoint presentation of an improvement path for a company, going from C /C /C to B /C / B to B / B / B. However to go further, I think that the rating levels, either for each question or group of questions, should also reflect an improvement path. In other word, a company that is at 15% for a category should find the next steps or the next goal through a description / guidelines for the 30% mark. I realize this may be trying to map out an ideal progression scheme but I think the rating tool could be improved at this level.
Quick answer by Sustainable IT Localized version: We understand this requirement. Then, after having established the first release of the IS Rating Tool and its guidelines, we will be working on the translation in French language. This translation work should be done on March and April 2010.
Guidelines/examples: Pushing currents notes to the Guidelines column is excellent. We will be doing that and trying to provide more explanations. However, we hope that Systems Integrators will be willing to put their own added-value in this field of Guidelines reference. Indeed, IS Rating Tool should provide basis explanations only to enforce an homogeneous rating and let the door opens to use different best practices and modeling procedures stemming from different communities and companies.
More precise and measurable questions: We fully agree on your remark and suggestion. Obviously, as you mentioned it, this is a big challenge but we must provide this effort. This work will be doing within 2010, when releasing further version after the end of March.
Pre-requisites / Assessment of maturity: Good suggestions. We will integrate them in the release planned at the end of March 2010.
Splitting questions: We agree with you. We must split too complicated questions into more understandable and targeted ones. This should be easier to read and to deliver the rating.
Your last suggestion about the "improvement path for a company": Yes we agree on this suggestion. On the basis of the first stabilized release of the IS Rating Tool, we will be able to provide some “natural paths” to improve the rating, depending on the current rate. Indeed, we can fully use best practices around the Agility Chain Management System Foundation to achieve this goal. This is the core-value of the Sustainable IT Architecture Community.