Customer Success with Microsoft Dynamics Sure Step
Chapter 6. Quality Management and Optimization
In the last chapter, we learned about the Waterfall and Agile implementation approaches supported by Sure Step. We discussed the different project types based on these approaches, as well as the implementation phases and cross-phase processes that these project types span. We also learned about the activities, templates, and guidance provided by Sure Step to enable solution delivery.
In this chapter, we focus on the quality aspects of Sure Step, which encompasses both proactive actions that can be taken during the solution delivery, as well as Post Go-Live steps to ensure the ongoing maintenance and success of the solution. The following topics will be covered:
· The manifestation of quality management in the different areas of Sure Step
· How quality control is embedded within the Waterfall-based and Agile-based project types
· The Sure Step Optimization Offerings Roadmap—what it means to service providers, especially those starting in the Microsoft Dynamics field, and how the customer can benefit from its usage
Understanding the quality management manifestation in Sure Step
Quality management practices, including quality control and quality assurance, are essential aspects of the solution-delivery process to ensure that the solution being developed is in line with the expectations of the customer. Dr. W. Edwards Deming was one of the pioneers of the quality revolution, first in Japan, then in the United States, and then the rest of the world. Dr. Deming's teachings and philosophies originated in manufacturing, but have been extended to several disciplines over time, and his works have been produced and reproduced via a number of books and articles. Dr. Deming emphasized how quality and efficiency can improve simultaneously, by creating "a consistency of purpose designed to drive the organization towards product and service improvement and continually and permanently improving the system."
In the Sure Step methodology, quality control and assurance are manifested in many areas, including in the activities of each cross phase of the implementation project types. Following are some of the areas where quality control and management are accentuated and specifically called out:
· The Program Management cross phase within the Sure Step project types includes specific activities and templates focused on quality control.
· The Quality and Testing cross phase within the Sure Step project types focuses on due diligence to ensure that the solution is configured and customized as per agreed upon standards and requirements.
· Sure Step provides several offerings under the Optimization umbrella. Optimization Offerings include proactive oversight of an implementation from a technical and governance aspect, as well as actions that can be performed during production to ensure that the solution continues to operate effectively.
· Sure Step also includes reference content featuring quality management within the Project Management Library.
We will review the first three topics in more detail in this chapter. We will reserve the discussion on the Project Management Library and its quality management content for an upcoming chapter.
Controlling quality within project types
Within the Sure Step implementation project types, the execution of quality management is often emphasized and entrusted to senior roles like the project manager, solution architect, and tester. These roles assume leadership of the solution-delivery process, and as such it is seen as a natural extension of their roles to oversee the quality aspects of solution delivery. Accordingly, key activities in the Program Management and Quality and Testing cross phases are specifically called out to monitor the quality of the implementation, with the project manager, solution architect, and so on, as the owners of the execution of the deliverables. In the Program Management cross phase, key quality-focused activities include the documentation of Conditions of Satisfaction (CoS) and the execution of Tollgate Reviews. The Quality and Testing cross phase includes activities early in the delivery cycle to ensure that Quality Standards are established. Additionally, Monitoring and Testing activities are essential elements of this cross phase, with activities called out in each phase.
Quality activities embedded in program management
Conditions of Satisfaction are the measures of project success and the goals for the engagement that allow the teams to clearly determine success or failure of the project. The guidance in Sure Step calls for the elements of CoS to be identified at the outset of the engagement, and are noted within the Project Charter or in a similar project documentation. The project manager is responsible for working with the customer to ensure that this activity is executed, and also to ensure that the document is signed off by the customer, thereby denoting their acceptance.
Another key component of quality control during the implementation is Tollgate Reviews. For the Waterfall project types, the executions of Tollgate Reviews are called out at the end of each phase. For the Agile project type, Tollgate Reviews take the form of the Sprint Post Mortem, which is executed at the end of each sprint cycle.
Tollgate Reviews in the Waterfall project types assess the current health of the project by reviewing the key milestones achieved and key deliverables completed during the corresponding phase. Any project issues and risks are also identified, documented, and a course for mitigating them is established. This may include a project scope and change requests to be initiated, and an approval is requested from the customer. Any adjustments to the overall timeline of the project are also performed during this activity. Tollgate Reviews are also used to assess how the project is faring in terms of addressing the Conditions of Satisfaction identified by the customer.
Finally, the lessons learned are documented for the benefit of the customer and project teams, which is especially critical at the end of the Operation phase of the project because they may produce important guidelines for future-related engagements.
In the Agile project type, the project team members use Sprint Cycle Review to discuss the relative success and failures of the process at the end of each sprint cycle. The team focuses on the processes and working practices followed during the sprint cycle, including how the team worked together and if any improvements are needed before the next sprint cycle is initiated.
Key quality and testing cross-phase activities
Establishing the Quality and Testing Standards early in an implementation can reduce any ambiguity in the configuration, development, and testing of the solution. These standards, gathered and documented in a Test Plan, communicate the general procedures to be followed when conducting software testing and validation. The plan may include specific test cases or scenarios and their expected results. It may also encompass projected business processes and workflow changes in the customer organization.
The Test Plan also provides the general overview of the Monitoring and Testing activities that will be performed during the course of the implementation. The Testing activities are especially emphasized in the Sure Step methodology—the larger the scope of the engagement, the more the rigor and number of tests that are recommended. The following diagram shows the recommended tests for large-scale engagements:
At the outset of the engagement, the implementation team ascertained the solution requirements and conducted a Fit-Gap Analysis to determine the requirements that fit the standard solution, gaps and required customization. The testing activities address these requirements beginning with the Solution Fits and Solution Gaps during the solution development phase, and then test the overall solution in the deployment phase.
The first three tests performed during the solution development are executed within the development team. These tests do not require the customer team to be involved. The customer team will be required in the next series of tests, though the development team may query the customer Subject Matter Experts (SMEs) as necessary during the development process if, for example, a requirement needs clarification.
This test is performed by the application consultants in the delivery team focused on the configuration and setup of the system. The objective of this test is to ensure that the system is configured to meet the requirements described in the Functional Requirement Document (FRD) and Functional Design Document (FDD).
Let's take an example of a customer requirement for a specific workflow for approval of orders over a certain quantity or amount. The design for the requirement is defined in the FDD, and the system is configured to follow this approval workflow. A Feature Test is first conducted by the application consultant to verify that the requirement is met. At a later stage, the customer SME will also verify this configuration.
A Unit Test is a standalone test of the custom code written for a system modification. It is performed by the developers and is based on the solution design described in the Technical Design Document (TDD).
For example, say the customer's marketing department needs some custom fields in the customer's master table to allow them to classify and segment their customers. The TDD may be used to describe the specific tables in the system that will need to be modified. After the system is customized accordingly, the Unit Test is conducted by the developers to verify that the fields have been created as required.
The Function Test is the subsequent test to the Unit Test. Like the Unit Test, it is also focused on custom code, but unlike the Unit Test, the Function Test is performed by the application consultants and is based on the FDD. The objective is that the system modification is in-line with the functional or business need of the customer.
In our previous example of the custom field, in the customer's master table, after the Unit Test, the Function Test is conducted by the application consultant to verify that the marketing department's functional need is being met by the customization. This may involve, for example, verifying that the field is placed on the corresponding form at the right location, and the appropriate values are available to the marketing personnel.
The need for such rigor in testing is evident from the examples noted here. However, if the reader concludes that the number of tests seems to be too much overhead for smaller engagements, they may consider combining tests where feasible. However, for large-scale engagements, which may have several hundred simple and complex requirements, this rigor is necessary; so the reader is strongly advised against taking any shortcuts that may pose unnecessary risks. Skipping the individual tests may lead to issues being detected downstream in the process. As any experienced consultant would tell you, it is better to test a smaller subset of the solution so as to isolate any potential issues. The alternative, which is more time consuming and laborious, is to test all the setups together to try to determine if the problem lies in a single subset of code, or is due to the intersection with another element of code. In that sense, you could draw a parallel to a manufacturing process. In multi-part manufacturing, it is critical to detect quality issues during the manufacture of the components. Waiting until assembly to discover parts that don't fit will likely shut down the assembly line and lead to expensive delays. Just as in the manufacturing process, testing the individual code components is important during the solution development, before the solution is tested as a whole.
The tests described above are conducted within the implementation team. The remaining tests are conducted with direct involvement of the customer's personnel involved in the solution implementation. The success of these tests is predicated on this involvement to ensure that the solution is developed as envisioned.
A Sub-Process Test involves the testing of a subset of the company's overall business process, to ensure that the users of the new solution will be getting a system that performs as originally envisioned. This test is performed by the application consultants with the customer SMEs participating, verifying, and signing-off on the subset of the solution.
A test of the customer's Order Taking process is an example of a Sub-Process Test. The new solution is set up in a test environment and the customer SMEs work with the application consultants to run the system, checking that the Order-Taking workflow is intuitive and as per the agreed-upon design.
While a Sub-Process Test focuses on a subset of the company's workflow, the Process Test is a complete test of the related features and functions that make up a defined business process. This test is also performed by the application consultants with the customer SMEs.
Testing of the Order-to-Cash workflow or Procure-to-Pay workflow are examples of Process Testing from an ERP solution perspective. In the Order-to-Cash Process Test, the customer SMEs verify that the system performs as desired for entering a customer order, fulfilling the order, and accepting payment for the order, including alerting the appropriate Customer Service personnel when the payments are overdue. In the Procure-to-Pay Process Test, the SMEs verify that they can place a Purchase Order with their suppliers as per their design, they can receive and account for the delivery of the supplies, and they are able to pay the suppliers for their goods within the appropriate payment conventions of the organization.
From a CRM solution perspective, the Process Test examples include the Quote-to-Order workflow, where a quote captured in the CRM system can be tracked through to conversion into an order, or the Self Service Portal workflows such as users obtaining documentation or answers to specific queries or statuses of their requests.
Data Acceptance Testing (DAT)
DAT is a very important test for business solutions delivery. The first objective of DAT is to verify that the data migrated from the existing systems to the new system is the correct data subset, and the data has been cleansed as necessary. The second objective of the DAT is validating that all of the data needed for transactions, inquiry, and reporting is available. DAT should be performed by the customer's data owners and key users who are closest to the data elements and can identify any shortcomings. DAT may also involve the customer's IT staff, if the data sources need to be validated during the testing process.
The importance of the DAT cannot be understated as the behavior of the new system is dependent on the data that is populated in its database. If the data is incorrect, it doesn't matter how good the new system is—the users will only get wrong information faster or easier.
Let's say that one of the suppliers for the customer is ABC Corporation. Due to the lack of data entry checks and rules in legacy systems, it is not uncommon to find multiple records with the same supplier entered as ABC or ABC Corp or ABC Corporation. Why is this problem? The purchasing manager does not have a true overview of all the orders placed with ABC Corporation, without which he or she may not have all the ammunition to negotiate additional discounts.
The Integration Test is an end-to-end test of specific business processes, and includes system setup, development, reports, and testing of integrations or interfaces to any external subsystems. Integration testing is performed with the company's SMEs, key users, and the application consultants. The company's IT staff may also be involved in this testing, especially as it relates to the touchpoints of the external systems.
In the Process Test's Order-to-Cash and Procure-to-Pay workflow testing examples, if the process required connecting to external systems or databases for reporting or other reasons, the Integration Test would address and validate these scenarios.
Integration testing is often under-executed because of the difficulty of simulating live integration environments. These tests take substantial prior planning often involving other system vendors.
The Performance Test is a technical test that focuses on how well the system performs in high transaction volumes anticipated during peak times. This test is performed with the company's IT staff, SMEs, and application and technical consultants.
Performance Testing can avail of canned scripts to populate the system and simulate heavy load. However, depending on the number of customizations made to the standard system, the development of the scripts may require several person-hours. But depending on the criticality of the system response rate for a corresponding business process, this test may be a very important one to validate that the configured system under load meets the business requirements and agreed-upon performance metrics.
An example of Performance Testing is a scenario to monitor the system response for multiple order entries, which includes validation of the customer's credit and outstanding payments.
User Acceptance Testing (UAT)
The UAT is the final test performed by the customer SMEs, key users, and the application consultants for system sign-off. The UAT is the most important indicator of the customer's acceptance of the new solution for Go-Live.
The UAT is conducted with the data migrated from the customer's existing systems and uses actual transactions from a specific period (such as one or two days) identified by the customer as being a representative sample of their business. The test focuses on complete end-to-end testing to ensure that the system meets the business requirements and the test criteria established early in the implementation. The UAT is typically performed in a Testing or Staging environment. The UAT leverages scripts that are prepopulated with the test steps and expected results from the testing, and the actual results of the testing are documented for future reference and customer approval. The following screenshot shows one of the many UAT scripts provided in Sure Step:
If the tests were determined to be successful, the customer signs off the system to be moved into production. However, the customer may still request changes to a certain feature, data migration, or integration process. These changes will go through the change management process established at the outset of the engagement.
In the preceding sections, we saw how Sure Step has guidance and templates embedded within the activities of the project types to assure a quality solution delivery. We will now discuss other avenues for ensuring customer satisfaction with the overall solution and delivery process.
The Sure Step Optimization Offerings
The Encarta dictionary definition of optimize is: enhance effectiveness of something, to make something function at its best or most effective, or use something to its best advantage. Optimization can mean different things depending on your point of view. In terms of a system, optimization could mean the process of improving the ease of use or response rate of the system. For a program, it could mean an attempt to reduce runtime, bandwidth, or memory requirements, while for computer code, optimization could entail improving the performance or efficiency of the compiled code. For a business process, optimization could mean improving the efficiency of that process, including reducing costs or the throughput time. For the optimization offerings in Sure Step, all these definitions apply to a certain extent.
The Sure Step Optimization Offerings includes a set of offerings designed to proactively help reduce the risk in an implementation or upgrade, as well as to assist the customer in ensuring that their system is performing optimally when it is in production. The Optimization Offerings are packaged and bundled differently for each of the Dynamics products, based on the complexity of the product and optimization needs. A conceptual diagram is as follows, but please refer to the individual product offering diagrams to ascertain the specific services provided by the offering for the corresponding product:
The Optimization Offerings include two types of services: Proactive services and Post Go-Live services. Proactive Services are delivered during solution delivery and in conjunction with the solution delivery teams; so they span the Analysis, Design,Development, and Deployment phases of Sure Step. The Post Go-Live services are executed in the Operation phase of Sure Step, typically after the solution has been in production for a given period.
The Proactive Services can be further classified as Technical Proactive Services and Project Governance Services. Technical Proactive Services are typically performed by architects or equivalent senior resources during solution delivery. Examples of these services, which are further explained in the upcoming sections, include Architecture Review, Functional and Technical Design Reviews, Customization Review, and Performance Review. Project Governance Services on the other hand, are oversight services typically performed by project directors or equivalent senior project management resources, throughout the life cycle of an engagement. The Project Governance and Delivery Review Service is a prime example for this category.
The Post Go-Live services are inherently technical in nature because the implementation has already been completed. These services are typically performed by senior support engineers, architects, or equivalent senior resources, and are executed when the solution is in production. Examples include the Health Check and Performance Tuning services.
Sure Step provides five Optimization Offerings for each of the Dynamics products. The Optimization Offering varies by product and are comprised of grouped and/or individual Proactive and Post Go-Live services. The structure provides flexibility for the customer and the solution provider to choose one or more of the Optimization Offering services for a given engagement.
Optimization Offering for Microsoft Dynamics AX
Microsoft Dynamics AX projects often represent the most complex implementations and therefore, the Optimization Offering for AX includes numerous grouped and individual services, which can be bundled to meet specific customer engagements.
The following screenshot shows an overview of the AX Optimization Offering:
The AX Optimization Offering includes the grouped Architecture Review service that encompasses multiple individual services such as Infrastructure Design Review, Functional Design Review, Technical Design Review, Customization Review, andPerformance Review. Project Governance and Delivery Review (PGDR)—which encompasses Lifecycle (Phase-by-Phase) Review and Project Closure Review—and Upgrade Review are also grouped services. The grouped services are designated in gray with a dotted box to encompass the individual services that compose these grouped services.
Other individual services included in this AX optimization offering are Database Storage, Integration Design Review, Performance Benchmark, Health Check, and Performance Tuning.
A customer may decide to select one or more of the grouped or individual services, bundled together or as standalone. If they are bundled, they are typically executed in the sequence detailed in the preceding screenshot, for a more robust and comprehensive experience for the customer.
Optimization Offering for Microsoft Dynamics CRM
The Optimization Offering for CRM engagements is packaged differently than AX. All the CRM optimization offering services are individual services, with the exclusion of the grouped Project Governance and Delivery Review and Upgrade Review services.
The following screenshot is the Optimization Offering diagram for Microsoft Dynamics CRM:
The CRM offering features technical services including Architecture Review, Design Review, Customization Review, Performance Review, Health Check, and Performance Tuning.
Optimization Offering for Microsoft Dynamics GP
The Optimization Offering for GP consists of several individual services available for AX and CRM, including Infrastructure Design, Reporting Services Workshop, Admin Workshop, and Performance Review for Proactive Services, and Health Check andPerformance Tuning for Post Go-Live services.
The following screenshot is the Optimization Offering for Microsoft Dynamics NAV:
Note that GP does not have a standalone Architecture Review because this type of engagement and complexity does not usually warrant the need for this grouped service. It also does not have a Customization Review as these types of implementations do not usually involve extensive custom code.
It also does not warrant the need for a PGDR since that is only a strong suggestion for more complex engagements. GP, however, does have an Upgrade Review service as well.
Optimization Offering for Microsoft Dynamics NAV
Similar to the GP offering, the Optimization Offering for NAV individual services available for AX and CRM, including Infrastructure Design, Performance Review for Proactive Services, and Performance Tuning and Health Check for Post Go-Live services.
The following screenshot shows the Optimization Offering for Microsoft Dynamics NAV:
Similar to GP, NAV does not have a standalone Architecture Review because this type of engagement and complexity does not usually warrant the need for this. It also does not have a Customization Review as these types of implementations do not usually involve extensive custom code.
It also does not warrant the need for a PGDR since that is only a strong suggestion for more complex engagements. NAV currently does not have an Upgrade Review Optimization Offering.
Optimization Offering for Microsoft Dynamics SL
The Optimization Offering for SL consists of two individual Proactive Services also available for AX and CRM, Infrastructure Design and Performance Review. It also includes one Post Go-Live Service and Performance Tuning.
The following screenshot shows the Optimization Offering for Microsoft Dynamics SL:
Similar to GP and NAV, SL does not have a standalone Architecture Review because this type of engagement and complexity does not usually warrant the need for this. It also does not have a Customization Review as these types of implementations do not usually involve extensive custom code.
It also does not warrant the need for a PGDR since that is only a strong suggestion for more complex engagements. SL also does not have an Upgrade Review optimization offering.
Understanding key Proactive and Post Go-Live services for AX and CRM
As previously discussed, Sure Step provides Technical Proactive Review services, Project Governance Services and Technical Post Go-Live services. The review services promote quality management by enabling the customer or service provider access to Microsoft Dynamics specialists at appropriate checkpoints during their implementation. These experts can review the proposed architecture and design for business and industry solution, as well as in technical areas such as performance, scalability, and integration with other systems and third-party software.
The Technical Proactive Review services include Architecture Review, Design Review, Customization Review, and Performance Review for new Microsoft Dynamics solution implementations and Upgrade Review for the upgrade of an existing solution. Technical Post Go-Live services includes Health Check, for review of solutions that have already been in operation for a certain period of time. Project Governance services includes PGDR. We describe these select services in this section.
Architecture Review for CRM provides an assessment of the overall technical design of the customer's Microsoft Dynamics CRM solution, and covers design of the solution for areas including the performance, scalability, security, and release management. The objective of this service is to ensure that the envisioned infrastructure for the customer's solution is in-line with best practices, and it is executed towards the end of the Analysis phase.
The key tasks carried out in this example include: technology analysis and review of the FRD, server architecture review, review of the Fit Gap and Solution Blueprint, high-level review of the integration and interface requirements, and high-level review of transaction volumes. As output of the review, the customers and service provider's implementation teams receive an objective, third-party view of the proposed architecture and how it aligns with the customer's requirements.
In contrast, the Architecture Review for AX is a grouped service which includes a number of individual services including Infrastructure Design Review, Functional Design Review, Technical Design Review, Customization Review, and Performance Review.
The Design Review service is used to examine the design of the Microsoft Dynamics solution in two primary areas. For CRM, the service examines the customizations of the Microsoft Dynamics system and the integration scenarios between the Microsoft Dynamics system and other third-party systems, and is executed towards the end of the Design phase of the implementation. For AX, the Design Review is made up of two services, the Functional Design Review and the Technical Design Review.
The stated objectives for the Design Review engagement may include the following:
· Assess the customization needs for the customer's Microsoft Dynamics solution
· Assess the integration needs for the Microsoft Dynamics application with other systems
· Review the Functional Design Documents for Gaps (FDD-Gap) and the corresponding TDD for the proposed customization and integration design
· Provide an assessment of whether the customization design is in-line with the previously completed and signed-off Fit Gap Analysis and Solution Blueprint
· Provide recommendations to optimize the Microsoft Dynamics solution architecture and integration design for performance, availability, and reliability
The deliverable from the engagement is a Design Review and Assessment Report, which includes integration and customization design recommendations for optimization based on best practices for developing custom components.
The Customization Review service focuses on analyzing the custom code to improve performance, increase stability, improve security, and reduce operating and upgrade costs. For CRM, this is an individual service, while for AX, this is part of the Architecture Review grouped service. The Customization Review service is executed towards the end of the Development phase, and the stated objectives may include the following:
· Identify any best-practice deviations in custom coding, including both server- and client-side code
· Review conformity with standards, detect code development errors early, and document the deviations
· Ensure compliance with necessary quality guidelines
· Review the interfaces to system components
The final report from the engagement provides recommendations and an action plan to implement the best practices and fix any issues found, to ensure optimal long-term operation of the Microsoft Dynamics system.
The Performance Review service analyzes the performance impact of the solution design and customizations. The service begins with a review of the existing Microsoft Dynamics solution and the current and proposed customer usage metrics, such as user counts, dataset sizes, and transaction volumes. The output is the performance recommendation for the Microsoft Dynamics server(s) and for the Microsoft SQL Server database that will support the Microsoft Dynamics solution.
For CRM, the Performance Review is an individual service, while for AX, this is part of the Architecture Review grouped service. The Performance Review service is executed in the Deployment phase, after the solution development is frozen for any additions. The review should be conducted before the solution is moved to production so as to catch crash, leak, performance, and other non-architectural issues that do not meet best practices. Performance Reviews should also be considered when prior solution testing has indicated potential performance implications in certain areas of the solution. Performance reviews can address the following concerns:
· Cost: The infrastructure works properly, but at too high a cost, causing an insufficient return on investment
· Agility: The infrastructure works properly, but it does not have the flexibility to change quickly enough to meet the business needs
· Performance: The infrastructure fails to meet users' expectations, either because the expectations were set incorrectly or because the infrastructure performs incorrectly
· Security: The infrastructure fails the business by not providing enough protection for data and resources or by enforcing so much security that legitimate users cannot efficiently access data and resources
The final report from the engagement provides system environment recommendations addressing the network topology, latency numbers, and bandwidth.
While the prior Optimization Offering services address new solution implementations, the Upgrade Review service is provided as guidance for customers upgrading their existing Microsoft Dynamics solutions to the current product release. The Upgrade Review service provides oversight of the customer's upgrade solution, including design, customization, integrations, physical infrastructure, and architecture throughout the upgrade project life cycle.
The Upgrade Review service offers the following group of advisory activities or services that are carried out throughout the Upgrade project life cycle:
· Upgrade Architecture Review and Upgrade Design Review: These services align with the activities of the Analysis and Design phases in the Upgrade project type, including upgrade preparation, requirements gathering, test planning, and environment setup. The review team evaluates the customizations in the existing Microsoft Dynamics solution, analyzes the upgrade of the code components, and documents issues along with resolutions and recommendations. The output of this exercise may include an upgrade estimation report to the customer.
· Test Upgrade: This service follows the Upgrade Architecture and Design Review activities and aligns with the Development Phase in the Upgrade project type. The goal of Test Upgrade is to provide a test or sandbox environment with the customer's existing data promoted to the new Microsoft Dynamics product version. This environment can then be used to verify and benchmark the data before promoting it to production.
· Production Upgrade: This service, executed in the Deployment phase of the Upgrade project type, provides an on-site assistance with the promotion of the upgraded solution to the production environment.
The Upgrade Review services align with the flow of the Upgrade project type in Microsoft Dynamics Sure Step. The Upgrade project type is covered in detail in the corresponding chapter on Upgrading with Sure Step.
When the solution is in production, and after the initial stabilization period, it is a good idea to revisit the solution to ensure that it is running efficiently. Periodic checks during the solution operation phase are also a recommended best practice. Both of these objectives can be achieved with the Health Check Post Go-Live service in Sure Step.
The Health Check service analyzes the customer's Microsoft Dynamics solution and measures the effectiveness of the solution in operation. The solution monitoring allows the proactive identification of any potential problems and provides suggested resolutions for the selected components. It is worth noting that these services are typically incorporated into enterprise offerings such as Microsoft Premier offerings for Dynamics AX.
A sample Health Check report is shown in the following screenshot:
Project Governance and Delivery Review
The previous services highlighted in this section are technical offerings that are typically delivered by solution architects and senior application or technology consultants. Sure Step also provides another service called Project Governance and Delivery Review (PGDR) to drive quality through project oversight and overall project governance. This service, delivered by experienced project and engagement managers, provides customers with proactive Project Governance and delivery execution guidance through the full life cycle of their Microsoft Dynamics engagement. At a high level, these resources perform the following three tasks:
· Analyze and assess the proposed engagement structure and established deliverables
· Monitor the project governance and communications with the customer and within the implementation teams
· Analyze and assess the quality of the deliverables for completeness and relevance
PGDR features two components—Lifecycle Phase-by-Phase Reviews and Project Closure Review. During the Analysis through Deployment solution implementation phase, the PGDR offering acts as guidance and oversight for the activities performed by the delivery team, helping them to stay aligned with the agreed-upon vision for the solution by proactively identifying risks and actively managing the overall scope of the engagement. Once the solution is in operation and the engagement is at closure, the PGDR offering evaluates how the project was delivered against the initial vision and determines the performance against schedule and quality.
At the start of the project, the PGDR offering initiates two key activities. The customer, reviewers, and the implementation team work together to establish the customer's CoS for the engagement. These CoS are key components of the Project Charter, and other key areas including governance, risk, communication, status reporting, and issue management are discussed and clearly documented in the charter.
During the engagement, the PGDR offering produces phase-by-phase recommendations and project health dashboards to help customers identify risks and address issues before they become problematic. The reviewers are armed with tools such as the Cobb's Paradox Tool, which is an effective risk assessment tool that provides questionnaires to detect and monitor the overall project risk factors. The following screenshot shows a sample of a graphical output of the Cobb's Paradox Tool:
At project closure, PGDR is used to collaboratively discuss accomplishments and challenges, and capture lessons learned. Some of the key tasks performed at project closure are listed as follows.
· Document the lessons learned, including the accomplishments, challenges, what the team could have done differently, and so on. This is sometimes viewed as trivial, but it can be a very important source of reference for future projects.
· Review the CoS to determine whether or not they have been met.
· Collate outstanding issues and open items, and determine how they will be addressed after project closure.
· Provide recommendations for future projects and follow-on work pertaining to the customer's deployed Microsoft Dynamics solution.
The PGDR and technical review services can be of great benefit to the customer and the implementation team, augmenting the resources in key areas to provide valuable independent perspectives.
It is also important to keep in mind that to be the most effective, the Optimization Offerings should be delivered by an independent third party—meaning, a provider who is not the primary implementer of the solution.
Optimization Offerings and their benefits
Delivering business solutions, especially ERP solutions, requires an industry-savvy implementation team, or in other words, situational fluency. The team must have the ability to translate the system functionality into a solution that meets the specific needs of the customer. When a customer selects a service provider for their solution implementation, they often add extra weight to industry knowledge. So what if it comes down to a service provider that is very knowledgeable in the specific industry or industry vertical, but is not quite as familiar with the Microsoft Dynamics solution? This is one area where the Technical Review services can greatly benefit both the service provider and customer. The marriage of seasoned industry veterans with technical Microsoft Dynamics experts can provide a powerful team to solve deep, topical requirements for the customer.
Another area where leveraging the Technical services of the Optimization Offering, especially the Proactive Review services, can benefit is with new ERP/CRM service providers getting started with Microsoft Dynamics. Inexperienced service providers can include experienced resources on their team by using these offerings, and as the team members shadow the experienced resources, they can ramp up on their understanding of the Microsoft Dynamics solutions.
Where the service providers already possess deep technical expertise but are short on project management expertise, especially on large-scale engagements, the Project Governance and Delivery Review service can be of great benefit. The Microsoft Dynamics resource pool includes several individuals with knowledge of the solutions prior to their acquisition by Microsoft. While these technical resources are very knowledgeable about the product itself, they sometimes lack the experience to manage the scope, communications, and risks inherent in complex, multisite solution deployments. Leveraging the PGDR offering, these resources can work with experienced project managers to develop that skill set for future engagements. They can, in turn, lead their organizations to scale up their offerings to address larger clientele.
The Post Go-Live services, including the Health Check and Performance Tuning services, are highly recommended for the customer. Given the substantial investment made in acquiring and deploying the Microsoft Dynamics solution, it behooves the customer to periodically get expert resources to monitor the health of the application. Depending on usage behavior, the application can feel like it bogs down over time, and the system response may appear slower to the users. The Post Go-Live services can then be used to review the application and the subsequent recommendations can help clean up the system to perform more efficiently.
From a service provider's perspective, the Post Go-Live services can be exercised to continue maintaining their relationship with their customer. A common paradigm noted in the industry is that it is harder to get a new customer than to keep an existing one. It is also next to impossible to gain a customer back once they are lost to the competition. As such, the service provider should leverage the Health Check offerings to further their relationship with their customers. In doing so, they may also unearth new related or unrelated opportunities on which they may assist the customer.
Use case – Technical Review services usage by Global Advertising Organization
In a previous chapter, we talked about a Global Advertising Organization that used Sure Step, specifically the Enterprise project type, for their Microsoft Dynamics AX solution delivery. The customer successfully deployed the initial solution across specific locations with the help of Microsoft Consulting Services (MCS) and Microsoft Services Global Delivery (MSGD) resources.
To roll out the solution across additional sites, the customer decided to use a combination of internal and partner resources. The customer also decided to leverage the Technical Review services of the Optimization Offerings, specifically the Architecture, Design, Customization and Performance Review services, and had the Microsoft resources execute these services as independent third-party reviewers. This approach provided continuity and consistency, resulting in reduced costs for the solution rollouts at the corresponding locations.
Use case – Project Governance and Delivery Review service usage by partner
A large retailer selected Microsoft Dynamics AX as their solution, as well as a partner familiar with their industry vertical to assist them in the delivery of the solution. The partner was adept in Microsoft Dynamics AX, and they were also comfortable with the technical ability of their consulting resources for solution delivery. They were however, concerned with the ability of their resources to manage the overall scope of the engagement, especially given the tight timelines necessitated by the customer.
Having worked successfully alongside MCS resources on other engagements, the partner felt that a Microsoft resource could assist in reducing the overall project risk inherent with the aggressive timelines. Accordingly, they set up a PGDR engagement wherein an experienced MCS project manager performed periodic independent assessments of the overall engagement. The result was an on-time and on-spec rollout of the solution, and a win-win for the customer and the partner.
In this chapter, we covered the quality aspects of Sure Step from the perspectives of proactive actions that can be taken during the solution delivery, as well as post Go-Live steps to ensure the ongoing maintenance and success of the solution. We discussed how quality is embedded in the Sure Step implementation project types, as well as in the technical and project governance review services available through the Sure Step Optimization Offerings.
In the next chapter, we will learn about Upgrading with Sure Step, including assessing the existing solution and determining the right approach to the upgrade, and guidance for executing the upgrade itself.