Member of MVC Consulting’s Senior Advisory Team Attains Scrum Master Certification

OAK BROOK, Ill., May 18, 2016 – As part of their commitment to meeting the ever-changing needs of our clients and the marketplace, MVC Consulting announced today that Senior Program Manager Dayton Kishimoto has earned his Certified Scrum Master (CSM) designation.

Kishimoto was trained by Mike Beedle, co-author of the Agile Manifesto and one of the country’s leading Agile strategists. This is just one more notable achievement in Kishimoto’s accomplished career which has spanned over 30 years and countless projects.

“We are proud of Dayton’s accomplishment,” says Mark Stroh, COO of MVC Consulting.  “As new software development methodologies, such as Agile, gain industry acceptance, we continue to invest in our team.  Dayton’s certification demonstrates our commitment to meet our client’s highly visible program management needs.”

Scrum is a well-known agile development method that provides a simple project management framework for organizing teams and their approach to system development. Agile’s benefits include improved quality, better opportunity for midcourse corrections, and improved customer satisfaction. Scrum’s relevance to the field of project management is further demonstrated by its continued popularity: “Scrum is far and away the most widely adopted flavor of Agile…For teams that have struggled to make accurate estimates or adapt to changes to the backlog, the attraction of Scrum isn’t just velocity” said Tom Grant and Diego Lo Giudice, Analysts for Forrester.


About MVC Consulting

For over 30 years, MVC Consulting has provided highly qualified program management and IT consulting services in the Chicagoland area. Our clients include leading pharmaceutical, insurance, distribution and manufacturing organizations. Our program managers lead highly complex projects, often integrating activities from both internal and external teams, resulting in on time and on budget delivery. We are a certified women owned business. Learn more at

Press Contact

Name: Kelly Burke

Mobile: 847.687.8976


Using a SaaS Data Integration Framework

One of the areas that CIO’s constantly struggle with is how to integrate disparate programs that weren’t originally designed with integration in mind. I have personally dealt with these problems both as a former CIO and later as a consulting partner.  The rapid move to SaaS has solved some problems but has allowed the integration problem to continue and in some ways get worse. Software-as-a-service vendors have struggled to find a good way to integrate their apps with those made by other vendors in addition to the problem of integrating with legacy applications. David Lithicum has pointed out that “As more enterprises move their applications to SaaS, there is a growing need for SaaS-to-SaaS integration,” “Unfortunately, as customers are requesting this, many of the SaaS providers are stumped for an answer; beyond [hiring] a bunch of developers and hoping for the best. Too often, this approach creates expensive and “cumbersome architectures that lack agility”.

Integration with legacy enterprise applications is often made more challenging by the fact that it is an afterthought for many organizations. SaaS creates silos of customers’ information in the cloud in the SaaS vendors’ data centers. “You typically want to integrate with your data center, and you have to figure out how to make that integration occur … [which] needs to happen even before you pick your SaaS,” said Blue Mountain Labs founder and CTO David Linthicum.

Given the customization work that’s involved in integrating on-premises apps and data with Software as a Service (SaaS) apps, it can cost between $80,000 and $100,000 and six to twelve months to move data, not to mention programming in the workflows necessary to automate the process between SaaS apps.

There’s an expectation that moving to a SaaS-based solution is going to solve all of an enterprise’s problems, and they’re shocked when they see what it will cost.

That is what drove our decision to partner with a SaaS tool vendor, Azuqua, that has built a software and workflow scripting solution that can solve the integration issue at a fraction of the cost and time required to do the same thing using expensive custom programming services.

Do PMOs Actually Work?

PMO Image


As I visit with clients, one of the most common, and divisive, topics of conversation seems to revolve around PMOs. In my experience, PMOs provide tremendous value when they are managed correctly and operate with the proper intent – that is to facilitate the completion of more projects on time and on budget with the fewest resources, while boosting organizational performance. As of 2014, roughly 80% of US companies, on average, had a PMO of some kind. Obviously, many leaders see the benefit, or at least a perceived benefit, in having PMO guidance, but many organizations are still hesitant.

What complicates the PMO discussion is the fact that many organizations simply don’t understand them. I have clients who view them as too rigid, adding additional “unneeded” oversight, and being too reliant on “meaningless methodology frameworks”. In addition to that, many organizations fail to realize that the purpose of a PMO can vary. A one size fits all approach isn’t the best approach. Also, companies may recognize a need and attempt to implement a PMO, but quickly abandon it if they don’t see immediate results.

First, PMOs do add oversight – which is one of their primary values. Recent studies have shown that less than half of all projects deliver on time, on budget, and with their intended benefits. This can be attributed to a variety of factors, but one in particular is that many projects don’t have an appropriate road map for success. A PMO group can outline a project map, implement an agreed upon methodology, and successfully manage checkpoints to ensure that targets are being hit. Adhering to a “prescribed” method may be met with resistance from within the organization, but PMOs can certainly provide the needed discipline, resource planning, and project focus to enable greater project success rates. The oversight doesn’t have to be rigid or prescribed. The good PMOs will take into account their organization’s culture and mold their frameworks to fit that, being open to input from stakeholders and Project Managers.

Next, the organization needs to determine what type of PMO will best meet its needs. Some PMOs exist solely to staff Project Managers and distribute resources to the business groups. Others serve as an all-encompassing project management cadre, consulting on each and every project, providing training to Project Managers, and establishing an overarching framework to follow for projects. The key to discerning which type of PMO is best depends on your company’s culture, project failures, and the intended goals.

Finally, patience is key. Of firms surveyed, 37% of companies with a PMO in place for less than 1 year reported better project success rates. Companies with a PMO in place for more than 4 years reported a 65% increased project success rate. The greatest gains are realized over time as PMOs integrate with the culture. PMO groups that are allowed the time to gain buy in from business groups and stakeholders are able to more effectively dictate program and project success. The organization that displays patience through the early years of a PMO can reap substantial benefits down the road. Studies show that CIOs should allow 3 years for a PMO to derive benefit.
PMOs are valuable, but they can only demonstrate that value if they are implemented with the right intentions, taking into account cultural aspects of the company, and are given time to develop. It has been recommended that newly formed PMOs start with overseeing well-defined pilot projects with a lot of oversight from business unit Project Managers. This will allow for the right amount of give and take so that both sides are contributing to making the PMO fit the company culture, and not the other way round. At the end of the day, a well organized PMO will help drive project success.

How Should We Assess Risk For Our Projects?

Firstly, we need to consider that doing risk assessment is our attempt as project managers to minimize the possibility of failure. We define failure as one of three things:

  • Exceeding the time allocated to complete the project
  • Spending more than the budget allocated for the project
  • Not meeting our client’s requirements

Secondly, we can generalize that projects fail for two reasons:

  • External events negatively impact the project – such as having insufficient resources, uncontrolled scope changes, and unanticipated events
  • An overly optimistic plan requires the team to hit pre-set targets – sometimes due to having these targets dictated by others without sufficient insight into what it takes to get the project done, and at other times due to failure to properly scope the project, or blithely expecting that all targets will be met in a timely way and technical problems will be minimal – in the name of being a good team player.

We would maintain that part of planning for risks consists not only of expressing time and budget estimates as ranges at the beginning of the project (which allows for the actual risks that may occur) but using those ranges and estimates to establish contingency percentages for the project (most effectively done for the project as a whole rather than at the task level). However, although contingency planning is necessary, it is not sufficient. The part that is often left out is that a full risk assessment should be done once the project team has been fully assembled and has started to work on the project – around the 20% completion mark.

Performing an assessment at this point allows the team to determine how realistic the original risk assessments were and also to add to (or subtract from) the list of possible project risks. Any assessment of this type needs to involve the key members of the team (Project/program manager, Lead Business Analyst, Lead Technical Architect, Business sponsor, senior program and QA leads, etc.) to get a true picture of all the risks impinging on the project. This type of assessment should be done one-on-one with each individual for larger programs/projects and/or using a survey instrument for smaller projects and needs to include both a quantitative assessment (What level of risk, on a numeric scale, does a particular risk represent to the project and/or to the business as well as the probability, on a numeric scale, of the risk actually coming to pass) AND in addition, a qualitative assessment.

Many organizations often ignore or downplay the need for the qualitative assessment. However, as has been proven many times with techniques like Six Sigma, your staff is usually very aware of the issues that are occurring or likely to occur, and can pinpoint them and suggest possible solutions. Our experience in doing this type of assessment for companies has shown that this type of risk assessment is best led by individuals not involved in the project being considered to avoid conflicts of interest in reporting what is actually happening within the project.

Performing this type of assessment and covering both the quantitative and qualitative issues will insure a higher likelihood of risk assessment accuracy and result in more successful projects over time.

Scrum Fills the Leak in the Waterfall Methodology (Part 2)

Read ‘Part 1’ Here.

Although there are many reasons for Waterfall Failure or Waterfall ‘Leaks’, here are some common causes:

Requirements & Scope Control

Requirements tend to be unclear, lack agreement, lack priority, can be contradictory, ambiguous, and imprecise. With that said, Waterfall assumes a detailed set of requirements will be 100% accurate at the beginning and locked for the duration of the project. Changes to requirements are bad and Waterfall tries to minimize once the plan is created.

Planning (schedules & budgets)

Schedules and budgets can be based on insufficient data, missing items, insufficient details, and poor estimates. Waterfall assumes the initial estimates are locked and accurate. Frequently dates and budget are decided well before even project engineers had a chance to provide their own input on estimates.   These same team members are then forced into long spanning timelines they don’t agree with. This can also lead to overly cautious PMs who end up in ‘analysis paralyses’ during the planning phase in an attempt to perfect estimates which can delay the start of the project.


Another common reason for failure is poor communication and stakeholder engagement. This also causes lack of clarity and trust. Waterfall assumes the PM should plan all the work or tasks (i.e. planning from the center) and the project team must execute the tasks they are assigned. There is no sense of ownership from the team’s perspective.

Customer Value (and quality):

Typically, Waterfall does not provide any value back to the customer until the very end after all other phases. All requirements are locked together as they journey through all the waterfall phases only to realize that what they wanted a year ago is already obsolete and needs changing. Waiting to do the testing until the end of the project risks finding out a major issue right before deployment which will cause the project dates to slide and be over budget. A common waterfall solution to this is to reduce testing effort to ‘keep the dates’ and turn over a poor quality system into production. Read more

Scrum Fills the Leak in the Waterfall Methodology (Part 1)

Although the Agile Methodology has been around for more than ten years, Scrum is still relatively new to most people in the project management field – although it is gaining ground. Scrum is the leading agile development methodology, used by Fortune 500 companies around the world. Not only does it change the mechanics of traditional project management (i.e. waterfall), it’s also a philosophical change for the team that uses it. This article will not go into details on how scrum works, but how the benefits of Scrum fix waterfall shortcomings.

The methodology that has dominated software development projects for decades is called “waterfall.” Winston Royce coined the term in his 1970 IEEE paper “Managing the Development of Large Software Systems” to describe a serial method for managing software projects through the development stages (Requirements, Analysis, Design, Code, Test).   However, Royce himself acknowledges the issues with this process: “I believe in this concept, but the implementation described above is risky and invites failure.” He goes on to actually promote iterative development but that seems to have been ignored (or a less understood concept).

After 40+ years, why doesn’t waterfall work? The articles and statistics are endless…

Studies have shown that in over 75% of the investigated and failed software projects, the usage of the Waterfall methodology was one of the key factors of failure.

A study by McKinsey & Company in conjunction with the University of Oxford of 5,400 IT projects found 17 percent of the large IT projects go so badly that they can threaten the very existence of the company. On average, large IT projects run 45 percent over budget and 7 percent over time, while delivering 56 percent less value than predicted.

A more recent study by Innotas reports that 50 percent of companies had an IT project fail in the last 12 months.

Read ‘Part 2’ Here.