Thursday, May 1, 2014
09:30 AM - 10:15 AM
Large IT projects, such as data warehouse implementations and data migrations, can have numerous potential negative consequences.
Some of these include:
- Overrun budgets
- Late deliveries
- Missing functionality
- Poor security
Many IT analysts believe that a major driving factor for all of these risk factors is complexity and that the failure to manage complexity is the single biggest reason that IT systems so often fail. Robert Glass popularized the observation that, when the functionality of an IT system increases by twenty-five percent, the complexity of that system doubles.
How can we manage complexity in our data projects? We first have to define complexity, recognize it when it occurs, devise methods to address it (Agile methods are among them), and then implement them.
We’ll look at why failure is an option, and sometimes a good one!
Paul has over twenty-five years of diverse experience in information technology. He is a Certified Data Management Professional (CDMP). Paul currently holds the position of Manager of Data Administration in the centralized IT department for his organization, Johnson County, Kansas. Johnson County has over 500,000 residents and employs over 3,500 persons in 40 different departments. He has worked in applications development, data management, data architecture, strategic planning, budgeting, project and personnel management, systems and data analysis, relational database design, disaster recovery, policy implementation, methodology evaluation, capacity planning, metadata implementations, data security, and scenario planning. He served in the United States Marine Corps. He holds an MPA (Master of Public Administration).