Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Structural Estimation Methodology for Microsoft Projects

DZone's Guide to

Structural Estimation Methodology for Microsoft Projects

Take a look at how traditional sizing models might not fit for Microsoft integration projects, and the alternatives that are available.

· Agile Zone ·
Free Resource

Discover how TDM Is Essential To Achieving Quality At Speed For Agile, DevOps, And Continuous Delivery. Brought to you in partnership with CA Technologies

Traditional Model for Sizing

There are a number of traditional software sizing models that could be used to estimate project costs such as Planning Poker, using a Work Breakdown Structure, or Function Point Analysis. Traditional software sizing models do not work accurately when sizing for integration projects. It gets complicated when we need to estimate for the smaller independent units (when an integration project is broken into units) to deliver in an agile fashion. The given methodology was implemented for projects involving Microsoft Integration technologies like BizTalk.

A Unit and Weight-Based Estimation Model

A number of extra factors such as interfaces, external systems, interfacing complexities, workflows, business logic, interfacing systems, and degrees of customization vs. configuration need to be taken into account for Integration projects.  The sizing effort is dependent on the complexity of the interfaces that are dependent on data exchange as well as data processing requirements. The structural estimation methodology addresses this gap between what functionality is to be built and how it is to be built to derive the cost estimation. The productivity factor can be obtained accurately for the right sizing and effort for new projects and thus help in reducing effort overruns in projects. It ensures consistency of a size unit across Microsoft integration projects with varying complexities and interfaces.

For ease of use and to standardize our way of estimation, we will introduce a "unit and weight based estimation model" that helps us to estimate faster, more accurately, and consistently. We would need to define units and weights for estimation units that cannot be directly addressed by the base model (traditional model) to predict the size of projects involving integration.

The methodology can be applied to projects that leverage Microsoft Integration Platforms that include BizTalk, WCF, and Logic Apps, since it was used for them, but can be extended to other EAI platforms as well.

The estimation methodology listed below is to provide realistic estimation model that captures the complexities associated with the run-time interface requirements. The complexity factors take into account the complexities involved in data exchange, data processing as well additional system complexities.

Units-Based Estimation Methodology:

Image title

Given below are the steps involved:

Step 1: List the external systems and the interfaces that the system supports

Step 2: Capture the interface requirements, which consists of Data Exchange and Data Processing.

Data Exchange:

  • Number of request types with simple schemas (nodes less than 20) without LOB adapters
  • Number of request types with medium schemas (nodes between 20 and 50) without LOB adapters
  • Number of request types with complex schemas (greater than 50) without LOB Adapters
  • Number of request types with simple schemas (nodes less than 20) with LOB Adapters
  • Number of request types with medium schemas (nodes between 20 and 50) with LOB adapters
  • Number of request types with complex schemas (nodes greater than 50) with LOB adapters
  • Number of publishing types with simple schemas (nodes less than 20) without LOB adapters
  • Number of publishing types with simple schemas (nodes less than 20) without LOB adapters
  • Number of publishing types with complex schemas (nodes greater than 50) without  LOB adapters
  • Number of publishing types with simple schemas (nodes less than 20) with  LOB adapters
  • Number of publishing types with medium schemas (nodes between 20 and 50) with  LOB Adapters
  • Number of publishing types with complex schemas (nodes greater than 50) with  LOB adapters

Transformation

  • Number of instances using xslt that require direct mapping
  • Number of instances using xslt and that require string manipulations
  • Number of instances using xslt and that require custom coding
  • Number of instances using maps that require direct mapping
  • Number of instances using maps that require string manipulations and out of box functoids
  • Number of instances using maps that require custom coding

Data Processing might include validation factors such as the number of instances involving custom coding for business validations, Number of instances involving rules for business validations, enrichment factors such as Number of instances where custom coding is required for any enrichment/mapping and transformation factors such as Number of instances using maps that require custom coding

Data Processing might include factors such as the number of instances involving Business Rules during orchestration, Number of instances involving additional business logic (If transactions are involved across invocation of multiple services and compensation needs to be taken care of)

 Step 3: Determine Data Exchanges and Data Processing Complexity factors

 For example, validation might be 1 unit, enrichment could be 1.25, the transformation could be 3

Step 4: Determine Additional Systemic Requirements as number of interfaces requiring aggregation (assembling back the processed disassembled records) and associated complexities

 Step 5: Assign Weights for each identified factors and subfactors.

The degree of implementation for each complexity factor and weights assigned based on the nature of work involved.

The assignment of weights will follow multiple iterations of the below steps:

  • Determine the degree of implementation of each complexity assigned across the three categories.
  • Rank ordering the complexity assigned to the three categories.
  • Assign the unit and the weights in scale factors for each complexity.

Activity Types

Data Exchange / Processing / Complexity Types

Input Value

Size in Adjusted Units

Effort (Hrs)

The sizing model needs to be validated by applying to projects, which are currently executing by verifying the size units against the effort consumed for build and unit testing and size vs effort relationship.


References:

http://static1.1.sqspcdn.com/static/f/702523/20866447/1352101956280/201211-Natarajan.pdf?%3D


See how three solutions work together to help your teams have the tools they need to deliver quality software quickly. Brought to you in partnership with CA Technologies

Topics:
agile estimating ,agile ,integration ,microsoft ,data exchange

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}