Question of the Week are asked and answered in the Beck Technology Community. Anyone is encouraged to join the community to discover additional tips and tricks.
Great question for the community, Doug. Admittedly, this is a challenge stemming from new model-based workflows that we are not yet used to working. But we are seeing some things that will ultimately help us establish a best practice. Much of what I am implementing stems from having to QA/QC someone else’s DESTINI Profiler models and estimates. So we have some precedent to help us understand the model-based workflow. What I am finding is that there is a three-step approach that seems to help us ensure we have covered all the bases.
1) The first step is to ensure that the estimate provides 100% coverage of the scope of the project. This is no different than QA/QC’ing someone’s traditionally-derived estimate. It helps if the estimate can be organized by a familiar group structure (i.e. CSI or Uniformat) so to serve as a checklist or guide. So, for the first step, I am simply reviewing the estimate to verify scope coverage by the estimate cost items for each major division. My next concern deals with validation of quantities, but now, instead of verifying against 2D quantity takeoff values, we need to validate quantities from the model. This takes us to step 2.
2) The second step involves verifying that the quantities in each line item make sense. This applies regardless of where the quantities come from. We don’t want to forget estimating basics just because our workflow has been modified. For any line item whose quantity source needs to be verified, select on the row header for the line item and select the “Review Takeoff” command. This will reveal the source of the line item’s quantity and whether it has been hard-coded, or whether it has been tied to a model component variable, such as Area or Perimeter, or some combination of the above. (Note: If you are only concerned about quantities derived from the model, then it might be helpful to use the “Is-Mapped” field as a major grouping field in the Group Panel. Once a quantity source has been verified, then we can work to locate the quantity and understand how it was derived. Which takes us to step 3.
3) The third step includes “interrogating” the model to understand how a specific quantity was derived. For example, if my estimate shows the slab on grade tied to an “Area” variable value, then I might want to investigate that Area value in the model to ensure it picks up all relevant slab areas. For this, I will switch over to the Takeoff View and begin to evaluate model components. I’ve found that creating a proper Filter grouping is key here, and often set up the filter grouping on “Is-Mapped” followed by one or two other common value across all model components (i.e. Assembly Code and/or Identity Data – Type Name), and then end on “Component Name” so that I can get down to the level of the individual model components. In this example, I would isolate the slab on grade and ensure that value for the area of the slab component matches the one that was linked in the line item.
So far, I have found that repeating this process is a solid work-flow for QA/QC’ing an estimate tied to a 3D model.