13:45 – 14:45
Stage 02: Money and Megaprojects: 5D know how and processes laid bare (Case Study)
Date: 2014-04-23, Track: Session 2
Associate Director – Turner & Townsend
Senior BIM Manager – Turner & Townsend project management
Megaproject = significant cost or complexity, attract public attention. Lots of failure in this zone. Unique endeavour. Impact on communities and environments. Multi party endeavour.
Procurement. Influenced by the sequence of the design.
Detail gap. Early stages define more than many appreciate.
Simple £ per sqm
Elemental shell and core
Composite component level
Boq component level.
Getting real cost in use data is hard.
Consistent guiding or intermittent interference. Don’t want to wait until gateways for information.
Capex win v opex gain
Procurement adversely affects both this and the model.
Sprints v programme
Design priority v constraint.
The plan to procure affects the design ability to deliver and when/what.
Modelling and platform consistency challenges. The methodology of modelling affects the mega project.
Inconsistency in the use, production and sharing of data.
Cost consultants want to get in at the EIR stage.
More effort goes into data conversion and output consolidation than any of the other processes.
Data is brutal. 99% is wrong.
This process is about understanding data. Producers working with consumers.
Accuracy, compliancy, cost, efficiency and net/gross
Possible use of the first cobie drop as a hack to get compliancy validation criteria.
LOD – completeness , resolution, over modelling and BEP status.
Completeness – of design development and data development.
Quality and coordination – clash mitigation and understanding, deficiency , duplication and rogue objects, standards checks.
Happy to get a model with problems in it as long as the problems are identified.
Manipulate data to inform decisions.
Check for consistent or similar build ups and identify for efficient coating and procurement.
Model data – category mapping, property sets, data type and validity, completeness and open source data testing.
No.1 consistency of taxonomy
Why is all this significant:
We can’t alter the model
Measurement rules differ from absolutes
We measure things that are not modelled.
Automation of the QS function? Myth mostly busted
Use of Quantum, tableaux
Not writing to the model but associating.
Split apart onto the measurement “bits”
Associated database to the objects to track them.
Embodied carbon comes from volume
Not a single software solution.
Primary aim – to isolate the unmeasured balance. Looking for what is not in the model
Model quantities are different to rules of measurement which have rules for the work to be applied.
Unit rate, cost by discipline , cost by package, significant items, forex exposure.
Communicate the optimum combination of procurement options.
Use visualisation techniques to show variance. Where things have changed and why. Pace of change can over run the change control procedure.
Snapshots as fast as it is being modelled.