MiningMath

MiningMath

Loading...

Unlimited scenarios and decision trees for your strategic evaluations

DoclyChild

Optimization Runtime

Estimated reading: 3 minutes 873 views

Introduction

The optimization run time is a common concern for professionals dealing with robust models. This page aims to provide context and guidance to improve run times, which might be quite useful for having a big picture of the project’s behavior under different assumptions and hypotheses.

Runtime Barriers

The runtime depends on a combination of multiple aspects. It is directly related to the complexity of the deposit. The runtime is directly proportional to the number of:

  • Blocks.

  • Multiple destinations (+3)

  • Constraints in use and conflicting goals with the same hierarchy order.

  • Variables imported.

  • Period ranges.

  • Parameters changing over time.

  • Multi-mine deposits.

Often, users are concerned with the limits to handle models with +20M blocks. MiningMath can virtually handle any model size. It has successfully made tests with models up to 240M blocks without reblocking, which took three weeks, and over a 32 Gb desktop machine.

Typically, datasets with 5 million blocks take a few hours (in an 8GB RAM machine). The technology can also execute multiple scenarios in parallel, using the same computer. There is no need for special servers with extra RAM capabilities for deposits of average size.

Hardware Improvements

Overall, the main bottleneck for MininingMath is memory consumption. Hardware upgrades that most positively impact the optimization run time are:

  • RAM capacity.

  • RAM frequency.

MiningMath is a single-thread application, which means:

  • Additional cores and threads do not affect the optimization run time.

  • Processors with higher clock speeds improve the run time.

Additionally, the user can open several instances of MiningMath to run multiple scenarios in parallel. While a single scenario might not run faster, it is possible to improve the average of scenarios tested per unit of time.

Strategies to improve the runtime

The most recommended strategy is passing through the tutorial steps of validating data and constraints validations then starting using the surfaces as a guide to reduce the complexitywithout losing dilution aspects on your approach.

To get such guidance on a broader view with a reduced runtime use the Exploratory Analysis to get pushbacks and insights on what could be used. The last step is to get a detailed Schedule since the model has such complexity. If such approaches do not offer a proper runtime, try to get intermediate results by splitting the total production into 2 or 3 periods.

This is another approach that could be done, which is not recommended since we lose dilution aspects by increasing the block size. Therefore, if your blocks have the dimensions of 5x5x5 and you could increase it to 10x10x10, which could reduce the number of blocks to half of its standard dataset size. However, according to user’s feedback, a multi-mine project with 32M blocks for the final integrated model, which mining complex considered various processing routes and operational constraints involving certain mine infrastructure within the final pit boundaries, obtained the following runtimes:

  • Quadruple reblocking in each direction took about 4-5 hours for each run.

  • Triple reblocking took 12 hours.

  • Double reblocking took 36 hours.

As the solution became clearer, reblocking was gradually reduced to have more flexibility.

Share this Doc
CONTENTS

Ask GPT

You can use ChatGPT to help you with our knowledge base. First, you will need to...

Multivariate Sensitivity Analysis

Multivariate Sensitivity Analysis is the process of creating and analyzing scena...

Bottleneck Analysis

Bottleneck Analysis is the process of creating scenarios to conduct extensive se...

NPV Enhancement

NPV Enhancement is the process of creating scenarios to conduct extensive search...

Design Enhancement

Design Enhancement is the process of creating scenarios to conduct extensive sea...

Selectivity Analysis

Selectivity Analysis is the process of generating and analyzing scenarios to mea...

Best-Worst Range Analysis

Best-Worst Range Analysis is the process of generating and analyzing scenarios t...

NPV Upside Potential

NPV Upside Potential is the process of generating and analyzing scenarios to mea...

Protected: Getting Started

There is no excerpt because this is a protected post.

Protected: Validation stage

There is no excerpt because this is a protected post.

Protected: Formatting stage

There is no excerpt because this is a protected post.

Protected: Destinations and economic values​

There is no excerpt because this is a protected post.

Chat Icon Close Icon