MiningMath

MiningMath

Loading...

Unlimited scenarios and decision trees for your strategic evaluations

Tutorials

Beginners guide

Run your first project

You can check a sequence of pages to learn how to run your first project with our Getting Started training. From installation process and formatting your model files up to the long-term planning of your project.

Click Here

Must-Read Articles

In order to take the maximum of MiningMath’s Optimization we recommend this flow through our Knowledge Base. It will guide you step-by-step in order to integrate multiple business’ areas and to improve your strategic analysis through risk assessments unconstrained by step-wise processes. 

Set up and first run

  1. Quick Check: Here you’ll have all the necessary instructions to install, activate and run MiningMath.

  2. How to run a scenario: Once everything is ready, it’s time to run your first scenario with MiningMath so you can familiarize with our technology!

Find new results

  1. Playing with pre-defined scenarios: Each change in a scenario opens a new world of possibilities, therefore, it’s time to understand a little more about and see it in practice, playing with pre-defined scenarios.

  2. Decision Trees: Decision Trees provide you a detailed broad view of your project, allowing you to plan your mining sequence by analyzing every possibility in light of constraints applied to each scenario, which options are more viable and profitable to the global project, as well as how these factors impact the final NPV.

Understand the technology in depth

  1. Current best practices: Here we go through the modern technology usually employed by other mining packages. It is important to understand these in order to comprehend MiningMath differentials.

  2. MiningMath uniqueness: Now that you’ve practiced the basics of MiningMath, and understand how other mining packages work, it’s time to get deep into the theory behind the MiningMath technology.

  3. Interface Overview: It’s time understand our interface overview with detailed information about every screen and constraints available in MiningMath. Home page, Model tab, Scenario tab, and Viewer for a better understanding of the possibilities.

Using and validating your data

  1. Formatting the Block Model: Learn how to format your block model data and use it in MiningMath.

  2. Importing the block model: Go through the importation process and to proper configure your data.

  3. Economic Values: MiningMath does not require pre-defined destinations ruled by an arbitrary cut-off grade. Instead, the software uses an Economic Value for each possible destination and for each block. After your data is formatted and imported into MiningMath, you may build your Economic Value for each possible destination.

  4. Data Validation: Once your data is set, it’s time to validate it by running MiningMath with a bigger production capacity than the expected reserves. Thus, you will get and analyze results faster.

  5. Constraints Validation: Continuing the validation, start to add the first constraints related to your project so that you can understand its maximum potential.

Improve your results

  1. Integrated Workflow: Each project has its own characteristics and MiningMath allows you to choose which workflow fits best in your demand and to decide which one should be used.

  2. Super Best Case: In the search for the upside potential for the NPV of a given project, this setup explores the whole solution space without any other constraints but processing capacities, in a global multi-period optimization fully focused on maximizing the project’s discounted cash flow.

  3. Optimized Pushbacks: Identify timeframe intervals in your project, so that you can work with group periods before getting into a detailed insight. This strategy allows you to run the scenarios faster without losing flexibility or adding dilution on the optimization, which happens when you reblock.

  4. Optimized Schedules: Consider your real production and explore scenarios to the most value in terms of NPV.

  5. Short-term Planning: Now that you built the knowledge about your project based on the previous steps, it is time to start the integration between long and short-term planning in MiningMath. You may also optimize the short-term along with the long-term using different timeframes.

Export your results

  1. Exporting Data: After running your scenarios, you can export all data. Results are automatically exported to CSV files to integrate with your preferred mining package.

In-Depth MiningMath

This tutorial provides a detailed guidance to the pages in the knowledge base for new MiningMath users. A shorter tutorial can be found here with a set of must read articles. In this tutorial, a larger number of pages is contextualized and recommended for those with no previous experience using MiningMath but who wish to gain a more advanced knowledge.

Software requirements

  1. Quick check: Verify if your computer has all the minimum/recommended requirements for running the software.

  2. Put it to run: Here you’ll have all the necessary instructions to install, activate and run MiningMath.

Set up the block model

The next step after installation is to understand the home page interface and import your project data. The following pages go over these in detail.

  1. Home page: MiningMath automatically starts on this page. It depicts your decision trees, recent projects and model information.

  2. Import your block model: import your csv data, name your project, set fields and validation.

  3. Modify the block model: this window aims to help you to modify your block model accordingly with what is required for your project and also allows you to “Export” the block model to the CSV format to be used with any other software.

  4. Calculator: calculate and create new fields by manipulating your project inside MiningMath.

Handling unformatted data

If you don’t have a block model ready to be imported you might want to create a new one. The following pages can guide you through this process.

Define the scenario and run

Once you have started your block model defined, there are several options to set up your project’s parameters before running a scenario.

  1. Scenario tab: set densities, economic parameters, slope angles, stockpiles, add/remove processes and dumps, production inputs, geometric inputs and so on.

  2. Save as: save the scenario's configuration once it has been configured.

  3. Run: the Run tab is the last step before running your project’s optimization. Change the scenario name, set a time limit, and set up results files.

Results

After running your scenario it is important to analyze and understand the given results.

  1. Output files and 3D viewer: by default, MiningMath generates an Excel report summarizing the main results of the optimization. It also creates outputs of mining sequence, topography, and pit surfaces in csv format so that you can easily import them into other mining packages. The 3D viewer enables a view of your model from different angles. 

  2. Export model: export your model as a csv file. This can be used in new scenarios or imported in other mining packages.

Extensive set-up

MiningMath offers a lot of customization. You might use pre-defined scenarios to learn with standard parameters. Otherwise, the following pages of the knowledge base detail several important parameters that might need to be fine tuned in your project.

Advanced content

Complex projects might need advanced configurations or advanced knowledge in certain topics. The following pages cover some subjects considered advanced in our knowledge base.

Theory

In order to understand the theory behind MiningMath’s algorithm, a set of pages is provided to describe mathematical formulations, pseudo-code, and any rationale to justify the software design.

Workflows

ManingMath acknowledges and supports different workflows. This knowledge base provides a set of articles aimed at showing how MiningMath can be integrated into other workflows or have its results used by different mining packages. 

Getting Started

Quick Check

System requirements

The only mandatory requirement for using MiningMath is a 64-bits system. Other minimum requirements are listed further:

  1. Windows 10

  2. 64-bits system (mandatory)

  3. 110 MB of space (installation) + additional space for your projects' files.

  4. Processor: processors above 2.4 GHz are recommended to improve your experience.

  5. Memory: at least 8 GB of RAM is required. 16 GB of RAM or higher is recommended to improve your experience.

  6. Microsoft Excel.

  7. OpenGL 3.2 or above. Discover yours by downloading and running the procedure available here.

  8. Visual C++ Redistributable: Installation of Visual C++ Redistributable is necessary to run this software.

Recommended Hardware

Memory should be a higher priority when choosing the machine in which MiningMath will be run on. Here’s a list of priority upgrades to improve performance with large scale datasets: 

  1. Higher Ram

  2. Higher Ram frequency

  3. Higher processing clock

Common Issues

Insufficient memory

As previously presented, RAM should be one of the most important components to prioritize when selecting a computer to run MiningMath, especially because Windows alone consumes a significant amount of memory.

However, if you encounter an insufficient memory warning or a sudden crash while using MiningMath, there are some recommendations you can consider:

1. Memory Upgrade: If possible, this is the best solution to enhance efficiency. The characteristics to observe are listed in the previous item, “Recommended Hardware.” Based on our experience with more complex projects, 64 GB is usually sufficient for nearly all cases.

2. Free Up Memory: Consider closing other applications that are consuming the computer’s RAM while MiningMath is running.

3. Increase Windows Virtual Memory: This procedure involves allocating disk space to be used as RAM. To perform this procedure, we recommend this tutorial.

4. Reblock: If none of these options work, reblocking can be considered to reduce the size of the model. Check more details here.

Extra: In exceptional cases, when working with boxes, it may be viable to manipulate the block coordinates to bring them closer together, creating a smaller model box.

Put It to Run!

Installing, Activating and Running

Installing and activating MiningMath is quick and straightforward. All you need to do is follow the setup wizard and have an internet connection to activate your license. 

Video 1: MiningMath installation process.

Activating Your License

To activate your license, you will need to: 

  1. Open MiningMath (it will open automatically after the installation, but you can open it manually afterwards).

  2. On the left menu, click License.

  3. Select the field "I have an activation code" and paste the License Code provided by MiningMath.

  4. Click "Activate license".

Opening an old project

If you need to open an old project, just follow these steps:

  1. Open MiningMath (it will open automatically after the installation, but you can open it manually afterwards).

  2. On the left menu, click on Open Project.

  3. Search for the folder in which you saved your old project.

  4. Select the ".ssprj" file.

  5. Click on "Open" and it will show up on the "Recent Projects"

  6. Now you can open it!

The images below illustrate this process:

NOTE

MiningMath’s licensing method demands an internet connection. 

Optimizing Scenarios

Play with the predefined scenarios

MiningMath allows you to learn with each scenario by providing standard parameters which simulate some common constraints that a mining company may face. Standard scenarios are listed and described below so you can identify the main changes made within the “Overview” tab.

The ultimate goal of this practice is to prepare you to build Decision Trees, which allow you to organize scenarios in order to understand how variables influence one another and, consequently, how these variables determine the final NPV.

Figure 1: Scenarios on the Home Page

Market Conditions Decision Tree

1) BaseCase

The Base Case consists of the initial scenario, with a uniform production capacity and without sum, average or surface mining limits.

2) BaseCase-RampUp

While the base case considers a uniform production capacity, the BaseCase-RampUp scenario offers the possibility to vary the levels of production within the different timespans. We have an initial production capacity of 10Mton on the first 2 periods; 20 Mton on periods 3 and 4; and 30 Mton from period 5 until the end of the mine’s lifetime, with a total movement constraint of 30, 60, and 80 Mton, considering the increase of production within the time-frames mentioned.

Figure 3: BaseCase-RampUp

3) PriceUp and PriceDown

Scenarios “PriceUp” and “PriceDown” differ in relation to the basic scenario in the economic value used for the calculation of the P1 process, where there is an increase and a decrease of 10% at the copper selling price, respectively. In the destination tab, “P1 Cu +10” and “P1 Cu -10” were the values used for the process.

4) PriceUp-RampUp and PriceDown-RampUp

These scenarios consider a 10% copper selling price increase and decrease, and a ramp-up of the production capacity at the same time, as mentioned before.

5) PriceUp-RampUp-Protection300 and PriceUp-RampUp-Protection400

This scenario considers a 10% copper selling price increase and a ramp-up of the products at the same time, as mentioned in the previous scenario. In addition, a restrict mining surface (this constraint is used to prohibit access to this area in a specific timeframe) was included up to the fourth period, since it may represent some legal constraints on a project.

Figure 8: PriceUp-RumpUp-Protections

Other Decision Trees

Below you can see a description of some scenarios of other Decision Trees.

1) MW150 (Geometries Decision Trees)

The MW150 scenario considers different geometries from the base case at the geometric constraints. In this scenario, 150 meters was used as mining width (the horizontal distance between the walls of two surfaces that belonged to consecutive periods), and a vertical rate advance of 180 meters.

Figure 9: MW150

2) AvgCu (Average Decision Tree)

In the AvgCu scenario, blending constraints were added in the average tab to consider allowed at the process plant 0.5% as a minimum and 0.7% as a maximum average copper grade. The optimization will have to fulfill the P1 process capacity and, as an additional challenge, it has to meet this new set of parameters related to the Cu average content within the ore.

Figure 10: AvgCu

3) Proc13000h and Proc13000h-33Mt

(Process throughput Decision Tree)

Scenario Proc13000h considers 13.000 hours of processing equipment used as the maximum limit. This constraint was inserted at the sum tab and it controls variables such as rock type feeding, energy consumption, and any parameter controlled by its sum. Scenario Proc13000h-33Mt considers an increase of 10% in the production, inserted at the production tab beyond the parameters mentioned previously.

Figure 12: Processing hours

4) Yearly-TriannualProduction

(Short-Long Term Integration Decision Tree)

This considers a yearly production for period range 1-4 and triannual planning for range 5-end. This way is possible to integrate both short and long-term planning in a single run, facilitating the analysis and strategic definitions. 

Any kind of timeframe can be used according to your needs.

Yearly-Triannual Production Example
Figure 13: Short-Long Term Integration

Translations

MiningMath supports and encourages the translation of its knowledge base to multiple languages. If you would like to translate our knowledge base and have your profile advertised please contact us.

Portuguese

Ask GPT

You can use ChatGPT to help you with our knowledge base. First, you will need to have the Plugins options enabled on GPT-4.

After that, choose the AskYourPDF option:

Finally, you should enter the following prompt:

For the requests all along this chat, consider the following content: https://miningmath.com/Knowledge-Base.pdf

Other prompts can help you with different requests. For example, you can ask GPT-4 to act as your own technical support agent that answers in the same language as your question:

Plugin AskYourPdf, consider the following content: https://miningmath.com/Knowledge-Base.pdf

Please answer the following question as a technical support agent, coming from a MiningMath user, in the same language as the question:

"QUESTION TEXT TO BE REPLACED"

Essential Topics

How to Run a Scenario

Video 1: Downloading MiningMath.

On MiningMath’s interface, you will find the Marvin block model and its scenarios (Figure 1). It is possible to preview the scenario and its parameters before opening it (Figure 2).

Choose and open Base Case, click the “Overview” tab (Figure 3) to check the parameters, and then click on “Run” to run the optimization (Figure 4).

After that, a short report with the results will be generated. To view it, check all the boxes on the “Load Options” window and click on “Load” (Figure 5).

Finally, whenever you feel ready to run your own scenarios, start by formatting your data here.

Common Issues: Setting your first scenario

When setting up your first scenario, you may come across some situations such as unavailable tabs and some fields marked in red. These situations are quite simple to resolve, as shown in the following video:

Play Video

Results of the Optimization

By default, MiningMath generates an Excel report summarizing the main results of the optimization. It also creates outputs of mining sequence, topography, and pit surfaces in .csv format so that you can easily import them into other mining packages.

Viewer

The 3D viewer enables a view of your model from different angles. The block colors are defined accordingly with each property displayed, varying from blue to red (smallest to largest), due to destinations, periods, or any other parameter. Therefore, it’s possible to filter the blocks by the period in which they were mined or processed, for instance. In addition, it also allows you to compare multiple scenarios by loading different cases and using the left bar to change from one to another.

Output Files

After optimizing your block model and running your scenario(s), MiningMath generates standard output files with detailed reports. The main files have a universal format (.csv), which allows you to easily import them onto other mining packages to start your mine design and further steps of your projects.

To open the project folder, click on the scenario’s name with the right button of your mouse and choose “Show in the Explorer“. The optimization’s main output files are:

  • Scenarioname.xlsx: Short report with the main results.

  • MinedBlocks.csv: Detailed report which presents all the blocks that have been mined.

  • Surface.csv: Grid of points generated through the pit each period.

Scenarioname.xlsx

Provides you with a short report with the main results of the optimization: several charts and sheets in which you can analyze the production on each period, the stockpiles by periods, the average grade of processes and dump, NPV per period, the cumulative NPV (Net Present Value), etc.

Figure 18: Graphic results

MinedBlocks.csv

This file offers a detailed report on all the mined blocks and their specificities: information on the mining sequence based on each block extracted, along with mined and processed periods, destinations, economic value, and all information used for the optimization. This file also allows you to identify blocks that were stocked and the algorithm decision-making process.

Figure 19: Mined blocks

Surface.csv

The surface scenario brings a grid of points generated through the pit of each period: each surface is named according to its mining periods and contains information about the topographic coordinates at that time. These files can be imported into the viewer separately, so that you can verify and validate your data before starting the optimization process. Note: The surfaces are exported/imported from/at MiningMath in Coordinates.

Figure 20: Surface's CSV

Video 1: Outputs and files hirearchy.

Play with Predefined Scenarios

MiningMath allows you to learn with each scenario by providing standard parameters which simulate some common constraints a mining company may face. Standard scenarios are listed and described below so you can identify the main changes made within the “Overview” tab.

The ultimate goal of this practice is to prepare you to build Decision Trees, which allow you to organize scenarios in order to understand how variables influence one another and, consequently, how these variables determine the final NPV.

Figure 1: Scenarios on the Home Page

Dataset

The examples in this page come preinstalled with every version of MiningMath. If you have deleted this project by any chance, please download the zip file below, extract the files and choose the “Open Project” option in MiningMath.

BaseCase

The Base Case consists of the initial scenario, with a uniform production capacity, and without sum, average, or surface mining limits.
Figure 2: BaseCase overview

BaseCase-RampUp

While the base case considers a uniform production capacity, the BaseCase-RampUp scenario offers the possibility to vary the levels of production within the various timespans. We have an initial production capacity of 10Mton on the first 2 periods; 20 Mton on periods 3 and 4; and 30 Mton from period 5 until the end of the mine’s lifetime, with a total movement constraint of 30, 60, and 80 Mton, considering the increase of production within the time-frames mentioned.

Figure 3: BaseCase-RampUp

PriceUp and PriceDown

Scenarios “PriceUp” and “PriceDown” differ in relation to the basic scenario in the economic value used for the calculation of the P1 process, where there is an increase and a decrease of 10% at the copper selling price, respectively. In the destination tab, “P1 Cu +10” and “P1 Cu -10” were the values used for the process.

PriceUp-RampUp and PriceDown-RampUp

These scenarios consider a 10% copper selling price increase and decrease, and a ramp-up of the production capacity at the same time, as mentioned before.

PriceUp-RampUp-Protection300 and PriceUp-RampUp-Protection400

This scenario considers a 10% copper selling price increase and a ramp-up of the products at the same time, as mentioned in the previous scenario. In addition, a restrict mining surface (this constraint is used to prohibit access to this area in a specific timeframe) was included up to the fourth period, since it may represent some legal constraints on a project.

Figure 8: PriceUp-RumpUp-Protections

Below you can see a description of some scenarios of other Decision Trees.

MW150

The MW150 scenario considers different geometries from the base case at the geometric constraints. In this scenario, 150 meters was used as mining width (the horizontal distance between the walls of two surfaces that belonged to consecutive periods), and a vertical rate advance of 180 meters.

Figure 9: MW150

AvgCu

In the AvgCu scenario, blending constraints were added in the average tab to consider allowed at the process plant 0.5% as a minimum and 0.7% as a maximum average copper grade. The optimization will have to fulfill the P1 process capacity and, as an additional challenge, it has to meet this new set of parameters related to the Cu average content within the ore.

Figure 10: AvgCu

AvgCu-Stock5Mt

Here, the same blending constraints of the previous scenario (AvgCu) were added, in addition to a stockpile limit of 5Mton for process 1, on the destination tab. This feature allows you to control the stock limit of your whole process, which increases the optimization flexibility to feed the plant, while respecting the blending constraints that were already implemented.

Figure 11: AvgCU-Stock5Mt

Proc13000h and Proc13000h-33Mt

Scenario Proc13000h considers 13.000 hours of processing equipment used as the maximum limit. This constraint was inserted at the sum tab and it controls variables such as rock type feeding, energy consumption, and any parameter controlled by its sum. Scenario Proc13000h-33Mt considers an increase of 10% in the production, inserted at the production tab beyond the parameters mentioned previously.

Figure 12: Processing hours

Calculator

This feature allows the user to manipulate their project inside MiningMath, enabling adjustments and new field creation. Figure 1 shows a general view of the calculator. On the left side we have the block parameters and on the right the calculator itself, where the calculation can be done.

Figure 1: Calculator.

To manipulate the calculator just insert a name for the new field, select the type of field (to know more about field types, access this link), and build your expression. In case of a more complex expression, just mark the field “Logical Test” to enable conditional features. To manipulate the calculator just insert a name for the new field, select the type of field (to know more about field types, access this link), and build your expression. In case of a more complex expression, just mark the field “Logical Test” to enable conditional features.

  • + : Addition

  • - : Subtraction

  • * : Multiplication

  • / : Division

  • % : Modulus

  • ** : Exponential

  • // : Floor division

Practical approach

To facilitate the understanding, let’s work on some examples. You can see below a generic math expression (left), and its equivalent written on MiningMath’s calculator (right)

\((x^2)\times(\frac{y}{2}-1)\) x**2*((y/2)-1)

Adding a field without logical expression

Using an example of the Marvin’s Economic Value calculation, we are going to add a Block Tonnes field, as the figure 2:

Figure 2: Adding a new field

Adding a field with a logical expression

One more time using Marvin’s block model, let’s suppose we want a maximum slope angle of 45 degrees.

First, we name our Field, in this case, will be “SlopeMax45d”, select the field type as “Slope” and check the Logical Test box. Then a double click on the Slope field select it and already put it on the Expression. The next step is to select the operator, as we want a maximum of 45 degrees, we choose the operator “>” and insert the value 45 in the text box. If the value is true, that is, if this value is bigger than 45 it will now have the value of 45 assigned to it. If the value is false, i.e., lower than 45, then it will keep its value. Figure 3 shows this calculation:

Figure 3: Logical test expression

During the expression construction, green or red lines will underline it, highlighting the correct parts and the ones that need adjustments to become correct. When it is all set, just click on “Add field” and this new field will be available for use on the project on its correct field type assignments. In case the user needs to delete a field, just go to the parameters option, select and delete it.

Removing a field

To remove an existing field, go to the “Parameters” tab, select the desired field and click “Remove”.

Figure 4: Removing a field

NPV Calculation

The following video explains more about the NPV calculation made by MiningMath’s algorithm. The understand of these steps might be useful for users working on projects with variable mining costs, which are not yet smoothly implemented on the UI.

Video 1: NPV calculation.

The discount rate (%/year) is provided by the user in MiningMath’s interface, as depicted in the figure below.

Figure 1: Interface example to define discount rate (%/year)

In a usual scenario period ranges are defined by annual time frames, as depicted in Fig-2.

Figure 2: Interface example with annual time frame

In this case, the annual discount rate multiplier (annual_multiplier) to return the discounted cash flow is performed as follows:

\(\text{annual_multiplier}(t) = \) \(\frac{1}{(1 + \text{input discount rate})^t}\)

The table below exemplifies one case for 10 periods.

Period Process 1 Dump 1 NPV (Discounted) M$ Annual multiplier Undiscounted NPV M$
1
P1
Waste
1.2
0.909
1.320
2
P1 +5%
Waste
137.9
0.826
166.859
3
P1 +5%
Waste
132.5
0.751
176.358
4
P1 +5%
Waste
105.4
0.683
154.316
5
P1 -5%
Waste
89
0.621
143.335
6
P1 -5%
Waste
92
0.564
162.984
7
P1 -5%
Waste
91.3
0.513
177.918
8
P1 -10%
Waste
52.3
0.467
112.110
9
P1 -10%
Waste
54.3
0.424
128.037
10
P1 -10%
Waste
12.1
0.386
31.384

Table 1: Example of annual multiplying factors and undiscounted cash-flows for a 10% discount rate per year. Process 1 exemplifies the use of different economic values per period.

In details, Table 1 lists:
  1. the NPV (discounted) resulting from a 10 yearly period with a 10% discount rate per year.

  2. the annual discount rate (annual_multiplier) for each period; and

  3. the undiscounted NPV as the result of the discounted NPV divided by the annual_multiplier.

MiningMath allows the creation of scenarios in which period ranges are defined with custom time frames (months, trienniums, decades, etc.), as depicted in Figure 3.

Figure 3: Interface example with custom time frames

In this case, the discount rate is still provided in years on the interface. However, the discount rate per period follows a different set of calculations. To identity the correct multiplier (discount rate for a custom time frame) applied to each custom time frame, it is necessary to apply the formula below:

\( \text{mult}(t) = \frac{1}{(1 + \text{discount_rate}(t)) ^ {\text{tf_sum}(t)}}\)

where:

\(
\text{tf_sum}(t) = \sum_{i=1}^{t}\frac{TF(i)}{TF(t)}
\)

and

\(
\text{discount_rate}(t) = (1 + \text{annual_discount_rate})^{TF(t)} – 1
\)

and

\(
TF(t)=
\begin{cases}
1,& \text{if}\, t\, \text{is in years}\\
\frac{1}{12},& \text{if}\, t\, \text{is in months}\\ 3,& \text{if}\, t\, \text{is in trienniums}\\
etc.&
\end{cases}
\)

For example, to calculate the multiplier of the first period in figure 3, the equation would be:

\( TF(1) = \frac{1}{12} = 0.8333… \)

\( \text{tf_sum}(t) = \sum_{i=1}^{1}\frac{TF(1)}{TF(1)} = 1 \)

\( \text{discount_rate}(1) = (1 + \text{annual_discount_rate})^{TF(1)} – 1 = (1 + 0.1) ^ {1/12} – 1 = 0.007 \)

\( \text{mult}(1) = \frac{1}{(1 + \text{discount_rate}(1)) ^ {\text{tf_sum}(1)}} = \frac{1}{(1 + 0.007)^{1}} = 0.993 \)

Another example, to calculate the multiplier of period 15 in figure 3, the equation would be:

\( TF(15) = 3 \)

\( \text{tf_sum}(t) = \sum_{i=1}^{15}\frac{TF(i)}{TF(15)} = 2 \)

\( \text{discount_rate}(15) = (1 + \text{annual_discount_rate})^{TF(15)} – 1 = (1 + 0.1) ^ {3} – 1 = 0.331 \)

\( \text{mult}(15) = \frac{1}{(1 + \text{discount_rate}(15)) ^ {\text{tf_sum}(15)}} = \frac{1}{(1 + 0.331)^{2}} = 0.564 \)

Evaluate Project Potential

Certain constraints related to your project can be defined so that you can understand its maximum potential. The surface generated in this case could also be used as a restrict mining in the last period to reduce the complexity of your block model and the runtime of MiningMath, since it includes a set of constraints inputted.

Example

  • Set up a scenario with 1,000 Mt in the processing plants, which corresponds to a lot more mass than expected in the whole life of the mine.

  • Add the Minimum Bottom Width (100m). This constraint will allow you to have a suitable work-front for your equipment.

  • Restrict Mining surface, if you have this constraint in your project.

  • Grade constraint until 0.7%.

  • Timeframe: Years (1), since it would all be processed in 1 period.

Note: Sum constraints can restrict the total amount of handling material (ore + waste) of the mine. Therefore, do not use them in the validation.

Using results

Now that Constraints Validation step is done, you are able to use this final surface as a guide for future optimizations. This approach reduces the runtime and the complexity of the algorithm because when you take it into account, the blocks below this final optimized surface won’t be considered and the heuristics inside the interface would be facilitated. Notice that we did not make any change in the discount rate, thus, this first NPV does not represent the reality. If you need an accurate result in this step, make sure to adjust it.

It’s important to remember that when we restrict the mining into this surface, the number of periods generated in future runs could be reduced because the average parameters of each one will have to meet the constraints of the overall package. Therefore, to achieve the same parameters in a lower timeframe, some blocks may be discarded due to the mining sequence and the optimization of destinations inside the whole mass.

Having this idea in mind, you should already have enough information to decide and structure the next step of the optimization. Based on the amount mined in the last item and in the processing capacity, define a good timeframe to identify the mining sequence. In this case, we had 231 Mt as the ore total mass to split in almost 23 years, since the processing capacity is 10Mt.

To improve efficiency in the optimization, before working on a yearly basis, we made the decision to consider the first 5 years. It is reasonable to generate a 10-year surface to consider the optimization inside this limit due to observations made before. Remember that each assumption here can be done accordingly with your project’s demands and that MiningMath can work with any timeframe to meet your needs.

Decision Trees

Comparing Scenarios

Decision Trees provide you with a detailed broad view of your project, allowing you to plan your mining sequence by analyzing every possibility in light of constraints applied to each scenario, which options are more viable and profitable to the global project, as well as how these factors impact the final NPV. Consider, for instance, the plant production per year as a variable factor. Using Decision Trees (Figure 1), you will be able to analyze how each constraint, e.g. the ore price, affects that year’s production and benefits or not the global project.

Figure 1: Essence of a Decision Tree, done in presentation software.

By running all the scenarios individually, just like what you did on Practice First, you will be able to identify how each change, within a set of constraints, impacts the NPV results and the mining sequence generated (Figure 2 and 3), which provides you a broader view of your project and enables you to decide which route you should take to generate value to your company.

How to Analyze Multiple Scenarios

Increase in the value of copper

Analyzing first the scenario in which there is a change in the economic value of the P1 process (“scn-PriceUp”), values such as NPV would naturally be different. In this case, analyzing the NPV and the total movement (Figure 3), it’s possible to understand that a different mining sequence was generated, which increased the mine’s lifetime by one period. This market change has also increased cumulative NPV (Figure 4) values based on its direct relation with the copper selling price. The charts below were made with the help of MiningMath’s results in simple spreadsheet software.

Figure 4: Total mass (Process+waste) handled on each scenario.
Figure 5: Cumulative NPV contrasts.

Adding an average grade limit

Now we can analyze the scenario in which a restriction in the average grade at P1 process was added, using a minimum and a maximum limit of copper (“scn41-AvgCu”). The blocks that would be processed would have to meet established targets, allowing a better selectivity of what should be processed or not. The ones which have higher or lower grades than required could be blended with others to generate an average grade that respects the constraints and improves the NPV.

Notice that there was a higher total production (Figure 5) in each period, caused by the increase of the stripping (ore/waste) ratio to meet the 30 Mtons of ore production at P1 Process and the average grade targets settled at the “scn41-AvgCu” scenario. A better stock pilling use is expected, in order to use all the blending capabilities and decision-making intelligence of the algorithm to decide which blocks could be mixed to fulfill the plant capacity. In addition, the cumulative NPV (Figure 6) shows that by inserting average grade constraints we consequently reduce the algorithm flexibility and lose some money to keep the operational stability frequently required at a processing plant.

In general, the main goal of MiningMath, considering the set of constraints provided, is to maximize the cumulative NPV in the shortest mine lifetime possible, which would reduce the project depreciation by interest rates. The charts below were made based on MiningMath results with the support of spreadsheet software.

Figure 6: Total mass handled on each scenario.
Figure 7: Cumulative NPV contrasts.

Building Decision Trees

You have been introduced to some of MiningMath’s functionalities. Now let’s take a closer look at how decision trees are built.

Mine project evaluation largely relies on technology from the 1960’s, in which a step-wise process is usually necessary along with time-consuming activities, like pit-design, in order to create only one single scenario. Evaluating projects through this approach could take from weeks to months of multidisciplinary work just to produce a couple of scenarios. This process is often guided by some arbitrary decisions that may constrain the mathematical solution space, confining solutions to engineering expertise and judgment.

global optimization scheduling can speed up the process of generating multiple scenarios for project overview prior to detailed work. MiningMath integrates the business’ areas and allows managers to improve their decision-making process by structuring their strategic analysis through multiple decision trees with a broader and optimized view of their projects, comprising constraints from different areas of the company.

The following video shows a few possibilities recognized only when seeing the available paths to create value. The video is oriented to technical daily usage but also covers interesting subjects for the managerial perspective. For the last case, skip straight to minute 15:23.

Video 1: Video detailing the building of decision-trees.

Apply to your projects

Now that you have played with the sample data, it is time for a hands on approach and apply this optimized strategy to your own projects!

MiningMath already allows you to structure your Decision Trees layout at its home page, which facilitates and guides the decision-making and mining planning processes.

Take advantage of the possibility to add (+), rename, or delete Decision Trees (Figure 7), by clicking with the right button at their names and/or exchange scenarios (Figure 8) between trees to build different mining planning strategiesThe icon is a shortcut, so you can easily open your scenario’s full report.

Compare everything in a single look and identify how each change impacts your results to build your own analysis by using presentations based on MiningMath charts as shown in Figure 1.

Interface Overview

Home Page

MiningMath automatically opens on the home page, as shown below.

Two main areas are acessible from the home page:

  1. Recent projects:This section allows you to select a project. Right-clicking on the project name provides you with a range of options:
     

    New scenario: The "Scenario Config" window will open, allowing you to choose which decision tree to place your new scenario in. You can also enter a name and a description for it. After that, you'll be directed to the Scenario tab to set it up

    Here's a more user-friendly version:

    Show in Explorer: This option opens the directory containing the folder with your project files and data.

    Remove from list: This option removes the selected project from the Recent Projects list

    Delete project: This option permanently deletes the project and all associated scenarios.

  2. MiningMath menu: this provides quick access to essential functions.

    Here's a description of each item:

    New Project: Use this option to start a fresh project. Clicking on it will allow you to create and configure a new mining optimization scenario from scratch.

    Open Project: This allows you to open an existing project that you have previously worked on. You can browse your files and select the project you want to continue.

    License: Clicking on this option takes you to the licensing section, where you can manage your software license, check its status, or enter a new license key. More info about license can be seen here.

    Help: This displays a new window with key software information and links to other essential resources.

    Close: This option closes the current session or the entire application.

Once a project is selected, the Decision Trees and Model areas will be displayed (as depicted below).

Both sections provide key information about project results and block model parameters. They can be used as follows:

  1. Decision Trees: This feature allows you to quickly navigate through recent scenarios without needing to open them from their original folders.

    Additionally, it lets you create new tabs and organize your mining planning strategies by exchanging scenarios as needed. You can access all paths involved in the project, giving you a comprehensive view that enhances your decision-making process (read more). To open any scenario, right-click and select the "Open" option.

     

    Here’s a more user-friendly revision:

    Alternatively, you can select the scenario and click “Open” at the bottom right of the screen. The “View” button next to it will take you to the Viewer tab instead. 

    This area displays scenarios for each tree, providing key information such as name, description, NPV (M$), runtime, and a direct link to the sheet containing all the results of the scenarios (available after execution).

    Several options are available to manage Decision Trees:

    1) Add new trees by clicking on “+” 

    2) Rename a tree by double-clicking its name

    3) By right-clicking on the tree name, you can add a new scenario, rename the tree, or delete it.

    For more options, right-clicking on a scenario’s name reveals hidden choices: open, view model, rename, show in Explorer, delete, and transfer it between decision trees.

    Lastly, the scenario description can be easily edited with a double click.

  2. Model Table: This section provides key information about your block model and its parameters, allowing you to easily review it at any time using the "Edit" button. This will take you to the Calculator functionality.

Model Tab

This tab lets you modify your block model to meet your project requirements. You can also export the block model in CSV format for use with other software packages.

Parameters tab

The Model tab begins with the Parameters option, displaying your data from the previous setup during import, along with all other existing fields. You can also remove any parameter if needed.

Function tab

The Function tab features the Validate Block Parameters table, allowing you to select a single field in your model to verify its values. It also includes an internal Calculator for making adjustments and adding new fields to your block model.

Viewer Tab

MiningMath’s 3D Viewer enhances your workflow by providing a comprehensive visualization of your block model, optimization results, and surfaces from various angles. This tool allows you to filter and customize displayed features, offering a quick and efficient overview of your data and optimization processes.

Model properties and scenario results

After running your scenario, the MinedBlocks.csv file will display its results on the 3D viewer, allowing you to see your model from different angles. By selecting “Period Mined,” you can view the mining sequence period by period.

By selecting a surface, you can identify topography changes for each period and adjust its opacity, making visualization easier.

MiningMath also lets you import existing surfaces by placing them in the same folder as your other files, allowing you to validate their geometry if needed. Click on Load Scenario to import multiple scenarios and compare them, helping you extract the best results according to your project constraints.

Scenario Tab

After importing a model, you can manage the scenario setup in the Scenario Tab. The setup process is divided into several guided steps, helping you configure all necessary parameters. Before running the optimization, you’ll receive a summary of the entire setup for your review.

Feel free to explore each step using the links below or by navigating the page tree on the left side of the knowledge base.

Parameters

Constraints

Execution

General Parameters

When you open a scenario in the Scenario Tab, MiningMath automatically takes you to the General subtab. Here, you’ll find all the essential inputs, including densities, economic parameters, slope angles, and stockpile information.

General tab setup employing the values from the Marvin case.

The description of all sections are outlined below.

  • Densities: These values are used alongside the block size to help calculate tonnages.

    You have two options to define densities:

    Field: This displays the column(s) assigned to density during the import process. It allows you to set varying densities for each block.

    Default value: This applies to any block that doesn’t have density information, regardless of whether a density column was imported. It’s also used when you select the field as <none>.

  • Economic Parameters: This field lets you set the discount rate, which is usually applied on an annual basis.

    The discount rate reflects the time value of money, impacting how future cash flows from mining ore and waste are valued. It plays a crucial role in the algorithm's decision-making process by influencing the timing and prioritization of mining activities. For more information on economic values, click here.

    Note: When working with different time frames, the discount rate primarily serves as a rough approximation of NPV and has a limited impact on solution quality, as the algorithm prioritizes the allocation of the best materials first. By adjusting the discount rate—either multiplying or dividing it based on the number of periods—you can still achieve reasonable results.
  • Slope Angles: These are one of the most important parameters when considering constraints hierarchy.

    You have two options for defining slope angles:

    Field: This displays the column(s) assigned to slope during the import process. It allows you to set different slope angles for each block.

    Default value: This applies to any block that doesn’t have slope information, even if a column was assigned. It’s also used when you select the field as <none>.

  • Stockpiling: You can enable this feature by checking the box.

    When this option is enabled, you can define two parameters:

    Fixed mining cost (cost/t): This refers to the average mining cost used in the economic function. This value helps break down the economic value while accounting for stockpiles

    Rehandling cost (cost/t): This represents the cost of reclaiming blocks from the stockpile for processing.

Destinations

Defining destinations is the next essential step after completing the General Parameters. In the Destinations subtab, you can easily add or remove processes and dumps using the buttons on the right bottom.

MiningMath requires at least one destination for processing and one for dumping. Each process must have a recovery field, and you can also define stockpile limits for each destination.

Recovery values based on the Marvin case.

Recoveries and stockpile limits are described next:

  • Recovery: For each processing stream, you need to specify a recovery value (ranging from 0 to 1) for any element or mineral whose column has been imported as a grade.

    This value on the interface is intended solely for generating reports, as it has already been accounted for in the economic calculations.

  • Stockpile limits: If activated in the General subtab, you can set a tonnage limit for the stockpile here. MiningMath treats this specified tonnage as a cumulative upper limit, applying it throughout the entire life of the mine.

    In this example, a limit hasn’t been defined, which means there is unlimited capacity for stockpiling (read more about stockpiles).

Finally, the destination of each block will be reported by assigning them with the numbers 1 to N (see the numbers beside the Name column), which depends on the order of addition.

Grouping destinations

When working with more than two destinations, you can group certain destinations into different categories. This is especially useful if you need to limit production capacity across grouped destinations. To define groups, simply switch to the Groups tab.

You can add or remove groups using the buttons on the right. To include or exclude destinations from each group, simply click on the red or green cells.

Production

Once you’ve completed the General and Destination subtabs, the Production subtab will be enabled.

In this section, it’s important to add period ranges by clicking the Add Range button in the bottom right corner. Typically, period ranges are defined for annual periods, as shown below.

Annual periods with no limit on the amount of periods.

However, you can also incorporate both short-term and long-term planning within a single scenario (as shown below) or use any combination of period ranges you prefer.

Monthly periods for the first 12 months, followed by annual periods without any limit on the amount.

After defining the period ranges, you’ll need to add production capacities and economic values for each range.

  • Production capacities: Set limits (in tonnes) for each destination, as well as the total amount of material moved per period (read more about production constraints).

    Note: If destinations groups were defined at the previous Destinations subtab, these should also be displayed here in order to have their production limited.

    Illustrative example of production capacities for destination groups.
  • Economic values: For each period range and destination, assign an imported economic value. A single destination can have different economic values for various period ranges, as shown below.

Geometric

Once the Production subtab is complete, the Geometric subtab will be enabled. In this section, you can define parameters designed to find solutions that meet the basic requirements for operational feasibility.

You can define five different parameters for each period range and timeframe:

  • Minimum mining width (MW)

    This parameter specifies the distance (in meters) from one pit to another. Wider values allow for greater mining fronts and better designs for nested pits, pushbacks, schedules, or any other desired outcomes. (read more). 

  • Minimum mining length (ML)

    This parameter defines the minimum distance (in meters) required between at least two points along the walls of surfaces in two consecutive mining periods. This distance is automatically maintained for any values that are smaller than or equal to the Minimum Mining Width (MW) (read more)

  • Minimum bottom width (BW)

    This parameter specifies the minimum horizontal distance (in meters) on the lowest floor of the pit. It ensures that mining operations can be conducted effectively based on the equipment sizing (read more).

  • Vertical rate of advance

    This parameter indicates the vertical distance (in meters) mined in each period. It is calculated by evaluating each mining face independently (read more).

  • Surface mining limits

    You can add two types of surface restrictions:

    Force mining: This option requires a surface file (.csv) as input. It defines an area that must be mined within a specified time frame, ensuring that all material within the boundaries is fully depleted (read more).

    Restrict mining: This option also requires a surface file (.csv) as input. It defines an area outside the imported surface that cannot be accessed within the specific time frame, effectively setting a maximum mining depth for that period (read more).

    Note: Surfaces are the most important constraints within MiningMath's constraint hierarchy, allowing you to impose one's understanding and take control of prior results and operational aspects.

Average

In the Average subtab, following the Geometric subtab, you can define the minimum, maximum, and weight for the average values of any element or mineral imported as an Average field.

Average constraints can be defined by period ranges, destinations and destination groups.

Illustrative example of multiple destinations, destination groups and period ranges available in the Average constraint window.

For each element or mineral imported in each destination and period range, you can set the corresponding weight, mining values, and maximum values.

Example of CU element imported for the Marvin dataset and respective fields to define its minimum average, maximum average and weight (block-by-block) for a specific a destination and period range.

Weight is defined on a block-by-block basis and is associated with an imported field. The minimum and maximum values serve as limits for the average values of the corresponding material or element within the specific destination and period range (read more).

Sum

In the Sum subtab, following the Average subtab, you can define the minimum and maximum sum for any parameter imported as a Sum field.

Sum constraints can be defined by period ranges, destinations and destination groups.

Illustrative example of multiple destinations, destination groups and period ranges available in the Sum constraint window.

For each field imported in each destination and period range, you can set the corresponding mining values and maximum values.

Example of Proc Hours field imported for the Marvin dataset and respective fields to define its minimum and maximum sum for a specific destination and period range.

The minimum and maximum values serve as limits for the sum of the corresponding field within the specific destination and period range (read more).

Overview

The Overview subtab provides a single page summary of all parameters related to the current scenario.
You can use it to review or adjust any General parameters, Destination parameters, and parameters related to Constraints by period ranges.

Due to the number of constraints and imported fields, you might need to scroll horizontally in order to visualise all the fields related to each period range (depicted below).

Once all configurations have been reviewed you can click on Save. 

If there are any files on the scenario’s folder, you will be prompted to confirm if the files can be replaced. In case not, please rename your scenario for it to be saved in a different folder.

Run

Once a scenario has been configured, you can use the Run option to execute the scenario.

Advanced options

Advanced options are also avaible before executing a scenario.

  • Time limit (h)

    It is possible to indicate a time limit in hours before running a scenario. The time limit is defined in hours due to the usual complexity of mining projects (read more).

  • Run options

    You can set different export options by clicking on the small button under the Run option (read more about output files).

Results

After the scenario finishes running, you will be able to acess its report, visualise results in the Viewer and also have a summary information in the decision tree.

Handling Data

Formatting the Block Model

The following formatting specifications are required:

  • Regularized block model: This means all blocks must be the same size.

  • Air blocks must be removed prior to importation. This is the way MiningMath recognizes the topography.

  • Coordinates of each block in the 3 dimensions.

  • Header Names should not have special characters or have them exceed 13. Use this recommendation for folders and files also.

  • The data format should be a CSV file (Comma Separated Value), which might be compatible with most mining packages.

Good practices

  • Configure Microsoft Windows number formatting to use dot as the decimal separator.

    MiningMath does not recognize CSV’s using semicolon (;) as a decimal separator. To change it on Windows follow these steps:
    1. Click Start, type control and click to open Control Panel.
    
    2. Under Clock and Region, click Change data, time, or number formats. 
    
    3. Click Additional Settings…, and then we can manually change the Decimal symbol.

    Finally, you will need to change the separator on the CSV. On Excel, you will need to open it, convert text to columns, and then save it again.

  • Use the metric system.  More infor about data in the imperial system can be seen here.
  • Set multiple fields that will consider different economic values, material types, contaminant limits, and any other variable you wish to analyze or control.
  • Some mining packages export data with letters and values between quotation marks(“”), check if this kind of format is interfering in the importation process.
  • If all your values are in the same column, try using the “Text to Columns” option in Excel. This will allow you to break down all the columns using the comma (,) as the correct separator.

Understanding field types

Field Types are the fields MiningMath can understand. Each column imported should be assigned to the proper field type (as depicted below) so that the software treats each variable accordingly with its meaning.

Options of field types (in orange) to be assigned to each column in the block model data.

Mandatory field types and their meanings

  1. Coordinates X, Y, and Z refer to your geo-referenced information.

  2. Average refers to any variable that could be controlled by means of minimums and maximums considering its average: grades, haulage distance, and other variables.

    There should be at least one element assigned as Average.

  3. Economic Value refers to the columns containing economic data that represent the available destinations for material allocation. At a minimum, the data must include at least one process destination and one waste destination.

    It is possible to import multiple economic values at once, and they may be used simultaneously (ex.: multiple processing streams) or calculated in the internal calculator.

Optional field types and their meanings

  1. Density refers to the block's density. This field is used to calculate the block's tonnage.

  2. Slope refers to slopes varying block-by-block, which gives the flexibility to define slopes by lithotype and sectors.

  3. Recovery refers to recoveries varying block-by-block.

  4. Sum refers to any variable that could be controlled by means of minimums and maximums considering its sum.

  5. Predefined destinations refers to possible fixed destination values. This can be used for example if you want to define pushbacks or apply lithologic restrictions that prevent certain blocks to be processed. However, by fixing destinations you are impeding MiningMath to reach its full potential. More about this here.

  6. Other refers to information that you with to have in the exported outputs.

  7. Skip refers to any variable that should be ignored. This field type might help improving the runtime since these variables will not be considered and exported along with the optimization outputs.

Field names shortcut

Shortcuts can be used for automatic recognition in the importation process as in the following image:

The full list of shortcuts is listed below.

Field name Shortcuts
Coordinates
X | Y | Z
Average
@ | grade
Density
% | dens | sg
Economic value
$ | dest | val
Recovery
* | recov
Slope
/ | slope
Sum
+
Skip
!

Formatting conventions

Each software has unique conventions for data formats, naming, and numbering systems. To prevent conflicts when transferring data between different software, ensure you follow the specific procedures required by each. There’s no universal standard—just the correct approach for each tool.

MiningMath convention

The model’s origin must be placed at the bottom portion, starting to count from the minimum coordinates at X, Y, and Z.

The following image highlights a block model origin at the corner of the first block and the coordinates on its centroid.

Blocks' matrix with origin and coordinates example.

MiningMath uses coordinates (X,Y,Z) for which Z, which represents the elevation, starts upwards.

The lowest IZ value is at the bottom of the model, following MiningMath convention.

Good practices

  • Verify Coordinate Ranges: Check the minimum and maximum values for each coordinate (X, Y, and Z) in your CSV file to ensure they fit within the box of your block model.
  • Set the Correct Origin: The origin of your block model must be set to the minimum values (minX, minY, minZ) of the box. Ensure all block centroid coordinates in your CSV are greater than these origin values.
  • Calculate Origins Accurately: If all blocks in your CSV are below the topography, calculate the origin by subtracting half a block size from the minimum value of each axis.
  • Avoid Common Errors:
    1) Ensure the coordinates align with the block dimensions relative to the set origin.
    2) Avoid repeated coordinates in the CSV file to prevent import failures.

Air blocks

MiningMath recognizes that all imported blocks of your model are underground. This means it is necessary to remove all the air blocks prior to importation. Unless your topography is totally flat, which is unlikely, the image below shows an example of your model should be displayed.

The non-removal of air blocks may lead to unsatisfactory results and long processing times, since it would be considering blocks that do not exist in reality.

Example of how block models should look like with a rectangular base.

More details on air blocks

The following video shows how to do remove air blocks using filters on MS Excel. These tips are also applicable to any mining software of your choice.

Importing the Block Model

To import the block model, select the option New Project on the left panel of MiningMath. 

Afterwards, the file name input field is shown in red, indicating a mandatory field. Browse for and select the CSV formatted file. Press Next to advance.

Project parameters

A series of project parameters will be requested in order to setup your project. This will be asked in  a series of forms as detailed next.

Project naming

In the next window the Model Name must be entered.

Optionally, the destination folder (Model Folder) can be changed as well as the Scenario Name, and a Scenario Description can be added.

Possible issues

This is usually an error related to the origins. We recommend checking the origins from the previous mining package, otherwise, MiningMath’s results won’t match the actual coordinates.

You should be aware that MiningMath employs coordinates (X,Y,Z), where Z, representing elevation, starts from the bottom. However, some other mining software may start from the top.

Also check the minimum and maximum values for each coordinate (X, Y and Z) of your CSV file to confirm if they are all within the box of your block model. And always remember that the origin must be the (minX, minY, minZ) of this box, as shown here. Therefore, all the block centroid coordinates in the CSV must be greater (>) to the origin values.

The calculation of the origins is quite simple. Usually, when your CSV file has all blocks below the topography, just take the minimum value of each axis, indicated on the first import screen, and subtract half a block. This will give you the correct origin.

This error may also be related to other aspects such as incorrect spacing (coordinates that do not respect the block dimensions, considering the set of coordinates and the origin), and repeated coordinates, among others. Therefore, we also recommend reviewing the steps of formatting the block model and importing the block model to ensure there are no inaccuracies.

Read permission is required to import your CSV file, while write permission is necessary for SSMOD file. Ensure that the user or group has appropriate permissions in that folder. For more information on configuring file and folder permissions, please refer to the relevant section in the Windows documentation or contact your system administrator.

MiningMath does not recognize CSV’s using semicolon (;) as a decimal separator. To change it on Windows, go to:

'Control Panel' > 'Clock, Language, and Region' > 'Region' > 'Change date, time, or number formats' > 'Additional Settings'

Next, change the List Separator to the comma sign (,). Press OK and OK. Finally, you will need to change the separator on the CSV. On Excel, you will need to open it, convert text to columns, and then save it again.

Imported fields and validation

Upon clicking Next, the following window will provide a statistical summary of information for the block model that will be imported. Check the parameters carefullyIn particular economic values

Figure showing the interface to validate your data.
Summary preview of to validate your data.

Geo-reference system, origin, dimension and rotation

Upon clicking Next, the CSV file will be imported into MiningMath, and show data related to the block model geo-reference system, that can be only coordinates. The next steps are to place the rotation degrees (Azimuth rotation), origin (accordingly with your mining package) and the block dimension. The number of blocks is automatically calculated after the origin and dimensions are provided as depicted below.

Coordinates input. In this example the origin of this project is x=3,475, y=6,480, and z=285, and the block dimensions are 30 meters in each coordinate.

Rotated models

MiningMath supports the use of block models that have been rotated using an Azimuth rotation.

Example of Azimuth rotation in the coordinate system.

The amount of rotation degrees can be passed as depicted in below.

Azimuth rotation depicted when hovering over the RZ field.

After importing, you can see the rotated model in the Viewer tab.

Example of rotated model in the viewer tab.

The detailed steps with mathematical formulations for the rotation procedure can be seen here.

Field type assignment

When Next is selected, the following form will appear, showing correlations between the imported CSV file header and the available field types in MiningMath.

Assigning each column to the proper field type.

You must associate each imported column to one of the options located just above the table, for instance: block coordinates X, Y, and Z to Coord. X, Y, and Z field types. For more details on how you can correlate each column, access this link. You can also keep the original data from your previous Mining Package, by using this approach.

If you do not already have an Economic Value function, when importing your block model, you will be directed to the Scenario tab. Then, click on the Function tab to calculate your Economic Value function in the internal calculator as explained here.

Notes

  • MiningMath has mandatory variables (columns) to be assigned to the proper Field Type:

    1) Coordinates (X, Y, Z).

    2) Average

    3) Economic Values (at least two)

  • Validating data screen might be overlooked, but it is very important to validate one's data based on minimums and maximums. Read more.

  • Each column imported should be assigned to the proper field type in order for MiningMath to treat each variable accordingly. Read more.

  • Typically, MiningMath recognizes some columns automatically when their headers are similar to the Field Type name. Otherwise, the MiningMath will automatically assign them to the Field Type sum.

    To enable the Next button, the user needs to assign each one of the mandatory variables to their respective Field Type

Grade units

The next step is to input the grade units. As in the example below, the copper grade has been defined as a percentage (%), while gold grade was defined as PPM, which stands for parts per million and, in turn, is equivalent to g/ton.

Informing block dimensions, origin, and grade units.

Evaluate your model

After filling all the required fields, the options View Model and Setup Scenario will be enabled. Before setting up your first scenario you can view it by clicking in the View model (example below).

This evaluation in the Viewer should help you answer questions such as:

  • Where are the high grades distributed?
  • Does the process economic values, above zero, match with the regions identified in the last question?
  • How are waste economic values distributed? Are maximum and minimum values reasonable when you compare them with the process?

Economic Values

MiningMath does not require pre-defined destinations ruled by an arbitrary cut-off grade. Instead, the software uses an Economic Value for each possible destination and for each block. The average grade that delineates whether blocks are classified as ore or waste will be a dynamic consequence of the optimization process.

Destinations required

MiningMath requires two mandatory destinations at least:

Therefore, each block must be associated with:
  • 1 Processing stream and its respective economic value

  • 1 Waste dump and its respective economic value

Notes:
  • Even blocks of waste might have processing costs in the economic values of the plant. Therefore, non-profitable blocks would have higher costs when sent to process instead of waste.

  • If you have any material that should be forbidden in the plant, you can use economic values to reduce the complexity and runtime, as mentioned here.

Simplified flow-chart of blocks’ destinations optimization. 

Calculation

Each field related to Economic Value (Process/Waste) must report the value of each block as a function of its destination (Process or Waste in this example), grades, recovery, mining cost, haul costs, treatment costs, blasting costs, selling price, etc. The user is not required to pre-set the destination, as the software will determine the best option during the optimization.

To calculate the Economic Values you can use MiningMaths’s internal calculator, available at the “Function” option inside the “Model” tab. To illustrate the calculation of economic values, an example is shown below. The calculation parameters are listed in Table 1.

Description Cu (%) Au (PPM)
Recovery
0.88
0.6
Selling price (Cu: $/t, Au: $/gram)
2000
12
Selling cost (Cu: $/t, Au: $/gram)
720
0.2
Processing cost ($/t)

                  4

Mining cost ($/t)
                 0.9
Discount rate (%)
                 10
Dimensions of the blocks in X, Y, Z (m)
          30, 30, 30

Table 1: Parameters for calculating the economic values.

Figure 1: Internal Calculator.

Block Tonnes

  • Block Tonnes = BlockVolume * BlockDensity

  • Block Tonnes = 30*30*30*[Density]

Figure 2: Block model calculations.

Tonnes Cu

  • Tonnes Cu = Block Tonnes x (Grade Cu/100)

  • Tonnes Cu = [BlockTonnes]*([CU]/100)

Figure 3: Block model calculations.

Mass Au

  • Mass Au = Block Tonnes x Grade Au

  • Mass Au = [BlockTonnes]*[AU]

Figure 4: Block model calculations.

Economic Value Process

  • Economic Value Process =
    [Tonnes Cu x Recovery Cu x (Selling Price Cu – Selling Cost Cu)] +
    [Mass Au x Recovery Au x (Selling Price Au – Selling Cost Au)] –
    [Block Tonnes x (Processing Cost + Mining Cost)]

  • Economic Value Process = ([TonnesCu]* 0.88 * (2000–720)) + ([MassAu] * 0.60 * (12 – 0.2)) – ([BlockTonnes] * (4.00 + 0.90))

Formula for economic price
Figure 5: Process Economic Value calculation.

Economic Value Waste

  • Economic Value Waste = –Block Tonnes x Mining Cost

  • Economic Value Waste = –[BlockTonnes] * 0.9

Figure 6: Economic Value Waste calculation.

The example block in Figures 4-6 would generate -299,880$ if it is sent to the process, and –55,080.1$ if discarded as waste. Therefore, this block might go to waste, since it would result in less prejudice than when it is processed. MiningMath defines the best destination regarding the set of constraints throughout the time, thus this decision a lot more complex than the example above in most cases.

Data Validation

Running an optimization for complex projects with several constraints may demand hours only to validate if the formatting has been done properly. Therefore, we present here an efficient scenario to quickly validate your data.

This page uses the Marvin Deposit as an example. To see its parameters and constraints please check the page here.

Validate it First

In order to validate your data and cut its runtime, we strongly recommend running MiningMath FULL with the following set up:

  1. Process and dumps set with respective recovery values.

  2. A bigger production capacity than the expected reserves. In this example, the expected life of mine vs production rate is 35-year producing 10 Mt per year. Hence, a value of 1,000 Mt would be big enough to cover the whole reserve.

  3. No discount rate.

  4. No stockpiling.

  5. Density and slope values.

  6. Timeframe: Years (1), since it would all be processed in 1 period.

The figure below depicts this set up at MiningMath, with the highlighted fields.

Results

Results are depicted below, with blocks in the sequencing, surface, surface with blocks and production tonnage.

Ultimate pit

The surface returned by this data validation process represents the most economically viable pit shell, also known as the ultimate pit.

Questions

  • Did the scenario run properly?

  • Are most of the positive economic values from the process inside this surface?

  • Is the mining happening in reasonable areas?

  • Is there a reasonable number of periods of life of mine?

Constraints Validation

Continuing the data validation, start to add the first constraints related to your project so that you can understand its maximum potential. The surface generated in this case could also be used as a restrict mining in the last period to reduce the complexity of your block model and the runtime of MiningMath, since it includes a set of constraints inputted.

Example

  • Set up a scenario with 1,000 Mt in the processing plants, which corresponds to a lot more mass than expected in the whole life of the mine.

  • Add the Minimum Bottom Width (100m). This constraint will allow you to have a suitable work-front for your equipment.

  • Restrict Mining surface, if you have this constraint in your project.

  • Grade constraint until 0.7%.

  • Timeframe: Years (1), since it would all be processed in 1 period.

Note: Sum constraints can restrict the total amount of handling material (ore + waste) of the mine. Therefore, do not use them in the validation.

Let's make everything clear

Now that Constraints Validation step is done, you are able to use this final surface as a guide for future optimizations. This approach reduces the runtime and the complexity of the algorithm because when you take it into account, the blocks below this final optimized surface won’t be considered and the heuristics inside the interface would be facilitated. Notice that we did not make any change in the discount rate, thus, this first NPV does not represent the reality. If you need an accurate result in this step, make sure to adjust it.

It’s important to remember that when we restrict the mining into this surface, the number of periods generated in future runs could be reduced because the average parameters of each one will have to meet the constraints of the overall package. Therefore, to achieve the same parameters in a lower timeframe, some blocks may be discarded due to the mining sequence and the optimization of destinations inside the whole mass.

Having this idea in mind, you should already have enough information to decide and structure the next step of the optimization. Based on the amount mined in the last item and in the processing capacity, define a good timeframe to identify the mining sequence. In this case, we had 231 Mt as the ore total mass to split in almost 23 years, since the processing capacity is 10Mt.

To improve efficiency in the optimization, before working on a yearly basis, we made the decision to consider the first 5 years. It is reasonable to generate a 10-year surface to consider the optimization inside this limit due to observations made before. Remember that each assumption here can be done accordingly with your project’s demands and that MiningMath can work with any timeframe to meet your needs.

Exporting Data

Exporting the Model

Select the button Export Model on MiningMath’s Model tab, as shown below.

Figure 1: Clicking on Export.

Clicking on Export, a new page will appear, allowing you to select the folder where the block model exported would be saved with its name.

Figure 2: Exporting data.

Just click on “Next” for your model to be exported to the folder selected.

Public Datasets

MiningMath allows you to learn, practice, and demonstrate, by showing any scenario previously ran the concepts of Strategy Optimization using the full capabilities of using only Marvin Deposit. This version is freely available to mining professionals, researchers, and students who want to develop their abilities considering this standard block model.

Marvin Deposit

DB Information

Below are listed the default parameters for Marvin according to the adaptions made in our formatted model.

Parameter Value
Block size
23000 m³ (X = 30m, Y=30m, Z=30m)
AU - Selling Price
12 $/g
AU - Selling Cost
0.2 $/g
AU - Recovery
0.60
CU - Selling Price
2000 $/ton
CU - Selling Cost
720 $/ton
CU - Recovery
0.88
Mining Cost
0.9 $/ton
Processing Cost
4.0 $/ton
Discount Rate
10% per year
Default Density
2.75 t/m³
Default Slope Angles
45 degrees

Some common constraints applied to the Marvin deposit are listed below.

Constraint Value
Processing capacity
10 Mt per year
Total movement
40 Mt per year
Sum of processing hours
4,000 per year (detailed estimate of the plant throughput)
Vertical rate of advance:
150m per year
Copper grade
Limited until 0.7%
Minimum Mining Width
50m
Minimum Bottom Width
100m
Restrict Mining Surface

Some surface in .csv format. For example due to a processing plant in the area.

Fixed Mining (Stockpiling)
0.9$/t
Rehandling cost (Stockpiling)
0.2$/t

Economic Values

  • Process Function = BlockSize * Density * [GradeCU/100 * RecoveryCu * (SellingPriceCU – SellingCostCU) + GradeAU * RecoveryCu * (SellingPriceAU – SellingCostAU) - (ProcessingCost + MiningCost)]
  • Waste Function = BlockSize * Density * MiningCost

McLaughlin Deposit

DB Information

Below are listed the default parameters for the McLaughlin deposit according to the adoptions made in our formatted model.

Parameter Value
Block size
X = 7.62m (25ft), Y = 7.62m (25ft), Z = 6.096m (20ft)
AU - Selling Price
900 $/oz
AU - Recovery
0.90
Mining Cost
1.32 $/ton
Processing Cost
19 $/ton
Discount Rate
15% per year
Default Density
3.0 t/m³
Default Slope Angle
45 degrees

Economic Values

  • Process Function = BlockSize * Density * [GradeAU * RecoveryCu * (SellingPriceAU) - (ProcessingCost + MiningCost)]
  • Waste Function = BlockSize * Density * MiningCost

Output files

The Execution Options or Run Options allow the user to define:

  • Files to be exported.

  • The visual results to be automatically shown on the viewer after each run.

Figure 1 highlights in (A) where the user can trigger this pop-up window and in (B) the options available, among which the user can:

  • Export/not export to CSV files:

    • The resulting surfaces
    • The resulting model in two ways: all blocks or only mined blocks, with/without coordinates and/or index information.
  • Set which results to be shown on the viewer:

    • Surfaces
    • Model
Figure 1: Execution options.

MiningMath automatically produces:

  • Formatted reports (XLSX files).

  • Tables (CSV) whose data feeds the reports.

  • Updated block model (MinedBlocks or AllBlocks).

  • Surfaces as a grid of points (CSV)

MiningMath organizes files, as listed below:

SSMOD and SSPRJ are important to report any issues you face.

    • Model Folder
    • MiningMath Model file (.SSMOD).
    • MiningMath Project file (.SSPRJ).
    • Scenario folder
      • Output Block Model
        • MinedBlocks.CSV contains information about the mined blocks.
        • AllBlocks.CSV, when requestedcontains information about all blocks.
      • Scenario file (.SSSCN) is a XML file read by the interface. Use it for a quick check on parameters used.
      • Report file (.XLSX) summarizes some quantifiable results, including charts such as productions, average grades, and NPV.
      • MiningMath also generates independent report files (.CSV) present in the report file (XLSX) as a backup:
        • Production Process.
        • Production Dump.
        • Production Total.
        • Grade Process.
        • Grade Dump.
        • Metal Process.
        • NPV.
        • Cumulative NPV.
      • Surface files (Surface-##.CSV) formatted as a grid of points.
  • List Item

SSMOD and SSPRJ are important to report any issues you face.

After each optimization, MiningMath exports the block model in one of two formats:

  • MinedBlocks.csv: The file presents only the blocks that have been mined from each scenario. Mined Blocks are exported by default, as it is a lighter file.

  • AllBlocks.csv: The All Blocks file presents all the blocks, whether mined or not, from each scenario, so it is basically the original Block Model along with resultant information from the optimization.

The resultant model includes all columns imported (except the skipped ones) besides the following information:

  • Mined Block shows whether (1) or not (0) a block have been mined.

  • Period Mined shows in which period a block have been mined (-99 for blocks that have not been mined). To learn more about the mining sequence within a period, access here.

  • Period Processed shows in which period a block have been processed (-99 for blocks that have not been processed).

  • Destination informs the destination of each block — according to the order the user has added processing stream(s) and waste dump(s).

Figure 2 shows where the user can interchange of these options.

  1. Click on the highlighted Execution button (A) to open the Run Options (B).

  2. Select All blocks in model or Only mined blocks, as you need.

  3. Hit OK, then Run.

By default, MiningMath exports the MinedBlocks file as a block model output

MiningMath will generate a report directly on Microsoft Excel, as shown in the following image, and the optimized pit (blocks and surface) in the viewer in case the user chooses this option (right figure above). The automatic preview shows only the mined blocks, colored according to each mining period defined by the scheduler.

The results presented in the Excel spreadsheet show, in the Charts tab, the graphs relative to the reported results calculated in the Report tabThe processed mass results, discarded mass, stock development, Au/Cu percentage in the process, Au/Cu percentage in the dump, metal contained in the process, net present value and cumulative net present value are arranged individually in the Production Process 1, Production Dump 1, Stock Process 1, AU/CU – Grade Process 1, AU/CU – Grade Dump 1, AU/CU – Metal Process 1, NPV and Cumulative NPV tabs, respectively.

Figure 2: Results report.

By default, MiningMath exports only the Mined Blocks file showing them by period on the viewer, as in the following illustration. The user can change any exporting options on Run Options menu.

Figure 3: Visual results.

If the user chooses to export the model, MiningMath will automatically save the list of the scheduled blocks (MinedBlocks.csv) or all blocks (AllBlocks.csv) in the block model folder, as shown in the figure below, which can be imported into other mining software packages.

The files MinedBlocks.csv and AllBlocks.csv may contain indices and/or block coordinates, and all the imported data/parameters along with the following information:

Figure 4: Mined blocks.
  • Mined Block shows whether (1) or not (0) a block have been mined.

  • Period Mined shows in which period a block have been mined (-99 for blocks that have not been mined).

  • Period Processed shows in which period a block have been processed (-99 for blocks that have not been processed).

  • Destination informs the destination of each block — according to the order the user has added processing stream(s) and waste dump(s).

Video 1: Outputs and files’ hiearchy.

Workflow

Super Best Case

In the search for the upside potential for the NPV of a given project, this setup explores the whole solution space without any other constraints but processing capacities, in a global multi-period optimization fully focused on maximizing the project’s discounted cashflow.

As MiningMath optimizes all periods simultaneously, without the need for revenue factors, it has the potential to find higher NPVs than traditional procedures based on LG/Pseudoflow nested pits, which do not account for processing capacities (gap problems), destination optimization and discount rate. Traditionally, these, and many other, real-life aspects are only accounted for later, through a stepwise process, limiting the potentials of the project.

MiningMath vs Tradional Technologies

MiningMath’s Super Best Case serves as a reference to challenge the best case obtained by other means, including more recent academic/commercial DBS technologies available. See a detailed comparison of these two approaches below.

In modern/traditional technology, large size differences between consecutive periods may render them impractical, leading to the “gap” problem. Such a gap is caused by a scaling revenue factor that might limit a large area of being mined until some threshold value is tested. MiningMath allows you to control the entire production without oscillations due to our global optimization.

In the modern/traditional methodology the decisions on block destinations can be taken following some techniques such as: fixed predefined values based on grades/lithologies post-processing cutoff optimization based on economics post-processing based on math programming or even multiple rounds combining these techniques. With MiningMath the destination optimization happens within a global optimization in a single step, maximizing NPV and accounting simultaneously for capacities, sinking rates, widths, discounting, blending, and many other required constraints.

Modern technology is restricted to pre-defined, less diverse sequences because it is based on step-wise process built upon revenue factor variation, nested pits, and pushbacks. These steps limit the solution space for the whole process. MiningMath performs a global optimization, without previous steps limiting the solution space at each change. Hence, a completely different scenario can appear, increasing the variety of solutions.

Due to tonnage restrictions, modern technology might need to mine partial benches in certain periods. With MiningMath’s technology, there isn’t such a division. MiningMath navigates through the solution space by using surfaces that will never result in split benches, leading to a more precise optimization.

Due to tonnage restrictions, modern technology might need to mine partial benches in certain periods. With MiningMath’s technology, there isn’t such a division. MiningMath navigates through the solution space by using surfaces that will never result in split benches, leading to a more precise optimization.

Modern approaches present a difference between the optimization input parameters for OSA (Overall Slope Angle) and what is measured from output pit shells, due to the use of the “block precedence” methodology. MiningMath works with “surface-constrained production scheduling” instead. It defines surfaces that describe the group of blocks that should be mined, or not, considering productions required, and points that could be placed anywhere along the Z-axis. This flexibility allows the elevation to be above, below, or matching a block’s centroid, which ensures that MiningMath’s algorithm can control the OSA precisely, with no errors that could have a strong impact on transition zones.

Example

Setting up the Super Best Case is simple. There are only two necessary restrictions:

  1. Processing capacity: 10 Mt per year.

  2. Timeframe: Years (1).

Depending on your block model, additional parameters may need to be specified. For example, if you have multiple destinations these could be added for proper destination optimization. The figure below provides a comprehensive overview, highlighting the essential parameters required for running the Super Best Case scenario using the pre-installed Marvin dataset.

Results

Results can be analysed in the Viewer tab and the exported report file. For the pre-installed Marvin dataset, note how the sequencing has no gap problems, and the production is kept close to the limit without without violating any restrictions.

Super Best Case Sequencing
Sequencing Slice
Super Best Case production tonnages

Export files

The block periods and destinations optimized by MiningMath’s Super Best Case (or any other scenario) can be exported in a CSV format. You could use these results to import back into your preferred mining package, for comparison, pushback design or scheduling purposes. Export options are depicted below.

Adding constraints

A refinement of the super best case could be done by adding more constraints, preferably one at the time to evaluate each impact in “reserves”, potential conflicts between them, and so on. You can try to follow the suggestions below for this improvement:

Optimized Pushbacks

MiningMath offers the option of producing optimized, single-step pushbacks with controlled ore production and operational designs. This procedure is important to ensure the financial and operational viability of the mining project, as excessively large volumes can render the project unfeasible, while excessively small volumes can result in resource wastage or missed opportunities for ore extraction.

By testing different volumes, it is possible to find an optimal point that maximizes the net present value (NPV) of the project.

MiningMath's single-step methodology to generate a diverse range of pushbacks straight from the block model.

How does it work?

MiningMath utilizes timeframes to generate pushbacks at different levels of detail. Timeframes are time intervals that divide the mine’s lifespan into smaller periods. Different timeframes allow users to perform a fast evaluation of the impact of production volume on the NPV. If necessary, adjustments can be made to optimize production and reduce costs.

In Pushback Optimization, multiple optimized pushback scenarios are created with varying levels of detail, enabling users to have a comprehensive view of the impact of volume variations on the project’s performance.

Single-step approach

Every pushback produced with MiningMath is created  in a single-step, straight from block model, taking into consideration geometric constraints such as minimum bottom width and minimum mining width, controlling tonnages, blending and other requirements. 

Multiple single-step pushback scenarios can be created with varying levels of detail, enabling users to have a a higher variety of options and a comprehensive view of the impact of volume variations on the project’s performance.

Create a Pushback

You can Identify timeframe intervals in your project, so that you can work with group periods before getting into a detailed insight. This strategy allows you to run the scenarios faster without losing flexibility or adding dilution for the optimization, which happens when we reblock.

The idea is to make each optimized period represent biennial, triennial, or decennial plans. MiningMath allows you to do it easily by simply adjusting some constraints to fit with the timeframe selected. Notice that in this example, the processing was not fully achieved, and this kind of approach helps us to understand which constraints are interfering the most in the results.

Example

Property Value
Timeframe custom factor
5
Processing capacity
50Mt in 5 years
Dump capacty
150Mt in 5 years
Vertical rate of advance

 750 m in 5 years

Minimum Mining Width
100m
Minimum Bottom Width
100m
Restrict Mining Surface
Optional
Grade copper
0.88%
Stockpiling parameters
On

Note: Waste control and vertical rate of advance are not recommended if you are just looking for pushback shapes.

Work Through Different Timeframes

Given the previous initial scenario, you might want to consider different timeframes for your pushback design. In order to perform a Pushback Optimization, the timeframes (in green), process and dump production limits (in green) and the vertical rate (in red) will be adjusted.

By varying the highlighted parameters above, the following decision tree has been constructed for Pushback Optimization.

Three different timeframes are explored: 3 years, 5 years, and 10 years. Each timeframe is associated with specific process and dump production limits. Such limits not only scale with their respective timeframes but also allow for variations that provide flexibility for testing different production scenarios. Finally, the vertical rate is also adjusted to align with the defined timeframe of each scenario. For instance, the vertical rate is set to 450m for the 3-year timeframe, 750m for the 5-year timeframe, and 1500m for the 10-year timeframe.

Afterward, specific results were carefully selected for comparison, focusing on key parameters such as Net Present Value (NPV), production process, and production dump.

More details

The 2 constraints inputted at the production tab are related to the maximum material handling allowed: the third one is about the processing equipment capacity, and the vertical rate of advance is related to the depth that could be achieved adjusted to this interval. The minimum mining width was added because we are already generating designed surfaces that could be used later as guidance of detailed schedules, thus, it should respect the parameter due to the equipment sizing. Parameters such as average, minimum bottom and restrict mining surface, don’t suffer any change in the time frames.

It’s important to remember that the packages of time here don’t necessarily have to correspond to identical sets of years. You could propose intervals with different constraints until reaching reasonable/achievable shapes for the design of ramps, for example. If you wish to produce more operational results, easier to design, and closer to real-life operations, try to play with wider mining/bottom widths rates. Those changes will not necessarily reduce the NPV of your project.

Considering this approach the discount rate serves just a rough NPV approximation and it doesn’t affect much the quality of the solution, given that the best materials following the required constraints will be allocated to the first packages anyway.

Remember all the constraints

NPV Upside Potential

NPV Upside Potential is the process of generating and analyzing scenarios to measure the impact of each constraint on the project’s net present value (NPV), from the Super Best Case to a detailed setup. Measuring the impact of each constraint on the NPV is important to assess the financial impact and ensure the project’s viability under different scenarios and conditions. Each constraint can have a significant impact on the project’s NPV, and it is crucial to understand how they affect the project’s financial performance.

By evaluating the impact of each constraint on the project’s NPV, it is possible to identify financial bottlenecks and opportunities for improvement, as well as prioritize problem resolution. This can result in better resource allocation and cost reduction, enhancing the project’s profitability and viability.

In NPV Upside Potential, scenarios are created that sequentially incorporate each constraint of the project, allowing users to have a comprehensive view of the impact of each constraint on the project’s performance. In case more efficiency is needed, the resulting surface obtained on the Constraints Validation or in Best Case refinements could be used as Restrict Mining in the last interval, which might reduce the complexity and the runtime. 

Example

To illustrate this process, let’s consider the base scenario of the Marvin dataset (shown in the figure below). The highlighted green fields represent all the targeted constraints that need to be controlled in this project: process capacity, minimum average grade of CU in process, dump capacity, bottom minimum width, mining minimum width and maximum vertical rate.

The decision tree depicted below has been constructed for a NPV Upside Potential process, based on the above scenario. In this decision tree, the scenarios progressively introduce each constraint into the project.

The target scenario is the last one, with the following restrictions: Process Production=10mt, Dump Production=30mt, Bottom Width=100m, Mining Width=100m, Vertical Rate=150m, and average CU=0.5. However, the constraints are added interactively, starting with the process production, followed by the dump production, widths, and so on.

Note how the cumulative NPV usually decreases (as expected) when more restrictions are added (see note at the end for exceptions). Without this interactive process, there might be a lack of information to understand the NPV of the final, desired scenario.

Best-Worst Range Analysis

Best-Worst Range Analysis is the process of generating and analyzing scenarios to measure the impact of mine width constraints on the project’s net present value (NPV), from no restriction to wide widths. Measuring the impact of mine width constraints is crucial to determine the optimal fleet equipment configuration in mining operations, with the aim of optimizing productivity and maximizing the net present value (NPV) of the project.

By analyzing variations in width constraints, it is possible to identify the effect of space limitations on mining operations and evaluate the influence of different bench widths on fleet performance. Appropriate mining widths can bring a series of benefits: higher amount of material to be simultaneously extracted; higher fleet productivity; more efficient transportation; easier road maintenance and so on. Hence, the search for different widths allows finding the best combination of equipment and mining techniques aimed at maximizing production and profit simultaneously in each scenario.

In a Best-Worst Range Analysis, scenarios are created gradually increasing the mine width up to a feasible maximum, allowing users to have a comprehensive view of the impact of space limitations on the project’s performance.

Considering the nature of global optimization and the non-linearity of the problem, it is expected that there will be variations in performance (NPV, production, amount of mine fronts, etc.) as parameter values are modified. Therefore, it is crucial to generate a large number of scenarios to obtain a comprehensive analysis of the impact of these variations on the project. This way, a more precise understanding of how different parameter values affect the overall performance can be achieved.

Example

Consider the following base scenario and decision tree built for a Best-Worst Range Analysis using the Marvin dataset.

The goal in this case is to understand the impact of different values of mining width (in green), which will be tested with a range of different values, from 0m up to 200m. 

Note that there is no linear relationship between mining width and NPV. In other words, a higher mining width does not necessarily imply a lower NPV. As previously mentioned, that is due to the non-linearity of the problem.

Considering the nature of global optimization employed in MiningMath, other variables might also be affected by different mining widths. For example, the production could be analyzed for identification of possible issues when employing different mining widths.

Selectivity Analysis

Selectivity Analysis is the process of generating and analyzing scenarios to measure the impact of all geometric constraints on the project’s net present value (NPV), from the most selective to the least selective setup. Analyzing the impact of variations in geometric constraints is important to determine the optimal mine configuration and optimize productivity and profit.

By performing such an analysis, it is possible to identify the effect of geometric limitations on mining operations. Moreover, it is possible to evaluate the influence of each parameter and its variation on mine performance. This allows finding the best combination of parameters and mining techniques aimed at maximizing production and profit for each scenario.

In a Selectivity Analysis, scenarios are created including each geometric constraint sequentially and gradually increasing or decreasing their values from the least selective until the desirable requirement. This allows users to have a comprehensive view of the impact of geometric limitations on the project’s performance.

Considering the nature of global optimization and the non-linearity of the problem, it is expected that there will be variations in performance  (NPV, production, amount of mine fronts, etc.) as parameter values are modified. Therefore, it is crucial to generate a large number of scenarios to perform a comprehensive analysis of the impact of these variations on the project.

Example

Consider the following base scenario and decision tree built for a Selectivity Analysis using the Marvin dataset.

The goal is to understand the impact of different values of geometric constraints (mining width, bottom width, and vertical rate of advance). The geometric parameters (in green) will be tested with a range of different values: bottom width with values from 0m up to 200m; mining width with values from 0m up to 200m; and vertical rate of advance with values from 50m up to 300m. In this example, 26 different scenarios were evaluated. 

Note that there is no linear relationship between geometric constraints and NPV. In other words, a higher width or lower vertical rate of advance do not necessarily imply a lower NPV. As previously mentioned, that is due to the non-linearity of the problem. The cumulative NPV of the scenarios is compared in the graph below.

A diverse range of results can be achieved with a Selectivity Analysis. However, there are usually two possibilities when they are compared:

  • Contrasting geometrics parameters with small NPV variations: note that when the bottom width changes from 0m to 80m, and the remaining parameters are fixed, the NPV drops from 454 M$ to 444M$. This indicates that large changes in geometric constraints do not necessarily lead to large changes in the NPV. The same for the scenarios SA_BW000_MW100_VR150 and SA_BW100_MW100_VR250.

  • Similar geometric parameters with larger NPV variations: when comparing scenarios SA_BW080_MW100_VR150 and SA_BW100_MW160_VR150 there is a drop in NPV from 444M$ to 370M$, highlighting that the 20m and 60m change in bottom width and mining width respectively can lead to a larger NPV difference in the project.

In conclusion, it is important to create several scenarios in a Selectivity Analysis. As exemplified above, results can be quite similar or quite different due to the non-linearity of the problem. Considering the nature of global optimization employed in MiningMath, it is also important to evaluate other indicators. The figures below depict the tonnage achieved for the production, demonstrating the possible impacts for different geometric constraints.

Design Enhancement

Design Enhancement is the process of creating scenarios to conduct extensive searches for solutions with similar NPV values but with fewer violations and improved shapes. This process allows finding more efficient and sustainable solutions that meet specific mine constraints and needs. Hence, seeking these kinds of scenarios is important for optimizing mining operations and for reducing risks and costs. 

In Design Enhancement, scenarios are created with more rigorous geometric constraints without compromising the desirable requirements. The goal is  to reduce violations and find better forms for the project. This is possible due to the global nature of optimization and the non-linearity of the problem, enabling the use of stricter requirements for the geometric constraints that could lead to a better  performance of the project.

Example

Consider the initial scenario and respective decision tree built for a Design Enhancement process. The goal is to evaluate stricter variations in the geometric constraints (in green).

Note the variation of results for Cumulative NPV and production. The base scenario has a small violation on dump production for period 1. However, when modifying the minimum width to 120m (scenario DE_BW100_MW120_VR150) this violation is not present anymore. Hence, this is an example of how small variations in geometric constraints could lead to less violations.

NPV Enhancement

NPV Enhancement is the process of creating scenarios to conduct extensive searches for solutions with higher net present value (NPV) values and similar violations, while considering minimum requirements for project constraints. Scenarios are created that gradually modify constraints from desirable requirements to minimum requirements, with the goal of increasing project profitability.

Example

Consider the initial scenario and respective decision tree built for a NPV Enhancement process. The constraints in green (production capacities, geometric constraints, and average CU) are considered for modifications, from desirable requirements to minimum requirements, in order to identify solutions with higher net present value (NPV).

Results show a high variation in NPV while the production remains in its limits. Hence, it shows that it is possible to achieve higher NPVs when employing minimum requirements defined by the user.

Bottleneck Analysis

Bottleneck Analysis involves generating scenarios to conduct extensive searches for solutions with fewer violations while preserving NPV, keeping geometries, optimizing mining operations, and reducing risks. This allows for the discovery of more efficient and sustainable solutions that meet specific constraints and needs of the mine.

In Bottleneck Analysis, after analyzing a desirable scenario it is possible to identify the constraint/s with demanding requirements that directly impact the optimization results and cause significant violation issues. Then, scenarios should be created by relaxing these demanding parameters, enabling users to make decisions to mitigate risks and ensure project viability.

Example

Consider the base scenario overview and the respective report on the dump production. Note how the first period has violated the 30Mt constraint.

A Bottleneck Analysis can help us identifying the constraint/s with demanding requirements that directly impact the optimization results and cause the violation in the dump production. Four different scenarios are built using a decision tree to analyze different values for dump production limits, minimum average of CU and vertical rate of advance.

NoteTo decide which parameters need to be changed, you can consider the contraint priority order that MiningMath employs in order to always deliver a solution. However, adjustments usually depend on the unique characteristics of each project and the flexibility available to modify its requirements.

The graphs below depict a comparative analysis of the results for the scenarios in the decision tree.

This analysis shows that the minimum average constraint of CU, production dump and vertical rate of advance are all restricting the base scenario. When relaxing these parameters, there is an increase of approximately 5% in the cumulative NPV, while the dump productions are kept within their limits and the process productions are closer to the target for some scenarios.

Multivariate Sensitivity Analysis

Multivariate Sensitivity Analysis is the process of creating and analyzing scenarios based on a range of possible values for selected constraints. Analyzing the impact of constraints variations is important for determining the optimal mine configuration and for optimizing productivity and profitability.

In Multivariate Sensitivity Analysis, scenarios are created gradually increasing or gradually decreasing the values of the constraints within a desired range, covering all combinations of values. This allows users to have a comprehensive view of the impact of combinations of constraint values on the project’s performance.

Considering the nature of global optimization and the nonlinearity of the problem, it is expected that there will be variations in performance as certain parameter values are modified. Therefore, it is crucial to generate a large number of scenarios to obtain a comprehensive analysis of the impact of these variations on the project.

Example

Consider the base scenario overview and the respective decision tree built for a Multivariate Sensitivity Analysis depicted below.

Base scenario for Multivariate Sensitivity Analysis using the Marvin dataset
Base scenario for Multivariate Sensitivity Analysis

All the scenarios in decision tree were executed and a set of best and worst results were chosen to be depicted in the graphs below.

Cumulative NPV (M$) achieved for two selected scenarios of the decision tree.
Production Process (Mt) achieved for two selected scenarios of the decision tree.
Production Dump (Mt) achieved for two selected scenarios of the decision tree.

The evaluation was conducted by analyzing both cumulative NPV and production stability. The results indicate that:

  1. A diverse range of cumulative NPVs can be achieved when compared to the Base scenario.
  2. Certain productions can be more stable than others.
  3. Certain productions can be violated, such as the dump in this example.

These observations demonstrate the importance of performing a Multivariate Sensitivity Analysis.

Optimized Schedules

MiningMath software allows mining engineers to improve their strategic analysis through risk assessments that are unconstrained by a step-wise approach to optimization. MiningMath’s global mining optimization methodology helps to integrate multiple areas of the business. It handles all parameters simultaneously, delivering multiple scenarios and accounting for both strategic and tactical aspects.

MiningMath's single-step methodology to generate a diverse range of schedules with short-term integration straight from the block model.

Run your first project

You can check a sequence of pages to learn how to run your first project with our Getting Started training. From installation process and formatting your model files up to the long-term planning of your project.

Hundreds of unseen distinctive solutions

MiningMath provides different views and solutions for each parameter changed and each possible objective on the same mine. Search through our extensive set of workflows to improve your projects and generate optimized schedules.

Do not over constrain

When using single-step methodology, it is important not to target infeasible results. MiningMath provides a diverse range of workflows that can help you understand and optimize your project. For example:

Short-term Planning

MiningMath allows the integration between long and short-term. By running the Best Casesurfaces to guide the optimization were generated and they could be used as a guide based on the NPV upper bound. The Exploratory Analysis provides insights on what could be the challenges of our project and also operational designs that could be used in further steps. At last, we obtained a detailed Schedule by using, or not, a surface, which could be the final pit or any intermediary one, as a guide.

Considering this workflow, now you may have enough information on a reasonable long-term view to enhance the adherence/reconciliation of your plans. You could choose a surface and use it as force and restrict mining to refine everything inside it. Remember that Force Mining is responsible for making the mining achieve at least the surface inserted, which means that all the material inside its limits should be extracted, respecting the slope angles, while Restrict Mining aims to prohibit the area below the surface inserted to be mined until the period in which it has been applied.

Thus, MiningMath will reach this exact surface in the time-frame required and enable you to test different geometries, blending constraints, and any other variable that could be required in the short-term planning without interfering in the long-term overview. Additional helpful features in these refinements are the concepts of mining fronts and the design optimization, based on surfaces modification, that could be done respecting all the parameters and generating results accordingly with your needs.

Figure 1: Results generated using different helpful features.

Example

Parameters Value
Timeframe
Custom factor (0.5)
Processing capacity
5 Mt per semester
Total movement
20 Mt per semester
Vertical rate of advance
60m per semester
Minimum Mining
120m
Bottom width
100m
Force and Restrict Mining Surface
Surface005 from Schedule Optimization
Stockpiling parameters
On
Play with steeper slope angles in the short term?
Yes

Table 1: Set of constraints example (1).

Results examples

Further details

The example above used fewer constraints, geometries were changed and the average grade was let free. It is very helpful to define the early years based on a semester timeframe, which can assist you to manage stocks and any other variables in the firsts 3 years, for instance. Note that the period ranges on MiningMath are based on the timeframe selected, therefore, you should adjust your variables accordingly with this value.

When we use Force+Restrict, we are telling the optimizer to break this volume into pieces and that it must mine this volume entirely, even if it is waste, so that the long-term view is respected. This way, you keep regarding the whole deposit while deciding what to do in the first periods. The approach here is quite different than a set of Revenue Factors for a series of LG/Pseudoflow runs, followed by adjustments to find pushbacks without math optimization criteria. It is worth mentioning that this kind of suggestion must be only applied at the beginning or at the end of the life of mine, since Force+Restrict Mining surfaces used in intermediate periods could interfere directly with the results.

Using timeframes

Another strategy is to optimize the short-term along with the long-term using different timeframes. In this approach, the integration between the short and long term visions is made in the same optimization process, facilitating the analysis and strategic definitions.

It is possible to consider:

  • shorter time horizons (weeks, months, quarters...) for the initial periods of the operation;

  • annual plans as far as needed, for a precise definition of discounted cash flow;

  • less detail for longer time horizons. They need to be considered in the overall view of the mine, up to exhaustion, but they consume optimization processing time that can be more focused on the early years of operation.

Thus, there is value maximization at the strategic level, and feasibility at the tactical level simultaneously. In addition, there’s minimizing compliance and reconciliation problems, as well as improving communication between teams, by working in an integrated system.

On this strategy, each period range will represent the time interval chosen in the timeframe, and discount rate will be adjusted in alignment with the time interval choice. Other constraints such as production and vertical rate of advance (VRA) must be adjusted to match each interval on the period ranges.

In order to clarify this strategy, Table 2 and Figure 9 present a possible list of constraints for an example using timeframes:

Table 2: Set of constraints for a timeframe example.

Constraints chosen in the interface for a timeframe example.

Multi-mine

MiningMath’s global optimization algorithm effectively addresses the challenges of integrated multi-mine projects by considering all pits simultaneously. Unlike individual pit optimization, this approach delivers a comprehensive solution that optimizes the entire project, providing a more cohesive and strategic overview.

Multiple pits projects

Formatting the block model

For multi-mine projects, the block model must include all mining regions for simultaneous optimization. If your pits are mapped in separate datasets, it’s essential to follow the steps outlined below:

  1. Work with a single block model or single pit first, run the initial tests and understand this region before handling the block model modification.

  2. Try to eliminate meaningless blocks, which would not affect the solution and could increase complexity.

  3. Add a second model or pit to explore the process of working with multi-mine projects. This combined block model file should meet the same requirements as a single model, as outlined on the data formatting page, ensuring unified characteristics.

    Experiment with surface adjustments to refine results, filter out regions you don’t wish to mine, and apply other guidance as needed. Since MiningMath surface files maintain a consistent order, using an Excel file (available here) can be a helpful tool for these modifications.

    Use mining fronts if you’d like to control the material extracted from each region.

  4. Add the other regions and start using everything that you wish.

Geometric constraints

The current version of MiningMath applies the same values for vertical rate, bottom width, and mining width across the entire block model. However, in a multi-pit scenario, each pit may have unique geometric parameters that impact selectivity. In these cases, we recommend setting the parameters for one pit, fixing its solutions (as the force and restrict mining settings have the highest priority), and then starting the optimization of the other pits. This approach ensures that the optimization considers the mass already planned for extraction from the first pit.

Example workflow

An efficient workflow starts by running an initial scenario without geometric parameters to serve as a validation or best-case scenario for scheduling optimization. Next, configure a scenario using the geometric parameters of the most selective mine—meaning the smallest widths and highest vertical rate (VR)—to create the least constrained scenario in terms of geometry. The surfaces generated from this setup can then be used to fix solutions for Mine 1.

Surface obtained in the first optimization for Mine 1.

For example, you could take Surface 1 and adjust the elevation in other areas to reflect the mass extracted in Period 1 from Mine 1, as well as the potential extraction from the second pit. With these results, you can refine surfaces or mining fronts, conduct a sensitivity analysis of the geometric parameters across multiple projects, and still maintain the benefits of global optimization.

Surfaces setup

Sustainable analysis

Technology has been developed to incorporate social and environmental factors in the mining project optimization, assessing these impacts whilst maximizing its net present value (NPV). The method can quantify socio-environmental aspects, such as dust, noise, avoidance of springs/caves/tribes, carbon emissions, water consumption, and any parameter that could be controlled by its average or sum. These environmental and social aspects can be assessed following internationally recognized standards (ISO 14044).

Figure 1: social and environmental factors.

Minviro in partnership with MiningMath has developed an approach to integrate such quantitative assessment into strategic mining optimization. This enables socio-environmental impacts to be constrained in the mining optimization, and the economic cost of reducing them to be calculated as a consequence. The way to do it is by inserting these variables linked with each block of your model, following these instructions. Considering this methodology, published here, significant reductions in the global warming impact could be achieved with a small economic cost. For example, using an environmental constraint it was possible to reduce 8.1% of ‎CO2 emission whilst achieving 95.9% of the net present value compared to the baseline, as you can see in the image bellow.

Figure 2: Reduction in enviromental impacts.

Several scenarios for mine development, processing setup, energy/water consumption, CAPEX (content in Spanish), OPEX etc. can be evaluated . It is also possible to include geometric constraints in order to restrict a mining area due to legal and site-specific issues that affecs the local population, using this feature. Spatially and temporally explicit socio-environmental risks can be included in mining optimization, providing an opportunity to assess alternative project options or explore a socio-environmental cost benefit analysis. For each aspect considered, decision makers are able to propose a range of possible scenarios and assess the economic cost of constraining these to different levels.

Figure 3: Possible scenarios to assess the economic cost of constraints.

The decision-making board, which previously had access to one or a few scenarios, now has a cloud of possibilities optimized and integrated with the technical and economic aspects of the project, reducing risks and adding sustainable value. The mathematical intelligence behind it is based on modern and well-accepted Data Science and Optimization concepts academically proven. The methodology has been tested in real mining projects with gains in NPV ranging between 15% and 20% on average, where socio-environmental aspects haven’t been added yet.

Figure 4: Performance over time.

Uncertainties at the Beginning

One of the many possibilities offered by MiningMath’s approach is to have multiple overview scenarios to evaluate different project assumptions, before doing a more detailed work. It does not demand an arbitrary/automated trial-and-error cutoff definition, nor a fixed input in form of pushbacks that will guide further optimization steps within the boundaries of a simplified problem. A subtle but substantial implication is the possibility of seeing a totally different mine development throughout the mine life cycle for each project assumption change. This allows mine managers to have a clearer view of the decision-tree and the possibilities on their hands, to improve economic, technical, and socio-environmental performances.

Considering this context, mine managers can judge greenfield projects to know whether or not they should prioritize a geotechnical study. This could be done by running multiple scenarios, considering the expected variability for slope angles for a given deposit. For example, in a given deposit, benchmarks from similar deposits indicate the overall slope angles might vary between 35-45 degrees. Before reaching the conclusion using an in-depth geotechnical study, multiple scenarios can be used to estimate the economic impact of each possible assumption for the overall slope angle. The conclusion might, then indicate a low economic impact, that could postpone the need for a detailed study.

The same idea applies to any parameter, which ultimately represents a project assumption.

MiningMath conducted an illustrative example with 2000 simulations varying multiple parameters independently. The results produced the chart from Figure 1, showing the probability (Y-axis) and the Project’s Value (X-axis). In this case, a detailed geotechnical study might be postponed, as the Project’s Value varies between 700 to 1100 MU$, in function of the OSA.

Figure 1: What would 2000 simulations say about NPV distributions?

Theory

Current Best Practices

MiningMath software allows mining engineers to improve their strategic analysis through risk assessments performed in a single-step approach to optimization. In other words, MiningMath’s global mining optimization methodology helps to integrate multiple areas of the business. It handles all parameters simultaneously, delivering multiple scenarios and accounting for both strategic and tactical aspects.

Hence, it is important to understand other current best practices employing a stepwise rationale and their disadvantages compared to MiningMath’s single-step approach.

Stepwise technologies

The mining planning models built with current best practices have developed shortcuts and approximations to try to deliver acceptable results that consider all the project’s complexities and constraints. To handle it, powerful machines are required to find a solution and to simultaneously determine the optimum pit limit and mining sequence that deliver the maximum project value.

Figure 1 depicts a stepwise approach used by current best practices.

Figure 1: Current best practices: stepwise approach

Stages of stepwise approaches

These steps may include different strategies, technologies or algorithms. However, they are all usually solved individually in three larger stages:

  1. Nested pits: when finding nested pits it is possible to employ the Lerchs-Grossmann (LG) algorithm, the Pseudoflow algorithm, destination optimization, direct block scheduling, or even more recent  heuristic mechanisms.
  2. Pushback definition: having the nested pits defined, the next step would usually be to perform the definition of pushbacks in a manual way by some expert mine planning engineers using a number of empirical rules.  Automatic ways focused on NPV optimization could also be employed for pushback design, but these are usually under resource constraints and do not consider enough geometric requirements.
  3. Schedules: finally, starting from a chosen pushback, the scheduling is performed. A myriad of techniques can be employed for that, such as direct block scheduling, genetic algorithms, (fuzzy) clustering algorithms, dynamic programming, and heuristic methods in general. All with different rates of success, but limited variety of solutions due to the single pushback input.

Aim of stepwise approaches

Regardless of the technologies or algorithms, in a stepwise approach the aim is to initially find the final pit limit that maximizes the undiscounted cash flow to then focus on block sequence within this final pit envelope. By constraining the problem and predefining inputs, these shortcuts (approximations) help to save time and computer resources, enabling such software to consider complexities such as ore blending requirements, different processing routes, stockpiling policy, truck fleet considerations, and so on.

Disadvantages of stepwise approaches

With current best practices employing some stepwise approach, thousands of potential schedules can be generated with a multitude of different methods, but they are all based on the same stepwise rationale, with one step guiding the other. Commonly, schedules follow from a set of nested pits and other fixed input parameters such as geotechnics, metallurgical performance, blending constraints, etc. Therefore, the results frequently present similar behaviours and restrict the full exploration of the solution space.

MiningMath Uniqueness

MiningMath allows mining managers to improve their strategic analysis through risk assessments that are unconstrained by stepwise processes. Through math optimization models that integrate multiple areas of the business, MiningMath handles all parameters simultaneously and delivers multiple scenarios, accounting for both strategic and tactical aspects.

MiningMath optimization is not constrained by arbitrary decisions for cut-off grades or pushbacks, since these decisions are usually guided by prior knowledge or automated trial-and-error. Thus, each set of constraints in our technology has the potential to deliver an entirely new project development, including economic, technical, and socio-environmental indicators, along with a mine schedule, while aiming to maximize the project’s NPV.

How can it be used?

MiningMath acknowledges that each project has its own characteristics. Thus, it also allows you to choose which workflow fits best in your demand and decide which one should be used. Straight from block model you can find solutions to your short-term, schedules, optimized pushbacks or super best case, as depicted in Figure 1.

Figure 1: Single-step approach employed in MiningMath. Straight from block model to short-term, schedules, optimized pushbacks or super best case.

Super best case

As MiningMath optimizes all periods simultaneously, without the need for revenue factors, it has the potential to find higher best case’s NPVs than traditional best case procedures based on LG/Pseudoflow nested pits, which do not account for processing capacities (gap problems), cutoff policy optimization and discount rate. Usually, these, and many other, real-life aspects are only accounted for later, through a stepwise process, limiting the potentials of the project. 

Discounted Cash flow x Undiscounted Cash flow

The use of LG/Pseudoflow methods to perform pit optimization aims to maximize the undiscounted cash flow of the project. On the other hand, MiningMath maximizes the discounted cash flow. Therefore, regions in which MiningMath has decided not to mine are, probably, regions where you have to pay for removing waste on the earlier periods, but the profit obtained by the discounted revenue from the hidden ore does not pay for the extraction.

A proper comparison between this methodology could be done if you import the final pit surface obtained from the other mining package into MiningMath, and use it as Force/Restrict mining. This way, MiningMath will do the schedule optimization using the exact same surface, which will allow you to compare the NPV for each case. Figures 2 and 3 depict two comparisons between undiscounted and discounted cashflows.

Example of discounted and undiscounted cashflow
Figure 2: Undiscounted versus discounted cash flow optimization.
Figure 3: Undiscounted versus discounted cash flow optimization regarding a minimum mining width.

Pushbacks

MiningMath offers the option of producing Optimized Pushbacks with controlled ore production and operational designs to guide your mine sequencing. Having this broader view in mind, you are already able to begin the scheduling stage. The block periods and destinations optimized by MiningMath could be imported back into your preferred mining package, for comparison, pushback design or scheduling purposes.

Schedules

When using MiningMath, it is possible to define the pit limit and mine schedule simultaneously. That is, to determine which blocks should be mined, when this should happen and to where they should be sent to maximize the NPV, while respecting production and operational constraints, slope angles, discount rate, stockpiles, among others, all performed straight from the block model. This means that the steps of pit optimization, pushback and scheduling are not obtained separately, but in a single and optimized process.

Decision Trees

To help with all that, our software allows you to build Decision Trees which enable a broader view of your project and a deeper understanding of the impacts of each variable. This is all possible because MiningMath works with a global optimization which simultaneously regards all variables, instead of using a step-wise approach. The software provides different views and solutions for the same mine for each parameter changed and each possible objective. 

Guaranteed Solutions

Multiple, complex constraints increase the likelihood of not finding or not existing feasible solutions. Nonetheless, MiningMath always delivers a solution, even if it could not honor the entire set of constraints imposed or had to reduce the NPV to find a feasible solution.

When dealing with highly constrained problems, other technologies might take hours or days to realize there is no feasible solution. The reason for that is because they usually employ generic optimization algorithms, not suitable to take decisions in a mining problem. In this case, the only option is to prepare a second execution with more flexible constraints, but still with no guarantee of feasibility.

On MiningMath, once an unfeasible solution is detected, the algorithm takes decisions on which (less relevant) constraints should be flexible, returning some warnings to the user at the report. This is performed along the optimization process, without compromising the runtimes. 

In some cases, the set of constraints may be too limiting, and the software is unable to return a solution, generating the “Unfeasible project” message. In these cases, it is recommended that you relax some restrictions.

The constraints priority order, from the highest to the lowest, is depicted in Figure 1.

  1. Force+Restrict Mining together using the same surface.

  2. Slope Angles.

  3. Force mining or Restrict Mining, same concept as above, but the surfaces here are corrected according to slopes and it might have some differences.

  4. Minimum Bottom and Mining width, mining length.

  5. Total Production Capacity (or the sum of the capacities across all destinations.)

  6. Vertical rate of advance.

  7. Average and Sum, modeled as strong penalties in the objective function

  8. Time limit

  9. Improve the NPV

Figure 1: Constraints hierarchy order.

Theory Validation

MiningMath’s results are only possible due to its proprietary Math Programming Solver ©. It consists of a Mixed Integer Linear Programming (MILP) formulation and linearization methods that tackle the challenging non-linear aspects of the mining optimization. In addition, it has its own Branch & Cut algorithm, which provides more efficiency than standard MILP optimizers since it’s fine tuned to this specific optimization problem. 

Another major advantage of MiningMath comes from the mathematical formulations based on surfaces (Goodwin et al., 2006; Marinho, 2013), instead of usual block precedences. Block precedence methods might lead to higher errors (Beretta and Marinho, 2014), providing slopes steeper (i.e. riskier, more optimistic) than requested. The use of surfaces eliminates these geotechnical errors and  allows for block-by-block geotechnical zones, if needed.

These surface-based formulations allow MiningMath to include geometric constraints, and, consequently, find solutions that are closer to real mining operations. The user can guide geometries by including mining and bottom widths, mining lengths, maximum vertical advance rates, and forcing/restricting mining areas. You can better understand how each constraint interacts with all others here. Such constraints give freedom to the user to work, or not, with predefined cut-offs and pushbacks which might limit the space of potential solutions. An in-depth view of MiningMath’s formulations and algorithm can also be seen here.

This approach (Figure 1) has been applied for years by clients, such as Vale, Rio Tinto, Codelco, Kinross, AMSA and MMG, with a growing number of licenses sold, press releases and academic research also proving the consistency of the implementation. With constant developments since 2013, MiningMath has reached a mature and robust state. It is the first and only singlestep mining optimization engine available in the market.

Figure 1: MiningMath’s approach. From block model to schedule in a single step solved by its proprietary Math Programming Solver ©.

Mining Optimization Algorithm

MiningMath has a flexible mining optimization algorithm that consists of a Mixed Integer Linear Programming (MILP) formulation and linearization methods that tackle the challenging non-linear aspects of the problem. In addition, MiningMath has its own Branch & Cut algorithm, which provides more efficiency than standard MILP optimizers since it’s fine tuned to this specific optimization problem.

One of the major advantages of MiningMath comes from the mathematical formulations based on surfaces (Goodwin et. al. (2006), Marinho (2013)) instead of usual block precedences. Block precedence methods might lead to higher errors (Beretta and Marinho (2015)), providing slopes steeper (i.e. riskier, more optimistic) than requested. The use of surfaces eliminates these geotechnical errors and allows for block-by-block geotechnical zones, if needed.

Another crucial advantage is that MiningMath’s formulation includes geometric constraints, allowing its algorithm to find solutions that are closer to real mining operations. The user can guide geometries by including mining and bottom widths, maximum vertical advance rates, and forcing/restricting mining areas. Such constraints give freedom to the user to work, or not, with predefined cut-offs and pushbacks which might limit the space of potential solutions. Hence, the software provides different views and solutions for the same mine for each parameter changed.

Eventually, linear solutions need to be mapped onto an approximate integer (block-by-block) solution that will represent the scheduling of the mining problem in the real-world. The intelligence to convert continuous solutions into integer and non-linear ones are made by MiningMath’s Branch & Cut algorithm.

algorithm mining optimization
Summary steps of MiningMath algorithm

Algorithm’s flowchart and mathematical formulation

MiningMath employs an innovative mathematical formulation and powerful proprietary Branch & Cut algorithm for mining optimization problems. A description of this mathematical formulation and the three main steps of the algorithm employed are given below.

Step 1: Initial assessment

Figure 1: Initial assessment of entire block model and inclusion of likely profitable blocks within an initial surface.

The first step of the mining optimization algorithm is to remove regions that do not add any value to the project. This is an initial assessment that considers slope constraints, reducing the size of the problem and providing a region of interest for the optimization process. Since MiningMath always employs surfaces in its mathematical formulations, this first set of likely profitable blocks are contained within an initial surface as depicted in Figure 1.

Step 2: Problem linearization and mining optimization

Solution provided by mining optimization algorithm
Figure 2: Example solution with geometric constraints

In the second step of the mining optimization algorithm, the non-linear, integer problem is approximated to an integer, linear one based on surfaces.  For that, it is necessary first to define the common notation across the problem and its variables.

  • [latex]S[/latex]: number of simulated orebody models considered
  • [latex]s[/latex]: simulation index, [latex]s = 1,...,S [/latex]
  • [latex]D[/latex]: number of destinations
  • [latex]d[/latex]: destination index, [latex]d = 1,...,D [/latex]
  • [latex]Z[/latex]: number of levels in the orebody model
  • [latex]z[/latex]: level index, [latex] z = 1,...,Z [/latex]
  • [latex]T[/latex]: number of periods over which the orebody is being scheduled and also defines the number of surfaces considered
  • [latex]t[/latex]: period index, [latex]t = 1,...,T. [/latex]
  • [latex]M[/latex]: number of cells in each surface; where [latex]M = x \times y[/latex] represents the number of mining blocks in x and y dimensions.
  • [latex]c[/latex]: cell index, [latex]c = 1, \ldots ,M[/latex].
  • [latex]G[/latex]: number of unique destination groups defined. Each group might contain 1, all, or any combination of destinations.
  • [latex]g[/latex]: group index, [latex]g = 1, \ldots ,G[/latex].
  • [latex]x_{c,t,d}^{z}[/latex]: simulation-independent binary variable that assumes 1 if block [latex](c, z)[/latex] is being mined in period [latex]t[/latex] and sent to destination [latex]d[/latex], and 0 otherwise.
  • [latex]e_{c,t}[/latex]: simulation-independent continuous variables associated with each cell [latex]c[/latex] for each period [latex]t[/latex], representing cell elevations.
  • [latex]\overline{f_{t,g,s}},\underline{f_{t,g,s}}[/latex]: continuous variables to penalize sum constraints violated for each period, group of destinations, and simulation. One pair of variables is necessary for each quantifiable parameter modeled block by block whose sum is being constrained. An example would be variables used to control fleet hours spent in different periods, groups of destinations, and simulations. More information about possible parameters here. Note also that the software allows the control of the average of simulations, instead of dealing with each simulation individually, and the control by the sum of destinations, instead of each destination individually.
  • [latex]\overline{\alpha_{t,g}},\underline{\alpha_{t,g}}[/latex]: user defined weights for variables [latex]\overline{f_{t,g,s}},\underline{f_{t,g,s}}[/latex] with the same destination group [latex]g[/latex] and period [latex]t[/latex]. These can only be defined in the .ssscn files.
  • [latex]\overline{j_{t,g,s}},\underline{j_{t,g,s}}[/latex]: continuous variables to penalize average constraints violated for each period, destination, and simulation. One pair of variables is necessary for each quantifiable parameter modeled block by block whose average is being constrained. An example would be variables used to control the average grade of blocks mined in different periods, destination groups, and simulations. More information about possible parameters here. Note also that the software allows the control of the average of simulations, instead of dealing with each simulation individually, and the control by the sum of destinations, instead of each destination individually.
  • [latex]\overline{\beta_{t,g}},\underline{\beta_{t,g}}[/latex]: user defined weights for variables [latex]\overline{j_{t,g,s}},\underline{j_{t,g,s}}[/latex] with the same destination [latex]d[/latex] and period [latex]t[/latex]. These can only be defined in the .ssscn files.
  • [latex]e_{c,t} \in \mathbb{R},\,\, [/latex]  [latex]t = 1,...,T[/latex],[latex]c=1,...,M[/latex]
  • [latex]x_{c,t,d}^{z} \in \{0,1\},\,\, [/latex]  [latex]c=1,...,M[/latex], [latex]t = 1,...,T[/latex], [latex]z=1,...,Z[/latex], [latex]d=1,...,D[/latex]
  • [latex]\overline{f_{t,d,s}},\underline{f_{t,d,s}} \in \mathbb{R_{\geq 0}}[/latex], [latex]t = 1,...,T[/latex], [latex]s=1,...,S[/latex], [latex]d=1,...,D[/latex]
  • [latex]\overline{j_{t,d,s}},\underline{j_{t,d,s}} \in \mathbb{R_{\geq 0}}[/latex], [latex]t = 1,...,T[/latex], [latex]s=1,...,S[/latex], [latex]d=1,...,D[/latex]

Having the set of variables defined, is possible now to define a mathematical model with an objective function and necessary constraints. 

Objective function

Intuitive idea

  1. Sum of the economic value of blocks mined per period, destination, and simulation.
  2. Average the result by the number of simulations.
  3. Subtract penalties for certain violated restrictions associated with some user defined parameters.

Requirements

  • \(V_{c,t,d,s}^{z}\): cumulative discounted economic value of block \((c, z)\) in simulation \(s\), period \(t\) and destination \(d\). More about this calculation here.

Formulation

\(max\frac{1}{s}\)\(\sum\limits_{s=1}^{S}\sum\limits_{t=1}^{T}\sum\limits_{c=1}^{M}\sum\limits_{z=1}^{Z}\sum\limits_{d=1}^{D}\)\((V_{c,t,d,s}^{z} x_{c,t,d}^{z}) \,-\, p\)
where
\(p = \sum\limits_{t=1}^{T}\sum\limits_{g=1}^{G}(\overline{\alpha_{t,g}}(\sum\limits_{s=1}^{S}\overline{f_{t,g,s}})\)\( + \underline{\alpha_{t,g}}(\sum\limits_{s=1}^{S}\underline{f_{t,g,s}})\)\( + \overline{\beta_{t,g}}(\sum\limits_{s=1}^{S}\overline{j_{t,g,s}})\)\( + \underline{\beta_{t,g}}(\sum\limits_{s=1}^{S}\underline{j_{t,g,s}})) \)

Finally, the objective function is constrained by the restrictions below

  • [latex]e_{c,t-1} - e_{c,t} \ge 0, c=1,...,M, t=2,...,T [/latex]
[caption id="attachment_14191" align="alignnone" width="5532"]Figure 3: Example of crossing surfaces Figure 3: Two surfaces (blue and yellow): a) not crossing each other and respecting the constraint; b) crossing each other and not respecting the constraint.[/caption]

Intuitive idea

  • Adjacent elevations in a single surface need to respect a maximum difference. This maximum will change based on which direction they are adjacent: x, y, or diagonally.

Requirements

  • [latex]H_x, H_y, H_d[/latex]: maximum difference in elevation for adjacent cells in [latex]x[/latex], [latex]y[/latex] and diagonal directions
  • [latex]X_c, Y_c, D_c[/latex]: equivalent to [latex]H_x, H_y, H_d[/latex]concept, the sets of adjacent cells, laterally in [latex]x[/latex], in [latex]y[/latex], and diagonally, for a given cell [latex]c[/latex], respectively.

Formulation

  • [latex]e_{c,t} - e_{x,t} \ge H_x, c=1,...,M, t=1,...,T, x \in X_c[/latex]
  • [latex]e_{c,t} - e_{y,t} \ge H_y, c=1,...,M, t=1,...,T, y \in Y_c [/latex]
  • [latex]e_{c,t} - e_{d,t} \ge H_d, c=1,...,M, t=1,...,T, d \in D_c [/latex]
[caption id="attachment_14202" align="alignnone" width="5260"]Example of slope constraints in mining optimization algoritm Figure 4: Maximum allowed difference (Hx, Hy, and Hd) in elevation between adjacent cells in contact laterally in the x direction (a), in contact laterally in the y direction (b), and in contact diagonally (c).[/caption]
 

Proprietary constraints not disclosed. Possible examples of constraints of the same type, but not the ones actually employed.

Intuitive idea

  • Surfaces will define when blocks will be mined. For example, blocks between surfaces associated with period 1 and 2, will be mined in period two. A block is between two surfaces if its centroid is between the two surfaces.

Requirements

  • [latex]E_{c}^{z}[/latex]: elevation of centroid for a given block [latex](c, z)[/latex]

Formulation

  • [latex]E_{c}^{z} \times \sum\limits_{d=1}^{D}x_{c,1,d}^{z} \ge e_{c,1}, c=1,...,M,  z=1,...,Z[/latex]
  • [latex]e_{c,t-1} \ge E_{c}^{z} \times \sum\limits_{d=1}^{D}x_{c,t,d}^{z} \ge e_{c,t}, [/latex][latex]c=1,...,M, t=2,...,T, z=1,...,Z[/latex]
[caption id="attachment_31065" align="alignnone" width="3848"]Figure 5: Distance between centroids above surfaces (green lines) and below surfaces (red lines) respecting constraints. Blue blocks are mined in period 1, while yellow blocks are mined in period 2. Figure 5: Distance between centroids above surfaces (green lines) and below surfaces (red lines) respecting constraints. Blue blocks are mined in period 1, while yellow blocks are mined in period 2.[/caption]
Intuitive idea
  • Each mined block can only be sent to one destination.
Formulation
  • [latex]\sum\limits_{d =1}^{D}x_{c,t,d}^{z} = 1, c=1,....,M, t=1,...,T, z = 1,...,Z[/latex]

Intuitive idea

  • For each period and destination groups there’s an upper and lower limit of total tonnage to be extracted. Destination groups might be formed by any unique combination of destinations, with 1, many or all. The sum of the tonnage of mined blocks sent to the same group of destinations in the same period must respect these limits.

Requirements

  • [latex]T_c^z[/latex]: tonnage for a given block [latex](c, z)[/latex].
  • [latex]\overline{T_{t,g}}[/latex]: upper limits in total tonnage to be extracted during period [latex]t[/latex] and destinations in group [latex]g[/latex].
  •  

Formulation

  • [latex] \sum\limits_{c=1}^M\sum\limits_{z=1}^{Z}\sum\limits_{d \in g}T_c^z x_{c,t,d}^{z} \le T_{t,g}, t = 1,...,T, g = 1,..., G[/latex]

Intuitive idea

  • The user can define a certain parameter (i.e. fleet hours spent) associated with each mined block to have its sum controlled. The sum of the values of this parameter associated to each mined block must respect lower and upper bounds for each period, destination groups (optional) and simulation (individually or on average). Destination groups might be formed by any unique combination of destinations, with 1, many or all.

Requirements

  • [latex]\underline{F_{t,g,s}},\overline{F_{t,g,s}}[/latex]: lower and upper limits, respectively, in sum of user defined parameter to be respected in period [latex]t[/latex], destination group [latex]g[/latex], and simulation [latex]s[/latex].
  • [latex]F_{c,d,s}^{z}[/latex]: value of user defined parameter related to a given block [latex](c, z)[/latex] in destination [latex]d[/latex] and simulation [latex]s[/latex].

Formulation

  • [latex]\underline{F_{t,g,s}} \le \sum\limits_{c=1}^M\sum\limits_{z=1}^{Z}\sum\limits_{d \in g}F_{c,d,s}^{z}x_{c,t,d}^{z} + \underline{f_{t,g,s}} - \overline{f_{t,g,s}} \le \overline{F_{t,g,s}},[/latex]

    [latex]t = 1,...,T, g = 1,..., G, s = 1,..., S[/latex]

Intuitive idea

  • The user can define a certain parameter (i.e. grade) associated with each mined block to be controlled in average. This average is weighted by the block’s tonnage and by an optional, user defined weight. It must respect lower and upper bounds for each period, destination group (optional) and simulation (individually or on average). Destination groups might be formed by any unique combination of destinations, with 1, many or all.

Requirements

  • [latex]\underline{J_{t,g,s}},\overline{J_{t,g,s}}[/latex]: lower and upper limits, respectively, for average value of user defined parameter to be respected in period [latex]t[/latex], simulation [latex]s[/latex], and destination group [latex]g[/latex].
  • [latex]T_{c}^{z}[/latex]: tonnage for a given block [latex](c, z)[/latex].
  • [latex]J_{c,s,d}^{z}[/latex]: value of user defined parameter of block [latex](c, z)[/latex] sent to destination [latex]d[/latex] in simulation [latex]s[/latex]
  • [latex]P_{c,t,d,s}^{z}[/latex]: user defined weight for block [latex](c, z)[/latex] in period [latex]t[/latex], destination [latex]d[/latex], and simulation [latex]s[/latex]

Formulation

  • [latex]\underline{J_{t,g,s}} \le[/latex][latex]\frac{\sum\limits_{c=1}^M\sum\limits_{z=1}^Z\sum\limits_{d\in g}P_{c,t,d,s}^{z}T_{c}^{z}J_{c,s,d}^{z}x_{c,t,d}^{z}}{\sum\limits_{c=1}^M\sum\limits_{z=1}^Z\sum\limits_{d\in g}P_{c,t,d,s}^{z}T_{c}^{z}}[/latex][latex] + \underline{j_{t,g,s}} - \overline{j_{t,g,s}} \le \overline{J_{t,g,s}}[/latex]

    [latex]t = 1,...,T, g = 1,..., G, s = 1,..., S[/latex]

Proprietary constraints not disclosed

Intuitive idea

  • Surfaces should respect geometric parameters defined by the user, such as minimum bottom width, minimum mining width, minimum mining length, and maximum vertical rate of advance, as depicted here.

Formulation

  • [latex]Geometric(e_{c,t}) \le \text{geometric restriction}, c=1,...,M, t=1,...,T[/latex]

Step 3: Integer, non-linear solution and evaluation

The next step in the mining optimization algorithm is to convert the linear solution to an integer, non-linear one. MiningMath’s Branch & Cut method is responsible for this conversion. Once it is done, the resulting solution can be evaluated, leading to the end of the algorithm’s execution or to a new optimization process. This new process might be triggered if one of the two situations arise: 

  1. restrictions are violated due to transformation from linear to integer, non-linear solution, or due to problem being infeasible.

  2. an evaluation of certain restrictions in the transformed integer, non-linear solution concluded that they might not affect the problem and be better discarded or modified.

If any of these are true, the solution at this stage will be sent back to Step 2 for linearization and refinement. Thus, if this refinement is caused by situation 1) then the goal is to improve the solution’s feasibility. This feasibility is improved according the constraint hierarchy order depicted in Figure 6.

Figure 6: constraints hierarchy order.

In contrast, if it is caused by situation 2) then the goal is to allow the optimization to focus on the bottlenecks of the problem and improve the current NPV. Once none of these situations have been identified, the current solution is returned. Note that each time the algorithm goes back to Step 2, a new global optimization is performed, thus the new resulting solution might be entirely different. 

Pseudo-code

The whole process of the mining optimization algorithm, from input to output is summarized in the diagram and pseudo-code below. References are made to previous Steps 1, 2, and 3. This algorithmic flow together with the proposed mathematical formulation exemplifies the innovative methodology applied to solve a single mine scheduling problem.

Visual representation of MiningMath's algorithm workflow
Pseudocode of MiningMath's algorithm

Evaluating constraints

MiningMath has a flexible mining optimization algorithm that consists of a Mixed Integer Linear Programming (MILP) formulation and linearization methods that tackle the challenging non-linear aspects of the problem. It is the only mining package able to handle a diverse range of constraints in a single-step process. However, such range of available constraints raises the question: 

How to add all the required constraints without losing too much value?

There is no exact procedure, as each constraint models a different engineering aspect. Therefore, there must be an experienced engineer willing to explore a range of possibilities by building Decision Trees, wisely choosing scenarios that get closer to the real problem (more constraints added) without losing so much value (or even gaining, given some non-linear aspects).

The following sections suggest possible workflows that can be followed in order to perform an efficient analysis.

Initial analysis

It is important to analyze scenarios to measure the impact of each constraint on the project’s net present value (NPV), from the Super Best Case to a detailed setup. For example, with a NPV Upside Portential analysis.

When performing such an evaluation it is common that the cumulative NPV usually decreases (as expected) when more constraints are added. However, there are exceptions as described in the following section.

Non-linear constraints

Geometric constraints are modelled as non-linear restrictions. This non-linearity can lead to counterintuitive results, with more constraints potentially causing a better NPV. Hence, if you are not happy with the results achieved after adding geometric constraints you might need to perform a Selectivity Analysis or Best-Worst Range Analysis of your project.

Other workflows

MiningMath offers a diverse range of Workflows that can be followed in order to improve your project’s results. If you are still struggling with certain parameters or constraints, please have a look on all possible options to identify what would be better suited to your particular case.

Mining Sequence per Period

What is a mining sequence per period?

A mining sequence per period outlines the order in which blocks are to be mined within each period, typically starting from the first block (block 1) and ending with the last block (block N). It is common that other mining packages produce such sequences as part of their output.

How are mining sequences created?

These sequences are usually generated using heuristic approaches. For example, some methods gather all blocks from each period and, starting from the highest bench, select a specific horizontal direction to enumerate them. Some other tools also employ short-term strategies with greedy algorithms to optimize mining operations on a day-to-day basis.

What are the disadvantages of mining sequences?

Although a sequence of blocks can be defined for mining, these sequences often lack optimization criteria during their creation. For instance, approaches that prioritize starting from the highest bench only ensure that slope angles are respected, neglecting other geometric constraints. Similarly, greedy algorithms fail to consider the global view, potentially leading to violations of certain constraints later on.

Therefore, MiningMath does not provide such sequences, as users might assume that constraints will be respected when, in reality, they may not be.

How does MiningMath handle the mining sequence?

MiningMath tackles this challenge by introducing the concept of Timeframes. This feature empowers users to specify the level of detail they desire within each mining period while maintaining a comprehensive overview and ensuring that all constraints are duly considered.

We recommend initiating the entire Life of Mine (LOM) setup with smaller time frames, such as “months,” for the initial interval. However, in some cases, employing Force and Restrict mining surfaces from previous runs can help reduce the complexity of the problem and enhance efficiency.

General Content

Destinations

Destination Policy of MiningMath

MiningMath aims to maximize the NPV of a mining project, and as such, it uses a discount rate in all calculations, it also considers the value of money through time. The software decides which blocks will be mined, when, and to which destination they must be sent. The mathematical model bases its decision on the economic values of each possible route. It means that MiningMath aims to identify, in a global view, which is the best destination to increase the NPVrespecting, simultaneously, all the constraints and using the priority order listed here.

Adding Destinations

Figure 1 shows, on the right bottom corner of the screen, the buttons responsible for add/remove destinations.

The panel Type shows the type of each destination added. The user must add at least:

  • One (1) processing stream;

  • One (1) waste dump

The numbers at the left of the screen are identifiers for each route at the mined blocks output file.

Renaming

Especially when using multiple destinations, the user should consider using more meaningful names for each route. Figure 2 highlights the panel Name, where the user can rename each one.

Process Recovery

Figure 3 shows the recovery panel, which is intended to input recoveries used during economic value calculationThe difference is that now it is for reporting purposes, which means it is not being considered twice.

Economic Value

Figure 4 highlights the Economic Value panel. Here, the user can assign each economic function to the proper destination.

Stockpile limits

As shown in Figure 5, stockpile limits are available only for processing streams, if activated in the General tabThese limits are valid for the life of mine.

Read more about stockpiles.

Recovery

One of the most important and basic concepts in mineral processing is metallurgical efficiency. Two terms are used to describe this efficiency: recovery and grade.

The recovery or percent recovery refers to the ratio of the valuable material (metal or mineral) contained in the concentrate with reference to the amount of the same material in the processing plant feed.

How and where?

The main place where the user input the recovery information is when defining the Economic Values.

However, as this information is implicit in the economic functions, the user need to input this value on the interface for purposes of generating reports.

On the Destination tab (light green), the user can define recoveries for each element in the panel (dark green) from Figure 1.

Figure 1: Destinations tab and recoveries.

Varying Recoveries

Why?

A detailed mine planning is likely to require an iterative process to update a block model with new information.

Considering the usage of specific tools for measurements, analysis, and reporting, mine planners might be interested in using the thorough information acquired on recoveries along the way.

How?

When editing the data set, the user can add as many columns as needed, defining recoveries for each blockFigure 2 shows how it would look like.

Figure 2: Recovery columns in the block model.
Figure 3: Imported recoveries available to the user.

Figure 3 shows a dropdown menu where the user can choose which recovery to use:

  • RecoveryA

  • RecoveryB

  • Constant recovery

NOTE the user does not need to use all the recovery fields imported for each run. This means recovery fields might be created for further scenarios, being used separately. The same concept is valid for columns with slopes and economic values.

Step-by-step

The following video presents how to use a different recovery for each block.

Video 1: Varying recoveries for each block.

Stockpiles

How MiningMath handles stockpiles

Stockpiles are handled as a post-processing stage within the optimization algorithm. After generating the mining surfaces, the algorithm reviews blocks initially considered for discarding and evaluates which ones have the potential to be stored for later processing instead of being sent to the dump. In other words, this all occurs as part of the optimization process, before the final destinations of the blocks are determined.

MiningMath evaluates whether a block’s net value (Revenue – Fixed Mining Cost – Rehandling Cost) exceeds the cost of discarding it, applying the appropriate discount rates for both the extraction and processing periods. This means that stockpiles effectively optimize blocks initially destined for the dump, potentially recovering additional value. For selecting a block’s destination, MiningMath uses the same logic of maximizing project value. The software identifies which discarded blocks should be reclaimed to address production shortfalls over time, ensuring optimal resource utilization. The diagram below summarizes the stockpile methodology.

Stockpiles setup

To enable the stockpiles on the interface the first step is on the General tab where two inputs are required:

  1. Fixed Mining Cost: value used to decompose the economic value while considering stockpiles.

    This cost is essential because MiningMath uses this parameter when calculating economic values. It helps break down the block's value, allowing the algorithm to accurately account for costs incurred during mining and those applied during processing.

  2. Rehandling Cost: represents the cost to reclaim blocks from the stockpile to the process.

    This cost is applied to break the economic values into parts and apply the discount rate at the time a block is processed.

After that, on the Destinations tab, you can define stockpile limits for each processing plant added, remembering that this limit is based on the life of mine, not in a period timeframe.

Stocked blocks

There are two common ways that mining software packages deal with stockpiles:

  • Reclaiming blocks according to an average grade for the entire stockpile.

  • Reclaiming blocks selectively in any required sequence such as FIFO (first in, first out), LIFO (last in, first out), etc.

Both options are approximations and have their advantages and disadvantages.

Currently, MiningMath reclaims blocks selectively from stockpiles according to the highest Economic ValuesRead more about Reclamation Policy.

The user can analyze MiningMath’s output in the files MinedBlocks or AllBlocks and report the periods in which a block has been mined and when it has been processed.

Video 1: How to trace stocked blocks.

Artificial stockpiles

Artificial Stockpiles is an advanced operation to incorporate external sources of material to your model and use them as an input on your scheduling.

This technique is especially useful when you need to incorporate the following scenarios into the optimization process:

  • Pre-existing stockpile from ongoing operations.

  • Underground material to be blended with open-pit material.

  • Ore bought from third-part companies to fulfill production shortfalls.

There are two main ways artificial stockpiles can be incorporated into the optimization: 1) modelling the stockpile with its actual geometry; or 2) creating a simplified artificial stockpile. These are detailed next.

Modelling the stockpile with its actual geometry

Modelling an existing stockpile with its actual geometry is the best alternative to include it into the scheduling for cases where you need an operational control over:

  • The stockpile and its adjacent areas.
  • The stock reclamation. 

For this process, use a modelling software to perform the following steps:

  1. Use the previous topography for the base of the stockpile.

  2. Use the current topography for the top of the stockpile.

  3. Create blocks in-between these surfaces. These blocks will have the same size of the block model.

  4. Assign an average quality (grade) and density to each block created.

  5. Calculate the economic values for these stocked blocks.

  6. Import the model back to MiningMath to further scheduling.

Creating a simplified artificial stockpile

To speed up the process, users can create multiple blocks as needed using a spreadsheet application. This method serves as a useful alternative in scenarios where you require:

  • Faster processing and evaluation.
  • Reduced need for operational control.

You can model artificial stockpiles as rows or as columns. A comparison between each choice is listed below.

Rows Columns
To control a sequence, it may require surface constraints.
Easier sequence. The precedence is defined by the vertical geometry.
A 1-line row will be affected by minimum widths used for the scenario.
A 1-line column will be affected by minimum widths used for the scenario.
Thin rows may cause problems to be mined completely.
Long columns may affect how deep the scheduling can go in a single period.
Multiple rows will give more flexibility and reduce conflicts with the operational constraints from the open-pit scheduling.
Multiple columns will give more flexibility and reduce conflicts with the operational constraints from the open-pit scheduling.
If you opt for multiple stockpiles, create them with a 2-cell distance to avoid overlapping interference.
If you opt for multiple stockpiles, create them with a 2-cell distance to avoid overlapping interference.

Step-by-step (rows)

  1. Create rows of blocks above the topography.

    Illustration of a row of 4 blocks created (in green) above a flat topography.
  2. Assign an average quality (grade) and density to each block created.

  3. Calculate the economic values for these new blocks.

  4. Import the model back to MiningMath to further schedules.

Consider checking the following requirements and observations.

Notes on modelling:

  • The blocks created must have the same size of the ones from original model.
  • Consider increasing densities for the blocks created to represent more material with a few blocks. The trade-off is a reduced selectivity.
  • The more blocks you have, the more selective is the algorithm.

Note on operational needs:

  • The blocks created will be subject to the operational constraints, such as widths and vertical advance, from your scenarios. This means you need to consider these two parameters to define them: 1) the stockpile base width; and 2) the stockpile height. 

Notes on the placement within the model:

  • The artificial stockpile should be placed in a peripheral area of your model to not affect the open-pit schedule.
  • Avoid borders to prevent any geotechnical issue, which will impede mining the artificial stockpile completely.

The following video shows more information on artificial stockpiles.

Reclaim Policy

Stockpiles are managed as a post-processing phase of the optimization algorithm (learn more). The decision-making process for reclaiming stockpiled blocks is driven by their economic value, with the algorithm aiming to maximize NPV. Consequently, blocks with the highest value are reclaimed first, regardless of when they were mined and added to the stockpile.

Diverse operational needs

Various mining software packages may use different conventions for stock reclaim policies, typically aligned with the specific goals of each application or module. Some of the possibilities may include:

  • FIFO: First In, First Out.

  • FILO: First In, Last Out.

  • Reclaiming an average grade for the entire stockpile.

  • Reclaiming the highest-value blocks first (MiningMath uses this method).

Each one of these other possibilities is an approximation of reality, with its pros and cons.

  • FIFO and FILO are quite logic but represent a level of selectivity that is not practical. The angle of repose and the positioning of each block are likely the most intuitive examples of reasons for lower selectivity in practice.

  • An average grade for the entire stockpile is naturally an approximation, considering the amount of material that should be blended to make it close to the reality.

  • Reclaiming the high-value blocks first also assumes a selectivity level that is not possible in reality. However, it is quite aligned with the mathematical goal of maximizing the project's NPV for strategic evaluation.

Ultimately, none of these approaches are fully operational. The final decision still relies on the expertise of the professional overseeing the strategy optimization, who uses their preferences, experience, and skills to introduce additional levels of control.

Guiding the Reclaim Policy

This section aims to bring a few ideas on how the user might guide the algorithm in order to follow one’s preferred reclaim strategy. For this article, the user should have prior knowledge in the following concepts.

Baseline Scenario

The general idea is to set up and run a baseline scenario to find what is optimal for the long-term value. The solution obtained in this step will guide further executions.

For this manipulation you will need to switch the output format to export the entire block model along with the optimization results, which is saved in the AllBlocks.csv file. This is essential for any iteration requiring re-optimization of a previous solution. By default, MiningMath exports only the MinedBlocks.csv file.

The results will indicate which blocks should be mined, when they should be mined, and whether they were immediately processed, stockpiled and later processed, or discarded.

The goal is to utilize previous outputs to generate new columns of economic values by introducing fictitious destinations. This involves creating multiple processing streams that do not coexist but effectively represent a single plant. These fictitious destinations, combined with predefined economic values, enable the user to impose their preferred level of control.

Additionally, based on prior results, the user must adjust the new economic values by specifying the final destination for each block.

For the FIFO approach, a block stocked during the second period on the first run, should be sent to the stockpile of Process 1 to be reclaimed first. Hence, this block must have a very negative value for all other destinations.

  1. Define a criteria for stocked blocks (Period Mined different from Period Processed) that should be reclaimed early or later. This criteria must be based on the previous results from the AllBlocks.csv (depicted below), and on the columns Mined BlockPeriod MinedPeriod Processed, and Destination.

    Block model file. The value -99 indicates unassigned values, meaning the block was not mined, processed, or assigned a destination.
  2. Add a pre-defined destinations column based on the criteria adopted.

  3. Set up your scenario considering the pre-defined destinations will not coexist.

    Notice that: 1) Process 1 and its stockpile will be used from Period 1 to Period 5;  2) Process 2 and its stockpile will be used from Period 6 to Period 10;  and 3) Process 3 and its stockpile will be used from Period 11 to .

  4. Optimize the new scenario to have a better approximation for the final NPV, considering the strategy of your preference.

Variable Mining Costs

MiningMath was conceived with some simplifications for current version, such as the fixed mining costs.

How does it work?

The software uses a fixed mining cost, which is incorporated into the economic formulation at an early stage. This value is entered through the user interface, enabling the software to recognize and adjust the pre-calculated economic values accordingly. The appropriate discounts are then applied to each block based on its status: whether it is (1) processed, (2) discarded, or (3) stockpiled and later processed. In all cases, mining costs are factored in before processing costs and revenue are considered.

For mined blocks in the stockpiles context, the possibilities for the material flow and their cost calculation are:

  1. Mine-to-process: this is already embedded in the economic formulation.

  2. Mine-to-waste: this is already embedded in the economic formulation.

  3. Mine-to-stock: this is considered the same as the cost for mine-to-waste. However, this is a limitation, since these two costs might not be always the equal due to, for example, different distances at the mine site surface (different haul costs).

  4. Stock-to-process: this considers an additional cost, represented in the user interface by the re-handling cost.

Limitations and alternatives

Mining costs might be defined with some variability for purposes of higher detail. This sections shares some ideas on how to consider variable mining costs, in which some goals are based on different aspects, such as:

  • Haul costs, in function of block’s depth and/or the destination site.
  • Blasting costs, in function of the rock type.
  • Loading costs, in function of a selective mine.
  • Supplies and materials, labor costs, among others.

The following steps outline how to add block-by-block mining costs to the block model file and define their average cost within the software.

  1. Prior to the model import, create a column of mining costs by block — including whatever costs are applicable: blasting, haul, loading, etc.

  2. Calculate the Average Mining Cost of all of the blocks.

  3. During the model import, assign the Mining Costs column to the field type Other; MiningMath will export this column along with its output.

  4. Use the Average Mining Cost as the Fixed Mining Cost for the stockpiles.

  5. Run MiningMath.

  6. OPTIONAL for more accurate approximation: analyze the MinedBlocks.csv file (or AllBlocks.csv) and calculate the average mining cost just for the stockpiled blocks.

  7. OPTIONAL: run MiningMath again, now using the new average mining cost (from step 6) as the fixed mining cost for the stockpiles.

For which the steps are:

  1. Open the MinedBlocks.csv.

  2. Edit the stocked blocks' value to add new costs.

  3. Calculate the NPV manually.

For which the steps are:

  1. For the first execution, set up MiningMath to export the AllBlocks.csv and run. 

  2. Use the output model (AllBlocks.csv) to update the economic value for stocked blocks. Then, save as a new model.

  3. For the next execution, re-import this model with fixed surfaces for all periods.

Further iterations

This process does not guarantee that the solution will remain unchanged; stockpiled blocks may vary, potentially requiring further iterations for better approximation.

Since stockpiles are treated as a post-processing unit and are not part of the optimization, Option 1 offers a more accurate NPV calculation.

Production Constraints

MiningMath allows the user to set period ranges and its corresponding production limits. This functionality allows the use of options such as pre-strippingproduction ramp-upprices changing over time, among others. It is worth mentioning that the discount rate on MiningMath is already applied in the first period. For more information on NPV calculation click here.

In the Scenario tab, under the Production option the user can define period ranges.

The user is able to edit only the field To. Subsequent periods will have their From field adjusted automatically. It is also possible to ajust the time frame for each period in each range.

Such ranges allow the user to vary variables over time such as:

  • Production limits
  • Limiting surfaces
  • Average (blending) and sum constraints
  • Economic values
  • Geometric constraints

The Timeframe Panel allows users to set values for their projects, ensuring more accurate sequencing. These values are applied to calculations on a timeframe basis. Users can either select a predefined value or input a custom one to suit their specific needs.

For example, in this case, we selected a timeframe of one year. This means that each generated period in the sequencing corresponds to a single year.

In the Production panel the user can define limits for any destination added.

In this example, we have the following limits:

  • Process 1: 30,000,000 t
  • Total: 60,000,000 t
  • Dump 1: <unlimited>

These limits are being considered from Period 1 up to the end of the life of mine.

Multiple period ranges based on different timeframes can also be added added.

In this image, two period ranges were defined as follows:

  • From period 1 to 4, yearly productions
  • From period 5 till the end, each period represents 3 years production.

This approach allows you to fragment the life of the mine and adjust parameters over time as needed. The Add Range and Remove buttons, located at the bottom-right corner, make it easy to add or remove period ranges.

Average

Average constraints are based on the average of any quantifiable parameter modeled block by block. To use this feature on MiningMath, the dataset must contain an auxiliary field/column which considers a value of what you wish to limit regarding the value of it in each block. Therefore, this feature controls the avarage value of the variable that has been modeled considering blocks that were mined in that single period. Since this feature is based on average parameters, the algorithm can use lower values to respect this target and increase the NPV with higher ones.

This feature is usually applied on blending to combine low-grade and high-grade blocks in order to increase the profitability. Although, it could have a lot of other applications. Basically, any variable which could be could be modeled considering these assumptions could be controlled.

Video 1: Blending and other constraints.

Some examples using average are listed below:

  • Grade of a contaminant on the plant.

  • Haulage distance, based on the destination each block.

The user can define:

  • Minimum and maximum average limits.

  • Different limits for different materials.

  • Different limits for different intervals.

  • Different limits for different destinations.

  1. Create auxiliary fields in the block model, quantifying the information to be controlled.

  2. During the importation, assign the column to be blended to Grade (Figure1).

  3. On the Average tab, input minimum and maximum limits for each variable (Figure 2a), period range (Figure 2b), other weights to be considered (Figure 2c) and destination (Figure 2d).

Figure 1: During the importation, Cu and Au are assigned to "Average".
Figure 2: Fields where the user can input limits (A), for each period range (B), Weights (C) and each destination (D).

Sum

What is a Sum constraint?

Sum constraints are based on the sum of any quantifiable parameter modeled block by block. To use this feature on MiningMath, the dataset must contain an auxiliary field/column which considers a value of what you wish to limit regarding the value of it in each block. Therefore, this feature controls the total amount of the variable that has been modeled based on blocks that were mined in that single period. Basically, any variable which could be modeled considering these assumptions could be controlled.

Video 1: Blending and other constraints.

Some examples are listed below:

  • Tonnages and proportions of rock type and metal production.

  • Consumption of inputs such as energy spent during comminution, and fleet hours spent to mobilize material.

  • Contaminants control on the processing plant during each period.

  • Blasting material consumptions.

The user can define:

  • Different sum limits for each material.

  • Different sum limits for each intervals.

  • Different sum limits for each destinations.

  • Combine all the options above in order to achieve globally optimized results.

  1. Create auxiliary fields in the block model, quantifying the information to be controlled (Figure 1).

  2. During importation, assign these auxiliary columns to Sum (Figure 2).

  3. On the Sum tab, input minimum and maximum limits for each variable, period range and destination (Figure 3).

Material Types

Mining fronts, rock type, or lithotype are usually defined in the block model as strings or converted to integer numbers. As the next step, the user defines which ranges of material types should be allowed, avoided or forbidden in the processing plant.

The ideal way to model material types for further control of this variable is to create tonnage columns for each material type. Therefore, you will be allowed to:

  • Control material types to be allowed, avoided, or forbidden in any destination.

  • Control the proportions of different material types, if applicable.

  • Analyze scenarios with different levels of flexibility.

  • Understand how the project development changes in face of each hypothesis tested.

  • Assess the impacts of the flexibility level given on economic and technical variables of the system.

The general idea is to create auxiliary columns in the block model to control any variable through their sum. Then, use the same idea of if-then-else statements, example:

  • If variable value matches condition

  • Then auxiliary column equals to X

  • Else auxiliary column equals to Y

To control the amount of material by rock type, create columns for tonnages of each lithotype, as shown in Figure 1, where:

  • Lithotype A has its tonnage inputted in the field Tonnage A.

  • Lithotype A equals zero for fields Tonnage B and Tonnage C, as it does not match the specified condition, i.e. being a lithotype B or C.

The same concept is used to the other material types.

Figure 2 shows the same concept being applied to measured, indicated, and inferred resources.

Whatever is the variable being modeled, the columns created with a similar purpose should be assigned to the field type “Sum”, during the importation, as shown in Figure 3.

On the interface, the user will need to insert General and Destinations parameters in order to enable the Sum tab (Figure 4). The next step is to define the limits to be imposed to each variable, to which destinations, and to which period intervals.

Figure 3: Importation screen where variables to be controlled through sums are properly assigned to the field type “Sum”.

Figure 4: Illustrates the interface and options available.

Mining Fronts

The ‘mining fronts’ approach, as shown in Figure 1 and 2, is a good way to refine resultscontrol regions, and understand which are the best results considering different amounts extracted from a specific place. Using this methodology, you have the possibility to categorize masses by depth, a specific coordinate intervale, even into sectors based on into 360 degrees vision, of your project.

The general idea is pretty similar to what is used to define material types. The first step is to identify the area which you want to control. Then create an additional field regarding the mass on that region and assign it as a sum constraint while importing it. At last, go to the sum tab and control the minimum or maximum amount at the process or dump destinations.

Note that by using this feature you are able to control what, how much, and when the blocks should be mined according to what you want to analyze. This concept is also very useful in the context of mine design, although it increases the complexity while compared with the use of force/restrict mining surfaces, it can also guide solutions, geometrically speaking. It is important to remember that the sum constraint has a high priority parameter in the algorithm and it can also influence the other inputs based on the hierarchy order.

Identify and define your regions as you wish. For instance, use this excel file, place the z from the validation data scenario, calculate the elevation difference with the topography, filter the region which has a 0 result, and create a dispersion chart to identify the final pit. Then, take the coordinates/indexes, which will be the limits of your mining fronts, in this case, the boundary between the mining fronts was at the index 35. Figures 3 and 4 have some visual information to help you.

Calculate the tonnage of your mining front in an additional field and import it as a sum parameter. In this case, the mining front 1 is in the region above or equal to the limit chosen and the mining front 2 is what is bellow it, as seen in Figure 4.

Use the sum tab to control the Mining Front of each destination. Play with the minimum or maximum that should be extracted from each mining front and understand how the results change. The scenarios, that are shown in Figures 5 to 9, used the production, geometric and stockpiling parameters from the Data Validation page.

Download this modified Marvin Deposit file and play on your own.

Stochastic Models

Stochastic simulations requires equiprobable models to consider uncertainties related to geological aspects, such as grade and/or volume of ore.

While single scenarios of distinct models are run separately, a stochastic scenario consist of obeying all the single scenarios at once.

This is achieved through an adapted resource block model that contains equiprobable values for a given set of variables containing a certain level of uncertainty.

As a consequence, MiningMath produces reports with the risk-profile of indicators presenting the minimum, maximum, expected, and percentiles P10 and P90 (these are threshold values, indicating that 10% of the indicators are below the P10 and 90% of the indicators are below the P90). Fig. 1 and 2 depicts the graphs for the NPV and cumulative NPV respectively.

Stochastic Mine Planning Report on NPV.

Maximum expected NPV for period 2

Minimum NPV expected for Period 2

P90 NPV for Period 3. 90% of possible expected values for Period 3 are below this point.

P10 NPV for Period 3. 10% of possible expected values for Period 3 are below this point.

Expected NPV for Period 6.

Fig 1: Report on NPV for stochastic model.

Stochastic Mine Planning Report on Cumulative NPV.

Maximum expected Cumulative NPV for Period 2.

Minimum Cumulative NPV expected for Period 3

P90 Cumulative NPV for Period 4. 90% of possible expected values for Period 3 are below this point.

Expected Cumulative NPV for Period 5.

Fig 2: Report on cumulative NPV for stochastic model.

The purpose of this page is to briefly explain how to import data and manage stochastic constraints using MiningMath.

Formatting Uncertain Fields

Uncertain-fields are those which might vary from simulation to simulation. By definition stochastic models have uncertain fields. Typically, grade fields contain uncertain information. Therefore, the user will need to format each equiprobable possibility in a specific way: name each uncertain column as the same adding {#} (where # is a number from 1 up to n). The list below highlight how grade headers should look like for example:

  • Copper {1}

  • Copper {2}

  • Copper {3}

  • Copper {4}

  • Copper {5}

Note that the grade information will influence the economic values for the processing stream. Therefore, the user will need to calculate the Economic Values for each possible grade information, as highlighted in Figure 3. This figure illustrates parts of a simulated copper deposit that can be downloaded here.

Figure 3: Stochastic values for copper grade (green) and respective process (blue) for 20 simulations.

Stochastic Constraints

Once you import your stochastic block model, the tabs Average and Sum will allow for constraints both on:

  1. Expected values to control the averages over all simulations.

    These constraints will guarantee that, in average, the indicators will be within the defined ranges. For example, take Expected Min = 0.60 and Expected Max = 0.65 for a certain constraint. If there are 3 simulations returning 0.59, 0.62 and 0.65, the average is 0.62, so this is within the range defined.

    Copper simulation example average
    Fig 4: Example to control the average of all simulations. This option is only available when databases containing stochastic data are imported.
  2. All simulations to guarantee that each one of them respect certain criteria

    These constraints control the variability, or the spread, of the results to be within a certain acceptable range. Let's take an example where such a range has Min = 0.60 and Max = 0.65, and again three simulations returning 0.59, 0.62, and 0.65. In this case the solution will be penalized by the optimizer, as 0.59 < 0.60. Learn more about penalized solutions here.

    Fig 5: Example to control all simulations individually. This option is only available when databases containing stochastic data are imported.

Stochastic optimization is an optimized way to combine all these modelled uncertainties into one schedule that maximizes the Expected NPV of the project.

Violated constraints

After executing the optimization with constraints for simulated indicators, it is possible that such constraints will not be respected due to some infeasibility in the problem (more about infeasibilities here).

MiningMath will try to solve any violated constraints following the hierarchy order depicted in Fig 6. Stochastic constraining are average and sum constraints. They have higher priority than any NPV improvements and time limits imposed by the user.

Constraint order
Figure 6: constraints hierarchy order.

Time Limit

It is possible to indicate a time limit in hours before running a scenario in the “Run” tab as depicted in Fig. 1. The time limit is defined in hours due to the usual complexity of mining projects and by the fact that MiningMath will always try to deliver a reasonable solution.

Figure1: Time limit option in the interface

MiningMath is built via a global and interactive algorithm. It solves the entire mining optimization after formulating a global mathematical model. The result of such optimization might deliver a solution with room for improvement, due to necessary approximations for solving complex non-linear restrictions, such as the geometric ones, or due to infeasibilities identified in the problem’s restrictions. In turn, if an improvement is possible, another iteration of the global algorithm is prepared and executed.

Therefore, in order to deliver any solution, the whole mining problem needs to be solved at least once, making a more fine-grained time limit (i.e. seconds or minutes) not possible to be set. In other words, the time limit is evaluated before each iteration of a global optimization that executes multiple times as depicted in Fig. 2.

Illustration of when time limit is evaluated. From steps 1 to 4.
Fig 2: Illustration of when time limit is evaluated. From steps 1 to 4.

The algorithm is designed in such a way that it is able to adjust subsequent iterations once it has identified that the time limit becomes restricted. However, it is important to highlight two aspects of such adjustment:

  1. It will not interrupt the current iteration of the algorithm. Hence, while it is expected that this adjustment will help the execution to achieve the desired time limit, it is still possible that it will take more than what was defined.

  2. Once an adjustment is made, a different problem will be defined and consequently new solutions will be explored. Thus, while unlikely, there is a chance that solutions will end up better than those unrestricted in relation to time. Therefore, despite not being implemented for this purpose, the time limit might be used to find more diverse solutions. For instance, you might build decision trees with different time limits. Even if better results are not obtained, fast solutions will still give you a quicker assessment of your project.

Predefined Destinations

Predefined destinations refers to a predetermined assignment of individual blocks destinations (such as waste or process) within a mining operation before any optimization is performed. We have already seen that  MiningMath works with Economic Values for each destination, taking each one of them into account to decide whether or not a block should be mined and where it should be sent to. Thus, fixing destinations or predefining them is not a concern anymore, especially with MiningMath technology.

However, it might still be necessary if you are using MiningMath to define pushbacks while making use of other constraints, comparing the MiningMath technology with other software solutions, or just want to reduce MiningMath run time by accepting a less optimized solution.

Applications

  1. Predefine destinations to define pushbacks.

  2. Lithologic restrictions that prevent certain blocks to be processed. For example, preventing a rock type block to be sent to a processing plant.

  3. Speeding up the algorithm's run time (while accepting a possible loss of NPV, due to an unoptimized choice of destinations).

  4. Among others.

How to predefine destinations?

Predefine using the block model file

When formatting and importing your block model csv file, you can have a predefined destination column, as depicted below. This column will indicate the fixed destination for each block.

When importing the csv file, make sure to define the field type of your destination column as Predefined Destinations

Predefined using the calculator

The option to predefine destinations can also be made in the Calculator area. The figure below depicts a new parameter Destinations that is set to 1 (a process destination) if the grade of CU is greater or equal 0.5, or -999 (non existent destination) otherwise. Note that the field type is set to Predefined Destinations.

Using the predefined destinations

After creating the new parameter (using the calculator or importing the field in the block model file), make sure it is being used in the Scenario tab as depicted below.

Verifying results

In the Viewer tab you can verify the destinations. Just select the Destinations field in the Blocks area to check the destination value of the filtered blocks. In the exampled depicted below all destinations are filtered to 1.

Discounted vs. Undiscounted Cash Flow

MiningMath’s objective function maximizes the discounted cash flow over the entire life of the mine in a single mathematical optimization step, taking all necessary constraints into account simultaneously. In contrast, other software packages that use LG/Pseudoflow methods for pit optimization focus on maximizing the undiscounted cash flow for each given revenue factor. Hence, the solutions provided by MiningMath are not easily comparable to undiscounted flow approaches that only consider slope angles.

A visual comparison between undiscounted and discounted cash flows is provided below. This comparison indicates that MiningMath’s decision not to mine certain regions is likely due to the higher cost of waste removal outweighing the potential profit from extracting hidden ore. Despite discounting, the revenue from the hidden ore is insufficient to cover the extraction costs in these areas.

Example of discounted and undiscounted cashflow
Undiscounted versus discounted cash flow optimization.
Undiscounted versus discounted cash flow optimization regarding a minimum mining width.

Comparing the different methodologies​

A proper comparison between both methodologies could be done if you import the final pit surface obtained from the other mining package into MiningMath, and use it as Force/Restrict mining. By utilizing this surface as a guide, MiningMath can precisely optimize scheduling within the specific boundaries delineated by the imported surface. This integration simplifies the comparison of NPV between MiningMath and various other mining packages, giving a more comprehensive evaluation of the methodologies employed by each one.

For comparisons targeting specific objectives, such as maximizing early-period cash flow or identifying profitable areas in the initial years, the following examples will guide you in setting up the appropriate scenarios within MiningMath.

Cash flow in the early periods

If you want to emphasize the cash flow in the early mining periods, simply create a decision tree varying the discount rate. The higher the rate value, the more weight will be given to the early periods, leading the undiscounted cash flow to have higher values at the beginning, while later periods will be heavily penalized by the discount rate.

Discount rate field.
Custom decision tree with scenarios employing different discount rates.

Greedy behavior as in the context of nested pits

If you wish to mimic the same greedy behavior as in the context of nested pits in MiningMath, you should drop all constraints and set a 1-<end> (only) interval with the desired ore productions, asking MiningMath to focus solely on maximizing the cash flow of this single pit, regardless of the long-term consequences, as in the picture below and similar to this process.

Focusing on finding the most profitable area during the early years

If you with to focus on finding the most profitable area to operate during the early years and the long-term consequences of such decisions are not an immediate concern, you must setup MiningMath accordingly., dropping all constraints after initial years. For example, it is possible to only have interval 1-5 timeframe (represensint the first 5 years) and 6-<end> with no constraints and infinite productions, as depecited below.

An even greedier approach would be to have just the 1-5 interval, meaning “I only have these 5 years to operate this mine”. In this case, you could also setup the discount rate to zero, so that you can analise only the undiscounted cashflow.

MiningMath allows you to keep as much of the global view as you wish while constraining your project as much as needed, even changing criteria for short and long-terms. If you’re unsure, simply run multiple scenarios and choose the one that best meets your objectives.

Azimuth Rotation Procedure

MiningMath supports the use of block models that have been rotated using an Azimuth rotation.

Example of Azimuth rotation (from North to East) with θ angle.

Rotation steps

This procedure performs a rotation of a point around a specific origin in the \(XY\) plane, while preserving the \(Z\) coordinate. Below are the steps involved, along with the corresponding mathematical formulas.

  1. Azimuth normalization

    First, the azimuth value is adjusted to ensure it lies within the range of \(0\) to \(360\) degrees. This is done by taking the remainder of the azimuth divided by \(360\)

    \(\text{azimuth} = \text{fmod}(\text{azimuth}, 360.0)\)

    In mathematics, the function \(\text{fmod}\) (floating-point modulus) computes the remainder of the division of two floating-point numbers. If the azimuth value is negative, \(360\) degrees is added to make it positive:

    \(\text{azimuth} = \begin{cases}
    \text{azimuth} + 360.0 & \text{if } \text{azimuth} < 0.0 \\
    \text{azimuth} & \text{otherwise}
    \end{cases}\)

  2. Inversion of Rotation Direction

    The adjusted azimuth is then inverted to perform the rotation in the opposite direction:

    \(\text{azimuth} = -\text{azimuth}\)

  3. Conversion from Degrees to Radians

    The azimuth in degrees is converted to radians, as trigonometric functions use radians:

    \(\text{radians} = \frac{\text{azimuth} \times \pi}{180.0}\)

  4. Calculation of New Coordinates

    The new \(x\) and \(y\) coordinates are calculated by applying the rotation in the \(XY\) plane around an origin \((x_0, y_0)\). The formulas used are:

    \(
    p_x = \left( (x - x_0) \times \cos(\text{radians}) + (y - y_0) \times \sin(\text{radians}) \right) + x_0\)

    \(
    p_y = \left( -(x - x_0) \times \sin(\text{radians}) + (y - y_0) \times \cos(\text{radians}) \right) + y_0\)

    Where \((x, y)\) are the initial coordinates of the point, and \((x_0, y_0)\) are the coordinates of the origin.

  5. Preservation of Z Coordinate

    The \(z\) coordinate of the original point remains unchanged during the rotation:

    \(z_{\text{final}} = z_{\text{initial}}\)

  6. Final Result

    The result of the procedure is a new point with coordinates \((p_x, p_y, z)\), where \(p_x\) and \(p_y\) are the new coordinates in the \(XY\) plane after the rotation, and \(z\) is the original coordinate in the \(Z\) axis.

    \(\text{New Point} = \{ p_x, p_y, z \}\)

Academy

Teaching

Training Presentation

Assignment

John is a mine planning engineer working on Marvin deposit. He’s conducting the long-term production scheduling plan and is concerned on what would be the best scenario to optimize Marvin’s operations.

The mine has some constraints, such as slope angles defined block-by-block and a default value of 45 degrees for blocks missing information. The total movement is limited to 60 Mt/yr.

Part 1

  1. Formateconomic values, and import this Marvin data. Save it as MMCI_YourName_Marvin.

  2. Choose your own economic parameters and fill the Table 1b to simulate a process B.

Figure 1: Block model format

Part 2

Use the following steps to evaluate your project:

Now that you know MiningMath is able to easily run multiple scenarios, you should play with the remaining parameters, evaluate the results and bring a proposal for the board. You should:

  1. Download Screencast O-Matic or equivalent screen recording software.

  2. Prepare a presentation (ppt or docx), which could be structured by answering the following questions:

    a) What is the goal of your evaluation?
    b) How did you use MiningMath to achieve your goal? Which were the features used? What was the methodology used to get to your results
    c)
    Why have you got such results?

  3. Record it as if you were reporting your analysis to the board. Upload it on YouTube as public. The video should be limited to 5 minutes.

  4. Send the link, materials, and the Marvin data used to your teacher.

Questions

  1. What are the main resources of MiningMath viewer and which are the output files?

  2. Could market changes be considered based on different ore selling prices throughout time? If yes, how?

  3. Is it possible to forbid the optimization to access some specific region? If yes, which feature could be used?

  4. Which geometric features could be used to get results accordingly with the operational needs?

  5. What are the benefits of building decision trees?

  6. What are the main differences between Lerchs-Grossmann/Pseudoflow and DBS methodologies?

  7. What is the importance of considering a discounted cash flow on the optimization decision-making process? Does the value of money through time impact the mining sequence?

  8. Could the optimization handle multiple destinations? Over 2 processing plants? One stockpile for each with maximum capacity?

  9. Could production ramp-ups be part of the optimization constraints?

  10. Is it possible to control the average of a given variable, such as grade, haulage distance and etc.? How many properties can be controlled? Is it possible to change them over the mine's life?

  11. How does the cut-off grade policy work on MiningMath?

  12. What are the main validations before running a scenario?

  13. Can MiningMath run more than one scenario simultaneously?

  14. Which are the 3 main suggestions of the integrated workflow?

  15. Is it possible to generate a short term planning using MiningMath?

Press Releases

Strategic optimization

Is mining an ordinary business?

November 22, 2019.

“Mining is a business. This may sound obvious. But when one tries to look at how we operate that becomes much less evident. Are we really looking at what would be important for any business? I’ve had this discussions throughout my career in mining and I must say in many cases technical people, i.e. geologists, mining engineers, processing engineers – those who run the mines – do not look at business metrics. At all!!! Let’s look at a miner. “Let’s move tonnes and increase equipment utilization”, one would say! This person will see the efficiency improvements in higher utilization. Does it help increase shareholder returns? No, not really. It has nothing to do with it.”

Alexey Tsoy, LinkedIn Article

Startup wants to reduce dam risks

March 21, 2019

“The last two tragedies that desolated Brazil, leaving hundreds of deaths and miles of environmental devastation, from Brumadinho and Mariana, raised a great question: it is possible to reduce the risks of tailing dams and to guarantee everyone’s safety in an efficient and economically viable way?

MiningMath, startup from Belo Horizonte, says yes. The company has created a software, SimSched, which, through the union of modern programming, and data science, allows the combination of any variables from a mining project to generate analysis, hypothesis and possible results. The objective is to contribute to improve the decision-making processes in the mining companies, in order to effectively consider the economic, social and environmental aspects of the business.”

Diário do Comércio · Gira Betim

Startups from SEED visit the Mining Hub

March 18, 2019

“Five Startups of the Minas Gerais Government Accelerator (SEED – Startups and Entrepreneurship Ecosystem Development) from several segments, with solutions for business areas, visited last Tuesday (March 12th) the Mining Hub in Belo Horizonte (MG), to strengthen the relationship and exchange experiences on the market.

Among the startups were: MiningMath, which markets a solution that supports decision making at a strategic level specific to the mining industry, from designing a project to optimizing the value chain during the development phases; Recrutamento Inteligente (“Intelligent Recruitment”), which facilitates the management of intellectual capital; The Mindset, which acts in the management of mental health professionals; Cargo Sapiens, which offers management and compliance solution for international logistics; VG Resíduos, which connects producers and consumers of waste.”

Portal da Mineração

Startup develops software that facilitates data management of deposits

December, 2018

“MiningMath has developed mine data management software that promises to reduce costs and anticipate the environmental consequences of mining. The tool crosses and organizes data from different departments of a mining company to generate scenarios with the possible geological, environmental, economic and social impacts of an action in a given location.”

DCI

From left to right: the engineer Matheus Ulhoa and the partners of MiningMath, Fabrício Ceolin and Alexandre Marinho (MiningMath).

Industry 4.0 increasingly present in areas of the primary sector

November 6, 2018

“With the advent of industry 4.0, traditional and grassroots activities such as mining start to gain new ground in Brazil and the world. The use of innovative technologies such as data science, internet of things and optimization of processes breathe new life into the industry, promoting added value gain to the minerals extracted in the country, such as iron ore, whose largest production is concentrated in Minas Gerais.

Proof of this is that mining startup MiningMath recently won first place at the MineTech Mining Solutions Technical Challenges Challenge with an innovative technology to simulate scenarios to support strategic mining decisions using modern techniques of programming.”

Diário do Comercio

Event in the Capital discussed a more sustainable performance, besides the optimization of processes and improvements in the performance of the sector (Vale Agency).

Mining Software developed by UFMG Alumni is awarded in competition in Russia

October 25, 2018

“The software combines variables ranging from geological aspects to economic data and legal, environmental and social constraints.

An innovative technology designed to simulate scenarios to support strategic decisions in the mining field, developed by former students of UFMG, Alexandre Marinho and Fabrício Ceolin, won the first place in the MineTech: Technical Challenges and Mining Solutions Challenge in early October , held in Moscow during the 14th Russia Exploration and Mining Forum.”

UFMG · Mining.com · Hoje em Dia · SIMI · Notícias de Mineração · FUNDEP

Fabrício Ceolin (left) and Alexandre Marinho (right) are the founders of the MiningMath startup (Personal Files).

Alexey Tsoy to present at the MiningMath 2018 Creating Value in Mining Conference

October 16, 2018

CSA Global Principal Consultant-Corporate, Alexey Tsoy will present at the upcoming MiningMath Creating Value in Mining; Strategy Optimisation through Data Science Conference on November 6 2018 at the Museum of Mines and Metals in Belo Horizonte, Brazil.

Alexey will present on Strategic Schedule Optimisation.

Winner of the 2018 MineTech Competition

October 16, 2018

Alexey Tsoy, Principal Consultant – Corporate and Business Development at CSA Global, is the winner of the 2018 MineTech Competition: 2nd Mining Technical Challenges and Solutions Competition’s at the 14th Russian Mining and Exploration Forum held between 2-4 October in Moscow, Russia.

“Alexey presented on Strategic Schedule Optimisation; forming part of this year’s MINEX conference ‘Building Up Innovative Excellence in Mining and Exploration’ […]. The proposed approach is based on a software called SimSched Direct Block Scheduling (DBS) […]. The software applies a simplistic approach where a block model is optimized on economic values assigned to each block in pre-processing. The simplicity allows a great degree of flexibility on assigning the economic values. Moreover it allows definition of variable costs based on time, residence time, or indeed any other parameter that can be calculated and limited in time.”

“The software that could revolutionize mine planning”

April 9, 2018

This publication states that the traditional planning includes the discount rate, production limits, blending and other technique-economical variables during advanced stages of planning and SimSched brings the possibility to reduce the number of iterations for planning.

“A unique and flexible tool for planning, which currently doesn’t exist in the industry. ” — Fabián Lemus, Senior Long-term Mine Planning Engineer.

The Scheduling The Sequences (STS) project, conducted by Antofagasta Minerals, includes a co-development of SimSched DBS, where major tests will focus on Minera Centinela, a multi-pit project with multiple processing streams that might benefit from an integrated plan. This will provide flexibility to analyze and evaluate development scenarios and the inclusion of new projects.

Access the original content.

Integrated approaches in the industry

How digital innovation can improve mining productivity

With profits down, miners are focused on improving their productivity. Digital innovation could provide a breakthrough.
Read more.

Digital Transformation—The future of Mining

In a challenging market, the digital transformation of mining companies has become a business imperative—leveraging technology to improve processes aligned to value.
Read more.

A tool for these times

Considering the current resurgence in commodity prices, every mining company should concentrate on strategy optimization to ensure that operations move from a cost-focused mindset to one centred on value maximization in order to reap the benefits of the upturn in the mining industry.
Read more.

Waste Dump Sequencing with SimSched

SimSched Direct Block Scheduler (DBS) is an open pit optimization package that selects maximum NPV pit shell while generating a mining schedule. Read more.

Innovation and Technology to Improve Open Pit Mine Plan and Design Optimisation

In the mining the change in technology, i.e., processes and software, does not happen fast but it takes its time, around 10 to 15 years as miners are very. (LinkedIn article)

Academic Partners

Mine Optimizations and SimSched DBS is a friend of yours now. SimSched DBS is the most powerful tool to achieve and compare NPV results of the pit. Read more.
 

Publications

2018 Sensitivity Analysis applied to operational parameters, IFG, Brazil (Portuguese)

2017 NPV Analysis as a function of the discount rate and cost of rehandling implementing SimSched DBS to open pit mining, Universidad Nacional de Colombia, Colombia (Spanish)

Courses

2017 Universidad del Azuay

2017 Universidad de San Luis, Argentina (Spanish)

We are on LinkedIn

During decades, the mining industry has dealt with Mine Planning as a step-by-step process. This traditional technology has been established in an intelligent manner in face of the technological limitations of that time.

We’d like to share the news that the main limitations of open pit mine plan and design based on Lerchs-Grossmann have been surpassed by a novel and innovative technology.

Spanish | Portuguese

The concept of Pit Optimization is becoming obsolete. Here we bring a comparison showing the variety of outputs you can obtain from a single optimization.

Spanish | Portuguese

Common Issues

Reported user issues

Take a look at some notable user issues shared on our forum. Feel free to post your own questions or lend a hand to fellow MiningMath users!

Forum posts

Unexpected Results

"I’m running MiningMath, but it’s mining a less profitable area, or even waste, or it’s not achieving the specific constraint I’ve set... Why is that?" To understand this, it’s important to recognize that not everything that is aimed to be incorporate into a mining project is mathematically feasible when attempting to respect all constraints simultaneously. Handling multiple,..."

Last active: 4 months ago

Problems using Vertical Rate

Hello friends, I am writing to make known a problem I have and see if you can help me. I am doing my degree work with MIMA. When I add physical constraints like ML, MW, MB everything goes fine, but when I add Vertical Rate, the optimization process does not exceed 16%. Is there any solution for thi..."

Last active: 1 year, 1 month ago

I have an issue at the importing my block model

When I import the block model it returns an error "invalid index: (1, 75, 1)."

Last active: 2 years, 3 months ago

Warnings

  Warning 1101

The Force Mining surface (S1) was initially above the topography. Correction applied: S1 has been projected onto the level of the topography.

  Warning 1102

The Force Mining (FM) surface used was initially below the economically viable last period surface. Correction applied: FM depth has been reduced by increasing its Z-level.

  Warning 1103

The Force Mining surface (S1) used for period X+1 was initially above the one (S2) used for period X. Correction applied: the elevation of S1 has been reduced to match S2.

Note: ‘X’ can represent any period value (1, 2, 3, etc.)

  Warning 1104

The Force Mining surface (S1) employed was initially below the Restrict Mining surface (S2) during period X. Correction applied: S1 has been adjusted upward to align with the elevation of S2.

Note: ‘X’ can represent any period value (1, 2, 3, etc.)

  Warning 1105

The restrict mining surface used violated slope constraints of the last period surface. Correction applied: the surface has been adjusted accordingly.

  Warning 1106

The force mining surface used violated slope constraints of the last period surface. Correction applied: the surface has been adjusted accordingly.

  Warning 1107

The force mining surface (S1) imported for the last period surface was initially below the restrict mining surface (S2) imported for the same purpose. Correction applied: the elevation of S1 has been increased to align with that of S2.

  Warning 1108

The restrict mining surface (S1) imported for the last period surface was initially above the force mining surface (S2) imported for the same purpose. Correction applied: the elevation of S1 has been decreased to align with that of S2.

  Warning 1109

The force mining surface (S1) did not initially provide sufficient space to meet minimum operational constraints. Correction applied: adjustments have been made to S1 in critical areas.

  Warning 1110

The restrict mining surface (S1) was initially below the origin Z level. Correction applied: S1 has been projected to align with the origin Z level.

  Warning 1111

The force mining surface (S1) was initially below the origin Z level. Correction applied: S1 has been projected to align with the origin Z level.

  Warning 1112

The edges of the restrict mining surface (S1) were initially below the topography level. Correction applied: the edges of S1 have been projected to align with the topography level.

  Warning 1113

The edges of the force mining surface (S1) used were initially below the topography level. Correction applied: the edges of S1 have been projected to align with the topography level.

  Warning 1114

The restrict mining surface (S1) used was initially above the topography. Correction applied: S1 has been projected to align with the topography level.

  Warning 1115

The restrict mining (RM) surface utilized was initially above the economically viable last period surface. Correction applied: the depth of RM has been increased by reducing its Z-level.

  Warning 1116

The restrict mining surface (S1) used for period X-1 was initially below surface S2 used for period X. Correction applied: the elevation of S1 has been increased to match that of S2.

Note: ‘X’ can represent any period value (1, 2, 3, etc.)

  Warning 1201

Slope angles for the restrict mining surface have been adjusted for period X.

Note: ‘X’ can represent any period value (1, 2, 3, etc.)

  Warning 1202

Slope angles for the force mining surface have been adjusted for period X.

Note: ‘X’ can represent any period value (1, 2, 3, etc.)

  Warning 1401

Failure while importing CSV. Invalid index: .

This is usually an error related to the origins. We recommend checking the origins from the previous mining package, otherwise, MiningMath’s results won’t match the actual coordinates. Read more about formatting your coordinates here.

  Warning 2301

The vertical rate has been adjusted by N meters to adhere to production and/or surface constraints in period X. Note: ‘X’ can represent any period value (1, 2, 3, etc.). ‘N’ can represent any value in meters.

  Warning 2401

The production capacities informed may force a mining schedule with more than 100 periods. Do you want to continue anyway?

This warning message is designed to prevent users from spending time on incorrect scenario setups by flagging potential inconsistencies with the block model data.

It often occurs due to typos, such as inputting values with lower magnitudes than intended. For instance, if production is meant to be 10 million tons per year but is mistakenly entered as 10 thousand tons per year. MiningMath conducts basic validation checks before each scenario optimization, comparing parameters like production limits with the available material. If values are significantly lower than expected, it triggers a warning, as it may result in an unrealistically long mine life, typically over 100 years.

  Warning 2403

The pre-defined scenarios with Marvin deposit can usually appear with Red Warnings.




This means that within the installation process MiningMath could not find a folder to place it. Therefore, to run it you just have to click in the scenario, choose where the files should be and save them.


It is important to mention that Marvin block model is available here, thus if lost it or want to import it again feel free to do so.

  Warning 2402

The block model must have at least two Economic Value fields, one for process and another one for dump.

MiningMath requires at least one process and one dump destination with their respective economic values. You should analyze your model and be sure to have these fields. 

The economic value calculation is one of the most important procedures of MiningMath. Any error in the formula could lead to incompatible results and even increase complexity and runtime, due to incorrect assumptions derived from these values. The main validations on this step can be done by evaluating the minimum and maximum values. Once the data has been imported, you can also perform a data validation procedure.

  Warning 2404

Error parsing surface.

Usually, when you import an invalid surface, the relative field will appear in red. Hover the mouse over the field to see the details.

The first common issue here is about importing a surface which does not meet the block model limits. To check this issue, verify your origins and check your coordinates by following the steps mentioned on this page.

An additional error is based on the name of the headers of the surface file, which might be always “X, Y, Z”. Thus, if it has any other name or type, correct it accordingly with this statement.

  Warning 3501

MiningMath tries to communicate with Excel and fails.

The reason why this happens is that there are additional windows (like a login screen or an activation failure) being opened before the worksheet, which is interfering with MiningMath’s generation of reports.

Close the Excel instances completely and then reopen. If an additional window appears before the worksheet, just follow its instructions.

Floating-point numbers

Same scenario, different results?

Is it possible to find different results for the same scenario running in different computers. Algorithms based on Mixed Integer Linear Programming (MILP) depend on third-party solvers and their results may differ in terms of floating-point precision from hardware to hardware.
 
Given that MILP is based on multiple LP executions, precision differences may accumulate over the sequence of operations performed with floating-point numbers. It’s expected that results may differ, but they should be equivalent in terms of NPV. If the physical results are too different, this means this mine has the flexibility to operate in both ways without big impacts in NPV.

For an in-depth explanation on floating-point arithmetic from a computer science perspective please see here.

License Information

MiningMath relies on online activation based on the internet connection, or through an identification code from your hardware as a contingency.

If you are experiencing issues with activating your license, you can find the license information on the licensing screen.

  1. Identify your error number/message.

  2. Get your hardware identification code (Host ID), by the two following options:

    a) Copy the text disclosed at the "Informations for support", if it is available for you

    b) Execute this procedure bellow, explained in this video.

  3. Send us the error number/message generated and the identification (Host ID) by filling this form.

Note: If your error number is -3001 get your solution at this page.
Note 2: The revoke license procedure, available only in commercial licenses, started after version v2.0.24. Therefore, make sure to have an updated version before revoking it on your computer and activate it in another one.

Progress Bar

The percentage displayed on the progress bar is only an estimate, as mathematical programming can be unpredictable.

The pre-processing steps, in which the algorithm eliminates the useless material, might keep it stuck in the initial percentage (2%, 4%, etc) for a while, but after that, the optimization can get faster.

If the remaining time reaches zero but the progress bar hasn’t completed and results aren’t available, it’s likely due to the unpredictability of mathematical programming calculations. We recommend waiting until the software completes the optimization—the bar will fill completely, “Completed” will display, and results will become available.

MiningMath can virtually handle any model size. It has successfully run models from clients beyond 10M blocks without reblocking, which might take a few hours to finish. The runtime is directly proportional to the number of blocks, destinations, periods, constraints in use, and variables imported. Therefore, the combination of multiple aspects, are directly related to the complexity of the deposit.

Check for any ‘floating blocks‘ that are not connected to the model’s topography, as shown below. These regions can impact optimization, so removing them might help MiningMath to function properly.

Example of floating blocks.

Geometry

Slope Angles

Slope angles are a critical consideration in the mining industry, as they directly impact safety and significantly influence a wide range of operational parameters. The most relevant types include:

  • Bench face angle (BFA): The angle between the vertical and the bench slope face, measured within an individual bench.
  • Inter-ramp angle: The angle measured between the toe of the bottom bench and the crest of the top bench, encompassing multiple benches but excluding ramps or haul roads.
  • Overall slope angle (OSA): The angle measured from the bottom-most point of the pit to the top-most point, incorporating all benches, ramps, and haul roads.
Slope angles in mining operations.
Bench face angle, inter-ramp angle, and overall slope angle in open-pit mining

Each of which plays a specific role in balancing safety, operational efficiency, and ore recovery. This makes slope angles one of the most critical parameters when establishing a constraints hierarchy. Their values often vary depending on factors such as time frames (short- or long-term), rock type, lithology, mine sector, depth, and geotechnical domains. Therefore, it is essential to clearly define these assumptions to use this parameter effectively and align it with your project’s specific goals. 

Discrepancies between optimization and design phases

The traditional workflow for open-pit optimization, design, and production scheduling often leads to discrepancies between the parameters used during the optimization phase and those applied during the design phase. Ramp design and placement, for instance, significantly influence the OSA, creating a mismatch. This misalignment frequently necessitates an iterative process of re-optimizing the same scenario based on the finalized pit design.

How MiningMath works

MiningMath controls the Overall Slope Angle through a “surface-constrained production scheduling” approach, where surfaces define groups of blocks to be mined (or not), instead of relying on the traditional “block precedence” method. This approach enables MiningMath to incorporate geometric constraints into its unique single-step optimization, delivering solutions that are closer to real mining operations.

Timing of block extraction

Surfaces determine the timing of block extraction. For instance, blocks located between the surfaces associated with periods 1 and 2 will be mined in period 2. A block is considered “between” two surfaces if its centroid lies within the vertical space defined by them. In this example, blue blocks are mined in period 1, while yellow blocks are mined in period 2.

Surfaces determine the timing of block extraction. For instance, blocks located between the surfaces associated with periods 1 and 2 will be mined in period 2. A block is considered "between" two surfaces if its centroid lies within the vertical space defined by them. In this example, blue blocks are mined in period 1, while yellow blocks are mined in period 2.
In this example, blue blocks are mined in period 1, while yellow blocks are mined in period 2. Vertical lines indicate the distance between centroids below (red lines) and above (green lines) surfaces.

Grid-base approach for controlling slope angles

Within surfaces, slope constraints are managed by controlling the angles between adjacent points within a defined grid structure, rather than evaluating the angles between arbitrary pairs of points, as depicted below.

Slope angles being controlled in the surface grid.
Grid structure. Left: An example of a single point (green) with an associated slope constraint, showing all adjacent points (blue) that comply with the defined limitations. Right: An example of two arbitrary points (green and red) that are not subject to any constraints between them.

MiningMath employs a continuous linear variable for slope control, achieving superior accuracy compared to the discrete nature of block-based methods. Moreover, this grid-based approach ensures precise approximation of the deposit’s angles by focusing exclusively on grid-connected points, eliminating the need for comparisons between unrelated points. Adjacent elevations within a single surface must comply with a maximum allowable height difference, calculated based on the specified slope angle restriction (read more). This method guarantees that each generated surface represents a feasible solution while adhering to production requirements and delivering enhanced accuracy, particularly in transition zones.

Setting up slope angles

MiningMath provides two options for handling slope angles:

  1. Block-specific slope definitions directly within the model.

  2. A default slope value that can be applied universally or used as a fallback for blocks without predefined slope information.

In the interface, these can be found in the General tab.

For scenarios requiring multiple variable slope assumptions, you can prepare additional columns (e.g., Slope 1, Slope 2, … Slope N) in the block model before import. This setup allows you to select the appropriate slope data for each scenario directly through the interface, eliminating the need to repeatedly edit and re-import the block model.

Default values

In the interface, you can choose the field option as the primary rule for variable slope angles or select <none> to apply a constant value across the entire model. If the chosen field contains missing data, the defined default value will automatically be used for those blocks.

Block by block field

When assigning a field to the slope angles parameter, this will represent the column that will be assigned to the slope on each block as depicted below.

Slope angles assigned block by block during importation.

This approach allows a high level of flexibility to use any specific criteria. These possibilities can also comprise bi-dimensional and tri-dimensional variations, beyond linear and non-linear functions.

Employing slope angles in the short-term

Short-term planning presents a valuable opportunity to leverage the same platform used by the strategic mine planning team, enhancing project adherence and reconciliation. By selecting a surface and applying it as a constraint, mining can be restricted to specific areas, refining the entire operation.

This approach can utilize a surface already designed with ramps or any surface generated by MiningMath that adheres to the OSA within the required time frame.

Operational bench face angle and overall slope angle difference.

This flexibility allows for steeper Bench Face Angles based on operational blasting parameters. To maximize this feature, follow the suggestion to set up angles block by block and experiment with different angles according to your project’s capabilities. This methodology enables the algorithm to adjust the constraints hierarchy, ultimately leading to improved results.

Geometric Constraints

In a mining project, the mine planner must accurately dimension each unit operation to determine the most suitable set of equipment for the existing conditions. With MiningMath, operational parameters are integrated as constraints within the objective function, rather than being applied post-pit optimization. This methodology ensures solutions that adhere to operational criteria while maximizing NPV, leading to more effective data utilization and uncovering opportunities that might be overlooked with manual steps and arbitrary assumptions.

The Geometric tab is the place to set minimum mining & bottom widths, mining length and vertical rate of advance, whose values are applicable to every period. The user can also use surfaces to define operational constraints in compliance with period ranges, which can limit, force or achieve an exact shape, based o the constraints hierarchy.

There are two types of widths restrictions that can be created:

  1. Mining Width: distance from a pit to another.

  2. Bottom Width: bottom minimum area.

Currently, MiningMath does not mine partial blocks. As a consequence, the software will round up any widths to cover the next integer block.

Widths in the Geometry tab
Minimum widths in the Geometries tab.

It is also possible to define a Vertical Rate of Advance for each period range. The VR will be rounded up to cover the next integer block.

Vertical rate of advance in the Geometry tab
Vertical rate of advance in the Geometris tab

Mining length (ML)

A minimum horizontal distance that should be respected from a pit to another in every period can also be defined in the Minimum Mining Length field. Currently, this is only available in the insider version.

Minimum mining length option in the Geometries tab.

The figures below show a simplistic meaning of each width/length available and the vertical rate of advance.

For each period range, the user can consider:

  1. 1 force mining surface.

  2. 1 restrict mining surface.

Each surface file is valid from period A up to final of period B, as depicted below.

Surface mining limits: forcing and restrict mininig.

The following video shows how the variation of operational constraints impacts your solution and how you can take advantage of these parameters to find results more closer to the reality.

Operational constraints

Widths and lengths

On MiningMath, the widths and lengths are constraints within the objective function, which means that they are regarded in the optimization instead of being considered just on the pit design stage, when roads, access are drawn to conceive an operational feasible to the pit.

The definition of minimum widths is a pretty useful feature to obtain operational results and play with different geometries accordingly with the project requirements. It is important to understand that they are very complex parameters to be respected while considering 3-dimensional non-linear models, which also influence the runtime of each scenario. Therefore, it is not possible to always guarantee that all minimum widths are respected due to the deposit geometry and also the constraints hierarchy. Testing different values is a great strategy to identify opportunities that could bring the best mining sequence and NPV.

Types of widths and legnths

Bottom widths

Bottom widths (BW) are the minimum horizontal distance, on the lowest floor of the pit, as seen in Figure 1. It is required to allow mining operations based on the equipment sizing. This parameter is considered the same for all periods and is related to adjacent slopes and applicable in other areas regarded as pit bottoms.

Figure 1: Bottom width area in subsequent periods

Mining widths

Mining widths (MW) are the minimum horizontal distance that should be respected from a pit to another in every period, which means that this is the horizontal distance between the walls of two surfaces that belonged to consecutive periods, as shown in Figure 2.

This feature is more complex than the minimum bottom width since it would drastically change the pit shapes to identify the best regions. Note that wider values provide greater mining fronts and also better designs to nested pits, pushbacks, schedules, or any other result that you are looking for.

Figure 2: Mining width in subsequent periods

Mining lengths

The Mining Length (ML) represents a minimum distance that must exist between at least two points amidst the walls of surfaces among two consecutive mining periods. This distance is already respected for any values smaller or equal to the MW. Thus, this parameter extends such distance between any two points for a value greater than MW. Figure 3 depicts an example

Imagem da mina
Figure 3: Example of MW and ML.

Identifying geometric parameters

Figure 4 shows a section view of the McLaughlin deposit, where each color represents a given period. The horizontal arrows highlight the bottom and the mining width, while others identify the vertical advance. Mining lengths cannot be depicted in this 2D representation.

Figure 4: Bottom width, Mining width and vertical rate of advance.

How are widths defined?

The widths inputted in the interface define a diameter (d) of a circle. As MiningMath does not mine partial blocks, the software will consider the block size as a reference to define whether d should be rounded up to the next integer multiple. The approximated circle results in a polygon, whose objective is to select centroids of adjacent cells. Then, MiningMath will assign the same elevation value to the selected cells to define mining surfaces.

Figure 5 shows, in sequence, an example of how the minimum width of 25 meters and possible mining length of 50 meters could be defined over a 10 x 10 meters grid (block’s dimension in x and y direction).

Figure 5: How MiningMath defines the mining width and mining length.

Vertical Rate

Definition and calculation

The vertical rate of advance (or siking rate) is defined as the vertical distance, in meters, mined in each period, and it is calculated by evaluating each mining face independently, as shown in this figure.

Two mining faces relative to the same period (blue) and their vertical distances mined in meters.

The total vertical distance mined across all periods should be consistent with the pit depth and equipment available. In cases of unfeasibility this parameter will respect the constraints hierarchy in which it is placed at a lower priority.

Vertical rate of advance, bottom width and mining width.

Complexity and recommendations

The vertical rate of advance (VR) is a complex parameter within the optimization. The VR works as an upper bound to avoid operationally unfeasible solutions. Therefore, testing different values is a great strategy to identify opportunities that could bring the best mining sequence and NPV.

The Mining Width (MW) is not mandatory when using VR, but it plays an important role when defining its value. The MW and VR, together, define volumes of material for each mining period. A reduced MW might create additional challenges for the algorithm to comply with the VR. Therefore, it is important to play with different values, especially when VR is not being fully respected. 

Vertical Rate: Definition, Hierarchy of Constraints & Complexity

Example evaluation

To evaluate different values for the VR constraint, you can use Decision Trees. The figures below depict a base case for VR evaluation using the Marvin dataset and a respective decision tree built with different VR values.

Notice the fluctuation in NPV illustrated below, influenced by this singular parameter. Additionally, take note that smallest values like 30m (equivalent to block height) and 60m trigger warning violations in the generated report file. This indicates that MiningMath’s algorithm had to adapt the VR constraint for a viable solution to be achieved, keeping other indicators, such as tonnage, within limits. You can see more on the next section on how to prioritize the VR constraint in case adjustments like that are necessary to achieve feasible solutions.

NPV achieved with the Marvin dataset for different values of Vertical Rate. (*) 30m and 60m constraints were not respected in all periods and generated warnings in the report file.

The vertical rate of advance is one of the first constraints to be relaxed within MiningMath’s Hierarchy of Constraints (read more). To ensure VR is at least closer to what you need, relax low-priority constraints manually. This way you will lead the algorithm to a more flexible scenario and a broader solution space, which may help it to find a feasible solution for the new set of constraints.

Besides, if the user aims to force a maximum vertical rate for a given period, it can be created a flat surface constraint regarding the achievable depth and input it as a Restrict Mining. Beyond that is important to notice that even the goal of achieving “process full” could result in a non-feasible solution while the VR is still respected, therefore, this feature is very sensitive to any other parameter.

Vertical rates controlled by surface constraints

In summary, it is crucial to assess various parameter values to gain a deeper understanding of how they can impact your project, particularly concerning geometric constraints. You can explore additional workflows presented here that can assist you in achieving better results.

Force Mining

Force mining is a surface used to deplete the material of the entire area in a specific time-frame.

Golden arrows disclosing the areas that should be forced.

This feature is responsible for making MiningMath achieve at least the surface inserted, which means that all the material inside its limits should be extracted, whether is ore or waste. Thus, this feature could be also understood as a minimum depth that should be mined at a specific timeframe.

Keep in mind that these surfaces might be adjusted during the optimization to respect the slope angles (as depicted below), which has a higher priority order on the algorithm, while the optimization is done.

Force Mining surface and the slope angle adjust which could happen.

Therefore, more material can be mined either to correct the overall slope angle or to increase the NPV.

This functionality is commonly used to refine/keep the mining amount of a previous good surface in early periods, force a specific depth which the deposit should achieve, create custom advances, extract material to make a region available to allocate equipment, etc.

As MiningMath aims to assist the users to apply their project knowledge to guide the algorithm into the best decisions, that is why surfaces are one of the most important constraints hierarchy, which enables the implementation of custom geometries and operational parameters based on this smart hints imposed. Thus, forcing-surfaces might be the reason for disrespecting production limits, blending constraints, geometries, and so on, which require the user to be careful by using these functionalities. To sum up, the material above a Force Mining imported surface will be certainly mined until the specified period of time and what is below it will be mined only if the blocks respect all the other constraints and generate profitable results.

The approach here was the attempt to deplete a mining front with high grade ores in the first period on the optimization, which considers the region which has IY higher than 35, inside the final pit of the Data Validation and limited until the elevation 250.

Final pit surface (Y coordinates higher than 35) and high-grade ores filtered.

To build such surface the first step was to place the Z coordinates in this excel file, then use conditional functions to define these limits.

Mining_Front-FM.csv context.
Force Mining Surface and high-grade ores.

By having a surface suitable to use, a scenario of Exploratory Analysis was run using the set up of disclosed.

Example scenario for force mining

Note that a scenario without Force Mining was also run to have a good comparison.

Same example scenario without force mining

As a result, the Force Mining surface totally mined along additional material to fulfill the processing capacity and also adjustments on the slope angles which generated the results disclosed.

Lastly, it is always useful to compare these scenarios with the ones that did not use this approach and generated different sequences. This example illustrates how powerful could be user assumptions to generate better results or explore possibilities.

Restrict Mining

The Restrict mining parameter is defined to prohibit access to any area in a specific time-frame. This parameter allows you to ignore what is outside an input surface. In other words, only the material above it is available to be extracted, whether is ore or waste. This feature could be also understood as a maximum depth that could be mined at a specific timeframe.

Red X's disclosing the areas that should be restricted.
Restrict Mining field in the Surface Limits tab

How is it used?

Restrict Mining is commonly used to refine/optimize the mining amount of a previous surface in any period, restrict depth to a specific value which the deposit could achieve and extract the best ore in custom advances. It locks/prohibits an area due to concession rights, environmental issues, or even due to an already built stockpile, waste dump or structure in general, etc.

To sum up, the material above a Restrict Mining imported surface will be available to be mined until the specified period of time and what is inside could be mined if the blocks respect all the constraints and generate profitable results. Regarding what is outside, it will not be mined whether profitable or not.

Restricting an area for a range of time

For this example we employ the standard scenario scn21-PriceUp-RampUp-Protection300 that comes pre-installed. 

In this example, periods 3 and 4 have a Restrict Mining surface as depicted below.

In the results, it is possible to identify that periods 1 to 4 do not go 
beyond the limiting surface on the west portion, defined by the higher elevations of the cells (equal to the topography).

From periods 5 to 12, when the restricting surface is no longer applied, blocks can go beyong the previously constrained area.

Extracting the best material

The second example is based on the attempt to extract the best material inside the  Mining_Front-FM.csvwhich mapped high-grade ores in the first period as mentioned here

Using the suitable surface file, in a scenario of Exploratory Analysis, the example was run using the set up below. A scenario without Restrict Mining was also run to have a good comparison.

As a result, the Restrict Mining surface mined the best material within the area available.

In turn, this generated different sequences as depicted below.

Scenario with Restrict Mining
Scenario with no Restrict Mining

These examples illustrate how powerful could be user assumptions to generate suitable results or explore possibilities. The Restrict Mining parameter enables the implementation of custom geometries and operational parameters based on a smart hint imposed by the user.

Combining Force and Restrict

MiningMath allows the user to combine Force and Restrict mining by using surfaces on each field as you already read in the previous pages. These features allow us to use different arrangements due to concession rights, exchanging of land with adjacent mining companies, allocation of waste material inside exhausted areas, environmental issues, and so forth.

By using them together, the user can either reach the exact shape of a pit if you input the same surface as Force and Restrict mining at the same time frame (Figure 1), which has the highest priority constraint in the hierarchy order. It is also possible to optimize the material between surfaces if you add different surfaces in these two fields (Figure 2), which might be adjusted either to correct the overall slope angle or to increase the NPV, as mentioned before.

Figure 1: Using the same surface as Force and Restrict mining to reach an exact shape.

Based on these concepts, MiningMath allows you to export surfaces from the best scenario to a bigger mining package, design it with your mining package, create the grid of pointsimport the designed pit, and, finally, optimize and refine as much as you can by using smart constraints. Therefore, the user has the advantage to control the results by guiding results accordingly with the project requirements.

Figure 2: Using different surfaces on Force and Restrict mining to optimize a volume within its limits.

Surfaces used simultaneously in both fields, force and restrict mining, are interpreted as follows:

  • Different surfaces: MiningMath will force mining according to the forcing-surface and will restrict mining according to the restricting-surface.

  • Same surface: MiningMath will achieve the same format proposed by the surface in use until the end of the time frame in which they have been applied.

This approach allows you to use different surfaces in the same time frame or split them accordingly to the goals that you want to achieve since this feature works close to what is shown in Figure 2 above.

Figure 3: Mining sequence between force and restrict mining surfaces.

In this example, it was used the same constraints were mentioned on the schedule optimization page. Besides, the surface 2 of this scenario was add on the force mining field in the second time frame, which means that by the end of the second period the mining should reach at least the shape imposed. The restricting surface used came from the constraints validation page and was added on the last interval, which means that the algorithm could not surpass this limit. Figure 3 discloses the optimized mining sequence volume between the surfaces inserted, and Figure 4 shows the set up of the scenario.

As result, the Figures 5 and 6 illustrate how the force mining has influenced the the firsts periods of the optimization, which mined more material due to its profitability.

Figure 7 disclose the constraint validation surface, used as restrict mining.

Figure 8 shows the final result regarding the force and restrict mining features, which respected the surface constraints and demonstrates the capability of these features to guide results.

Figure 4: Scenario setup.

The following figure exemplifies how the user can take into account any pit design to make MiningMath iteratively produce more operational results, detailing a previous scenario.

In this case, the user needs to use the same surface in both fields, force and restrict mining, and during the same period of time to reach the exact shape of the designed surface.

Although the workflow on Figure 9 uses a designed pit, it is possible to use pits from previous scenarios as well, so that you can freeze good results and optimize further periods. Below are some examples of it:

  • Getting the same: Achieve the same final pit of a previous scenario.

  • Using the traditional approach: Define a pushback from 5 to 5 years.

Figure 9: Using a designed surface as force and restrict mining.

This powerful workflow allow a lot of flexibility so that the user can guide solutions based on insights and previous knowledge of the deposit. The concept is a pretty unique feature of MiningMath and such approach could be easily done by following the steps of creating surfaces and validating them on the footer bellow.

Creating Surfaces

The surface files on MiningMath are a set of points (Figure 1)which are aligned with blocks centroids on X and Y axes (Figure 2). The easiest way to avoid any error message is by using a topography surface created by MiningMath on the Data Validation, for instance, and then manipulating only the Z coordinates.

Usually, the designed surfaces on traditional mining CADs are based on contour/drawn lines and point triangularizations (Figures 3 and 4). Therefore, they are continuous figures which can not be recognized on MiningMath.

Here, it’s the step-by-step to create surfaces on MiningMath:

  • Import the TopographySurface.csv (Figure 5)which is a grid of points, in the CAD of your Mining Package. The CSV file mentioned might be obtained by validating scenarios or any other execution on MiningMath.

  • Manipulate only the Z coordinates and project them in a way that fits your needs. There are 3 main options at this step:

    • Use a polyline (Figure 6) to draw the region, select the points inside the polygon drewand find the option on your Mining Package which allows you to place them at the elevation that you want.
    • Use your triangularized designed surface (Figure 7) by placing it on your CAD viewer along the TopographySurface.csv generated by MiningMath. Select the point set imported in your CAD and find the option on your Mining Package which allows you to project all points at the same elevation of your designed surface (Figure 8).
    • Using only Excel or any spreadsheet program. Open the TopographySurface.csv, filter the regions in X and Y that you want to change the elevation and manipulate them.

Export the modified point set (Figure 9) as a CSV using a different name. Open the file on excel or the notepad to make sure that the header is correct (Figure 10) before importing on MiningMath since it is pretty common that these exported come along with meaningless information that can interfere with the importation.

Figure 9: Surface file which could be imported in MiningMath.
Figure 10: Header and disclosed data on a standard surface file of MiningMath.
  • Select the points in the area that should be restrictedplace them at the highest elevation of the topography, and set rest of them at the bottom.

  • Use as a restricted mining surface.

Figures 11 through 13 illustrate the process.

Note: The X and Y coordinates must remain the same.

  • Choose the points inside the area you want to force mineplace them at the elevation that you wish, and let the rest of them at the topography.

  • Use as a force mining surface.

Figures 14 through 16 illustrate the process.

Note: The X and Y coordinates must remain the same.

  • Define your polygons or use designed surfaces, by following the methodology presented in 2. Creating surfaces Step-by-step.

  • Use as a force and restrict the mining surface.

Figures 17 through 19 illustrate the process.

Note: The X and Y coordinates must remain the same.

Validating Surfaces

The best way to generate surfaces is by using a topography surface created by MiningMath, which is created after the Data Validation, then manipulating only the Z coordinates. Make sure to meet all the surface requirements disclosed here.

Surface Requirements
  • Headers must be named as X, Y, and Z. These files have to obey an ascending value order in each one of the axes.

  • Same size of the block model, item 1.1 of this page explains it.

  • Its points must be aligned with blocks' centroidsitems 1.1 and 1.2 help you to understand it.

  • Defined as a grid of points, the visual validation, item 1.2 shows it.

  • To be in the CSV format.

Figure 1: Surface file format of MiningMath

By using the values of a Marvin Deposit file in Figure 2, we find the Block model centroids boundaries which begin at XMin=15; YMin=15 and has the maximum centroid value of XMax=5,295YMax=8,265, which could be confirmed by checking the topography generated by MiningMath.

Then, its time to search for your surface limits, at this example the file chosen was “Surface-RM-offset-300m”. The easy way to find it, is by filtering the axis values, as shown in Figure 3, which disclosed XMin= 15; YMin=15 and XMax= 1,815 and YMax= 1,785. Therefore, even though this file was also based on Marvin Deposit, it is not possible to use it since the surface smaller than the block model in place, which means that it does not have the same size as the block model file.

It is always worthwhile to check the limits of the designed surface (Figures 4 to 8) if we face an error. Remember that everything, even elevations, must be in the boundaries of the block model and check the recommendations for your surface.

Figure 4: Surface (grid of points) smaller than the block model.
Figure 6: Surface (grid of points) bigger than the block model.
Figure 7: Surface (grid of points) with missing values.
Figure 5: Surface (grid of points) smaller than the block model.
Figure 8: Surface (grid of points) unaligned with the centroids.
Figure 9: Correct Surface (grid of points) aligned with the centroids and same block model size.

The centroids maximum limit by using the equation for each axis:

Maximum centroid value = OX + (NX*DX)-(DX/2)

Where:

  • OX is the origin of the X-axis;

  • DX is the block dimension of the X-axis;

  • NX is the number of blocks in the X-axis.

Note: (DX/2) is related to the distance which should be summed on the origin to find the centroid of the first block, or reduced at the block model limit to find the last centroid, since origins are based on the corner of the block model.

The following video presents how to validate surfaces numerically and visually. This initial verification is what enables you to understand what might be happening and where the error is. The example used regards to the message: “Error parsing surface: Coordinates aren’t properly spaced” but it fits any case where the surfaces used are causing a problem, especially when a red box error shows up.

In this case, as there are a lot of values that do not match the correct ones for X and Y, the quicker and easier way to fix it is restarting from the beginning.

Video 1: Validating surfaces.

Common Issues

  • Error parsing surface: duplicated coordinate

If this message appears, it’s possible that your surface file contains more than one elevation for the same coordinate. Please check the CSV file.

Figure 9: Duplicated coordinates error

Surfaces as a Guide

The easier way to work with surfaces is by manipulating the Z coordinates of the topography generated by MiningMath, while keeping the same values for the X and Y coordinates.

The surfaces generated in MiningMath have always the same format. Each one of them has an equivalent number of lines. Moreover, the data follows the same order from the first row to the last one. Hence, you can choose the topography file and use it as a guide. This could also facilitate: 1) conversions to original coordinates from other software; 2) filter of regions and pits; 3) creation of force and restrict mining files; and many other options using this concept.

It is worth mentioning that this approach could be easily done by using a simple worksheet and loading the CSV files generated in the viewer for further analysis.

If you are considering geometries, mainly the Mining Width with surfaces imported from a different software package, there will be conflicts between the geometric criteria of MiningMath and the geometric criteria from the surfaces imported.

You must give some freedom for the last period so that MiningMath can also optimize the number of periods. It means that you should use <end> instead of using a lock period range, such as from 16 to 16 for instance.

In order to add more material from deeper areas, you can use a base surface to increase its depth where you wish to get closer to what the project requires. Focusing on the main areas in early periods, you provide tips to the algorithm so that it can understand your approach by using surfaces. The following steps will reproduce an efficient workflow to promote this optimization:

  1. Download this excel file to use as a guidance advance in the steps.

  2. Insert the data in the yellow cells based on your block model information.

  3. Paste the coordinates of the topography surface

  4. Define a new column, plan Z, to represent the coordinates Z of a surface plan. If you are not sure what is the plan Z, you can import a test surface on the viewer, and identify the elevation in which you want to force the bottom area below it.

  5. Create a new column to represent the coordinates Z of the restrict mining surface. This will be used to identify the maximum amount of ore, respecting the geotechnical aspects, that could be extracted. Finally, you can create a condition to define the restrict mining Z at the bottom Z if it is below the plan Z, or at the topography Z otherwise. The figure below depicts this concept and respective formula in Excel.

  6. Set up your scenario and run.

Note: As a result, you will get the maximum potential that can be extracted bellow the elevation that was chosen.

  1. Place the Z coordinate from the base surface that you what to increase the depth to extract more material in the bottom

  2. Paste the coordinate from the maximum potential scenario

  3. Calculate the difference between them.

  4. If the area of the maximum potential surface is bellow the surface base, it will consider the lowest elevation. Therefore, it will consider the additional ore that can be extracted in the bottom along with the base surface used.

  5. Use the mixed surface as force and restrict mining at the period range that you want to achieve it.

Note: After this step, MiningMath will generate operational surfaces so that you can use it on your projects considering the timeframes required.

 

Surface Constraints

MiningMath uses a surface-constrained mine production scheduling, which is an improvement of the idea proposed by Marinho (2013). Surfaces are one of the most important constraints, allowing the user to impose its manipulations and knowledge to guide the optimization process. It can be used to force areas to back-fill operations, to allocate an in-pit crusher, to restrict an area considering different offsets, and show the economic impacts of preserving or not a given area or community. It also allows to incorporate an operational mine design as a requirement for a given time frame. For example, the mine design for the current year could be a mandatory requirement for the first period, while the rest of the mine sequence would have a new chance to be re-optimized and find a different sequence that finds more long-term value. The following pages unlock all the possibilities of the use of such features.

Internally in the algorithm:

  • To define slope angles and eliminate geotechnical errors, present in the blocks precedence method (Beretta & Marinho, 2014, 2015).

  • To handle geometric parameters and comply with minimum widths and maximum vertical rate.

Force mining illustration. Force mining could be understood as a minimum depth to be mined.
Figure 1: Force mining could be understood as a minimum depth to be mined.

As optimization inputs:

  • To force mining and achieve a minimum depth, geometry, or area within a given time frame.

  • To restrict mining and ensure unavailable areas will not be considered as part of the optimization within a given time frame.

  • To force and restrict mining to achieve an specific design o guide the optmization.

Restrict mining illustration. Restrict mining could be understood as a maximum depth achievable.
Figure 2: Restrict mining could be understood as a maximum depth achievable.

As optimization outputs:

  • To outline the mine sequence throughout the Life of Mine that maximizes the Net Present Value.

  • Outputs will be a consequence of the optimization, which implies each set of project assumptions, constraints, and parameters, since it is unconstrained by pushbacks it will produce a different sequence of extraction, unlocking hidden opportunities.

Force and restrict mining illustration. Force and restrict mining used together could represents minimum depth to be mined and maximum achievable.
Figure 3: Force and restrict mining used together could represents minimum depth to be mined and maximum achievable.

Surface formatting is simple and any surface output from MiningMath might serve as a start point for further manipulations or even for validations. It is important mentioning that they are exported/imported from/at MiningMath in Coordinates.

  • To have headers named as X, Y, and Z. These files also obey an ascending value order in each one of the axes.

  • To have the same size of the block model, which means it should not exceed the block model dimensions.

  • To have its points aligned with blocks' centroids in the X-Y plane.

  • To be defined as a grid of points.

  • To be in the CSV format.

To avoid any mistakes, manipulate an output surface from MiningMath instead of creating one from scratch.

  1. Run any scenario to obtain the topography file in MiningMath’s format.

  2. Import the topography.csv, created by MiningMath, on a software able to manipulate it graphically.

  3. Select points inside/outside a polygon. Move them up/down accordingly to the objectives to force or restrict mining. Points should be moved only up and down, along the Z direction.

  4. Once the surface is ready, move it back to the original coordinate system.

  5. Use it on MiningMath.

  • X and Y coordinates should remain the same, with the same spacing between each pair of points.

  • For rectangular areas, a spreadsheet application is suitable for this task.

Surfaces are imported in two tabs of MiningMath: Geometric and OverviewFigure 1 zooms in the operational constraints from the Geometric tab. The main variables to use this feature are mentioned on Figure 4, which illustrates that surfaces are imported considering:

  • The purpose of forcing/restricting mining.

  • The period range when each surface is applicable.

    • MiningMath automatically defines a single period range from "1" to the "and the the user can also add custom intervals.
Geometric constraints.
Figure 4: Geometric constraints.

In the example from Figure 5, the image highlights the fields to apply:

  • restricting-surface valid for periods 1 and 2 (in green), which means that it would be respected until the end of the second year.

  • A forcing-surface valid for periods 1 to 5 (in blue), which means that the area has to be mined until the and of period 5.

General constraints and fields related to surfaces.
Figure 5: General constraints and fields related to surfaces.

Video 1: Surface Constraints: The ultimate guide

Tips and Tricks

Incorporating Fixed Costs

Fixed costs are expenses that do not vary with production or mining volume, such as equipment maintenance, infrastructure, and fixed labor costs. They are fundamental to the economic viability of the operation, ensuring the financial sustainability of the mine regardless of production variations.

How MiningMath handles fixed costs

In MiningMath, fixed costs can be directly incorporated into the mining model, ensuring they are considered in decision-making processes.

MiningMath also allows for pit optimization, aiming to maximize undiscounted cash flow while considering fixed costs and all desired constraints. This enables an economic feasibility assessment of specific regions throughout the project’s lifespan. To do this, binary decision field must be used, which require additional configurations in the block model and the pit optimization scenario.

Block model configuration steps

1. Add a special block just above the topography

Create a block just above the defined topography, avoiding the model’s edges. Alternatively, select an existing block at the topography level; in this case, its related field values must be considered in the analysis of the results.

To prevent interference with scenario constraints, this block should have:

  1. Very low density;

  2. A slope of 89.9999;

  3. Null values for any additional constraints imposed in the scenario.

The block should also contain the fixed cost value for all economic value fields.

2. Add a binary decision field for the region

Create a new field in the block model for binary decision-making. This field should:

  1. Have a negative value equal to the number of blocks in the region or an even smaller value for the special block.

  2. Have a value of 1 for all blocks in the region with fixed operational costs.

  3. Have a value of 0 for all other blocks that do not belong to the region.

3. Import the block model

Perform the standard block model import. During the field type definition, set the created decision field as SUM.

New field Reg1 being set as SUM type.

4. Validate the imported field

After importing, visualize the block model using the software’s viewer. Check that the values assigned to the binary decision field are correct and properly distributed as expected.

Scenario configuration steps

1. Create a pit optimization scenario

Create a pit optimization scenario where the pit’s lifespan is limited to a defined period. Additionally, include all project constraints for the mine’s entire lifespan. To streamline the optimization process, it is recommended to avoid geometric constraints, as this will help reduce complexity.

Example of pit optimization scenario with a single period and all constraints for the mine's entire lifespan. No geometric constraints are employed.

2. Create the binary restriction

In the sums tab, set the maximum value to 0 for the special field created.

Sum constraints with Reg1 maximum value set to 0.

3. Execute the scenario

Run the scenario and validate the obtained results.

Optimization report with indicators for the pit.
Visual representation of the blocks mined in the pit after the optimization.

Final considerations

With this configuration, whenever any block in the region is mined, the special block containing the fixed cost must also be mined. This ensures compliance with the restriction and guarantees that the fixed cost is properly included in the mining scenario.

This process should be repeated for each different fixed cost and region that needs to be included in the model to ensure proper cost allocation across the entire mining plan.

Realistic cost modeling for better planning

This approach enables a more effective and realistic modeling of fixed costs within mining planning.  MiningMath’s flexibility allows users to evolve their modeling over time, expanding to more sophisticated scenarios and ensuring better operational results.

This methodology can also serve as a first step toward a more complete transition to a fully optimized environment, reducing dependence on manual solutions and increasing the reliability of internal reports.

Furthermore, in the long term, MiningMath can be integrated into short- and medium-term planning, providing a continuous optimization process that may eventually replace the use of other tools.

Labs

Starting from version 3.0.8, a new set of small apps are provided in the MiningMath Labs section. These apps provide quick solutions to common problems, for example changing input csv files.

How to access it

To use the Labs section for the first time click on the Labs button. You might be prompted to select a folder with writing permissions in which the apps will be stored.

Once you have it selected wait a few sconds until it opens in a new window as depicted below.

Note: Labs may take a bit longer to open the first time.

Example app

To run any app, just double click its name. For example, if you double click the csv_colum_operations script a new small window will pop up asking you to select a csv file to perform alterations.

Once a file has been selected, you will have the option to add or remove columns, get column statistics, and save colums.

Column options with csv_colum_operations script
Summary statistics of a Process column in a csv file.

From now on you should be able to play with all the options and all the apps and see if there is anything that can aid in your project.

Reblocking

Reblocking is a method used to decrease the number of blocks in a block model by combining some of the smaller blocks to create larger ones. For example, if your blocks have the dimensions of 5 x 5 x 5, you could increase it to 10 x 10 x 10, which could reduce the number of blocks to half of its standard dataset size.

Note: when reblocking your model it is important to evaluate dilution aspects that can be lost by increasing the block size.

Improving runtime

Reblocking can significantly reduce optimization runtime. Users have observed substantial improvements in runtime by implementing double, triple, or even quadruple reblocking. For example, feedback indicates that for a 32M blocks model, optimization runtime decreased from 36 hours (with double reblocking) to 12 hours with triple reblocking, and further to just 4-5 hours with quadruple reblocking.

MiningMath provides an app in its MM Labs section that is able to reblock your block model. An example is provided below.

Reblocking with MM Labs

Open the Labs section in the main menu as depicted below. Note: You will need at  least version 3.0.8 to start using the MM Labs applications. More about Labs can be seen here.

A reblocking application should be available. Double click on it to open the app.

You will be prompted to select the csv file of your block model. Afterwards, you will need to inform the coordinate columns, model dimensions and desired reblocked dimensions.

Based on the columns of your model, you will be able to indicate which columns should be summed, averaged or weighted averaged. Lastly, you will need to indicate the output csv file. This file needs to be created beforehand. 

Cut-Off Grades

Definition

The cut-off grade is the concentration of a mineral or metal required to classify material as either ore or waste. Material with a concentration above the cut-off is classifed as ore, while anything below is classified as waste.

Different from MiningMath, other software packages usually make manual assumptions to pre-define a cut-off, and based solely on that, decide whether some material should be processed or not. That is needed, for example, before running LG or Pseudoflow algorithms. However, this simplified approach doesn’t account for other required mine scheduling constraints. Moreover, even the best algorithms commercially available for destination optimization perform this task apart from mine scheduling.

How MiningMath operates

MiningMath is not constrained by cut-off grades, fixed pits, or pushbacks. It’s the only software available that optimizes the mining sequence and the destination optimization in a single-step, all at once, considering the all constraints applied, as a truly global optimization process. 

MiningMath approach. Not need of pre-defined cut-off grades.

These advantages become even more apparent in more complex cases with multiple destinations or constraints that could be overlooked, hiding potentially valuable opportunities when not considered jointly.

As a result, MiningMath may reveal new possibilities, deliver more realistic results and even lead to an increase in NPV through its unique single-step optimization.

Since the algorithm does not employ cut-offs to manually assign destinations to blocks, it automatically sends the less valuable blocks to the dump, based on the constraints you’ve set. MiningMath prioritizes meeting all constraints to optimize the average grade, leading to an increase in NPV through global optimization. Hence, even blocks with positive economic value may be discarded if their value is below the minimum required for processing at a given time. In scenarios without stockpiling policies, blocks with higher positive value may also be sent to waste, as there’s no alternative destination. As a result, MiningMath may deliver outcomes different from your initial expectations. However, for comparative purposes you can adjust the results as needed using the methods suggested below.

Forcing a cut-off grade on MiningMath will likely make you lose part of the advantages it can offer. However, for many reasons, mining professionals might still be willing to force it, either to compare different methodologies, to understand the practical effects of using it or not, and so on. Different approaches are available to force a cut-off grade, which are described next. Note that these could also be employed to forbid any material type on the plant.

Using predefined destinations

Predefining destinations before any optimization is performed is also a possibility within MiningMath when importing a block model. 

Setting predefined destinations in order to force cut-off grades.

A detailed guide on how to use this parameter can be seen here.

You can create multiple columns of economic values, each one for a cut-off you want to test. Then, force MiningMath to use this limit by defining very negative values for the destination you want to avoid, as depicted belowfor a cut-off of 0.5. The math is:

Economic Value Process = If [Ore_Grade] > [0.5], then [f(Economic Value)], else [-999,999,999.00]

Block model setup to incorporate cut-off grades when defining economic values.

Grades in MiningMath are controlled as a minimum and/or maximum average calculation, which means that these limits do not represent cut-off values since the algorithm can use lower values to blend higher ones. Thus, to use this approach, just set a very negative value on the grades below the cut-off so that these blocks would reduce substantially the average when processed. It can also work to constraint a contaminant maximum limit by adding a high grade on it, as it can be seen below. Once again, the math is:

Ore_Grade1 = If [Ore_Grade] < [0.5], then [-999,999,999.00], else [Ore_Grade]

Block model setup to incorporate cut-off grades using average.

Another option is by using the sum tab to control material types. Therefore, it would be required to create a field to calculate only waste blocks mass and set the constraint of the maximum limit of it in the plant as zero, as it can be seen below. It is worth mentioning that this approach could increase the complexity of the optimization due to the priority order within the algorithm.

Tonnage_Waste = If [Ore_Grade] < [0.5], then [Volume*density], else [0]

Block model setup to incorporate cut-off grades using sum.

The mined blocks file (example below) is the main output to trace the blocks from each destination, to understand the results, and to find the best way to enhance your reporting based on any detail that you wish to disclose.

Figure 4: Mined blocks output file.
Example of mined blocks output file.

There are many useful tips to identify, and understand the results generated, some of them, are listed below:

  1. Filter results where the period mined is equal to the period processed (blocks processed). Check the process economic values of those blocks and identify the lower one (which means the cut-off value at the plant). This can be compared with higher processing economic value of those who went to the dump.

  2. Calculate the average grade of any material in period mined equal to the period processed filter. Check if the blocks going to dump would have exceeded any limit in the plant. If so, even having good economic values, they would not comprise the constraints in place.

To sum up, there are a lot of validations that can be done to understand why the algorithm is taking such decisions. It is also worth mentioning that any constraint can influence the results, even geometric ones, which could change the sequence and also change the destination of a block at any period.

Integration

MiningMath doesn’t necessarily compete against mine scheduling optimization packages. The only concept we and all research centers worldwide recommend mining companies to overcome is the one related to Pit Optimization, due to the set of problems you have to face when dealing with such technology.

Figure 1: Lerchs-Grossmann/Pseudoflow

Therefore, even our simplest version has more features to generate nested pits with better control so that you could design better pushbacks and define a mine schedule using your preferred tool. The reason why this software can deliver such results is the Direct Block Scheduling methodology based on Mixed Integer Linear Programming (MILP) model and proprietary heuristics. Check other technical details and related research in our theory section.

Figure 2: Direct to block scheduling.

MiningMath also allows you to generate optimized pushbacks, which could facilitate your design process and guide your mine schedule while using other software packages. Notice that our tool is an optimizer that simply breaks the whole deposit (your block model) into smaller pieces, aiming for maximum Net Present Value, but respecting as many constraints as you wish:

Figure 3: Optimized pushbacks and optimized schedules.

A usual application of our technology is basically on strategy optimization for building decision trees. Once we run dozens/hundreds of scenarios of the yearly schedule optimization and fine-tune their parameters/constraints, our users take some of the resulting surfaces of MiningMath and use them to design some pushbacks so that they could integrate with other packages, such as MSSO, COMET, etc. This procedure could be accelerated/simplified by working with packages of years and finding shapes closer to pushbacks you’re used to.

The outputs of our software will serve basically as optimized pushbacks, searching for maximum NPV and controlling whatever variable you consider necessary. Once we manage to import MiningMath surfaces into the other package, they will serve as guidance and they should assist the other package in finding higher NPVs. Most of these packages also allow us to predefine the blocks’ destination, if we wish to use MiningMath optimized cutoff policy. Finally, the package should have “only” the duty to do the bench scheduling, according to your short-term operational/tactical needs.

Even if you decide, for any internal reason, that you have to use LG/Pseudoflow to define final pit limits, there is no problem at all. MiningMath is the only tool available in the market capable of performing complete strategic analysis by building decision trees unconstrained by predefined pushbacks. Please, check this short example (in Spanish) with dozens of scenarios just for the decision on CAPEX regarding processing capacities. Check also the second half of this video for a broader view on how to use the same concept to take strategic decisions on many other aspects related to mine projects or ongoing operations. I assure your managers will get much more interested in your reports once you start adding this sort of strategic analysis. Notice you could perform this sort of analysis either free of constraints or respecting any pre-existing (designed) ultimate pit or pushbacks.

Figure 4: Multiple scenarios to build.

Going one step beyond, we also have clients improving their adherence and reconciliation between long and short-term mine plans by using MiningMath as a complementary tool. Notice that, by using MiningMath in strategic mine planning, you could add more constraints from real-life operations, even if you decide just to check your current long-term plans. Also, notice you could place some surface limits, such as the designed surface of the next five years plan for example, and give some controlled freedom to short-term planners to rerun their mine plans, including more operational details, as long as they don’t change anything from period 6 on and they don’t affect the NPV negatively. Whenever they find an issue or an opportunity, short and long term teams have a way to collaborate and generate new joint configurations that account for all the strategic and tactical needs of the project simultaneously. All the remaining details, such as the designs, could be adjusted using the current mining packages available.

If you wish to skip such steps and go straight to your final designed plans, we can guide you through this process, which includes a loop of running MiningMath and designing surfaces, until reaching a reasonable and operational sequence. This is a much more innovative procedure, which tends to achieve higher NPVs.

Coordinates to index conversion

The indices of each block represent its position in the model, indicating in which column, line, and level (IX, IY, and IZ) it is.

The indices must be integer values, starting with any value (for Marvin model, it was adopted the indices 1,1,1 for the first block).

The model’s origin must be placed at the bottom portion, starting to count from the minimum coordinates at X, Y, and Z.

Figure 1 highlights the origin of the Marvin block model and the first block index coordinates (1,1,1).

Figure 1: Block’s Matrix.

However, if the block model contains geo-referenced information based on coordinates, they could be converted into indices before being imported to MiningMath.

To perform this conversion, check the following demonstration on how to convert coordinates into indices using data from Figure 2 and the equation from Figure 3.

Figure 2: Sample data to convert coordinates into indices.

Figure 3 exemplifies using the X-axis but the process is the same for Y and Z axes by just using the corresponding information.

Figure 3: Equation to convert coordinates into indexes.

Click here to download a spreadsheet to convert the coordinates into indices and calculate the economic values.

Figure 4: Resultant coordinates converted to indexes.

The video below exemplifies the conversion process in case you have any doubts.

Video 1: How to convert block coordinates into block indices.

Dilution

This is an informal video on how to consider dilution and mining recovery.

Video 1: Dilution and Mining Recovery.

Keep your Original Data

A common question while using MiningMath integrated with other mining packages is: How to keep the data from one to another?

In order to facilitate this format exchange, you can keep any field from the other software and import it as a Other parameter. By following this approach MininingMath will keep the data in the reports and MinedBlocks.csv generated after the optimization. Thus, you will able to import it on your mining package by following the same parameters as to when it was exported.

This approach is quite useful when we have to keep the original coordinates X, Y, and Z or any other data that would help to identify the blocks in the platform which it came from.

Operational Solutions

A block-by-block schedule is likely very difficult to achieve in operations – example: operations does not allow using multiple types of equipment on the same bench. How to cope with it?

Although MiningMath works with blocks as inputs, the level of connectivity is user-defined by playing with geometrical parameters in the interface.

The following image shows results for the Marvin deposit when changing the Minimum Widths (filtered view after Period 2).

Figure 1: Marvin deposit and visual comparison across scenarios differing their operational widths.
Figure 2: Marvin deposit and NPV impact from scenarios differing their operational widths.

Note everything changes when playing with a single parameter, including the Life of Mine and geometries. Such impacts are also possible when playing with economic aspects, slope angles, environmental and boundary constraints, fleet size, processing setupsblending requirements, etc.

Watch the following video on how to play with operational constraints to achieve results closer to the reality of any project.

Video 1: Operational Constraints.

Optimization Hints

Impact of constraints on NPV optimization

The relationship between the number of constraints and Net Present Value (NPV) is generally inversely proportional. Scenarios with fewer constraints enable the optimization algorithm to search more freely, often leading to higher NPV results. It is therefore recommended to first explore the full potential of a mining project, starting with the Data Validation procedure, followed by the other possible Workflows that can inform further steps.

A study on constraint impacts

To better understand the effects of constraints, the MiningMath team conducted 2,000 simulations as depicted below.

Results from a 2,000-simulation study conducted by the MiningMath team.

The study examined the impact of both individual constraints and their combined effects.

  1. Individual Variation

    Initial simulations tested the effect of varying individual constraints such as Copper Selling Price, Mining Widths, and Vertical Rate of Advance. These parameters were adjusted individually to understand their tendencies. The results showed that most individual variations had a high probability of generating NPVs around $900M.

  2. Combined Scenarios

    The next phase involved running "overall" scenarios, where all 11 variables were assigned random values simultaneously. In these cases, the constraints collectively reduced the NPV potential, with a higher likelihood of NPVs around $200M.

Balancing freedom and constraints

The findings of the above study highlight a key principle: increasing constraints tends to decrease the likelihood of achieving higher NPVs. Users are encouraged to explore constraints to strike a balance between freedom for the algorithm and guidance to achieve practical results. Some strategies for that are given throughout the page.

Insights into MiningMath's algorithm

MiningMath employs mathematical programming to maximize NPV while respecting imposed constraints.

Constraint order
Constraints hierarchy order. More info here.

Advanced users can take advantage of this capability by strategically placing constraints that guide the optimizer toward better solutions.

For example, starting with unrestricted total movement/production allows for better waste distribution over time. Subsequently, applying production limits (e.g., 100 Mtpa, 90 Mtpa) helps assess tendencies by creating a curve of scenarios against NPV.

Constraint management

The ideal approach begins with minimal constraints and gradually introduces reasonable ones. By doing so, the algorithm focuses on a narrower solution space, potentially yielding higher NPVs.

Recommended workflow

  1. Start with minimal constraints and avoid adding complex geometries.

  2. Gradually introduce constraints one by one to observe their impact on NPV.

  3. Use an Workflow to assess how each assumption influences long-term project outcomes.

  4. Measure the "cost" of each constraint to guide managerial decisions.

Smart constraint strategies

  • Relax constraints when possible: For instance, temporarily flex slope angles to improve designs.

  • Use hints to guide the algorithm: Relax low-priority constraints or simplify complex scenarios into smaller problems.

  • Iterate gradually: Optimize from the least-constrained to the most-constrained scenarios.

Applying smart constraints to narrow the solution space and achieve optimal results. The ideal solution, represented by the green dot, lies closer to the smart constraint boundary, highlighting the advantage of focused guidance.

Shake-up approach

Nonlinear optimization problems often present challenges due to the presence of local maxima, where the algorithm may become “stuck,” preventing it from finding the global maxima.

Example of local maxima. Algorithm is not able to identify a path to the global maxima and concludes with a local maxima solution.

One effective strategy to mitigate this issue is to introduce slight modifications to the parameters. These adjustments should be small enough that they do not significantly alter the practical outcome but are sufficient to change the mathematical structure of the problem.

Example of new solution space after Shake-up approach. The new solution space and the previous one differ slightly, but enough for the algorithm to identify the path to the global maxima.

Some examples (not exhaustive) of constraints that could be shaked-up are:

  • Processing Capacities: Introduce minor changes to processing limits. For example, reducing or increasing the mill capacity by 1-2% can shift the balance between waste handling and ore processing priorities.
  • Discount Rate: Slightly tweak the discount rate used in NPV calculations. This affects how future revenues are valued and can lead the algorithm to favor different project schedules.
  • Stockpile Parameters: Change the maximum or minimum limits for stockpiling by small amounts. This can prompt the algorithm to explore alternative stockpiling strategies.
  • Time limit: It is possible to indicate a time limit in hours before running a scenario. Despite not being implemented for this purpose, this parameter might also be used to find more diverse solutions.

This “Shake-up” approach encourages the algorithm to explore alternative paths in the solution space, potentially bypassing local maxima and improving the chances of identifying better solutions.

Relaxed vs. violated constraints

Understanding the nuances of constraints is critical. Constraints may be relaxed to achieve feasible outcomes but should not be violated in ways that render the project unfeasible.

Examples of constraints that could be relaxed:

  • Processing stream shortfalls.
  • Operational constraints.
  • Certain average or sum-based constraints (depending on the deviation).

Examples of violated constraints

  • Production limits exceeded.
  • Slope angle violations.
  • Critical average or sum-based constraints that breach feasibility.

Reports detailing these nuances are available in the Excel output’s Report Tab for review.

Optimization Runtime

The optimization run time is a common concern for professionals dealing with robust models. This page aims to provide context and guidance to improve run times, which might be quite useful for having a big picture of the project’s behavior under different assumptions and hypotheses.

Runtime Barriers

The runtime depends on a combination of multiple aspects. It is directly related to the complexity of the deposit and it is proportional to the number of:

  • Blocks.

  • Multiple destinations (+3).

  • Constraints in use and conflicting goals with the same hierarchy order.

  • Variables imported.

  • Period ranges.

  • Parameters changing over time.

  • Multi-mine deposits.

  • RAM memory available. You can check it using Windows Task Manager. More details about recommended hardware can be found here.

Often, users are concerned with the limits to handle models with +20M blocks. MiningMath can virtually handle any model size. It has successfully made tests with models up to 240M blocks without reblocking, which took three weeks to run, and over a 32 Gb desktop machine.

Typically, datasets with 5 million blocks take a few hours (in an 8GB RAM machine). In the future, the technology will be capable of concurrently running multiple scenarios on the same computer. There is no need for special servers with extra RAM capabilities for deposits of average size.

Hardware Improvements

Memory

Overall, the main bottleneck for MininingMath is memory consumption. Hardware upgrades that most positively impact the optimization run time are:

  • RAM capacity.

  • RAM frequency.

Cores and threads

MiningMath is a single-thread application, which means:

  • Additional cores and threads do not affect the optimization run time.

  • Processors with higher clock speeds improve the run time.

Strategies to reduce the runtime

Use surfaces

The most recommended strategy is passing through the tutorial steps of validating data and constraints validations then starting using the surfaces as a guide to reduce the complexitywithout losing dilution aspects on your approach.

To get such guidance on a broader view with a reduced runtime you can for example create optimized pushbacks. The last step is to get a detailed Schedule since the model has such complexity. If such approaches do not offer a proper runtime, try to get intermediate results by splitting the total production into 2 or 3 periods.

Reblocking

Reblocking is a method used to decrease the number of blocks in a block model by combining some of the smaller blocks to create larger ones. This can be done using MM Labs as described here.

Note: when reblocking your model it is important to evaluate dilution aspects that can be lost by increasing the block size.

Time limit

It is possible to indicate a time limit in hours before running a scenario. The time limit is defined in hours due to the usual complexity of mining projects and by the fact that MiningMath will always try to deliver a reasonable solution.

This is a complex parameter that may not always be feasible to adhere to. It could also hinder the final solution, since it is restricting the algorithm from exploring a broader range of potential solutions. However, even if better results are not obtained, fast solutions will still give you a quicker assessment of your project. To better understand how the time limit works, you can visit this page.

Timeframes

Another strategy to reduce runtime might be the use of timeframes. MiningMath allows the integration between the short and long term visions in the same optimization processfacilitating the analysis and strategic definitions

For example, it is possible to consider less detail for longer time horizons. Such horizons need to be considered in the overall view of the mine, up to exhaustion, but they consume optimization processing time that can be more focused on the early years of operation. The figure below depicts an example with monthly time frames in the initial periods of the project, transitioning to yearly periods, and extending to decennial periods in the final stages. You can visit this page for more information on how to use timeframes.

Constraints chosen in the interface for a timeframe example.

Optimizing your Workflow

The following video comprises some tips and tricks to optimize your workflow when running multiple scenarios. The options comprise:

  • Altering multiple scenarios through the SSSCN files, which are XML files that could be parsed using scripts created by the user, then running scenarios through the interface.

  • Running scenarios from the command prompt, without recurring to the User Interface.

Video 1: Optimizing your workflow.

Percent Models

A percent model divides the mineral deposit into blocks while accounting for irregularities in the deposit’s shape. Instead of assuming each block is entirely filled with material, the percent model indicates the proportion of each block that actually contains valuable material.

Although this is not a mandatory step for the optimization process, the lithology can be defined considering:

  • The tonnage of a block.
  • The value of a block.

For example, consider the information of a block as depicted below:

Block information divided into lithologies OX (Oxide), MX (Mixed), PM (Primary) and Waste.
Such a block could be classified as:
  • MX, if considering the greatest parcel in terms of tonnage.
  • PM, if considering the greatest parcel in terms of value.

MiningMath calculates tonnages using the formula: [block size × density]. The average density of a block should be the weighted average, based on the lithologies present and their respective percentages.

Recoveries should also be calculated based on the amount of material recovered from each parcel of the block.

The economic value is calculated considering the amount of material recovered from each parcel, along with their respective revenues and costs.

The economic value of a block can be calculated both with and without dilution:

  • Without dilution (Option 1): Only ore parcels feed the plant

    In this case, the economic value for the process will consist of Revenue - Costs, where:
    Revenue refers to the ore parcel (70%).
    Processing Costs refer to the ore parcel (70%).
    Mining Costs refer to the entire block (100%).

    Since MiningMath processes the entire block, input a higher value for the process limit, assuming the algorithm will feed the plant with the remaining parcel of waste (30%).

    Additional steps

    1) Create Auxiliary Columns: Track and control the tonnage limits of ore, waste, and any specific lithotype you want to monitor (as shown below).

    2) Set Auxiliary Columns as "Other": During the import process, mark these auxiliary columns as "Other." This allows you to track and control the tonnages of each material.

    3) Adjust Production Charts: Ignore the default production charts and instead use the tonnage charts set as "Other Constraints."

  • With dilution (Option 2): The entire block feeds the plant.

    In this case, the economic value for the process will consist of Revenue - Costs, where:

    Revenue refers to the ore parcel (70%).
    Processing Costs refer to the entire block (100%).
    Mining Costs refer to the entire block (100%).

    Since there is dilution, the processing limit input in the interface should reflect the actual plant limit.

    Additional steps

    1) Create Auxiliary Columns: Use auxiliary columns to provide further control of the tonnages for each parcel, ensuring accurate tracking.

    2) Adjust Processing Limit: Input the real plant limit in the processing limit field to account for dilution.

    By using these adjustments, you can accurately model the economic value of blocks with dilution.

Single block example

The figure below depicts an example of calculations for a block composed of different lithotypes and its respective economic values assuming no dilution and diluted material (download the spreadsheet here).

Types of block models

The most common block model types used to define and manage the spatial distribution of geological and economic data are Regular, Percent, Subblocked, Reblocked, and GSM (Gridded Seam Model). Below is a brief overview and comparison of each type.

Block model types

  • Regular Block Model: The deposit is divided into uniform blocks of fixed size in three dimensions (X, Y, Z).

    This is the type natively supported by MiningMath. Formatting instructions can be seen here. Rotation of regular block models is also supported as described here.

  • Percent Block Model: Extends the regular model by incorporating percentage values to represent how much of a block contains a specific material or belongs to a given domain.

    This is not natively supported. However, instructions on how to format this model with and without dilution are provided here.

  • Reblocked Block Model: It is created by merging smaller blocks (often from a subblocked model) into larger ones to simplify calculations or meet specific optimization constraints.

    MiningMath provides an app in its MM Labs section that is able to reblock your block model. Further instructions can be seen here.

  • Subblocked Block Model: Refines regular blocks by dividing them into smaller sub-blocks where needed, typically along geological boundaries or to represent irregular shapes more precisely. It can give a higher detailed representation of geological features, but can also lead to large file sizes, higher computational demands, and higher complexity to prepare and manage.

    This is not supported by MiningMath.

  • GSM (Gridded Seam Model): This is used for stratified deposits, like coal or other layered materials. It divides the deposit into horizontal layers (seams) and models variations within each layer. However, it is less flexible for deposits with significant vertical or irregular variations.

    This is not supported by MiningMath.

Comparison and use cases

These models offer flexibility in balancing accuracy in deposit representation with complexity in computational effort required for optimization.

The table below highlights the trade-offs between accuracy and complexity, using the regular block model as a reference point. It also summaries the block model types supported by MiningMath.

Model type Best for Complexity (lower is better) Accuracy (higher is better) Support
Regular
Deposits with uniform geometry
Standard
Standard
Percent
Deposits with irregular ore/waste distribution
Higher
Higher
Reblocked
Simplified optimization or large-scale production planning
Lower
Lower
Subblocked
Detailed geological models with sharp boundaries
Higher
Higher
GSM
Layered, stratified deposits with predictable structures
Higher
Higher (for stratified deposits)

Fake Destinations

Fake destinations serve as a workaround to account for variations over time in recovery rates, ore prices, costs, or even combinations of these economic factors. Additionally, they can be used to manually segregate materials of varying qualities into different stockpiles, providing greater control over resource management. There are two possibilities when considering fake destinations as detailed below.

Price Fluctuation

In the case of defining scenarios of price fluctuation, different ore prices for the same material should not coexist. Thus, each plant will use a specific function but will represent the same one in reality.

Example

In the following example, columns CU_1 and CU_1.2 represent the economic values considering the default price for copper and an increase of 20% in this default price.

The main point here is the fact that these circumstances cannot coexist. On the Production tab, add and define Period Ranges in which each function will exist. Then, when Process 1 (CU_1) is active, use the proper production limit for it. For the other(s), define a production of zero in the same period.

Repeat this logic for any other period range you need. The image below gives a clear example of it.

Imperial System

For importing databases, MiningMath uses the metric system exclusively. In case your database is in the imperial system, it should be modified to a counterpart in the metric system. This page provides a script and further instructions for you to do this conversion successfully.

For this process, python and pandas packages are required (see installation guidelines below). In case you find any issues using the script, or if you make any modifications to it that could be useful for the community as a whole, please share your discoveries at our Forum

Script download

If you’re already familiar with python scripts, just copy or download this simple python script and run it in your machine. Otherwise, keep scrolling for further instructions.

DOWNLOAD SCRIPT HERE.

				
					import pandas

# define conversion consts
# foot to metter const
ft_to_m = 0.3048
# short tonne to metric tonne const
st_to_t = 0.907184
# ounce to gram const
oz_to_g = 28.34952

# import imperial model
imperial_model = pandas.read_csv("imperial_model.csv")

# create metric model
metric_model = pandas.DataFrame(columns=['X', 'Y', 'Z'])

# set coordinates to metric model (foot to meter)
metric_model['X'] = imperial_model['X'] * ft_to_m
metric_model['Y'] = imperial_model['Y'] * ft_to_m
metric_model['Z'] = imperial_model['Z'] * ft_to_m

# set dimension to metric model (foot to meter)
metric_model['!DIM_X'] = imperial_model['DIM_X'] * ft_to_m
metric_model['!DIM_Y'] = imperial_model['DIM_Y'] * ft_to_m
metric_model['!DIM_Z'] = imperial_model['DIM_Z'] * ft_to_m

# set mass to metric model (short tonne to metric tonne)
metric_model['!MASS'] = imperial_model['MASS'] * st_to_t

# set volume to metric model (cubic metter)
metric_model['!VOLUME'] = metric_model['!DIM_X'] * metric_model['!DIM_Y'] * metric_model['!DIM_Z']

# set density to metric model (metric tonne per cubic metter)
metric_model['%DENSITY'] = metric_model['!MASS'] / metric_model['!VOLUME']

# set grades to metric model (ounces per short tonne to grams per metric tonne or ppm)
metric_model['@GRADE_AU'] = imperial_model['GRADE_AU'] * oz_to_g / st_to_t
metric_model['@GRADE_CU'] = imperial_model['GRADE_CU'] * oz_to_g / st_to_t

# set recovery values to metric model
metric_model['*REC_AU'] = imperial_model['REC_AU']
metric_model['*REC_CU'] = imperial_model['REC_CU']

# set economic values to metric model
metric_model['$PROCESS'] = imperial_model['PROCESS']
metric_model['$WASTE'] = imperial_model['WASTE']

# export metric model to csv
metric_model.to_csv("metric_model.csv", index = False)

				
			

1) Installing Python

  1. Download Python's lastest version at https://www.python.org/downloads/.

    Python download webpage
    Python download webpage
  2. Once the download is complete, open the .exe and follow the instructions for a default installation. Make sure to select "Add Python to Path" before proceeding, as depicted below. 

    Python installation screen
    Python installation screen
  3. At this point, the installation should be concluded. You can check if Python has been correctly installed by running the command python --version at Windows PowerShell. 

    Python version on Windows shell
    Python version on Windows PowerShell

2) Installing Pandas

Pandas is an open source data analysis and manipulation tool, built on top of Python. Follow the steps below to install it:

  1. Open the Windows PowerShell and run the command "pip install pandas".

    Pandas install command
    Pandas install command
  2. Once the installation is complete, you're able to run Pandas inside your Python programs. You can check if Pandas has been correctly installed by running the command "pip show pandas" at Windows PowerShell.

    Pandas version
    Pandas version

3) Converting your database

This script can be used to convert foot to meter; short tonne to metric tonne; and ounce to gram.
It works with the columns: X, Y, Z, DIM_X, DIM_Y, DIM_Z, MASS, VOLUME, DENSITY, GRADE_AU, GRADE_CU, REC_AU, REC_CU, PROCESS, WASTE.

Follow the steps below:

  1. Save your database in a file named imperial_model.csv, at the same folder where your script is located.

  2. Run the command python imperial.py at the Windows PowerShell from the folder where the script is located. The example below shows the script in the Downloads folder.

    Run script example
    Run script example
  3. Open the output file named metric_model.csv, and that's it! Your data has been converted to the metric system. 

    Output file example
    Output file example
Chat Icon

Hi, it's Mima here 😇 Ask me any questions!