MiningMath

MiningMath

Loading...

With MiningMath there is no complex and slow learning curve!

Tutorials

Beginners guide

Run your first project

You can check a sequence of pages to learn how to run your first project with our Getting Started training. From installation process and formatting your model files up to the long-term planning of your project.

Click Here

Must-Read Articles

In order to take the maximum of MiningMath’s Optimization we recommend this flow through our Knowledge Base. It will guide you step-by-step in order to integrate multiple business’ areas and to improve your strategic analysis through risk assessments unconstrained by step-wise processes. 

Set up and first run

  1. Quick Check: Here you’ll have all the necessary instructions to install, activate and run MiningMath.

  2. How to run a scenario: Once everything is ready, it’s time to run your first scenario with MiningMath so you can familiarize with our technology!

Find new results

  1. Playing with pre-defined scenarios: Each change in a scenario opens a new world of possibilities, therefore, it’s time to understand a little more about and see it in practice, playing with pre-defined scenarios.

  2. Decision Trees: Decision Trees provide you a detailed broad view of your project, allowing you to plan your mining sequence by analyzing every possibility in light of constraints applied to each scenario, which options are more viable and profitable to the global project, as well as how these factors impact the final NPV.

Understand the technology in depth

  1. Current best practices: Here we go through the modern technology usually employed by other mining packages. It is important to understand these in order to comprehend MiningMath differentials.

  2. MiningMath uniqueness: Now that you’ve practiced the basics of MiningMath, and understand how other mining packages work, it’s time to get deep into the theory behind the MiningMath technology.

  3. Interface Overview: It’s time understand our interface overview with detailed information about every screen and constraints available in MiningMath. Home page, Model tab, Scenario tab, and Viewer for a better understanding of the possibilities.

Using and validating your data

  1. Formatting the Block Model: Learn how to format your block model data and use it in MiningMath.

  2. Importing the block model: Go through the importation process and to proper configure your data.

  3. Economic Values: MiningMath does not require pre-defined destinations ruled by an arbitrary cut-off grade. Instead, the software uses an Economic Value for each possible destination and for each block. After your data is formatted and imported into MiningMath, you may build your Economic Value for each possible destination.

  4. Data Validation: Once your data is set, it’s time to validate it by running MiningMath with a bigger production capacity than the expected reserves. Thus, you will get and analyze results faster.

  5. Constraints Validation: Continuing the validation, start to add the first constraints related to your project so that you can understand its maximum potential.

Improve your results

  1. Integrated Workflow: Each project has its own characteristics and MiningMath allows you to choose which workflow fits best in your demand and to decide which one should be used.

  2. Super Best Case: In the search for the upside potential for the NPV of a given project, this setup explores the whole solution space without any other constraints but processing capacities, in a global multi-period optimization fully focused on maximizing the project’s discounted cash flow.

  3. Optimized Pushbacks: Identify timeframe intervals in your project, so that you can work with group periods before getting into a detailed insight. This strategy allows you to run the scenarios faster without losing flexibility or adding dilution on the optimization, which happens when you reblock.

  4. Optimized Schedules: Consider your real production and explore scenarios to the most value in terms of NPV.

  5. Short-term Planning: Now that you built the knowledge about your project based on the previous steps, it is time to start the integration between long and short-term planning in MiningMath. You may also optimize the short-term along with the long-term using different timeframes.

Export your results

  1. Exporting Data: After running your scenarios, you can export all data. Results are automatically exported to CSV files to integrate with your preferred mining package.

In-Depth MiningMath

This tutorial provides a detailed guidance to the pages in the knowledge base for new MiningMath users. A shorter tutorial can be found here with a set of must read articles. In this tutorial, a larger number of pages is contextualized and recommended for those with no previous experience using MiningMath but who wish to gain a more advanced knowledge.

Software requirements

  1. Quick check: Verify if your computer has all the minimum/recommended requirements for running the software.

  2. Put it to run: Here you’ll have all the necessary instructions to install, activate and run MiningMath.

Set up the block model

The next step after installation is to understand the home page interface and import your project data. The following pages go over these in detail.

  1. Home page: MiningMath automatically starts on this page. It depicts your decision trees, recent projects and model information.

  2. Import your block model: import your csv data, name your project, set fields and validation.

  3. Modify the block model: this window aims to help you to modify your block model accordingly with what is required for your project and also allows you to “Export” the block model to the CSV format to be used with any other software.

  4. Calculator: calculate and create new fields by manipulating your project inside MiningMath.

Handling unformatted data

If you don’t have a block model ready to be imported you might want to create a new one. The following pages can guide you through this process.

Define the scenario and run

Once you have started your block model defined, there are several options to set up your project’s parameters before running a scenario.

  1. Scenario tab: set densities, economic parameters, slope angles, stockpiles, add/remove processes and dumps, production inputs, geometric inputs and so on.

  2. Save as: save the scenario's configuration once it has been configured.

  3. Run: the Run tab is the last step before running your project’s optimization. Change the scenario name, set a time limit, and set up results files.

Results

After running your scenario it is important to analyze and understand the given results.

  1. Output files and 3D viewer: by default, MiningMath generates an Excel report summarizing the main results of the optimization. It also creates outputs of mining sequence, topography, and pit surfaces in csv format so that you can easily import them into other mining packages. The 3D viewer enables a view of your model from different angles. 

  2. Export model: export your model as a csv file. This can be used in new scenarios or imported in other mining packages.

Extensive set-up

MiningMath offers a lot of customization. You might use pre-defined scenarios to learn with standard parameters. Otherwise, the following pages of the knowledge base detail several important parameters that might need to be fine tuned in your project.

Advanced content

Complex projects might need advanced configurations or advanced knowledge in certain topics. The following pages cover some subjects considered advanced in our knowledge base.

Theory

In order to understand the theory behind MiningMath’s algorithm, a set of pages is provided to describe mathematical formulations, pseudo-code, and any rationale to justify the software design.

Workflows

ManingMath acknowledges and supports different workflows. This knowledge base provides a set of articles aimed at showing how MiningMath can be integrated into other workflows or have its results used by different mining packages. 

Getting Started

Quick Check

System requirements

The only mandatory requirement for using MiningMath is a 64-bits system. Other minimum requirements are listed further:

  1. Windows 10

  2. 64-bits system (mandatory)

  3. 110 MB of space (installation) + additional space for your projects' files.

  4. Processor: processors above 2.4 GHz are recommended to improve your experience.

  5. Memory: at least 4 GB of RAM is required. 8 GB of RAM or higher is recommended to improve your experience.

  6. Microsoft Excel.

  7. OpenGL 3.2 or above. Discover yours by downloading and running the procedure available here.

  8. Visual C++ Redistributable: Installation of Visual C++ Redistributable is necessary to run this software.

Recommended Hardware

Memory should be a higher priority when choosing the machine in which MiningMath will be run on. Here’s a list of priority upgrades to improve performance with large scale datasets: 

  1. Higher Ram

  2. Higher Ram frequency

  3. Higher processing clock

Common Issues

Insufficient memory

As previously presented, RAM should be one of the most important components to prioritize when selecting a computer to run MiningMath. However, if you encounter an insufficient memory warning during the import of your block model, there are some recommendations you can consider:

1. Memory Upgrade: If possible, this is the best solution to enhance efficiency. The characteristics to observe are listed in the previous item, “Recommended Hardware.”

2. Free Up Memory: Consider closing other applications that are consuming the computer’s RAM while MiningMath is running.

3. Increase Windows Virtual Memory: This procedure involves allocating disk space to be used as RAM. To perform this procedure, we recommend this tutorial.

4. Reblock: If none of these options work, reblocking can be considered to reduce the size of the model. Check more details here.

Extra: In exceptional cases, when working with boxes, it may be viable to manipulate the block coordinates to bring them closer together, creating a smaller model box.

Put It to Run!

Installing, Activating and Running

Installing and activating MiningMath is quick and straightforward. All you need to do is follow the setup wizard and have an internet connection to activate your license. 

Video 1: MiningMath installation process.

Activating Your License

To activate your license, you will need to: 

  1. Open MiningMath (it will open automatically after the installation, but you can open it manually afterwards).

  2. On the left menu, click License.

  3. Select the field "I have an activation code" and paste the License Code provided by MiningMath.

  4. Click "Activate license".

Opening an old project

If you need to open an old project, just follow these steps:

  1. Open MiningMath (it will open automatically after the installation, but you can open it manually afterwards).

  2. On the left menu, click on Open Project.

  3. Search for the folder in which you saved your old project.

  4. Select the ".ssprj" file.

  5. Click on "Open" and it will show up on the "Recent Projects"

  6. Now you can open it!

The images below illustrate this process:

NOTE

MiningMath’s licensing method demands an internet connection. 

Optimizing Scenarios

Play with the predefined scenarios

MiningMath allows you to learn with each scenario by providing standard parameters which simulate some common constraints that a mining company may face. Standard scenarios are listed and described below so you can identify the main changes made within the “Overview” tab.

The ultimate goal of this practice is to prepare you to build Decision Trees, which allow you to organize scenarios in order to understand how variables influence one another and, consequently, how these variables determine the final NPV.

Figure 1: Scenarios on the Home Page

Market Conditions Decision Tree

1) BaseCase

The Base Case consists of the initial scenario, with a uniform production capacity and without sum, average or surface mining limits.

2) BaseCase-RampUp

While the base case considers a uniform production capacity, the BaseCase-RampUp scenario offers the possibility to vary the levels of production within the different timespans. We have an initial production capacity of 10Mton on the first 2 periods; 20 Mton on periods 3 and 4; and 30 Mton from period 5 until the end of the mine’s lifetime, with a total movement constraint of 30, 60, and 80 Mton, considering the increase of production within the time-frames mentioned.

Figure 3: BaseCase-RampUp

3) PriceUp and PriceDown

Scenarios “PriceUp” and “PriceDown” differ in relation to the basic scenario in the economic value used for the calculation of the P1 process, where there is an increase and a decrease of 10% at the copper selling price, respectively. In the destination tab, “P1 Cu +10” and “P1 Cu -10” were the values used for the process.

4) PriceUp-RampUp and PriceDown-RampUp

These scenarios consider a 10% copper selling price increase and decrease, and a ramp-up of the production capacity at the same time, as mentioned before.

5) PriceUp-RampUp-Protection300 and PriceUp-RampUp-Protection400

This scenario considers a 10% copper selling price increase and a ramp-up of the products at the same time, as mentioned in the previous scenario. In addition, a restrict mining surface (this constraint is used to prohibit access to this area in a specific timeframe) was included up to the fourth period, since it may represent some legal constraints on a project.

Figure 8: PriceUp-RumpUp-Protections

Other Decision Trees

Below you can see a description of some scenarios of other Decision Trees.

1) MW150 (Geometries Decision Trees)

The MW150 scenario considers different geometries from the base case at the geometric constraints. In this scenario, 150 meters was used as mining width (the horizontal distance between the walls of two surfaces that belonged to consecutive periods), and a vertical rate advance of 180 meters.

Figure 9: MW150

2) AvgCu (Average Decision Tree)

In the AvgCu scenario, blending constraints were added in the average tab to consider allowed at the process plant 0.5% as a minimum and 0.7% as a maximum average copper grade. The optimization will have to fulfill the P1 process capacity and, as an additional challenge, it has to meet this new set of parameters related to the Cu average content within the ore.

Figure 10: AvgCu

3) Proc13000h and Proc13000h-33Mt

(Process throughput Decision Tree)

Scenario Proc13000h considers 13.000 hours of processing equipment used as the maximum limit. This constraint was inserted at the sum tab and it controls variables such as rock type feeding, energy consumption, and any parameter controlled by its sum. Scenario Proc13000h-33Mt considers an increase of 10% in the production, inserted at the production tab beyond the parameters mentioned previously.

Figure 12: Processing hours

4) Yearly-TriannualProduction

(Short-Long Term Integration Decision Tree)

This considers a yearly production for period range 1-4 and triannual planning for range 5-end. This way is possible to integrate both short and long-term planning in a single run, facilitating the analysis and strategic definitions. 

Any kind of timeframe can be used according to your needs.

Yearly-Triannual Production Example
Figure 13: Short-Long Term Integration

Translations

MiningMath supports and encourages the translation of its knowledge base to multiple languages. If you would like to translate our knowledge base and have your profile advertised please contact us.

Portuguese

Ask GPT

You can use ChatGPT to help you with our knowledge base. First, you will need to have the Plugins options enabled on GPT-4.

After that, choose the AskYourPDF option:

Finally, you should enter the following prompt:

For the requests all along this chat, consider the following content: https://miningmath.com/Knowledge-Base.pdf

Other prompts can help you with different requests. For example, you can ask GPT-4 to act as your own technical support agent that answers in the same language as your question:

Plugin AskYourPdf, consider the following content: https://miningmath.com/Knowledge-Base.pdf

Please answer the following question as a technical support agent, coming from a MiningMath user, in the same language as the question:

"QUESTION TEXT TO BE REPLACED"

Essential Topics

How to Run a Scenario

Video 1: Downloading MiningMath.

On MiningMath’s interface, you will find the Marvin block model and its scenarios (Figure 1). It is possible to preview the scenario and its parameters before opening it (Figure 2).

Choose and open Base Case, click the “Overview” tab (Figure 3) to check the parameters, and then click on “Run” to run the optimization (Figure 4).

After that, a short report with the results will be generated. To view it, check all the boxes on the “Load Options” window and click on “Load” (Figure 5).

Finally, whenever you feel ready to run your own scenarios, start by formatting your data here.

Common Issues: Setting your first scenario

When setting up your first scenario, you may come across some situations such as unavailable tabs and some fields marked in red. These situations are quite simple to resolve, as shown in the following video:

Play Video

Results of the Optimization

By default, MiningMath generates an Excel report summarizing the main results of the optimization. It also creates outputs of mining sequence, topography, and pit surfaces in .csv format so that you can easily import them into other mining packages.

Viewer

The 3D viewer enables a view of your model from different angles. The block colors are defined accordingly with each property displayed, varying from blue to red (smallest to largest), due to destinations, periods, or any other parameter. Therefore, it’s possible to filter the blocks by the period in which they were mined or processed, for instance. In addition, it also allows you to compare multiple scenarios by loading different cases and using the left bar to change from one to another.

Output Files

After optimizing your block model and running your scenario(s), MiningMath generates standard output files with detailed reports. The main files have a universal format (.csv), which allows you to easily import them onto other mining packages to start your mine design and further steps of your projects.

To open the project folder, click on the scenario’s name with the right button of your mouse and choose “Show in the Explorer“. The optimization’s main output files are:

  • Scenarioname.xlsx: Short report with the main results.

  • MinedBlocks.csv: Detailed report which presents all the blocks that have been mined.

  • Surface.csv: Grid of points generated through the pit each period.

Scenarioname.xlsx

Provides you with a short report with the main results of the optimization: several charts and sheets in which you can analyze the production on each period, the stockpiles by periods, the average grade of processes and dump, NPV per period, the cumulative NPV (Net Present Value), etc.

Figure 18: Graphic results

MinedBlocks.csv

This file offers a detailed report on all the mined blocks and their specificities: information on the mining sequence based on each block extracted, along with mined and processed periods, destinations, economic value, and all information used for the optimization. This file also allows you to identify blocks that were stocked and the algorithm decision-making process.

Figure 19: Mined blocks

Surface.csv

The surface scenario brings a grid of points generated through the pit of each period: each surface is named according to its mining periods and contains information about the topographic coordinates at that time. These files can be imported into the viewer separately, so that you can verify and validate your data before starting the optimization process. Note: The surfaces are exported/imported from/at MiningMath in Coordinates.

Figure 20: Surface's CSV

Video 1: Outputs and files hirearchy.

Play with Predefined Scenarios

MiningMath allows you to learn with each scenario by providing standard parameters which simulate some common constraints a mining company may face. Standard scenarios are listed and described below so you can identify the main changes made within the “Overview” tab.

The ultimate goal of this practice is to prepare you to build Decision Trees, which allow you to organize scenarios in order to understand how variables influence one another and, consequently, how these variables determine the final NPV.

Figure 1: Scenarios on the Home Page

Dataset

The examples in this page come preinstalled with every version of MiningMath. If you have deleted this project by any chance, please download the zip file below, extract the files and choose the “Open Project” option in MiningMath.

BaseCase

The Base Case consists of the initial scenario, with a uniform production capacity, and without sum, average, or surface mining limits.
Figure 2: BaseCase overview

BaseCase-RampUp

While the base case considers a uniform production capacity, the BaseCase-RampUp scenario offers the possibility to vary the levels of production within the various timespans. We have an initial production capacity of 10Mton on the first 2 periods; 20 Mton on periods 3 and 4; and 30 Mton from period 5 until the end of the mine’s lifetime, with a total movement constraint of 30, 60, and 80 Mton, considering the increase of production within the time-frames mentioned.

Figure 3: BaseCase-RampUp

PriceUp and PriceDown

Scenarios “PriceUp” and “PriceDown” differ in relation to the basic scenario in the economic value used for the calculation of the P1 process, where there is an increase and a decrease of 10% at the copper selling price, respectively. In the destination tab, “P1 Cu +10” and “P1 Cu -10” were the values used for the process.

PriceUp-RampUp and PriceDown-RampUp

These scenarios consider a 10% copper selling price increase and decrease, and a ramp-up of the production capacity at the same time, as mentioned before.

PriceUp-RampUp-Protection300 and PriceUp-RampUp-Protection400

This scenario considers a 10% copper selling price increase and a ramp-up of the products at the same time, as mentioned in the previous scenario. In addition, a restrict mining surface (this constraint is used to prohibit access to this area in a specific timeframe) was included up to the fourth period, since it may represent some legal constraints on a project.

Figure 8: PriceUp-RumpUp-Protections

Below you can see a description of some scenarios of other Decision Trees.

MW150

The MW150 scenario considers different geometries from the base case at the geometric constraints. In this scenario, 150 meters was used as mining width (the horizontal distance between the walls of two surfaces that belonged to consecutive periods), and a vertical rate advance of 180 meters.

Figure 9: MW150

AvgCu

In the AvgCu scenario, blending constraints were added in the average tab to consider allowed at the process plant 0.5% as a minimum and 0.7% as a maximum average copper grade. The optimization will have to fulfill the P1 process capacity and, as an additional challenge, it has to meet this new set of parameters related to the Cu average content within the ore.

Figure 10: AvgCu

AvgCu-Stock5Mt

Here, the same blending constraints of the previous scenario (AvgCu) were added, in addition to a stockpile limit of 5Mton for process 1, on the destination tab. This feature allows you to control the stock limit of your whole process, which increases the optimization flexibility to feed the plant, while respecting the blending constraints that were already implemented.

Figure 11: AvgCU-Stock5Mt

Proc13000h and Proc13000h-33Mt

Scenario Proc13000h considers 13.000 hours of processing equipment used as the maximum limit. This constraint was inserted at the sum tab and it controls variables such as rock type feeding, energy consumption, and any parameter controlled by its sum. Scenario Proc13000h-33Mt considers an increase of 10% in the production, inserted at the production tab beyond the parameters mentioned previously.

Figure 12: Processing hours

The Calculator

This feature allows the user to manipulate their project inside MiningMath, enabling adjustments and new field creation. Figure 1 shows a general view of the calculator. On the left side we have the block parameters and on the right the calculator itself, where the calculation can be done.

Figure 1: Calculator.

To manipulate the calculator just insert a name for the new field, select the type of field (to know more about field types, access this link), and build your expression. In case of a more complex expression, just mark the field “Logical Test” to enable conditional features. To manipulate the calculator just insert a name for the new field, select the type of field (to know more about field types, access this link), and build your expression. In case of a more complex expression, just mark the field “Logical Test” to enable conditional features.

  • + : Addition

  • - : Subtraction

  • * : Multiplication

  • / : Division

  • % : Modulus

  • ** : Exponential

  • // : Floor division

Practical approach

To facilitate the understanding, let’s work on some examples. You can see below a generic math expression (left), and its equivalent written on MiningMath’s calculator (right)

\((x^2)\times(\frac{y}{2}-1)\) x**2*((y/2)-1)

Adding a field without logical expression

Using an example of the Marvin’s Economic Value calculation, we are going to add a Block Tonnes field, as the figure 2:

Figure 2: Adding a new field

Adding a field with a logical expression

One more time using Marvin’s block model, let’s suppose we want a maximum slope angle of 45 degrees.

First, we name our Field, in this case, will be “SlopeMax45d”, select the field type as “Slope” and check the Logical Test box. Then a double click on the Slope field select it and already put it on the Expression. The next step is to select the operator, as we want a maximum of 45 degrees, we choose the operator “>” and insert the value 45 in the text box. If the value is true, that is, if this value is bigger than 45 it will now have the value of 45 assigned to it. If the value is false, i.e., lower than 45, then it will keep its value. Figure 3 shows this calculation:

Figure 3: Logical test expression

During the expression construction, green or red lines will underline it, highlighting the correct parts and the ones that need adjustments to become correct. When it is all set, just click on “Add field” and this new field will be available for use on the project on its correct field type assignments. In case the user needs to delete a field, just go to the parameters option, select and delete it.

Removing a field

To remove an existing field, go to the “Parameters” tab, select the desired field and click “Remove”.

Figure 4: Removing a field

NPV Calculation

The following video explains more about the NPV calculation made by MiningMath’s algorithm. The understand of these steps might be useful for users working on projects with variable mining costs, which are not yet smoothly implemented on the UI.

Video 1: NPV calculation.

The discount rate (%/year) is provided by the user in MiningMath’s interface, as depicted in the figure below.

Figure 1: Interface example to define discount rate (%/year)

In a usual scenario period ranges are defined by annual time frames, as depicted in Fig-2.

Figure 2: Interface example with annual time frame

In this case, the annual discount rate multiplier (annual_multiplier) to return the discounted cash flow is performed as follows:

\(\text{annual_multiplier}(t) = \) \(\frac{1}{(1 + \text{input discount rate})^t}\)

The table below exemplifies one case for 10 periods.

Period Process 1 Dump 1 NPV (Discounted) M$ Annual multiplier Undiscounted NPV M$
1
P1
Waste
1.2
0.909
1.320
2
P1 +5%
Waste
137.9
0.826
166.859
3
P1 +5%
Waste
132.5
0.751
176.358
4
P1 +5%
Waste
105.4
0.683
154.316
5
P1 -5%
Waste
89
0.621
143.335
6
P1 -5%
Waste
92
0.564
162.984
7
P1 -5%
Waste
91.3
0.513
177.918
8
P1 -10%
Waste
52.3
0.467
112.110
9
P1 -10%
Waste
54.3
0.424
128.037
10
P1 -10%
Waste
12.1
0.386
31.384

Table 1: Example of annual multiplying factors and undiscounted cash-flows for a 10% discount rate per year. Process 1 exemplifies the use of different economic values per period.

In details, Table 1 lists:
  1. the NPV (discounted) resulting from a 10 yearly period with a 10% discount rate per year.

  2. the annual discount rate (annual_multiplier) for each period; and

  3. the undiscounted NPV as the result of the discounted NPV divided by the annual_multiplier.

MiningMath allows the creation of scenarios in which period ranges are defined with custom time frames (months, trienniums, decades, etc.), as depicted in Figure 3.

Figure 3: Interface example with custom time frames

In this case, the discount rate is still provided in years on the interface. However, the discount rate per period follows a different set of calculations. To identity the correct multiplier (discount rate for a custom time frame) applied to each custom time frame, it is necessary to apply the formula below:

\( \text{mult}(t) = \frac{1}{(1 + \text{discount_rate}(t)) ^ {\text{tf_sum}(t)}}\)

where:

\(
\text{tf_sum}(t) = \sum_{i=1}^{t}\frac{TF(i)}{TF(t)}
\)

and

\(
\text{discount_rate}(t) = (1 + \text{annual_discount_rate})^{TF(t)} – 1
\)

and

\(
TF(t)=
\begin{cases}
1,& \text{if}\, t\, \text{is in years}\\
\frac{1}{12},& \text{if}\, t\, \text{is in months}\\ 3,& \text{if}\, t\, \text{is in trienniums}\\
etc.&
\end{cases}
\)

For example, to calculate the multiplier of the first period in figure 3, the equation would be:

\( TF(1) = \frac{1}{12} = 0.8333… \)

\( \text{tf_sum}(t) = \sum_{i=1}^{1}\frac{TF(1)}{TF(1)} = 1 \)

\( \text{discount_rate}(1) = (1 + \text{annual_discount_rate})^{TF(1)} – 1 = (1 + 0.1) ^ {1/12} – 1 = 0.007 \)

\( \text{mult}(1) = \frac{1}{(1 + \text{discount_rate}(1)) ^ {\text{tf_sum}(1)}} = \frac{1}{(1 + 0.007)^{1}} = 0.993 \)

Another example, to calculate the multiplier of period 15 in figure 3, the equation would be:

\( TF(15) = 3 \)

\( \text{tf_sum}(t) = \sum_{i=1}^{15}\frac{TF(i)}{TF(15)} = 2 \)

\( \text{discount_rate}(15) = (1 + \text{annual_discount_rate})^{TF(15)} – 1 = (1 + 0.1) ^ {3} – 1 = 0.331 \)

\( \text{mult}(15) = \frac{1}{(1 + \text{discount_rate}(15)) ^ {\text{tf_sum}(15)}} = \frac{1}{(1 + 0.331)^{2}} = 0.564 \)

Evaluate Project Potential

Certain constraints related to your project can be defined so that you can understand its maximum potential. The surface generated in this case could also be used as a restrict mining in the last period to reduce the complexity of your block model and the runtime of MiningMath, since it includes a set of constraints inputted.

Example

  • Set up a scenario with 1,000 Mt in the processing plants, which corresponds to a lot more mass than expected in the whole life of the mine.

  • Add the Minimum Bottom Width (100m). This constraint will allow you to have a suitable work-front for your equipment.

  • Restrict Mining surface, if you have this constraint in your project.

  • Grade constraint until 0.7%.

  • Timeframe: Years (1), since it would all be processed in 1 period.

Note: Sum constraints can restrict the total amount of handling material (ore + waste) of the mine. Therefore, do not use them in the validation.

Using results

Now that Constraints Validation step is done, you are able to use this final surface as a guide for future optimizations. This approach reduces the runtime and the complexity of the algorithm because when you take it into account, the blocks below this final optimized surface won’t be considered and the heuristics inside the interface would be facilitated. Notice that we did not make any change in the discount rate, thus, this first NPV does not represent the reality. If you need an accurate result in this step, make sure to adjust it.

It’s important to remember that when we restrict the mining into this surface, the number of periods generated in future runs could be reduced because the average parameters of each one will have to meet the constraints of the overall package. Therefore, to achieve the same parameters in a lower timeframe, some blocks may be discarded due to the mining sequence and the optimization of destinations inside the whole mass.

Having this idea in mind, you should already have enough information to decide and structure the next step of the optimization. Based on the amount mined in the last item and in the processing capacity, define a good timeframe to identify the mining sequence. In this case, we had 231 Mt as the ore total mass to split in almost 23 years, since the processing capacity is 10Mt.

To improve efficiency in the optimization, before working on a yearly basis, we made the decision to consider the first 5 years. It is reasonable to generate a 10-year surface to consider the optimization inside this limit due to observations made before. Remember that each assumption here can be done accordingly with your project’s demands and that MiningMath can work with any timeframe to meet your needs.

Decision Trees

Comparing Scenarios

Decision Trees provide you with a detailed broad view of your project, allowing you to plan your mining sequence by analyzing every possibility in light of constraints applied to each scenario, which options are more viable and profitable to the global project, as well as how these factors impact the final NPV. Consider, for instance, the plant production per year as a variable factor. Using Decision Trees (Figure 1), you will be able to analyze how each constraint, e.g. the ore price, affects that year’s production and benefits or not the global project.

Figure 1: Essence of a Decision Tree, done in presentation software.

By running all the scenarios individually, just like what you did on Practice First, you will be able to identify how each change, within a set of constraints, impacts the NPV results and the mining sequence generated (Figure 2 and 3), which provides you a broader view of your project and enables you to decide which route you should take to generate value to your company.

How to Analyze Multiple Scenarios

Increase in the value of copper

Analyzing first the scenario in which there is a change in the economic value of the P1 process (“scn-PriceUp”), values such as NPV would naturally be different. In this case, analyzing the NPV and the total movement (Figure 3), it’s possible to understand that a different mining sequence was generated, which increased the mine’s lifetime by one period. This market change has also increased cumulative NPV (Figure 4) values based on its direct relation with the copper selling price. The charts below were made with the help of MiningMath’s results in simple spreadsheet software.

Figure 4: Total mass (Process+waste) handled on each scenario.
Figure 5: Cumulative NPV contrasts.

Adding an average grade limit

Now we can analyze the scenario in which a restriction in the average grade at P1 process was added, using a minimum and a maximum limit of copper (“scn41-AvgCu”). The blocks that would be processed would have to meet established targets, allowing a better selectivity of what should be processed or not. The ones which have higher or lower grades than required could be blended with others to generate an average grade that respects the constraints and improves the NPV.

Notice that there was a higher total production (Figure 5) in each period, caused by the increase of the stripping (ore/waste) ratio to meet the 30 Mtons of ore production at P1 Process and the average grade targets settled at the “scn41-AvgCu” scenario. A better stock pilling use is expected, in order to use all the blending capabilities and decision-making intelligence of the algorithm to decide which blocks could be mixed to fulfill the plant capacity. In addition, the cumulative NPV (Figure 6) shows that by inserting average grade constraints we consequently reduce the algorithm flexibility and lose some money to keep the operational stability frequently required at a processing plant.

In general, the main goal of MiningMath, considering the set of constraints provided, is to maximize the cumulative NPV in the shortest mine lifetime possible, which would reduce the project depreciation by interest rates. The charts below were made based on MiningMath results with the support of spreadsheet software.

Figure 6: Total mass handled on each scenario.
Figure 7: Cumulative NPV contrasts.

Building Decision Trees

You have been introduced to some of MiningMath’s functionalities. Now let’s take a closer look at how decision trees are built.

Mine project evaluation largely relies on technology from the 1960’s, in which a step-wise process is usually necessary along with time-consuming activities, like pit-design, in order to create only one single scenario. Evaluating projects through this approach could take from weeks to months of multidisciplinary work just to produce a couple of scenarios. This process is often guided by some arbitrary decisions that may constrain the mathematical solution space, confining solutions to engineering expertise and judgment.

global optimization scheduling can speed up the process of generating multiple scenarios for project overview prior to detailed work. MiningMath integrates the business’ areas and allows managers to improve their decision-making process by structuring their strategic analysis through multiple decision trees with a broader and optimized view of their projects, comprising constraints from different areas of the company.

The following video shows a few possibilities recognized only when seeing the available paths to create value. The video is oriented to technical daily usage but also covers interesting subjects for the managerial perspective. For the last case, skip straight to minute 15:23.

Video 1: Video detailing the building of decision-trees.

Apply to your projects

Now that you have played with the sample data, it is time for a hands on approach and apply this optimized strategy to your own projects!

MiningMath already allows you to structure your Decision Trees layout at its home page, which facilitates and guides the decision-making and mining planning processes.

Take advantage of the possibility to add (+), rename, or delete Decision Trees (Figure 7), by clicking with the right button at their names and/or exchange scenarios (Figure 8) between trees to build different mining planning strategiesThe icon is a shortcut, so you can easily open your scenario’s full report.

Compare everything in a single look and identify how each change impacts your results to build your own analysis by using presentations based on MiningMath charts as shown in Figure 1.

FAQ

What is MiningMath?

It is a software application that uses innovative technology for direct block scheduling. The MiningMath aims to maximize the Net Present Value (NPV) of a project deciding, based on an imported block model, which blocks will be mined, when and what the destination of each block is.

It is possible to define multiple processing plants, stockpiles and waste dumps, respecting their capacities. It is also possible to set physical limits or force mining in certain regions by importing surfaces.

As the software has a flexible algorithm, it will be possible to include other restriction types in the future, such as blending for example.

Read more.

OK, but why should I use MiningMath?

MiningMath allows running a complete schedule directly from the resource block model, with no need to define a final pit, nested pits, pushbacks, cut-off grade optimization and stockpiles as would be required as part of a traditional full scheduling exercise. MiningMath will find a mining schedule that aims to maximize the NPV of the project, combining all the steps mentioned and optimizing all the periods simultaneously. Therefore, an experienced professional can test multiple scenarios by modifying parameters and advance other stages of his work, while MiningMath performs the entire optimization.

Read more.

Is the generated solution operational, or only a mathematical result?

Each mining plan generated by the optimization respects important geometrical parameters, such as a minimum pit bottom width, a mining width and vertical rates of advance, which can be configured specifically for your project. Furthermore, MiningMath technology generates surfaces without geotechnical errors. The generated plans are close to the operational reality of the mine, which implies smaller variations in the parameters when ramps are designed.

In practice, how can I use MininngMath to scheduling optimization?

A block model, in CSV formatincluding indexes or coordinates and economic values of the blocks is first imported followed by entering the primary parameters of the model and production restrictions via interface. Upon completion of these steps, MiningMath is ready to perform the optimization.

The resulting surfaces will respect the user-defined parameters and a report will be generated with graphics containing the most important indicators.

To demonstrate the usage and power of the software, you can access our demo video here.

Does MiningMath suit my project? What are the limitations of the current version?

The current version of MiningMath suits any open pit mining project that can be modeled with blocks of regular dimensions. If your project has multiple types of rock per block, there are ways to adapt the inputs to handle these cases. MiningMath is 100% based on 64-bit technology and has an efficient algorithm, capable of handling tens of millions of blocks without requiring supercomputers or cloud computing.

To date, MiningMath has focused on developing the best algorithms; future versions will introduce more facilities for the user, including the development of plug-ins to mining softwares on the market.

I have a sub-blocked model. How could MiningMath be used in this case?

If the model can be exported by dividing all blocks into sub-blocks, then we have a regular database formed only by sub-blocks and MiningMath can run it. We perform regular tests successfully using models with tens of millions of blocks. For future versions, we are planning significant efficiency improvements.

Does MiningMath have the Lerchs-Grossman (LG) algorithm implemented?

No. LG is a brilliant algorithm for its time, but none of the new software need to implement it anymore. The technological advances have proven that new methods overcome some barriers that the LG faces. These days, any software that has the same mathematical model of LG implemented will rather implement an algorithm based on maximum flow rate (Max Flow) which can be run tens or hundreds of times faster than the LG. However, both LG and Max Flow have no flexibility to include other important restrictions such as a minimum pit bottom width or blending.

MiningMath uses highly recommended technology currently in practice and inside research centers, being what is the most advanced, tested and available when talking about optimization. It was implemented using modern techniques based on mixed integer programming and heuristics. Its mathematical model is more realistic, for considering operational aspects and uses surfaces to return solutions that does not have any geotechnical errors. What in practice is mined are surfaces and not blocks. This type of technology has the flexibility to include other real restrictions, such as blending.

Read more.

Interface Overview

Home Page

MiningMath automatically starts on the Home Page, as shown in Figure 1.

Overview of Recent Projects

Decision Trees feature enables you to create new tabs and ways to organize the mining planning strategies.

Open and View options for the current selected scenario under the Decision Trees tab.

Edit button shortcut, which will lead you to the Calculator functionality explored further ahead.

Model table discloses the main information regarding your block model and its parameters.

Figure 1: Interface example. Click to expand.

The following are three main information windows:

  • The Recent Projects window allows you to select a project and have an overview of the saved scenarios that will appear right away at the Decision Tree tab. It allows the user to quickly navigate through recent projects and scenarios without opening them from their original folders. Clicking with the right button on the project name, the user will have 4 options:

    1) New scenario: "Scenario config" window will open allowing you to decide in which decision tree to place the new scenario, its name, and a description for it. Afterward, is going to be directed to set it up in the scenario tab.

    2) Show in explorer: This option takes to the directory containing the folder with the project files and data.

    3) Remove from list: Excludes the referred project from the Recent Projects list.

    4) Delete project: Deletes the project and the scenarios from it.

  • The Decision Trees feature enables you to create new tabs and ways to organize the mining planning strategies by exchanging scenarios between them, if necessary. It allows access to all paths involved in the project, providing a broad view and enhancing the decision-making process. It shows decision trees’ scenarios with key information about them, such as name and description to easily identify its characteristics, NPV on M$, runtime, and a direct link to the sheet containing all the results of the scenarios, which are available after the scenario’s execution. It’s possible to:

    1) Add new trees by clicking on “+”

    2) Rename a tree by double-clicking its name

    3) Clicking with the right button, the user may add a new scenario, rename or delete this tree.

    Getting into more details, some hidden options are available to the user when clicking with the right button on the scenario’s name: open, view model, rename, show in explorer, delete, and the possibility to transfer it between decision trees. The scenario description can be easily edited with a double click.

    To open any scenario, click on one Recent Project first, and then choose the scenario you want to work with at the Decision Trees tab by clicking with the right button of your mouse, and choosing the option "Open", or just select the scenario and click on “Open” at the right bottom of this screen. The “View” button leads to the MiningMath’s Viewer.

  • The Model table discloses the main information regarding your block model and its parameters, so that you can easily review it at any time using the "Edit" button shortcut, which will lead you to the "Calculator" functionality.

Model Tab

This window aims to help you to modify your block model accordingly with what is required for your project and also allows you to “Export” the block model to the CSV format to be used with any other software.

This tab starts at the “Parameters” option, showing your previous set up at the importation, similar to Figure 2, and all the existing fields. It also allows you to remove any parameter at any time.

Figure 1: Parameters Model tab

The “Function” option discloses the “Validate your block parameters” table, so that you can choose a single block within your model to verify its values. It also enables MiningMath’s internal Calculator to make adjustments and changes to your dataset with the addition of new fields.

Figure 2: Internal Calculator at Function in Model tab

Scenario Tab

General Parameters

MiningMath automatically switches to the Scenario Tab in the General option once a scenario is opened, as shown in Figure 1

The General tab presents all the general inputs regarding densities (Figure 1), economic parameters (Figure 2), slope angles (Figure 3) and stockpiles (Figure 4), which are detailed next.

Densities

Densities, as shown in Figure 1, are used along with block size to calculate tonnages. The user has two options to define them:

  • "Field" shows the column(s) that has/have been assigned to the density during the importation. This option is intended to allow varying densities by block.

  • "Default value" is applicable to any block without density information, whether a density column is imported or not. It is also used when you choose the field as .

Economic Parameters

The Discount Rate is a field of Economic parameters, as seen in Figure 2. It is usually considered in an annual base, is responsible to define the impact of mining ore/waste over time, which influences the algorithm decision-making process.

While working with different time frames, the discount rate serves just a rough NPV approximation, and it doesn’t affect so much the quality of the solution, given that the best materials would be allocated first. Thus, by multiplying or splitting it by the number of periods, you might get reasonable results.

Slope Angles

Slope angles are one of the most important parameters when considering constraints hierarchy. The user, as seen in Figure 3, has two options here:

  • Field shows the column(s) that has/have been assigned to the slope during the importation. This option is intended to allow different slope angles by block.

  • Default value is used to any block without slope information, even if a column was assigned. It is also used when you choose the field as <none> .

Stockpiling

The stockpiling feature can be used by activating the checkbox. When this option is enabled, shown in Figure 4, the user can define:

  • Fixed mining cost (cost/t) refers to the average mining cost used for the economic functionThis value is used to decompose the economic value while considering stockpiles.

  • Rehandling cost (cost/t) represents the cost to reclaim blocks from the stockpile to the process.

To illustrate the other tabs inside the Scenario tab, the parameters used in the Marvin case will be employed as summarized below:

Parameter Subparameter Value
Densities
Field
Density
Default value
2.75
Slope angles
Field
Slope
Default value
45 degrees
Stockpiling
Fixed mining cost
0.9$/t
Rehandling cost
0.2$/t
Discount rate

-

10%

Table 1: Parameters used in the Marvin cases.

Destinations: Process, Dump and Stockpile

Figure 5: Destinations tab, Recoveries for each element/mineral and destination, and Stockpile limit in tonnages.

On the Destinations tab (Figure 5), you will define destinations to where the blocks can be sent. Each destination must be mapped with their respective field containing the economic values. MiningMath requires at least one destination for the process and one destination for the dump. For each destination, you have an economic value and recoveries by elements.

Process & Dump

To add destinations, in the bottom corner of the window, click on:

  • Add Process

  • Add Dump

Each scenario must contain at least one process and one dump among the destinations imported. The destination of each block will be reported by assigning them with the numbers 1 or 2 (see the numbers beside the Name column)which depends on the order of addition.

Recovery

For each processing stream, the user must inform a process recovery, varying from 0 to 1, to any element/mineral whose column has been imported as a grade.

This value on the interface serves only for the purpose of generating reports, as it has been considered during the economic calculation. Use the following values for the processing stream:

  • Cu: 0.88

  • Au: 0.60

Stockpile

You can also define a tonnage limit for the stockpile if activated in the General tab (see Figure 4).

MiningMath considers the tonnage inputted as a cumulative upper limit that will be considered all over the life of mine.

In this example, a limit has not been defined, which implies an capacity to stock. Read more about stockpiles.

Economic Value

In the column of Economic value, you must assign each destination to its corresponding economic function. Therefore, use:

  • Destination 1 - Process 1 - Economic Value Process

  • Destination 2 - Dump 1 - Economic Value Waste

Figure 6 zooms in the destination fields, showing how they should look like for this example.

Figure 6: Economic values for each destination.

Production Inputs

Figure 7: Production tab and production limits for each destination based on an yearly timeframe.

After completing the previous fields, move to the Production tab (Figure 7). You can define limits (in tonnes) for each destination and the total amount of material moved per period and also add different timeframe ranges in the optimization. For this example, use the values as shown in Figure 7.

  • TimeFrame: Years (1)

  • Process 1: 30,000,000 t

  • Dump 1: 50,000,000 t

  • Total: 80,000,000 t

Geometric Inputs

Figure 8: Operational constraints.

On the Geometric tab (Figure 8), you can define parameters intending to find mathematical solutions that already consider basic requirements to be operationally feasible. Figure 8 highlights Operational Fields, which could differ for each period range and timeframesuch as minimum widths and vertical rate. Optional Fields are also allowed. These allow the definition of: 1) areas to be forced and/or restricted; and 2) periods to which surfaces are applied. In this example the values defined for all periods are:

  • Minimum width: Mining 100m, Bottom: 100 m

  • Vertical rate of advance: Maximum 150 m

In the Geometric tab you can also force mining and restrict mining using surfaces based on coordinates and defined as a 3D-grid of points in the CSV format. Surfaces are the most important constraints within MiningMath’s hierarchy, allowing you to impose one’s understanding and take control of prior results and operational aspects.

Using surfaces, you are able to play with geotechnical aspects, force certain regions to allocate waste material, restrict areas to protect the environment, and/or guide operational aspects by importing a designed pit.

Due to the complexity of this subject, surfaces will be treated in a specific section of our documentation.

Average

Figure 9: Blending constraints.

On the Average tab, you are able to define a minimum and/or maximum average grades for any element/mineral imported as grade (Figure 9 area 1).

Blending constraints can also be defined by period ranges (Figure 9 area 2) and/or destination (Figure 9 area 3).

It’s worth mentioning that this minimum limit does not represent cut-off values. Since it is based on average parameters, the algorithm can use lower values to respect this parameter and increase the NPV with higher ones. If you wish to input a cut-off, a good way to do it is by filtering these blocks and assigning the the mass as a sum field, as mentioned here.

Sum

Figure 10: Other constraints.

On the Sum tab, you are able to consider any summed parameter as a minimum and/or maximum limits for any data imported as other (Figure 10 area 1).

Other constraints can also be defined by period ranges (Figure 10 area 2and/or destination (Figure 10 area 3).

This feature is available only at the full versions of MiningMath. Read more.

Overview

Click on Overview (Figure 11) for a single page summary of all parameters related to the direct block scheduling, as illustrated in the figure below.

The Save as option (Figure 12) can be used to redefine the name, description for an edited scenario, and its decision tree.

You can also decide which files MiningMath will produce as outputs by using the execution options (Figure 13) before clicking on “Run” your scenario.

Viewer

After running your scenario, the Mined Blocks file will disclose its results on the 3D viewer, enabling the view of your model from different angles. By selecting “Period Mined”, you are able to view the mining sequence period by period. By selecting a surface, it is possible to identify the topography changes of each period and also modify its opacity, facilitating the visualization.

MiningMath also allows you to import surfaces already created if you place them in the same folder as the other ones, so that you can validate their geometry if necessary.

Click on “Load Scenario” to import multiple scenarios and compare them, in order to extract the best results accordingly with your project constraints.

Overview

The Overview tab aims to let the user to give a last check on each parameter inputted. Every information available in this screen, as shown in Figure 1, is also present on the previous screens:

Note the user can even go straight to the overview tab to save time.

Figure 1: The Overview Tab.

Save As

By clicking over Save As button, a window will pop-up, as shown in Figure 1. Then, the user will be able to set:

  • A scenario name

  • A scenario description

These fields intends to better identify a scenario among several ones by just taking a look at the Projects List.

Figure 1: Saving As.

Run

The Run tab is the last step before running your project’s optimization. The Figure 1 below shows its interface.

This screen allows you to view the results, in tables, if the scenario has already been run, inside the software, facilitating the analysis.

After everything is checked, just Run the scenario.

With the progress bar you can monitor the execution of the scenario and the estimated time it will take to be completely executed, as shown in Figure 2.

Figure 2. Progress bar

Handling Data

Formatting the Block Model

Block Model Basic requirements

MiningMath requires the following formatting specifications:

  1. Regularized block model: This means all blocks must be the same size.

  2. Air blocks must be removed prior to importation. This is the way MiningMath recognizes the topography.

  3. Coordinates of each block in the 3 dimensions.

  4. Header Names should not have special characters or have them exceed 13. Use this recommendation for folders and files also.

  5. The data format should be a CSV file (Comma Separated Value), which might be compatible with most mining packages.

Good practices

  1. Configure Microsoft Windows number formatting to use dot as the decimal separator.

  2. Use the metric system.

  3. Set multiple fields that will consider different economic values, material types, contaminant limits, and any other variable you wish to analyze or control.

Must check

Understanding Field Types

Field Types are the fields MiningMath can understand. Each column imported should be assigned to the proper field type so that the software treats each variable accordingly with its meaning.

Figure 1: Field types

Mandatory Field Types and their meanings

  1. Coordinates X, Y, and Z refer to your geo-referenced information.

  2. Average refers to any variable that could be controlled by means of minimums and maximums considering its average: grades, haulage distance, and other variables.

  3. Economic Value refers to the columns with the economic value, which represent the available destinations. It is possible to import multiple economic values at once, and they may be used simultaneously (ex.: multiple processing streams) or calculated in the internal calculator mentioned on the next page.

Optional Field Types and their meanings

  1. Density refers to the block's density. This field is used to calculate the block's tonnage.

  2. Slope refers to slopes varying block-by-block, which gives the flexibility to define slopes by lithotype and sectors.

  3. Recovery refers to recoveries varying block-by-block.

  4. Sum refers to any variable that could be controlled by means of minimums and maximums considering its sum.

  5. Predefined destinations refers to possible fixed destination values. This can be used for example if you want to define pushbacks or apply lithologic restrictions that prevent certain blocks to be processed. However, by fixing destinations you are impeding MiningMath to reach its full potential. More about this here.

  6. Other refers to information that you with to have in the exported outputs.

  7. Skip refers to any variable that should be ignored. This field type might help improving the runtime since these variables will not be considered and exported along with the optimization outputs.

Field names shortcut

Shortcuts can be used for automatic recognition in the importation process. These are listed in the table below.

Field name Shortcuts
Coordinates
X | Y | Z
Average
@ | grade
Density
% | dens | sg
Economic value
$ | dest | val
Recovery
* | recov
Slope
/ | slope
Sum
+
Skip
!

Mandatory requirements

Considering the specifications mentioned before, the formatted data set should have the following information for each block:

  1. Coordinates.

  2. Grades (at least one element assigned as Average).

  3. Economic values (at least 1 process and 1 waste).

The following video gives an introduction on how to setup your block model.

Video 1: Block Model setup.

Attention to software conversions

The model’s origin must be placed at the bottom portion, starting to count from the minimum coordinates at X, Y, and Z.

Figure 1 highlights a block model origin at the corner of the first block and the coordinates on its centroid.

Each software uses its own conventions for data format, naming and numbering systems, etc. These differences should be observed to prevent conflicts when transiting data from multiple software, each one for one specificity.

What you must know:

  1. MiningMath uses coordinates (X,Y,Z) for which Z, which represents the elevation, starts upwards (Figure 3a).

  2. Other mining software may use indexes with IZ starting downwards (Figure 3b). MineSight is an example that uses this notation.

Figure 2: Blocks Matrix.

There is no right or wrong convention, but there is a correct procedure for each software.

To invert coordinates use the following formula to convert:
\(new(Z) = max(Z) + 1 – current(Z)\)

Figure 3a: The lowest IZ value is at the bottom of the model.
Figure 3b: The lowest Z value is at the top of the model, which will not fit MiningMath requirements.

Air Blocks

MiningMath recognizes that all imported blocks of your model are underground. This means it is necessary to remove all the air blocks prior to importation. Unless your topography is totally flat, which is unlikely, the image below shows an example of your model should be displayed.

The non-removal of air blocks may lead to unsatisfactory results and long processing times, since it would be considering blocks that do not exist in reality.

Figure 4: Example of how block models should look like with a rectangular base.

More Details on Air Blocks

The following video shows how to do remove air blocks using filters on MS Excel. These tips are also applicable to any mining software of your choice.

Video 1: Removing air blocks using filters on MS Excel.

Importing the Block Model

Block Model File

To import the block model, select the option New Project on the left panel of MiningMath (Figure 1). 

Figure showing where to create a new project to import a new model.
Figure 1: Creating a new project to import a new model.

Afterwards, the file name input field is shown in red, indicating a mandatory field. (Figure 2) Browse for and select the CSV formatted file. Press Next to advance.

Figure 2: Importing a CSV model.

Project Naming

In the next window, shown in Figure 3, the Model Name must be entered.

Optionally, the destination folder (Model Folder) can be changed as well as the Scenario Name, and a Scenario Description can be added.

Figure 3: Defining a name for the model and the first scenario.

Imported Fields & Validation

Upon clicking Next, the following window will provide a statistical summary of information for the block model that will be imported (Figure 4).

Check the parameters carefully.

Figure showing the interface to validate your data.
Figure 4: Validating your data.

Geo-reference system, Origin, Dimension and Rotation

Upon clicking Next, the CSV file will be imported into MiningMath, and show data related to the block model geo-reference system, that can be only coordinates. The next steps are to place the rotation degrees (Azimuth rotation), origin accordingly with your mining package, and the block dimension as illustrated in Figure 5. The number of blocks is automatically calculated after the origin and dimensions are provided.

The origin of this project was x=3,475, y=6,480, and z=285, and the block dimensions were 30 meters in each coordinate.

Figure 4: Coordinates input.

Rotated models

MiningMath supports the use of block models that have been rotated using an Azimuth rotation (Figure 5). The amount of rotation degrees can be passed as depicted in Figure 6. After importing, you can see the rotated model in the Viewer tab (Figure 7).

Figure 5: Example of Azimuth rotation in the coordinate system.
Figure 6: Azimuth rotation depicted when hovering over the RZ field.
Figure 7: Example of rotated model in the viewer tab.

Field Type Assignment

When Next is selected, the following form will appear (Figure 8), showing correlations between the imported CSV file header and the available field types in MiningMath.

You must associate each imported column to one of the options located just above the table, for instance: block coordinates X, Y, and Z to Coord. X, Y, and Z field types. For more details on how you can correlate each column, access this link. You can also keep the original data from your previous Mining Package, by using this approach.

If you do not already have an Economic Value function, when importing your block model, you will be directed to the Scenario tab. Then, click on the Function tab to calculate your Economic Value function in the internal calculator as explained here.

Figure 8: Assigning each column to the proper field type.
Notes
  1. MiningMath has mandatory variables (columns) to be assigned to the proper Field Type:

    1) Coordinates (X, Y, Z).

    2) Average

    3) Economic Values (at least two)

  2. Validating data screen might be overlooked, but it is very important to validate one's data based on minimums and maximums. Read more.

  3. Each column imported should be assigned to the proper field type in order for MiningMath to treat each variable accordingly. Read more.

  4. Typically, MiningMath recognizes some columns automatically when their headers are similar to the Field Type name. Otherwise, the MiningMath will automatically assign them to the Field Type sum.

    To enable the Next button, the user needs to assign each one of the mandatory variables to their respective Field Type

Grade, Dimension and Origin

After clicking Next, it will demand grade units. As you can see in Figure 10, the copper grade has been defined as a percentage (%), while gold grade was defined as PPM, which stands for parts per million and, in turn, is equivalent to g/ton.

Figure 10: Informing block dimensions, origin, and grade units.

View Your Model and Surfaces

After filling in the required fields, the options View Model and Scenarios will be enabled. Before setting up your first scenario you can view it by clicking in the Viewer and Load scenario. Select all the tooltip options and click in load. This option also allows you to view surfaces created, just place them in the scenario folder before loading and do the first validation.

Evaluate your model

After importing your model, you can view it in the Viewer tab as depicted in Fig. 11-14. This should help you answer questions such as:

  1. Where are the high grades distributed?

  2. Does the process economic values, above zero, match with the regions identified in the last question?

  3. How are waste economic values distributed? Are maximum and minimum values reasonable when you compare them with the process?

Economic Values

MiningMath does not require pre-defined destinations ruled by an arbitrary cut-off grade. Instead, the software uses an Economic Value for each possible destination and for each block. The average grade that delineates whether blocks are classified as ore or waste will be a dynamic consequence of the optimization process.

Destinations required

MiningMath requires two mandatory destinations at least:

Therefore, each block must be associated with:
  • 1 Processing stream and its respective economic value

  • 1 Waste dump and its respective economic value

Notes:
  • Even blocks of waste might have processing costs in the economic values of the plant. Therefore, non-profitable blocks would have higher costs when sent to process instead of waste.

  • If you have any material that should be forbidden in the plant, you can use economic values to reduce the complexity and runtime, as mentioned here.

Animation (click to see): Simplified flow-chart of blocks' destinations optimization.

Calculation

Each field related to Economic Value (Process/Waste) must report the value of each block as a function of its destination (Process or Waste in this example), grades, recovery, mining cost, haul costs, treatment costs, blasting costs, selling price, etc. The user is not required to pre-set the destination, as the software will determine the best option during the optimization.

To calculate the Economic Values you can use MiningMaths’s internal calculator, available at the “Function” option inside the “Model” tab. To illustrate the calculation of economic values, an example is shown below. The calculation parameters are listed in Table 1.

Description Cu (%) Au (PPM)
Recovery
0.88
0.6
Selling price (Cu: $/t, Au: $/gram)
2000
12
Selling cost (Cu: $/t, Au: $/gram)
720
0.2
Processing cost ($/t)

                  4

Mining cost ($/t)
                 0.9
Discount rate (%)
                 10
Dimensions of the blocks in X, Y, Z (m)
          30, 30, 30

Table 1: Parameters for calculating the economic values.

Figure 1: Internal Calculator.

Block Tonnes

  • Block Tonnes = BlockVolume * BlockDensity

  • Block Tonnes = 30*30*30*[Density]

Figure 2: Block model calculations.

Tonnes Cu

  • Tonnes Cu = Block Tonnes x (Grade Cu/100)

  • Tonnes Cu = [BlockTonnes]*([CU]/100)

Figure 3: Block model calculations.

Mass Au

  • Mass Au = Block Tonnes x Grade Au

  • Mass Au = [BlockTonnes]*[AU]

Figure 4: Block model calculations.

Economic Value Process

  • Economic Value Process =
    [Tonnes Cu x Recovery Cu x (Selling Price Cu – Selling Cost Cu)] +
    [Mass Au x Recovery Au x (Selling Price Au – Selling Cost Au)] –
    [Block Tonnes x (Processing Cost + Mining Cost)]

  • Economic Value Process = ([TonnesCu]* 0.88 * (2000–720)) + ([MassAu] * 0.60 * (12 – 0.2)) – ([BlockTonnes] * (4.00 + 0.90))

Formula for economic price
Figure 5: Process Economic Value calculation.

Economic Value Waste

  • Economic Value Waste = –Block Tonnes x Mining Cost

  • Economic Value Waste = –[BlockTonnes] * 0.9

Figure 6: Economic Value Waste calculation.

The example block in Figures 4-6 would generate -299,880$ if it is sent to the process, and –55,080.1$ if discarded as waste. Therefore, this block might go to waste, since it would result in less prejudice than when it is processed. MiningMath defines the best destination regarding the set of constraints throughout the time, thus this decision a lot more complex than the example above in most cases.

Data Validation

Running an optimization for complex projects with several constraints may demand hours only to validate if the formatting has been done properly. Therefore, we present here an efficient scenario to quickly validate your data.

The next pages use the Marvin Deposit as an example. To see its parameters and constraints please check the page here.

Validate it First

In order to validate your data and cut its runtime, we strongly recommend running MiningMath Full with the following set up:

  1. Process and dumps set with respective recovery values.

  2. A bigger production capacity than the expected reserves. In this example, the expected life of mine vs production rate is 35-year producing 10 Mt per year. Hence, a value of 1,000 Mt would be big enough to cover the whole reserve.

  3. No discount rate.

  4. No stockpiling.

  5. Density and slope values.

  6. Timeframe: Years (1), since it would all be processed in 1 period.

The figure below depicts this set up at MiningMath, with the highlighted fields.

Results

Results are depicted below, with blocks in the sequencing, surface, surface with blocks and production tonnage.

Ultimate pit

The surface returned by this data validation process represents the most economically viable pit shell, also known as the ultimate pit.

Questions

  • Did the scenario run properly?

  • Are most of the positive economic values from the process inside this surface?

  • Is the mining happening in reasonable areas?

  • Is there a reasonable number of periods of life of mine?

Constraints Validation

Continuing the data validation, start to add the first constraints related to your project so that you can understand its maximum potential. The surface generated in this case could also be used as a restrict mining in the last period to reduce the complexity of your block model and the runtime of MiningMath, since it includes a set of constraints inputted.

Example

  • Set up a scenario with 1,000 Mt in the processing plants, which corresponds to a lot more mass than expected in the whole life of the mine.

  • Add the Minimum Bottom Width (100m). This constraint will allow you to have a suitable work-front for your equipment.

  • Restrict Mining surface, if you have this constraint in your project.

  • Grade constraint until 0.7%.

  • Timeframe: Years (1), since it would all be processed in 1 period.

Note: Sum constraints can restrict the total amount of handling material (ore + waste) of the mine. Therefore, do not use them in the validation.

Let's make everything clear

Now that Constraints Validation step is done, you are able to use this final surface as a guide for future optimizations. This approach reduces the runtime and the complexity of the algorithm because when you take it into account, the blocks below this final optimized surface won’t be considered and the heuristics inside the interface would be facilitated. Notice that we did not make any change in the discount rate, thus, this first NPV does not represent the reality. If you need an accurate result in this step, make sure to adjust it.

It’s important to remember that when we restrict the mining into this surface, the number of periods generated in future runs could be reduced because the average parameters of each one will have to meet the constraints of the overall package. Therefore, to achieve the same parameters in a lower timeframe, some blocks may be discarded due to the mining sequence and the optimization of destinations inside the whole mass.

Having this idea in mind, you should already have enough information to decide and structure the next step of the optimization. Based on the amount mined in the last item and in the processing capacity, define a good timeframe to identify the mining sequence. In this case, we had 231 Mt as the ore total mass to split in almost 23 years, since the processing capacity is 10Mt.

To improve efficiency in the optimization, before working on a yearly basis, we made the decision to consider the first 5 years. It is reasonable to generate a 10-year surface to consider the optimization inside this limit due to observations made before. Remember that each assumption here can be done accordingly with your project’s demands and that MiningMath can work with any timeframe to meet your needs.

Exporting Data

Exporting the Model

Select the button Export Model on MiningMath’s Model tab, as shown below.

Figure 1: Clicking on Export.

Clicking on Export, a new page will appear, allowing you to select the folder where the block model exported would be saved with its name.

Figure 2: Exporting data.

Just click on “Next” for your model to be exported to the folder selected.

Public Datasets

MiningMath allows you to learn, practice, and demonstrate, by showing any scenario previously ran the concepts of Strategy Optimization using the full capabilities of using only Marvin Deposit. This version is freely available to mining professionals, researchers, and students who want to develop their abilities considering this standard block model.

Marvin Deposit

DB Information

Below are listed the default parameters for Marvin according to the adaptions made in our formatted model.

Parameter Value
Block size
23000 m³ (X = 30m, Y=30m, Z=30m)
AU - Selling Price
12 $/g
AU - Selling Cost
0.2 $/g
AU - Recovery
0.60
CU - Selling Price
2000 $/ton
CU - Selling Cost
720 $/ton
CU - Recovery
0.88
Mining Cost
0.9 $/ton
Processing Cost
4.0 $/ton
Discount Rate
10% per year
Default Density
2.75 t/m³
Default Slope Angles
45 degrees

Some common constraints applied to the Marvin deposit are listed below.

Constraint Value
Processing capacity
10 Mt per year
Total movement
40 Mt per year
Sum of processing hours
4,000 per year (detailed estimate of the plant throughput)
Vertical rate of advance:
150m per year
Copper grade
Limited until 0.7%
Minimum Mining Width
50m
Minimum Bottom Width
100m
Restrict Mining Surface

Some surface in .csv format. For example due to a processing plant in the area.

Fixed Mining (Stockpiling)
0.9$/t
Rehandling cost (Stockpiling)
0.2$/t

Economic Values

  • Process Function = BlockSize * Density * [GradeCU/100 * RecoveryCu * (SellingPriceCU – SellingCostCU) + GradeAU * RecoveryCu * (SellingPriceAU – SellingCostAU) - (ProcessingCost + MiningCost)]
  • Waste Function = BlockSize * Density * MiningCost

McLaughlin Deposit

DB Information

Below are listed the default parameters for the McLaughlin deposit according to the adoptions made in our formatted model.

Parameter Value
Block size
X = 7.62m (25ft), Y = 7.62m (25ft), Z = 6.096m (15ft)
AU - Selling Price
900 $/oz
AU - Recovery
0.90
Mining Cost
1.32 $/ton
Processing Cost
12 $/ton
Discount Rate
15% per year
Default Density
3.0 t/m³
Default Slope Angle
45 degrees

Economic Values

  • Process Function = BlockSize * Density * [GradeAU * RecoveryCu * (SellingPriceAU) - (ProcessingCost + MiningCost)]
  • Waste Function = BlockSize * Density * MiningCost

Output files

The Execution Options or Run Options allow the user to define:

  • Files to be exported.

  • The visual results to be automatically shown on the viewer after each run.

Figure 1 highlights in (A) where the user can trigger this pop-up window and in (B) the options available, among which the user can:

  • Export/not export to CSV files:

    • The resulting surfaces
    • The resulting model in two ways: all blocks or only mined blocks, with/without coordinates and/or index information.
  • Set which results to be shown on the viewer:

    • Surfaces
    • Model
Figure 1: Execution options.

MiningMath automatically produces:

  • Formatted reports (XLSX files).

  • Tables (CSV) whose data feeds the reports.

  • Updated block model (MinedBlocks or AllBlocks).

  • Surfaces as a grid of points (CSV)

MiningMath organizes files, as listed below:

SSMOD and SSPRJ are important to report any issues you face.

    • Model Folder
    • MiningMath Model file (.SSMOD).
    • MiningMath Project file (.SSPRJ).
    • Scenario folder
      • Output Block Model
        • MinedBlocks.CSV contains information about the mined blocks.
        • AllBlocks.CSV, when requestedcontains information about all blocks.
      • Scenario file (.SSSCN) is a XML file read by the interface. Use it for a quick check on parameters used.
      • Report file (.XLSX) summarizes some quantifiable results, including charts such as productions, average grades, and NPV.
      • MiningMath also generates independent report files (.CSV) present in the report file (XLSX) as a backup:
        • Production Process.
        • Production Dump.
        • Production Total.
        • Grade Process.
        • Grade Dump.
        • Metal Process.
        • NPV.
        • Cumulative NPV.
      • Surface files (Surface-##.CSV) formatted as a grid of points.
  • List Item

SSMOD and SSPRJ are important to report any issues you face.

After each optimization, MiningMath exports the block model in one of two formats:

  • MinedBlocks.csv: The file presents only the blocks that have been mined from each scenario. Mined Blocks are exported by default, as it is a lighter file.

  • AllBlocks.csv: The All Blocks file presents all the blocks, whether mined or not, from each scenario, so it is basically the original Block Model along with resultant information from the optimization.

The resultant model includes all columns imported (except the skipped ones) besides the following information:

  • Mined Block shows whether (1) or not (0) a block have been mined.

  • Period Mined shows in which period a block have been mined (-99 for blocks that have not been mined). To learn more about the mining sequence within a period, access here.

  • Period Processed shows in which period a block have been processed (-99 for blocks that have not been processed).

  • Destination informs the destination of each block — according to the order the user has added processing stream(s) and waste dump(s).

Figure 2 shows where the user can interchange of these options.

  1. Click on the highlighted Execution button (A) to open the Run Options (B).

  2. Select All blocks in model or Only mined blocks, as you need.

  3. Hit OK, then Run.

By default, MiningMath exports the MinedBlocks file as a block model output

MiningMath will generate a report directly on Microsoft Excel, as shown in the following image, and the optimized pit (blocks and surface) in the viewer in case the user chooses this option (right figure above). The automatic preview shows only the mined blocks, colored according to each mining period defined by the scheduler.

The results presented in the Excel spreadsheet show, in the Charts tab, the graphs relative to the reported results calculated in the Report tabThe processed mass results, discarded mass, stock development, Au/Cu percentage in the process, Au/Cu percentage in the dump, metal contained in the process, net present value and cumulative net present value are arranged individually in the Production Process 1, Production Dump 1, Stock Process 1, AU/CU – Grade Process 1, AU/CU – Grade Dump 1, AU/CU – Metal Process 1, NPV and Cumulative NPV tabs, respectively.

Figure 2: Results report.

By default, MiningMath exports only the Mined Blocks file showing them by period on the viewer, as in the following illustration. The user can change any exporting options on Run Options menu.

Figure 3: Visual results.

If the user chooses to export the model, MiningMath will automatically save the list of the scheduled blocks (MinedBlocks.csv) or all blocks (AllBlocks.csv) in the block model folder, as shown in the figure below, which can be imported into other mining software packages.

The files MinedBlocks.csv and AllBlocks.csv may contain indices and/or block coordinates, and all the imported data/parameters along with the following information:

Figure 4: Mined blocks.
  • Mined Block shows whether (1) or not (0) a block have been mined.

  • Period Mined shows in which period a block have been mined (-99 for blocks that have not been mined).

  • Period Processed shows in which period a block have been processed (-99 for blocks that have not been processed).

  • Destination informs the destination of each block — according to the order the user has added processing stream(s) and waste dump(s).

Video 1: Outputs and files’ hiearchy.

Workflow

Super Best Case

In the search for the upside potential for the NPV of a given project, this setup explores the whole solution space without any other constraints but processing capacities, in a global multi-period optimization fully focused on maximizing the project’s discounted cashflow.

As MiningMath optimizes all periods simultaneously, without the need for revenue factors, it has the potential to find higher NPVs than traditional procedures based on LG/Pseudoflow nested pits, which do not account for processing capacities (gap problems), cutoff policy optimization and discount rate. Traditionally, these, and many other, real-life aspects are only accounted for later, through a stepwise process, limiting the potentials of the project.

MiningMath’s Super Best Case serves as a reference to challenge the best case obtained by other means, including more recent academic/commercial DBS technologies available. See a detailed comparison of these two approaches below.

In modern/traditional technology, large size differences between consecutive periods may render them impractical, leading to the “gap” problem. Such a gap is caused by a scaling revenue factor that might limit a large area of being mined until some threshold value is tested. MiningMath allows you to control the entire production without oscillations due to our global optimization.

In the modern/traditional methodology the decisions on block destinations can be taken following some techniques such as: fixed predefined values based on grades/lithologies post-processing cutoff optimization based on economics post-processing based on math programming or even multiple rounds combining these techniques. With MiningMath the destination optimization happens within a global optimization in a single step, maximizing NPV and accounting simultaneously for capacities, sinking rates, widths, discounting, blending, and many other required constraints.

Modern technology is restricted to pre-defined, less diverse sequences because it is based on step-wise process built upon revenue factor variation, nested pits, and pushbacks. These steps limit the solution space for the whole process. MiningMath performs a global optimization, without previous steps limiting the solution space at each change. Hence, a completely different scenario can appear, increasing the variety of solutions.

Due to tonnage restrictions, modern technology might need to mine partial benches in certain periods. With MiningMath’s technology, there isn’t such a division. MiningMath navigates through the solution space by using surfaces that will never result in split benches, leading to a more precise optimization.

Due to tonnage restrictions, modern technology might need to mine partial benches in certain periods. With MiningMath’s technology, there isn’t such a division. MiningMath navigates through the solution space by using surfaces that will never result in split benches, leading to a more precise optimization.

Modern approaches present a difference between the optimization input parameters for OSA (Overall Slope Angle) and what is measured from output pit shells, due to the use of the “block precedence” methodology. MiningMath works with “surface-constrained production scheduling” instead. It defines surfaces that describe the group of blocks that should be mined, or not, considering productions required, and points that could be placed anywhere along the Z-axis. This flexibility allows the elevation to be above, below, or matching a block’s centroid, which ensures that MiningMath’s algorithm can control the OSA precisely, with no errors that could have a strong impact on transition zones.

Example

Setting up the Super Best Case is simple. There are only two necessary restrictions:

  1. Processing capacity: 10 Mt per year.

  2. Timeframe: Years (1).

Depending on your block model, additional parameters may need to be specified. For example, if you have multiple destinations these could be added for proper destination optimization. The figure below provides a comprehensive overview, highlighting the essential parameters required for running the Super Best Case scenario using the pre-installed Marvin dataset.

Results

Results can be analysed in the Viewer tab and the exported report file. For the pre-installed Marvin dataset, note how the sequencing has no gap problems, and the production is kept close to the limit without without violating any restrictions.

Super Best Case Sequencing
Sequencing Slice
Super Best Case production tonnages

Export files

The block periods and destinations optimized by MiningMath’s Super Best Case (or any other scenario) can be exported in a CSV format. You could use these results to import back into your preferred mining package, for comparison, pushback design or scheduling purposes. Export options are depicted below.

Adding constraints

A refinement of the super best case could be done by adding more constraints, preferably one at the time to evaluate each impact in “reserves”, potential conflicts between them, and so on. You can try to follow the suggestions below for this improvement:

  • All blending constraints.

  • All restrict mining aspects due to forbidden areas.

  • Extra processing or dump routes for proper cutoff optimization.

  • Sum variables (with caution), just in case some aspect must be controlled for the whole LOM at once.

  • In case more efficiency is needed, the resulting surface obtained in the Constraints Validationstep could be used as restrict mining for the runs here.

Optimized Pushbacks

Pushback Optimization is the process of generating and analyzing optimized pushback scenarios by varying the volumes of ore and waste in each case. This procedure is important to ensure the financial and operational viability of the mining project, as excessively large volumes can render the project unfeasible, while excessively small volumes can result in resource wastage or missed opportunities for ore extraction. By testing different volumes, it is possible to find an optimal point that maximizes the net present value (NPV) of the project.

MiningMath utilizes timeframes to generate pushbacks at different levels of detail. Timeframes are time intervals that divide the mine’s lifespan into smaller periods. Different timeframes allow users to perform a fast evaluation of the impact of production volume on the NPV. If necessary, adjustments can be made to optimize production and reduce costs.

In Pushback Optimization, multiple optimized pushback scenarios are created with varying levels of detail, enabling users to have a comprehensive view of the impact of volume variations on the project’s performance.

Create a Pushback

You can Identify timeframe intervals in your project, so that you can work with group periods before getting into a detailed insight. This strategy allows you to run the scenarios faster without losing flexibility or adding dilution for the optimization, which happens when we reblock.

The idea is to make each optimized period represent biennial, triennial, or decennial plans. MiningMath allows you to do it easily by simply adjusting some constraints to fit with the timeframe selected. Try to run with and without dump/total productions to check potential bottlenecks and impacts on waste profiles, which could be useful for fleet management exercises. Also, test with wider mining widths than required, as this is a complex non-linear constraint and you might find better shapes without losing value. Notice that in this example, the processing was not fully achieved, and this kind of approach helps us to understand which constraints are interfering the most in the results.

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

Property Value
Timeframe custom factor
5
Processing capacity
50Mt in 5 years
Dump capacty
150Mt in 5 years
Vertical rate of advance

 750 m in 5 years

Minimum Mining Width
100m
Minimum Bottom Width
100m
Restrict Mining Surface
Optional
Grade copper
0.88%
Stockpiling parameters
On
Note: Waste control and vertical rate of advance are not recommended if you are just looking for pushback shapes.

Work Through Different Timeframes

Given the previous initial scenario, you might want to consider different timeframes for your pushback design. In order to perform a Pushback Optimization, the timeframes (in green), process and dump production limits (in green) and the vertical rate (in red) will be adjusted.

By varying the highlighted parameters above, the following decision tree has been constructed for Pushback Optimization.

Three different timeframes are explored: 3 years, 5 years, and 10 years. Each timeframe is associated with specific process and dump production limits. Such limits not only scale with their respective timeframes but also allow for variations that provide flexibility for testing different production scenarios. Finally, the vertical rate is also adjusted to align with the defined timeframe of each scenario. For instance, the vertical rate is set to 450m for the 3-year timeframe, 750m for the 5-year timeframe, and 1500m for the 10-year timeframe.

Afterward, specific results were carefully selected for comparison, focusing on key parameters such as Net Present Value (NPV), production process, and production dump.

Note: Waste control and vertical rate of advance are not recommended if you are just looking for pushback shapes.

More details

The 2 constraints inputted at the production tab are related to the maximum material handling allowed: the third one is about the processing equipment capacity, and the vertical rate of advance is related to the depth that could be achieved adjusted to this interval. The minimum mining width was added because we are already generating designed surfaces that could be used later as guidance of detailed schedules, thus, it should respect the parameter due to the equipment sizing. Parameters such as average, minimum bottom and restrict mining surface, don’t suffer any change in the time frames.

It’s important to remember that the packages of time here don’t necessarily have to correspond to identical sets of years. You could propose intervals with different constraints until reaching reasonable/achievable shapes for the design of ramps, for example. If you wish to produce more operational results, easier to design, and closer to real-life operations, try to play with wider mining/bottom widths rates. Those changes will not necessarily reduce the NPV of your project.

Considering this approach the discount rate serves just a rough NPV approximation and it doesn’t affect much the quality of the solution, given that the best materials following the required constraints will be allocated to the first packages anyway.

Remember all the constraints

NPV Upside Potential

NPV Upside Potential is the process of generating and analyzing scenarios to measure the impact of each constraint on the project’s net present value (NPV), from the Super Best Case to a detailed setup. Measuring the impact of each constraint on the NPV is important to assess the financial impact and ensure the project’s viability under different scenarios and conditions. Each constraint can have a significant impact on the project’s NPV, and it is crucial to understand how they affect the project’s financial performance.

By evaluating the impact of each constraint on the project’s NPV, it is possible to identify financial bottlenecks and opportunities for improvement, as well as prioritize problem resolution. This can result in better resource allocation and cost reduction, enhancing the project’s profitability and viability.

In NPV Upside Potential, scenarios are created that sequentially incorporate each constraint of the project, allowing users to have a comprehensive view of the impact of each constraint on the project’s performance. In case more efficiency is needed, the resulting surface obtained on the Constraints Validation or in Best Case refinements could be used as Restrict Mining in the last interval, which might reduce the complexity and the runtime. 

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

To illustrate this process, let’s consider the base scenario of the Marvin dataset (shown in the figure below). The highlighted green fields represent all the targeted constraints that need to be controlled in this project: process capacity, minimum average grade of CU in process, dump capacity, bottom minimum width, mining minimum width and maximum vertical rate.

The decision tree depicted below has been constructed for a NPV Upside Potential process, based on the above scenario. In this decision tree, the scenarios progressively introduce each constraint into the project.

The target scenario is the last one, with the following restrictions: Process Production=10mt, Dump Production=30mt, Bottom Width=100m, Mining Width=100m, Vertical Rate=150m, and average CU=0.5. However, the constraints are added interactively, starting with the process production, followed by the dump production, widths, and so on.

Note how the cumulative NPV usually decreases (as expected) when more restrictions are added (see note at the end for exceptions). Without this interactive process, there might be a lack of information to understand the NPV of the final, desired scenario.

Best-Worst Range Analysis

Best-Worst Range Analysis is the process of generating and analyzing scenarios to measure the impact of mine width constraints on the project’s net present value (NPV), from no restriction to wide widths. Measuring the impact of mine width constraints is crucial to determine the optimal fleet equipment configuration in mining operations, with the aim of optimizing productivity and maximizing the net present value (NPV) of the project.

By analyzing variations in width constraints, it is possible to identify the effect of space limitations on mining operations and evaluate the influence of different bench widths on fleet performance. Appropriate mining widths can bring a series of benefits: higher amount of material to be simultaneously extracted; higher fleet productivity; more efficient transportation; easier road maintenance and so on. Hence, the search for different widths allows finding the best combination of equipment and mining techniques aimed at maximizing production and profit simultaneously in each scenario.

In a Best-Worst Range Analysis, scenarios are created gradually increasing the mine width up to a feasible maximum, allowing users to have a comprehensive view of the impact of space limitations on the project’s performance.

Considering the nature of global optimization and the non-linearity of the problem, it is expected that there will be variations in performance (NPV, production, amount of mine fronts, etc.) as parameter values are modified. Therefore, it is crucial to generate a large number of scenarios to obtain a comprehensive analysis of the impact of these variations on the project. This way, a more precise understanding of how different parameter values affect the overall performance can be achieved.

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

Consider the following base scenario and decision tree built for a Best-Worst Range Analysis using the Marvin dataset.

The goal in this case is to understand the impact of different values of mining width (in green), which will be tested with a range of different values, from 0m up to 200m. 

Note that there is no linear relationship between mining width and NPV. In other words, a higher mining width does not necessarily imply a lower NPV. As previously mentioned, that is due to the non-linearity of the problem.

Considering the nature of global optimization employed in MiningMath, other variables might also be affected by different mining widths. For example, the production could be analyzed for identification of possible issues when employing different mining widths.

Selectivity Analysis

Selectivity Analysis is the process of generating and analyzing scenarios to measure the impact of all geometric constraints on the project’s net present value (NPV), from the most selective to the least selective setup. Analyzing the impact of variations in geometric constraints is important to determine the optimal mine configuration and optimize productivity and profit.

By performing such an analysis, it is possible to identify the effect of geometric limitations on mining operations. Moreover, it is possible to evaluate the influence of each parameter and its variation on mine performance. This allows finding the best combination of parameters and mining techniques aimed at maximizing production and profit for each scenario.

In a Selectivity Analysis, scenarios are created including each geometric constraint sequentially and gradually increasing or decreasing their values from the least selective until the desirable requirement. This allows users to have a comprehensive view of the impact of geometric limitations on the project’s performance.

Considering the nature of global optimization and the non-linearity of the problem, it is expected that there will be variations in performance  (NPV, production, amount of mine fronts, etc.) as parameter values are modified. Therefore, it is crucial to generate a large number of scenarios to perform a comprehensive analysis of the impact of these variations on the project.

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

Consider the following base scenario and decision tree built for a Selectivity Analysis using the Marvin dataset.

The goal is to understand the impact of different values of geometric constraints (mining width, bottom width, and vertical rate of advance). The geometric parameters (in green) will be tested with a range of different values: bottom width with values from 0m up to 200m; mining width with values from 0m up to 200m; and vertical rate of advance with values from 50m up to 300m. In this example, 26 different scenarios were evaluated. 

Note that there is no linear relationship between geometric constraints and NPV. In other words, a higher width or lower vertical rate of advance do not necessarily imply a lower NPV. As previously mentioned, that is due to the non-linearity of the problem. The cumulative NPV of the scenarios is compared in the graph below.

A diverse range of results can be achieved with a Selectivity Analysis. However, there are usually two possibilities when they are compared:

  • Contrasting geometrics parameters with small NPV variations: note that when the bottom width changes from 0m to 80m, and the remaining parameters are fixed, the NPV drops from 454 M$ to 444M$. This indicates that large changes in geometric constraints do not necessarily lead to large changes in the NPV. The same for the scenarios SA_BW000_MW100_VR150 and SA_BW100_MW100_VR250.

  • Similar geometric parameters with larger NPV variations: when comparing scenarios SA_BW080_MW100_VR150 and SA_BW100_MW160_VR150 there is a drop in NPV from 444M$ to 370M$, highlighting that the 20m and 60m change in bottom width and mining width respectively can lead to a larger NPV difference in the project.

In conclusion, it is important to create several scenarios in a Selectivity Analysis. As exemplified above, results can be quite similar or quite different due to the non-linearity of the problem. Considering the nature of global optimization employed in MiningMath, it is also important to evaluate other indicators. The figures below depict the tonnage achieved for the production, demonstrating the possible impacts for different geometric constraints.

Design Enhancement

Design Enhancement is the process of creating scenarios to conduct extensive searches for solutions with similar NPV values but with fewer violations and improved shapes. This process allows finding more efficient and sustainable solutions that meet specific mine constraints and needs. Hence, seeking these kinds of scenarios is important for optimizing mining operations and for reducing risks and costs. 

In Design Enhancement, scenarios are created with more rigorous geometric constraints without compromising the desirable requirements. The goal is  to reduce violations and find better forms for the project. This is possible due to the global nature of optimization and the non-linearity of the problem, enabling the use of stricter requirements for the geometric constraints that could lead to a better  performance of the project.

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

Consider the initial scenario and respective decision tree built for a Design Enhancement process. The goal is to evaluate stricter variations in the geometric constraints (in green).

Note the variation of results for Cumulative NPV and production. The base scenario has a small violation on dump production for period 1. However, when modifying the minimum width to 120m (scenario DE_BW100_MW120_VR150) this violation is not present anymore. Hence, this is an example of how small variations in geometric constraints could lead to less violations.

NPV Enhancement

NPV Enhancement is the process of creating scenarios to conduct extensive searches for solutions with higher net present value (NPV) values and similar violations, while considering minimum requirements for project constraints. Scenarios are created that gradually modify constraints from desirable requirements to minimum requirements, with the goal of increasing project profitability.

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

Consider the initial scenario and respective decision tree built for a NPV Enhancement process. The constraints in green (production capacities, geometric constraints, and average CU) are considered for modifications, from desirable requirements to minimum requirements, in order to identify solutions with higher net present value (NPV).

Results show a high variation in NPV while the production remains in its limits. Hence, it shows that it is possible to achieve higher NPVs when employing minimum requirements defined by the user.

Bottleneck Analysis

Bottleneck Analysis involves generating scenarios to conduct extensive searches for solutions with fewer violations while preserving NPV, keeping geometries, optimizing mining operations, and reducing risks. This allows for the discovery of more efficient and sustainable solutions that meet specific constraints and needs of the mine.

In Bottleneck Analysis, after analyzing a desirable scenario it is possible to identify the constraint/s with demanding requirements that directly impact the optimization results and cause significant violation issues. Then, scenarios should be created by relaxing these demanding parameters, enabling users to make decisions to mitigate risks and ensure project viability.

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

Consider the base scenario overview and the respective report on the dump production. Note how the first period has violated the 30Mt constraint.

A Bottleneck Analysis can help us identifying the constraint/s with demanding requirements that directly impact the optimization results and cause the violation in the dump production. Four different scenarios are built using a decision tree to analyze different values for dump production limits, minimum average of CU and vertical rate of advance.

NoteTo decide which parameters need to be changed, you can consider the contraint priority order that MiningMath employs in order to always deliver a solution. However, adjustments usually depend on the unique characteristics of each project and the flexibility available to modify its requirements.

The graphs below depict a comparative analysis of the results for the scenarios in the decision tree.

This analysis shows that the minimum average constraint of CU, production dump and vertical rate of advance are all restricting the base scenario. When relaxing these parameters, there is an increase of approximately 5% in the cumulative NPV, while the dump productions are kept within their limits and the process productions are closer to the target for some scenarios.

Multivariate Sensitivity Analysis

Multivariate Sensitivity Analysis is the process of creating and analyzing scenarios based on a range of possible values for selected constraints. Analyzing the impact of constraints variations is important for determining the optimal mine configuration and for optimizing productivity and profitability.

In Multivariate Sensitivity Analysis, scenarios are created gradually increasing or gradually decreasing the values of the constraints within a desired range, covering all combinations of values. This allows users to have a comprehensive view of the impact of combinations of constraint values on the project’s performance.

Considering the nature of global optimization and the nonlinearity of the problem, it is expected that there will be variations in performance as certain parameter values are modified. Therefore, it is crucial to generate a large number of scenarios to obtain a comprehensive analysis of the impact of these variations on the project.

Example

Dataset: Preinstalled Marvin deposit. It can also be downloaded here.

Consider the base scenario overview and the respective decision tree built for a Multivariate Sensitivity Analysis depicted below.

Results are depicted below. Note how a diverse range of cumulative NPVs is reported when compared to the Base scenario. Also not how certain productions are more stable than others, demonstrating the importance of performing a Multivariate Sensitivity Analysis.

Optimized Schedules

In the early years of the project you will find many concerns and the most value in terms of NPV. Knowing that, we decide to consider a 10-year surface to optimize the first 5 years. By a logical assumption, as the surface used corresponds to a decade, it will contain the interval from the 1st to 5th period and represents a lot more mass. However, this simple input can restrict the space where the algorithm has to find a solution, which could reduce the run-time and help it to deliver better results respecting the set of constraints given.

Example:

    Parameters Value
    Timeframe
    Years (1)
    Processing capacity
    10 Mt per year
    Total movement
    40 Mt per year
    Vertical rate of advance
    150m per year
    Maximum of processing hours
    4,500 processing hours per year
    Minimum Mining
    50m
    Bottom width
    100m
    Restrict Mining Surface
    Stockpiling parameters
    On
    Grade copper
    Until 0.7%

    Short-term Planning

    MiningMath allows the integration between long and short-term. By running the Best Casesurfaces to guide the optimization were generated and they could be used as a guide based on the NPV upper bound. The Exploratory Analysis provides insights on what could be the challenges of our project and also operational designs that could be used in further steps. At last, we obtained a detailed Schedule by using, or not, a surface, which could be the final pit or any intermediary one, as a guide.

    Considering this workflow, now you may have enough information on a reasonable long-term view to enhance the adherence/reconciliation of your plans. You could choose a surface and use it as force and restrict mining to refine everything inside it. Remember that Force Mining is responsible for making the mining achieve at least the surface inserted, which means that all the material inside its limits should be extracted, respecting the slope angles, while Restrict Mining aims to prohibit the area below the surface inserted to be mined until the period in which it has been applied.

    Thus, MiningMath will reach this exact surface in the time-frame required and enable you to test different geometries, blending constraints, and any other variable that could be required in the short-term planning without interfering in the long-term overview. Additional helpful features in these refinements are the concepts of mining fronts and the design optimization, based on surfaces modification, that could be done respecting all the parameters and generating results accordingly with your needs.

    Figure 1: Results generated using different helpful features.

    Example

    Parameters Value
    Timeframe
    Custom factor (0.5)
    Processing capacity
    5 Mt per semester
    Total movement
    20 Mt per semester
    Vertical rate of advance
    60m per semester
    Minimum Mining
    120m
    Bottom width
    100m
    Force and Restrict Mining Surface
    Surface005 from Schedule Optimization
    Stockpiling parameters
    On
    Play with steeper slope angles in the short term?
    Yes

    Table 1: Set of constraints example (1).

    Results examples

    Further details

    The example above used fewer constraints, geometries were changed and the average grade was let free. It is very helpful to define the early years based on a semester timeframe, which can assist you to manage stocks and any other variables in the firsts 3 years, for instance. Note that the period ranges on MiningMath are based on the timeframe selected, therefore, you should adjust your variables accordingly with this value.

    When we use Force+Restrict, we are telling the optimizer to break this volume into pieces and that it must mine this volume entirely, even if it is waste, so that the long-term view is respected. This way, you keep regarding the whole deposit while deciding what to do in the first periods. The approach here is quite different than a set of Revenue Factors for a series of LG/Pseudoflow runs, followed by adjustments to find pushbacks without math optimization criteria. It is worth mentioning that this kind of suggestion must be only applied at the beginning or at the end of the life of mine, since Force+Restrict Mining surfaces used in intermediate periods could interfere directly with the results.

    Using timeframes

    Another strategy is to optimize the short-term along with the long-term using different timeframes. In this approach, the integration between the short and long term visions is made in the same optimization process, facilitating the analysis and strategic definitions.

    It is possible to consider:

    • shorter time horizons (weeks, months, quarters...) for the initial periods of the operation;

    • annual plans as far as needed, for a precise definition of discounted cash flow;

    • less detail for longer time horizons. They need to be considered in the overall view of the mine, up to exhaustion, but they consume optimization processing time that can be more focused on the early years of operation.

    Thus, there is value maximization at the strategic level, and feasibility at the tactical level simultaneously. In addition, there’s minimizing compliance and reconciliation problems, as well as improving communication between teams, by working in an integrated system.

    On this strategy, each period range will represent the time interval chosen in the timeframe, and discount rate will be adjusted in alignment with the time interval choice. Other constraints such as production and vertical rate of advance (VRA) must be adjusted to match each interval on the period ranges.

    In order to clarify this strategy, Table 2 and Figure 9 present a possible list of constraints for an example using timeframes:

    Table 2: Set of constraints for a timeframe example.

    Constraints chosen in the interface for a timeframe example.

    Multi-mine

    Multi-mine projects often have hidden opportunities to maximize value. Accordingly with the natural level of complexity, it is pretty commom to make a few simplifications that might reduce value, producing sub-optimal solutions from the mathematical perspective, while using the traditional LG/Pseudoflow methodology.

    MiningMath has global optimization algorithm which can overcome such challenges, when working with integrated multi-mine projects to find solutions that regards all the pits been mined or mixed at the same time, instead of an individual pit optimization within a multi-mine project, which provides a totally different overview. To handle with such projects, the block model must contain all the mining regions which should be considered to the simultaneous optimization, thus, if you have the pits mapped in different datasets, it is important to follow the steps suggested below:

    1. Work with a single block model or single pit first, run the initial tests and understand this region before handling the block model modification.

    2. Try to eliminate meaningless blocks, which would not affect the solution and could increase complexity.

    3. Join a second model/pit* and understand the manipulation process to work with multi-mine projects.

      - Play with surfaces, if you wish to refine the results, filter regions to not mine, and any other guide. Since the surface files on MiningMath has always the same order, a good way to work with it could be using an excel file, available here, which is pretty useful to such modifications.

      - Play with Mining Fronts, if you want to control the material which is extracted from each region.

    4. Add the other regions and start use everything that you wish.

    *This joint block model file should fullfill the same requirments of a single one, as mentioned at the formating data page.

    Figure 1: Multiple pits project

    The current version of MiningMath consider the same Vertical RateBottom Width and Mining Width for the entire block model. However, in a multiple pit overview, each one could have different geometric parameters that influence the levels of selectivity. Regarding these scenarios, the main recommendation is to set the parameter from one pit, fix solutions of this one, since force+restrict mining has the highest level of priority, and start the optimization of the others which would already consider the mass that would be extracted from the first pit fixed.

     
    Figure 2: Surface obtained in the first optimization

    An efficient workflow is to run the first scenario without geometric parameters, which should be the validating or the Best Case, for scheduling optimization. Then set up a scenario with geometric parameters from the most selective mine which means, the smallest widths and biggest vertical rate (VR), which would be the least constrained scenario considering geometric aspects. Therefore, each surface generated by this approach will be used to fix solutions for Mine 1.

    For instance, you could take the surface 1 generated and decrease the elevation in the other areas to regard the mass from the period one of mine 1 and what could be mined in the second pit. By having these results you can refine either surfaces or mining fronts to get the best results and do a sensitivity analysis of geometric parameters for multi-projects still preserving the global optimization.

    Figure 3: Surfaces set-up

    Sustainable analysis

    Technology has been developed to incorporate social and environmental factors in the mining project optimization, assessing these impacts whilst maximizing its net present value (NPV). The method can quantify socio-environmental aspects, such as dust, noise, avoidance of springs/caves/tribes, carbon emissions, water consumption, and any parameter that could be controlled by its average or sum. These environmental and social aspects can be assessed following internationally recognized standards (ISO 14044).

    Figure 1: social and environmental factors.

    Minviro in partnership with MiningMath has developed an approach to integrate such quantitative assessment into strategic mining optimization. This enables socio-environmental impacts to be constrained in the mining optimization, and the economic cost of reducing them to be calculated as a consequence. The way to do it is by inserting these variables linked with each block of your model, following these instructions. Considering this methodology, published here, significant reductions in the global warming impact could be achieved with a small economic cost. For example, using an environmental constraint it was possible to reduce 8.1% of ‎CO2 emission whilst achieving 95.9% of the net present value compared to the baseline, as you can see in the image bellow.

    Figure 2: Reduction in enviromental impacts.

    Several scenarios for mine development, processing setup, energy/water consumption, CAPEX (content in Spanish), OPEX etc. can be evaluated . It is also possible to include geometric constraints in order to restrict a mining area due to legal and site-specific issues that affecs the local population, using this feature. Spatially and temporally explicit socio-environmental risks can be included in mining optimization, providing an opportunity to assess alternative project options or explore a socio-environmental cost benefit analysis. For each aspect considered, decision makers are able to propose a range of possible scenarios and assess the economic cost of constraining these to different levels.

    Figure 3: Possible scenarios to assess the economic cost of constraints.

    The decision-making board, which previously had access to one or a few scenarios, now has a cloud of possibilities optimized and integrated with the technical and economic aspects of the project, reducing risks and adding sustainable value. The mathematical intelligence behind it is based on modern and well-accepted Data Science and Optimization concepts academically proven. The methodology has been tested in real mining projects with gains in NPV ranging between 15% and 20% on average, where socio-environmental aspects haven’t been added yet.

    Figure 4: Performance over time.

    Uncertainties at the Beginning

    One of the many possibilities offered by MiningMath’s approach is to have multiple overview scenarios to evaluate different project assumptions, before doing a more detailed work. It does not demand an arbitrary/automated trial-and-error cutoff definition, nor a fixed input in form of pushbacks that will guide further optimization steps within the boundaries of a simplified problem. A subtle but substantial implication is the possibility of seeing a totally different mine development throughout the mine life cycle for each project assumption change. This allows mine managers to have a clearer view of the decision-tree and the possibilities on their hands, to improve economic, technical, and socio-environmental performances.

    Considering this context, mine managers can judge greenfield projects to know whether or not they should prioritize a geotechnical study. This could be done by running multiple scenarios, considering the expected variability for slope angles for a given deposit. For example, in a given deposit, benchmarks from similar deposits indicate the overall slope angles might vary between 35-45 degrees. Before reaching the conclusion using an in-depth geotechnical study, multiple scenarios can be used to estimate the economic impact of each possible assumption for the overall slope angle. The conclusion might, then indicate a low economic impact, that could postpone the need for a detailed study.

    The same idea applies to any parameter, which ultimately represents a project assumption.

    MiningMath conducted an illustrative example with 2000 simulations varying multiple parameters independently. The results produced the chart from Figure 1, showing the probability (Y-axis) and the Project’s Value (X-axis). In this case, a detailed geotechnical study might be postponed, as the Project’s Value varies between 700 to 1100 MU$, in function of the OSA.

    Figure 1: What would 2000 simulations say about NPV distributions?

    Theory

    Current Best Practices

    MiningMath software allows mining engineers to improve their strategic analysis through risk assessments performed in a single-step approach to optimization. In other words, MiningMath’s global mining optimization methodology helps to integrate multiple areas of the business. It handles all parameters simultaneously, delivering multiple scenarios and accounting for both strategic and tactical aspects.

    Hence, it is important to understand other current best practices employing a stepwise rationale and their disadvantages compared to MiningMath’s single-step approach.

    Stepwise technologies

    The mining planning models built with current best practices have developed shortcuts and approximations to try to deliver acceptable results that consider all the project’s complexities and constraints. To handle it, powerful machines are required to find a solution and to simultaneously determine the optimum pit limit and mining sequence that deliver the maximum project value.

    Figure 1 depicts a stepwise approach used by current best practices.

    Figure 1: Current best practices: stepwise approach

    Stages of stepwise approaches

    These steps may include different strategies, technologies or algorithms. However, they are all usually solved individually in three larger stages:

    1. Nested pits: when finding nested pits it is possible to employ the Lerchs-Grossmann (LG) algorithm, the Pseudoflow algorithm, destination optimization, direct block scheduling, or even more recent  heuristic mechanisms.
    2. Pushback definition: having the nested pits defined, the next step would usually be to perform the definition of pushbacks in a manual way by some expert mine planning engineers using a number of empirical rules.  Automatic ways focused on NPV optimization could also be employed for pushback design, but these are usually under resource constraints and do not consider enough geometric requirements.
    3. Schedules: finally, starting from a chosen pushback, the scheduling is performed. A myriad of techniques can be employed for that, such as direct block scheduling, genetic algorithms, (fuzzy) clustering algorithms, dynamic programming, and heuristic methods in general. All with different rates of success, but limited variety of solutions due to the single pushback input.

    Aim of stepwise approaches

    Regardless of the technologies or algorithms, in a stepwise approach the aim is to initially find the final pit limit that maximizes the undiscounted cash flow to then focus on block sequence within this final pit envelope. By constraining the problem and predefining inputs, these shortcuts (approximations) help to save time and computer resources, enabling such software to consider complexities such as ore blending requirements, different processing routes, stockpiling policy, truck fleet considerations, and so on.

    Disadvantages of stepwise approaches

    With current best practices employing some stepwise approach, thousands of potential schedules can be generated with a multitude of different methods, but they are all based on the same stepwise rationale, with one step guiding the other. Commonly, schedules follow from a set of nested pits and other fixed input parameters such as geotechnics, metallurgical performance, blending constraints, etc. Therefore, the results frequently present similar behaviours and restrict the full exploration of the solution space.

    MiningMath Uniqueness

    MiningMath allows mining managers to improve their strategic analysis through risk assessments that are unconstrained by stepwise processes. Through math optimization models that integrate multiple areas of the business, MiningMath handles all parameters simultaneously and delivers multiple scenarios, accounting for both strategic and tactical aspects.

    MiningMath optimization is not constrained by arbitrary decisions for cut-off grades or pushbacks, since these decisions are usually guided by prior knowledge or automated trial-and-error. Thus, each set of constraints in our technology has the potential to deliver an entirely new project development, including economic, technical, and socio-environmental indicators, along with a mine schedule, while aiming to maximize the project’s NPV.

    How can it be used?

    MiningMath acknowledges that each project has its own characteristics. Thus, it also allows you to choose which workflow fits best in your demand and decide which one should be used. Straight from block model you can find solutions to your short-term, schedules, optimized pushbacks or super best case, as depicted in Figure 1.

    Figure 1: Single-step approach employed in MiningMath. Straight from block model to short-term, schedules, optimized pushbacks or super best case.

    Super best case

    As MiningMath optimizes all periods simultaneously, without the need for revenue factors, it has the potential to find higher best case’s NPVs than traditional best case procedures based on LG/Pseudoflow nested pits, which do not account for processing capacities (gap problems), cutoff policy optimization and discount rate. Usually, these, and many other, real-life aspects are only accounted for later, through a stepwise process, limiting the potentials of the project. 

    Discounted Cash flow x Undiscounted Cash flow

    The use of LG/Pseudoflow methods to perform pit optimization aims to maximize the undiscounted cash flow of the project. On the other hand, MiningMath maximizes the discounted cash flow. Therefore, regions in which MiningMath has decided not to mine are, probably, regions where you have to pay for removing waste on the earlier periods, but the profit obtained by the discounted revenue from the hidden ore does not pay for the extraction.

    A proper comparison between this methodology could be done if you import the final pit surface obtained from the other mining package into MiningMath, and use it as Force/Restrict mining. This way, MiningMath will do the schedule optimization using the exact same surface, which will allow you to compare the NPV for each case. Figures 2 and 3 depict two comparisons between undiscounted and discounted cashflows.

    Example of discounted and undiscounted cashflow
    Figure 2: Undiscounted versus discounted cash flow optimization.
    Figure 3: Undiscounted versus discounted cash flow optimization regarding a minimum mining width.

    Pushbacks

    MiningMath offers the option of producing Optimized Pushbacks with controlled ore production and operational designs to guide your mine sequencing. Having this broader view in mind, you are already able to begin the scheduling stage. The block periods and destinations optimized by MiningMath could be imported back into your preferred mining package, for comparison, pushback design or scheduling purposes.

    Schedules

    When using MiningMath, it is possible to define the pit limit and mine schedule simultaneously. That is, to determine which blocks should be mined, when this should happen and to where they should be sent to maximize the NPV, while respecting production and operational constraints, slope angles, discount rate, stockpiles, among others, all performed straight from the block model. This means that the steps of pit optimization, pushback and scheduling are not obtained separately, but in a single and optimized process.

    Decision Trees

    To help with all that, our software allows you to build Decision Trees which enable a broader view of your project and a deeper understanding of the impacts of each variable. This is all possible because MiningMath works with a global optimization which simultaneously regards all variables, instead of using a step-wise approach. The software provides different views and solutions for the same mine for each parameter changed and each possible objective. 

    Guaranteed Solutions

    Multiple, complex constraints increase the likelihood of not finding or not existing feasible solutions. Nonetheless, MiningMath always delivers a solution, even if it could not honor the entire set of constraints imposed or had to reduce the NPV to find a feasible solution.

    When dealing with highly constrained problems, other technologies might take hours or days to realize there is no feasible solution. The reason for that is because they usually employ generic optimization algorithms, not suitable to take decisions in a mining problem. In this case, the only option is to prepare a second execution with more flexible constraints, but still with no guarantee of feasibility.

    On MiningMath, once an infeasible solution is detected, the algorithm takes decisions on which (less relevant) constraints should be flexible, returning some warnings to the user at the report. This is performed along the optimization process, without compromising the runtimes. 

    The constraints priority order, from the highest to the lowest, is depicted in Figure 1.

    1. Force+Restrict Mining together using the same surface.

    2. Slope Angles.

    3. Force mining or Restrict Mining, same concept as above, but the surfaces here are corrected according to slopes and it might have some differences.

    4. Minimum Bottom and Mining width, mining length.

    5. Total Production Capacity (or the sum of the capacities across all destinations.)

    6. Vertical rate of advance.

    7. Average and Sum, modeled as strong penalties in the objective function

    8. Time limit

    9. Improve the NPV

    Figure 1: Constraints hierarchy order.

    Theory Validation

    MiningMath’s results are only possible due to its proprietary Math Programming Solver ©. It consists of a Mixed Integer Linear Programming (MILP) formulation and linearization methods that tackle the challenging non-linear aspects of the mining optimization. In addition, it has its own Branch & Cut algorithm, which provides more efficiency than standard MILP optimizers since it’s fine tuned to this specific optimization problem. 

    Another major advantage of MiningMath comes from the mathematical formulations based on surfaces (Goodwin et al., 2006; Marinho, 2013), instead of usual block precedences. Block precedence methods might lead to higher errors (Beretta and Marinho, 2014), providing slopes steeper (i.e. riskier, more optimistic) than requested. The use of surfaces eliminates these geotechnical errors and  allows for block-by-block geotechnical zones, if needed.

    These surface-based formulations allow MiningMath to include geometric constraints, and, consequently, find solutions that are closer to real mining operations. The user can guide geometries by including mining and bottom widths, mining lengths, maximum vertical advance rates, and forcing/restricting mining areas. You can better understand how each constraint interacts with all others here. Such constraints give freedom to the user to work, or not, with predefined cut-offs and pushbacks which might limit the space of potential solutions. An in-depth view of MiningMath’s formulations and algorithm can also be seen here.

    This approach (Figure 1) has been applied for years by clients, such as Vale, Rio Tinto, Codelco, Kinross, AMSA and MMG, with a growing number of licenses sold, press releases and academic research also proving the consistency of the implementation. With constant developments since 2013, MiningMath has reached a mature and robust state. It is the first and only singlestep mining optimization engine available in the market.

    Figure 1: MiningMath’s approach. From block model to schedule in a single step solved by its proprietary Math Programming Solver ©.

    Mining Optimization Algorithm

    MiningMath has a flexible mining optimization algorithm that consists of a Mixed Integer Linear Programming (MILP) formulation and linearization methods that tackle the challenging non-linear aspects of the problem. In addition, MiningMath has its own Branch & Cut algorithm, which provides more efficiency than standard MILP optimizers since it’s fine tuned to this specific optimization problem.

    One of the major advantages of MiningMath comes from the mathematical formulations based on surfaces [1] [2] , instead of usual block precedences. Block precedence methods might lead to higher errors [3] , providing slopes steeper (i.e. riskier, more optimistic) than requested. The use of surfaces eliminates these geotechnical errors and  allows for block-by-block geotechnical zones, if needed.

    Another crucial advantage is that MiningMath’s formulation includes geometric constraints, allowing its algorithm to find solutions that are closer to real mining operations. The user can guide geometries by including mining and bottom widths, maximum vertical advance rates, and forcing/restricting mining areas. Such constraints give freedom to the user to work, or not, with predefined cut-offs and pushbacks which might limit the space of potential solutions. Hence, the software provides different views and solutions for the same mine for each parameter changed.

    Eventually, linear solutions need to be mapped onto an approximate integer (block-by-block) solution that will represent the scheduling of the mining problem in the real-world. The intelligence to convert continuous solutions into integer and non-linear ones are made by MiningMath’s Branch & Cut algorithm.

    algorithm mining optimization
    Summary steps of MiningMath algorithm

    Algorithm’s flowchart and mathematical formulation

    MiningMath employs an innovative mathematical formulation and powerful proprietary Branch & Cut algorithm for mining optimization problems. A description of this mathematical formulation and the three main steps of the algorithm employed are given below.

    Step 1: Initial assessment

    Figure 1: Initial assessment of entire block model and inclusion of likely profitable blocks within an initial surface.

    The first step of the mining optimization algorithm is to remove regions that do not add any value to the project. This is an initial assessment that considers slope constraints, reducing the size of the problem and providing a region of interest for the optimization process. Since MiningMath always employs surfaces in its mathematical formulations, this first set of likely profitable blocks are contained within an initial surface as depicted in Figure 1.

    Step 2: Problem linearization and mining optimization

    Solution provided by mining optimization algorithm
    Figure 2: Example solution with geometric constraints

    In the second step of the mining optimization algorithm, the non-linear, integer problem is approximated to an integer, linear one based on surfaces.  For that, it is necessary first to define the common notation across the problem and its variables.

    • [latex]S[/latex]: number of simulated orebody models considered
    • [latex]s[/latex]: simulation index, [latex]s = 1,...,S [/latex]
    • [latex]D[/latex]: number of destinations
    • [latex]d[/latex]: destination index, [latex]d = 1,...,D [/latex]
    • [latex]Z[/latex]: number of levels in the orebody model
    • [latex]z[/latex]: level index, [latex] z = 1,...,Z [/latex]
    • [latex]T[/latex]: number of periods over which the orebody is being scheduled and also defines the number of surfaces considered
    • [latex]t[/latex]: period index, [latex]t = 1,...,T. [/latex]
    • [latex]M[/latex]: number of cells in each surface; where [latex]M = x \times y[/latex] represents the number of mining blocks in x and y dimensions.
    • [latex]c[/latex]: cell index, [latex]c = 1, \ldots ,M[/latex].
    • [latex]G[/latex]: number of unique destination groups defined. Each group might contain 1, all, or any combination of destinations.
    • [latex]g[/latex]: group index, [latex]g = 1, \ldots ,G[/latex].
    • [latex]x_{c,t,d}^{z}[/latex]: simulation-independent binary variable that assumes 1 if block [latex](c, z)[/latex] is being mined in period [latex]t[/latex] and sent to destination [latex]d[/latex], and 0 otherwise.
    • [latex]e_{c,t}[/latex]: simulation-independent continuous variables associated with each cell [latex]c[/latex] for each period [latex]t[/latex], representing cell elevations.
    • [latex]\overline{f_{t,g,s}},\underline{f_{t,g,s}}[/latex]: continuous variables to penalize sum constraints violated for each period, group of destinations, and simulation. One pair of variables is necessary for each quantifiable parameter modeled block by block whose sum is being constrained. An example would be variables used to control fleet hours spent in different periods, groups of destinations, and simulations. More information about possible parameters here. Note also that the software allows the control of the average of simulations, instead of dealing with each simulation individually, and the control by the sum of destinations, instead of each destination individually.
    • [latex]\overline{\alpha_{t,g}},\underline{\alpha_{t,g}}[/latex]: user defined weights for variables [latex]\overline{f_{t,g,s}},\underline{f_{t,g,s}}[/latex] with the same destination group [latex]g[/latex] and period [latex]t[/latex]. These can only be defined in the .ssscn files.
    • [latex]\overline{j_{t,g,s}},\underline{j_{t,g,s}}[/latex]: continuous variables to penalize average constraints violated for each period, destination, and simulation. One pair of variables is necessary for each quantifiable parameter modeled block by block whose average is being constrained. An example would be variables used to control the average grade of blocks mined in different periods, destination groups, and simulations. More information about possible parameters here. Note also that the software allows the control of the average of simulations, instead of dealing with each simulation individually, and the control by the sum of destinations, instead of each destination individually.
    • [latex]\overline{\beta_{t,g}},\underline{\beta_{t,g}}[/latex]: user defined weights for variables [latex]\overline{j_{t,g,s}},\underline{j_{t,g,s}}[/latex] with the same destination [latex]d[/latex] and period [latex]t[/latex]. These can only be defined in the .ssscn files.
    • [latex]e_{c,t} \in \mathbb{R},\,\, [/latex]  [latex]t = 1,...,T[/latex],[latex]c=1,...,M[/latex]
    • [latex]x_{c,t,d}^{z} \in \{0,1\},\,\, [/latex]  [latex]c=1,...,M[/latex], [latex]t = 1,...,T[/latex], [latex]z=1,...,Z[/latex], [latex]d=1,...,D[/latex]
    • [latex]\overline{f_{t,d,s}},\underline{f_{t,d,s}} \in \mathbb{R_{\geq 0}}[/latex], [latex]t = 1,...,T[/latex], [latex]s=1,...,S[/latex], [latex]d=1,...,D[/latex]
    • [latex]\overline{j_{t,d,s}},\underline{j_{t,d,s}} \in \mathbb{R_{\geq 0}}[/latex], [latex]t = 1,...,T[/latex], [latex]s=1,...,S[/latex], [latex]d=1,...,D[/latex]

    Having the set of variables defined, is possible now to define a mathematical model with an objective function and necessary constraints. 

    Objective function

    Intuitive idea

    1. Sum of the economic value of blocks mined per period, destination, and simulation.
    2. Average the result by the number of simulations.
    3. Subtract penalties for certain violated restrictions associated with some user defined parameters.

    Requirements

    • \(V_{c,t,d,s}^{z}\): cumulative discounted economic value of block \((c, z)\) in simulation \(s\), period \(t\) and destination \(d\). More about this calculation here.

    Formulation

    \(max\frac{1}{s}\)\(\sum\limits_{s=1}^{S}\sum\limits_{t=1}^{T}\sum\limits_{c=1}^{M}\sum\limits_{z=1}^{Z}\sum\limits_{d=1}^{D}\)\((V_{c,t,d,s}^{z} x_{c,t,d}^{z}) \,-\, p\)
    where
    \(p = \sum\limits_{t=1}^{T}\sum\limits_{g=1}^{G}(\overline{\alpha_{t,g}}(\sum\limits_{s=1}^{S}\overline{f_{t,g,s}})\)\( + \underline{\alpha_{t,g}}(\sum\limits_{s=1}^{S}\underline{f_{t,g,s}})\)\( + \overline{\beta_{t,g}}(\sum\limits_{s=1}^{S}\overline{j_{t,g,s}})\)\( + \underline{\beta_{t,g}}(\sum\limits_{s=1}^{S}\underline{j_{t,g,s}})) \)

    Finally, the objective function is constrained by the restrictions below

    • [latex]e_{c,t-1} - e_{c,t} \ge 0, c=1,...,M, t=2,...,T [/latex]
    [caption id="attachment_14191" align="alignnone" width="5532"]Figure 3: Example of crossing surfaces Figure 3: Two surfaces (blue and yellow): a) not crossing each other and respecting constraint (2); b) crossing each other and not respecting constraint (2).[/caption]

    Intuitive idea

    • Adjacent elevations in a single surface need to respect a maximum difference. This maximum will change based on which direction they are adjacent: x, y, or diagonally.

    Requirements

    • [latex]H_x, H_y, H_d[/latex]: maximum difference in elevation for adjacent cells in [latex]x[/latex], [latex]y[/latex] and diagonal directions
    • [latex]X_c, Y_c, D_c[/latex]: equivalent to [latex]H_x, H_y, H_d[/latex]concept, the sets of adjacent cells, laterally in [latex]x[/latex], in [latex]y[/latex], and diagonally, for a given cell [latex]c[/latex], respectively.

    Formulation

    • [latex]e_{c,t} - e_{x,t} \ge H_x, c=1,...,M, t=1,...,T, x \in X_c[/latex]
    • [latex]e_{c,t} - e_{y,t} \ge H_y, c=1,...,M, t=1,...,T, y \in Y_c [/latex]
    • [latex]e_{c,t} - e_{d,t} \ge H_d, c=1,...,M, t=1,...,T, d \in D_c [/latex]
    [caption id="attachment_14202" align="alignnone" width="5260"]Example of slope constraints in mining optimization algoritm Figure 4: Maximum allowed difference (Hx, Hy, and Hd) in elevation between adjacent cells in contact laterally in the x direction (a), in contact laterally in the y direction (b), and in contact diagonally (c).[/caption]
     

    Proprietary constraints not disclosed. Possible examples of constraints of the same type, but not the ones actually employed.

    Intuitive idea

    • Surfaces will define when blocks will be mined. For example, blocks between surfaces associated with period 1 and 2, will be mined in period two. A block is between two surfaces if its centroid is between the two surfaces.

    Requirements

    • [latex]E_{c}^{z}[/latex]: elevation of centroid for a given block [latex](c, z)[/latex]

    Formulation

    • [latex]E_{c}^{z} \times \sum\limits_{d=1}^{D}x_{c,1,d}^{z} \ge e_{c,1}, c=1,...,M,  z=1,...,Z[/latex]
    • [latex]e_{c,t-1} \ge E_{c}^{z} \times \sum\limits_{d=1}^{D}x_{c,t,d}^{z} \ge e_{c,t}, [/latex][latex]c=1,...,M, t=2,...,T, z=1,...,Z[/latex]
    [caption id="attachment_14206" align="alignnone" width="4192"]Figure 5: Distance between centroids above surfaces (green lines) and below surfaces (red lines) respecting constraints (6) and (7). Blue blocks are mined in period 1, while yellow blocks are mined in period 2. Figure 5: Distance between centroids above surfaces (green lines) and below surfaces (red lines) respecting constraints (6) and (7). Blue blocks are mined in period 1, while yellow blocks are mined in period 2.[/caption]
    Intuitive idea
    • Each mined block can only be sent to one destination.
    Formulation
    • [latex]\sum\limits_{d =1}^{D}x_{c,t,d}^{z} = 1, c=1,....,M, t=1,...,T, z = 1,...,Z[/latex]

    Intuitive idea

    • For each period and destination groups there’s an upper and lower limit of total tonnage to be extracted. Destination groups might be formed by any unique combination of destinations, with 1, many or all. The sum of the tonnage of mined blocks sent to the same group of destinations in the same period must respect these limits.

    Requirements

    • [latex]T_c^z[/latex]: tonnage for a given block [latex](c, z)[/latex].
    • [latex]\overline{T_{t,g}}[/latex]: upper limits in total tonnage to be extracted during period [latex]t[/latex] and destinations in group [latex]g[/latex].
    •  

    Formulation

    • [latex] \sum\limits_{c=1}^M\sum\limits_{z=1}^{Z}\sum\limits_{d \in g}T_c^z x_{c,t,d}^{z} \le T_{t,g}, t = 1,...,T, g = 1,..., G[/latex]

    Intuitive idea

    • The user can define a certain parameter (i.e. fleet hours spent) associated with each mined block to have its sum controlled. The sum of the values of this parameter associated to each mined block must respect lower and upper bounds for each period, destination groups (optional) and simulation (individually or on average). Destination groups might be formed by any unique combination of destinations, with 1, many or all.

    Requirements

    • [latex]\underline{F_{t,g,s}},\overline{F_{t,g,s}}[/latex]: lower and upper limits, respectively, in sum of user defined parameter to be respected in period [latex]t[/latex], destination group [latex]g[/latex], and simulation [latex]s[/latex].
    • [latex]F_{c,d,s}^{z}[/latex]: value of user defined parameter related to a given block [latex](c, z)[/latex] in destination [latex]d[/latex] and simulation [latex]s[/latex].

    Formulation

    • [latex]\underline{F_{t,g,s}} \le \sum\limits_{c=1}^M\sum\limits_{z=1}^{Z}\sum\limits_{d \in g}F_{c,d,s}^{z}x_{c,t,d}^{z} + \underline{f_{t,g,s}} - \overline{f_{t,g,s}} \le \overline{F_{t,g,s}},[/latex]

      [latex]t = 1,...,T, g = 1,..., G, s = 1,..., S[/latex]

    Intuitive idea

    • The user can define a certain parameter (i.e. grade) associated with each mined block to be controlled in average. This average is weighted by the block’s tonnage and by an optional, user defined weight. It must respect lower and upper bounds for each period, destination group (optional) and simulation (individually or on average). Destination groups might be formed by any unique combination of destinations, with 1, many or all.

    Requirements

    • [latex]\underline{J_{t,g,s}},\overline{J_{t,g,s}}[/latex]: lower and upper limits, respectively, for average value of user defined parameter to be respected in period [latex]t[/latex], simulation [latex]s[/latex], and destination group [latex]g[/latex].
    • [latex]T_{c}^{z}[/latex]: tonnage for a given block [latex](c, z)[/latex].
    • [latex]J_{c,s,d}^{z}[/latex]: value of user defined parameter of block [latex](c, z)[/latex] sent to destination [latex]d[/latex] in simulation [latex]s[/latex]
    • [latex]P_{c,t,d,s}^{z}[/latex]: user defined weight for block [latex](c, z)[/latex] in period [latex]t[/latex], destination [latex]d[/latex], and simulation [latex]s[/latex]

    Formulation

    • [latex]\underline{J_{t,g,s}} \le[/latex][latex]\frac{\sum\limits_{c=1}^M\sum\limits_{z=1}^Z\sum\limits_{d\in g}P_{c,t,d,s}^{z}T_{c}^{z}J_{c,s,d}^{z}x_{c,t,d}^{z}}{\sum\limits_{c=1}^M\sum\limits_{z=1}^Z\sum\limits_{d\in g}P_{c,t,d,s}^{z}T_{c}^{z}}[/latex][latex] + \underline{j_{t,g,s}} - \overline{j_{t,g,s}} \le \overline{J_{t,g,s}}[/latex]

      [latex]t = 1,...,T, g = 1,..., G, s = 1,..., S[/latex]

    Proprietary constraints not disclosed

    Intuitive idea

    • Surfaces should respect geometric parameters defined by the user, such as minimum bottom width, minimum mining width, minimum mining length, and maximum vertical rate of advance, as depicted here.

    Formulation

    • [latex]Geometric(e_{c,t}) \le \text{geometric restriction}, c=1,...,M, t=1,...,T[/latex]

    Step 3: Integer, non-linear solution and evaluation

    The next step in the mining optimization algorithm is to convert the linear solution to an integer, non-linear one. MiningMath’s Branch & Cut method is responsible for this conversion. Once it is done, the resulting solution can be evaluated, leading to the end of the algorithm’s execution or to a new optimization process. This new process might be triggered if one of the two situations arise: 

    1. restrictions are violated due to transformation from linear to integer, non-linear solution, or due to problem being infeasible.

    2. an evaluation of certain restrictions in the transformed integer, non-linear solution concluded that they might not affect the problem and be better discarded or modified.

    If any of these are true, the solution at this stage will be sent back to Step 2 for linearization and refinement. Thus, if this refinement is caused by situation 1) then the goal is to improve the solution’s feasibility. This feasibility is improved according the constraint hierarchy order depicted in Figure 6.

    Figure 6: constraints hierarchy order.

    In contrast, if it is caused by situation 2) then the goal is to allow the optimization to focus on the bottlenecks of the problem and improve the current NPV. Once none of these situations have been identified, the current solution is returned. Note that each time the algorithm goes back to Step 2, a new global optimization is performed, thus the new resulting solution might be entirely different. 

    Pseudo-code

    The whole process of the mining optimization algorithm, from input to output is summarized in the psudo-code below. References are made to previous Steps 1, 2, and 3. This algorithmic flow together with the proposed mathematical formulation exemplifies the innovative methodology applied to solve a single mine scheduling problem.

    				
    					INPUT: Block model,
           Mining parameters,
           Optional time limit T
    OUTPUT: Excel report summarizing the main results of the optimization,
            Outputs of mining optimization, topography, and pit surfaces using   
            .csv format that can also be imported into other mining packages.
    
    EXECUTE initial assessment // Step 1
    CREATE problem linearization P // Step 2
    SET CURRENT_SOLUTION to empty
    SET FEASIBLE_SOLUTION to empty
    REPEAT // Step 3
        SOLVE P // Optimization engine + proprietary Branch & Cut algorithm
        SET LS to the integer, linear solution of P
        TRANSFORM LS to an integer, non-linear solution RS
        
        // Evaluate RS
        IF RS has no violated constraints THEN
            SET FEASIBLE_SOLUTION to RS
        ENDIF
            
        IF RS is better than CURRENT_SOLUTION THEN 
            SET CURRENT_SOLUTION to RS
        ENDIF
        
        // Evaluate if new iteration is necessary
        IF FEASIBLE_SOLUTION is empty THEN
            // Step 2 and Figure 6
            CREATE new problem linearization P
                   with flexible constraints
            CONTINUE // Go back to loop's start
        ELSE IF T has been reached
            BREAK // Leave loop
        ELSE IF RS has violated constraints that were unviolated in LS OR
              has constraints that can be discarded/modified THEN
            CREATE new problem linearization P // Step 2
            CONTINUE // Go back to loop's start
        ELSE
            BREAK // Leave loop
        ENDIF
    WHILE TRUE
    EXPORT reports and outputs from CURRENT_SOLUTION
    
    				
    			

    Evaluating constraints

    MiningMath has a flexible mining optimization algorithm that consists of a Mixed Integer Linear Programming (MILP) formulation and linearization methods that tackle the challenging non-linear aspects of the problem. It is the only mining package able to handle a diverse range of constraints in a single-step process. However, such range of available constraints raises the question: 

    How to add all the required constraints without losing too much value?

    There is no exact procedure, as each constraint models a different engineering aspect. Therefore, there must be an experienced engineer willing to explore a range of possibilities by building Decision Trees, wisely choosing scenarios that get closer to the real problem (more constraints added) without losing so much value (or even gaining, given some non-linear aspects).

    The following sections suggest possible workflows that can be followed in order to perform an efficient analysis.

    Initial analysis

    It is important to analyze scenarios to measure the impact of each constraint on the project’s net present value (NPV), from the Super Best Case to a detailed setup. For example, with a NPV Upside Portential analysis.

    When performing such an evaluation it is common that the cumulative NPV usually decreases (as expected) when more constraints are added. However, there are exceptions as described in the following section.

    Non-linear constraints

    Geometric constraints are modelled as non-linear restrictions. This non-linearity can lead to counterintuitive results, with more constraints potentially causing a better NPV. Hence, if you are not happy with the results achieved after adding geometric constraints you might need to perform a Selectivity Analysis or Best-Worst Range Analysis of your project.

    Other workflows

    MiningMath offers a diverse range of Workflows that can be followed in order to improve your project’s results. If you are still struggling with certain parameters or constraints, please have a look on all possible options to identify what would be better suited to your particular case.

    Mining Sequence per Period

    What is a mining sequence per period?

    A mining sequence per period outlines the order in which blocks are to be mined within each period, typically starting from the first block (block 1) and ending with the last block (block N). It is common that other mining packages produce such sequences as part of their output.

    How are mining sequences created?

    These sequences are usually generated using heuristic approaches. For example, some methods gather all blocks from each period and, starting from the highest bench, select a specific horizontal direction to enumerate them. Some other tools also employ short-term strategies with greedy algorithms to optimize mining operations on a day-to-day basis.

    What are the disadvantages of mining sequences?

    Although a sequence of blocks can be defined for mining, these sequences often lack optimization criteria during their creation. For instance, approaches that prioritize starting from the highest bench only ensure that slope angles are respected, neglecting other geometric constraints. Similarly, greedy algorithms fail to consider the global view, potentially leading to violations of certain constraints later on.

    Therefore, MiningMath does not provide such sequences, as users might assume that constraints will be respected when, in reality, they may not be.

    How does MiningMath handle the mining sequence?

    MiningMath tackles this challenge by introducing the concept of Timeframes. This feature empowers users to specify the level of detail they desire within each mining period while maintaining a comprehensive overview and ensuring that all constraints are duly considered.

    We recommend initiating the entire Life of Mine (LOM) setup with smaller time frames, such as “months,” for the initial interval. However, in some cases, employing Force and Restrict mining surfaces from previous runs can help reduce the complexity of the problem and enhance efficiency.

    General Content

    Destinations

    Destination Policy of MiningMath

    MiningMath aims to maximize the NPV of a mining project, and as such, it uses a discount rate in all calculations, it also considers the value of money through time. The software decides which blocks will be mined, when, and to which destination they must be sent. The mathematical model bases its decision on the economic values of each possible route. It means that MiningMath aims to identify, in a global view, which is the best destination to increase the NPVrespecting, simultaneously, all the constraints and using the priority order listed here.

    Adding Destinations

    Figure 1 shows, on the right bottom corner of the screen, the buttons responsible for add/remove destinations.

    The panel Type shows the type of each destination added. The user must add at least:

    • One (1) processing stream;

    • One (1) waste dump

    The numbers at the left of the screen are identifiers for each route at the mined blocks output file.

    Renaming

    Especially when using multiple destinations, the user should consider using more meaningful names for each route. Figure 2 highlights the panel Name, where the user can rename each one.

    Process Recovery

    Figure 3 shows the recovery panel, which is intended to input recoveries used during economic value calculationThe difference is that now it is for reporting purposes, which means it is not being considered twice.

    Economic Value

    Figure 4 highlights the Economic Value panel. Here, the user can assign each economic function to the proper destination.

    Stockpile limits

    As shown in Figure 5, stockpile limits are available only for processing streams, if activated in the General tabThese limits are valid for the life of mine.

    Read more about stockpiles.

    Recovery

    One of the most important and basic concepts in mineral processing is metallurgical efficiency. Two terms are used to describe this efficiency: recovery and grade.

    The recovery or percent recovery refers to the ratio of the valuable material (metal or mineral) contained in the concentrate with reference to the amount of the same material in the processing plant feed.

    How and where?

    The main place where the user input the recovery information is when defining the Economic Values.

    However, as this information is implicit in the economic functions, the user need to input this value on the interface for purposes of generating reports.

    On the Destination tab (light green), the user can define recoveries for each element in the panel (dark green) from Figure 1.

    Figure 1: Destinations tab and recoveries.

    Varying Recoveries

    Why?

    A detailed mine planning is likely to require an iterative process to update a block model with new information.

    Considering the usage of specific tools for measurements, analysis, and reporting, mine planners might be interested in using the thorough information acquired on recoveries along the way.

    How?

    When editing the data set, the user can add as many columns as needed, defining recoveries for each blockFigure 2 shows how it would look like.

    Figure 2: Recovery columns in the block model.
    Figure 3: Imported recoveries available to the user.

    Figure 3 shows a dropdown menu where the user can choose which recovery to use:

    • RecoveryA

    • RecoveryB

    • Constant recovery

    NOTE the user does not need to use all the recovery fields imported for each run. This means recovery fields might be created for further scenarios, being used separately. The same concept is valid for columns with slopes and economic values.

    Step-by-step

    The following video presents how to use a different recovery for each block.

    Video 1: Varying recoveries for each block.

    Stockpiles

    Stockpiles are a post-processing unit of the algorithm. Therefore, after the optimization, which generates mining surfaces, the algorithm does its analysis among the discarded blocks. MiningMath checks if the block’s values (Revenue – Fixed Mining Cost – Rehandling Cost) is higher than the cost of discarding it, with the respective discount rates applied at the period of extraction and period of processing. Therefore, the stocks are considered as an optimization of the blocks that initially went to the dump.

    Figure 1: Stockpiles on MiningMath

    When choosing the destination, the same logic of maximizing the project value is used and MiningMath will define which discarded blocks will be reclaimed, complementing production shortfalls over time, Figure 1 shows the process.

    To enable the stockpiles on the interface the first step is on the General tab (Figure 2) where two inputs are required:

    • Fixed Mining Cost: value used to decompose the economic value while considering stockpiles;

    • Rehandling Cost: represents the cost to reclaim blocks from the stockpile to the process.

    After that, on the Destinations tab, you can define stockpile limits (Figure3) for each processing plant added, remembering that this limit is based on the life of mine, not in a period timeframe.

    Defining a Fixed Mining Cost is required because MiningMath does have an interface to enter these parameters when calculating the economic values. It is only used to decompose the block value so that the algorithm can make proper calculations to adjust values considering that part of its costs will occur when mined, while others may be charged when processed.

    The Rehandling Cost is the cost to reclaim a block from a stockpile. Therefore, it is the way to break the Economic Values into parts and apply the discount rate at the time a block is processed.

    Disclaimer: stockpiles and total tonnage

    Since stockpiles are a post-processing unit of the algorithm, total tonnage restrictions might be violated when the stockpiles are processed. Total tonnages restrictions are only considered during the processing unit of the algorithm.

    Stocked blocks

    There are two common ways that mining software packages deal with stockpiles:

    • Reclaiming blocks according to an average grade for the entire stockpile.

    • Reclaiming blocks selectively in any required sequence such as FIFO (first in, first out), LIFO (last in, first out), etc.

    Both options are approximations and have their advantages and disadvantages.

    Currently, MiningMath reclaims blocks selectively from stockpiles according to the highest Economic ValuesRead more about Reclamation Policy.

    The user can analyze MiningMath’s output in the files MinedBlocks or AllBlocks and report the periods in which a block has been mined and when it has been processed.

    Video 1: How to trace stocked blocks.

    Artificial stockpiles

    Artificial Stockpiles is an advanced operation to incorporate external sources of material to your model and use them as an input on your scheduling.

    This artifice may be required for cases when the user needs to include into the optimization, any of the following cases:

    • Pre-existing stockpile from ongoing operations.

    • Underground material to be blended with open-pit material.

    • Ore bought from third-part companies to fulfill production shortfalls.

    Two main ways to do it:

    • Modelling the stockpile with its actual geometry.

    • Creating a simplified artificial stockpile.

    Modelled Stockpile

    Modelling an existing stockpile with its actual geometry is the best alternative to include it to the scheduling for cases where you need an operational control over:

    • The stockpile and its adjacent areas.

    • The stock reclamation.

    For this process, use a modelling software to perform the following steps:

    1. Use the previous topography for the base of the stockpile.

    2. Use the current topography for the top of the stockpile.

    3. Create blocks in-between these surfaces. These blocks will have the same size of the block model.

    4. Assign an average quality (grade) and density to each block created.

    5. Calculate the economic values for these stocked blocks.

    6. Import the model back to MiningMath to further scheduling.

    Simplified Stockpile

    For a quicker process, the user can create as many blocks as needed using a spreadsheet application.

    This is an alternative useful for cases where you need:

    • A quicker process and evaluation.

    • Less operational control.

    You can model artificial stockpiles as rows or as columns.

    Rows

    • To control a sequence, it may require surface constraints.

    • A 1-line row will be affected by minimum widths used for the scenario.

    • Thin rows may cause problems to be mined completely.

    • Easier setup. Rows don't need geotechnical adjustments.

    • Multiple rows will give more flexibility and reduce conflicts with the operational constraints from the open-pit scheduling.

    • If you opt for multiple stockpiles, create them with a 2-cell distance to avoid overlapping interference.

    Columns

    • Easier sequence. The precedence is defined by the vertical geometry.

    • A 1-line column will be affected by minimum widths used for the scenario.

    • Long columns may affect how deep the scheduling can go in a single period.

    • Columns require a very vertical slope angle set to them and to the adjacent cells.

    • Multiple columns will give more flexibility and reduce conflicts with the operational constraints from the open-pit scheduling.

    • If you opt for multiple stockpiles, create them with a 2-cell distance to avoid overlapping interference.

    Step-by-step (rows)

    1. Create rows of blocks above the topography (Figure 1).

    2. Assign an average quality (grade) and density to each block created.

    3. Calculate the economic values for these new blocks.

    4. Import the model back to MiningMath to further schedules.

    Figure 1: Illustration of a row of 4 blocks created (in green) above a flat topography.

    Consider checking the following requirements and observations.

    Notes on modelling:

    • The blocks created must have the same size of the ones from original model.

    • Consider increasing densities for the blocks created to represent more material with a few blocks. The trade-off is a reduced selectivity.

    • The more blocks you have, the more selective is the algorithm.

    Notes on operational needs:

    • The blocks created will be subject to the operational constraints, such as widths and vertical advance, from your scenarios. This means you need to consider these parameters to define:

      • The stockpile base width.
      • The stockpile height.

    Notes on the placement within the model

    The artificial stockpile should be placed in a peripheral area of your model to not affect the open-pit schedule.

    Avoid borders to prevent any geotechnical issue, which will impede mining the artificial stockpile completely.

    The following video shows more information on artificial stockpiles.

    Video 1: Artificial Stockpiles.

    Use Other Constraints to have additional control over different sources of material, whether they come from the original model or from a modeled stockpile.

    Figure 2 shows an example on how to setup your model to track and material from different sources and control them by inputting minimum and maximum limits.

    Figure 2: Example of how to use other constraints to track and control from different sources.

    Reclaim Policy

    Stockpiles are handled as a post-processing unit of the optimization algorithm (read more). Also, another important point is to understand what guides MiningMath’s algorithm decisions for stocked and reclaimed blocks. MiningMath’s reclaim policy is based on the economic values. This goal is aligned with the objective of maximizing the NPV. This means, blocks with higher value will be reclaimed first, regardless of the period in which the block has been mined and stocked.

    How could we further control stockpiles and make it suitable to different sorts of controls?

    Different mining packages may adopt different conventions for Stock Reclaim Policies, overall, based on the main goal for each application or module. Some of the possibilities may include:

    • FIFO: First In, First Out.

    • FILO: First In, Last Out.

    • Reclaiming an average grade for the entire stockpile.

    • Reclaiming the highest-value blocks first (MiningMath uses this method).

    Each one of these possibilities is an approximation of reality, with its pros and cons. FIFO and FILO are quite logic but represent a level of selectivity that is not practical. The angle of repose and the positioning of each block are likely the most intuitive examples of lower selectivity in practice.

    An average grade for the entire stockpile is naturally an approximation, considering the amount of material that should be blended to make it close to the reality.

    Reclaiming the high-value blocks first is also a selectivity level that is not possible in reality. On the other hand, it is quite aligned with the mathematical goal of maximizing the project’s NPV for strategic evaluation.

    After all, notice that none of these approaches are fully operational. Here, the decision is still based on the professional in charge of strategy optimization, one’s preferences, experience, and skills to add further ways of control.

    Preserving Value and Guiding the Reclaim Policy

    This section aims to bring a few ideas on how the user might guide the algorithm in order to follow one’s preferred reclaim strategy.

    Baseline Knowledge​

    For this article, the user should have prior knowledge in the following concepts.

    Baseline Scenario

    The general idea is to set up and run a baseline scenario to find what is optimal for the long-term value. The solution obtained in this step will guide further executions.

    The first step is to switch the output format to export the entire block model along with the optimization output information, which is represented by the AllBlocks.csv. This is a basic step for any iteration that requires re-optimizing a previous solution. By default, MiningMath exports only the MinedBlocks.csv.

    Results will present a guide o which blocks should be mined, when, and which ones were mined, immediately processed, stocked then processed, or discarded.

    The idea here is to use previous outputs to create new columns of economic values, assuming the use of fake destinations, i.e. creating multiple processing streams that does not coexist and, in fact, represent the same single plant. These fake destinations, along with a pre-definition of the destination through economic values is what allows the user to impose whatever sort of control is preferred.

    Along with that, and based on the previous results, the user must manipulate new economic values pre-defining the final destination for each block.

    Steps here comprise:

    For the FIFO approach, a block stocked during the second period on the first run, should be sent to the stockpile of Process 1 to be reclaimed first. Hence, this block must have a very negative value for all other destinations.

    • Define a criteria for stocked blocks (Period Mined different from Period Processed) that should be reclaimed early or later. This criteria must be based on the previous results from the AllBlocks.csv file (Figure 1), and on the columns Mined BlockPeriod MinedPeriod Processed, and Destination.

    • Pre-define destinations, based on the criteria adopted, by manipulating economic values. Use very negative values to prevent a block from going to a given destination, as shown in Figure 2.

    • Set up your scenario considering the pre-defined destinations will not coexist, as illustrated in Figure 3. Notice that: 1) Process 1 and its stockpile will be used from Period 1 to Period 5;  2) Process 2 and its stockpile will be used from Period 6 to Period 10;  and 3) Process 3 and its stockpile will be used from Period 11 to .

    • Optimize the new scenario to have a better approximation for the final NPV, considering the strategy of your preference.

    Variable Mining Costs

    MiningMath was conceived with some simplifications for current version, such as the Fixed Mining Costs.

    The software considers a fixed mining cost, which is applied beforehand in the economic formulation. This value is inputted in the user interface to make the software able to recognize this variable and fragment the previously calculated economic values in order to apply proper discounts to each block, whether (1) processed, (2) discarded, or (3) stocked and processed — here, mining costs are applied earlier than the processing costs and revenue.

    However, mining costs might be defined with some variability for purposes of higher detail.

    This article shares some ideas to consider Variable Mining Costs, where some goals are based on different:

    • Haul costs, in function of block's depth and/or the destination site.

    • Blasting costs, in function of the rock type.

    • Loading costs, in function of a selective mine

    • Supplies and materials, labor costs, among others.

    1. Prior to the model import, create a column of mining costs by block — including whatever costs are applicable: blasting, haul, loading, etc.

    2. Calculate the Average Mining Cost of all of the blocks.

    3. During the model import, assign the Mining Costscolumn to the field type Other; MiningMath will export this column along with its output.

    4. Use the Average Mining Cost as the Fixed Mining Cost for the stockpiles.

    5. Run MiningMath.

    6. (OPTIONAL, for more accurate approximation) Analyze the MinedBlocks file (or AllBlocks) and calculate the Average Mining Cost just for the stockpiled blocks (Check out, Tracing Stocked Blocks)

    7. (OPTIONAL) Run MiningMath again, now using the new Average Mining Cost (from Step 6) as the Fixed Mining Cost for the stockpiles.

    For mined blocks in the stockpiles context, the possibilities for the material flow are:

    1. Mine-to-process

    2. Mine-to-waste

    3. Mine-to-stock

    4. Stock-to-process

    The mining costs for flows (1) mine-to-process and (2) mine-to-waste are already embedded in the economic formulation.

    The costs for flow (4) stock-to-process consider an additional cost, represented in the user interface by the re-handling cost.

    MiningMath assumes the costs for (2) mine-to-waste and (3) mine-to-stock are the same, which might not be always the case due to, for example, different distances at the mine site surface (different haul costs).

    For which the steps are:

    1. Open the MinedBlocks.csv.

    2. Edit the stocked blocks' value to add new costs.

    3. Calculate the NPV manually.

    1. For the first execution 1, set up MiningMath to export the AllBlocks.csv and run.

    2. Use the output model (AllBlocks) to update the economic value for stocked blocks. Then, save as a new model.

    3. For the execution 2, re-import this model with fixed surfaces for all periods. As in the Figure 1.

    Observation

    This process does not guarantee solution will not change, i.e. the stocked blocks might change, which might require further iterations for a better approximation.

    As stockpiles are treated as a post-processing unit in our algorithm, not being part of the optimization, Option 1 is more suitable for a more assertive NPV calculation.

    DEVELOPMENT NOTES

    The following features are already in our pipeline:

    • Optimized Stockpiles: future versions will consider Stockpiles as destinations when optimizing the NPV.

    • Varying mining costs: future versions will have a field such Density, Slope and Recoveries, which varies by block.

    We have a partner working on a calculator and post-processing analysis tools to be integrated with MiningMath in the future. For more details, contact our support.

    Production Constraints

    The Production tab allows the user to set period ranges and its corresponding production limits. This functionality allows the user to consider options such as pre-strippingproduction ramp-upprices changing over time, among others. It is worth mentioning that the discount rate on MiningMath is already applied in the first period. At the training presentation on this page, you can check the NPV formulas used.

    Figure 1 presents a panel where the user can define period ranges.

    The Period Ranges feature allows the user to vary correspondent variables over time (see the last section of this page) such as:

    • Production limits

    • Limiting surfaces

    • Blending constraints

    • Other constraints

    The user is able to edit only the field To. Subsequent periods will have their From field adjusted automatically.

    Figure 2 highlights the timeframe panel, where the user can choose different values for their projects. The values are used to deliver more accurate sequencing and have their values attached to a yearly basis calculation. It is possible to select a predefined value or to input a customized one.

    In this example, we used a timeframe of 1 year, meaning that each generated period on this sequencing represents one year.

    Figure 3 highlights the Production panel where the user can define limits for any destination added.

    In this example, we have the following limits:

    • Process 1: 30,000,000 t

    • Dump 1: 

    • Total: 60,000,000 t

    These limits are being considered from Period 1 up to the end of the life of mine.

    Figure 4 highlights buttons to add and/or remove intervals, which are shown in rows in Figure 4.

    Figure 5 presents an example where multiple periods were added.

    A user has defined two period ranges as follows:

    • From 1 to 4

    • From 5 to <end>

    For the first range, the user clicked on the field To (A) and set a value of 4. Automatically, the field From (B) of the second range, changed to 5.

    The user can repeat this process to fragment the life of mine, and change parameters over time, as much as required. In this example, production limits increased 100% for Process 1 and 50% increase in Total from Period 5

    The example from Figure 2 shows that the user does not need to set an explicit limit for all destinations available. In this case, Dump 1 has an <unlimited> upper bound. This means MiningMath will have enough flexibility to reduce process throughput – and increase dump tonnages up to the Total limit – whenever it finds this is a solution that increases the project’s NPV. For users willing to have more control and reduce this flexibility, simply define a limit for each destination.

    Average

    Average constraints are based on the average of any quantifiable parameter modeled block by block. To use this feature on MiningMath, the dataset must contain an auxiliary field/column which considers a value of what you wish to limit regarding the value of it in each block. Therefore, this feature controls the avarage value of the variable that has been modeled considering blocks that were mined in that single period. Since this feature is based on average parameters, the algorithm can use lower values to respect this target and increase the NPV with higher ones.

    This feature is usually applied on blending to combine low-grade and high-grade blocks in order to increase the profitability. Although, it could have a lot of other applications. Basically, any variable which could be could be modeled considering these assumptions could be controlled.

    Video 1: Blending and other constraints.

    Some examples using average are listed below:

    • Grade of a contaminant on the plant.

    • Haulage distance, based on the destination each block.

    • Blasting material consumptions.

    The user can define:

    • Minimum and maximum average limits.

    • Different limits for different materials.

    • Different limits for different intervals.

    • Different limits for different destinations.

    1. Create auxiliary fields in the block model, quantifying the information to be controlled.

    2. During the importation, assign the column to be blended to Grade (Figure1).

    3. On the Average tab, input minimum and maximum limits for each variable (Figure 2a), period range (Figure 2b), other weights to be considered (Figure 2c) and destination (Figure 2d).

    Figure 1: During the importation, Cu and Au are assigned to "Average".
    Figure 2: Fields where the user can input limits (A), for each period range (B), Weights (C) and each destination (D).

    Sum

    What is a Sum constraint?

    Sum constraints are based on the sum of any quantifiable parameter modeled block by block. To use this feature on MiningMath, the dataset must contain an auxiliary field/column which considers a value of what you wish to limit regarding the value of it in each block. Therefore, this feature controls the total amount of the variable that has been modeled based on blocks that were mined in that single period. Basically, any variable which could be modeled considering these assumptions could be controlled.

    Video 1: Blending and other constraints.

    Some examples are listed below:

    • Tonnages and proportions of rock type and metal production.

    • Consumption of inputs such as energy spent during comminution, and fleet hours spent to mobilize material.

    • Contaminants control on the processing plant during each period.

    The user can define:

    • Different sum limits for each material.

    • Different sum limits for each intervals.

    • Different sum limits for each destinations.

    • Combine all the options above in order to achieve globally optimized results.

    1. Create auxiliary fields in the block model, quantifying the information to be controlled (Figure 1).

    2. During importation, assign these auxiliary columns to Sum (Figure 2).

    3. On the Sum tab, input minimum and maximum limits for each variable, period range and destination (Figure 3).

    Material Types

    Mining fronts, rock type, or lithotype are usually defined in the block model as strings or converted to integer numbers. As the next step, the user defines which ranges of material types should be allowed, avoided or forbidden in the processing plant.

    The ideal way to model material types for further control of this variable is to create tonnage columns for each material type. Therefore, you will be allowed to:

    • Control material types to be allowed, avoided, or forbidden in any destination.

    • Control the proportions of different material types, if applicable.

    • Analyze scenarios with different levels of flexibility.

    • Understand how the project development changes in face of each hypothesis tested.

    • Assess the impacts of the flexibility level given on economic and technical variables of the system.

    The general idea is to create auxiliary columns in the block model to control any variable through their sum. Then, use the same idea of if-then-else statements, example:

    • If variable value matches condition

    • Then auxiliary column equals to X

    • Else auxiliary column equals to Y

    To control the amount of material by rock type, create columns for tonnages of each lithotype, as shown in Figure 1, where:

    • Lithotype A has its tonnage inputted in the field Tonnage A.

    • Lithotype A equals zero for fields Tonnage B and Tonnage C, as it does not match the specified condition, i.e. being a lithotype B or C.

    The same concept is used to the other material types.

    Figure 2 shows the same concept being applied to measured, indicated, and inferred resources.

    Whatever is the variable being modeled, the columns created with a similar purpose should be assigned to the field type “Sum”, during the importation, as shown in Figure 3.

    On the interface, the user will need to insert General and Destinations parameters in order to enable the Sum tab (Figure 4). The next step is to define the limits to be imposed to each variable, to which destinations, and to which period intervals.

    Figure 3: Importation screen where variables to be controlled through sums are properly assigned to the field type “Sum”.

    Figure 4: Illustrates the interface and options available.

    Mining Fronts

    The ‘mining fronts’ approach, as shown in Figure 1 and 2, is a good way to refine resultscontrol regions, and understand which are the best results considering different amounts extracted from a specific place. Using this methodology, you have the possibility to categorize masses by depth, a specific coordinate intervale, even into sectors based on into 360 degrees vision, of your project.

    The general idea is pretty similar to what is used to define material types. The first step is to identify the area which you want to control. Then create an additional field regarding the mass on that region and assign it as a sum constraint while importing it. At last, go to the sum tab and control the minimum or maximum amount at the process or dump destinations.

    Note that by using this feature you are able to control what, how much, and when the blocks should be mined according to what you want to analyze. This concept is also very useful in the context of mine design, although it increases the complexity while compared with the use of force/restrict mining surfaces, it can also guide solutions, geometrically speaking. It is important to remember that the sum constraint has a high priority parameter in the algorithm and it can also influence the other inputs based on the hierarchy order.

    Identify and define your regions as you wish. For instance, use this excel file, place the z from the validation data scenario, calculate the elevation difference with the topography, filter the region which has a 0 result, and create a dispersion chart to identify the final pit. Then, take the coordinates/indexes, which will be the limits of your mining fronts, in this case, the boundary between the mining fronts was at the index 35. Figures 3 and 4 have some visual information to help you.

    Calculate the tonnage of your mining front in an additional field and import it as a sum parameter. In this case, the mining front 1 is in the region above or equal to the limit chosen and the mining front 2 is what is bellow it, as seen in Figure 4.

    Use the sum tab to control the Mining Front of each destination. Play with the minimum or maximum that should be extracted from each mining front and understand how the results change. The scenarios, that are shown in Figures 5 to 9, used the production, geometric and stockpiling parameters from the Data Validation page.

    Download this modified Marvin Deposit file and play on your own.

    Stochastic Models

    Stochastic simulations requires equiprobable models to consider uncertainties related to geological aspects, such as grade and/or volume of ore.

    While single scenarios of distinct models are run separately, a stochastic scenario consist of obeying all the single scenarios at once.

    This is achieved through an adapted resource block model that contains equiprobable values for a given set of variables containing a certain level of uncertainty.

    As a consequence, MiningMath produces reports with the risk-profile of indicators presenting the minimum, maximum, expected, and percentiles P10 and P90 (these are threshold values, indicating that 10% of the indicators are below the P10 and 90% of the indicators are below the P90). Fig. 1 and 2 depicts the graphs for the NPV and cumulative NPV respectively.

    Report on NPV for stochastic model.

    Maximum expected NPV for period 2

    Minimum NPV expected for Period 2

    P90 NPV for Period 3. 90% of possible expected values for Period 3 are below this point.

    P10 NPV for Period 3. 10% of possible expected values for Period 3 are below this point.

    Expected NPV for Period 6.

    Fig 1: Report on NPV for stochastic model.

    Maximum expected Cumulative NPV for Period 2.

    Minimum Cumulative NPV expected for Period 3

    P90 Cumulative NPV for Period 4. 90% of possible expected values for Period 3 are below this point.

    Expected Cumulative NPV for Period 5.

    Fig 2: Report on cumulative NPV for stochastic model.

    The purpose of this page is to briefly explain how to import data and manage stochastic constraints using MiningMath.

    Formatting Uncertain Fields

    Uncertain-fields are those which might vary from simulation to simulation. By definition stochastic models have uncertain fields. Typically, grade fields contain uncertain information. Therefore, the user will need to format each equiprobable possibility in a specific way: name each uncertain column as the same adding {#} (where # is a number from 1 up to n). The list below highlight how grade headers should look like for example:

    • Copper {1}

    • Copper {2}

    • Copper {3}

    • Copper {4}

    • Copper {5}

    Note that the grade information will influence the economic values for the processing stream. Therefore, the user will need to calculate the Economic Values for each possible grade information, as highlighted in Figure 3. This figure illustrates parts of a simulated copper deposit that can be downloaded here.

    Figure 3: Stochastic values for copper grade (green) and respective process (blue) for 20 simulations.

    Stochastic Constraints

    Once you import your stochastic block model, the tabs Average and Sum will allow for constraints both on:

    1. Expected values to control the averages over all simulations.

      These constraints will guarantee that, in average, the indicators will be within the defined ranges. For example, take Expected Min = 0.60 and Expected Max = 0.65 for a certain constraint. If there are 3 simulations returning 0.59, 0.62 and 0.65, the average is 0.62, so this is within the range defined.

      Copper simulation example average
      Fig 4: Example to control the average of all simulations. This option is only available when databases containing stochastic data are imported.
    2. All simulations to guarantee that each one of them respect certain criteria

      These constraints control the variability, or the spread, of the results to be within a certain acceptable range. Let's take an example where such a range has Min = 0.60 and Max = 0.65, and again three simulations returning 0.59, 0.62, and 0.65. In this case the solution will be penalized by the optimizer, as 0.59 < 0.60. Learn more about penalized solutions here.

      Fig 5: Example to control all simulations individually. This option is only available when databases containing stochastic data are imported.

    Stochastic optimization is an optimized way to combine all these modelled uncertainties into one schedule that maximizes the Expected NPV of the project.

    Violated constraints

    After executing the optimization with constraints for simulated indicators, it is possible that such constraints will not be respected due to some infeasibility in the problem (more about infeasibilities here).

    MiningMath will try to solve any violated constraints following the hierarchy order depicted in Fig 6. Stochastic constraining are average and sum constraints. They have higher priority than any NPV improvements and time limits imposed by the user.

    Constraint order
    Figure 6: constraints hierarchy order.

    Time Limit

    It is possible to indicate a time limit in hours before running a scenario in the “Run” tab as depicted in Fig. 1. The time limit is defined in hours due to the usual complexity of mining projects and by the fact that MiningMath will always try to deliver a reasonable solution.

    Figure1: Time limit option in the interface

    MiningMath is built via a global and interactive algorithm. It solves the entire mining optimization after formulating a global mathematical model. The result of such optimization might deliver a solution with room for improvement, due to necessary approximations for solving complex non-linear restrictions, such as the geometric ones, or due to infeasibilities identified in the problem’s restrictions. In turn, if an improvement is possible, another iteration of the global algorithm is prepared and executed.

    Therefore, in order to deliver any solution, the whole mining problem needs to be solved at least once, making a more fine-grained time limit (i.e. seconds or minutes) not possible to be set. In other words, the time limit is evaluated before each iteration of a global optimization that executes multiple times as depicted in Fig. 2.

    Illustration of when time limit is evaluated. From steps 1 to 4.
    Fig 2: Illustration of when time limit is evaluated. From steps 1 to 4.

    The algorithm is designed in such a way that it is able to adjust subsequent iterations once it has identified that the time limit becomes restricted. However, it is important to highlight two aspects of such adjustment:

    1. It will not interrupt the current iteration of the algorithm. Hence, while it is expected that this adjustment will help the execution to achieve the desired time limit, it is still possible that it will take more than what was defined.

    2. Once an adjustment is made, a different problem will be defined and consequently new solutions will be explored. Thus, while unlikely, there is a chance that solutions will end up better than those unrestricted in relation to time. Therefore, despite not being implemented for this purpose, the time limit might be used to find more diverse solutions. For instance, you might build decision trees with different time limits. Even if better results are not obtained, fast solutions will still give you a quicker assessment of your project.

    Predefined Destinations

    Predefined destinations refers to a predetermined assignment of individual blocks destinations (such as waste or process) within a mining operation before any optimization is performed. We have already seen that  MiningMath works with Economic Values for each destination, taking each one of them into account to decide whether or not a block should be mined and where it should be sent to. Thus, fixing destinations or predefining them is not a concern anymore, especially with MiningMath technology.

    However, it might still be necessary if you are using MiningMath to define pushbacks while making use of other constraints, comparing the MiningMath technology with other software solutions, or just want to reduce MiningMath run time by accepting a less optimized solution.

    Applications

    1. Predefine destinations to define pushbacks.

    2. Lithologic restrictions that prevent certain blocks to be processed. For example, preventing a rock type block to be sent to a processing plant.

    3. Speeding up the algorithm's run time (while accepting a possible loss of NPV, due to an unoptimized choice of destinations).

    4. Among others.

    How to predefine destinations?

    Predefine using the block model file

    When formatting and importing your block model csv file, you can have a predefined destination column, as depicted below. This column will indicate the fixed destination for each block.

    When importing the csv file, make sure to define the field type of your destination column as Predefined Destinations

    Predefined using the calculator

    The option to predefine destinations can also be made in the Calculator area. The figure below depicts a new parameter Destinations that is set to 1 (a process destination) if the grade of CU is greater or equal 0.5, or -999 (non existent destination) otherwise. Note that the field type is set to Predefined Destinations.

    Using the predefined destinations

    After creating the new parameter (using the calculator or importing the field in the block model file), make sure it is being used in the Scenario tab as depicted below.

    Verifying results

    In the Viewer tab you can verify the destinations. Just select the Destinations field in the Blocks area to check the destination value of the filtered blocks. In the exampled depicted below all destinations are filtered to 1.

    Discounted vs. Undiscounted Cash Flow

    MiningMath’s objective function maximizes the discounted flow for the entire life of the mine, all in a single math optimization step, considering all needed constraints simultaneously. On the other hand, other packages that use the LG/Pseudoflow methods to perform pit optimization aim to maximize the undiscounted cash flow for each revenue factor provided. Hence, the MiningMath solution is not easily comparable to undiscounted flow approaches that only consider slope angles.

    Figures 1 and 2 provide a visual comparison between undiscounted and discounted cash flows. This comparison indicates that MiningMath’s decision not to mine certain regions is likely due to the higher cost of waste removal outweighing the potential profit from extracting hidden ore. Despite discounting, the revenue from the hidden ore is insufficient to cover the extraction costs in these areas.

    Example of discounted and undiscounted cashflow
    Figure 1: Undiscounted versus discounted cash flow optimization.
    Figure 2: Undiscounted versus discounted cash flow optimization regarding a minimum mining width.

    Comparing the different methodologies​

    A proper comparison between both methodologies could be done if you import the final pit surface obtained from the other mining package into MiningMath, and use it as Force/Restrict mining. By utilizing this surface as a guide, MiningMath can precisely optimize scheduling within the specific boundaries delineated by the imported surface. This integration simplifies the comparison of NPV between MiningMath and various other mining packages, giving a more comprehensive evaluation of the methodologies employed by each one.

    If you want to emphasize the cash flow in the early mining periods, simply create a decision tree varying the discount rate. The higher the rate value, the more weight will be given to the early periods, leading the undiscounted cash flow to have higher values at the beginning, while later periods will be heavily penalized by the discount rate.

    If you wish to mimic the same greedy behavior as in the context of nested pits in MiningMath, you should drop all constraints and set a 1-1 (only) interval with the desired ore productions, asking MiningMath to focus solely on maximizing the cash flow of this single pit, regardless of the long-term consequences, as in the picture below and similar to this process.

    Academy

    Teaching

    Training Presentation

    Assignment

    John is a mine planning engineer working on Marvin deposit. He’s conducting the long-term production scheduling plan and is concerned on what would be the best scenario to optimize Marvin’s operations.

    The mine has some constraints, such as slope angles defined block-by-block and a default value of 45 degrees for blocks missing information. The total movement is limited to 60 Mt/yr.

    Part 1

    1. Formateconomic values, and import this Marvin data. Save it as MMCI_YourName_Marvin.

    2. Choose your own economic parameters and fill the Table 1b to simulate a process B.

    Figure 1: Block model format

    Part 2

    Use the following steps to evaluate your project:

    Now that you know MiningMath is able to easily run multiple scenarios, you should play with the remaining parameters, evaluate the results and bring a proposal for the board. You should:

    1. Download Screencast O-Matic or equivalent screen recording software.

    2. Prepare a presentation (ppt or docx), which could be structured by answering the following questions:

      a) What is the goal of your evaluation?
      b) How did you use MiningMath to achieve your goal? Which were the features used? What was the methodology used to get to your results
      c)
      Why have you got such results?

    3. Record it as if you were reporting your analysis to the board. Upload it on YouTube as public. The video should be limited to 5 minutes.

    4. Send the link, materials, and the Marvin data used to your teacher.

    Questions

    1. What are the main resources of MiningMath viewer and which are the output files?

    2. Could market changes be considered based on different ore selling prices throughout time? If yes, how?

    3. Is it possible to forbid the optimization to access some specific region? If yes, which feature could be used?

    4. Which geometric features could be used to get results accordingly with the operational needs?

    5. What are the benefits of building decision trees?

    6. What are the main differences between Lerchs-Grossmann/Pseudoflow and DBS methodologies?

    7. What is the importance of considering a discounted cash flow on the optimization decision-making process? Does the value of money through time impact the mining sequence?

    8. Could the optimization handle multiple destinations? Over 2 processing plants? One stockpile for each with maximum capacity?

    9. Could production ramp-ups be part of the optimization constraints?

    10. Is it possible to control the average of a given variable, such as grade, haulage distance and etc.? How many properties can be controlled? Is it possible to change them over the mine's life?

    11. How does the cut-off grade policy work on MiningMath?

    12. What are the main validations before running a scenario?

    13. Can MiningMath run more than one scenario simultaneously?

    14. Which are the 3 main suggestions of the integrated workflow?

    15. Is it possible to generate a short term planning using MiningMath?

    Press Releases

    Strategic optimization

    Is mining an ordinary business?

    November 22, 2019.

    “Mining is a business. This may sound obvious. But when one tries to look at how we operate that becomes much less evident. Are we really looking at what would be important for any business? I’ve had this discussions throughout my career in mining and I must say in many cases technical people, i.e. geologists, mining engineers, processing engineers – those who run the mines – do not look at business metrics. At all!!! Let’s look at a miner. “Let’s move tonnes and increase equipment utilization”, one would say! This person will see the efficiency improvements in higher utilization. Does it help increase shareholder returns? No, not really. It has nothing to do with it.”

    Alexey Tsoy, LinkedIn Article

    Startup wants to reduce dam risks

    March 21, 2019

    “The last two tragedies that desolated Brazil, leaving hundreds of deaths and miles of environmental devastation, from Brumadinho and Mariana, raised a great question: it is possible to reduce the risks of tailing dams and to guarantee everyone’s safety in an efficient and economically viable way?

    MiningMath, startup from Belo Horizonte, says yes. The company has created a software, SimSched, which, through the union of modern programming, and data science, allows the combination of any variables from a mining project to generate analysis, hypothesis and possible results. The objective is to contribute to improve the decision-making processes in the mining companies, in order to effectively consider the economic, social and environmental aspects of the business.”

    Diário do Comércio · Gira Betim

    Startups from SEED visit the Mining Hub

    March 18, 2019

    “Five Startups of the Minas Gerais Government Accelerator (SEED – Startups and Entrepreneurship Ecosystem Development) from several segments, with solutions for business areas, visited last Tuesday (March 12th) the Mining Hub in Belo Horizonte (MG), to strengthen the relationship and exchange experiences on the market.

    Among the startups were: MiningMath, which markets a solution that supports decision making at a strategic level specific to the mining industry, from designing a project to optimizing the value chain during the development phases; Recrutamento Inteligente (“Intelligent Recruitment”), which facilitates the management of intellectual capital; The Mindset, which acts in the management of mental health professionals; Cargo Sapiens, which offers management and compliance solution for international logistics; VG Resíduos, which connects producers and consumers of waste.”

    Portal da Mineração

    Startup develops software that facilitates data management of deposits

    December, 2018

    “MiningMath has developed mine data management software that promises to reduce costs and anticipate the environmental consequences of mining. The tool crosses and organizes data from different departments of a mining company to generate scenarios with the possible geological, environmental, economic and social impacts of an action in a given location.”

    DCI

    From left to right: the engineer Matheus Ulhoa and the partners of MiningMath, Fabrício Ceolin and Alexandre Marinho (MiningMath).

    Industry 4.0 increasingly present in areas of the primary sector

    November 6, 2018

    “With the advent of industry 4.0, traditional and grassroots activities such as mining start to gain new ground in Brazil and the world. The use of innovative technologies such as data science, internet of things and optimization of processes breathe new life into the industry, promoting added value gain to the minerals extracted in the country, such as iron ore, whose largest production is concentrated in Minas Gerais.

    Proof of this is that mining startup MiningMath recently won first place at the MineTech Mining Solutions Technical Challenges Challenge with an innovative technology to simulate scenarios to support strategic mining decisions using modern techniques of programming.”

    Diário do Comercio

    Event in the Capital discussed a more sustainable performance, besides the optimization of processes and improvements in the performance of the sector (Vale Agency).

    Mining Software developed by UFMG Alumni is awarded in competition in Russia

    October 25, 2018

    “The software combines variables ranging from geological aspects to economic data and legal, environmental and social constraints.

    An innovative technology designed to simulate scenarios to support strategic decisions in the mining field, developed by former students of UFMG, Alexandre Marinho and Fabrício Ceolin, won the first place in the MineTech: Technical Challenges and Mining Solutions Challenge in early October , held in Moscow during the 14th Russia Exploration and Mining Forum.”

    UFMG · Mining.com · Hoje em Dia · SIMI · Notícias de Mineração · FUNDEP

    Fabrício Ceolin (left) and Alexandre Marinho (right) are the founders of the MiningMath startup (Personal Files).

    Alexey Tsoy to present at the MiningMath 2018 Creating Value in Mining Conference

    October 16, 2018

    CSA Global Principal Consultant-Corporate, Alexey Tsoy will present at the upcoming MiningMath Creating Value in Mining; Strategy Optimisation through Data Science Conference on November 6 2018 at the Museum of Mines and Metals in Belo Horizonte, Brazil.

    Alexey will present on Strategic Schedule Optimisation.

    Winner of the 2018 MineTech Competition

    October 16, 2018

    Alexey Tsoy, Principal Consultant – Corporate and Business Development at CSA Global, is the winner of the 2018 MineTech Competition: 2nd Mining Technical Challenges and Solutions Competition’s at the 14th Russian Mining and Exploration Forum held between 2-4 October in Moscow, Russia.

    “Alexey presented on Strategic Schedule Optimisation; forming part of this year’s MINEX conference ‘Building Up Innovative Excellence in Mining and Exploration’ […]. The proposed approach is based on a software called SimSched Direct Block Scheduling (DBS) […]. The software applies a simplistic approach where a block model is optimized on economic values assigned to each block in pre-processing. The simplicity allows a great degree of flexibility on assigning the economic values. Moreover it allows definition of variable costs based on time, residence time, or indeed any other parameter that can be calculated and limited in time.”

    “The software that could revolutionize mine planning”

    April 9, 2018

    This publication states that the traditional planning includes the discount rate, production limits, blending and other technique-economical variables during advanced stages of planning and SimSched brings the possibility to reduce the number of iterations for planning.

    “A unique and flexible tool for planning, which currently doesn’t exist in the industry. ” — Fabián Lemus, Senior Long-term Mine Planning Engineer.

    The Scheduling The Sequences (STS) project, conducted by Antofagasta Minerals, includes a co-development of SimSched DBS, where major tests will focus on Minera Centinela, a multi-pit project with multiple processing streams that might benefit from an integrated plan. This will provide flexibility to analyze and evaluate development scenarios and the inclusion of new projects.

    Access the original content.

    Integrated approaches in the industry

    How digital innovation can improve mining productivity

    With profits down, miners are focused on improving their productivity. Digital innovation could provide a breakthrough.
    Read more.

    Digital Transformation—The future of Mining

    In a challenging market, the digital transformation of mining companies has become a business imperative—leveraging technology to improve processes aligned to value.
    Read more.

    A tool for these times

    Considering the current resurgence in commodity prices, every mining company should concentrate on strategy optimization to ensure that operations move from a cost-focused mindset to one centred on value maximization in order to reap the benefits of the upturn in the mining industry.
    Read more.

    Waste Dump Sequencing with SimSched

    SimSched Direct Block Scheduler (DBS) is an open pit optimization package that selects maximum NPV pit shell while generating a mining schedule. Read more.

    Innovation and Technology to Improve Open Pit Mine Plan and Design Optimisation

    In the mining the change in technology, i.e., processes and software, does not happen fast but it takes its time, around 10 to 15 years as miners are very. (LinkedIn article)

    Academic Partners

    Mine Optimizations and SimSched DBS is a friend of yours now. SimSched DBS is the most powerful tool to achieve and compare NPV results of the pit. Read more.
     

    Publications

    2018 Sensitivity Analysis applied to operational parameters, IFG, Brazil (Portuguese)

    2017 NPV Analysis as a function of the discount rate and cost of rehandling implementing SimSched DBS to open pit mining, Universidad Nacional de Colombia, Colombia (Spanish)

    Courses

    2017 Universidad del Azuay

    2017 Universidad de San Luis, Argentina (Spanish)

    We are on LinkedIn

    During decades, the mining industry has dealt with Mine Planning as a step-by-step process. This traditional technology has been established in an intelligent manner in face of the technological limitations of that time.

    We’d like to share the news that the main limitations of open pit mine plan and design based on Lerchs-Grossmann have been surpassed by a novel and innovative technology.

    Spanish | Portuguese

    The concept of Pit Optimization is becoming obsolete. Here we bring a comparison showing the variety of outputs you can obtain from a single optimization.

    Spanish | Portuguese

    Common Issues

    Warnings

      Warning 1101

    The Force Mining surface (S1) was initially above the topography. Correction applied: S1 has been projected onto the level of the topography.

      Warning 1102

    The Force Mining (FM) surface used was initially below the economically viable last period surface. Correction applied: FM depth has been reduced by increasing its Z-level.

      Warning 1103

    The Force Mining surface (S1) used for period X+1 was initially above the one (S2) used for period X. Correction applied: the elevation of S1 has been reduced to match S2.

    Note: ‘X’ can represent any period value (1, 2, 3, etc.)

      Warning 1104

    The Force Mining surface (S1) employed was initially below the Restrict Mining surface (S2) during period X. Correction applied: S1 has been adjusted upward to align with the elevation of S2.

    Note: ‘X’ can represent any period value (1, 2, 3, etc.)

      Warning 1105

    The restrict mining surface used violated slope constraints of the last period surface. Correction applied: the surface has been adjusted accordingly.

      Warning 1106

    The force mining surface used violated slope constraints of the last period surface. Correction applied: the surface has been adjusted accordingly.

      Warning 1107

    The force mining surface (S1) imported for the last period surface was initially below the restrict mining surface (S2) imported for the same purpose. Correction applied: the elevation of S1 has been increased to align with that of S2.

      Warning 1108

    The restrict mining surface (S1) imported for the last period surface was initially above the force mining surface (S2) imported for the same purpose. Correction applied: the elevation of S1 has been decreased to align with that of S2.

      Warning 1109

    The force mining surface (S1) did not initially provide sufficient space to meet minimum operational constraints. Correction applied: adjustments have been made to S1 in critical areas.

      Warning 1110

    The restrict mining surface (S1) was initially below the origin Z level. Correction applied: S1 has been projected to align with the origin Z level.

      Warning 1111

    The force mining surface (S1) was initially below the origin Z level. Correction applied: S1 has been projected to align with the origin Z level.

      Warning 1112

    The edges of the restrict mining surface (S1) were initially below the topography level. Correction applied: the edges of S1 have been projected to align with the topography level.

      Warning 1113

    The edges of the force mining surface (S1) used were initially below the topography level. Correction applied: the edges of S1 have been projected to align with the topography level.

      Warning 1114

    The restrict mining surface (S1) used was initially above the topography. Correction applied: S1 has been projected to align with the topography level.

      Warning 1115

    The restrict mining (RM) surface utilized was initially above the economically viable last period surface. Correction applied: the depth of RM has been increased by reducing its Z-level.

      Warning 1116

    The restrict mining surface (S1) used for period X-1 was initially below surface S2 used for period X. Correction applied: the elevation of S1 has been increased to match that of S2.

    Note: ‘X’ can represent any period value (1, 2, 3, etc.)

      Warning 1201

    Slope angles for the restrict mining surface have been adjusted for period X.

    Note: ‘X’ can represent any period value (1, 2, 3, etc.)

      Warning 1202

    Slope angles for the force mining surface have been adjusted for period X.

    Note: ‘X’ can represent any period value (1, 2, 3, etc.)

      Warning 2301

    The vertical rate has been adjusted by N meters to adhere to production and/or surface constraints in period X. Note: ‘X’ can represent any period value (1, 2, 3, etc.). ‘N’ can represent any value in meters.

      Warning 2401

    The production capacities informed may force a mining schedule with more than 100 periods. Do you want to continue anyway?

    This warning message is designed to prevent users from spending time on incorrect scenario setups by flagging potential inconsistencies with the block model data.

    It often occurs due to typos, such as inputting values with lower magnitudes than intended. For instance, if production is meant to be 10 million tons per year but is mistakenly entered as 10 thousand tons per year.

    MiningMath conducts basic validation checks before each scenario optimization, comparing parameters like production limits with the available material. If values are significantly lower than expected, it triggers a warning, as it may result in an unrealistically long mine life, typically over 100 years.

      Warning 2402

    The block model must have at least two Economic Value fields, one for process and another one for dump.


    MiningMath requires at least one process and one dump destination with their respective economic values. You should analyze your model and be sure to have this fields.

    The economic value calculation is one of most important procedures of MiningMath. Any error on the formula could cause incompatible results and even increase the complexity and runtime based on wrong assumptions that could be done based on these values. The main validations on this step could be done by evaluating the minimum and maximum values. Once the data has been imported you can also perform a data validation procedure.

      Warning 2403

    The pre-defined scenarios with Marvin deposit can usually appear with Red Warnings.




    This means that within the installation process MiningMath could not find a folder to place it. Therefore, to run it you just have to click in the scenario, choose where the files should be and save them.


    It is important to mention that Marvin block model is available here, thus if lost it or want to import it again feel free to do so.

      Warning 2404

    Error parsing surface.

    Usually, when you import an invalid surface the relative field will be in red. By hovering the mouse over the field you could see the message about it.

    The first common issue here is about importing a surface which does not meet the block model limits. To check this issue verify your origins and check your coordinates by following the steps mentioned on this page.

    An additional error is based on the name of the headers of the surface file, which might be always “X, Y, Z”. Thus, if it has any other name or type, correct it accordingly with this statement.

      Warning 3501

    MiningMath tries to communicate with Excel and fails.

    The reason why it happens is that there are additional windows (like a login screen or an activation failure) being opened before the worksheet, which is interfering with MiningMath generation of reports.

    To find out, close the Excel instances completely and reopen it. If an additional window appears, before the worksheet, try to solve what this message asks.

    Floating-point numbers

    Same scenario, different results?

    Is it possible to find different results for the same scenario running in different computers. Algorithms based on Mixed Integer Linear Programming (MILP) depend on third-party solvers and their results may differ in terms of floating-point precision from hardware to hardware.
     
    Given that MILP is based on multiple LP executions, precision differences may accumulate over the sequence of operations performed with floating-point numbers. It’s expected that results may differ, but they should be equivalent in terms of NPV. If the physical results are too different, this means this mine has the flexibility to operate in both ways without big impacts in NPV.

    For an in-depth explanation on floating-point arithmetic from a computer science perspective please see here.

    License Information

    MiningMath relies on online activation based on the internet connection, or through an identification code from your hardware as a contingency.

    If you are experiencing issues with activating your license, you can find the license information on the licensing screen.

    1. Identify your error number/message.

    2. Get your hardware identification code (Host ID), by the two following options:

      a) Copy the text disclosed at the "Informations for support", if it is available for you

      b) Execute this procedure bellow, explained in this video.

    3. Send us the error number/message generated and the identification (Host ID) by filling this form.

    Note: If your error number is -3001 get your solution at this page.
    Note 2: The revoke license procedure, available only in commercial licenses, started after version v2.0.24. Therefore, make sure to have an updated version before revoking it on your computer and activate it in another one.

    Progress Bar

    The percentage displayed on the progress bar is an estimate, as mathematical programming can be unpredictable.

    The pre-processing steps, in which the algorithm eliminates the useless material, might keep it stuck in the initial percentage (2%, 4%, etc) for a while, but after that, the optimization can get faster.

    MiningMath can virtually handle any model size. It has successfully run models from clients beyond 10M blocks without reblocking, which might take a few hours to finish. The runtime is directly proportional to the number of blocks, destinations, periods, constraints in use, and variables imported. Therefore, the combination of multiple aspects, are directly related to the complexity of the deposit.

    Check for any ‘floating blocks‘ that are not connected to the model’s topography, as shown below. These regions can impact optimization, so removing them might help MiningMath to function properly.

    Example of floating blocks.

    Geometry

    Slope Angles

    How slope angles are generated

    The slope angles (Figure 1) are one of the main concerns in the mining industry due to security, and all the operational parameters that it can influence. Therefore, this is one of the most important parameters when considering the constraints hierarchy. Usually, it can vary based on the time frames (short/long-term), rock type, lithologies, mine sector, depth, geotechnical domains, etc. Thus, it is important to have these assumptions clear to use this parameter wisely following what you are expecting to your project.

    The traditional workflow for open-pit optimization, design, and production schedule may incur in a discrepancy between parameters used for the optimization stage and the design parameters. Ramp design and positioning is a common example on a step that will affect the Overall Slope Angle (OSA). This often brings the need for an iterative process of re-optimizing the same scenario based on the pit designed.

    MiningMath works with “surface-constrained production scheduling”, defining surfaces that describe the group of blocks that should be mined, or not, instead of the block precedence” method. By this approach each surface has a feasible solution, considering productions required, and each point could be placed be anywhere along the Z-axis. Therefore, this flexibility allows the elevation to be above, below, or matching a block’s centroid, which ensures MiningMath’s algorithm to control the OSA precisely, with no error, which would have an even higher impact on transition zones.

    The following video illustrates how MiningMath’s algorithm handles the definition of slope angles, especially in transition zones.

    Figure 1: Slope angles defninition

    Video 1: Discover how slope angles are generated.

    Setting up Slope angles

    MiningMath allows the user to handle slope angles in two ways: (I) block-by-block definition inside the model; or (II) a default value, applicable to all blocks or to the blocks from (I), for which there is no information on slopes. It is worth mentioning that for multiple variable slope assumptions, you can add several columns, before the importation, named Slope 1, Slope 2, (…), Slope N. Then, on the interface, select the proper information for each scenario, which avoids the back-and-forth to edit the block model several times.

    Block by Block field

    This field represents the column(s) that has/have been assigned to the slope during the importation, for varying slope angles on each block. This allows a high level of flexibility to use any specific criteria. These possibilities can also comprise bi-dimensional and tri-dimensional variations, beyond linear and non-linear functions (Figure 2).

    Default values

    On the interface, you have the option of selecting the information field, which will be the main rule for variable slope angles or to select <none> to define a constant value that will be used for the entire model. The “Default value” settled is also when you use a field that has missing information in the column (Figure 3).

    Slope angles in the Short-term

    Short-term Planning is a great opportunity to use the same platform that the strategic mine planning team is using, which allows the company to enhance the adherence/reconciliation of projects by choosing a surface and using it as force and restrict mining to refine everything inside it.

    The approach could consider a surface already designed with ramps or any other from MiningMath which respects the Overall Slope Angle (Figure 4) in the time frame required, so that you could use the steeper Bench Face Angles based on the operational parameters of blasting. To use such a feature apply the item 2.2. suggestion and play with different angles accordingly with your project capabilities, this methodology allows you to give flexibility to the algorithm in the constraints hierarchy to find better results.

    Figure 4: Operational Bench Face Anlgle and Overall Slope Angle difference.

    Geometric Constraints

    In a mining project, the mine planner must accurately dimension each unit operation to determine the most suitable set of equipment for the existing conditions. With MiningMath, operational parameters are integrated as constraints within the objective function, rather than being applied post-pit optimization. This methodology ensures solutions that adhere to operational criteria while maximizing NPV, leading to more effective data utilization and uncovering opportunities that might be overlooked with manual steps and arbitrary assumptions.

    The Geometric tab is the place to set minimum mining & bottom widths, mining length and vertical rate of advance, whose values are applicable to every period. The user can also use surfaces to define operational constraints in compliance with period ranges, which can limit, force or achieve an exact shape, based o the constraints hierarchy.

    There are two types of widths restrictions that can be created:

    1. Mining Width: distance from a pit to another.

    2. Bottom Width: bottom minimum area.

    Currently, MiningMath does not mine partial blocks. As a consequence, the software will round up any widths to cover the next integer block.

    Widths in the Geometry tab
    Minimum widths in the Geometries tab.

    It is also possible to define a Vertical Rate of Advance for each period range. The VR will be rounded up to cover the next integer block.

    Vertical rate of advance in the Geometry tab
    Vertical rate of advance in the Geometris tab

    Mining length (ML)

    A minimum horizontal distance that should be respected from a pit to another in every period can also be defined in the Minimum Mining Length field. Currently, this is only available in the insider version.

    Minimum mining length option in the Geometries tab.

    The figures below show a simplistic meaning of each width/length available and the vertical rate of advance.

    For each period range, the user can consider:

    1. 1 force mining surface.

    2. 1 restrict mining surface.

    Each surface file is valid from period A up to final of period B, as depicted below.

    Surface mining limits: forcing and restrict mininig.

    The following video shows how the variation of operational constraints impacts your solution and how you can take advantage of these parameters to find results more closer to the reality.

    Operational constraints

    Widths and lengths

    On MiningMath, the widths and lengths are constraints within the objective function, which means that they are regarded in the optimization instead of being considered just on the pit design stage, when roads, access are drawn to conceive an operational feasible to the pit.

    The definition of minimum widths is a pretty useful feature to obtain operational results and play with different geometries accordingly with the project requirements. It is important to understand that they are very complex parameters to be respected while considering 3-dimensional non-linear models, which also influence the runtime of each scenario. Therefore, it is not possible to always guarantee that all minimum widths are respected due to the deposit geometry and also the constraints hierarchy. Testing different values is a great strategy to identify opportunities that could bring the best mining sequence and NPV.

    Types of widths and legnths

    Bottom widths

    Bottom widths (BW) are the minimum horizontal distance, on the lowest floor of the pit, as seen in Figure 1. It is required to allow mining operations based on the equipment sizing. This parameter is considered the same for all periods and is related to adjacent slopes and applicable in other areas regarded as pit bottoms.

    Figure 1: Bottom width area in subsequent periods

    Mining widths

    Mining widths (MW) are the minimum horizontal distance that should be respected from a pit to another in every period, which means that this is the horizontal distance between the walls of two surfaces that belonged to consecutive periods, as shown in Figure 2.

    This feature is more complex than the minimum bottom width since it would drastically change the pit shapes to identify the best regions. Note that wider values provide greater mining fronts and also better designs to nested pits, pushbacks, schedules, or any other result that you are looking for.

    Figure 2: Mining width in subsequent periods

    Mining lengths

    The Mining Length (ML) represents a minimum distance that must exist between at least two points amidst the walls of surfaces among two consecutive mining periods. This distance is already respected for any values smaller or equal to the MW. Thus, this parameter extends such distance between any two points for a value greater than MW. Figure 3 depicts an example

    Imagem da mina
    Figure 3: Example of MW and ML.

    Identifying geometric parameters

    Figure 4 shows a section view of the McLaughlin deposit, where each color represents a given period. The horizontal arrows highlight the bottom and the mining width, while others identify the vertical advance. Mining lengths cannot be depicted in this 2D representation.

    Figure 4: Bottom width, Mining width and vertical rate of advance.

    How are widths defined?

    The widths inputted in the interface define a diameter (d) of a circle. As MiningMath does not mine partial blocks, the software will consider the block size as a reference to define whether d should be rounded up to the next integer multiple. The approximated circle results in a polygon, whose objective is to select centroids of adjacent cells. Then, MiningMath will assign the same elevation value to the selected cells to define mining surfaces.

    Figure 5 shows, in sequence, an example of how the minimum width of 25 meters and possible mining length of 50 meters could be defined over a 10 x 10 meters grid (block’s dimension in x and y direction).

    Figure 5: How MiningMath defines the mining width and mining length.

    Vertical Rate

    The vertical rate of advance (VR) or sinking rate is also considered within the objective function on MiningMath. This parameter is defined as the vertical distance mined on each period. The total vertical distance mined across all periods should be consistent with the pit depth and equipment available. In cases of unfeasibility it will respect the constraints hierarchy in which it is placed at a lower priority.

    Bottom width, mining width and vertical rate of advance.

    How to calculate it

    The vertical rate of advance is defined as the vertical distance, in meters, mined in each period, and it is calculated by evaluating each mining face independently, as shown in this figure.

    Complexity and recommendations

    It is important to understand that this is also a 3-dimensional non-linear model, which means that it is a complex parameter within the optimization. The VR works as an upper bound to avoid operationally unfeasible solutions. Therefore, testing different values is a great strategy to identify opportunities that could bring the best mining sequence and NPV.

    The Mining Width (MW) is not mandatory when using VR, but it plays an important role when defining its value. The MW and VR, together, define volumes of material for each mining period. A reduced MW might create additional challenges for the algorithm to comply with the VR. Therefore, it is important to play with different values, especially when VR is not being fully respected. 

    Vertical Rate: Definition, Hierarchy of Constraints & Complexity

    Example evaluation

    To evaluate different values for the VR constraint, you can use Decision Trees. The figures below depict a base case for VR evaluation using the Marvin dataset and a respective decision tree built with different VR values.

    Notice the fluctuation in NPV illustrated below, influenced by this singular parameter. Additionally, take note that smallest values like 30m (equivalent to block height) and 60m trigger warning violations in the generated report file. This indicates that MiningMath’s algorithm had to adapt the VR constraint for a viable solution to be achieved, keeping other indicators, such as tonnage, within limits. You can see more on the next section on how to prioritize the VR constraint in case adjustments like that are necessary to achieve feasible solutions.

    NPV achieved with the Marvin dataset for different values of Vertical Rate. (*) 30m and 60m constraints were not respected in all periods and generated warnings in the report file.

    The vertical rate of advance is one of the first constraints to be relaxed within MiningMath’s Hierarchy of Constraints (read more). To ensure VR is at least closer to what you need, relax low-priority constraints manually. This way you will lead the algorithm to a more flexible scenario and a broader solution space, which may help it to find a feasible solution for the new set of constraints.

    Besides, if the user aims to force a maximum vertical rate for a given period, it can be created a flat surface constraint regarding the achievable depth and input it as a Restrict Mining. Beyond that is important to notice that even the goal of achieving “process full” could result in a non-feasible solution while the VR is still respected, therefore, this feature is very sensitive to any other parameter.

    Vertical rates controlled by surface constraints

    In summary, it is crucial to assess various parameter values to gain a deeper understanding of how they can impact your project, particularly concerning geometric constraints. You can explore additional workflows presented here that can assist you in achieving better results.

    Force Mining

    Force mining, is used to deplete the material of the entire area (Figure 1) in a specific time-frame. This feature is responsible for making MiningMath achieve at least the surface inserted, which means that all the material inside its limits should be extracted, whether is ore or waste. Thus, this feature could be also understood as a minimum depth that should be mined at a specific timeframe.

    Keep in mind that these surfaces might be adjusted during the optimization to respect the slope angles (Figure 2), which has a higher priority order on the algorithm, while the optimization is done. Therefore, more material can be mined either to correct the overall slope angle or to increase the NPV.

    This functionality is commonly used to refine/keep the mining amount of a previous good surface in early periods, force a specific depth which the deposit should achieve, create custom advances, extract material to make a region available to allocate equipment, etc.

    Figure 1: Golden arrows disclosing the areas that should be forced.

    As MiningMath aims to assist the users to apply their project knowledge to guide the algorithm into the best decisions, that is why surfaces are one of the most important constraints hierarchy, which enables the implementation of custom geometries and operational parameters based on this smart hints imposed. Thus, forcing-surfaces might be the reason for disrespecting production limits, blending constraints, geometries, and so on, which require the user to be careful by using these functionalities. To sum up, the material above a Force Mining imported surface will be certainly mined until the specified period of time and what is below it will be mined only if the blocks respect all the other constraints and generate profitable results.

    Figure 2: Force Mining surface and the slope angle adjust which could happen.

    The approach here was the attempt to deplete a Mining Front with high grade ores in the first period on the optimization, which considers the region which has IY higher than 35, inside the final pit of the Data Validation (Figure 3) and limited until the elevation 250. To build such surface the first step was to place the Z coordinates in this excel file, then use conditional functions to define these limits (Figure 4 and 5).

    By having a surface suitable to use, a scenario of Exploratory Analysis was run using the set up of disclosed in Figure 6. Note that a scenario without Force Mining (Figure 7) was also run to have a good comparison.

    As a result, the Force Mining surface totally mined along additional material to fulfill the processing capacity and also adjustments on the slope angles which generated the result disclosed in Figures 8 to 10. Besides, it is always useful to compare these scenarios with the ones that did not use this approach, which generates different sequences (Figure 11 and 12) and illustrates how powerful could be user assumptions to generate better results or explore possibilities.

    Download the Mining_Front-FM.csv surface file and play on your own.

    Restrict Mining

    Restrict mining is used to prohibit access to any area (Figure 1) in a specific time-frame. This feature is responsible for making MiningMath ignore what is outside the surface inserted, which means that only the material above it is available to be extracted, whether is ore or waste. Thus, this feature could be also understood as a maximum depth that could be mined at a specific timeframe.

    Keep in mind that these surfaces might be adjusted during the optimization to respect the slope angles (Figure 2), which has a higher priority order on the algorithm, while the optimization is done. Therefore, less material can be mined either to correct the overall slope angle or to increase the NPV, since mining, useless waste material does not increase the revenue.

    This functionality is commonly used to refine/optimize the mining amount of a previous surface in any period, restrict depth to a specific value which the deposit could achieve, extract the best ore in custom advances. It locks/prohibits an area due to concession rights, environmental issues, or even due to an already built stockpile, waste dump or structure in general, etc.

    Figure 1: Red X's disclosing the areas that should be restricted.

    As MiningMath aims to assist the users to apply their project knowledge to guide the algorithm into the best decisions, that is why surfaces are one of the most important constraints hierarchywhich enables the implementation of custom geometries and operational parameters based on this smart hints imposed. To sum up, the material above a Restrict Mining imported surface will be available to be mined until the specified period of time and what is inside could be mined if the blocks respect all the constraints and generate profitable results. Regarding what is outside, it will not be mined whether profitable or not.

    Figure 2: Restrict Mining surface and a possible pit shape respecting this constraint.

    The image aside illustrates an example as the standard scenario scn21-PriceUp-RampUp-Protection300 available on optimizing scenarios. It is possible to identify that in the free case (left), the pit limit advanced beyond the limiting surface on the west portion, defined by the higher elevations of the cells (equal to the topography). On the other hand, applying the limiting surface (right) to restrict mining until the last period, a new pit limit is obtained and obeying the constraint imposed by the surface, as shown in Figure 3.

    Figure 3: Comparison between a free run and the use of a restrict mining scenario.

    The second example is based on the attempt to extract the best material inside the Mining_Front-FM.csvwhich mapped high-grade ores in the first period (Figure 4 to 6), as mentioned on this pageUsing the suitable surface filein a scenario of Exploratory Analysis was run using the set up disclosed in Figure 4. Note that a scenario without Force Mining (Figures 7 and 8) was also run to have a good comparison.

    As a result, the Restrict Mining surface mined the best material within the area available (Figure 9), which generates different sequences (Figures 10 and 11) and illustrates how powerful could be user assumptions to generate suitable results or explore possibilities.

    Download the Mining_Front-FM.csv surface file and play on your own.

    Combining Force and Restrict

    MiningMath allows the user to combine Force and Restrict mining by using surfaces on each field as you already read in the previous pages. These features allow us to use different arrangements due to concession rights, exchanging of land with adjacent mining companies, allocation of waste material inside exhausted areas, environmental issues, and so forth.

    By using them together, the user can either reach the exact shape of a pit if you input the same surface as Force and Restrict mining at the same time frame (Figure 1), which has the highest priority constraint in the hierarchy order. It is also possible to optimize the material between surfaces if you add different surfaces in these two fields (Figure 2), which might be adjusted either to correct the overall slope angle or to increase the NPV, as mentioned before.

    Figure 1: Using the same surface as Force and Restrict mining to reach an exact shape.

    Based on these concepts, MiningMath allows you to export surfaces from the best scenario to a bigger mining package, design it with your mining package, create the grid of pointsimport the designed pit, and, finally, optimize and refine as much as you can by using smart constraints. Therefore, the user has the advantage to control the results by guiding results accordingly with the project requirements.

    Figure 2: Using different surfaces on Force and Restrict mining to optimize a volume within its limits.

    Surfaces used simultaneously in both fields, force and restrict mining, are interpreted as follows:

    • Different surfaces: MiningMath will force mining according to the forcing-surface and will restrict mining according to the restricting-surface.

    • Same surface: MiningMath will achieve the same format proposed by the surface in use until the end of the time frame in which they have been applied.

    This approach allows you to use different surfaces in the same time frame or split them accordingly to the goals that you want to achieve since this feature works close to what is shown in Figure 2 above.

    Figure 3: Mining sequence between force and restrict mining surfaces.

    In this example, it was used the same constraints were mentioned on the schedule optimization page. Besides, the surface 2 of this scenario was add on the force mining field in the second time frame, which means that by the end of the second period the mining should reach at least the shape imposed. The restricting surface used came from the constraints validation page and was added on the last interval, which means that the algorithm could not surpass this limit. Figure 3 discloses the optimized mining sequence volume between the surfaces inserted, and Figure 4 shows the set up of the scenario.

    As result, the Figures 5 and 6 illustrate how the force mining has influenced the the firsts periods of the optimization, which mined more material due to its profitability.

    Figure 7 disclose the constraint validation surface, used as restrict mining.

    Figure 8 shows the final result regarding the force and restrict mining features, which respected the surface constraints and demonstrates the capability of these features to guide results.

    Figure 4: Scenario setup.

    The following figure exemplifies how the user can take into account any pit design to make MiningMath iteratively produce more operational results, detailing a previous scenario.

    In this case, the user needs to use the same surface in both fields, force and restrict mining, and during the same period of time to reach the exact shape of the designed surface.

    Although the workflow on Figure 9 uses a designed pit, it is possible to use pits from previous scenarios as well, so that you can freeze good results and optimize further periods. Below are some examples of it:

    • Getting the same: Achieve the same final pit of a previous scenario.

    • Using the traditional approach: Define a pushback from 5 to 5 years.

    Figure 9: Using a designed surface as force and restrict mining.

    This powerful workflow allow a lot of flexibility so that the user can guide solutions based on insights and previous knowledge of the deposit. The concept is a pretty unique feature of MiningMath and such approach could be easily done by following the steps of creating surfaces and validating them on the footer bellow.

    Creating Surfaces

    The surface files on MiningMath are a set of points (Figure 1)which are aligned with blocks centroids on X and Y axes (Figure 2). The easiest way to avoid any error message is by using a topography surface created by MiningMath on the Data Validation, for instance, and then manipulating only the Z coordinates.

    Usually, the designed surfaces on traditional mining CADs are based on contour/drawn lines and point triangularizations (Figures 3 and 4). Therefore, they are continuous figures which can not be recognized on MiningMath.

    Here, it’s the step-by-step to create surfaces on MiningMath:

    • Import the TopographySurface.csv (Figure 5)which is a grid of points, in the CAD of your Mining Package. The CSV file mentioned might be obtained by validating scenarios or any other execution on MiningMath.

    • Manipulate only the Z coordinates and project them in a way that fits your needs. There are 3 main options at this step:

      • Use a polyline (Figure 6) to draw the region, select the points inside the polygon drewand find the option on your Mining Package which allows you to place them at the elevation that you want.
      • Use your triangularized designed surface (Figure 7) by placing it on your CAD viewer along the TopographySurface.csv generated by MiningMath. Select the point set imported in your CAD and find the option on your Mining Package which allows you to project all points at the same elevation of your designed surface (Figure 8).
      • Using only Excel or any spreadsheet program. Open the TopographySurface.csv, filter the regions in X and Y that you want to change the elevation and manipulate them.

    Export the modified point set (Figure 9) as a CSV using a different name. Open the file on excel or the notepad to make sure that the header is correct (Figure 10) before importing on MiningMath since it is pretty common that these exported come along with meaningless information that can interfere with the importation.

    Figure 9: Surface file which could be imported in MiningMath.
    Figure 10: Header and disclosed data on a standard surface file of MiningMath.
    • Select the points in the area that should be restrictedplace them at the highest elevation of the topography, and set rest of them at the bottom.

    • Use as a restricted mining surface.

    Figures 11 through 13 illustrate the process.

    Note: The X and Y coordinates must remain the same.

    • Choose the points inside the area you want to force mineplace them at the elevation that you wish, and let the rest of them at the topography.

    • Use as a force mining surface.

    Figures 14 through 16 illustrate the process.

    Note: The X and Y coordinates must remain the same.

    • Define your polygons or use designed surfaces, by following the methodology presented in 2. Creating surfaces Step-by-step.

    • Use as a force and restrict the mining surface.

    Figures 17 through 19 illustrate the process.

    Note: The X and Y coordinates must remain the same.

    Validating Surfaces

    The best way to generate surfaces is by using a topography surface created by MiningMath, which is created after the Data Validation, then manipulating only the Z coordinates. Make sure to meet all the surface requirements disclosed here.

    Surface Requirements
    • Headers must be named as X, Y, and Z. These files have to obey an ascending value order in each one of the axes.

    • Same size of the block model, item 1.1 of this page explains it.

    • Its points must be aligned with blocks' centroidsitems 1.1 and 1.2 help you to understand it.

    • Defined as a grid of points, the visual validation, item 1.2 shows it.

    • To be in the CSV format.

    Figure 1: Surface file format of MiningMath

    By using the values of a Marvin Deposit file in Figure 2, we find the Block model centroids boundaries which begin at XMin=15; YMin=15 and has the maximum centroid value of XMax=5,295YMax=8,265, which could be confirmed by checking the topography generated by MiningMath.

    Then, its time to search for your surface limits, at this example the file chosen was “Surface-RM-offset-300m”. The easy way to find it, is by filtering the axis values, as shown in Figure 3, which disclosed XMin= 15; YMin=15 and XMax= 1,815 and YMax= 1,785. Therefore, even though this file was also based on Marvin Deposit, it is not possible to use it since the surface smaller than the block model in place, which means that it does not have the same size as the block model file.

    It is always worthwhile to check the limits of the designed surface (Figures 4 to 8) if we face an error. Remember that everything, even elevations, must be in the boundaries of the block model and check the recommendations for your surface.

    Figure 4: Surface (grid of points) smaller than the block model.
    Figure 6: Surface (grid of points) bigger than the block model.
    Figure 7: Surface (grid of points) with missing values.
    Figure 5: Surface (grid of points) smaller than the block model.
    Figure 8: Surface (grid of points) unaligned with the centroids.
    Figure 9: Correct Surface (grid of points) aligned with the centroids and same block model size.

    The centroids maximum limit by using the equation for each axis:

    Maximum centroid value = OX + (NX*DX)-(DX/2)

    Where:

    • OX is the origin of the X-axis;

    • DX is the block dimension of the X-axis;

    • NX is the number of blocks in the X-axis.

    Note: (DX/2) is related to the distance which should be summed on the origin to find the centroid of the first block, or reduced at the block model limit to find the last centroid, since origins are based on the corner of the block model.

    The following video presents how to validate surfaces numerically and visually. This initial verification is what enables you to understand what might be happening and where the error is. The example used regards to the message: “Error parsing surface: Coordinates aren’t properly spaced” but it fits any case where the surfaces used are causing a problem, especially when a red box error shows up.

    In this case, as there are a lot of values that do not match the correct ones for X and Y, the quicker and easier way to fix it is restarting from the beginning.

    Video 1: Validating surfaces.

    Surfaces as a Guide

    The easier way to work with surfaces is by manipulating the Z coordinates of the topography generated by MiningMath, while keeping the same values for the X and Y coordinates.

    The surfaces generated in MiningMath have always the same format. Each one of them has an equivalent number of lines. Moreover, the data follows the same order from the first row to the last one. Hence, you can choose the topography file and use it as a guide. This could also facilitate: 1) conversions to original coordinates from other software; 2) filter of regions and pits; 3) creation of force and restrict mining files; and many other options using this concept.

    It is worth mentioning that this approach could be easily done by using a simple worksheet and loading the CSV files generated in the viewer for further analysis.

    If you are considering geometries, mainly the Mining Width with surfaces imported from a different software package, there will be conflicts between the geometric criteria of MiningMath and the geometric criteria from the surfaces imported.

    You must give some freedom for the last period so that MiningMath can also optimize the number of periods. It means that you should use <end> instead of using a lock period range, such as from 16 to 16 for instance.

    In order to add more material from deeper areas, you can use a base surface to increase its depth where you wish to get closer to what the project requires. Focusing on the main areas in early periods, you provide tips to the algorithm so that it can understand your approach by using surfaces. The following steps will reproduce an efficient workflow to promote this optimization:

    1. Download this excel file to use as a guidance advance in the steps.

    2. Insert the data in the yellow cells based on your block model information.

    3. Paste the coordinates of the topography surface

    4. Define a new column, plan Z, to represent the coordinates Z of a surface plan. If you are not sure what is the plan Z, you can import a test surface on the viewer, and identify the elevation in which you want to force the bottom area below it.

    5. Create a new column to represent the coordinates Z of the restrict mining surface. This will be used to identify the maximum amount of ore, respecting the geotechnical aspects, that could be extracted. Finally, you can create a condition to define the restrict mining Z at the bottom Z if it is below the plan Z, or at the topography Z otherwise. The figure below depicts this concept and respective formula in Excel.

    6. Set up your scenario and run.

    Note: As a result, you will get the maximum potential that can be extracted bellow the elevation that was chosen.

    1. Place the Z coordinate from the base surface that you what to increase the depth to extract more material in the bottom

    2. Paste the coordinate from the maximum potential scenario

    3. Calculate the difference between them.

    4. If the area of the maximum potential surface is bellow the surface base, it will consider the lowest elevation. Therefore, it will consider the additional ore that can be extracted in the bottom along with the base surface used.

    5. Use the mixed surface as force and restrict mining at the period range that you want to achieve it.

    Note: After this step, MiningMath will generate operational surfaces so that you can use it on your projects considering the timeframes required.

     

    Surface Constraints

    MiningMath uses a surface-constrained mine production scheduling, which is an improvement of the idea proposed by Marinho (2013). Surfaces are one of the most important constraints, allowing the user to impose its manipulations and knowledge to guide the optimization process. It can be used to force areas to back-fill operations, to allocate an in-pit crusher, to restrict an area considering different offsets, and show the economic impacts of preserving or not a given area or community. It also allows to incorporate an operational mine design as a requirement for a given time frame. For example, the mine design for the current year could be a mandatory requirement for the first period, while the rest of the mine sequence would have a new chance to be re-optimized and find a different sequence that finds more long-term value. The following pages unlock all the possibilities of the use of such features.

    Internally in the algorithm:

    • To define slope angles and eliminate geotechnical errors, present in the blocks precedence method (Beretta & Marinho, 2014, 2015).

    • To handle geometric parameters and comply with minimum widths and maximum vertical rate.

    Force mining illustration. Force mining could be understood as a minimum depth to be mined.
    Figure 1: Force mining could be understood as a minimum depth to be mined.

    As optimization inputs:

    • To force mining and achieve a minimum depth, geometry, or area within a given time frame.

    • To restrict mining and ensure unavailable areas will not be considered as part of the optimization within a given time frame.

    • To force and restrict mining to achieve an specific design o guide the optmization.

    Restrict mining illustration. Restrict mining could be understood as a maximum depth achievable.
    Figure 2: Restrict mining could be understood as a maximum depth achievable.

    As optimization outputs:

    • To outline the mine sequence throughout the Life of Mine that maximizes the Net Present Value.

    • Outputs will be a consequence of the optimization, which implies each set of project assumptions, constraints, and parameters, since it is unconstrained by pushbacks it will produce a different sequence of extraction, unlocking hidden opportunities.

    Force and restrict mining illustration. Force and restrict mining used together could represents minimum depth to be mined and maximum achievable.
    Figure 3: Force and restrict mining used together could represents minimum depth to be mined and maximum achievable.

    Surface formatting is simple and any surface output from MiningMath might serve as a start point for further manipulations or even for validations. It is important mentioning that they are exported/imported from/at MiningMath in Coordinates.

    • To have headers named as X, Y, and Z. These files also obey an ascending value order in each one of the axes.

    • To have the same size of the block model, which means it should not exceed the block model dimensions.

    • To have its points aligned with blocks' centroids in the X-Y plane.

    • To be defined as a grid of points.

    • To be in the CSV format.

    To avoid any mistakes, manipulate an output surface from MiningMath instead of creating one from scratch.

    1. Run any scenario to obtain the topography file in MiningMath’s format.

    2. Import the topography.csv, created by MiningMath, on a software able to manipulate it graphically.

    3. Select points inside/outside a polygon. Move them up/down accordingly to the objectives to force or restrict mining. Points should be moved only up and down, along the Z direction.

    4. Once the surface is ready, move it back to the original coordinate system.

    5. Use it on MiningMath.

    • X and Y coordinates should remain the same, with the same spacing between each pair of points.

    • For rectangular areas, a spreadsheet application is suitable for this task.

    Surfaces are imported in two tabs of MiningMath: Geometric and OverviewFigure 1 zooms in the operational constraints from the Geometric tab. The main variables to use this feature are mentioned on Figure 4, which illustrates that surfaces are imported considering:

    • The purpose of forcing/restricting mining.

    • The period range when each surface is applicable.

      • MiningMath automatically defines a single period range from "1" to the "and the the user can also add custom intervals.
    Geometric constraints.
    Figure 4: Geometric constraints.

    In the example from Figure 5, the image highlights the fields to apply:

    • restricting-surface valid for periods 1 and 2 (in green), which means that it would be respected until the end of the second year.

    • A forcing-surface valid for periods 1 to 5 (in blue), which means that the area has to be mined until the and of period 5.

    General constraints and fields related to surfaces.
    Figure 5: General constraints and fields related to surfaces.

    Video 1: Surface Constraints: The ultimate guide

    Tips and Tricks

    Labs

    Starting from version 3.0.8, a new set of small apps are provided in the MiningMath Labs section. These apps provide quick solutions to common problems, for example changing input csv files.

    How to access it

    To use the Labs section for the first time click on the Labs button. You might be prompted to select a folder with writing permissions in which the apps will be stored.

    Once you have it selected wait a few sections until it opens in a new window as depicted below.

    Example app

    To run any app, just double click its name. For example, if you double click the csv_colum_operations script a new small window will pop up asking you to select a csv file to perform alterations.

    Once a file has been selected, you will have the option to add or remove columns, get column statistics, and save colums.

    Column options with csv_colum_operations script
    Summary statistics of a Process column in a csv file.

    From now on you should be able to play with all the options and all the apps and see if there is anything that can aid in your project.

    Reblocking

    Reblocking is a method used to decrease the number of blocks in a block model by combining some of the smaller blocks to create larger ones. For example, if your blocks have the dimensions of 5 x 5 x 5, you could increase it to 10 x 10 x 10, which could reduce the number of blocks to half of its standard dataset size.

    Note: when reblocking your model it is important to evaluate dilution aspects that can be lost by increasing the block size.

    Improving runtime

    Reblocking can significantly reduce optimization runtime. Users have observed substantial improvements in runtime by implementing double, triple, or even quadruple reblocking. For example, feedback indicates that for a 32M blocks model, optimization runtime decreased from 36 hours (with double reblocking) to 12 hours with triple reblocking, and further to just 4-5 hours with quadruple reblocking.

    MiningMath provides an app in its MM Labs section that is able to reblock your block model. An example is provided below.

    Reblocking with MM Labs

    Open the Labs section in the main menu as depicted below. Note: You will need at  least version 3.0.8 to start using the MM Labs applications. More about Labs can be seen here.

    A reblocking application should be available. Double click on it to open the app.

    You will be prompted to select the csv file of your block model. Afterwards, you will need to inform the coordinate columns, model dimensions and desired reblocked dimensions.

    Based on the columns of your model, you will be able to indicate which columns should be summed, averaged or weighted averaged. Lastly, you will need to indicate the output csv file. This file needs to be created beforehand. 

    Cut-Off Grades

    The concept of cut-off grade has been conceived to delineate what is ore and waste, considering the life of mine. Usually, manual assumptions are made to pre-define what must be considered ore or not, while using the LG or Pseudoflow methodologies. These approaches do not consider the value of the money through time in the decision-making process, which could generate a whole different mining sequence due to the choices of what and when it should be mined. Another challenge, that is faced when planning projects which involve multiple destinations, blending constraints, restrict mining areas, and all the complexities present in a real global optimization. MiningMath allows for such Global Optimization and you can still combine it with your current strategies!

    MiningMath has no pre-defined assumptions in order to identify the cut-off limit, which is based on a math optimization considering a discounted cash flow while respecting production capacity, blending constraints, vertical advances, widths, and any other assumption. Another key aspect in this regard is that MiningMath is not constrained by fixed pits, pushbacks, or phases. Instead, the mining sequence is an optimized output, which is a consequence of each set of parameters used, allowing more flexibility to find completely new solutions. The advantages of these differences are even more evident for more complex cases with multiple destinations, or complex constraints that could be neglected, hiding opportunities.

    As the algorithm here has no manual destinations appointments, it always sends the less valuable blocks to the dump, considering the set of constraints imposed, what means that MiningMath tries to comply with all the constraints inserted, by respecting this priority orderto define an optimized cut-off that can meet all of the requirements and increase the NPV as a consequence of a global optimization. Meanwhile, blocks that have positive values when processed can also be discarded to increase NPV based on the minimum economic value (economic value cut-off) going to the plant at that a specific period. Scenarios without stockpiling policy could have even higher positive values going to the waste since they do not have any other destination. Considering these examples, MiningMath can deliver quite different results from what you were expecting from your previous assumptions. However, you can also get closer, as much as you wish, to any solution by using the approaches suggested below.

    Forcing a cut-off grade on MiningMath will likely make you lose part of the advantages it can offer. However, for many reasons, mining professionals might still be willing to use either to compare different approaches, to understand the practical effects of using it, or not, etc. The approaches mentioned could also be used to forbid any material type on the plant.

    You can create multiple columns of economic values, each one for a cut-off you want to test. Then, force MiningMath to use this limit by defining very negative values for the destination you want to avoid, as it is shown in Figure 1, for a cut-off of 0.5. The math is:

    Economic Value Process = If [Ore_Grade] > [0.5], then [f(Economic Value)], else [-999,999,999.00]

    Figure 1: Block model setup to incorporate cut-off grades when defining economic values.

    Grades in MiningMath are controlled as a minimum and/or maximum average calculation, which means that these limits do not represent cut-off values since the algorithm can use lower values to blend higher ones. Thus, to use this approach, just set a very negative value on the grades below the cut-off so that these blocks would reduce substantially the average when processed. It can also work to constraint a contaminant maximum limit by adding a high grade on it, as it can be viewed in Figure 2Once again, the math is:

    Ore_Grade1 = If [Ore_Grade] < [0.5], then [-999,999,999.00], else [Ore_Grade]

    Figure 2: Block model setup to incorporate cut-off grades using average.

    Another option is by using the sum tab to control material types. Therefore, it would be required to create a field to calculate only waste blocks mass and set the constraint of the maximum limit of it in the plant as zero, as it is seen in Figure 3. It is worth mentioning that this approach could increase the complexity of the optimization due to the priority order within the algorithm.

    Tonnage_Waste = If [Ore_Grade] < [0.5], then [Volume*density], else [0]

    Figure 3: Block model setup to incorporate cut-off grades using sum.

    The mined blocks file, as seen in Figure 4, is the main output to trace the blocks from each destination, to understand the results, and to find the best way to enhance your reporting based on any detail that you wish to disclose. There are many useful tips to identify, and understand the results generated, some of them, are listed below:

    • Filter results where the period mined is equal to the period processed. Check process economic values of those blocks, which were processed, identify the lower one (which means the cut-off value at the plant), and compare it with higher processing economic value of those who went to the dump.

    • Calculate the average grade of any material in period mined equal to the period processed filter. Check if the blocks, which are going to dump, would have exceeded any limit in the plant, if so, even having good economic values, they would not comprise the constraints in place.

    To sum up, there are a lot of validations that can be done to understand why the algorithm is taking such decisions. It is also worth mentioning that any constraint can influence the results, even geometric ones, which could change the sequence and change the destination of the block at any period.

    Figure 4: Mined blocks output file.
    Figure 4: Mined blocks output file.

    Integration

    MiningMath doesn’t necessarily compete against mine scheduling optimization packages. The only concept we and all research centers worldwide recommend mining companies to overcome is the one related to Pit Optimization, due to the set of problems you have to face when dealing with such technology.

    Figure 1: Lerchs-Grossmann/Pseudoflow

    Therefore, even our simplest version has more features to generate nested pits with better control so that you could design better pushbacks and define a mine schedule using your preferred tool. The reason why this software can deliver such results is the Direct Block Scheduling methodology based on Mixed Integer Linear Programming (MILP) model and proprietary heuristics. Check other technical details and related research in our theory section.

    Figure 2: Direct to block scheduling.

    MiningMath also allows you to generate optimized pushbacks, which could facilitate your design process and guide your mine schedule while using other software packages. Notice that our tool is an optimizer that simply breaks the whole deposit (your block model) into smaller pieces, aiming for maximum Net Present Value, but respecting as many constraints as you wish:

    Figure 3: Optimized pushbacks and optimized schedules.

    A usual application of our technology is basically on strategy optimization for building decision trees. Once we run dozens/hundreds of scenarios of the yearly schedule optimization and fine-tune their parameters/constraints, our users take some of the resulting surfaces of MiningMath and use them to design some pushbacks so that they could integrate with other packages, such as MSSO, COMET, etc. This procedure could be accelerated/simplified by working with packages of years and finding shapes closer to pushbacks you’re used to.

    The outputs of our software will serve basically as optimized pushbacks, searching for maximum NPV and controlling whatever variable you consider necessary. Once we manage to import MiningMath surfaces into the other package, they will serve as guidance and they should assist the other package in finding higher NPVs. Most of these packages also allow us to predefine the blocks’ destination, if we wish to use MiningMath optimized cutoff policy. Finally, the package should have “only” the duty to do the bench scheduling, according to your short-term operational/tactical needs.

    Even if you decide, for any internal reason, that you have to use LG/Pseudoflow to define final pit limits, there is no problem at all. MiningMath is the only tool available in the market capable of performing complete strategic analysis by building decision trees unconstrained by predefined pushbacks. Please, check this short example (in Spanish) with dozens of scenarios just for the decision on CAPEX regarding processing capacities. Check also the second half of this video for a broader view on how to use the same concept to take strategic decisions on many other aspects related to mine projects or ongoing operations. I assure your managers will get much more interested in your reports once you start adding this sort of strategic analysis. Notice you could perform this sort of analysis either free of constraints or respecting any pre-existing (designed) ultimate pit or pushbacks.

    Figure 4: Multiple scenarios to build.

    Going one step beyond, we also have clients improving their adherence and reconciliation between long and short-term mine plans by using MiningMath as a complementary tool. Notice that, by using MiningMath in strategic mine planning, you could add more constraints from real-life operations, even if you decide just to check your current long-term plans. Also, notice you could place some surface limits, such as the designed surface of the next five years plan for example, and give some controlled freedom to short-term planners to rerun their mine plans, including more operational details, as long as they don’t change anything from period 6 on and they don’t affect the NPV negatively. Whenever they find an issue or an opportunity, short and long term teams have a way to collaborate and generate new joint configurations that account for all the strategic and tactical needs of the project simultaneously. All the remaining details, such as the designs, could be adjusted using the current mining packages available.

    If you wish to skip such steps and go straight to your final designed plans, we can guide you through this process, which includes a loop of running MiningMath and designing surfaces, until reaching a reasonable and operational sequence. This is a much more innovative procedure, which tends to achieve higher NPVs.

    Coordinates to index conversion

    The indices of each block represent its position in the model, indicating in which column, line, and level (IX, IY, and IZ) it is.

    The indices must be integer values, starting with any value (for Marvin model, it was adopted the indices 1,1,1 for the first block).

    The model’s origin must be placed at the bottom portion, starting to count from the minimum coordinates at X, Y, and Z.

    Figure 1 highlights the origin of the Marvin block model and the first block index coordinates (1,1,1).

    Figure 1: Block’s Matrix.

    However, if the block model contains geo-referenced information based on coordinates, they could be converted into indices before being imported to MiningMath.

    To perform this conversion, check the following demonstration on how to convert coordinates into indices using data from Figure 2 and the equation from Figure 3.

    Figure 2: Sample data to convert coordinates into indices.

    Figure 3 exemplifies using the X-axis but the process is the same for Y and Z axes by just using the corresponding information.

    Figure 3: Equation to convert coordinates into indexes.

    Click here to download a spreadsheet to convert the coordinates into indices and calculate the economic values.

    Figure 4: Resultant coordinates converted to indexes.

    The video below exemplifies the conversion process in case you have any doubts.

    Video 1: How to convert block coordinates into block indices.

    Dilution

    This is an informal video on how to consider dilution and mining recovery.

    Video 1: Dilution and Mining Recovery.

    Keep your Original Data

    A common question while using MiningMath integrated with other mining packages is: How to keep the data from one to another?

    In order to facilitate this format exchange, you can keep any field from the other software and import it as a Other parameter. By following this approach MininingMath will keep the data in the reports and MinedBlocks.csv generated after the optimization. Thus, you will able to import it on your mining package by following the same parameters as to when it was exported.

    This approach is quite useful when we have to keep the original coordinates X, Y, and Z or any other data that would help to identify the blocks in the platform which it came from.

    Operational Solutions

    A block-by-block schedule is likely very difficult to achieve in operations – example: operations does not allow using multiple types of equipment on the same bench. How to cope with it?

    Although MiningMath works with blocks as inputs, the level of connectivity is user-defined by playing with geometrical parameters in the interface.

    The following image shows results for the Marvin deposit when changing the Minimum Widths (filtered view after Period 2).

    Figure 1: Marvin deposit and visual comparison across scenarios differing their operational widths.
    Figure 2: Marvin deposit and NPV impact from scenarios differing their operational widths.

    Note everything changes when playing with a single parameter, including the Life of Mine and geometries. Such impacts are also possible when playing with economic aspects, slope angles, environmental and boundary constraints, fleet size, processing setupsblending requirements, etc.

    Watch the following video on how to play with operational constraints to achieve results closer to the reality of any project.

    Video 1: Operational Constraints.

    Optimization Hints

    The general relation between the number of Constraints and NPV tends to be inversely proportional, which means that scenarios with fewer constraints imposed, the optimization tends to find higher NPV since the algorithm free to search for better solutions. Therefore, it is always recommended that the full potentials of the mining project are tested first, as recommended in the optimized workflow, which starts at the Data Validation generating results that could be used in further steps. Regarding these aspects, a pretty useful approach was done by the MiningMath team throughout 2000 simulations (Figure 1) on a project seeking to identify the main impacts that the constraints had.

    The firsts executions considered the individual variation of each constraint, therefore, Copper Selling Price, Mining Widths, Vertical Rate of Advance, and the rest of the parameters were modified to understand the tendency of each one. As we can see, most of them, when individually varied, have a high probability of generating NPVs around 900M$.

    Figure 1: Project evaluation over 2000 executuions.

    The second round was performed by running “Overall” scenarios, which consider a random value of each one of the 11 variables at the same time on the project. As you can see, the combination of this amount of constraints has a higher probability of generating NPVs around 200M$.

    In summary, these results illustrate the intuitive assumption that by adding more and more constraints the probability of getting higher NPVs decreases drastically that is why the users have to apply “smart” concepts and constraints to achieve such results.

    MiningMath deals with multiple constraints and aims to maximize NPV simultaneously. Remember that it is computationally impossible to guarantee that all possible solutions are evaluated by any algorithm and, therefore, it is impossible to guarantee that the maximum NPV will be delivered by a single run. Having such concepts in mind, a simple modification on a single parameter can generate unexpected effects on NPV. Thus, is always worthwhile to identify tendency, what it is not possible to find with only two runs. Note that is required to create a curve with a set of runs if we wish to find tendencies of an algorithm that includes such heuristics.

    The logic behind such analysis is quite simple: for any optimization algorithm that aims to solve such complex problems, if we put too many constraints, we give no freedom for the algorithm to explore the solution space. Therefore, we tend to find solutions with lower NPV. On the other hand, when we give too much freedom, the algorithm will have the whole space to search for solutions, with no guidance on what would be reasonable in real life, so good solutions might not be found. The best compromise is when the user starts free of constraints and, then, starts giving some reasonable guidance (Figure 2). Therefore, smart constraints might help the algorithm in focusing only on a certain part of the space and actually deliver higher NPVs.

    Figure 2: Applying smart constraints to reduce a broader solution space and find best results on a limited one.

    The main suggestions to explore the potential is by running the first iterations and evaluations with as few constraints as possible and without geometries (most complex in the mathematical model). Then, starts to move gradually to detailed scenarios, by adding parameters one-by-one, which will allow you to understand the impact of each constraint in the NPV, and guide decisions through an Integrated Workflow. Thus, by measuring how much each assumption “costs” to the project in the long-term, the managers will be able to choose the best way to use them.

    MiningMath is a technology composed of mathematical programming and heuristics, which do not disrespect the constraints imposed to increase the NPV as the hierarchy (Figure 3) disclosed on this content explains. Experienced users take advantage of this aspect and slowly place constraints that might even help the optimizer to find solutions with higher NPV. A scenario with the total movement/production free, for instance, results in the better distribution of waste over time while adding/modifying a total limit makes the algorithm search the best result within these capacities and even find surprising NPVs among them. Therefore, to assess tendencies, it’s recommended to draw a curve of scenarios, such as free, 100 Mtpa, 90 Mtpa, 80 Mtpa, and so forth versus NPV.

    The user always can guide the optimization by relaxing constraints on the hierarchy whenever possible, which works as hints to the algorithm to reach feasible outcomes. Flexibilyzing the Slope Angles in the short term could bring better designs to your project, for instance. Whereas the user has an understanding of the optimization problem, and how the constraints are acting on it, it is possible to insert smart hints to the algorithm to reach a reasonable result.

    Figure 3: Constraints hierarchy order.

    Key takeaways

    One of the main advantages of MiningMath’s algorithm is the capability of finding intermediary solutions between the extremes outcomes. Therefore, the user can give hints to guide the algorithm, such as:

    • Letting the algorithm free, with fewer constraints (bigger solution space), is great for exploring the economic potential of a project.

    • When dealing with more constraints, reducing the solution space might give some hints to the algorithm and help it to achieve the goals set.

    • Starting to optimize from the most free scenario up to the most constrained one.

    • Relaxing low-priority constraints manually to guide the algorithm.

    • Breaking complex scenarios into smaller problems and iterating over multiple scenarios.

    • Avoid using constraints for the same purpose with the same priority in the hierarchy order.|

    These are subtle nuances that should be evaluated along about this topic as: When are constraints disrespected a consequence of the algorithm pursue to an intermediary outcome? When such deviations should be considered as a violation or an improper solution?

    Constraints relaxed to achieve a feasible outcome. Examples:

    • Processing stream shortfalls.

    • Operational constraints.

    • Average constraints (depending on the deviation).

    • Constraints based on Sums (depending on the deviation).

    Constraints violated that would represent, in fact, an unfeasible project. Examples:

    • Production limits exceeded.

    • Slope constraints.

    • Average constraints (depending on the deviation).

    • Constraints based on Sums (depending on the deviation).

    This information are usually reported at the report tab on the excel output file.

    Optimization Runtime

    The optimization run time is a common concern for professionals dealing with robust models. This page aims to provide context and guidance to improve run times, which might be quite useful for having a big picture of the project’s behavior under different assumptions and hypotheses.

    Runtime Barriers

    The runtime depends on a combination of multiple aspects. It is directly related to the complexity of the deposit and it is proportional to the number of:

    • Blocks.

    • Multiple destinations (+3).

    • Constraints in use and conflicting goals with the same hierarchy order.

    • Variables imported.

    • Period ranges.

    • Parameters changing over time.

    • Multi-mine deposits.

    Often, users are concerned with the limits to handle models with +20M blocks. MiningMath can virtually handle any model size. It has successfully made tests with models up to 240M blocks without reblocking, which took three weeks to run, and over a 32 Gb desktop machine.

    Typically, datasets with 5 million blocks take a few hours (in an 8GB RAM machine). In the future, the technology will be capable of concurrently running multiple scenarios on the same computer. There is no need for special servers with extra RAM capabilities for deposits of average size.

    Hardware Improvements

    Memory

    Overall, the main bottleneck for MininingMath is memory consumption. Hardware upgrades that most positively impact the optimization run time are:

    • RAM capacity.

    • RAM frequency.

    Cores and threads

    MiningMath is a single-thread application, which means:

    • Additional cores and threads do not affect the optimization run time.

    • Processors with higher clock speeds improve the run time.

    Strategies to reduce the runtime

    Use surfaces

    The most recommended strategy is passing through the tutorial steps of validating data and constraints validations then starting using the surfaces as a guide to reduce the complexitywithout losing dilution aspects on your approach.

    To get such guidance on a broader view with a reduced runtime you can for example create optimized pushbacks. The last step is to get a detailed Schedule since the model has such complexity. If such approaches do not offer a proper runtime, try to get intermediate results by splitting the total production into 2 or 3 periods.

    Reblocking

    Reblocking is a method used to decrease the number of blocks in a block model by combining some of the smaller blocks to create larger ones. This can be done using MM Labs as described here.

    Note: when reblocking your model it is important to evaluate dilution aspects that can be lost by increasing the block size.

    Time limit

    It is possible to indicate a time limit in hours before running a scenario. The time limit is defined in hours due to the usual complexity of mining projects and by the fact that MiningMath will always try to deliver a reasonable solution.

    This is a complex parameter that may not always be feasible to adhere to. It could also hinder the final solution, since it is restricting the algorithm from exploring a broader range of potential solutions. However, even if better results are not obtained, fast solutions will still give you a quicker assessment of your project. To better understand how the time limit works, you can visit this page.

    Timeframes

    Another strategy to reduce runtime might be the use of timeframes. MiningMath allows the integration between the short and long term visions in the same optimization processfacilitating the analysis and strategic definitions

    For example, it is possible to consider less detail for longer time horizons. Such horizons need to be considered in the overall view of the mine, up to exhaustion, but they consume optimization processing time that can be more focused on the early years of operation. The figure below depicts an example with monthly time frames in the initial periods of the project, transitioning to yearly periods, and extending to decennial periods in the final stages. You can visit this page for more information on how to use timeframes.

    Constraints chosen in the interface for a timeframe example.

    Optimizing your Workflow

    The following video comprises some tips and tricks to optimize your workflow when running multiple scenarios. The options comprise:

    • Altering multiple scenarios through the SSSCN files, which are XML files that could be parsed using scripts created by the user, then running scenarios through the interface.

    • Running scenarios from the command prompt, without recurring to the User Interface.

    Video 1: Optimizing your workflow.

    Percent Models

    Although this is not a mandatory step for the optimization process, the lithology can be defined considering:l in terms of tonnage.

    • The tonnage of a block.

    • The value of a block.

    Figure 2 shows the information of a block that could be classified as:

    • MX, if considering the greatest parcel in terms of tonnage.

    • MX, if considering the greatest parcel in terms of tonnage.

    Figure 1: Illustration of a block with different parcels.

    Figure 2: Block information divided into lithologies OX, MX, PM, Waste.

    MiningMath calculates tonnages based on [block size × densities]. The average density of a block should be equal to the weighted average based on lithologies and their respective percents.

    Recoveries should also be calculated considering the amount of material recovered from each parcel of the block.

    The economic value is calculated considering the amount of material recovered from each parcel, along with their respective revenues and costs.

    There are two ways to calculate the economic value of a block:

    • Without dilution (Option 1)only ore parcels feed the plant (Figure 3).

    • With dilution (Option 2): the entire block feeds the plant (Figure 4).

    In this case, the Economic Value for the process will consist of Revenue – Costs, where:

    • Revenue refers to the ore parcel (70%).

    • Processing Costs refer to the ore parcel (70%).

    • Mining Costs refer to the entire block (100%).

    As MiningMath will process the entire block, input a greater value for the process limit, assuming the algorithm will feed the plant with the remaining parcel of waste (30%).

    Create auxiliary columns to track and control tonnages limits of ore, waste, and any specific lithotype you want (Figure 6).

    Ignore default production charts and consider the tonnage ones being used as Other Constraints.

    Figure 3: Strategy for economic values considering only ore feeds the plant.

    In this case, the Economic Value for the process will consist of Revenue – Costs, where:

    • Revenue refer to the ore parcel (70%).

    • Processing Costs refer to the entire block (100%).

    • Mining Costs refer to the entire block (100%).

    In this case, as there is dilution, the processing limit inputted in the interface, should be the real plant limit.

    Again, auxiliary columns will provide further control of tonnages for each parcel (Figure 6).

    Figure 4: Strategy for economic values considering ore and waste feed the plant.

    Figure 5: Example of calculations for a block composed of different lithotypes and its respective economic values assuming no dilution and diluted material.

    • Create auxiliary columns for tonnages of each lithotype, as demonstrated in Figure 6.

    • During the importation, set them as Other.

    This step will make you able to track and control tonnages of each material.

    Figure 6: Example on how to track specific information of a block.

    READ MORE

    Imperial System

    For importing databases, MiningMath uses the metric system exclusively. In case your database is in the imperial system, it should be modified to a counterpart in the metric system. This page provides a script and further instructions for you to do this conversion successfully.

    For this process, python and pandas packages are required (see installation guidelines below). In case you find any issues using the script, or if you make any modifications to it that could be useful for the community as a whole, please share your discoveries at our Forum

    Script download

    If you’re already familiar with python scripts, just copy or download this simple python script and run it in your machine. Otherwise, keep scrolling for further instructions.

    DOWNLOAD SCRIPT HERE.

    				
    					import pandas
    
    # define conversion consts
    # foot to metter const
    ft_to_m = 0.3048
    # short tonne to metric tonne const
    st_to_t = 0.907184
    # ounce to gram const
    oz_to_g = 28.34952
    
    # import imperial model
    imperial_model = pandas.read_csv("imperial_model.csv")
    
    # create metric model
    metric_model = pandas.DataFrame(columns=['X', 'Y', 'Z'])
    
    # set coordinates to metric model (foot to meter)
    metric_model['X'] = imperial_model['X'] * ft_to_m
    metric_model['Y'] = imperial_model['Y'] * ft_to_m
    metric_model['Z'] = imperial_model['Z'] * ft_to_m
    
    # set dimension to metric model (foot to meter)
    metric_model['!DIM_X'] = imperial_model['DIM_X'] * ft_to_m
    metric_model['!DIM_Y'] = imperial_model['DIM_Y'] * ft_to_m
    metric_model['!DIM_Z'] = imperial_model['DIM_Z'] * ft_to_m
    
    # set mass to metric model (short tonne to metric tonne)
    metric_model['!MASS'] = imperial_model['MASS'] * st_to_t
    
    # set volume to metric model (cubic metter)
    metric_model['!VOLUME'] = metric_model['!DIM_X'] * metric_model['!DIM_Y'] * metric_model['!DIM_Z']
    
    # set density to metric model (metric tonne per cubic metter)
    metric_model['%DENSITY'] = metric_model['!MASS'] / metric_model['!VOLUME']
    
    # set grades to metric model (ounces per short tonne to grams per metric tonne or ppm)
    metric_model['@GRADE_AU'] = imperial_model['GRADE_AU'] * oz_to_g / st_to_t
    metric_model['@GRADE_CU'] = imperial_model['GRADE_CU'] * oz_to_g / st_to_t
    
    # set recovery values to metric model
    metric_model['*REC_AU'] = imperial_model['REC_AU']
    metric_model['*REC_CU'] = imperial_model['REC_CU']
    
    # set economic values to metric model
    metric_model['$PROCESS'] = imperial_model['PROCESS']
    metric_model['$WASTE'] = imperial_model['WASTE']
    
    # export metric model to csv
    metric_model.to_csv("metric_model.csv", index = False)
    
    				
    			

    1) Installing Python

    1. Download Python's lastest version at https://www.python.org/downloads/.

      Python download webpage
      Python download webpage
    2. Once the download is complete, open the .exe and follow the instructions for a default installation. Make sure to select "Add Python to Path" before proceeding, as depicted below. 

      Python installation screen
      Python installation screen
    3. At this point, the installation should be concluded. You can check if Python has been correctly installed by running the command python --version at Windows PowerShell. 

      Python version on Windows shell
      Python version on Windows PowerShell

    2) Installing Pandas

    Pandas is an open source data analysis and manipulation tool, built on top of Python. Follow the steps below to install it:

    1. Open the Windows PowerShell and run the command "pip install pandas".

      Pandas install command
      Pandas install command
    2. Once the installation is complete, you're able to run Pandas inside your Python programs. You can check if Pandas has been correctly installed by running the command "pip show pandas" at Windows PowerShell.

      Pandas version
      Pandas version

    3) Converting your database

    This script can be used to convert foot to meter; short tonne to metric tonne; and ounce to gram.
    It works with the columns: X, Y, Z, DIM_X, DIM_Y, DIM_Z, MASS, VOLUME, DENSITY, GRADE_AU, GRADE_CU, REC_AU, REC_CU, PROCESS, WASTE.

    Follow the steps below:

    1. Save your database in a file named imperial_model.csv, at the same folder where your script is located.

    2. Run the command python imperial.py at the Windows PowerShell from the folder where the script is located. The example below shows the script in the Downloads folder.

      Run script example
      Run script example
    3. Open the output file named metric_model.csv, and that's it! Your data has been converted to the metric system. 

      Output file example
      Output file example
    Chat Icon

    Hi, it's Mima here 😇 Ask me any questions!