Two main sampling and testing methods were developed and validated by Norman O. Lotter for use in the mineral processing industry.  These are High Confidence Flotation Testing, and Statistical Benchmark Surveying.  Links to these works follow:

High Confidence Flotation Testing

University of Cape Town, 1995

This thesis addresses the problem of obtaining reliable laboratory scale flotation test data for the Merensky ore type found in the Bushveld complex of South Africa. The complex nature of the platinum-group element (PGE) deportment in this ore renders the normally-practiced procedures inappropriate for this particular testwork. A more robust and thorough procedure is necessary because of the diverse mineralogical forms in which the PGE are found. The evaluation of the mass and value balances has accordingly to take these factors into account. The major features of the evaluation of input and output errors across the laboratory scale flotation test are analysed. It is found that unless size-by-size variance of PGE in a conventionally crushed mill feed is taken into account the mill feed sample size is underestimated by some 176%. Further the preparation of a reference distribution of assayed head material is necessary to provide the 95% confidence limits of grade estimate. The need for repeating flotation tests and compositing the adjudicated products is discussed, concluding that quintuplicates are suitable to achieve a desirable level of confidence in the built-up head grade. The sample preparation of the flotation products has a critical role in minimising evaluation errors, as is the case with fire assaying of samples where minimum numbers of replicate determinations have been calculated. An outlier rejection model for adjudication of the replicate built-up head grades is proposed, and a complete flowsheet of the quality control model is developed from first principles. The effect of this model on the PGE total balance is analysed. It is concluded that workable controls are defined, since a metal balance with < 1% error has been achieved.

Statistical Benchmark Surveying

McGill University, 2005

The sampling and analysis of sulphide mineral processing plants is addressed in this study. A review of the published literature has shown that the foundations of this topic were laid in the 1970’s, but typically a single sampling test was performed, and its representativity accepted provided its metallurgical balance closed without excessive adjustments. There was no mention made of quality control or equivalent tests of representativity of the feed material during sampling tests. No recognition of the effect(s) of ore grade on metallurgical performance was given. In this study, a quantitative model, called a statistical benchmark survey, is presented. Multiple surveys are completed over a limited time; the corresponding stream samples of the surveys deemed acceptable are combined to obtain high confidence composite samples. The head grade of each survey is compared to two distributions to test its acceptability, typically at a 95% confidence level. These distributions are called the Internal Reference Distribution and the External Reference Distribution. The first test—on the Internal Reference Distribution—uses the Sichel t-estimator, a lognormal model designed for use on small data sets, on the set of six survey unit head grades. The associated confidence limits of this mean grade are equivalent to two standard errors of the distribution, but are skewed about the sample mean. The second test, this time by the External Reference Distribution, also uses a lognormal platform, designed by Krige, but uses larger data sets from 1-3 months of shift sample head grades. The associated confidence limits of this second model are also skewed, but are wider than for the Sichel model, and are equivalent to two standard deviations of the sample mean. This outlier rejection model produces ore grade estimates that are in good agreement with the more robust External Reference Distribution means. The Raglan Mine case study is used to illustrate that ore grades in situ are highly lognormal; this lognormality is also present in the time domain in head samples (taken at the cyclone overflow), but is less pronounced (i.e. residual). Two survey models are presented. The benchmark model describes typical operations. The campaign model specifically chooses ore types that are mined and milled in a specific week of operations for predictive or diagnostic purposes. The multiple mineral hosting of nickel across three orders of magnitude extends this problem into that of a compound distribution. The construction and use of an External Reference Distribution to estimate the mean and associated skew confidence limits of this compound distribution is shown for both drill core and ore milled (the latter in a case of residual lognormality). A trial decomposition of the spatial External Reference Distribution is discussed. The heterogeneous nickel mineral hosting in ore, after processing, becomes an artificially controlled final concentrate, containing most of the economic nickel sulphides in a normal distribution, and most of the uneconomic nickel minerals in a final tailing with a residually bimodal lognormal distribution. The presence of bimodal lognormality in final tailing data may have historical or predictive uses: at Raglan, flowsheet improvements and more seasoned operations contributed to the decrease in the mean of both the low-grade and high-grade modes, and increase the contribution of the low-grade mode.

Flux Formulation for Fire Assaying

Fire-assaying is used as a preconcentration method to capture precious metals into a lead button from ore and metallurgical process samples.  The lead button is then cupelled to remove the lead metal, leaving a prill of precious metals that can either be weighed on a microbalance or dissolved in acid, made up to volume, and measured either on an AA or an ICP.

The fusion process that produces this lead button relies on a flux that must be tailored to the whole rock analysis of the ore sample.  Unless this is done, the capture of the precious metals in the lead button will be incomplete, and the true precious metals grade of the sample underestimated.  This generally happens when a generic flux is used.

There are multiple publications on this subject which quantify the high-temperature reactions in this fusion (Lenehan and de L. Murray-Smith, 1986; McIntosh, 2004).  Flowsheets has used this information to develop a model that uses the whole rock analysis to inform the high-temperature acid-base and redox reactions in the fusion so as to form a low-viscosity fusion slag, allowing more complete capture of the precious metals.  This is done by making use of the low-temperature eutectic that forms when the various fluxing agents present in the appropriate proportions.

Write to, or call, Flowsheets to discuss your fire-assaying needs.


Lenehan, W.C., de L. Murray-Smith, R., 1986. Assay and Analytical Practice in the South African Mining Industry, 1st ed. SAIMM, Johannesburg.

McIntosh, K.S., 2004. The Systems Engineering of Automated Fire Assay Laboratories for the Analysis of Precious Metals. Stellenbosch.