In a recent webinar hosted by Malvern Panalytical, John Gamble, associate scientific director at Bristol Myers Squibb, and Paul Kippax, sector director at Malvern Panalytical, call for a paradigm shift in the way we address particle size during pharmaceutical development.
Accuracy has long been considered an essential consideration for particle size measurements, but Gamble and Kippax argue that there are better ways to understand whether a given method is realistic for characterising active pharmaceutical ingredients (APIs). They demonstrate how to establish an ‘appropriate’ method of measurement which keeps the end goal – understanding the product – firmly in mind.
The problem with accuracy
To ensure safety and efficacy, it is essential for pharmaceutical developers to understand how their product behaves, and API particle size has long been considered important. Gamble and Kippax emphasise this, noting that particle size relates to many critical quality attributes, including dissolution rate, bioavailability, and processing behaviour.
Consequently, a variety of techniques are commonly used to measure particle size. However, Gamble asserts that their drawbacks should be noted.
For example, laser diffraction gives a single ‘spherical equivalent’ diameter measurement to represent particle size. But as Kippax points out, in the real world, particles are rarely perfectly uniform or spherical. They may be rough or smooth, regular, or elongated, dispersed or agglomerated – and these characteristics affect important factors like powder flowability and stability.
As Kippax explains, accuracy is defined as the closeness of agreement to the reference value with the value found. However, for non-spherical particles, laser diffraction and other techniques will give different absolute values. Therefore, for real particles, a single reference value for particle size cannot be found, and accuracy isn’t a viable characteristic.
Moving forwards, how can we replace accuracy in method design? Kippax notes that, over the past 15 years, one of the main currents moving the pharmaceutical industry has been the notion of quality by design: starting from the pharmaceutical need and defining the target product profile (TPP) and using it to establish the critical quality attributes (CQA) and critical material attributes (CMA) which must be assessed to ensure product quality and efficacy.
In a similar way, he argues, our approach to accuracy could start from how we are going to use the data. As Gamble emphasises, particle size is not an objective itself but a means to an end, the end being its correlation with the behaviour of the material during processing or use. This is the core concern to keep in mind.
If we are no longer demanding absolute accuracy from our measurements, we must find another way of ensuring that data is indicative of the material being characterised. A material-specific method of measurement is required.
During early-stage development, says Gamble, techniques like laser diffraction may not be the most informative method of measurement. Instead, he proposes static image analysis (SIA) using microscopy. The benefits of this method include the fact that particles can be seen, and method conditions can be developed for each sample. Perhaps most importantly, more information than just particle size can be measured: shape, number, length, and width are shown too. As we move into an age of big data, Gamble points out, this extra data is highly useful and enables better modelling. Shape data is of increasing interest in the industry, affecting important properties such as drying time post-crystallisation.
Without accuracy being in play, these measurements can be related to the product’s CQA: those characteristics which determine whether it meets the TPP. Our new concern, the correlation between powder properties and CQA, is no longer defined as accuracy but ‘appropriateness’. Gamble and Kippax define appropriateness as the measurement of particle characteristics which link directly to key product/process performance characteristics. We are demonstrating not the accuracy of the measurement but the fact that the measurement is appropriate to describe the performance or processability of the product.
We can add to this the notion of precision to validate our methods. However, Gamble notes, precision requirements tend to give a target variance which relates more to the operation of the technique being applied rather than accounting well for the material being analysed. Instead, he suggests we aim for discrimination: that is, enough precision to enable the discrimination of acceptable and unacceptable materials with respect to key product/process performance characteristics.
In practice, we begin with SIA of the material, which provides us with a range of information. That data is used to understand the CQA and CMA of the product and to establish a correlation between the property we are measuring and the product’s performance – the appropriateness of the measure. Based on that correlation, we can create a material-specific method using techniques such as laser diffraction method. An analysis of the correlation between the SIA measurement and laser diffraction measurement is used to ensure an appropriate laser diffraction method is selected. The result is a method which not only shows very good agreement between the two measurement methods (orthogonality) but also good discrimination between acceptable and unacceptable materials. This shows us that the measurement is indicative of the material concerned and therefore that we have achieved a high level of appropriateness.
With more than 1,000 FDA scientists in the US now engaged around material characterisation, there is an open discussion in the industry about how to ensure the data sets applied to inform product development decision are relevant. Kippax and Gamble emphasise the importance of transparency and the ability to demonstrate how the methods for product analysis have been developed.