PUBLICATIONS

2024

2023

2022

Developing mathematical models for the description of reaction kinetics is fundamental for process design, control and optimisation. The problem of model discrimination among a set of candidate models is not trivial, and recently a new and complementary approach based on artificial neural networks (ANNs) for kinetic model recognition was proposed. This paper extends the ANNs-based model identification approach by defining an optimal design of experiment procedure, whose performance is assessed through a simulated case study. The proposed design of experiments method allows to reduce the number of experiments to be conducted while increasing the ability of the artificial neural network in recognising the proper kinetic model structure.

2021

2020

The traditional approach to designing a catalyst starts with identifying a suitable candidate catalyst for the desired chemical transformation, then iteratively synthesising and testing catalyst structures and supports to improve its specifications. The major weakness of this approach is that a trial and error approach is likely to be both inefficient and unsuccessful in identifying a catalyst with the required attributes such as phase, type, temperature, conversion, etc. The design of the overall process, including both reaction and separation steps, although defined by the chemical route, is often developed on a similarly ad hoc basis relying on the identified catalyst, experience and trial and error. The aim of this research is to bridge the gap between catalyst and process design and develop systematic methods for simultaneous catalyst characteristics and process synthesis.

2019

Kinetic models of chemical and biochemical phenomena are frequently built from simplifying assumptions. Whenever a model is falsified by data, its mathematical structure should be modified embracing the available experimental evidence. A framework based on maximum likelihood inference is illustrated in this work for diagnosing model misspecification and improving the structure of approximated models. In the proposed framework, statistical evidence provides a measure to justify a modification of the model structure, namely a reduction of complexity through the removal of irrelevant parameters and/or an increase of complexity through the replacement of relevant parameters with more complex state-dependent expressions. A tailored Lagrange multipliers test is proposed to support the scientist in the improvement of parametric models when an increase in model complexity is required.

An autonomous flow microreactor platform was developed that was able to conduct reaction experiments and measure the outlet reactant and product concentrations using HPLC without user supervision. The platform performed unmanned kinetic experiments with the aim of precisely estimating the parameters of a kinetic model for the esterification between benzoic acid and ethanol catalysed by sulfuric acid. The capabilities of the autonomous platform were demonstrated on three different experimental scenarios: 1) performing steady-state experiments, where the experimental reaction conditions were pre-defined by the user; 2) performing steady-state experiments, where the conditions were optimised online by Model-Based Design of Experiments (MBDoE) algorithms, with the aim of improving parameter precision; 3) executing transient experiments, where the conditions were pre-selected by the user. For the steady-state experiments, the platform automatically performed online parameter estimation and MBDoE with a pre-selected kinetic model. It was demonstrated that a campaign of steady-state experiments designed using online MBDoE algorithms led to more precise parameter estimates than a campaign of experiments designed by the traditional factorial design. Transient experiments were shown to expedite kinetic parameter estimation and use less reagents than campaigns of steady-state experiments, as it was no longer necessary to wait for the system to reach steady-state. In general, the transient experiments offered less precise parameter estimates than the steady-state campaigns, however the experiments could be completed in just 2 h instead of the 8 h required for a campaign of steady-state experiments.

Predictable and repeatable outcome is a major issue in nanoparticle synthesis. Traditionally, chemists rely on the one-factor-at-a-time method to investigate and optimise synthetic processes; however, this method is inefficient and often misleading. Design of Experiments (DoE), in contrast, can provide a greater amount of information in fewer experiments and lends itself to producing more reproducible results. Nevertheless, DoE techniques are only used by a relatively low number of practitioners in nanoparticle research. Here, we provide a case study on the synthesis of oleylamine-capped gold nanoparticles (AuNPs). Through the use of DoE, we achieved a marked reduction in dispersity and developed a model to carefully control the mean diameter of the nanoparticle populations.

Von Willebrand disease (VWD) is one of the most severe inherited bleeding disorder in humans, and it is associated with a qualitative and/or quantitative deficiency of von Willebrand factor, a multimeric glycoprotein fundamental in the coagulation process. At present, the diagnosis of VWD is extremely challenging and mostly based on clinical experience. Kinetic models have been recently proposed and applied to help in the diagnosis and characterization of VWD, but the complexity of these models is such that they requires long and stressful clinical tests, such as the desmopressin response test (DDAVP), to achieve a satisfactory estimation of the individual haemostatic parameters. The goal of this paper is to design a minimal set of clinical tests for the identification of akinetic model to decrease the required time and effort for the characterization and diagnosis of VWD.

Catalytic oxidation of methanol to formaldehyde is an important industrial process due to the value of formaldehyde either as a final product or as a precursor of numerous chemicals. The study of kinetics in this system is hindered by sources of uncertainty that are inherently associated to the nature and state of the catalyst (e.g., uncertain reactivity level, deactivation phenomena), the measurement system and the structure of the kinetic model equations. In this work, a simplified kinetic model is identified from data collected from continuous flow microreactor systems where catalysts with assorted levels of reactivity are employed. Tailored model-based data mining methods are proposed and applied for the effective estimation of the kinetic parameters and for identifying robust experimental conditions to be exploited for the kinetic characterization of catalysts with different reactivity, whose kinetic behavior is yet to be investigated.

The increasing issues about the use of fossil raw materials advocate for an expanding utilization of biomass-based fuels and chemicals. Among the bio-based furan compounds, the 5-hydroxymethylfurfural (HMF) has received considerable attention in the chemical industry since it can be hydrogenated to 2,5-dimethylfuran (DMF) that is a valuable alternative fuel. The identification of a suitable kinetic model, where all the non-measurable kinetic parameters can be reliably estimated, is crucial to pursue the process optimization. In this work, the kinetic models currently available in literature for the HMF hydrogenation process are investigated to underline their strengths and weaknesses using a sensitivity-based identifiability analysis. The application of identifiability analysis techniques allows to define a set of fully identifiable kinetic models to be used for statistically reliable predictions. Furthermore, the use of design of experiments techniques leads to the characterization of design space regions that maximize the quality of the statistics related to the different estimates.

Modelling chemical processes frequently requires the construction of complex systems of differential and algebraic equations involving a high number of state variables and parameters. Whenever a model structure is proposed, its adequacy is checked with a goodness-of-fit test. The goodness-of-fit test is capable of detecting the presence of overfitting or under-fitting. However, when some modelling error is detected, the test does not provide guidance on how to modify the model equations to match the behaviour of the physical system under analysis. In this work, a test statistic is derived from a tailored Lagrange multiplier test with the aim of diagnosing potential sources of process-model mismatch and to provide guidance on how to evolve approximated model structures towards a higher level of complexity. The proposed test is applied on a simulated case study of a yeast growth model in a fed-batch bioreactor.

Automated model identification platforms were recently employed to identify parametric models online in the course of unmanned experimental campaigns. The algorithms controlling these platforms include two computational elements: i) a tool for parameter estimation; ii) a tool for model-based experimental design. Both tools require the solution of complex optimisation problems and their effective outcome relies on their respective objective functions being well-conditioned. Ill-conditioned objective functions may arise when the model is characterised by a weak parametrisation, i.e. the model parameters are practically non-identifiable and/or extremely correlated. In this work, a robust reparametrisation technique is proposed and tested both in-silico and in an automated model identification platform. The benefit of reparametrisation is demonstrated on a case study for the identification of a kinetic model of catalytic esterification of benzoic acid with ethanol in a flow microreactor.

Parameter estimation algorithms integrated in automated platforms for kinetic model identification are required to solve two optimization problems: i) a parameter estimation problem given the available samples; ii) a model-based design of experiments problem to select the conditions for collecting future samples. These problems may be ill-posed, leading to numerical failures when optimization routines are applied. In this work, an approach of online reparametrization is introduced to enhance the robustness of model identification algorithms towards ill-posed parameter estimation problems.

2018

Von Willebrand disease is one of the most severe inherited bleeding disorders in humans, characterized by qualitative and/or quantitative defects of the von Willebrand factor protein. Diagnosis is difficult due to the high heterogeneity of the disease. Pharmacokinetic models have been recently proposed and applied to help in the disease characterization and diagnosis. However, the complexity of the models requires long and invasive dynamic non-routine tests to be carried out on the subjects to achieve a statistically satisfactory estimate of the individual metabolic parameters. In this work, it is demonstrated how the use of basal clinical tests and a shorter dynamic clinical test may allow for the identification of a mechanistic model of the disease. An existing mechanistic model of von Willebrand disease has been modified to account for the basal tests, where new model equations are derived using response surface methodology. Results show a good agreement between the model response and the clinical data.

The identification of an approximated model, once an opportune mathematical structure is selected, requires both a precise estimation of its parameters and the determination of the range of conditions in which the model provides accurate predictions, i.e., the domain of model reliability. A variety of model-based design of experiments (MBDoE) techniques are available in the literature for designing highly informative trials for the precise estimation of model parameters. AvailableMBDoEmethods assume that the model structure is exact in the formulation of experimental design metrics. Hence, in the presence of an approximated model, the employment of conventionalMBDoEapproaches may lead to the collection and fitting of data at conditions where the model performance is very poor, thus leading to the degradation of the fitting performance and a loss of model predictive power. In this work, an iterative framework for the identification of approximated models is proposed in which theMBDoEstep is constrained to the domain of model reliability. The method is tested on a simulated case study on the identification of an approximated kinetic model of catalytic ethanol dehydrogenation.

The identification of a parametric model, once a suitable model structure is proposed, requires the estimation of its non-measurable parameters. The precise quantification of the model parameters relies on the fitting of data with high Fisher information content. Model-based design of experiment (MBDoE) methods have been proposed in the literature for maximising the collection of information whenever there is a limited amount of resources available for conducting the experiments. ConventionalMBDoEmethods do not take into account the structural uncertainty on the model equations and may lead to a substantial miscalculation of the information in the experimental design stage. In this work, an extended formulation of the Fisher information matrix is proposed as a metric of information accounting for model misspecification. The properties of the extended Fisher information matrix are presented and discussed with the support of two simulated case studies: a biomass growth model and a bacterial population model.

Parametric models derived from simplifying modelling assumptions give an approximated description of the physical system under study. The value of an approximated model depends on the consciousness of its descriptive limits and on the precise estimation of its parameters. In this manuscript, a framework for identifying the model domain of validity for the simplifying model hypotheses is presented. A model-based data mining method for parameter estimation is proposed as central block to classify the observed experimental conditions as compatible or incompatible with the approximated model. A nonlinearsupport vector classifier is then trained on the classified (observed) experimental conditions to identify a decision function for quantifying the expected model reliability in unexplored regions of the experimental design space. The proposed approach is employed for determining the domain of reliability for a simplified kinetic model of methanol oxidation on silver catalyst.

The Traveling Traders' Exchange Problem (TTEP) is formalized, aiming at studying the collision-exchange systems found in various research areas. As an example of the TTEP models, a 1-D model is developed and characterized in detail. The computational stochastic simulation of the 1-D TTEP model relies on a stochastic simulation algorithm implemented on the basis of the Monte Carlo method. A model identification framework is proposed where the money distribution in the system obtained from the stochastic model is characterized in terms of (a) standard deviation of the money redistribution; (b) its probability density function. Results indicate that the expressions of the estimated functions for terms (a) and (b) are tightly related to the system input conditions. The example of curve fitting on the probability density function shows how the variation of money redistribution in the system in time is driven by different values of the parameters describing the interaction mechanism.

Online model-based redesign of experiments (OMBRE) techniques reduce the experimental effort substantially for achieving high model reliability along with the precise estimation of model parameters. In dynamic systems, OMBRE techniques allow redesigning an experiment while it is still running and information gathered from samples collected at multiple time points is used to update the experimental conditions before the completion of the experiment. For processes evolving through a sequence of steady state experiments, significant time delays may exist when collecting new information from each single run, because measurements can be available only after steady state conditions are reached. In this work an online model-based optimal redesign technique is employed in continuous flow reactors for improving the accuracy of estimation of kinetic parameters with great benefit in terms of time and analytical resources during the model identification task. The proposed approach is applied to a simulated case study and compared with the conventional sequential model-based design of experiments (MBDoE) techniques as well as the offline optimal redesign of experiments.

Parameter estimation in modelling reaction kinetics is affected by the prior knowledge on the domain of variability of model parameters which can be very limited at the beginning of model building activities. In conventional parameter estimation approaches a reasonably wide domain of variability for kinetic parameters is initially assumed, but this uncertainty on domain definition might deeply affect the efficiency of model-based experimental design techniques for model validation. In this work, we propose the use of binary classification techniques to define a feasible parametric region of parameter variability satisfying a set of user-defined model-based constraints. The proposed approach is illustrated in a case study of consecutive reactions in a plug flow reactor.

A reduced von Willebrand factor (VWF) synthesis or survival, or its increased proteolysis, alone or in combination, contributes to the development of von Willebrand disease (VWD). We describe a new, simple mechanistic model for exploring how VWF behaves in well-defined forms of VWD after its 1-desamino-8-D-arginine vasopressin (DDAVP)-induced release from endothelial cells. We aimed to ascertain whether the model can consistently predict VWF kinetic changes. The study involved 9 patients with VWD types Vicenza (a paradigmatic form with a reduced VWF survival), 8 type 2B, 2 type 2A-I, 1 type 2A-II (associated with an increased VWF proteolysis), and 42 normal controls, whose VWF levels were measured after a 24-hour-long DDAVP test. The rate constants considered were: k0, associated with the VWF release phase; k1, illustrating the phase of conversion from high- to low-molecular-weight VWF multimers; and ke, associated with the VWF elimination phase. The amount of VWF released (D) was also measured. ke and D were significantly higher in O than in non-O blood group controls; k1 was also higher, but less markedly so. All the parameters were accelerated in type Vicenza, especially k e (p < 0.0001), which explains the significant reduction in VWF half-life. In types 2B and 2A-II, k1 was one order of magnitude higher than in controls, which explains their loss of large VWF multimers. All parameters except ke were lower in type 2A-I. The proposed mechanistic model clearly describes the altered biochemical pathways in well-characterized VWD, prompting us to suggest that it might help clarify elusive forms of VWD too.

Von Willebrand disease (VWD) is one of the main inherited coagulation disorders. It is caused by a deficiency and/or a dysfunction of the von Willebrand factor (VWF), a fundamental multimeric glycoprotein involved in the hemostasis process. Correct detection of the disease is not an easy task because the disease manifests itself in many variants and a high intra-subject variability is observed. For these reasons, the diagnostic clinical trials typically rely on a 24-h sampling protocol, which makes the overall test long, stressful, and costly. Using a new pharmacokinetic model derived from Galvanin et al.'s 2014 study, this study aims at i) assessing the theoretical possibility to perform a shorter clinical test and ii) proposing a set of model-based diagnostic methods as a support for the clinical team. A preliminary information analysis is performed in order to understand which sampling instants are more informative for model identification. This allowed us to propose a novel, 8-h diagnostic protocol, which is still able to ensure model identifiability. Three alternative diagnostic methods are then proposed based on this short-length clinical protocol. One of them directly uses the pharmacokinetic model, whereas the other two are based on the use of three indices (two pharmacokinetic indices, namely clearance, total VWF released, and as third index the basal multimer ratio) to formulate the diagnosis problem as a classification one. The classification problem is then solved using K-nearest neighbours and linear discriminant analysis. Results show the theoretical feasibility of a VWD diagnosis based on a shorter protocol.

Bimetallic Au-Pd nanoparticles supported on TiO2 show excellent catalytic activity and selectivity to benzaldehyde in the solvent-free transformation of benzyl alcohol to benzaldehyde, where toluene is the main observed by-product, together with smaller amounts of benzoic acid, benzyl benzoate and dibenzyl ether. However, despite the industrial relevance of this reaction and importance of tuning the selectivity to the desired benzaldehyde, only a few attempts have been made in the literature on modeling the reaction kinetics for a quantitative description of this reaction system. A kinetic model for the oxidation of benzyl alcohol over Au-Pd is proposed in this paper. The model assumes that hydrogenolysis, disproportionation and dehydrogenation reactions may occur in parallel, and it has been found satisfactory after a model discrimination procedure was applied to a number of simplified candidate models developed from microkinetic studies. Despite its relative simplicity, the proposed model is capable of representing the reactant conversion and distribution of products observed in experiments carried out at different temperature, pressure and catalyst mass in a stirred batch reactor. Major findings include the quantitative evaluation of the impact of hydrogenolysis and disproportionation pathways on benzaldehyde production. At low temperature the disproportionation reaction is the dominant route to toluene formation, while hydrogenolysis dominates at high temperature.

2017

The development of comprehensive models for describing kinetic phenomena typically requires the extensive employment of resources for both determining the model structure and to precisely estimate the set of model parameters. For this reason, approximated model structures are normally employed instead of the comprehensive ones. The estimation of parameters in an incorrect model is not a trivial task. The expected information for the estimation of the model parameters is evaluated through the Fisher Information Matrix (FIM). Model-based design of experiments (MBDoE) methods for parameter precision do not take into account the structural model uncertainty in the formulation ofFIM-based information metrics. In this work, an ExtendedFIM(E-FIM) is proposed, which carries information about the incorrect model structure. Its properties are discussed through its application on a simulated case with a biomass growth model.

Von Willebrand disease (VWD) is the most common inherited bleeding disease and is caused by deficiency or dysfunction of a multimeric glycoprotein, namely the von Willebrand factor (VWF). The disease is present in numerous subtypes, making the diagnosis through the classic 24-h DDAVP sampling protocol a difficult task. In this simulation study, a new simplified pharmacokinetic model is proposed with the aims of demonstrating that a shorter DDAVP clinical test can be devised and that indices, such as clearance, total amount of VWF released and multimeric ratio at basal state can be exploited with multivariate classification methods in order to help practitioners to reach a correct diagnosis.

The travelling traders' exchange problem (TTEP) is a general mathematical problem arising in a number of applications where the purpose is to characterise the distribution of money over time related to a population of traders which can move in space and interact with each other. Results from stochastic simulations of TTEP models can be analysed over time in terms of i) standard deviation (STD); ii) probability density function (PDF) of the observations in time. A two-layer model identification strategy is proposed in this paper for the development of time-dependent nonlinear regression models from the results of TTEP computational stochastic simulations. The models are capable of representing the money distribution as a function of the TTEP operating parameters, paving the way to a new framework for model identification.

The purpose of this study was to prepare ginkgolide B (GB) lyophilized powder for injection with excellent appearance and stable quality through a formulation screening and by optimizing the freeze-drying process. [...]

The kinetics of gas-liquid methoxycarbonylation of ethylene using 0.0013 mol/L Pd(dtbpx)(dba) homogeneous catalyst at 100 °C and 10 bar were studied in a continuous flow Hastelloy capillary microreactor of 1 mm internal diameter. Characterisation of the hydrodynamics was conducted to confirm plug flow behaviour and evaluate liquid volume fraction, both important for reactor modelling. Reaction experiments were carried out to investigate the effect of ethylene, methanol and carbon monoxide concentrations on the observed reaction rate. Vapour-liquid equilibrium was employed to calculate component concentrations at the inlet and outlet reactor conditions from the experimental data. In conjunction with a reactor model, the results were used to evaluate kinetic models based on the Pd-hydride catalytic cycle. A kinetic model considering methanolysis as the rate limiting step agreed with the experimental data. A model-based design of experiments strategy was applied for selecting the most informative experiments to achieve a precise estimation of the kinetic model parameters.

2016

Simple and reliable phenomenological models always represent an attractive and powerful instrument in several chemical and biological industrial processes. A trustworthy model can potentially predict the response of a system outside the investigated range of experimental conditions and can be fruitfully exploited for the purposes of process design and non-empirical process optimization. Following the seminal work by Box and Lucas a number of works appeared in the scientific literature regarding the discrimination among candidate models and the precise identification of the model parameters through model-based design of experiments (MBDoE) techniques for model discrimination and parameter precision (PP) in nonlinear dynamic systems. However, the application of these techniques always starts from the availability of an existing set of candidate models while direct guidelines for subsequent model improvement are not obvious. A well-established systematic technique for quick model development and enhancement has not been proposed yet. [...]

The Nernst-Planck approach, previously used to model the electrodialytic recovery of uni-, di or tri-valent electrolytes, was used to accomplish the desalination of concentrated brines with an initial NaCl concentration up to 4.6 kmol m-3. The complexity of the proposed model is such that an extensive experimentation is required for a statistically sound estimation of the relevant model parameters, including solute (tB) and water (tW) transport numbers through the ion-selective membranes; solute (LB) and water (LW) transport rate by diffusion; average electro-membrane resistance (R). A model-based design of experiments (MBDoE) approach is proposed in this paper to minimise the number of trials and resources required for model identification. The use of this approach in an experimental case study allowed for a dramatic reduction of the experimentation time from 1080 min (corresponding to a classical experimentation with multiple batch desalination trials) to 30-60 min corresponding to a single optimal batch desalination experiment. The results obtained show the potential of MBDoE for quick development and assessment of electrodialysis models, where highly predictive capability can be achieved with the minimum experimental time and waste of resources.

Despite the great industrial importance of benzaldehyde as a reaction intermediate, only a few attempts have been made in the literature to develop kinetic models able of characterising the catalytic oxidation of benzyl alcohol to benzaldehyde quantitatively both in batch and flow systems. The purpose of this paper is to merge the information obtained from a laboratory scale batch glass stirred reactor (GSR) with the information obtained from a continuous-flow micro-packed bed reactor (MPBR) for an accurate and quantitative description of the products distribution in these reaction systems. A two-stage procedure is applied for this purpose where experimental design techniques are used for evaluating the most promising regions of the experimental space for the identification of kinetic models.

The hydrodynamics of a three-phase micro-packed bed reactor and its effect on catalysed benzyl alcohol oxidation with pure oxygen were studied in a silicon-glass microstructured reactor. The microreactor was operated at 120 °C and 1 barg and contained a channel with a 300 µm×600 µm cross-section, packed with 1 wt% Au-Pd/TiO2 catalyst, 65 µm in average diameter. Improvements in the conversion of benzyl alcohol and selectivity to benzaldehyde were observed with increasing gas-to-liquid ratio, which coincided with a change in the flow pattern from a liquid-dominated slug to a gas-continuous flow regime. The observed enhancement is attributed to improved external mass transfer, associated with an increase in the gas-liquid interfacial area and reduction in the liquid film thickness that occur with gradual changes in the flow pattern. A maximum selectivity of 93% to benzaldehyde was obtained under partial wetting - which introduced the added benefit of direct gas-solid mass transfer - outperforming the selectivity in a conventional glass stirred reactor. However, this was at the expense of a reduction in the conversion. A response surface model was developed and then used to predict optimal operating conditions for maximum benzaldehyde yield, which were in the gas-continuous flow regime. This corresponded to relatively high gas flow rate in conjunction with moderate liquid flow rate, ensuring sufficient catalyst wetting with a thin film to reduce transport resistances.

Online model-based design of experiments techniques were proposed to exploit the progressive increase of the information resulting from the running experiment, but they currently exhibit some limitations: the redesign time points are chosen “a-priori” and the first design may be heavily affected by the initial parametric mismatch. In order to face such issues an information driven redesign optimisation (IDRO) strategy is here proposed: a robust approach is adopted and a new design criterion based on the maximisation of a target profile of dynamic information is introduced. The methodology allows determining when to redesign the experiment in an automatic way, thus guaranteeing that an acceptable increase in the information content has been achieved before proceeding with the intermediate estimation of the parameters and the subsequent redesign of the experiment. The effectiveness of the new experiment design technique is demonstrated through two simulated case studies.

Continuous flow laboratory reactors are typically used for the development of kinetic models for catalytic reactions. Sequential model-based design of experiments (MBDoE) procedures have been proposed in literature where experiments are optimally designed for discriminating amongst candidate models or for improving the estimation of kinetic parameters. However, the effectiveness of these procedures is strongly affected by the initial model uncertainty, leading to suboptimal design solutions and higher number of experiments to be executed. A joint model-based design of experiments (j-MBDoE) technique, based on multi-objective optimization, is proposed in this paper for the simultaneous solution of the dual problem of discriminating among competitive kinetic models and improving the estimation of the model parameters. The effectiveness of the proposed design methodology is tested and discussed through a simulated case study for the identification of kinetic models of methanol oxidation over silver catalyst.

2015

Microreactor platforms represent advanced tools in reaction engineering for the quick development of reliable kinetic models. Experiments can be performed with a better reaction temperature control, enhanced heat and mass transfer and mixing of reactants. However, the effectiveness of the model identification procedure is strictly related to the execution of properly designed experiments, allowing elucidation of the reaction mechanisms and providing a precise estimation of the kinetic parameters. In this paper a model-based design of experiments (MBDoE) approach is proposed where experiments are designed for both discriminating among competing models and for improving the estimation of kinetic parameters. The procedure is tested on a real case study related to the identification of kinetic models of methanol oxidation on silver catalyst.

Partial oxidation of methanol to formaldehyde on silver catalyst represents an important industrial process due to the versatility of formaldehyde as an intermediate in chemical synthesis. The development of kinetic models is essential for a quantitative description of the changes in concentration of the chemical species involved in the process due to reaction as well as for process design and optimisation purposes. Microreactor platforms represent effective tools for the quick development of reliable kinetic models. However, the development and identification of kinetic models is strictly related to the execution of informative experiments, allowing either elucidation of the complex reaction pathways involved in the oxidation process or providing a precise estimation of the kinetic parameters for each candidate model. In this work a model-based design of experiments (MBDoE) procedure is proposed where experiments are optimally designed for both discriminating among competing models and for improving the estimation of kinetic parameters. The proposed methodology allows the most influential reaction pathways to be elucidated and provides a sequence of optimally informative experiments showing the key role of temperature in the kinetic model identification procedure.

Partial oxidation of methanol to formaldehyde on silver catalyst represents an important industrial process due to the versatility of formaldehyde as an intermediate in chemical synthesis. The development of kinetic models is essential for a quantitative description of the concentration of the chemical species involved in the process as well as for process design and optimisation purposes. However, the development and identification of reliable kinetic models is strictly related to the execution of informative experiments, allowing for the elucidation of the complex reaction pathways involved in the oxidation process and providing a precise estimation of the kinetic parameters for each model proposed in the study. In this paper a model-based design of experiments (MBDoE) approach is used for planning optimally informative experiments for the development of kinetic models of methanol oxidation on silver catalyst. Experiments are carried out in microreactor platforms where better reaction temperature control, accelerated heat and mass transfer and enhanced mixing of reactants can be achieved.

MBDoE aims at designing a set of experiments yielding the most informative process data to be used for the parameter estimation of first-principles dynamic process models. According to the standard procedure described in literature [1], the experiment is generally designed offline; then it is carried out in the plant/lab, process measures are collected and parameter estimation is carried out only at the end of the experimental run: since the experiment is designed on the basis of the initial available parameter estimates, the progressive increase of the information resulting from the progress of the test is not exploited. In order to overcome this problem new techniques were proposed [2,3], where the information is exploited as soon as it is generated by the execution of an experiment by redesigning the experiment online through intermediate parameter estimations. This technique enables users to reduce the number of experimental trials needed to reach a statistically sound estimation of model parameters and results in a substantial reduction of time and costs. Nevertheless, this technique exhibits some limitations potentially hindering the effectiveness of the redesign procedure: on the one side, the time point at which redesigning the experiment is chosen "a-priori", without verifying whether enough information has indeed been collected to obtain an improvement in the estimation of the parameter value; on the other hand, the first design may be heavily affected by the initial parametric mismatch. In order to overcome those problems a new strategy is here proposed. The main advantages is that a robust approach [4] is adopted within the online redesign procedure and, most importantly, a new design criterion based on the maximisation of a target profile of dynamic information is introduced. The methodology allows determining when to redesign the experiment in an automatic way, thus guaranteeing that a sufficient increase in the information content has been achieved before proceeding with the intermediate estimation of the parameters and the design of the experiment. Furthermore, the robust approach allows reducing the negative effects of the initial parametric mismatch. The effectiveness of the new experiment design techniques is demonstrated through a simulated case study. References [1] Franceschini G., Macchietto S., 2008, Model-based design of experiments for parameter precision: state of the art, Chem. Eng. Sci., 63, 4846-4872. [2] Stigter, J.D.; Vries, D.; Keesman, K.J., 2008 On adaptive optimal input design: a bioreactor case study. AIChE J., 52, 3290-3296 [3] Galvanin F., Barolo M., Bezzo F., 2009, Online model-based redesign of experiment for parameter estimation in dynamic systems, Ind. Eng. Chem. Res., 48, 4415-4427. [4] Asprey S. P., Macchietto S., 2002, Designing robust optimal dynamic experiments, J. Process Contr., 12, 545-556.

2014

Von Willebrand disease (VWD) is the most common inherited coagulation disorder to be seen in humans. It originates from a deficiency and/or dysfunction of the von Willebrand factor (VWF), a large multimeric glycoprotein playing a central role in the hemostasis process. VWD occurs in a large variety of forms, and its symptoms may range from sporadic nosebleeds and mild bleeding from small lesions in skin, to acute thrombocytopenia or prolonged bleeding episodes. Diagnosing VWD may be complicated because of the heterogeneous nature of the disorder. Two mechanistic models of VWD are proposed in this article, and their performance is assessed using clinical data. Models allow for the automatic detection of the disease, as well as for a quantitative assessment of VWF multimer distribution patterns, thus elucidating the critical pathways involved in the disease recognition and characterization.

Despite the high potential as feedstock for the production of fuels and chemicals, the industrial cultivation of microalgae still exhibits many issues. Yield in microalgae cultivation systems is limited by the solar energy that can be harvested. The availability of reliable models representing key phenomena affecting algae growth may help designing and optimizing effective production systems at an industrial level. In this work the complex influence of different light regimes on seawater alga Nannochloropsis salina growth is represented by first principles models. Experimental data such as in vivo fluorescence measurements are employed to develop the model. The proposed model allows description of all growth curves and fluorescence data in a reliable way. The model structure is assessed and modified in order to guarantee the model identifiability and the estimation of its parametric set in a robust and reliable way.

In this paper the effectiveness of dense phase carbon dioxide (DPCD) treatment to inactivate different bacterial strains inoculated on the surface of solid food matrices is studied. The bacterial survival is investigated on three distinct matrices: Salmonella enterica spiked on fresh cut coconut (Cocos nucifera), Escherichia coli on fresh cut carrot (Daucus carota) and Listeria monocytogenes on dry cured ham surface. Bacterial inactivation experiments are carried out in order to develop and identify mathematical models whose relative performance is assessed in terms of goodness-of-fit and a posteriori statistics obtained after parameters estimation. Operational maps illustrating the time required to achieve an assigned inactivation degree are built in order to guide the choice of the best operating conditions to be used in the process. The results demonstrate the potential of relatively simple correlative models to represent the DPCD pasteurisation process at different experimental conditions, paving the way to more complex model formulation that can be used in DPCD process design and optimisation.

Von Willebrand disease (VWD) is the most common inherited coagulationdisorder to be seen in humans. It originates from a deficiency and/or dysfunction of the von Willebrand factor (VWF), a large multimeric glycoprotein playing a central role in the hemostasis process. Diagnosing VWD may be complicated because of the heterogeneous nature of the disorder. A new mechanistic model of VWD, identified from clinical data, is presented in this paper. The model allows for the automatic detection of VWD variants, elucidating the critical pathways involved in the disease recognition and characterisation.

2013

The use of pharmacokinetic (PK) and pharmacodynamic (PD) models is a common and widespread practice in the preliminary stages of drug development. However, PK–PD models may be affected by structural identifiability issues intrinsically related to their mathematical formulation. A preliminary structural identifiability analysis is usually carried out to check if the set of model parameters can be uniquely determined from experimental observations under the ideal assumptions of noise-free data and no model uncertainty. However, even for structurally identifiable models, real-life experimental conditions and model uncertainty may strongly affect the practical possibility to estimate the model parameters in a statistically sound way. A systematic procedure coupling the numerical assessment of structural identifiability with advanced model-based design of experiments formulations is presented in this paper. The objective is to propose a general approach to design experiments in an optimal way, detecting a proper set of experimental settings that ensure the practical identifiability of PK–PD models. Two simulated case studies based on in vitro bacterial growth and killing models are presented to demonstrate the applicability and generality of the methodology to tackle model identifiability issues effectively, through the design of feasible and highly informative experiments.

Intravenous glucose tolerance tests (IVGTTs) are typically used to assess insulin resistance and insulin secretion activities in subjects affected by type 2 diabetes by adopting minimal models. However, the amount of information that can be obtained from IVGTTs for the purpose of model identification is intrinsically related to the dynamics triggered by the intravenous glucose infusion and to the individual specificity. This paper shows how the information content of clinical data from conventional IVGTTs can be handled by model-based design of experiments (MBDoE) techniques when the goal is to estimate the set of parameters of a complex model of type 2 diabetes. MBDoE allows to analyse and improve the information content of IVGTTs by optimising the sample allocation in such a way as to decrease the degree of correlation between critical parameters. 

The identification of individual parameters of detailed physiological models of type 1 diabetes can be carried out by clinical tests designed optimally through model-based design of experiments (MBDoE) techniques. So far, MBDoE for diabetes models has been considered for discrete glucose measurement systems only. However, recent advances on sensor technology allowed for the development of continuous glucose monitoring systems (CGMSs), where glucose measurements can be collected with a frequency that is practically equivalent to continuous sampling. To specifically address the features of CGMSs, in this paper the optimal clinical test design problem is formulated and solved through a continuous, rather than discrete, approach. A simulated case study is used to assess the impact of CGMSs both in the optimal clinical test design problem and in the subsequent parameter estimation for the identification of a complex physiological model of glucose homeostasis. The results suggest that, although the optimal design of a clinical test is simpler if continuous glucose measurements are made available through a CGMS, the noise level and formulation may make continuous measurements less suitable for model identification than their discrete counterparts.

2012

Model-based design of experiment (MBDoE) techniques are a useful tool to maximise the information content of experimental trials when the purpose is identifying the set of parameters of a deterministic model in a statistically sound way. In a conventional MBDoE procedure, the information gathered during the evolution of an experiment is exploited only at the end of the experiment itself. Conversely, online model-based redesign of experiment (OMBRE) techniques have been recently proposed to exploit the information as soon as it is generated by the running experiment, allowing for the dynamic update of the experimental conditions to yield the most informative data in order to improve the parameter identification task. However, the effectiveness of MBDoE strategies (including OMBRE) may be severely affected by the presence of systematic modelling errors as well as by disturbances acting on the system. In this paper, a novel experiment design approach (DE-OMBRE) is presented, where a model updating policy including disturbance estimation (DE) is embedded within an OMBRE strategy in order to achieve a statistically satisfactory estimation of the model parameters as well as to estimate the possible discrepancy between the real system and the model being identified. The procedure allows reducing (or even avoiding) constraint violations, preserving the optimality of the redesign even in the presence of systematic errors and/or unknown disturbances acting on the system. Two simulated case studies of different levels of complexity are used to illustrate the benefits of the novel approach.

The use of detailed pharmacokinetic (PK) and pharmacodynamic (PD) models in order to investigate drug resistance and the susceptibility breakthrough by means of in-vivo or in-vitro trials is a widespread practice in the preliminary stages of drug development. However, complex PK-PD models are usually affected by identifiability issues typically related to their specific model structure and to the strong correlation among the model parameters. Model-based design of experiments (MBDoE) techniques can be successfully adopted to design multiple experiments to be executed simultaneously, detecting a proper set of experimental settings improving the identifiability of the model parameters. The preliminary results presented in this paper show that designing experiments in parallel, rather than sequentially, can substantially decrease the time and effort required by the model identification task for a microbial growth model.

2011

Online Model-Based Redesign of Experiment (OMBRE) strategies represent a valuable support to the development of dynamic deterministic models, allowing for the dynamic update of the experimental conditions to yield the most informative data for the parameter identification task. However, the effectiveness of OMBRE strategies may be severely affected by the presence of systematic modelling errors. In this paper, a disturbance estimation approach is exploited within an OMBRE framework (DEOMBRE) in order to achieve a statistically satisfactory estimation of the model parameters, thus avoiding (or reducing) constraint violations even in the presence of systematic modelling errors. A case study illustrates the benefits of the new approach.

Model-based design of experiments (MBDoE) techniques are a useful tool to maximize the information content of experimental trials when the purpose is identifying the set of parameters of a deterministic model in a statistically sound way. Traditionally, the problem of MBDoE has been addressed for discrete measurement systems. In this case, formulation of the optimal design problem is based on maximization of the expected information, usually calculated from discrete forms of the Fisher information matrix. However, current measurement technology allows measurements to be taken at a much higher frequency than in the past, to a point that measurements may be assumed to be obtained in a continuous way. A novel design criterion allowing for the continuous model-based design of the experiments (CMBDoE) is formulated in this paper by optimizing a continuous measurement function of the Fisher information matrix, with the purpose of reaching a statistically satisfactory estimation of model parameters in a computationally efficient way. The benefits of the proposed strategy are discussed by means of two simulated case studies, where the effectiveness of the design is assessed by comparison to a standard MBDoE approach.

Model-Based Design of Experiments (MBDoE) techniques represent a valuable tool to increase the information content of clinical tests with the purpose to identify the set of parameters of physiological models of type 1 diabetes mellitus. However, conventional MBDoE techniques are affected by some limitations. Prior uncertainty in the model parameters and model mismatch may lead the constrained design procedure to predict clinical tests that turn out to be suboptimal or, even worse, unsafe for the subject. Advanced MBDoE techniques, including online model-based redesign of experiments can be used to preserve the effectiveness of the experiment design sessions, exploiting in a more efficient way the nearly-continuous information flux coming from continuous glucose monitoring systems (CGMSs). In this paper a simulated case study is used to assess the impact of advanced redesign techniques on exploiting CGMSs data in the experiment design and successive parameter estimation for the identification of a complex physiological model of glucose homeostasis.

How to design a clinical test aimed at identifying in the safest, most precise and quickest way the subject-specific parameters of a detailed model of glucose homeostasis in type 1 diabetes is the topic of this article. Recently, standard techniques of model-based design of experiments (MBDoE) for parameter identification have been proposed to design clinical tests for the identification of the model parameters for a single type 1 diabetic individual. However, standard MBDoE is affected by some limitations. In particular, the existence of a structural mismatch between the responses of the subject and that of the model to be identified, together with initial uncertainty in the model parameters may lead to design clinical tests that are sub-optimal (scarcely informative) or even unsafe (the actual response of the subject might be hypoglycaemic or strongly hyperglycaemic). The integrated use of two advanced MBDoE techniques (online model-based redesign of experiments and backoff-based MBDoE) is proposed in this article as a way to effectively tackle the above issue. Online model-based experiment redesign is utilised to exploit the information embedded in the experimental data as soon as the data become available, and to adjust the clinical test accordingly whilst the test is running. Backoff-based MBDoE explicitly accounts for model parameter uncertainty, and allows one to plan a test that is both optimally informative and safe by design. The effectiveness and features of the proposed approach are assessed and critically discussed via a simulated case study based on state-of-the-art detailed models of glucose homeostasis. It is shown that the proposed approach based on advanced MBDoE techniques allows defining safe, informative and subject-tailored clinical tests for model identification, with limited experimental effort.

2010

Model-based design of experiments (MBDoE) techniques are a useful tool to maximise the information content of experimental trials when the purpose is identifying the set of parameters of a deterministic model in a statistically sound way. When samples are collected in a discrete way, the formulation of the optimal design problem is based on the maximisation of the expected information, usually calculated from discrete forms of the Fisher information matrix. However, if a continuous measurement system is available, information can be acquired gradually in a continuous way, and a new MBDoE approach is required to take into account the specificity of the measurement system. In this paper a novel design criterion is formulated by optimising a continuous measurement of the Fisher information matrix, with the purpose of reaching a statistically satisfactory estimation of model parameters in the easiest and quickest way. The benefits of the proposed strategy are discussed through a simulated case study, where the effectiveness of the design is assessed by comparison to a standard MBDoE approach.

An examination of systematic techniques for the design of sustainable processes and products, this book covers reducing energy consumption, preventing pollution, developing new pathways for biofuels, and producing environmentally friendly ...

Model-Based Design of Experiments (MBDoE) techniques can be a valuable tool to improve the information content of clinical tests when the purpose is to identify the set of parameters of physiological models in type 1 diabetes mellitus care. Recent advances on sensor technology allowed for the development of continuous glucose monitoring systems (CGMSs), where measurements can be collected with a frequency which is much higher than possible so far. In this paper a dynamic approach to model-based design of experiments is adopted to specifically tailor the design procedure to the features of a CGMS. A simulated case study is used to assess the impact of CGMSs in the experiment design and successive parameter estimation for the identification of a complex physiological model of glucose homeostasis. Results are compared to an MBDoE applied to a conventional discrete measurement system.

Model-based design of experiments (MBDoE) techniques are a very useful tool for the rapid assessment and development of dynamic deterministic models, providing a significant support to the model identification task on a broad range of process engineering applications. These techniques allow to maximise the information content of an experimental trial by acting on the settings of an experiment in terms of initial conditions, profiles of the manipulated inputs and number and time location of the output measurements. Despite their popularity, standard MBDoE techniques are still affected by some limitations. In fact, when a set of constraints is imposed on the system inputs or outputs, factors like uncertainty on prior parameter estimation and structural system/model mismatch may lead the design procedure to plan experiments that turn out, in practice, to be suboptimal (i.e. scarcely informative) and/or unfeasible (i.e. violating the constraints imposed on the system). Additionally, standard MBDoE techniques have been originally developed considering a discrete acquisition of the information. Therefore, they do not consider the possibility that the information on the system itself could be acquired very frequently if there was the possibility to record the system responses in a continuous manner. In this Dissertation three novel MBDoE methodologies are proposed to address the above issues. First, a strategy for the online model-based redesign of experiments is developed, where the manipulated inputs are updated while an experiment is still running. Thanks to intermediate parameter estimations, the information is exploited as soon as it is generated from an experiment, with great benefit in terms of precision and accuracy of the final parameter estimate and of experimental time. Secondly, a general methodology is proposed to formulate and solve the experiment design problem by explicitly taking into account the presence of parametric uncertainty, so as to ensure by design both feasibility and optimality of an experiment. A prediction of the system responses for the given parameter distribution is used to evaluate and update suitable backoffs from the nominal constraints, which are used in the design session in order to keep the system within a feasible region with specified probability. Finally, a design criterion particularly suitable for systems where continuous measurements are available is proposed in order to optimise the information dynamics of the experiments since the very beginning of the trial. This approach allows tailoring the design procedure to the specificity of the measurement system.A further contribution of this Dissertation is aimed at assessing the general applicability of both standard and advanced MBDoE techniques to the biomedical area, where unconventional experiment design applications are faced. In particular, two identification problems are considered: one related to the optimal drug administration in cancer chemotherapy, and one related to glucose homeostasis models for subjects affected by type 1 diabetes mellitus (T1DM). Particular attention is drawn to the optimal design of clinical tests for the parametric identification of detailed physiological models of T1DM. In this latter case, advanced MBDoE techniques are used to ensure a safe and optimally informative clinical test for model identification. The practicability and effectiveness of a complex approach taking simultaneously into account the redesign-based and the backoff-based MBDoE strategies are also shown. The proposed experiment design procedure provides alternative test protocols that are sufficiently short and easy to carry out, and allow for a precise, accurate and safe estimation of the model parameters defining the metabolic portrait of a diabetic subject.

2009

A model-based experiment design techniques is used to design improved test protocols for the identification of the parameters of a detailed physiological models of diabetes specific to single subjects. This paper considers the problem of parameter identification focusing on the impact of some decision variables such as sampling frequency and test duration on both the design effectiveness and the ability to meet safety critical constraints on the subject response. The proposed methodology permits to establish the minimal experimental budget required to achieve a satisfactory parameter estimation from the planned test without upsetting the subject excessively.

The optimal model-based design of experiments aims at designing a set of dynamic experiments yielding the most informative process data to be used for the estimation of the parameters of a first-principles dynamic process model. According to the usual procedure for parameter estimation, the experiment is first designed offline; then, the experiment is carried out in the plant, and process measurements are collected; and finally, parameters are estimated after completion of the experiment. Therefore, the information gathered during the evolution of the experiment is analyzed only at the end of the experiment itself. Since the experiment is designed on the basis of the parameter estimates available before the experiment is started, the progressive increase of the information resulting from the progress of the experiment is not exploited by the designer until the end of that experiment. In this paper, a strategy for the online model-based redesign of experiments is proposed to exploit the information as soon as it is generated from the execution of an experiment, and its performance is compared to that of a standard optimal experiment design approach. Intermediate parameter estimations are carried out while the experiment is running, and by exploiting the information obtained, the experiment is partially redesigned before its termination, with the purpose of updating the experimental settings to generate more valuable information for subsequent analysis. This enables us to reduce the number of experimental trials that are needed to reach a statistically sound estimation of the model parameters and results in a reduction of experimental time, raw materials needs, number of samples to be analyzed, control effort, and labor. Two simulated case studies of increasing level of complexity are used to demonstrate the benefits of the proposed approach with respect to a state-of-the-art sequential model-based experiment design.

Type 1 diabetes mellitus is a disease affecting millions of people worldwide and causing the expenditure of millions of euros every year for health care. One of the most promising therapies derives from the use of an artificial pancreas, based on a control system able to maintain the normoglycaemia in the subject affected by diabetes. A dynamic simulation model of the glucose-insulin system can be useful in several circumstances for diabetes care, including testing of glucose sensors, insulin infusion algorithms, and decision support systems for diabetes. This paper considers the problem of the identification of single individual parameters in detailed dynamic models of glucose homeostasis. Optimal model-based design of experiment techniques are used to design a set of clinical tests that allow the model parameters to be estimated in a statistically sound way, while meeting constraints related to safety of the subject and ease of implementation. The model with the estimated set of parameters represents a specific subject and can thus be used for customized diabetes care solutions. Simulated results demonstrate how such an approach can improve the effectiveness of clinical tests and serve as a tool to devise safer and more efficient clinical protocols, thus providing a contribution to the development of an artificial pancreas.

2008

Model-based experiment design aims at detecting a set of experimental conditions yielding the most informative process data to be used for the estimation of the process model parameters. In this paper, a novel on-line strategy for the optimal model-based re-design of experiments is presented and discussed. The novel technique allows the dynamic update of the control variable profiles while an experiment is still running, and can embody a dynamic investigation of different directions of information through the adoption of modified design criteria. A case study illustrates the benefits of the new approach when compared to a conventional design.

2007

Advanced model-based experiment design techniques are essential for the rapid development, refinement, and statistical assessment of deterministic process models. One objective of experiment design is to devise experiments yielding the most informative data for use in the estimation of the model parameters. Current techniques assume that multiple experiments are designed in a sequential manner. However, multiple equipment can sometimes be available, and simultaneous (parallel) experiments could be advantageous in terms of time and resources utilization. The concept of model-based design of parallel experiments is presented in this paper. Furthermore, a novel criterion for optimal experiment design is proposed:? the criterion aims at maximizing complementary information by considering different eigenvalues in the information matrix. The benefits of adopting such an approach are discussed through an illustrative case.

2006

Advanced model-based experiment design techniques are essential for rapid development, refinement and statistical assessment of deterministic process models. One objective of experiment design is to devise experiments yielding the most informative data for use in the estimation of the model parameters. Current techniques assume the multiple experiments are designed in a sequential manner. The concept of model-based design of parallel experiments design is presented in this paper. A novel approach, viable for sequential, parallel and sequential-parallel design isproposed and evaluated through an illustrative case study.