
Search the School of Mathematical SciencesPeople matching "Stochastic Modelling and Optimisation"Courses matching "Stochastic Modelling and Optimisation" 
Financial Modelling: Tools and Techniques The growth of the range of financial products that are traded on financial markets or are available at other financial institutions, is a notable feature of the finance industry. A major factor contributing to this growth has been the development of sophisticated methods to price these products. The significance to the finance industry of developing a method for pricing options (financial derivatives) was recognized by the awarding of the Nobel Prize in Economics to Myron Scholes and Robert Merton in 1997. The mathematics upon which their method is built is stochastic calculus in continuous time. Binomial lattice type models provide another approach for pricing options. These models are formulated in discrete time and the examination of their structure and application in various financial settings takes place in a mathematical context that is less technically demanding than when time is continuous. This course discusses the binomial framework, shows how discretetime models currently used in the financial industry are formulated within this framework and uses the models to compute prices and construct hedges to manage financial risk. Spreadsheets are used to facilitate computations where appropriate. Topics covered are: The noarbitrage assumption for financial markets; noarbitrage inequalities; formulation of the onestep binomial model; basic pricing formula; the CoxRossRubinstein (CRR) model; application to European style options, exchange rates and interest rates; formulation of the nstep binomial model; backward induction formula; forward induction formula; nstep CRR model; relationship to BlackScholes; forward and future contracts; exotic options; path dependent options; implied volatility trees; implied binomial trees; interest rate models; hedging; real options; implementing the models using EXCEL spreadsheets.
More about this course... 

Mathematical epidemiology: Stochastic models and their statistical calibration Mathematical models are increasingly used to inform governmental policymakers on issues that
threaten human health or which have an adverse impact on the economy. It is this realworld success
combined with the wide variety of interesting mathematical problems which arise that makes
mathematical epidemiology one of the most exciting topics in applied mathematics. During the
summer school, you will be introduced to mathematical epidemiology and some fundamental theory
required for studying and parametrising stochastic models of infection dynamics, which will provide an
ideal basis for addressing key research questions in this area; several such questions will be
introduced and explored in this course. Topics:
An introduction to mathematical epidemiology
Discretetime and continuoustime discretestate stochastic infection models
Numerical methods for studying stochastic infection models: EXPOKIT, transforms and their inversion
Methods for simulating stochastic infection models: classical (Gillespie) algorithm, more efficient exact
and approximate algorithms
Methods for parameterising stochastic infection models: frequentist approaches, Bayesian
approaches, approximate Bayesian computation
Optimal observation of stochastic infection models
More about this course... 

Modelling and Simulation of Stochastic Systems The course provides students with the skills to analyse and design systems using modelling and simulation techniques. Case studies will be undertaken involving handson use of simulation packages. The application of simulation in areas such as manufacturing, telecommunications and transport will be investigated. At the end of this course, students will be capable of identifying practical situations where simulation modelling can be helpful, reporting to management on how they would undertake such a project, collecting relevant data, building and validating a model, analysing the output and reporting their findings to management. Students complete a project in groups of two or three, write a concise summary of what they have done and report their findings to the class. The project report at the end of this course should be a substantial document that is a record of a student's practical ability in simulation modelling, which can also become part of a portfolio or CV. Topics covered are: Introduction to simulation, hand simulation, introduction to a simulation package, review of basic probabilty theory, introduction to random number generation, generation of random variates, anaylsis of simulation output, variance reduction techniques and basic analytic queeing models.
More about this course... 

Optimisation and Operations Research Operations Research (OR) is the application of mathematical techniques and analysis to problem solving in business and industry, in particular to carrying out more efficiently tasks such as scheduling, or optimising the provision of services. OR is an interdisciplinary topic drawing from mathematical modelling, optimisation theory, game theory, decision analysis, statistics, and simulation to help make decisions in complex situations. This first course in OR concentrates on mathematical modelling and optimisation: for example maximising production capacity, or minimising risk. It focuses on linear optimisation problems involving both continuous, and integer variables. The course covers a variety of mathematical techniques for linear optimisation, and the theory behind them. It will also explore the role of heuristics in such problems. Examples will be presented from important application areas, such as the emergency services, telecommunications, transportation, and manufacturing. Students will undertake a team project based on an actual Adelaide problem. Topics covered are: formulating a linear program; the Simplex Method; duality and Complementary slackness; sensitivity analysis; an interior point method; alternative means to solve some linear and integer programs, such as primaldual approaches methods from a complete solution (such as Greedy Methods, and Simulated Annealing), methods from a partial solution (such as Dijkstra's shortest path algorithm, and branchandbound).
More about this course... 

Optimisation III Most problems in life are optimisation problems: what is the best design for a racing kayak, how do you get the best return on your investments, what is the best use of your time in swot vac, what is the shortest route across town for an emergency vehicle, what are the optimal release rates from a dam for environmental flows in a river? Mathematical formulations of such optimisation problems might contain one or many independent variables. There may or may not be constraints on those variables. There is always, though, an objective: minimise or maximise some function of the variable(s), subject to the constraints. This course will examine nonlinear mathematical formulations, and will concentrate on convex optimisation problems. Many modern optimisation methods in areas such as design of communication networks, finance, etc, rely on the classical underpinnings covered in this course. Topics covered are: Onedimensional (line) searches: direct methods, polynomial approximation, methods for differentiable functions; Theory of convex and nonconvex functions relevant to optimisation; Multivariable unconstrained optimisation, in particular, higherorder Newton's Method, steepest descent methods, conjugate direction and conjugate gradient methods; Constrained optimisation, including KuhnTucker conditions and the Gradient Projection Method; Penalty methods.
More about this course... 

Statistical Analysis and Modelling 1 This is a first course in Statistics for mathematically inclined students. It will address the key principles underlying commonly used statistical methods such as confidence intervals, hypothesis tests, inference for means and proportions, and linear regression. It will develop a deeper mathematical understanding of these ideas, many of which will be familiar from studies in secondary school. The application of basic and more advanced statistical methods will be illustrated on a range of problems from areas such as medicine, science, technology, government, commerce and manufacturing. The use of the statistical package SPSS will be developed through a sequence of computer practicals. Topics covered will include: basic probability and random variables, fundamental distributions, inference for means and proportions, comparison of independent and paired samples, simple linear regression, diagnostics and model checking, multiple linear regression, simple factorial models, models with factors and continuous predictors.
More about this course... 

Statistical Modelling and Inference Statistical methods are important to all areas that rely on data including science, technology, government and commerce. To deal with the complex problems that arise in practice requires a sound understanding of fundamental statistical principles together with a range of suitable modelling techniques. Computing using a high level statistical package is also an essential element of modern statistical practice. This course provides an introduction to the principles of statistical inference and the development of linear statistical models with the statistical package R. Topics covered are: Point estimates, unbiasedness, meansquared error, confidence intervals, tests of hypotheses, power calculations, derivation of one and twosample procedures; simple linear regression, regression diagnostics, prediction; linear models, ANOVA, multiple regression, factorial experiments, analysis of covariance models, model building; likelihood based methods for estimation and testing, goodness of fit tests; sample surveys, population means, totals and proportions, simple random samples, stratified random samples. Topics covered are: point estimates, unbiasedness, meansquared error, confidence intervals, tests of hypotheses, power calculations, derivation of one and twosample procedures: simple linear regression, regression diagnostics, prediction: linear models, analysis of variance (ANOVA), multiple regression, factorial experiments, analysis of covariance models, model building; likelihoodbased methods for estimation and testing and goodnessoffit tests.
More about this course... 

Statistical Modelling III One of the key requirements of an applied statistician is the ability to formulate appropriate statistical models and then apply them to data in order to answer the questions of interest. Most often, such models can be seen as relating a response variable to one or more explanatory variables. For example, in a medical experiment we may seek to evaluate a new treatment by relating patient outcome to treatment received while allowing for background variables such as age, sex and disease severity. In this course, a rigorous discussion of the linear model is given and various extensions are developed. There is a strong practical emphasis and the statistical package R is used extensively. Topics covered are: the linear model, least squares estimation, generalised least squares estimation, properties of estimators, the GaussMarkov theorem; geometry of least squares, subspace formulation of linear models, orthogonal projections; regression models, factorial experiments, analysis of covariance and model formulae; regression diagnostics, residuals, influence diagnostics, transformations, BoxCox models, model selection and model building strategies; models with complex error structure, splitplot experiments; logistic regression models.
More about this course... 
Events matching "Stochastic Modelling and Optimisation" 
Mathematics of underground mining. 15:10 Fri 12 May, 2006 :: G08 Mathematics Building University of Adelaide :: Prof. Hyam Rubinstein
Underground mining infrastructure involves an
interesting range of optimisation problems with geometric
constraints. In particular, ramps, drives and tunnels have gradient
within a certain prescribed range and turning circles (curvature) are
also bounded. Finally obstacles have to be avoided, such as faults,
ore bodies themselves and old workings. A group of mathematicians and
engineers at Uni of Melb and Uni of SA have been working on this
problem for a number of years. I will summarise what we have found and
the challenges of working in the mining industry. 

Mathematical modelling of multidimensional tissue growth 16:10 Tue 24 Oct, 2006 :: Benham Lecture Theatre :: Prof John King
Some simple continuummechanicsbased models for the
growth of biological tissue will be formulated and their properties
(particularly with regard to stability) described. 

Flooding in the Sundarbans 15:10 Fri 18 May, 2007 :: G08 Mathematics Building University of Adelaide :: Steve Need
Media...The Sunderbans is a region of deltaic isles formed in the mouth of the Ganges
River on the border between India and Bangladesh. As the largest mangrove
forest in the world it is a world heritage site, however it is also home to
several remote communities who have long inhabited some regions. Many of the
inhabited islands are lowlying and are particularly vulnerable to flooding, a
major hazard of living in the region. Determining suitable levels of
protection to be provided to these communities relies upon accurate assessment
of the flood risk facing these communities. Only recently the Indian
Government commissioned a study into flood risk in the Sunderbans with a view
to determine where flood protection needed to be upgraded.
Flooding due to rainfall is limited due to the relatively small catchment sizes,
so the primary causes of flooding in the Sunderbans are unnaturally high tides,
tropical cyclones (which regularly sweep through the bay of Bengal) or some
combination of the two. Due to the link between tidal anomaly and drops in local
barometric pressure, the two causes of flooding may be highly correlated. I
propose stochastic methods for analysing the flood risk and present the early work
of a case study which shows the direction of investigation. The strategy involves
linking several components; a stochastic approximation to a hydraulic flood
routing model, FARIMA and GARCH models for storm surge and a stochastic model for
cyclone occurrence and tracking. The methods suggested are general and should have
applications in other cyclone affected regions. 

Modelling gene networks: the case of the quorum sensing network in bacteria. 15:10 Fri 1 Jun, 2007 :: G08 Mathematics Building University of Adelaide :: Dr Adrian Koerber
The quorum sensing regulatory genenetwork is employed by bacteria to provide a measure of their populationdensity and switch their behaviour accordingly. I will present an overview of quorum sensing in bacteria together with some of the modelling approaches I\'ve taken to describe this system. I will also discuss how this system relates to virulence and medical treatment, and the insights gained from the mathematics. 

Global and Local stationary modelling in finance: Theory and empirical evidence 14:10 Thu 10 Apr, 2008 :: G04 Napier Building University of Adelaide :: Prof. Dominique Guégan :: Universite Paris 1 PantheonSorbonne
To model real data sets using second order stochastic processes imposes that the data sets verify the second order stationarity condition. This stationarity condition concerns the unconditional moments of the process. It is in that context that most of models developed from the sixties' have been studied; We refer to the ARMA processes (Brockwell and Davis, 1988), the ARCH, GARCH and EGARCH models (Engle, 1982, Bollerslev, 1986, Nelson, 1990), the SETAR process (Lim and Tong, 1980 and Tong, 1990), the bilinear model (Granger and Andersen, 1978, Guégan, 1994), the EXPAR model (Haggan and Ozaki, 1980), the long memory process (Granger and Joyeux, 1980, Hosking, 1981, Gray, Zang and Woodward, 1989, Beran, 1994, Giraitis and Leipus, 1995, Guégan, 2000), the switching process (Hamilton, 1988). For all these models, we get an invertible causal solution under specific conditions on the parameters, then the forecast points and the forecast intervals are available.
Thus, the stationarity assumption is the basis for a general asymptotic theory for identification, estimation and forecasting. It guarantees that the increase of the sample size leads to more and more information of the same kind which is basic for an asymptotic theory to make sense.
Now nonstationarity modelling has also a long tradition in econometrics. This one is based on the conditional moments of the data generating process. It appears mainly in the heteroscedastic and volatility models, like the GARCH and related models, and stochastic volatility processes (Ghysels, Harvey and Renault 1997). This nonstationarity appears also in a different way with structural changes models like the switching models (Hamilton, 1988), the stopbreak model (Diebold and Inoue, 2001, Breidt and Hsu, 2002, Granger and Hyung, 2004) and the SETAR models, for instance. It can also be observed from linear models with time varying coefficients (Nicholls and Quinn, 1982, Tsay, 1987).
Thus, using stationary unconditional moments suggest a global stationarity for the model, but using nonstationary unconditional moments or nonstationary conditional moments or assuming existence of states suggest that this global stationarity fails and that we only observe a local stationary behavior.
The growing evidence of instability in the stochastic behavior of stocks, of exchange rates, of some economic data sets like growth rates for instance, characterized by existence of volatility or existence of jumps in the variance or on the levels of the prices imposes to discuss the assumption of global stationarity and its consequence in modelling, particularly in forecasting. Thus we can address several questions with respect to these remarks.
1. What kinds of nonstationarity affect the major financial and economic data sets? How to detect them?
2. Local and global stationarities: How are they defined?
3. What is the impact of evidence of nonstationarity on the statistics computed from the global non stationary data sets?
4. How can we analyze data sets in the nonstationary global framework? Does the asymptotic theory work in nonstationary framework?
5. What kind of models create local stationarity instead of global stationarity? How can we use them to develop a modelling and a forecasting strategy?
These questions began to be discussed in some papers in the economic literature. For some of these questions, the answers are known, for others, very few works exist. In this talk I will discuss all these problems and will propose 2 new stategies and modelling to solve them. Several interesting topics in empirical finance awaiting future research will also be discussed.


Betti's Reciprocal Theorem for Inclusion and Contact Problems 15:10 Fri 1 Aug, 2008 :: G03 Napier Building University of Adelaide :: Prof. Patrick Selvadurai :: Department of Civil Engineering and Applied Mechanics, McGill University
Enrico Betti (18231892) is recognized in the mathematics community for his pioneering contributions to topology. An equally important contribution is his formulation of the reciprocity theorem applicable to elastic bodies that satisfy the classical equations of linear elasticity. Although James Clerk Maxwell (18311879) proposed a law of reciprocal displacements and rotations in 1864, the contribution of Betti is acknowledged for its underlying formal mathematical basis and generality. The purpose of this lecture is to illustrate how Betti's reciprocal theorem can be used to full advantage to develop compact analytical results for certain contact and inclusion problems in the classical theory of elasticity. Inclusion problems are encountered in number of areas in applied mechanics ranging from composite materials to geomechanics. In composite materials, the inclusion represents an inhomogeneity that is introduced to increase either the strength or the deformability characteristics of resulting material. In geomechanics, the inclusion represents a constructed material region, such as a ground anchor, that is introduced to provide load transfer from structural systems. Similarly, contact problems have applications to the modelling of the behaviour of indentors used in materials testing to the study of foundations used to distribute loads transmitted from structures. In the study of conventional problems the inclusions and the contact regions are directly loaded and this makes their analysis quite straightforward. When the interaction is induced by loads that are placed exterior to the indentor or inclusion, the direct analysis of the problem becomes inordinately complicated both in terns of formulation of the integral equations and their numerical solution. It is shown by a set of selected examples that the application of Betti's reciprocal theorem leads to the development of exact closed form solutions to what would otherwise be approximate solutions achievable only through the numerical solution of a set of coupled integral equations. 

Probabilistic models of human cognition 15:10 Fri 29 Aug, 2008 :: G03 Napier Building University of Adelaide :: Dr Daniel Navarro :: School of Psychology, University of Adelaide
Over the last 15 years a fairly substantial psychological literature has developed in which human reasoning and decisionmaking is viewed as the solution to a variety of statistical problems posed by the environments in which we operate. In this talk, I briefly outline the general approach to cognitive modelling that is adopted in this literature, which relies heavily on Bayesian statistics, and introduce a little of the current research in this field. In particular, I will discuss work by myself and others on the statistical basis of how people make simple inductive leaps and generalisations, and the links between these generalisations and how people acquire word meanings and learn new concepts. If time permits, the extensions of the work in which complex concepts may be characterised with the aid of nonparametric Bayesian tools such as Dirichlet processes will be briefly mentioned. 

Mathematical modelling of blood flow in curved arteries 15:10 Fri 12 Sep, 2008 :: G03 Napier Building University of Adelaide :: Dr Jennifer Siggers :: Imperial College London
Atherosclerosis, characterised by plaques, is the most common arterial
disease. Plaques tend to develop in regions of low mean wall shear
stress, and regions where the wall shear stress changes direction during
the course of the cardiac cycle. To investigate the effect of the
arterial geometry and driving pressure gradient on the wall shear stress
distribution we consider an idealised model of a curved artery with
uniform curvature. We assume that the flow is fullydeveloped and seek
solutions of the governing equations, finding the effect of the
parameters on the flow and wall shear stress distribution. Most
previous work assumes the curvature ratio is asymptotically small;
however, many arteries have significant curvature (e.g. the aortic arch
has curvature ratio approx 0.25), and in this work we consider in
particular the effect of finite curvature.
We present an extensive analysis of curvedpipe flow driven by a steady
and unsteady pressure gradients. Increasing the curvature causes the
shear stress on the inside of the bend to rise, indicating that the risk
of plaque development would be overestimated by considering only the
weak curvature limit. 

Assisted reproduction technology: how maths can contribute 13:10 Wed 22 Oct, 2008 :: Napier 210 :: Dr Yvonne Stokes
Media...Most people will have heard of IVF (in vitro fertilisation), a
technology for helping infertile couples have a baby. Although there are
many IVF babies, many will also know that the success rate is still low
for the cost and inconvenience involved. The fact that some women
cannot make use of IVF because of lifethreatening consequences is less
well known but motivates research into other technologies, including
IVM (in vitro maturation).
What has all this to do with maths? Come along and find out how
mathematical modelling is contributing to understanding and
improvement in this important and interesting field.


Oceanographic Research at the South Australian Research and Development Institute: opportunities for collaborative research 15:10 Fri 21 Nov, 2008 :: Napier G04 :: Associate Prof John Middleton :: South Australian Research and Development Institute
Increasing threats to S.A.'s fisheries and marine environment have underlined the increasing need for soundly based research into the ocean circulation and ecosystems (phyto/zooplankton) of the shelf and gulfs. With support of Marine Innovation SA, the Oceanography Program has within 2 years, grown to include 6 FTEs and a budget of over $4.8M. The program currently leads two major research projects, both of which involve numerical and applied mathematical modelling of oceanic flow and ecosystems as well as statistical techniques for the analysis of data. The first is the implementation of the Southern Australian Integrated Marine Observing System (SAIMOS) that is providing data to understand the dynamics of shelf boundary currents, monitor for climate change and understand the phyto/zooplankton ecosystems that underpin SA's wild fisheries and aquaculture. SAIMOS involves the use of shipbased sampling, the deployment of underwater marine moorings, underwater gliders, HF Ocean RADAR, acoustic tracking of tagged fish and Autonomous Underwater vehicles.
The second major project involves measuring and modelling the ocean circulation and biological systems within Spencer Gulf and the impact on prawn larval dispersal and on the sustainability of existing and proposed aquaculture sites. The discussion will focus on opportunities for collaborative research with both faculty and students in this exciting growth area of S.A. science.


Dynamics of Moving Average Rules in a Continuoustime Financial Market Model 15:10 Fri 8 May, 2009 :: LG29 :: Associate Prof (Tony) Xuezhong He :: University of Technology Sydney
Within a continuoustime framework, this paper proposes a stochastic
heterogeneous agent model (HAM) of financial markets with time
delays to unify various moving average rules used in discretetime
HAMs. Intuitive conditions for the stability of the fundamental price of
the deterministic model in terms of agents' behavior parameters and
time delay are obtained. By focusing on the stabilizing role of the
time delay, it is found that an increase in time delay not only can
destabilize the market price, resulting in oscillatory market price
characterized by a Hopf bifurcation, but also can stabilize an
otherwise unstable market price. Numerical simulations show that the
stochastic model is able to characterize long deviations of the
market price from its fundamental price and excess volatility and
generate most of the stylized facts observed in financial markets.


Averaging reduction for stochastic PDEs 15:10 Fri 5 Jun, 2009 :: LG29 :: Dr Wei Wang :: University of Adelaide
In this talk, I introduce recent work on macroscopic reduction for stochastic PDEs by an averaging method. Furthermore by using a special coupling boundary conditions, a macroscopic discrete approximation model can be derived. 

Strong PredictorCorrector Euler Methods for Stochastic Differential Equations 15:10 Fri 19 Jun, 2009 :: LG29 :: Prof. Eckhard Platen :: University of Technology, Sydney
This paper introduces a new class of numerical
schemes for the pathwise approximation of solutions of stochastic
differential equations (SDEs). The proposed family of strong
predictorcorrector Euler methods are designed to handle scenario
simulation of solutions of SDEs. It has the potential to overcome
some of the numerical instabilities that are often experienced
when using the explicit Euler method. This is of importance, for
instance, in finance where martingale dynamics arise for solutions
of SDEs with multiplicative diffusion coefficients. Numerical
experiments demonstrate the improved asymptotic stability
properties of the proposed symmetric predictorcorrector Euler
methods. 

Modelling fluidstructure interactions in microdevices 15:00 Thu 3 Sep, 2009 :: School Board Room :: Dr Richard Clarke :: University of Auckland
The flows generated in many modern microdevices possess very little convective inertia, however, they can be highly unsteady and exert substantial hydrodynamic forces on the device components. Typically these components exhibit some degree of compliance, which traditionally has been treated using simple onedimensional elastic beam models. However, recent findings have suggested that threedimensional effects can be important and, accordingly, we consider the elastohydrodynamic response of a rapidly oscillating threedimensional elastic plate that is immersed in a viscous fluid. In addition, a preliminary model will be presented which incorporates the presence of a nearby elastic wall. 

Modelling and pricing for portfolio credit derivatives 15:10 Fri 16 Oct, 2009 :: MacBeth Lecture Theatre :: Dr Ben Hambly :: University of Oxford
The current financial crisis has been in part precipitated by the
growth of complex credit derivatives and their mispricing. This talk
will discuss some of the background to the `credit crunch', as well as
the models and methods used currently. We will then develop an alternative
view of large basket credit derivatives, as functions of a stochastic
partial differential equation, which addresses some of the shortcomings. 

Nonlinear time series econometrics and financial econometrics: a personal overview 15:10 Fri 12 Mar, 2010 :: Napier G04 :: Prof Jiti Gao :: University of Adelaide
Through using ten examples, the talk focuses on the recent development on nonlinear time series econometrics and financial econometrics.
Such examples cover the following models:
1. Nonlinear time series trend model;
2. Partially linear autoregressive model;
3. Nonlinear capital asset pricing model;
4. Additive capital asset pricing model;
5. Varyingcoefficient capital asset pricing model;
6. Semiparametric errorterm model;
7. Nonlinear and nonstationary model;
8. Partially linear ARCH model;
9. Continuoustime financial model; and
10. Stochastic volatility model. 

Modelling of the Human Skin Equivalent 15:10 Fri 26 Mar, 2010 :: Napier 102 :: Prof Graeme Pettet :: Queensland University of Technology
A brief overview will be given of the development of a so called Human Skin Equivalent Construct. This laboratory grown construct can be used for studying growth, response and the repair of human skin subjected to wounding and/or treatment under strictly regulated conditions. Details will also be provided of a series of mathematical models we have developed that describe the dynamics of the Human Skin Equivalent Construct, which can be used to assist in the development of the experimental protocol, and to provide insight into the fundamental processes at play in the growth and development of the epidermis in both healthy and diseased states. 

Hugs not drugs 15:10 Mon 20 Sep, 2010 :: Ingkarni Wardli B17 :: Dr Scott McCue :: Queensland University of Technology
I will discuss a model for drug diffusion that involves a Stefan problem with a "kinetic undercooling". I like Stefan problems, so I like this model. I like drugs too, but only legal ones of course. Anyway, it turns out that in some parameter regimes, this sophisticated moving boundary problem hardly works better than a simple linear undergraduate model (there's a lesson here for mathematical modelling). On the other hand, for certain polymer capsules, the results are interesting and suggest new means for controlled drug delivery. If time permits, I may discuss certain asymptotic limits that are of interest from a Stefan problem perspective. Finally, I won't bring any drugs with me to the seminar, but I'm willing to provide hugs if necessary. 

Statistical physics and behavioral adaptation to Creation's main stimuli: sex and food 15:10 Fri 29 Oct, 2010 :: E10 B17 Suite 1 :: Prof Laurent Seuront :: Flinders University and South Australian Research and Development Institute
Animals typically search for food and mates, while avoiding predators. This is particularly critical for keystone organisms such as intertidal gastropods and copepods (i.e. millimeterscale crustaceans) as they typically rely on nonvisual senses for detecting, identifying and locating mates in their two and threedimensional environments. Here, using stochastic methods derived from the field of nonlinear physics, we provide new insights into the nature (i.e. innate vs. acquired) of the motion behavior of gastropods and copepods, and demonstrate how changes in their behavioral properties can be used to identify the tradeoffs between foraging for food or sex. The gastropod Littorina littorea hence moves according to fractional Brownian motions while foraging for food (in accordance with the fractal nature of food distributions), and switch to Brownian motion while foraging for sex. In contrast, the swimming behavior of the copepod Temora longicornis belongs to the class of multifractal random walks (MRW; i.e. a form of anomalous diffusion), characterized by a nonlinear moment scaling function for distance versus time. This clearly differs from the traditional Brownian and fractional Brownian walks expected or previously detected in animal behaviors. The divergence between MRW and Levy flight and walk is also discussed, and it is shown how copepod anomalous diffusion is enhanced by the presence and concentration of conspecific waterborne signals, and is dramatically increasing malefemale encounter rates. 

Arbitrage bounds for weighted variance swap prices 15:05 Fri 3 Dec, 2010 :: Napier LG28 :: Prof Mark Davis :: Imperial College London
This paper builds on earlier work by Davis and Hobson (Mathematical Finance,
2007) giving modelfreeexcept for a 'frictionless markets' assumption
necessary and sufficient conditions for absence of arbitrage given a set of
currenttime put and call options on some underlying asset. Here we suppose
that the prices of a set of put options, all maturing at the same time, are
given and satisfy the conditions for consistency with absence of arbitrage.
We
now add a pathdependent option, specifically a weighted variance swap, to
the
set of traded assets and ask what are the conditions on its time0 price
under
which consistency with absence of arbitrage is maintained. In the present
work,
we work under the extra modelling assumption that the underlying asset price
process has continuous paths. In general, we find that there is always a
non
trivial lower bound to the range of arbitragefree prices, but only in the
case
of a corridor swap do we obtain a finite upper bound. In the case of, say,
the
vanilla variance swap, a finite upper bound exists when there are additional
traded European options which constrain the left wing of the volatility
surface
in appropriate ways. 

Mathematical modelling in nanotechnology 15:10 Fri 4 Mar, 2011 :: 7.15 Ingkarni Wardli :: Prof Jim Hill :: University of Adelaide
Media...In this talk we present an overview of the mathematical modelling contributions of the Nanomechanics Groups at the Universities of Adelaide and Wollongong. Fullerenes and carbon nanotubes have unique properties, such as low weight, high strength, flexibility, high thermal conductivity and chemical stability, and they have many potential applications in nanodevices. In this talk we first present some new results on the geometric structure of carbon nanotubes and on related nanostructures. One concept that has attracted much attention is the creation of nanooscillators, to produce frequencies in the gigahertz range, for applications such as ultrafast optical filters and nanoantennae. The sliding of an inner shell inside an outer shell of a multiwalled carbon nanotube can generate oscillatory frequencies up to several gigahertz, and the shorter the inner tube the higher the frequency. A C60nanotube oscillator generates high frequencies by oscillating a C60 fullerene inside a singlewalled carbon nanotube. Here we discuss the underlying mechanisms of nanooscillators and using the LennardJones potential together with the continuum approach, to mathematically model the C60nanotube nanooscillator. Finally, three illustrative examples of recent modelling in hydrogen storage, nanomedicine and nanocomputing are discussed. 

Modelling of Hydrological Persistence in the MurrayDarling Basin for the Management of Weirs 12:10 Mon 4 Apr, 2011 :: 5.57 Ingkarni Wardli :: Aiden Fisher :: University of Adelaide
The lakes and weirs along the lower Murray River in Australia are aggregated and
considered as a sequence of five reservoirs. A seasonal Markov chain model for
the system will be implemented, and a stochastic dynamic program will be used to
find optimal release strategies, in terms of expected monetary value (EMV), for
the competing demands on the water resource given the stochastic nature of
inflows. Matrix analytic methods will be used to analyse the system further, and
in particular enable the full distribution of first passage times between any
groups of states to be calculated. The full distribution of first passage times
can be used to provide a measure of the risk associated with optimum EMV
strategies, such as conditional value at risk (CVaR). The sensitivity of the
model, and risk, to changing rainfall scenarios will be investigated. The effect
of decreasing the level of discretisation of the reservoirs will be explored.
Also, the use of matrix analytic methods facilitates the use of hidden states to
allow for hydrological persistence in the inflows. Evidence for hydrological
persistence of inflows to the lower Murray system, and the effect of making
allowance for this, will be discussed. 

How to value risk 12:10 Mon 11 Apr, 2011 :: 5.57 Ingkarni Wardli :: Leo Shen :: University of Adelaide
A key question in mathematical finance is: given a future random payoff X, what is its value today? If X represents a loss, one can ask how risky is X. To mitigate risk it must be modelled and quantified. The finance industry has used ValueatRisk and conditional ValueatRisk as measures. However, these measures are not time consistent and ValueatRisk can penalize diversification. A modern theory of risk measures is being developed which is related to solutions of backward stochastic differential equations in continuous time and stochastic difference equations in discrete time.
I first review risk measures used in mathematical finance, including static and dynamic risk measures. I recall results relating to backward stochastic difference equations (BSDEs) associated with a single jump process. Then I evaluate some numerical examples of the solutions of the backward stochastic difference equations and related risk measures. These concepts are new. I hope the examples will indicate how they might be used. 

Statistical modelling in economic forecasting: semiparametrically spatiotemporal approach 12:10 Mon 23 May, 2011 :: 5.57 Ingkarni Wardli :: Dawlah Alsulami :: University of Adelaide
How to model spatiotemporal variation of housing prices is an important and challenging problem as it is of vital importance for both investors and policy makersto assess any movement in housing prices. In this seminar I will talk about the proposed model to estimate any movement in housing prices and measure the risk more accurately. 

Optimal experimental design for stochastic population models 15:00 Wed 1 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Dan Pagendam :: CSIRO, Brisbane
Markov population processes are popular models for studying a wide range of
phenomena including the spread of disease, the evolution of chemical reactions
and the movements of organisms in population networks (metapopulations). Our
ability to use these models effectively can be limited by our knowledge about
parameters, such as disease transmission and recovery rates in an epidemic.
Recently, there has been interest in devising optimal experimental designs for
stochastic models, so that practitioners can collect data in a manner that
maximises the precision of maximum likelihood estimates of the parameters for
these models. I will discuss some recent work on optimal design for a variety
of population models, beginning with some simple oneparameter models where the
optimal design can be obtained analytically and moving on to more complicated
multiparameter models in epidemiology that involve latent states and
nonexponentially distributed infectious periods. For these more complex
models, the optimal design must be arrived at using computational methods and we
rely on a Gaussian diffusion approximation to obtain analytical expressions for
Fisher's information matrix, which is at the heart of most optimality criteria
in experimental design. I will outline a simple crossentropy algorithm that
can be used for obtaining optimal designs for these models. We will also
explore the improvements in experimental efficiency when using the optimal
design over some simpler designs, such as the design where observations are
spaced equidistantly in time. 

Priority queueing systems with random switchover times and generalisations of the KendallTakacs equation 16:00 Wed 1 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
In this talk I will review existing analytical results for priority queueing
systems with Poisson incoming flows, general service times and a single server
which needs some (random) time to switch between requests of different priority.
Specifically, I will discuss analytical results for the busy period and workload
of such systems with a special structure of switchover times.
The results related to the busy period can be seen as generalisations of the
famous KendallTak\'{a}cs functional equation for $MG1$:
being formulated in terms of LaplaceStieltjes transform, they represent systems
of functional recurrent equations.
I will present a methodology and algorithms of their numerical solution;
the efficiency of these algorithms is achieved by acceleration of the numerical
procedure of solving the classical KendallTak\'{a}cs equation.
At the end I will identify open problems with regard to such systems; these open
problems are mainly related to the modelling of switchover times.


Inference and optimal design for percolation and general random graph models (Part I) 09:30 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂ«mostly populatedÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Inference and optimal design for percolation and general random graph models (Part II) 10:50 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂÃÂÃÂ«mostly populatedÃÂÃÂÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Stochastic models of reaction diffusion 15:10 Fri 17 Jun, 2011 :: 7.15 Ingkarni Wardli :: Prof Jon Chapman :: Oxford University
Media...We consider two different position jump processes: (i) a random
walk on a lattice (ii) the Euler scheme for the Smoluchowski
differential equation. Both of these reduce to the diffusion equation as the time step
and size of the jump tend to zero.
We consider the problem of adding chemical reactions to these
processes, both at a surface and in the bulk. We show how the
"microscopic" parameters should be chosen to achieve the correct
"macroscopic" reaction rate. This choice is found to depend on
which stochastic model for diffusion is used. 

Modelling computer network topologies through optimisation 12:10 Mon 1 Aug, 2011 :: 5.57 Ingkarni Wardli :: Mr Rhys Bowden :: University of Adelaide
The core of the Internet is made up of many different computers (called routers) in many different interconnected networks, owned and operated by many different organisations. A popular and important field of study in the past has been "network topology": for instance, understanding which routers are connected to which other routers, or which networks are connected to which other networks; that is, studying and modelling the connection structure of the Internet. Previous study in this area has been plagued by unreliable or flawed experimental data and debate over appropriate models to use. The Internet Topology Zoo is a new source of network data created from the information that network operators make public. In order to better understand this body of network information we would like the ability to randomly generate network topologies resembling those in the zoo. Leveraging previous wisdom on networks produced as a result of optimisation processes, we propose a simple objective function based on possible economic constraints. By changing the relative costs in the objective function we can change the form of the resulting networks, and we compare these optimised networks to a variety of networks found in the Internet Topology Zoo. 

Alignment of time course gene expression data sets using Hidden Markov Models 12:10 Mon 5 Sep, 2011 :: 5.57 Ingkarni Wardli :: Mr Sean Robinson :: University of Adelaide
Time course microarray experiments allow for insight into biological processes by measuring gene expression over a time period of interest. This project is concerned with time course data from a microarray experiment conducted on a particular variety of grapevine over the development of the grape berries at a number of different vineyards in South Australia. The aim of the project is to construct a methodology for combining the data from the different vineyards in order to obtain more precise estimates of the underlying behaviour of the genes over the development process. A major issue in doing so is that the rate of development of the grape berries is different at different vineyards.
Hidden Markov models (HMMs) are a well established methodology for modelling time series data in a number of domains and have been previously used for gene expression analysis. Modelling the grapevine data presents a unique modelling issue, namely the alignment of the expression profiles needed to combine the data from different vineyards. In this seminar, I will describe our problem, review HMMs, present an extension to HMMs and show some preliminary results modelling the grapevine data. 

Mathematical modelling of lobster populations in South Australia 12:10 Mon 12 Sep, 2011 :: 5.57 Ingkarni Wardli :: Mr John Feenstra :: University of Adelaide
Just how many lobsters are there hanging around the South Australian coastline? How is this number changing over time? What is the demographic breakdown of this number? And what does it matter? Find out the answers to these questions in my upcoming talk. I will provide a brief flavour of the kinds of quantitative methods involved, showcasing relevant applications of regression, population modelling, estimation, as well as simulation. A product of these analyses are biological performance indicators which are used by government to help decide on fishery controls such as yearly total allowable catch quotas. This assists in maintaining the sustainability of the fishery and hence benefits both the fishers and the lobsters they catch. 

Estimating transmission parameters for the swine flu pandemic 15:10 Fri 23 Sep, 2011 :: 7.15 Ingkarni Wardli :: Dr Kathryn Glass :: Australian National University
Media...Following the onset of a new strain of influenza with pandemic potential, policy makers need specific advice on how fast the disease is spreading, who is at risk, and what interventions are appropriate for slowing transmission. Mathematical models play a key role in comparing interventions and identifying the best response, but models are only as good as the data that inform them. In the early stages of the 2009 swine flu outbreak, many researchers estimated transmission parameters  particularly the reproduction number  from outbreak data. These estimates varied, and were often biased by data collection methods, misclassification of imported cases or as a result of early stochasticity in case numbers. I will discuss a number of the pitfalls in achieving good quality parameter estimates from early outbreak data, and outline how best to avoid them.
One of the early indications from swine flu data was that children were disproportionately responsible for disease spread. I will introduce a new method for estimating agespecific transmission parameters from both outbreak and seroprevalence data. This approach allows us to take account of empirical data on human contact patterns, and highlights the need to allow for asymmetric mixing matrices in modelling disease transmission between age groups. Applied to swine flu data from a number of different countries, it presents a consistent picture of higher transmission from children. 

Statistical analysis of schoolbased student performance data 12:10 Mon 10 Oct, 2011 :: 5.57 Ingkarni Wardli :: Ms Jessica Tan :: University of Adelaide
Join me in the journey of being a statistician for 15 minutes of your day (if you are not already one) and experience the task of data cleaning without having to get your own hands dirty. Most of you may have sat the Basic Skills Tests when at school or know someone who currently has to do the NAPLAN (National Assessment Program  Literacy and Numeracy) tests. Tests like these assess student progress and can be used to accurately measure school performance. In trying to answer the research question: "what conclusions about student progress and school performance can be drawn from NAPLAN data or data of a similar nature, using mathematical and statistical modelling and analysis techniques?", I have uncovered some interesting results about the data in my initial data analysis which I shall explain in this talk. 

Statistical modelling for some problems in bioinformatics 11:10 Fri 14 Oct, 2011 :: B.17 Ingkarni Wardli :: Professor Geoff McLachlan :: The University of Queensland
Media...In this talk we consider some statistical analyses of data arising in
bioinformatics. The problems include the detection of differential
expression in microarray geneexpression data, the clustering of
timecourse geneexpression data and, lastly, the analysis of
modernday cytometric data. Extensions are considered to the procedures
proposed for these three problems in McLachlan et al. (Bioinformatics, 2006),
Ng et al. (Bioinformatics, 2006), and Pyne et al. (PNAS, 2009), respectively.
The latter references are available at http://www.maths.uq.edu.au/~gjm/. 

On the role of mixture distributions in the modelling of heterogeneous data 15:10 Fri 14 Oct, 2011 :: 7.15 Ingkarni Wardli :: Prof Geoff McLachlan :: University of Queensland
Media...We consider the role that finite mixture distributions have played in the modelling of heterogeneous data, in particular for clustering continuous data via mixtures of normal distributions. A very brief history is given starting with the seminal papers by Day and Wolfe in the sixties before the appearance of the EM algorithm. It was the publication in 1977 of the latter algorithm by Dempster, Laird, and Rubin that greatly stimulated interest in the use of finite mixture distributions to model heterogeneous data. This is because the fitting of mixture models by maximum likelihood is a classic example of a problem that is simplified considerably by the EM's conceptual unification of maximum likelihood estimation from data that can be viewed as being incomplete. In recent times there has been a proliferation of applications in which the number of experimental units n is comparatively small but the underlying dimension p is extremely large as, for example, in microarraybased genomics and other highthroughput experimental approaches. Hence there has been increasing attention given not only in bioinformatics and machine learning, but also in mainstream statistics, to the analysis of complex data in this situation where n is small relative to p. The latter part of the talk shall focus on the modelling of such highdimensional data using mixture distributions. 

Likelihoodfree Bayesian inference: modelling drug resistance in Mycobacterium tuberculosis 15:10 Fri 21 Oct, 2011 :: 7.15 Ingkarni Wardli :: Dr Scott Sisson :: University of New South Wales
Media...A central pillar of Bayesian statistical inference is Monte Carlo integration, which is based on obtaining random samples from the posterior distribution. There are a number of standard ways to obtain these samples, provided that the likelihood function can be numerically evaluated. In the last 10 years, there has been a substantial push to develop methods that permit Bayesian inference in the presence of computationally intractable likelihood functions. These methods, termed ``likelihoodfree'' or approximate Bayesian computation (ABC), are now being applied extensively across many disciplines.
In this talk, I'll present a brief, nontechnical overview of the ideas behind likelihoodfree methods. I'll motivate and illustrate these ideas through an analysis of the epidemiological fitness cost of drug resistance in Mycobacterium tuberculosis. 

Financial risk measures  the theory and applications of backward stochastic difference/differential equations with respect to the single jump process 12:10 Mon 26 Mar, 2012 :: 5.57 Ingkarni Wardli :: Mr Bin Shen :: University of Adelaide
Media...This is my PhD thesis submitted one month ago. Chapter 1 introduces the backgrounds of the research fields. Then each chapter is a published or an accepted paper.
Chapter 2, to appear in Methodology and Computing in Applied Probability, establishes the theory of Backward Stochastic Difference Equations with respect to the single jump process in discrete time.
Chapter 3, published in Stochastic Analysis and Applications, establishes the theory of Backward Stochastic Differential Equations with respect to the single jump process in continuous time.
Chapter 2 and 3 consist of Part I Theory.
Chapter 4, published in Expert Systems With Applications, gives some examples about how to measure financial risks by the theory established in Chapter 2.
Chapter 5, accepted by Journal of Applied Probability, considers the question of an optimal transaction between two investors to minimize their risks. It's the applications of the theory established in Chapter 3.
Chapter 4 and 5 consist of Part II Applications. 

Fasttrack study of viscous flow over topography using 'Smoothed Particle Hydrodynamics' 12:10 Mon 16 Apr, 2012 :: 5.57 Ingkarni Wardli :: Mr Stephen Wade :: University of Adelaide
Media...Motivated by certain tea room discussions, I am going to (attempt to) model the flow of a viscous fluid under gravity over conical topography. The method used is 'Smoothed Particle Hydrodynamics' (SPH), which is an easytouse but perhaps limitedaccuracy computational method. The model could be extended to include solidification and thermodynamic effects that can also be implemented within the framework of SPH, and this has the obvious practical application to the modelling of the coverage of ice cream with ice magic, I mean, lava flows.
If I fail to achieve this within the next 4 weeks, I will have to go through a talk on SPH that I gave during honours instead. 

Mathematical modelling of the surface adsorption for methane on carbon nanostructures 12:10 Mon 30 Apr, 2012 :: 5.57 Ingkarni Wardli :: Mr Olumide Adisa :: University of Adelaide
Media...In this talk, methane (CH4) adsorption is investigated on both graphite and in the region between two aligned singlewalled carbon nanotubes, which we refer to as the groove site. The LennardâJones potential function and the continuous approximation is exploited to determine surface binding energies between a single CH4 molecule and graphite and between a single CH4 and two aligned singlewalled carbon nanotubes. The modelling indicates that for a CH4 molecule interacting with graphite, the binding energy of the system is minimized when the CH4 carbon is 3.83 angstroms above the surface of the graphitic carbon, while the binding energy of the CH4âgroove site system is minimized when the CH4 carbon is 5.17 angstroms away from the common axis shared by the two aligned singlewalled carbon nanotubes. These results confirm the current view that for larger groove sites, CH4 molecules in grooves are likely to move towards the outer surfaces of one of the singlewalled carbon nanotubes. The results presented in this talk are computationally efficient and are in good agreement with experiments and molecular dynamics simulations, and show that CH4 adsorption on graphite and groove surfaces is more favourable at lower temperatures and higher pressures. 

Modelling protective antitumour immunity using a hybrid agentbased and delay differential equation approach 15:10 Fri 11 May, 2012 :: B.21 Ingkarni Wardli :: Dr Peter Kim :: University of Sydney
Media...Although cancers seem to consistently evade current medical treatments, the body's immune defences seem quite effective at controlling incipient tumours. Understanding how our immune systems provide such protection against earlystage tumours and how this protection could be lost will provide insight into designing nextgeneration immune therapies against cancer. To engage this problem, we formulate a mathematical model of the immune response against small, incipient tumours. The model considers the initial stimulation of the immune response in lymph nodes and the resulting immune attack on the tumour and is formulated as a hybrid agentbased and delay differential equation model. 

The change of probability measure for jump processes 12:10 Mon 28 May, 2012 :: 5.57 Ingkarni Wardli :: Mr Ahmed Hamada :: University of Adelaide
Media...In financial derivatives pricing theory, it is very common to change the probability measure from historical measure "real world" into a RiskNeutral measure as a development of the non arbitrage condition.
Girsanov theorem is the most known example of this technique and is used when prices randomness is modelled by Brownian motions. Other genuine candidates for modelling market randomness that have proved efficiency in recent literature are jump process, so how can a change of measure be performed for such processes?
This talk will address this question by introducing the non arbitrage condition, discussing Girsanov theorem for diffusion and jump processes and presenting a concrete example. 

Model turbulent floods based upon the Smagorinski large eddy closure 12:10 Mon 4 Jun, 2012 :: 5.57 Ingkarni Wardli :: Mr Meng Cao :: University of Adelaide
Media...Rivers, floods and tsunamis are often very turbulent. Conventional models of such environmental fluids are typically based on depthaveraged inviscid irrotational flow equations. We explore changing such a base to the turbulent Smagorinski large eddy closure. The aim is to more appropriately model the fluid dynamics of such complex environmental fluids by using such a turbulent closure. Large changes in fluid depth are allowed. Computer algebra constructs the slow manifold of the flow in terms of the fluid depth h and the mean turbulent lateral velocities u and v. The major challenge is to deal with the nonlinear stress tensor in the Smagorinski closure. The model integrates the effects of inertia, selfadvection, bed drag, gravitational forcing and turbulent dissipation with minimal assumptions. Although the resultant model is close to established models, the real outcome is creating a sound basis for the modelling so others, in their modelling of more complex situations, can systematically include more complex physical processes. 

Adventures with group theory: counting and constructing polynomial invariants for applications in quantum entanglement and molecular phylogenetics 15:10 Fri 8 Jun, 2012 :: B.21 Ingkarni Wardli :: Dr Peter Jarvis :: The University of Tasmania
Media...In many modelling problems in mathematics and physics, a standard
challenge is dealing with several repeated instances of a system under
study. If linear transformations are involved, then the machinery of
tensor products steps in, and it is the job of group theory to control how
the relevant symmetries lift from a single system, to having many copies.
At the level of group characters, the construction which does this is
called PLETHYSM.
In this talk all this will be contextualised via two case studies:
entanglement invariants for multipartite quantum systems, and Markov
invariants for tree reconstruction in molecular phylogenetics. By the end
of the talk, listeners will have understood why Alice, Bob and Charlie
love Cayley's hyperdeterminant, and they will know why the three squangles
 polynomial beasts of degree 5 in 256 variables, with a modest 50,000
terms or so  can tell us a lot about quartet trees! 

Infectious diseases modelling: from biology to public health policy 15:10 Fri 24 Aug, 2012 :: B.20 Ingkarni Wardli :: Dr James McCaw :: The University of Melbourne
Media...The mathematical study of humantohuman transmissible pathogens has
established itself as a complementary methodology to the traditional
epidemiological approach. The classic susceptibleinfectiousrecovered
model paradigm has been used to great effect to gain insight into the
epidemiology of endemic diseases such as influenza and pertussis, and
the emergence of novel pathogens such as SARS and pandemic influenza.
The modelling paradigm has also been taken within the host and used to
explain the withinhost dynamics of viral (or bacterial or parasite)
infections, with implications for our understanding of infection,
emergence of drug resistance and optimal druginterventions.
In this presentation I will provide an overview of the mathematical
paradigm used to investigate both biological and epidemiological
infectious diseases systems, drawing on case studies from influenza,
malaria and pertussis research. I will conclude with a summary of how
infectious diseases modelling has assisted the Australian government in
developing its pandemic preparedness and response strategies.


Electrokinetics of concentrated suspensions of spherical particles 15:10 Fri 28 Sep, 2012 :: B.21 Ingkarni Wardli :: Dr Bronwyn BradshawHajek :: University of South Australia
Electrokinetic techniques are used to gather specific information about concentrated dispersions such as electronic inks, mineral processing slurries, pharmaceutical products and biological fluids (e.g. blood). But, like most experimental techniques, intermediate quantities are measured, and consequently the method relies explicitly on theoretical modelling to extract the quantities of experimental interest. A selfconsistent cellmodel theory of electrokinetics can be used to determine the electrical conductivity of a dense suspension of spherical colloidal particles, and thereby determine the quantities of interest (such as the particle surface potential). The numerical predictions of this model compare well with published experimental results. High frequency asymptotic analysis of the cellmodel leads to some interesting conclusions. 

AD Model Builder and the estimation of lobster abundance 12:10 Mon 22 Oct, 2012 :: B.21 Ingkarni Wardli :: Mr John Feenstra :: University of Adelaide
Media...Determining how many millions of lobsters reside in our waters and how it changes over time is a central aim of lobster stock assessment. ADMB is powerful optimisation software to model and solve complex nonlinear problems using automatic differentiation and plays a major role in SA and worldwide in fisheries stock assessment analyses. In this talk I will provide a brief description of an example modelling problem, key features and use of ADMB. 

Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Thinfilm flow in helicallywound channels with small torsion 15:10 Fri 26 Oct, 2012 :: B.21 Ingkarni Wardli :: Dr Yvonne Stokes :: University of Adelaide
The study of flow in open helicallywound channels has application to many natural and industrial flows. We will consider laminar flow down helicallywound channels of rectangular cross section and with small torsion, in which the fluid depth is small. Assuming a steadystate flow that is independent of position along the axis of the channel, the flow solution may be determined in the twodimensional cross section of the channel. A thinfilm approximation yields explicit expressions for the fluid velocity in terms of the freesurface shape. The latter satisfies an interesting nonlinear ordinary differential equation that, for a channel of rectangular cross section, has an analytical solution. The predictions of the thinfilm model are shown to be in good agreement with much more computationally intensive solutions of the smallhelixtorsion NavierStokes equations.
This work has particular relevance to spiral particle separators used in the minerals processing industry. Early work on modelling of particleladen thinfilm flow in spiral channels will also be discussed. 

Thinfilm flow in helicallywound channels with small torsion 15:10 Fri 26 Oct, 2012 :: B.21 Ingkarni Wardli :: Dr Yvonne Stokes :: University of Adelaide
The study of flow in open helicallywound channels has application to many natural and industrial flows. We will consider laminar flow down helicallywound channels of rectangular cross section and with small torsion, in which the fluid depth is small. Assuming a steadystate flow that is independent of position along the axis of the channel, the flow solution may be determined in the twodimensional cross section of the channel. A thinfilm approximation yields explicit expressions for the fluid velocity in terms of the freesurface shape. The latter satisfies an interesting nonlinear ordinary differential equation that, for a channel of rectangular cross section, has an analytical solution. The predictions of the thinfilm model are shown to be in good agreement with much more computationally intensive solutions of the smallhelixtorsion NavierStokes equations.
This work has particular relevance to spiral particle separators used in the minerals processing industry. Early work on modelling of particleladen thinfilm flow in spiral channels will also be discussed. 

A multiscale approach to reactiondiffusion processes in domains with microstructure 15:10 Fri 15 Mar, 2013 :: B.18 Ingkarni Wardli :: Prof Malte Peter :: University of Augsburg
Media...Reactiondiffusion processes occur in many materials with microstructure such as biological cells, steel or concrete. The main difficulty in modelling and simulating accurately such processes is to account for the fine microstructure of the material. One method of upscaling multiscale problems, which has proven reliable for obtaining feasible macroscopic models, is the method of periodic homogenisation.
The talk will give an introduction to multiscale modelling of chemical mechanisms in domains with microstructure as well as to the method of periodic homogenisation. Moreover, a few aspects of solving the resulting systems of equations numerically will also be discussed. 

How fast? Bounding the mixing time of combinatorial Markov chains 15:10 Fri 22 Mar, 2013 :: B.18 Ingkarni Wardli :: Dr Catherine Greenhill :: University of New South Wales
Media...A Markov chain is a stochastic process which is "memoryless",
in that the next state of the chain depends only on the current state,
and not on how it got there. It is a classical result that an ergodic
Markov chain has a unique stationary distribution.
However, classical theory does not provide any information on the rate of
convergence to stationarity. Around 30 years ago, the mixing time of
a Markov chain was introduced to measure the number of steps required
before the distribution of the chain is within some small distance of
the stationary distribution. One reason why this is important is that
researchers in areas such as physics and biology use Markov chains to
sample from large sets of interest. Rigorous bounds on the mixing time
of their chain allows these researchers to have confidence in their results.
Bounding the mixing time of combinatorial Markov chains can be a challenge, and there are only a few approaches available. I will discuss the main methods and give examples for each (with pretty pictures). 

The boundary conditions for macroscale modelling of a discrete diffusion system with periodic diffusivity 12:10 Mon 29 Apr, 2013 :: B.19 Ingkarni Wardli :: Chen Chen :: University of Adelaide
Media...Many mathematical and engineering problems have a multiscale nature. There are a vast of theories supporting multiscale modelling on infinite domain, such as homogenization theory and centre manifold theory. To date, there are little consideration of the correct boundary conditions to be used at the edge of macroscale model. In this seminar, I will present how to derive macroscale boundary conditions for the diffusion system. 

Filtering Theory in Modelling the Electricity Market 12:10 Mon 6 May, 2013 :: B.19 Ingkarni Wardli :: Ahmed Hamada :: University of Adelaide
Media...In mathematical finance, as in many other fields where applied mathematics is a powerful tool, we assume that a model is good enough when it captures different sources of randomness affecting the quantity of interests, which in this case is the electricity prices. The power market is very different from other markets in terms of the randomness sources that can be observed in the prices feature and evolution. We start from suggesting a new model that simulates the electricity prices, this new model is constructed by adding a periodicity term, a jumps terms and a positives mean reverting term. The later term is driven by a nonobservable Markov process. So in order to prices some financial product, we have to use some of the filtering theory to deal with the nonobservable process, these techniques are gaining very much of interest from practitioners and researchers in the field of financial mathematics. 

Progress in the prediction of buoyancyaffected turbulence 15:10 Fri 17 May, 2013 :: B.18 Ingkarni Wardli :: Dr Daniel Chung :: University of Melbourne
Media...Buoyancyaffected turbulence represents a significant challenge to our
understanding, yet it dominates many important flows that occur in the
ocean and atmosphere. The presentation will highlight some recent progress
in the characterisation, modelling and prediction of buoyancyaffected
turbulence using direct and largeeddy simulations, along with implications
for the characterisation of mixing in the ocean and the lowcloud feedback
in the atmosphere. Specifically, direct numerical simulation data of
stratified turbulence will be employed to highlight the importance of
boundaries in the characterisation of turbulent mixing in the ocean. Then,
a subgridscale model that captures the anisotropic character of stratified
mixing will be developed for largeeddy simulation of buoyancyaffected
turbulence. Finally, the subgridscale model is utilised to perform a
systematic largeeddy simulation investigation of the archetypal lowcloud
regimes, from which the link between the lowertropospheric stability
criterion and the cloud fraction interpreted. 

Multiscale modelling couples patches of wavelike simulations 12:10 Mon 27 May, 2013 :: B.19 Ingkarni Wardli :: Meng Cao :: University of Adelaide
Media...A multiscale model is proposed to significantly reduce the expensive numerical simulations of complicated waves over large spatial domains. The multiscale model is built from given microscale simulations of complicated physical processes such as sea ice or turbulent shallow water. Our long term aim is to enable macroscale simulations obtained by coupling small patches of simulations together over large physical distances. This initial work explores the coupling of patch simulations of wavelike pdes. With the line of development being to water waves we discuss the dynamics of two complementary fields called the 'depth' h and 'velocity' u. A staggered grid is used for the microscale simulation of the depth h and velocity u. We introduce a macroscale staggered grid to couple the microscale patches. Linear or quadratic interpolation provides boundary conditions on the field in each patch. Linear analysis of the whole coupled multiscale system establishes that the resultant macroscale dynamics is appropriate. Numerical simulations support the linear analysis. This multiscale method should empower the feasible computation of large scale simulations of wavelike dynamics with complicated underlying physics. 

The Hamiltonian Cycle Problem and Markov Decision Processes 15:10 Fri 2 Aug, 2013 :: B.18 Ingkarni Wardli :: Prof Jerzy Filar :: Flinders University
Media...We consider the famous Hamiltonian cycle problem (HCP) embedded in a Markov decision process (MDP). More specifically, we consider a moving object on a graph G where, at each vertex, a controller may select an arc emanating from that vertex according to a probabilistic decision rule. A stationary policy is simply a control where these decision rules are time invariant. Such a policy induces a Markov chain on the vertices of the graph. Therefore, HCP is equivalent to a search for a stationary policy that induces a 01 probability transition matrix whose nonzero entries trace out a Hamiltonian cycle in the graph. A consequence of this embedding is that we may consider the problem over a number of, alternative, convex  rather than discrete  domains. These include: (a) the space of stationary policies, (b) the more restricted but, very natural, space of doubly stochastic matrices induced by the graph, and (c) the associated spaces of socalled "occupational measures". This approach to the HCP has led to both theoretical and algorithmic approaches to the underlying HCP problem. In this presentation, we outline a selection of results generated by this line of research. 

Thinfilm flow in helical channels 12:10 Mon 9 Sep, 2013 :: B.19 Ingkarni Wardli :: David Arnold :: University of Adelaide
Media...Spiral particle separators are used in the mineral processing industry to refine ores. A slurry, formed by mixing crushed ore with a fluid, is run down a helical channel and at the end of the channel, the particles end up sorted in different sections of the channel. Design of such devices is largely experimentally based, and mathematical modelling of flow in helical channels is relatively limited. In this talk, I will outline some of the work that I have been doing on thinfilm flow in helical channels. 

Modelling the South Australian garfish population slice by slice. 12:10 Mon 14 Oct, 2013 :: B.19 Ingkarni Wardli :: John Feenstra :: University of Adelaide
Media...In this talk I will provide a taste of how South Australian garfish populations are modelled. The role and importance of garfish 'slices' will be explained and how these help produce important reporting quantities of yearly recruitment, legalsize biomass, and exploitation rate within a framework of an age and length based population model. 

Modelling and optimisation of group doseresponse challenge experiments 12:10 Mon 28 Oct, 2013 :: B.19 Ingkarni Wardli :: David Price :: University of Adelaide
Media...An important component of scientific research is the 'experiment'. Effective design of these experiments is important and, accordingly, has received significant attention under the heading 'optimal experimental design'. However, until recently, little work has been done on optimal experimental design for experiments where the underlying process can be modelled by a Markov chain. In this talk, I will discuss some of the work that has been done in the field of optimal experimental design for Markov Chains, and some of the work that I have done in applying this theory to doseresponse challenge experiments for the bacteria Campylobacter jejuni in chickens. 

A gentle introduction to bubble evolution in HeleShaw flows 15:10 Fri 22 Nov, 2013 :: 5.58 (Ingkarni Wardli) :: Dr Scott McCue :: QUT
A HeleShaw cell is easy to make and serves as a fun toy for an applied mathematician to play with. If we inject air into a HeleShaw cell that is otherwise filled with viscous fluid, we can observe a bubble of air growing in size. The process is highly unstable, and the bubble boundary expands in an uneven fashion, leading to striking fingering patterns (look up HeleShaw cell or SaffmanTaylor instability on YouTube). From a mathematical perspective, modelling these HeleShaw flows is interesting because the governing equations are sufficiently ``simple'' that a considerable amount of analytical progress is possible. Indeed, there is no other context in which (genuinely) twodimensional moving boundary problems are so tractable. More generally, HeleShaw flows are important as they serve as prototypes for more complicated (and important) physical processes such as crystal growth and diffusion limited aggregation. I will give an introduction to some of the main ideas and summarise some of my present research in this area.


A few flavours of optimal control of Markov chains 11:00 Thu 12 Dec, 2013 :: B18 :: Dr Sam Cohen :: Oxford University
Media...In this talk we will outline a general view of optimal control of a continuoustime Markov chain, and how this naturally leads to the theory of Backward Stochastic Differential Equations. We will see how this class of equations gives a natural setting to study these problems, and how we can calculate numerical solutions in many settings. These will include problems with payoffs with memory, with random terminal times, with ergodic and infinitehorizon value functions, and with finite and infinitely many states. Examples will be drawn from finance, networks and electronic engineering. 

Weak Stochastic Maximum Principle (SMP) and Applications 15:10 Thu 12 Dec, 2013 :: B.21 Ingkarni Wardli :: Dr Harry Zheng :: Imperial College, London
Media...In this talk we discuss a weak necessary and sufficient SMP for Markov modulated optimal control problems. Instead of insisting on the maximum condition of the Hamiltonian, we show that 0 belongs to the sum of Clarke's generalized gradient of the Hamiltonian and Clarke's normal cone of the control constraint set at the optimal control. Under a joint concavity condition on the Hamiltonian the necessary condition becomes sufficient. We give examples to demonstrate the weak SMP and its applications in quadratic loss minimization. 

The effects of preexisting immunity 15:10 Fri 7 Mar, 2014 :: B.18 Ingkarni Wardli :: Associate Professor Jane Heffernan :: York University, Canada
Media...Immune system memory, also called immunity, is gained as a result of primary infection or vaccination, and can be boosted after vaccination or secondary infections. Immunity is developed so that the immune system is primed to react and fight a pathogen earlier and more effectively in secondary infections. The effects of memory, however, on pathogen propagation in an individual host (inhost) and a population (epidemiology) are not well understood. Mathematical models of infectious diseases, employing dynamical systems, computer simulation and bifurcation analysis, can provide projections of pathogen propagation, show outcomes of infection and help inform public health interventions. In the Modelling Infection and Immunity (MI^2) lab, we develop and study biologically informed mathematical models of infectious diseases at both levels of infection, and combine these models into comprehensive multiscale models so that the effects of individual immunity in a population can be determined. In this talk we will discuss some of the interesting mathematical phenomenon that arise in our models, and show how our results are directly applicable to what is known about the persistence of infectious diseases. 

Outlier removal using the Bayesian information criterion for groupbased trajectory modelling 12:10 Mon 28 Apr, 2014 :: B.19 Ingkarni Wardli :: Chris Davies :: University of Adelaide
Media...Attributes measured longitudinally can be used to define discrete paths of measurements, or trajectories, for each individual in a given population. Groupbased trajectory modelling methods can be used to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Existing methods generally allocate every individual trajectory into one of the estimated groups. However this does not allow for the possibility that some individuals may be following trajectories so different from the rest of the population that they should not be included in a groupbased trajectory model. This results in these outlying trajectories being treated as though they belong to one of the groups, distorting the estimated trajectory groups and any subsequent analyses that use them.
We have developed an algorithm for removing outlying trajectories based on the maximum change in Bayesian information criterion (BIC) due to removing a single trajectory. As well as deciding which trajectory to remove, the number of groups in the model can also change. The decision to remove an outlying trajectory is made by comparing the loglikelihood contributions of the observations to those of simulated samples from the estimated groupbased trajectory model. In this talk the algorithm will be detailed and an application of its use will be demonstrated. 

Ergodicity and loss of capacity: a stochastic horseshoe? 15:10 Fri 9 May, 2014 :: B.21 Ingkarni Wardli :: Professor Ami Radunskaya :: Pomona College, the United States of America
Media...Random fluctuations of an environment are common in ecological and
economical settings. The resulting processes can be described by a
stochastic dynamical system, where a family of maps parametrized by an
independent, identically distributed random variable forms the basis for a
Markov chain on a continuous state space. Random dynamical systems are a
beautiful combination of deterministic and random processes, and they have
received considerable interest since von Neuman and Ulam's seminal work in
the 1940's. Key questions in the study of a stochastic dynamical system
are: does the system have a welldefined average, i.e. is it ergodic?
How does this longterm behavior compare to that of the state
variable in a constant environment with the averaged parameter?
In this talk we answer these questions for a family of maps on the unit
interval that model selflimiting growth. The techniques used can be
extended to study other families of concave maps, and so we conjecture the
existence of a "stochastic horseshoe". 

Stochastic models of evolution: Trees and beyond 15:10 Fri 16 May, 2014 :: B.18 Ingkarni Wardli :: Dr Barbara Holland :: The University of Tasmania
Media...In the first part of the talk I will give a general introduction to phylogenetics, and discuss some of the mathematical and statistical issues that arise in trying to infer evolutionary trees. In particular, I will discuss how we model the evolution of DNA along a phylogenetic tree using a continuous time Markov process.
In the second part of the talk I will discuss how to express the twostate continuoustime Markov model on phylogenetic trees in such a way that allows its extension to more general models. In this framework we can model convergence of species as well as divergence (speciation). I will discuss the identifiability (or otherwise) of the models that arise in some simple cases. Use of a statistical framework means that we can use established techniques such as the AIC or likelihood ratio tests to decide if datasets show evidence of convergent evolution. 

Group meeting 15:10 Fri 6 Jun, 2014 :: 5.58 Ingkarni Wardli :: Meng Cao and Trent Mattner :: University of Adelaide
Meng Cao:: Multiscale modelling couples patches of nonlinear wavelike simulations ::
Abstract:
The multiscale gaptooth scheme is built from given microscale simulations of complicated physical processes to empower macroscale simulations. By coupling small patches of simulations over unsimulated physical gaps, large savings in computational time are possible. So far the gaptooth scheme has been developed for dissipative systems, but wave systems are also of great interest. This article develops the gaptooth scheme to the case of nonlinear microscale simulations of wavelike systems. Classic macroscale interpolation provides a generic coupling between patches that achieves arbitrarily high order consistency between the multiscale scheme and the underlying microscale dynamics. Eigenanalysis indicates that the resultant gaptooth scheme empowers feasible computation of large scale simulations of wavelike dynamics with complicated underlying physics. As an pilot study, we implement numerical simulations of dambreaking waves by the gaptooth scheme. Comparison between a gaptooth simulation, a microscale simulation over the whole domain, and some published experimental data on dam breaking, demonstrates that the gaptooth scheme feasibly computes large scale wavelike dynamics with computational savings.
Trent Mattner :: Coupled atmospherefire simulations of the Canberra 2003 bushfires using WRFSfire :: Abstract:
The Canberra fires of January 18, 2003 are notorious for the extreme fire behaviour and fireatmospheretopography interactions that occurred, including leeslope fire channelling, pyrocumulonimbus development and tornado formation. In this talk, I will discuss coupled fireweather simulations of the Canberra fires using WRFSFire. In these simulations, a firebehaviour model is used to dynamically predict the evolution of the fire front according to local atmospheric and topographic conditions, as well as the associated heat and moisture fluxes to the atmosphere. It is found that the predicted fire front and heat flux is not too bad, bearing in mind the complexity of the problem and the severe modelling assumptions made. However, the predicted moisture flux is too low, which has some impact on atmospheric dynamics. 

Modelling the meanfield behaviour of cellular automata 12:10 Mon 4 Aug, 2014 :: B.19 Ingkarni Wardli :: Kale Davies :: University of Adelaide
Media...Cellular automata (CA) are latticebased models in which agents fill the lattice sites and behave according to some specified rule. CA are particularly useful when modelling cell behaviour and as such many people consider CA model in which agents undergo motility and proliferation type events. We are particularly interested in predicting the average behaviour of these models. In this talk I will show how a system of differential equations can be derived for the system and discuss the difficulties that arise in even the seemingly simple case of a CA with motility and proliferation. 

Modelling biological gel mechanics 12:10 Mon 8 Sep, 2014 :: B.19 Ingkarni Wardli :: James Reoch :: University of Adelaide
Media...The behaviour of gels such as collagen is the result of complex interactions between mechanical and chemical forces. In this talk, I will outline the modelling approaches we are looking at in order to incorporate the influence of cell behaviour alongside chemical potentials, and the various circumstances which lead to gel swelling and contraction. 

Spectral asymptotics on random Sierpinski gaskets 12:10 Fri 26 Sep, 2014 :: Ingkarni Wardli B20 :: Uta Freiberg :: Universitaet Stuttgart
Self similar fractals are often used in modeling porous media. Hence, defining a Laplacian and a Brownian motion on such sets describes transport through such materials. However, the assumption of strict self similarity could be too restricting. So, we present several models of random fractals which could be used instead. After recalling the classical approaches of random homogenous and recursive random fractals, we show how to interpolate between these two model classes with the help of so called Vvariable fractals. This concept (developed by Barnsley, Hutchinson & Stenflo) allows the definition of new families of random fractals, hereby the parameter V describes the degree of `variability' of the realizations. We discuss how the degree of variability influences the geometric, analytic and stochastic properties of these sets.  These results have been obtained with Ben Hambly (University of Oxford) and John Hutchinson (ANU Canberra). 

A Hybrid Markov Model for Disease Dynamics 12:35 Mon 29 Sep, 2014 :: B.19 Ingkarni Wardli :: Nicolas Rebuli :: University of Adelaide
Media...Modelling the spread of infectious diseases is fundamental to protecting ourselves from potentially devastating epidemics. Among other factors, two key indicators for the severity of an epidemic are the size of the epidemic and the time until the last infectious individual is removed. To estimate the distribution of the size and duration of an epidemic (within a realistic population) an epidemiologist will typically use Monte Carlo simulations of an appropriate Markov process. However, the number of states in the simplest Markov epidemic model, the SIR model, is quadratic in the population size and so Monte Carlo simulations are computationally expensive. In this talk I will discuss two methods for approximating the SIR Markov process and I will demonstrate the approximation error by comparing probability distributions and estimates of the distributions of the final size and duration of an SIR epidemic. 

Modelling segregation distortion in multiparent crosses 15:00 Mon 17 Nov, 2014 :: 5.57 Ingkarni Wardli :: Rohan Shah (joint work with B. Emma Huang and Colin R. Cavanagh) :: The University of Queensland
Construction of highdensity genetic maps has been made feasible by lowcost highthroughput genotyping technology; however, the process is still complicated by biological, statistical and computational issues. A major challenge is the presence of segregation distortion, which can be caused by selection, difference in fitness, or suppression of recombination due to introgressed segments from other species. Alien introgressions are common in major crop species, where they have often been used to introduce beneficial genes from wild relatives.
Segregation distortion causes problems at many stages of the map construction process, including assignment to linkage groups and estimation of recombination fractions. This can result in incorrect ordering and estimation of map distances. While discarding markers will improve the resulting map, it may result in the loss of genomic regions under selection or containing beneficial genes (in the case of introgression).
To correct for segregation distortion we model it explicitly in the estimation of recombination fractions. Previously proposed methods introduce additional parameters to model the distortion, with a corresponding increase in computing requirements. This poses difficulties for large, densely genotyped experimental populations. We propose a method imposing minimal additional computational burden which is suitable for highdensity map construction in large multiparent crosses. We demonstrate its use modelling the known Sr36 introgression in wheat for an eightparent complex cross.


Multiscale modelling of multicellular biological systems: mechanics, development and disease 03:10 Fri 6 Mar, 2015 :: Lower Napier LG24 :: Dr James Osborne :: University of Melbourne
When investigating the development and function of multicellular biological systems it is not enough to only consider the behaviour of individual cells in isolation. For example when studying tissue development, how individual cells interact, both mechanically and biochemically, influences the resulting tissues form and function. In this talk we present a multiscale modelling framework for simulating the development and function of multicellular biological systems (in particular tissues). Utilising the natural structural unit of the cell, the framework consists
of three main scales: the tissue level (macroscale); the cell level (mesoscale); and the subcellular level (microscale), with multiple interactions occurring between all scales. The cell level is central to the framework and cells are modelled as discrete interacting entities using one of a number of possible modelling paradigms, including lattice based models (cellular automata and cellular Potts) and offlattice based models (cell centre and vertex based representations). The subcellular level concerns numerous metabolic and biochemical processes represented by interaction networks rendered stochastically or into ODEs. The outputs from such systems influence the behaviour of the cell level affecting properties such as adhesion and also influencing cell mitosis and apoptosis. At the tissue level we consider factors or restraints that influence the cells, for example the distribution of a nutrient or messenger molecule, which is represented by field equations, on a growing domain, with individual cells functioning as
sinks and/or sources. The modular approach taken within the framework enables more realistic behaviour to be considered at each scale.
This framework is implemented within the Open Source Chaste library (Cancer Heart and Soft Tissue Environment, (http://www.cs.ox.ac.uk/chaste/)
and has been used to model biochemical and biomechanical interactions in various biological systems. In this talk we present the key ideas of the framework along with applications within the fields of development and disease. 

Cricket and Maths 12:10 Mon 16 Mar, 2015 :: Napier LG29 :: Peter Ballard :: University of Adelaide
Media...Each game of international cricket has a scorecard. You don't need to know much maths to go through these scorecards and extract simple information, such as batting and bowling averages. However there is also the opportunity to use some more advanced maths. I will be using a bit of optimisation, probability and statistics to try to answer the questions: Which was the most dominant team ever? What scores are most likely? And are some players unlucky? 

How do we quantify the filamentous growth in yeast colony? 12:10 Mon 30 Mar, 2015 :: Ingkarni Wardli 715 Conference Room :: Dr. Benjamin Binder :: School of Mathematical Sciences
Media...In this talk we will develop a systematic method to measure the spatial patterning of colony morphology. A hybrid modelling approach of the growth process will also be discussed. 

Group Meeting 15:10 Fri 24 Apr, 2015 :: N218 Engineering North :: Dr Ben Binder :: University of Adelaide
Talk (Dr Ben Binder): How do we quantify the filamentous growth in a yeast colony?
Abstract: In this talk we will develop a systematic method to measure the spatial patterning of yeast colony morphology. The methods are applicable to other physical systems with circular spatial domains, for example, batch mixing fluid devices. A hybrid modelling approach of the yeast growth process will also be discussed.
After the seminar, Ben will start a group discussion by sharing some information and experiences on attracting honours/PhD students to the group. 

Group Meeting 15:10 Fri 29 May, 2015 :: EM 213 :: Dr Judy Bunder :: University of Adelaide
Talk : Patch dynamics for efficient exascale simulations
Abstract
Massive parallelisation has lead to a dramatic increase in available computational power.
However, data transfer speeds have failed to keep pace and are the major limiting factor in the development of exascale computing. New algorithms must be developed which minimise the transfer of data. Patch dynamics is a computational macroscale modelling scheme which provides a coarse macroscale solution of a problem defined on a fine microscale by dividing the domain into many nonoverlapping, coupled patches. Patch dynamics is readily adaptable to massive parallelisation as each processor core can evaluate the dynamics on one, or a few, patches. However, patch coupling conditions interpolate across the unevaluated parts of the domain between patches and require almost continuous data transfer. We propose a modified patch dynamics scheme which minimises data transfer by only reevaluating the patch coupling conditions at `mesoscale' time scales which are significantly larger than the microscale time of the microscale problem. We analyse and quantify the error arising from patch dynamics with mesoscale temporal coupling. 

Dynamics on Networks: The role of local dynamics and global networks on hypersynchronous neural activity 15:10 Fri 31 Jul, 2015 :: Ingkarni Wardli B21 :: Prof John Terry :: University of Exeter, UK
Media...Graph theory has evolved into a useful tool for studying complex brain networks inferred from a variety of measures of neural activity, including fMRI, DTI, MEG and EEG. In the study of neurological disorders, recent work has discovered differences in the structure of graphs inferred from patient and control cohorts. However, most of these studies pursue a purely observational approach; identifying correlations between properties of graphs and the cohort which they describe, without consideration of the underlying mechanisms. To move beyond this necessitates the development of mathematical modelling approaches to appropriately interpret network interactions and the alterations in brain dynamics they permit.
In the talk we introduce some of these concepts with application to epilepsy, introducing a dynamic network approach to study resting state EEG recordings from a cohort of 35 people with epilepsy and 40 adult controls. Using this framework we demonstrate a strongly significant difference between networks inferred from the background activity of people with epilepsy in comparison to normal controls. Our findings demonstrate that a mathematical model based analysis of routine clinical EEG provides significant additional information beyond standard clinical interpretation, which may ultimately enable a more appropriate mechanistic stratification of people with epilepsy leading to improved diagnostics and therapeutics. 

In vitro models of colorectal cancer: why and how? 15:10 Fri 7 Aug, 2015 :: B19 Ingkarni Wardli :: Dr Tamsin Lannagan :: Gastrointestinal Cancer Biology Group, University of Adelaide / SAHMRI
1 in 20 Australians will develop colorectal cancer (CRC) and it is the second most common cause of cancer death. Similar to many other cancer types, it is the metastases rather than the primary tumour that are lethal, and prognosis is defined by Ã¢ÂÂhow farÃ¢ÂÂ the tumour has spread at time of diagnosis. Modelling in vivo behavior through rapid and relatively inexpensive in vitro assays would help better target therapies as well as help develop new treatments. One such new in vitro tool is the culture of 3D organoids. Organoids are a biologically stable means of growing, storing and testing treatments against bowel cancer. To this end, we have just set up a human colorectal organoid bank across Australia. This consortium will help us to relate in vitro growth patterns to in vivo behaviour and ultimately in the selection of patients for personalized therapies. Organoid growth, however, is complex. There appears to be variable growth rates and growth patterns. Together with members of the ECMS we recently gained funding to better quantify and model spatial structures in these colorectal organoids. This partnership will aim to directly apply the expertise within the ECMS to patient care. 

Modelling terrorism risk  can we predict future trends? 12:10 Mon 10 Aug, 2015 :: Benham Labs G10 :: Stephen Crotty :: University of Adelaide
Media...As we are all aware, the incidence of terrorism is increasing in the world today. This is confirmed when viewing terrorism events since 1970 as a time series. Can we model this increasing trend and use it to predict terrorism events in the future? Probably not, but we'll give it a go anyway. 

Natural Optimisation (No Artificial Colours, Flavours or Preservatives) 12:10 Mon 21 Sep, 2015 :: Benham Labs G10 :: James Walker :: University of Adelaide
Media...Sometimes nature seems to have the best solutions to complicated optimisation problems. For example ant colonies have a clever way of optimising the amount of food brought to the colony using pheromones, the process of natural selection gives rise to species which are optimally suited to their environment and although this process is not technically natural, for centuries people have been using properties of crystal formation to make steel with optimal properties. In this talk I will discuss nonconvex optimisation and some optimisation methods inspired by natural processes. 

Modelling Directionality in Stationary Geophysical Time Series 12:10 Mon 12 Oct, 2015 :: Benham Labs G10 :: Mohd Mahayaudin Mansor :: University of Adelaide
Media...Many time series show directionality inasmuch as plots against time and against timetogo are qualitatively different, and there is a range of statistical tests to quantify this effect. There are two strategies for allowing for directionality in time series models. Linear models are reversible if and only if the noise terms are Gaussian, so one strategy is to use linear models with nonGaussian noise. The alternative is to use nonlinear models. We investigate how nonGaussian noise affects directionality in a first order autoregressive process AR(1) and compare this with a threshold autoregressive model with two thresholds. The findings are used to suggest possible improvements to an AR(9) model, identified by an AIC criterion, for the average yearly sunspot numbers from 1700 to 1900. The improvement is defined in terms of onestepahead forecast errors from 1901 to 2014. 

Typhoons and Tigers 12:10 Fri 23 Oct, 2015 :: Hughes Lecture Room 322 :: Assoc. Prof. Andrew Metcalfe :: School of Mathematical Sciences
Media...The Sundarbans, situated on the north coast of India and south west Bangladesh, are one of the world's largest mangrove regions (4100 square kilometres). In India, there are over 4 million inhabitants on the deltaic islands in the region. There is a diverse flora and fauna, and it is the only remaining habitat of the Bengal tiger. The Sundarbans is an UNESCO World Heritage Site and International Biodiversity Reserve.
However, the Sundarbans are prone to flooding from the cyclones that regularly develop in the Bay of Bengal. In this talk I shall describe a stochastic model for the flood risk and explain how this can be used to make decisions about flood mitigation strategies and to provide estimates of the increase in flood risk due to rising sea levels and climate change.


Modelling Coverage in RNA Sequencing 09:00 Mon 9 Nov, 2015 :: Ingkarni Wardli 5.57 :: Arndt von Haeseler :: Max F Perutz Laboratories, University of Vienna
Media...RNA sequencing (RNAseq) is the method of choice for measuring the expression of RNAs in a cell population. In an RNAseq experiment, sequencing the full length of larger RNA molecules requires fragmentation into smaller pieces to be compatible with limited read lengths of most deepsequencing technologies. Unfortunately, the issue of nonuniform coverage across a genomic feature has been a concern in RNAseq and is attributed to preferences for certain fragments in steps of library preparation and sequencing. However, the disparity between the observed nonuniformity of read coverage in RNAseq data and the assumption of expected uniformity elicits a query on the read coverage profile one should expect across a transcript, if there are no biases in the sequencing protocol. We propose a simple model of unbiased fragmentation where we find that the expected coverage profile is not uniform and, in fact, depends on the ratio of fragment length to transcript length. To compare the nonuniformity proposed by our model with experimental data, we extended this simple model to incorporate empirical attributes matching that of the sequenced transcript in an RNAseq experiment. In addition, we imposed an experimentally derived distribution on the frequency at which fragment lengths occur.
We used this model to compare our theoretical prediction with experimental data and with the uniform coverage model. If time permits, we will also discuss a potential application of our model. 

Weak globularity in homotopy theory and higher category theory 12:10 Thu 12 Nov, 2015 :: Ingkarni Wardli B19 :: Simona Paoli :: University of Leicester
Media...Spaces and homotopy theories are fundamental objects of study of algebraic topology. One way to study these objects is to break them into smaller components with the Postnikov decomposition. To describe such decomposition purely algebraically we need higher categorical structures. We describe one approach to modelling these structures based on a new paradigm to build weak higher categories, which is the notion of weak globularity. We describe some of their connections to both homotopy theory and higher category theory. 

Use of epidemic models in optimal decision making 15:00 Thu 19 Nov, 2015 :: Ingkarni Wardli 5.57 :: Tim Kinyanjui :: School of Mathematics, The University of Manchester
Media...Epidemic models have proved useful in a number of applications in epidemiology. In this work, I will present two areas that we have used modelling to make informed decisions. Firstly, we have used an age structured mathematical model to describe the transmission of Respiratory Syncytial Virus in a developed country setting and to explore different vaccination strategies. We found that delayed infant vaccination has significant potential in reducing the number of hospitalisations in the most vulnerable group and that most of the reduction is due to indirect protection. It also suggests that marked public health benefit could be achieved through RSV vaccine delivered to age groups not seen as most at risk of severe disease. The second application is in the optimal design of studies aimed at collection of householdstratified infection data. A design decision involves making a tradeoff between the number of households to enrol and the sampling frequency. Two commonly used study designs are considered: crosssectional and cohort. The search for an optimal design uses Bayesian methods to explore the joint parameterdesign space combined with Shannon entropy of the posteriors to estimate the amount of information for each design. We found that for the crosssectional designs, the amount of information increases with the sampling intensity while the cohort design often exhibits a tradeoff between the number of households sampled and the intensity of followup. Our results broadly support the choices made in existing data collection studies. 

A SemiMarkovian Modeling of Limit Order Markets 13:00 Fri 11 Dec, 2015 :: Ingkarni Wardli 5.57 :: Anatoliy Swishchuk :: University of Calgary
Media...R. Cont and A. de Larrard (SIAM J. Financial Mathematics, 2013) introduced a tractable stochastic model for the dynamics of a limit order book, computing various quantities of interest such as the probability of a price increase or the diffusion limit of the price process. As suggested by empirical observations, we extend their framework to 1) arbitrary distributions for book events interarrival times (possibly nonexponential) and 2) both the nature of a new book event and its corresponding interarrival time depend on the nature of the previous book event. We do so by resorting to Markov renewal processes to model the dynamics of the bid and ask queues. We keep analytical tractability via explicit expressions for the Laplace transforms of various quantities of interest. Our approach is justified and illustrated by calibrating the model to the five stocks Amazon, Apple, Google, Intel and Microsoft on June 21st 2012. As in Cont and Larrard, the bidask spread remains constant equal to one tick, only the bid and ask queues are modelled (they are independent from each other and get reinitialized after a price change), and all orders have the same size. (This talk is based on our joint paper with Nelson Vadori (Morgan Stanley)). 

Mathematical modelling of the immune response to influenza 15:00 Thu 12 May, 2016 :: Ingkarni Wardli B20 :: Ada Yan :: University of Melbourne
Media...The immune response plays an important role in the resolution of primary influenza infection and prevention of subsequent infection in an individual. However, the relative roles of each component of the immune response in clearing infection, and the effects of interaction between components, are not well quantified.
We have constructed a model of the immune response to influenza based on data from viral interference experiments, where ferrets were exposed to two influenza strains within a short time period. The changes in viral kinetics of the second virus due to the first virus depend on the strains used as well as the interval between exposures, enabling inference of the timing of innate and adaptive immune response components and the role of crossreactivity in resolving infection. Our model provides a mechanistic explanation for the observed variation in viruses' abilities to protect against subsequent infection at short interexposure intervals, either by delaying the second infection or inducing stochastic extinction of the second virus. It also explains the decrease in recovery time for the second infection when the two strains elicit crossreactive cellular adaptive immune responses. To account for intersubject as well as intervirus variation, the model is formulated using a hierarchical framework. We will fit the model to experimental data using Markov Chain Monte Carlo methods; quantification of the model will enable a deeper understanding of the effects of potential new treatments.


Harmonic Analysis in Rough Contexts 15:10 Fri 13 May, 2016 :: Engineering South S112 :: Dr Pierre Portal :: Australian National University
Media...In recent years, perspectives on what constitutes the ``natural" framework within which to conduct various forms of mathematical analysis have shifted substantially. The common theme of these shifts can be described as a move towards roughness, i.e. the elimination of smoothness assumptions that had previously been considered fundamental. Examples include partial differential equations on domains with a boundary that is merely Lipschitz continuous, geometric analysis on metric measure spaces that do not have a smooth structure, and stochastic analysis of dynamical systems that have nowhere differentiable trajectories.
In this talk, aimed at a general mathematical audience, I describe some of these shifts towards roughness, placing an emphasis on harmonic analysis, and on my own contributions. This includes the development of heat kernel methods in situations where such a kernel is merely a distribution, and applications to deterministic and stochastic partial differential equations. 

Behavioural Microsimulation Approach to Social Policy and Behavioural Economics 15:10 Fri 20 May, 2016 :: S112 Engineering South :: Dr Drew Mellor :: Ernst & Young
SIMULAIT is a general purpose, behavioural microsimulation system designed to predict behavioural trends in human populations. This type of predictive capability grew out of original research initially conducted in conjunction with the Defence Science and Technology Group (DSTO) in South Australia, and has been fully commercialised and is in current use by a global customer base. To our customers, the principal value of the system lies in its ability to predict likely outcomes to scenarios that challenge conventional approaches based on extrapolation or generalisation. These types of scenarios include: the impact of disruptive technologies, such as the impact of widespread adoption of autonomous vehicles for transportation or batteries for household energy storage; and the impact of effecting policy elements or interventions, such as the impact of imposing water usage restrictions.
SIMULAIT employs a multidisciplinary methodology, drawing from agentbased modelling, behavioural science and psychology, microeconomics, artificial intelligence, simulation, game theory, engineering, mathematics and statistics. In this seminar, we start with a highlevel view of the system followed by a look under the hood to see how the various elements come together to answer questions about behavioural trends. The talk will conclude with a case study of a recent application of SIMULAIT to a significant policy problem  how to address the deficiency of STEM skilled teachers in the Victorian teaching workforce. 

Time series analysis of paleoclimate proxies (a mathematical perspective) 15:10 Fri 27 May, 2016 :: Engineering South S112 :: Dr Thomas Stemler :: University of Western Australia
Media...In this talk I will present the work my colleagues from the School of
Earth and Environment (UWA), the "trans disciplinary methods" group of
the Potsdam Institute for Climate Impact Research, Germany, and I did to
explain the dynamics of the AustralianSouth East Asian monsoon system
during the last couple of thousand years.
From a time series perspective paleoclimate proxy series are more or
less the monsters moving under your bed that wake you up in the middle
of the night. The data is clearly nonstationary, nonuniform sampled in
time and the influence of stochastic forcing or the level of measurement
noise are more or less unknown. Given these undesirable properties
almost all traditional time series analysis methods fail.
I will highlight two methods that allow us to draw useful conclusions
from the data sets. The first one uses Gaussian kernel methods to
reconstruct climate networks from multiple proxies. The coupling
relationships in these networks change over time and therefore can be
used to infer which areas of the monsoon system dominate the complex
dynamics of the whole system. Secondly I will introduce the
transformation cost time series method, which allows us to detect
changes in the dynamics of a nonuniform sampled time series. Unlike the
frequently used interpolation approach, our new method does not corrupt
the data and therefore avoids biases in any subsequence analysis. While
I will again focus on paleoclimate proxies, the method can be used in
other applied areas, where regular sampling is not possible.


Approaches to modelling cells and remodelling biological tissues 14:10 Wed 10 Aug, 2016 :: Ingkarni Wardli 5.57 :: Professor Helen Byrne :: University of Oxford
Biological tissues are complex structures, whose evolution is characterised by multiple biophysical processes that act across diverse space and time scales. For example, during normal wound healing, fibroblast cells located around the wound margin exert contractile forces to close the wound while those located in the surrounding tissue synthesise new tissue in response to local growth factors and mechanical stress created by wound contraction. In this talk I will illustrate how mathematical modelling can provide insight into such complex processes, taking my inspiration from recent studies of cell migration, vasculogenesis and wound healing. 

Mathematical modelling of social spreading processes 15:10 Fri 19 Aug, 2016 :: Napier G03 :: Prof Hans De Sterck :: Monash University
Media...Social spreading processes are intriguing manifestations of how humans interact and shape each others' lives. There is great interest in improving our understanding of these processes, and the increasing availability of empirical information in the era of big data and online social networks, combined with mathematical and computational modelling techniques, offer compelling new ways to study these processes.
I will first discuss mathematical models for the spread of political revolutions on social networks. The influence of online social networks and social media on the dynamics of the Arab Spring revolutions of 2011 are of particular interest in our work. I will describe a hierarchy of models, starting from agentbased models realized on empirical social networks, and ending up with populationlevel models that summarize the dynamical behaviour of the spreading process. We seek to understand quantitatively how political revolutions may be facilitated by the modern online social networks of social media.
The second part of the talk will describe a populationlevel model for the social dynamics that cause cigarette smoking to spread in a population. Our model predicts that more individualistic societies will show faster adoption and cessation of smoking. Evidence from a newly composed centurylong composite data set on smoking prevalence in 25 countries supports the model, with potential implications for public health interventions around the world.
Throughout the talk, I will argue that important aspects of social spreading processes can be revealed and understood via quantitative mathematical and computational models matched to empirical data.
This talk describes joint work with John Lang and Danny Abrams. 

Modelling evolution of postmenopausal human longevity: The Grandmother Hypothesis 15:10 Fri 2 Sep, 2016 :: Napier G03 :: Dr Peter Kim :: University of Sydney
Media...Human postmenopausal longevity makes us unique among primates, but how did it evolve? One explanation, the Grandmother Hypothesis, proposes that as grasslands spread in ancient Africa displacing foods ancestral youngsters could effectively exploit, older females whose fertility was declining left more descendants by subsidizing grandchildren and allowing mothers to have new babies sooner. As more robust elders could help more descendants, selection favoured increased longevity while maintaining the ancestral end of female fertility.
We develop a probabilistic agentbased model that incorporates two sexes and mating, fertilitylongevity tradeoffs, and the possibility of grandmother help. Using this model, we show how the grandmother effect could have driven the evolution of human longevity. Simulations reveal two stable lifehistories, one humanlike and the other like our nearest cousins, the great apes. The probabilistic formulation shows how stochastic effects can slow down and prevent escape from the ancestral condition, and it allows us to investigate the effect of mutation rates on the trajectory of evolution. 

A principled experimental design approach to big data analysis 15:10 Fri 23 Sep, 2016 :: Napier G03 :: Prof Kerrie Mengersen :: Queensland University of Technology
Media...Big Datasets are endemic, but they are often notoriously difficult to analyse because of their size, complexity, history and quality. The purpose of this paper is to open a discourse on the use of modern experimental design methods to analyse Big Data in order to answer particular questions of interest. By appeal to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has wide generality and advantageous inferential and computational properties. In particular, the principled experimental design approach is shown to provide a flexible framework for analysis that, for certain classes of objectives and utility functions, delivers equivalent answers compared with analyses of the full dataset. It can also provide a formalised method for iterative parameter estimation, model checking, identification of data gaps and evaluation of data quality. Finally it has the potential to add value to other Big Data sampling algorithms, in particular divideandconquer strategies, by determining efficient subsamples. 

SIR epidemics with stages of infection 12:10 Wed 28 Sep, 2016 :: EM218 :: Matthieu Simon :: Universite Libre de Bruxelles
Media...This talk is concerned with a stochastic model for the spread of an epidemic in a closed homogeneously mixing population. The population is subdivided into three classes of individuals: the susceptibles, the infectives and the removed cases. In short, an infective remains infectious during a random period of time. While infected, it can contact all the susceptibles present, independently of the other infectives. At the end of the infectious period, it becomes a removed case and has no further part in the infection process.
We represent an infectious period as a set of different stages that an infective can go through before being removed. The transitions between stages are ruled by either a Markov process or a semiMarkov process. In each stage, an infective makes contaminations at the epochs of a Poisson process with a specific rate.
Our purpose is to derive closed expressions for a transform of different statistics related to the end of the epidemic, such as the final number of susceptibles and the area under the trajectories of all the infectives. The analysis is performed by using simple matrix analytic methods and martingale arguments. Numerical illustrations will be provided at the end of the talk. 

Transmission Dynamics of Visceral Leishmaniasis: designing a test and treat control strategy 12:10 Thu 29 Sep, 2016 :: EM218 :: Graham Medley :: London School of Hygiene & Tropical Medicine
Media...Visceral Leishmaniasis (VL) is targeted for elimination from the Indian SubContinent. Progress has been much better in some areas than others. Current control is based on earlier diagnosis and treatment and on insecticide spraying to reduce the density of the vector. There is a surprising dearth of specific information on the epidemiology of VL, which makes modelling more difficult. In this seminar, I describe a simple framework that gives some insight into the transmission dynamics. We conclude that the majority of infection comes from cases prior to diagnosis. If this is the case then, early diagnosis will be advantageous, but will require a test with high specificity. This is a paradox for many clinicians and public health workers, who tend to prioritise high sensitivity.
Medley, G.F., Hollingsworth, T.D., Olliaro, P.L. & Adams, E.R. (2015) Healthseeking, diagnostics and transmission in the control of visceral leishmaniasis. Nature 528, S102S108 (3 December 2015), DOI: 10.1038/nature16042 

Measuring and mapping carbon dioxide from remote sensing satellite data 15:10 Fri 21 Oct, 2016 :: Napier G03 :: Prof Noel Cressie :: University of Wollongong
Media...This talk is about environmental statistics for global remote sensing of atmospheric carbon dioxide, a leading greenhouse gas. An important compartment of the carbon cycle is atmospheric carbon dioxide (CO2), where it (and other gases) contribute to climate change through a greenhouse effect. There are a number of CO2 observational programs where measurements are made around the globe at a small number of groundbased locations at somewhat regular time intervals. In contrast, satellitebased programs are spatially global but give up some of the temporal richness. The most recent satellite launched to measure CO2 was NASA's Orbiting Carbon Observatory2 (OCO2), whose principal objective is to retrieve a geographical distribution of CO2 sources and sinks. OCO2's measurement of columnaveraged mole fraction, XCO2, is designed to achieve this, through a dataassimilation procedure that is statistical at its basis. Consequently, uncertainty quantification is key, starting with the spectral radiances from an individual sounding to borrowing of strength through spatialstatistical modelling. 

Segregation of particles in incompressible flows due to streamline topology and particleboundary interaction 15:10 Fri 2 Dec, 2016 :: Ingkarni Wardli 5.57 :: Professor Hendrik C. Kuhlmann :: Institute of Fluid Mechanics and Heat Transfer, TU Wien, Vienna, Austria
Media...The incompressible flow in a number of classical benchmark problems (e.g. liddriven cavity, liquid bridge) undergoes an instability from a twodimensional steady to a periodic threedimensional flow, which is steady or in form of a traveling wave, if the Reynolds number is increased. In the supercritical regime chaotic as well as regular (quasiperiodic) streamlines can coexist for a range of Reynolds numbers. The spatial structures of the regular regions in threedimensional NavierStokes flows has received relatively little attention, partly because of the high numerical effort required for resolving these structures. Particles whose density does not differ much from that of the liquid approximately follow the chaotic or regular streamlines in the bulk. Near the boundaries, however, their trajectories strongly deviate from the streamlines, in particular if the boundary (wall or free surface) is moving tangentially. As a result of this particleboundary interaction particles can rapidly segregate and be attracted to periodic or quasiperiodic orbits, yielding particle accumulation structures (PAS). The mechanism of PAS will be explained and results from experiments and numerical modelling will be presented to demonstrate the generic character of the phenomenon. 

How oligomerisation impacts steady state gradient in a morphogenreceptor system 15:10 Fri 20 Oct, 2017 :: Ingkarni Wardli 5.57 :: Mr Phillip Brown :: University of Adelaide
In developmental biology an important process is cell fate determination, where cells start to differentiate their form and function. This is an element of the broader concept of morphogenesis. It has long been held that cell differentiation can occur by a chemical signal providing positional information to 'undecided' cells. This chemical produces a gradient of concentration that indicates to a cell what path it should develop along. More recently it has been shown that in a particular system of this type, the chemical (protein) does not exist purely as individual molecules, but can exist in multiprotein complexes known as oligomers.
Mathematical modelling has been performed on systems of oligomers to determine if this concept can produce useful gradients of concentration. However, there are wide range of possibilities when it comes to how oligomer systems can be modelled and most of them have not been explored.
In this talk I will introduce a new monomer system and analyse it, before extending this model to include oligomers. A number of oligomer models are proposed based on the assumption that proteins are only produced in their oligomer form and can only break apart once they have left the producing cell. It will be shown that when oligomers are present under these conditions, but only monomers are permitted to bind with receptors, then the system can produce robust, biologically useful gradients for a significantly larger range of model parameters (for instance, degradation, production and binding rates) compared to the monomer system. We will also show that when oligomers are permitted to bind with receptors there is negligible difference compared to the monomer system. 

The Markovian binary tree applied to demography and conservation biology 15:10 Fri 27 Oct, 2017 :: Ingkarni Wardli B17 :: Dr Sophie Hautphenne :: University of Melbourne
Markovian binary trees form a general and tractable class of continuoustime branching processes, which makes them wellsuited for realworld applications. Thanks to their appealing probabilistic and computational features, these processes have proven to be an excellent modelling tool for applications in population biology. Typical performance measures of these models include the extinction probability of a population, the distribution of the population size at a given time, the total progeny size until extinction, and the asymptotic population composition. Besides giving an overview of the main performance measures and the techniques involved to compute them, we discuss recently developed statistical methods to estimate the model parameters, depending on the accuracy of the available data. We illustrate our results in human demography and in conservation biology. 

Stochastic Modelling of Urban Structure 11:10 Mon 20 Nov, 2017 :: Engineering Nth N132 :: Mark Girolami :: Imperial College London, and The Alan Turing Institute
Media...Urban systems are complex in nature and comprise of a large number of individuals that act according to utility, a measure of net benefit pertaining to preferences. The actions of individuals give rise to an emergent behaviour, creating the socalled urban structure that we observe. In this talk, I develop a stochastic model of urban structure to formally account for uncertainty arising from the complex behaviour. We further use this stochastic model to infer the components of a utility function from observed urban structure. This is a more powerful modelling framework in comparison to the ubiquitous discrete choice models that are of limited use for complex systems, in which the overall preferences of individuals are difficult to ascertain. We model urban structure as a realization of a Boltzmann distribution that is the invariant distribution of a related stochastic differential equation (SDE) that describes the dynamics of the urban system. Our specification of Boltzmann distribution assigns higher probability to stable configurations, in the sense that consumer surplus (demand) is balanced with running costs (supply), as characterized by a potential function. We specify a Bayesian hierarchical model to infer the components of a utility function from observed structure. Our model is doublyintractable and poses significant computational challenges that we overcome using recent advances in Markov chain Monte Carlo (MCMC) methods. We demonstrate our methodology with case studies on the London retail system and airports in England. 

Calculating optimal limits for transacting credit card customers 15:10 Fri 2 Mar, 2018 :: Horace Lamb 1022 :: Prof Peter Taylor :: University of Melbourne
Credit card users can roughly be divided into `transactors', who pay off their balance each month, and `revolvers', who maintain an outstanding balance, on which they pay substantial interest.
In this talk, we focus on modelling the behaviour of an individual transactor customer. Our motivation is to calculate an optimal credit limit from the bank's point of view. This requires an expression for the expected outstanding balance at the end of a payment period.
We establish a connection with the classical newsvendor model. Furthermore, we derive the Laplace transform of the outstanding balance, assuming that purchases are made according to a marked point process and that there is a simplified balance control policy which prevents all purchases in the rest of the payment period when the credit limit is exceeded. We then use the newsvendor model and our modified model to calculate bounds on the optimal credit limit for the more realistic balance control policy that accepts all purchases that do not exceed the limit.
We illustrate our analysis using a compound Poisson process example and show that the optimal limit scales with the distribution of the purchasing process, while the probability of exceeding the optimal limit remains constant.
Finally, we apply our model to some real credit card purchase data. 

Models, machine learning, and robotics: understanding biological networks 15:10 Fri 16 Mar, 2018 :: Horace Lamb 1022 :: Prof Steve Oliver :: University of Cambridge
The availability of complete genome sequences has enabled the construction of computer models of metabolic networks that may be used to predict the impact of genetic mutations on growth and survival. Both logical and constraintbased models of the metabolic network of the model eukaryote, the ale yeast Saccharomyces cerevisiae, have been available for some time and are continually being improved by the research community. While such models are very successful at predicting the impact of deleting single genes, the prediction of the impact of higher order genetic interactions is a greater challenge. Initial studies of limited gene sets provided encouraging results. However, the availability of comprehensive experimental data for the interactions between genes involved in metabolism demonstrated that, while the models were able to predict the general properties of the genetic interaction network, their ability to predict interactions between specific pairs of metabolic genes was poor. I will examine the reasons for this poor performance and demonstrate ways of improving the accuracy of the models by exploiting the techniques of machine learning and robotics.
The utility of these metabolic models rests on the firm foundations of genome sequencing data. However, there are two major problems with these kinds of network models  there is no dynamics, and they do not deal with the uncertain and incomplete nature of much biological data. To deal with these problems, we have developed the Flexible Nets (FNs) modelling formalism. FNs were inspired by Petri Nets and can deal with missing or uncertain data, incorporate both dynamics and regulation, and also have the potential for model predictive control of biotechnological processes.


Modelling phagocytosis 15:10 Fri 25 May, 2018 :: Horace Lamb 1022 :: Prof Ngamta (Natalie) Thamwattana :: University of Wollongong
Phagocytosis refers to a process in which one cell type fully encloses and consumes unwanted cells,
debris or particulate matter. It plays an important role in immune systems through the destruction of
pathogens and the inhibiting of cancerous cells. In this study, we combine models on cellcell adhesion
and on predatorprey modelling to generate a new model for phagocytosis that is capable of relating
the interaction between cells in both space and time. Numerical results are presented, demonstrating
the behaviours of cells during the process of phagocytosis. 

Topological Data Analysis 15:10 Fri 31 Aug, 2018 :: Napier 208 :: Dr Vanessa Robins :: Australian National University
Topological Data Analysis has grown out of work focussed on deriving qualitative and yet quantifiable information about the shape of data. The underlying assumption is that knowledge of shape  the way the data are distributed  permits highlevel reasoning and modelling of the processes that created this data. The 0th order aspect of shape is the number pieces: "connected components" to a topologist; "clustering" to a statistician. Higherorder topological aspects of shape are holes, quantified as "nonbounding cycles" in homology theory. These signal the existence of some type of constraint on the datagenerating process.
Homology lends itself naturally to computer implementation, but its naive application is not robust to noise. This inspired the development of persistent homology: an algebraic topological tool that measures changes in the topology of a growing sequence of spaces (a filtration). Persistent homology provides invariants called the barcodes or persistence diagrams that are sets of intervals recording the birth and death parameter values of each homology class in the filtration. It captures information about the shape of data over a range of length scales, and enables the identification of "noisy" topological structure.
Statistical analysis of persistent homology has been challenging because the raw information (the persistence diagrams) are provided as sets of intervals rather than functions. Various approaches to converting persistence diagrams to functional forms have been developed recently, and have found application to data ranging from the distribution of galaxies, to porous materials, and cancer detection. 

Mathematical modelling of the emergence and spread of antimalarial drug resistance 15:10 Fri 14 Sep, 2018 :: Napier 208 :: Dr Jennifer Flegg :: University of Melbourne
Malaria parasites have repeatedly evolved resistance to antimalarial drugs, thwarting efforts to eliminate the disease and contributing to an increase in mortality. In this talk, I will introduce several statistical and mathematical models for monitoring the emergence and spread of antimalarial drug resistance. For example, results will be presented from Bayesian geostatistical models that have quantified the spacetime trends in drug resistance in Africa and Southeast Asia. I will discuss how the results of these models have been used to update public health policy. 
News matching "Stochastic Modelling and Optimisation" 
ARC success The School of Mathematical Sciences was again very successful in attracting Australian Research Council funding for 2008. Recipients of ARC Discovery Projects are (with staff from the School highlighted):
Prof NG Bean; Prof PG Howlett; Prof CE Pearce; Prof SC Beecham; Dr AV Metcalfe; Dr JW Boland:
WaterLog  A mathematical model to implement recommendations of The Wentworth Group.
20082010: $645,000
Prof RJ Elliott:
Dynamic risk measures.
(Australian Professorial Fellowship)
20082012: $897,000
Dr MD Finn:
Topological Optimisation of Fluid Mixing.
20082010: $249,000
Prof PG Bouwknegt; Prof M Varghese; A/Prof S Wu:
Dualities in String Theory and Conformal Field Theory in the context of the Geometric Langlands Program.
20082010: $240,000
The latter grant is held through the ANU Posted Wed 26 Sep 07. 

Success in Learning and Teaching Grants The School of Mathematical Sciences has been awarded two Faculty L&T awards. Congratulations to Dr David Green for his successful grant "One Simulation Modelling Instruction Module" and to Drs Adrian Koerber, Paul McCann and Jim Denier for their successful grant "Graphics Calculators and beyond". Posted Tue 11 Mar 08. 

Australian Research Council Discovery Project Successes Congratulations to the following members of the School for their
success in the ARC Discovery Grants which were announced recently.
 A/Prof M Roughan; Prof H Shen $315K Network Management in a World of Secrets
 Prof AJ Roberts; Dr D Strunin $315K
Effective and accurate model dynamics, deterministic and stochastic,
across multiple space and time scales
 A/Prof J Denier; Prof AP Bassom $180K A novel approach to controlling boundarylayer separation
Posted Wed 17 Sep 08. 

Sam Cohen wins prize for best student talk at ANZIAM 2009 Congratulations to Mr Sam Cohen, a PhD student within the School, who was awarded the T. M. Cherry Prize for the best student paper at the 2009 meeting of ANZIAM for his talk on
A general theory of backward stochastic difference equations. Posted Fri 6 Feb 09. 

Position available: Lecturer in Applied Mathematics The School is currently seeking to appoint a Lecturer in Applied Mathematics in the area of optimisation. See the University's jobs website for full details, including the selection criteria. Posted Wed 26 Aug 09. 

Welcome to Dr Joshua Ross We welcome Dr Joshua Ross as a new lecturer in the School of Mathematical Sciences. Joshua has moved over to Adelaide from the University of Cambridge. His research interests are mathematical modelling (especially mathematical biology) and operations research. Posted Mon 15 Mar 10.More information... 

ARC Grant successes The School of Mathematical Sciences has again had outstanding success in the ARC Discovery and Linkage Projects schemes.
Congratulations to the following staff for their success in the Discovery Project scheme:
Prof Nigel Bean, Dr Josh Ross, Prof Phil Pollett, Prof Peter Taylor, New methods for improving active adaptive management in biological systems, $255,000 over 3 years;
Dr Josh Ross, New methods for integrating population structure and stochasticity into models of disease dynamics, $248,000 over three years;
A/Prof Matt Roughan, Dr Walter Willinger, Internet trafficmatrix synthesis, $290,000 over three years;
Prof Patricia Solomon, A/Prof John Moran, Statistical methods for the analysis of critical care data, with application to the Australian and New Zealand Intensive Care Database, $310,000 over 3 years;
Prof Mathai Varghese, Prof Peter Bouwknegt, Supersymmetric quantum field theory, topology and duality, $375,000 over 3 years;
Prof Peter Taylor, Prof Nigel Bean, Dr Sophie Hautphenne, Dr Mark Fackrell, Dr Malgorzata O'Reilly, Prof Guy Latouche, Advanced matrixanalytic methods with applications, $600,000 over 3 years.
Congratulations to the following staff for their success in the Linkage Project scheme:
Prof Simon Beecham, Prof Lee White, A/Prof John Boland, Prof Phil Howlett, Dr Yvonne Stokes, Mr John Wells, Paving the way: an experimental approach to the mathematical modelling and design of permeable pavements, $370,000 over 3 years;
Dr Amie Albrecht, Prof Phil Howlett, Dr Andrew Metcalfe, Dr Peter Pudney, Prof Roderick Smith, Saving energy on trains  demonstration, evaluation, integration, $540,000 over 3 years
Posted Fri 29 Oct 10. 

Bushfire CRC postgraduate scholarship success Congratulations to Mika Peace who has been awarded a PhD scholarship from the Bushfire Cooperative Research Centre. Mika is working with Trent Mattner and Graham Mills (from the Bureau of Meteorology) on coupled fireweather modelling Posted Wed 6 Apr 11. 

ARC Future Fellowship success Associate Professor Zudi Lu has been awarded an ARC Future Fellowship. Associate Professor Lu, and Associate Professor in Statistics, will use the support provided by his Future Fellowship to further improve the theory and practice of econometric modelling of nonlinear spatial time series. Congratulations Zudi. Posted Thu 12 May 11. 

ARC Grant Success Congratulations to the following staff who were successful in securing funding from the Australian Research Council Discovery Projects Scheme. Associate Professor Finnur Larusson awarded $270,000 for his project Flexibility and symmetry in complex geometry; Dr Thomas Leistner, awarded $303,464 for his project Holonomy groups in Lorentzian geometry, Professor Michael Murray Murray and Dr Daniel Stevenson (Glasgow), awarded $270,000 for their project Bundle gerbes: generalisations and applications; Professor Mathai Varghese, awarded $105,000 for his project Advances in index theory and Prof Anthony Roberts and Professor Ioannis Kevrekidis (Princeton) awarded $330,000 for their project Accurate modelling of large multiscale dynamical systems for engineering and scientific
simulation and analysis Posted Tue 8 Nov 11. 

Best paper prize at Membrane Symposium Congratulations to Wei Xian Lim who was awarded the prize for the best student presentation at the Membrane Society of Australasia 2011 ECR Membrane Symposium for her talk on "Mathematical modelling of gas capture in porous materials". Xian is working on her PhD with Jim Hill and Barry Cox. Posted Mon 28 Nov 11. 

Topup scholarship available A PhD opportunity is available to help in mathematical modelling of the interaction of ocean waves and sea ice. For information, see this advertisement. Posted Thu 1 Nov 12. 

AMSIANZIAM Lecture Tour  Public Lecture AMSIANZIAM Lecture Tour, Public Lecture
The Role of Embedded Optimisation in Smart Systems and Products
September 23, 6:00pm Horace Lamb Lecture Theatre
Professor Stephen Boyd, Stanford University, Samsung Professor of Engineering, Professor of Electrical Engineering
Further details here
Posted Mon 23 Sep 13. 

A/Prof Joshua Ross, 2017 Moran Medal recipient Congratulations to Associate Professor Joshua Ross who has won the 2017 Moran Medal, awarded by the Australian Academy of Science.
The Moran Medal recognises outstanding research by scientists up to 10 years postPhD in applied probability, biometrics, mathematical genetics, psychometrics and statistics.
Associate Professor Ross has made influential contributions to public health and conservation biology using mathematical modelling and statistics to help in decision making.
Posted Fri 23 Dec 16.More information... 
Publications matching "Stochastic Modelling and Optimisation"Publications 

Generalized solutions to abstract stochastic problems Melnikova, I; Filinkov, Alexei, Integral Transforms and Special Functions 20 (199–206) 2009  Hitting probabilities and hitting times for stochastic fluid flows the bounded model Bean, Nigel; O'Reilly, Malgorzata; Taylor, P, Probability in the Engineering and Informational Sciences 23 (121–147) 2009  On the beneficial impact of strong correlations for anomaly detection Roughan, Matthew, Stochastic Models (1–27) 2009  Stochastic Resonance From Suprathreshold Stochastic Resonance to Stochastic Signal Quantization McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, (Cambridge University Press) 2008  A markovian regimeswitching stochastic differential game for portfolio risk minimization Elliott, Robert; Siu, T, 2008 American Control Conference, Washington 11/06/08  Modelling Water BlendingSensitivity of Optimal Policies Webby, Roger; Green, David; Metcalfe, Andrew, 17th Biennial Congress on Modeling and Simulation, New Zealand 01/12/08  Stochastic cyclone modelling in the Bay of Bengal Need, Steven; Lambert, Martin; Metcalfe, Andrew; Sen, D, Water Down Under 2008, Adelaide 14/04/08  A nonlinear filter Elliott, Robert; Leung, H; Deng, J, Stochastic Analysis and Applications 26 (856–862) 2008  Algorithms for the LaplaceStieltjes transforms of first return times for stochastic fluid flows Bean, Nigel; O'Reilly, Malgorzata; Taylor, Peter, Methodology and Computing in Applied Probability 10 (381–408) 2008  Characterization of matrixexponential distributions Bean, Nigel; Fackrell, Mark; Taylor, Peter, Stochastic Models 24 (339–363) 2008  Evolving gene frequencies in a population with three possible alleles at a locus Hajek, Bronwyn; Broadbridge, P; Williams, G, Mathematical and Computer Modelling 47 (210–217) 2008  Modelling survival in acute severe illness: Cox versus accelerated failure time models Moran, John; Bersten, A; Solomon, Patricia; Edibam, C; Hunt, T, Journal of Evaluation in Clinical Practice 14 (83–93) 2008  Stochastic dynamic programming (SDP) with a conditional valueatrisk (CVaR) criterion for management of stormwater Piantadosi, J; Metcalfe, Andrew; Howlett, P, Journal of Hydrology 348 (320–329) 2008  Stochastic linear programming and conditional valueatrisk for water resources management Webby, Roger; Boland, J; Howlett, P; Metcalfe, Andrew, The ANZIAM Journal  Online fulltext 48 (885–898) 2008  The mathematical modelling of rotating capillary tubes for holeyfibre manufacture Voyce, Christopher; Fitt, A; Monro, Tanya, Journal of Engineering Mathematics 60 (69–87) 2008  Normal form transforms separate slow and fast modes in stochastic dynamical systems Roberts, Anthony John, Physics Letters A 387 (12–38) 2008  Computer algebra derives discretisations via selfadjoint multiscale modelling (Unpublished) Roberts, Anthony John,  Model subgrid microscale interactions to accurately discretise stochastic partial differential equations. Roberts, Anthony John,  Inverse groundwater modelling in the Willunga Basin, South Australia Knowles, I; Teubner, Michael; Yan, A; Rasser, Paul; Lee, Jong, Hydrogeology Journal 15 (1107–1118) 2007  The Mekongapplications of value at risk (VAR) and conditional value at risk (CVAR) simulation to the benefits, costs and consequences of water resources development in a large river basin Webby, Roger; Adamson, Peter; Boland, J; Howlett, P; Metcalfe, Andrew; Piantadosi, J, Ecological Modelling 201 (89–96) 2007  The solution of a free boundary problem related to environmental management systems Elliott, Robert; Filinkov, Alexei, Stochastic Analysis and Applications 25 (1189–1202) 2007  Computer algebra derives normal forms of stochastic differential equations Roberts, Anthony John,  Drought forecasting using adaptive stochastic models in New South Wales Wong, Hui; Osti, Alexander; Lambert, Martin; Metcalfe, Andrew, 30th Hydrology and Water Resources Symposium, Launceston, Tasmania 04/12/06  Modelling extreme rainfall and tidal anomaly Need, Steven; Lambert, Martin; Metcalfe, Andrew, 30th Hydrology and Water Resources Symposium, Launceston, Tasmania 04/12/06  Modelling multivariate extreme flood events Wong, Hui; Need, Steven; Adamson, Peter; Lambert, Martin; Metcalfe, Andrew, 30th Hydrology and Water Resources Symposium, Launceston, Tasmania 04/12/06  Datarecursive smoother formulae for partially observed discretetime Markov chains Elliott, Robert; Malcolm, William, Stochastic Analysis and Applications 24 (579–597) 2006  Mathematical modelling of oxygen concentration in bovine and murine cumulusoocyte complexes Clark, Alys; Stokes, Yvonne; Lane, Michelle; Thompson, Jeremy, Reproduction 131 (999–1006) 2006  Optimal information transmission in nonlinear arrays through suprathreshold stochastic resonance McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, Physics Letters A 352 (183–189) 2006  Stochastic volatility model with filtering Elliott, Robert; MIao, H, Stochastic Analysis and Applications 24 (661–683) 2006  Resolving the multitude of microscale interactions accurately models stochastic partial differential equations Roberts, Anthony John, London Mathematical Society. Journal of Computation and Mathematics 9 (193–221) 2006  Stochastic elastohydrodynamics of a microcantilever oscillating near a wall  art. no. 050801 Clarke, Richard; Jensen, O; Billingham, J; Pearson, A; Williams, P, Physical Review Letters 9605 (80101–80104) 2006  An algorithmic estimation scheme for hybrid stochastic systems Malcolm, William; Elliott, Robert; Dufour, F; Arulampalam, M, The 44th IEEE Conference on Decision and Control and European Control Conference ECC 2005, Seville, Spain 12/12/05  Analog to digital conversion using suprathreshold stochastic resonance McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, The SPIE International Symposium on Smart Structures, Devices, and Systems II, Sydney, Australia 13/12/04  Exact smoothers for discretetime hybrid stochastic systems Elliott, Robert; Malcolm, William; Dufour, F, The 44th IEEE Conference on Decision and Control and the European Control Conference, Seville, Spain 12/12/05  Algorithms for return probabilities for stochastic fluid flows Bean, Nigel; O'Reilly, Malgorzata; Taylor, Peter, Stochastic Models 21 (149–184) 2005  An analytic modelling approach for network routing algorithms that use "antlike" mobile agents Bean, Nigel; Costa, Andre, Computer NetworksThe International Journal of Computer and Telecommunications Networking 49 (243–268) 2005  An inverse modelling technique for glass forming by gravity sagging Agnon, Y; Stokes, Yvonne, European Journal of Mechanics BFluids 24 (275–287) 2005  Hidden Markov chain filtering for a jump diffusion model Wu, P; Elliott, Robert, Stochastic Analysis and Applications 23 (153–163) 2005  Hidden Markov filter estimation of the occurrence time of an event in a financial market Elliott, Robert; Tsoi, A, Stochastic Analysis and Applications 23 (1165–1177) 2005  Hitting probabilities and hitting times for stochastic fluid flows Bean, Nigel; O'Reilly, Malgorzata; Taylor, Peter, Stochastic Processes and their Applications 115 (1530–1556) 2005  Reliability of supply between production lines Green, David; Metcalfe, Andrew, Stochastic Models 21 (449–464) 2005  Optimal quantization and suprathreshold stochastic resonance McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, Fluctuations and noise in biological, biophysical, and biomedical systems III, Austin, Texas, USA 24/05/05  Deterministic and stochastic modelling of endosome escape by Staphylococcus aureus: "quorum" sensing by a single bacterium Koerber, Adrian; King, J; Williams, P, Journal of Mathematical Biology 50 (440–488) 2005  Filtering, smoothing and Mary detection with discrete time poisson observations Elliott, Robert; Malcolm, William; Aggoun, L, Stochastic Analysis and Applications 23 (939–952) 2005  Finitedimensional filtering and control for continuoustime nonlinear systems Elliott, Robert; Aggoun, L; Benmerzouga, A, Stochastic Analysis and Applications 22 (499–505) 2005  Computer algebra resolves a multitude of microscale interactions to model stochastic partial differential equations Roberts, Anthony John,  Investigation and modelling of traffic issues in immersive audio environments McMahon, Jeremy; Rumsewicz, Michael; Boustead, P; Safaei, F, 2004 IEEE International Conference on Communications, Paris, France 20/06/04  A deterministic discretisationstep upper bound for state estimation via Clark transformations Malcolm, William; Elliott, Robert; Van Der Hoek, John, J.A.M.S.A. Journal of Applied Mathematics and Stochastic Analysis 2004 (371–384) 2004  Modelling thirtyday mortality in the acute respiratory distress syndrome (ARDS) in an adult ICU Moran, John; Solomon, Patricia; Fox, V; Salagaras, M; Williams, P; Quinlan, K; Bersten, A, Anaesthesia and Intensive Care 32 (317–329) 2004  Development of NonHomogeneous and Hierarchical Hidden Markov Models for Modelling Monthly Rainfall and Streamflow Time Series Whiting, Julian; Lambert, Martin; Metcalfe, Andrew; Kuczera, George, World Water and Environmental Resources Congress (2004), Salt Lake City, Utah, USA 27/06/04  Conditional moment generating functions for integrals and stochastic integrals Charalambous, C; Elliott, Robert; Krishnamurthy, V, Siam Journal on Control and Optimization 42 (1578–1603) 2004  Stochastic modelling of tidal anomaly for estimation of flood risk in coastal areas Ahmer, Ingrid; Lambert, Martin; Leonard, Michael; Metcalfe, Andrew, 28th International Hydrology and Water Resources Symposium, Wollongong, NSW, Australia 10/11/03  The data processing inequality and stochastic resonance McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, Noise in Complex Systems and Stochastic Dynamics, Santa Fe, New Mexico, USA 01/06/03  A Probabilistic algorithm for determining the fundamental matrix of a block M/G/1 Markov chain Hunt, Emma, Mathematical and Computer Modelling 38 (1203–1209) 2003  A philosophy for the modelling of realistic nonlinear systems Howlett, P; Torokhti, Anatoli; Pearce, Charles, Proceedings of the American Mathematical Society 132 (353–363) 2003  An approximate formula for the stress intensity factor for the pressurized star crack Clements, David; Widana, Inyoman, Mathematical and Computer Modelling 37 (689–694) 2003  Effect of environmental fluctuations on the dynamic composition of engineered cartilage: a deterministic model in stochastic environment Saha, Asit; Mazumdar, Jagan; Morsi, Y, IEEE Transactions on NanoBioscience 2 (158–162) 2003  Method of hybrid approximations for modelling of multidimensional nonlinear systems Torokhti, Anatoli; Howlett, P; Pearce, Charles, Multidimensional Systems and Signal Processing 14 (397–410) 2003  Modelling persistence in annual Australian point rainfall Whiting, Julian; Lambert, Martin; Metcalfe, Andrew, Hydrology and Earth System Sciences 7 (197–211) 2003  On the ClarkOcone theorem for fractional Brownian motions with Hurst parameter bigger than a half Bender, C; Elliott, Robert, Stochastics and Stochastic Reports 75 (391–405) 2003  Optimal mathematical models for nonlinear dynamical systems Torokhti, Anatoli; Howlett, P; Pearce, Charles, Mathematical and Computer Modelling of Dynamical Systems 9 (327–343) 2003  Rumours, epidemics, and processes of mass action: Synthesis and analysis Dickinson, Rowland; Pearce, Charles, Mathematical and Computer Modelling 38 (1157–1167) 2003  Stochastic resonance and data processing inequality McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, Electronics Letters 39 (1287–1288) 2003  Lowdimensional modelling of dynamical systems applied to some dissipative fluid mechanics Roberts, Anthony John, chapter in Nonlinear dynamics: from lasers to butterflies (World Scientific Publishing) 257–313, 2003  Stochastic Differential Equations in Hilbert Spaces Filinkov, Alexei; Maizurna, Isna; Sorenson, J; Van Der Hoek, John, chapter in Applicable Mathematics in the Golden Age (Morgan & Claypool) 32–169, 2003  A step towards holistic discretisation of stochastic partial differential equations Roberts, Anthony John, The ANZIAM Journal 45 (C1–C15) 2003  Modelling host tissue degradation by extracellular bacterial pathogens King, J; Koerber, Adrian; Croft, J; Ward, J; Williams, P; Sockett, R, Mathematical Medicine and Biology (Print Edition) 20 (227–260) 2003  Modelling nonlinear dynamics of shapememoryalloys with approximate models of coupled thermoelasticity Melnik, R; Roberts, Anthony John, Zeitschrift fur Angewandte Mathematik und Mechanik 83 (93–104) 2003  Modelling the dynamics of turbulent floods Mei, Z; Roberts, Anthony John; Li, Z, Siam Journal on Applied Mathematics 63 (423–458) 2003  Coastal flood modelling: Allowing for dependence between rainfall and tidal anomaly Ahmer, Ingrid; Metcalfe, Andrew; Lambert, Martin; Deans, J, EMAC 2002, Brisbane, Australia 29/09/02  A characterization of suprathreshold stochastic resonance in an array of comparators by correlation coefficient McDonnell, Mark; Abbott, Derek; Pearce, Charles, Fluctuation and Noise Letters 2 (L205–L220) 2002  A mathematical study of peristaltic transport of a Casson fluid Mernone, Anacleto; Mazumdar, Jagan; Lucas, S, Mathematical and Computer Modelling 35 (895–912) 2002  Bivariate stochastic modelling of ephemeral streamflow Cigizoglu, H; Adamson, Peter; Metcalfe, Andrew, Hydrological Processes 16 (1451–1465) 2002  Differential equations in spaces of abstract stochastic distributions Filinkov, Alexei; Sorensen, Julian, Stochastics and Stochastic Reports 72 (129–173) 2002  Truncation and augmentation of levelindependent QBD processes Latouche, Guy; Taylor, Peter, Stochastic Processes and their Applications 99 (53–80) 2002  Optimisation Metcalfe, Andrew, chapter in Research methods for postgraduates (Oxford University Press) 300–304, 2002  Stochastic models and simulation Metcalfe, Andrew, chapter in Research methods for postgraduates (Oxford University Press) 292–299, 2002  Robust continuoustime smoothers without twosided stochastic integrals Krishnamurthy, V; Elliott, Robert, IEEE Transactions on Automatic Control 47 (1824–1841) 2002  Fractional Brownian motion and financial modelling Elliott, Robert; Van Der Hoek, John, chapter in Mathematical Finance (Birkhauser) 140–151, 2001  Integrated solutions of stochastic evolution equations with additive noise Filinkov, Alexei; Maizurna, Isna, Bulletin of the Australian Mathematical Society 64 (281–290) 2001  Statistical modelling and prediction associated with the HIV/AIDS epidemic Solomon, Patricia; Wilson, Susan, The Mathematical Scientist 26 (87–102) 2001  Stochastic flows and the forward measure Elliott, Robert; Van Der Hoek, John, Finance and Stochastics 5 (511–525) 2001  The Mx/G/1 queue with queue length dependent service times Choi, B; Kim, Y; Shin, Y; Pearce, Charles, J.A.M.S.A. Journal of Applied Mathematics and Stochastic Analysis 14 (399–419) 2001  The modelling and numerical simulation of causal nonlinear systems Howlett, P; Torokhti, Anatoli; Pearce, Charles, Nonlinear AnalysisTheory Methods & Applications 47 (5559–5572) 2001  Modelling Overflow Traffic from Terrestrial Networks into Satellite Networks Green, David, 8th International Conference on Telecommunications (June 2001), Bucharest, Romania 04/06/01  Modelling Service Time Distribution in Cellular Networks Using PhaseType Service Distributions Green, David; Asenstorfer, J; Jayasuriya, A,  A continuous time kronecker's lemma and martingale convergence Elliott, Robert, Stochastic Analysis and Applications 19 (433–437) 2001  Mathematical modelling of quorum sensing in bacteria Ward, J; King, J; Koerber, Adrian; Williams, P; Croft, J; Sockett, R, Mathematical Medicine and Biology (Print Edition) 18 (263–292) 2001  A brief survey and synthesis of the roles of time in petri nets Bowden, Fred David John, Mathematical and Computer Modelling 31 (55–68) 2000  A new perspective on the normalization of invariant measures for loss networks and other product form systems Bean, Nigel; Stewart, Mark, Mathematical and Computer Modelling 31 (47–54) 2000  Algorithms for second moments in batchmovement queueing systems Hunt, Emma, Mathematical and Computer Modelling 31 (299–305) 2000  Biomathematical modelling of physiological fluids using a Casson fluid with emphasis to peristalsis Mernone, Anacleto; Mazumdar, Jagan, Australasian Physical and Engineering Sciences in Medicine 23 (94–100) 2000  Disease surveillance and data collection issues in epidemic modelling Solomon, Patricia; Isham, V, Statistical Methods in Medical Research 9 (259–277) 2000  Maximal profit dimensioning and tariffing of loss networks with crossconnects Bean, Nigel; Brown, Deborah; Taylor, Peter, Mathematical and Computer Modelling 31 (21–30) 2000  Quasireversibility and networks of queues with nonstandard batch movements Taylor, Peter, Mathematical and Computer Modelling 31 (335–341) 2000  Quasistationary distributions for leveldependent quasibirthanddeath processes Bean, Nigel; Pollett, P; Taylor, Peter, Stochastic Models 16 (511–541) 2000  The exact solution of the general stochastic rumour Pearce, Charles, Mathematical and Computer Modelling 31 (289–298) 2000  Weak and generalized solutions to abstract stochastic equations Melnikova, I; Filinkov, Alexei, Doklady Mathematics 62 (373–377) 2000  When is a MAP poisson? Bean, Nigel; Green, David, Mathematical and Computer Modelling 31 (31–46) 2000 
Advanced search options
You may be able to improve your search results by using the following syntax:
Query  Matches the following 

Asymptotic Equation  Anything with "Asymptotic" or "Equation". 
+Asymptotic +Equation  Anything with "Asymptotic" and "Equation". 
+Stokes "NavierStokes"  Anything containing "Stokes" but not "NavierStokes". 
Dynam*  Anything containing "Dynamic", "Dynamical", "Dynamicist" etc. 
