June
2018  M  T  W  T  F  S  S      1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30         

Search the School of Mathematical SciencesPeople matching "Operations research"Courses matching "Operations research" 
Optimisation and Operations Research Operations Research (OR) is the application of mathematical techniques and analysis to problem solving in business and industry, in particular to carrying out more efficiently tasks such as scheduling, or optimising the provision of services. OR is an interdisciplinary topic drawing from mathematical modelling, optimisation theory, game theory, decision analysis, statistics, and simulation to help make decisions in complex situations. This first course in OR concentrates on mathematical modelling and optimisation: for example maximising production capacity, or minimising risk. It focuses on linear optimisation problems involving both continuous, and integer variables. The course covers a variety of mathematical techniques for linear optimisation, and the theory behind them. It will also explore the role of heuristics in such problems. Examples will be presented from important application areas, such as the emergency services, telecommunications, transportation, and manufacturing. Students will undertake a team project based on an actual Adelaide problem. Topics covered are: formulating a linear program; the Simplex Method; duality and Complementary slackness; sensitivity analysis; an interior point method; alternative means to solve some linear and integer programs, such as primaldual approaches methods from a complete solution (such as Greedy Methods, and Simulated Annealing), methods from a partial solution (such as Dijkstra's shortest path algorithm, and branchandbound).
More about this course... 
Events matching "Operations research" 
Watching evolution in real time; problems and potential research areas.
15:10 Fri 26 May, 2006 :: G08. Mathematics Building University of Adelaide :: Prof Alan Cooper (Federation Fellow)
Recent studies (1) have indicated problems with our
ability to use the genetic distances between species to estimate the
time since their divergence (so called molecular clocks). An
exponential decay curve has been detected in comparisons of closely
related taxa in mammal and bird groups, and rough approximations
suggest that molecular clock calculations may be problematic for the
recent past (eg <1 million years). Unfortunately, this period
encompasses a number of key evolutionary events where estimates of
timing are critical such as modern human evolutionary history, the
domestication of animals and plants, and most issues involved in
conservation biology. A solution (formulated at UA) will be briefly
outlined. A second area of active interest is the recent suggestion
(2) that mitochondrial DNA diversity does not track population size in
several groups, in contrast to standard thinking. This finding has
been interpreted as showing that mtDNA may not be evolving neutrally,
as has long been assumed.
Large ancient DNA datasets provide a means to examine these issues, by
revealing evolutionary processes in real time (3). The data also
provide a rich area for mathematical investigation as temporal
information provides information about several parameters that are
unknown in serial coalescent calculations (4). References: Ho SYW et al. Time dependency of molecular rate estimates and
systematic overestimation of recent divergence
times. Mol. Biol. Evol. 22, 15611568 (2005);
Penny D, Nature 436, 183184 (2005).
 Bazin E., et al. Population size does not influence mitochondrial
genetic diversity in animals. Science 312, 570 (2006);
EyreWalker A. Size does not matter for mitochondrial DNA,
Science 312, 537 (2006).
 Shapiro B, et al. Rise and fall of the Beringian steppe
bison. Science 306: 15611565 (2004);
Chan et al. Bayesian estimation of the timing and severity of a
population bottleneck from ancient DNA. PLoS Genetics, 2 e59
(2006).
 Drummond et al. Measurably evolving populations, Trends in
Ecol. Evol. 18, 481488 (2003);
Drummond et al. Bayesian coalescent inference of past population
dynamics from molecular sequences. Molecular Biology Evolution
22, 118592 (2005).


Global and Local stationary modelling in finance: Theory and empirical evidence 14:10 Thu 10 Apr, 2008 :: G04 Napier Building University of Adelaide :: Prof. Dominique Guégan :: Universite Paris 1 PantheonSorbonne
To model real data sets using second order stochastic processes imposes that the data sets verify the second order stationarity condition. This stationarity condition concerns the unconditional moments of the process. It is in that context that most of models developed from the sixties' have been studied; We refer to the ARMA processes (Brockwell and Davis, 1988), the ARCH, GARCH and EGARCH models (Engle, 1982, Bollerslev, 1986, Nelson, 1990), the SETAR process (Lim and Tong, 1980 and Tong, 1990), the bilinear model (Granger and Andersen, 1978, Guégan, 1994), the EXPAR model (Haggan and Ozaki, 1980), the long memory process (Granger and Joyeux, 1980, Hosking, 1981, Gray, Zang and Woodward, 1989, Beran, 1994, Giraitis and Leipus, 1995, Guégan, 2000), the switching process (Hamilton, 1988). For all these models, we get an invertible causal solution under specific conditions on the parameters, then the forecast points and the forecast intervals are available.
Thus, the stationarity assumption is the basis for a general asymptotic theory for identification, estimation and forecasting. It guarantees that the increase of the sample size leads to more and more information of the same kind which is basic for an asymptotic theory to make sense.
Now nonstationarity modelling has also a long tradition in econometrics. This one is based on the conditional moments of the data generating process. It appears mainly in the heteroscedastic and volatility models, like the GARCH and related models, and stochastic volatility processes (Ghysels, Harvey and Renault 1997). This nonstationarity appears also in a different way with structural changes models like the switching models (Hamilton, 1988), the stopbreak model (Diebold and Inoue, 2001, Breidt and Hsu, 2002, Granger and Hyung, 2004) and the SETAR models, for instance. It can also be observed from linear models with time varying coefficients (Nicholls and Quinn, 1982, Tsay, 1987).
Thus, using stationary unconditional moments suggest a global stationarity for the model, but using nonstationary unconditional moments or nonstationary conditional moments or assuming existence of states suggest that this global stationarity fails and that we only observe a local stationary behavior.
The growing evidence of instability in the stochastic behavior of stocks, of exchange rates, of some economic data sets like growth rates for instance, characterized by existence of volatility or existence of jumps in the variance or on the levels of the prices imposes to discuss the assumption of global stationarity and its consequence in modelling, particularly in forecasting. Thus we can address several questions with respect to these remarks.
1. What kinds of nonstationarity affect the major financial and economic data sets? How to detect them?
2. Local and global stationarities: How are they defined?
3. What is the impact of evidence of nonstationarity on the statistics computed from the global non stationary data sets?
4. How can we analyze data sets in the nonstationary global framework? Does the asymptotic theory work in nonstationary framework?
5. What kind of models create local stationarity instead of global stationarity? How can we use them to develop a modelling and a forecasting strategy?
These questions began to be discussed in some papers in the economic literature. For some of these questions, the answers are known, for others, very few works exist. In this talk I will discuss all these problems and will propose 2 new stategies and modelling to solve them. Several interesting topics in empirical finance awaiting future research will also be discussed.


Probabilistic models of human cognition 15:10 Fri 29 Aug, 2008 :: G03 Napier Building University of Adelaide :: Dr Daniel Navarro :: School of Psychology, University of Adelaide
Over the last 15 years a fairly substantial psychological literature has developed in which human reasoning and decisionmaking is viewed as the solution to a variety of statistical problems posed by the environments in which we operate. In this talk, I briefly outline the general approach to cognitive modelling that is adopted in this literature, which relies heavily on Bayesian statistics, and introduce a little of the current research in this field. In particular, I will discuss work by myself and others on the statistical basis of how people make simple inductive leaps and generalisations, and the links between these generalisations and how people acquire word meanings and learn new concepts. If time permits, the extensions of the work in which complex concepts may be characterised with the aid of nonparametric Bayesian tools such as Dirichlet processes will be briefly mentioned. 

Assisted reproduction technology: how maths can contribute 13:10 Wed 22 Oct, 2008 :: Napier 210 :: Dr Yvonne Stokes
Media...Most people will have heard of IVF (in vitro fertilisation), a
technology for helping infertile couples have a baby. Although there are
many IVF babies, many will also know that the success rate is still low
for the cost and inconvenience involved. The fact that some women
cannot make use of IVF because of lifethreatening consequences is less
well known but motivates research into other technologies, including
IVM (in vitro maturation).
What has all this to do with maths? Come along and find out how
mathematical modelling is contributing to understanding and
improvement in this important and interesting field.


Oceanographic Research at the South Australian Research and Development Institute: opportunities for collaborative research 15:10 Fri 21 Nov, 2008 :: Napier G04 :: Associate Prof John Middleton :: South Australian Research and Development Institute
Increasing threats to S.A.'s fisheries and marine environment have underlined the increasing need for soundly based research into the ocean circulation and ecosystems (phyto/zooplankton) of the shelf and gulfs. With support of Marine Innovation SA, the Oceanography Program has within 2 years, grown to include 6 FTEs and a budget of over $4.8M. The program currently leads two major research projects, both of which involve numerical and applied mathematical modelling of oceanic flow and ecosystems as well as statistical techniques for the analysis of data. The first is the implementation of the Southern Australian Integrated Marine Observing System (SAIMOS) that is providing data to understand the dynamics of shelf boundary currents, monitor for climate change and understand the phyto/zooplankton ecosystems that underpin SA's wild fisheries and aquaculture. SAIMOS involves the use of shipbased sampling, the deployment of underwater marine moorings, underwater gliders, HF Ocean RADAR, acoustic tracking of tagged fish and Autonomous Underwater vehicles.
The second major project involves measuring and modelling the ocean circulation and biological systems within Spencer Gulf and the impact on prawn larval dispersal and on the sustainability of existing and proposed aquaculture sites. The discussion will focus on opportunities for collaborative research with both faculty and students in this exciting growth area of S.A. science.


From linear algebra to knot theory 15:10 Fri 21 Aug, 2009 :: Badger Labs G13
Macbeth Lecture Theatre :: Prof Ross Street :: Macquarie University, Sydney
Vector spaces and linear functions form our paradigmatic monoidal category. The concepts underpinning linear algebra admit definitions, operations and constructions with analogues in many other parts of mathematics. We shall see how to generalize much of linear algebra to the context of monoidal categories. Traditional examples of such categories are obtained by replacing vector spaces by linear representations of a given compact group or by sheaves of vector spaces. More recent examples come from lowdimensional topology, in particular, from knot theory where the linear functions are replaced by braids or tangles. These geometric monoidal categories are often free in an appropriate sense, a fact that can be used to obtain algebraic invariants for manifolds. 

Statistical analysis for harmonized development of systemic organs in human fetuses 11:00 Thu 17 Sep, 2009 :: School Board Room :: Prof Kanta Naito :: Shimane University
The growth processes of human babies have been studied
sufficiently in scientific fields, but there have still been many issues
about the developments of human fetus which are not clarified. The aim of
this research is to investigate the developing process of systemic organs of
human fetuses based on the data set of measurements of fetus's bodies and
organs. Specifically, this talk is concerned with giving a mathematical
understanding for the harmonized developments of the organs of human
fetuses. The method to evaluate such harmonies is proposed by the use of the
maximal dilatation appeared in the theory of quasiconformal mapping. 

Understanding hypersurfaces through tropical geometry 12:10 Fri 25 Sep, 2009 :: Napier 102 :: Dr Mohammed Abouzaid :: Massachusetts Institute of Technology
Given a polynomial in two or more variables, one may study the
zero locus from the point of view of different mathematical subjects
(number theory, algebraic geometry, ...). I will explain how tropical
geometry allows to encode all topological aspects by elementary
combinatorial objects called "tropical varieties."
Mohammed Abouzaid received a B.S. in 2002 from the University of Richmond, and a Ph.D. in 2007 from the University of Chicago under the supervision of Paul Seidel. He is interested in symplectic topology and its interactions with algebraic geometry and differential topology, in particular the homological mirror symmetry conjecture. Since 2007 he has been a postdoctoral fellow at MIT, and a Clay Mathematics Institute Research Fellow. 

Stable commutator length 13:40 Fri 25 Sep, 2009 :: Napier 102 :: Prof Danny Calegari :: California Institute of Technology
Stable commutator length answers the question: "what is the simplest
surface in a given space with prescribed boundary?" where "simplest"
is interpreted in topological terms. This topological definition is
complemented by several equivalent definitions  in group theory, as a
measure of noncommutativity of a group; and in linear programming, as
the solution of a certain linear optimization problem. On the
topological side, scl is concerned with questions such as computing
the genus of a knot, or finding the simplest 4manifold that bounds a
given 3manifold. On the linear programming side, scl is measured in
terms of certain functions called quasimorphisms, which arise from
hyperbolic geometry (negative curvature) and symplectic geometry
(causal structures). In these talks we will discuss how scl in free
and surface groups is connected to such diverse phenomena as the
existence of closed surface subgroups in graphs of groups, rigidity
and discreteness of symplectic representations, bounding immersed
curves on a surface by immersed subsurfaces, and the theory of multi
dimensional continued fractions and Klein polyhedra.
Danny Calegari is the Richard Merkin Professor of Mathematics at the California Institute of Technology, and is one of the recipients of the 2009 Clay Research Award for his work in geometric topology and geometric group theory. He received a B.A. in 1994 from the University of Melbourne, and a Ph.D. in 2000 from the University of California, Berkeley under the joint supervision of Andrew Casson and William Thurston. From 2000 to 2002 he was Benjamin Peirce Assistant Professor at Harvard University, after which he joined the Caltech faculty; he became Richard Merkin Professor in 2007.


The proof of the Poincare conjecture 15:10 Fri 25 Sep, 2009 :: Napier 102 :: Prof Terrence Tao :: UCLA
In a series of three papers from 20022003, Grigori Perelman gave a spectacular proof of the Poincare Conjecture (every smooth compact simply connected threedimensional manifold is topologically isomorphic to a sphere), one of the most famous open problems in mathematics (and one of the seven Clay Millennium Prize Problems worth a million dollars each), by developing several new groundbreaking advances in Hamilton's theory of Ricci flow on manifolds. In this talk I describe in broad detail how the proof proceeds, and briefly discuss some of the key turning points in the argument.
About the speaker:
Terence Tao was born in Adelaide, Australia, in 1975. He has been a professor of mathematics at UCLA since 1999, having completed his PhD under Elias Stein at Princeton in 1996. Tao's areas of research include harmonic analysis, PDE, combinatorics, and number theory. He has received a number of awards, including the Salem Prize in 2000, the Bochner Prize in 2002, the Fields Medal and SASTRA Ramanujan Prize in 2006, and the MacArthur Fellowship and Ostrowski Prize in 2007. Terence Tao also currently holds the James and Carol Collins chair in mathematics at UCLA, and is a Fellow of the Royal Society and the Australian Academy of Sciences (Corresponding Member). 

Contemporary frontiers in statistics 15:10 Mon 28 Sep, 2009 :: Badger Labs G31 Macbeth Lectrue :: Prof. Peter Hall :: University of Melbourne
The availability of powerful computing equipment has had a dramatic impact on statistical methods and thinking, changing forever the way data are analysed. New data types, larger quantities of data, and new classes of research problem are all motivating new statistical methods. We shall give examples of each of these issues, and discuss the current and future directions of frontier problems in statistics. 

American option pricing in a Markov chain market model 15:10 Fri 19 Mar, 2010 :: School Board Room :: Prof Robert Elliott :: School of Mathematical Sciences, University of Adelaide
This paper considers a model for asset pricing in a world where
the randomness is modeled by a Markov chain rather than Brownian motion.
In this paper we develop a theory of optimal stopping and related
variational inequalities for American options in this model. A version of
Saigal's Lemma is established and numerical algorithms developed.
This is a joint work with John van der Hoek. 

Exploratory experimentation and computation 15:10 Fri 16 Apr, 2010 :: Napier LG29 :: Prof Jonathan Borwein :: University of Newcastle
Media...The mathematical research community is facing a great challenge to reevaluate the role of proof in light of the growing power of current computer systems, of modern mathematical computing packages, and of the growing capacity to datamine on the Internet. Add to that the enormous complexity of many modern capstone results such as the Poincare conjecture, Fermat's last theorem, and the Classification of finite simple groups. As the need and prospects for inductive mathematics blossom, the requirement to ensure the role of proof is properly founded remains undiminished. I shall look at the philosophical context with examples and then offer some of five benchmarking examples of the opportunities and challenges we face. 

Estimation of sparse Bayesian networks using a scorebased approach 15:10 Fri 30 Apr, 2010 :: School Board Room :: Dr Jessica Kasza :: University of Copenhagen
The estimation of Bayesian networks given highdimensional data sets, with more variables than there are observations, has been the focus of much recent research. These structures provide a flexible framework for the representation of the conditional independence relationships of a set of variables, and can be particularly useful in the estimation of genetic regulatory networks given gene expression data.
In this talk, I will discuss some new research on learning sparse networks, that is, networks with many conditional independence restrictions, using a scorebased approach. In the case of genetic regulatory networks, such sparsity reflects the view that each gene is regulated by relatively few other genes. The presented approach allows prior information about the overall sparsity of the underlying structure to be included in the analysis, as well as the incorporation of prior knowledge about the connectivity of individual nodes within the network.


Interpolation of complex data using spatiotemporal compressive sensing 13:00 Fri 28 May, 2010 :: Santos Lecture Theatre :: A/Prof Matthew Roughan :: School of Mathematical Sciences, University of Adelaide
Many complex datasets suffer from missing data, and interpolating these missing
elements is a key task in data analysis. Moreover, it is often the case that we
see only a linear combination of the desired measurements, not the measurements
themselves. For instance, in network management, it is easy to count the traffic
on a link, but harder to measure the endtoend flows. Additionally, typical
interpolation algorithms treat either the spatial, or the temporal
components of data separately, but in many real datasets have strong
spatiotemporal structure that we would like to exploit in reconstructing the
missing data. In this talk I will describe a novel reconstruction algorithm that
exploits concepts from the growing area of compressive sensing to solve all of
these problems and more. The approach works so well on Internet traffic matrices
that we can obtain a reasonable reconstruction with as much as 98% of the
original data missing. 

Some thoughts on wine production 15:05 Fri 18 Jun, 2010 :: School Board Room :: Prof Zbigniew Michalewicz :: School of Computer Science, University of Adelaide
In the modern information era, managers (e.g. winemakers) recognize the
competitive opportunities represented by decisionsupport tools which can
provide a significant cost savings & revenue increases for their businesses.
Wineries make daily decisions on the processing of grapes, from harvest time
(prediction of maturity of grapes, scheduling of equipment and labour, capacity
planning, scheduling of crushers) through tank farm activities (planning and
scheduling of wine and juice transfers on the tank farm) to packaging processes
(bottling and storage activities). As such operation is quite complex, the whole
area is loaded with interesting ORrelated issues. These include the issues of
global vs. local optimization, relationship between prediction and optimization,
operating in dynamic environments, strategic vs. tactical optimization, and
multiobjective optimization & tradeoff analysis. During the talk we address
the above issues; a few realworld applications will be shown and discussed to
emphasize some of the presented material. 

Topological chaos in two and three dimensions 15:10 Fri 18 Jun, 2010 :: Santos Lecture Theatre :: Dr Matt Finn :: School of Mathematical Sciences
Research into twodimensional laminar fluid mixing has enjoyed a
renaissance in the last decade since the realisation that the
Thurston–Nielsen theory of surface homeomorphisms can assist in
designing efficient "topologically chaotic" batch mixers.
In this talk I will survey some tools used in topological fluid
kinematics, including braid groups, traintracks, dynamical systems and
topological index formulae. I will then make some speculations about
topological chaos in three dimensions. 

Meteorological drivers of extreme bushfire events in southern Australia 15:10 Fri 2 Jul, 2010 :: Benham Lecture Theatre :: Prof Graham Mills :: Centre for Australian Weather and Climate Research, Melbourne
Bushfires occur regularly during summer in southern Australia, but only a few of these fires become iconic due to their effects, either in terms of loss of life or economic and social cost. Such events include Black Friday (1939), the Hobart fires (1967), Ash Wednesday (1983), the Canberra bushfires (2003), and most recently Black Saturday in February 2009. In most of these events the weather of the day was statistically extreme in terms of heat, (low) humidity, and wind speed, and in terms of antecedent drought. There are a number of reasons for conducting postevent analyses of the meteorology of these events. One is to identify any meteorological circulation systems or dynamic processes occurring on those days that might not be widely or hitherto recognised, to document these, and to develop new forecast or guidance products. The understanding and prediction of such features can be used in the short term to assist in effective management of fires and the safety of firefighters and in the medium range to assist preparedness for the onset of extreme conditions. The results of such studies can also be applied to simulations of future climates to assess the likely changes in frequency of the most extreme fire weather events, and their documentary records provide a resource that can be used for advanced training purposes. In addition, particularly for events further in the past, revisiting these events using reanalysis data sets and contemporary NWP models can also provide insights unavailable at the time of the events.
Over the past few years the Bushfire CRC's Fire Weather and Fire Danger project in CAWCR has studied the mesoscale meteorology of a number of major fire events, including the days of Ash Wednesday 1983, the Dandenong Ranges fire in January 1997, the Canberra fires and the Alpine breakout fires in January 2003, the Lower Eyre Peninsula fires in January 2005 and the Boorabbin fire in December 2007January 2008. Various aspects of these studies are described below, including the structures of dry cold frontal wind changes, the particular character of the cold fronts associated with the most damaging fires in southeastern Australia, and some aspects of how the vertical temperature and humidity structure of the atmosphere may affect the fire weather at the surface.
These studies reveal much about these major events, but also suggest future research directions, and some of these will be discussed.


Mathematica Seminar 15:10 Wed 28 Jul, 2010 :: Engineering Annex 314 :: Kim Schriefer :: Wolfram Research
The Mathematica Seminars 2010 offer an opportunity to experience the applicability, easeofuse, as well as the advancements of Mathematica 7 in education and academic research. These seminars will highlight the latest directions in technical computing with Mathematica, and the impact this technology has across a wide range of academic fields, from maths, physics and biology to finance, economics and business.
Those not yet familiar with Mathematica will gain an overview of the system and discover the breadth of applications it can address, while experts will get firsthand experience with recent advances in Mathematica like parallel computing, digital image processing, pointandclick palettes, builtin curated data, as well as courseware examples. 

A spatialtemporal point process model for fine resolution multisite rainfall data from Roma, Italy 14:10 Thu 19 Aug, 2010 :: Napier G04 :: A/Prof Paul Cowpertwait :: Auckland University of Technology
A point process rainfall model is further developed that has storm origins occurring in spacetime according to a Poisson process. Each storm origin has a random radius so that storms occur as circular regions in twodimensional
space, where the storm radii are taken to be independent exponential random
variables. Storm origins are of random type z, where z follows a continuous
probability distribution. Cell origins occur in a further spatial Poisson
process and have arrival times that follow a NeymanScott point process. Cell
origins have random radii so that cells form discs in twodimensional space.
Statistical properties up to third order are derived and used to fit the model
to 10 min series taken from 23 sites across the Roma region, Italy.
Distributional properties of the observed annual maxima are compared to
equivalent values sampled from series that are simulated using the fitted
model. The results indicate that the model will be of use in urban drainage
projects for the Roma region.


Compound and constrained regression analyses for EIV models 15:05 Fri 27 Aug, 2010 :: Napier LG28 :: Prof Wei Zhu :: State University of New York at Stony Brook
In linear regression analysis, randomness often exists in the independent variables and the resulting models are referred to errorsinvariables (EIV) models. The existing general EIV modeling framework, the structural model approach, is parametric and dependent on the usually unknown underlying distributions. In this work, we introduce a general nonparametric EIV modeling framework, the compound regression analysis, featuring an intuitive geometric representation and a 11 correspondence to the structural model. Properties, examples and further generalizations of this new modeling approach are discussed in this talk. 

Simultaneous confidence band and hypothesis test in generalised varyingcoefficient models 15:05 Fri 10 Sep, 2010 :: Napier LG28 :: Prof Wenyang Zhang :: University of Bath
Generalised varyingcoefficient models (GVC) are very important
models. There are a considerable number of literature addressing these models.
However, most of the existing literature are devoted to the estimation
procedure. In this talk, I will systematically investigate the statistical
inference for GVC, which includes confidence band as well as hypothesis test. I
will show the asymptotic distribution of the maximum discrepancy between the
estimated functional coefficient and the true functional coefficient. I will
compare different approaches for the construction of confidence band and
hypothesis test. Finally, the proposed statistical inference methods are used to
analyse the data from China about contraceptive use there, which leads to some
interesting findings. 

Principal Component Analysis Revisited 15:10 Fri 15 Oct, 2010 :: Napier G04 :: Assoc. Prof Inge Koch :: University of Adelaide
Since the beginning of the 20th century, Principal Component Analysis (PCA) has been an important tool in the analysis of multivariate data. The principal components summarise data in fewer than the original number of variables without losing essential information, and thus allow a split of the data into signal and noise components. PCA is a linear method, based on elegant mathematical theory.
The increasing complexity of data together with the emergence of fast computers in the later parts of the 20th century has led to a renaissance of PCA. The growing numbers of variables (in particular, highdimensional low sample size problems), nonGaussian data, and functional data (where the data are curves) are posing exciting challenges to statisticians, and have resulted in new research which extends the classical theory.
I begin with the classical PCA methodology and illustrate the challenges presented by the complex data that we are now able to collect. The main part of the talk focuses on extensions of PCA: the duality of PCA and the Principal Coordinates of Multidimensional Scaling, Sparse PCA, and consistency results relating to principal components, as the dimension grows. We will also look at newer developments such as Principal Component Regression and Supervised PCA, nonlinear PCA and Functional PCA.


TBA 15:05 Fri 22 Oct, 2010 :: Napier LG28 :: Dr Andy Lian :: University of Adelaide


Arbitrage bounds for weighted variance swap prices 15:05 Fri 3 Dec, 2010 :: Napier LG28 :: Prof Mark Davis :: Imperial College London
This paper builds on earlier work by Davis and Hobson (Mathematical Finance,
2007) giving modelfreeexcept for a 'frictionless markets' assumption
necessary and sufficient conditions for absence of arbitrage given a set of
currenttime put and call options on some underlying asset. Here we suppose
that the prices of a set of put options, all maturing at the same time, are
given and satisfy the conditions for consistency with absence of arbitrage.
We
now add a pathdependent option, specifically a weighted variance swap, to
the
set of traded assets and ask what are the conditions on its time0 price
under
which consistency with absence of arbitrage is maintained. In the present
work,
we work under the extra modelling assumption that the underlying asset price
process has continuous paths. In general, we find that there is always a
non
trivial lower bound to the range of arbitragefree prices, but only in the
case
of a corridor swap do we obtain a finite upper bound. In the case of, say,
the
vanilla variance swap, a finite upper bound exists when there are additional
traded European options which constrain the left wing of the volatility
surface
in appropriate ways. 

Queues with skill based routing under FCFS–ALIS regime 15:10 Fri 11 Feb, 2011 :: B17 Ingkarni Wardli :: Prof Gideon Weiss :: The University of Haifa, Israel
We consider a system where jobs of several types are served by servers
of several types, and a bipartite graph between server types and job types
describes feasible assignments. This is a common situation in manufacturing,
call centers with skill based routing, matching of parentchild in adoption or
matching in kidney transplants etc. We consider the case of first come first
served policy: jobs are assigned to the first available feasible server in
order of their arrivals. We consider two types of policies for assigning
customers to idle servers  a random assignment and assignment to the longest
idle server (ALIS) We survey some results for four different situations:
 For a loss system we find conditions for reversibility and insensitivity.
 For a manufacturing type system, in which there is enough capacity to serve
all jobs, we discuss a product form solution and waiting times.
 For an infinite matching model in which an infinite sequence of customers of
IID types, and infinite sequence of servers of IID types are matched
according to first come first, we obtain a product form stationary
distribution for this system, which we use to calculate matching rates.
 For a call center model with overload and abandonments we make some plausible
observations.
This talk surveys joint work with Ivo Adan, Rene Caldentey, Cor Hurkens, Ed
Kaplan and Damon Wischik, as well as work by Jeremy Visschers, Rishy Talreja and
Ward Whitt.


Bioinspired computation in combinatorial optimization: algorithms and their computational complexity 15:10 Fri 11 Mar, 2011 :: 7.15 Ingkarni Wardli :: Dr Frank Neumann :: The University of Adelaide
Media...Bioinspired computation methods, such as evolutionary algorithms and ant colony
optimization, are being applied successfully to complex engineering and
combinatorial optimization problems. The computational complexity analysis of
this type of algorithms has significantly increased the theoretical
understanding of these successful algorithms. In this talk, I will give an
introduction into this field of research and present some important results
that we achieved for problems from combinatorial optimization. These results
can also be found in my recent textbook "Bioinspired Computation in
Combinatorial Optimization  Algorithms and Their Computational Complexity". 

Nanotechnology: The mathematics of gas storage in metalorganic frameworks. 12:10 Mon 28 Mar, 2011 :: 5.57 Ingkarni Wardli :: Wei Xian Lim :: University of Adelaide
Have you thought about what sort of car you would be driving in the future? Would it be a hybrid, solar, hydrogen or electric car? I would like to be driving a hydrogen car because my field of research may aid in their development! In my presentation I will introduce you to the world of metalorganic frameworks, which are an exciting new class of materials that have great potential in applications such as hydrogen gas storage. I will also discuss about the mathematical model that I am using to model the performance of metalorganic frameworks based on beryllium. 

Classification for highdimensional data 15:10 Fri 1 Apr, 2011 :: Conference Room Level 7 Ingkarni Wardli :: Associate Prof Inge Koch :: The University of Adelaide
For twoclass classification problems Fisher's discriminant rule performs
well in many scenarios provided the dimension, d, is much smaller than the sample
size n. As the dimension increases, Fisher's rule may no longer be
adequate, and can perform as poorly as random guessing.
In this talk we look at new ways of overcoming this poor performance for
highdimensional data by suitably modifying Fisher's rule, and in particular
we describe the 'Features Annealed Independence Rule (FAIR)? of Fan and Fan
(2008) and a rule based on canonical correlation analysis. I describe some
theoretical developments, and also show analysis of data which illustrate the
performance of these modified rule. 

Why is a pure mathematician working in biology? 15:10 Fri 15 Apr, 2011 :: Mawson Lab G19 lecture theatre :: Associate Prof Andrew Francis :: University of Western Sydney
Media...A pure mathematician working in biology should be a contradiction in
terms. In this talk I will describe how I became interested in working in
biology, coming from an algebraic background. I will also describe some
areas of evolutionary biology that may benefit from an algebraic approach.
Finally, if time permits I will reflect on the sometimes difficult
distinction between pure and applied mathematics, and perhaps venture some
thoughts on mathematical research in general. 

On parameter estimation in population models 15:10 Fri 6 May, 2011 :: 715 Ingkarni Wardli :: Dr Joshua Ross :: The University of Adelaide
Essential to applying a mathematical model to a realworld application is
calibrating the model to data. Methods for calibrating population models
often become computationally infeasible when the populations size (more generally
the size of the state space) becomes large, or other complexities such as
timedependent transition rates, or sampling error, are present. Here we
will discuss the use of diffusion approximations to perform estimation in several
scenarios, with successively reduced assumptions: (i) under the assumption
of stationarity (the process had been evolving for a very long time with
constant parameter values); (ii) transient dynamics (the assumption of stationarity
is invalid, and thus only constant parameter values may be assumed); and, (iii)
timeinhomogeneous chains (the parameters may vary with time) and accounting
for observation error (a sample of the true state is observed). 

When statistics meets bioinformatics 12:10 Wed 11 May, 2011 :: Napier 210 :: Prof Patty Solomon :: School of Mathematical Sciences
Media...Bioinformatics is a new field of research which encompasses mathematics, computer science, biology, medicine and the physical sciences. It has arisen from the need to handle and analyse the vast amounts of data being generated by the new genomics technologies. The interface of these disciplines used to be informationpoor, but is now informationmegarich, and statistics plays a central role in processing this information and making it intelligible. In this talk, I will describe a published bioinformatics study which claimed to have developed a simple test for the early detection of ovarian cancer from a blood sample. The US Food and Drug Administration was on the verge of approving the test kits for market in 2004 when demonstrated flaws in the study design and analysis led to its withdrawal. We are still waiting for an effective early biomarker test for ovarian cancer. 

Statistical challenges in molecular phylogenetics 15:10 Fri 20 May, 2011 :: Mawson Lab G19 lecture theatre :: Dr Barbara Holland :: University of Tasmania
Media...This talk will give an introduction to the ways that mathematics and statistics gets used in the inference of evolutionary (phylogenetic) trees. Taking a modelbased approach to estimating the relationships between species has proven to be an enormously effective, however, there are some tricky statistical challenges that remain. The increasingly plentiful amount of DNA sequence data is a boon, but it is also throwing a spotlight on some of the shortcomings of current best practice particularly in how we (1) assess the reliability of our phylogenetic estimates, and (2) how we choose appropriate models. This talk will aim to give a general introduction this area of research and will also highlight some results from two of my recent PhD students. 

Optimal experimental design for stochastic population models 15:00 Wed 1 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Dan Pagendam :: CSIRO, Brisbane
Markov population processes are popular models for studying a wide range of
phenomena including the spread of disease, the evolution of chemical reactions
and the movements of organisms in population networks (metapopulations). Our
ability to use these models effectively can be limited by our knowledge about
parameters, such as disease transmission and recovery rates in an epidemic.
Recently, there has been interest in devising optimal experimental designs for
stochastic models, so that practitioners can collect data in a manner that
maximises the precision of maximum likelihood estimates of the parameters for
these models. I will discuss some recent work on optimal design for a variety
of population models, beginning with some simple oneparameter models where the
optimal design can be obtained analytically and moving on to more complicated
multiparameter models in epidemiology that involve latent states and
nonexponentially distributed infectious periods. For these more complex
models, the optimal design must be arrived at using computational methods and we
rely on a Gaussian diffusion approximation to obtain analytical expressions for
Fisher's information matrix, which is at the heart of most optimality criteria
in experimental design. I will outline a simple crossentropy algorithm that
can be used for obtaining optimal designs for these models. We will also
explore the improvements in experimental efficiency when using the optimal
design over some simpler designs, such as the design where observations are
spaced equidistantly in time. 

Priority queueing systems with random switchover times and generalisations of the KendallTakacs equation 16:00 Wed 1 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
In this talk I will review existing analytical results for priority queueing
systems with Poisson incoming flows, general service times and a single server
which needs some (random) time to switch between requests of different priority.
Specifically, I will discuss analytical results for the busy period and workload
of such systems with a special structure of switchover times.
The results related to the busy period can be seen as generalisations of the
famous KendallTak\'{a}cs functional equation for $MG1$:
being formulated in terms of LaplaceStieltjes transform, they represent systems
of functional recurrent equations.
I will present a methodology and algorithms of their numerical solution;
the efficiency of these algorithms is achieved by acceleration of the numerical
procedure of solving the classical KendallTak\'{a}cs equation.
At the end I will identify open problems with regard to such systems; these open
problems are mainly related to the modelling of switchover times.


Natural operations on the Hochschild cochain complex 13:10 Fri 3 Jun, 2011 :: Mawson 208 :: Dr Michael Batanin :: Macquarie University
The Hochschild cochain complex of an associative algebra provides an important bridge between algebra and geometry.
Algebraically, this is the derived center of the algebra. Geometrically, the Hochschild cohomology of the algebra of smooth functions on a manifold is isomorphic to the graduate space of polyvector fields on this manifold.
There are many important operations acting on the Hochschild complex. It is, however, a tricky question to ask which operations are natural because the Hochschild complex is not a functor. In my talk I will explain how we can overcome this obstacle and compute all possible natural operations on the Hochschild complex. The result leads immediately to a proof of the Deligne conjecture on Hochschild cochains. 

Inference and optimal design for percolation and general random graph models (Part I) 09:30 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂ«mostly populatedÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Inference and optimal design for percolation and general random graph models (Part II) 10:50 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂÃÂÃÂ«mostly populatedÃÂÃÂÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Quantitative proteomics: data analysis and statistical challenges 10:10 Thu 30 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Peter Hoffmann :: Adelaide Proteomics Centre


Introduction to functional data analysis with applications to proteomics data 11:10 Thu 30 Jun, 2011 :: 7.15 Ingkarni Wardli :: A/Prof Inge Koch :: School of Mathematical Sciences


Object oriented data analysis 14:10 Thu 30 Jun, 2011 :: 7.15 Ingkarni Wardli :: Prof Steve Marron :: The University of North Carolina at Chapel Hill
Object Oriented Data Analysis is the statistical analysis of populations of complex objects. In the special case of Functional Data Analysis, these data objects are curves, where standard Euclidean approaches, such as principal components analysis, have been very successful. Recent developments in medical image analysis motivate the statistical analysis of populations of more complex data objects which are elements of mildly nonEuclidean spaces, such as Lie Groups and Symmetric Spaces, or of strongly nonEuclidean spaces, such as spaces of treestructured data objects. These new contexts for Object Oriented Data Analysis create several potentially large new interfaces between mathematics and statistics. Even in situations where Euclidean analysis makes sense, there are statistical challenges because of the High Dimension Low Sample Size problem, which motivates a new type of asymptotics leading to nonstandard mathematical statistics. 

Object oriented data analysis of treestructured data objects 15:10 Fri 1 Jul, 2011 :: 7.15 Ingkarni Wardli :: Prof Steve Marron :: The University of North Carolina at Chapel Hill
The field of Object Oriented Data Analysis has made a lot of
progress on the statistical analysis of the variation in populations
of complex objects. A particularly challenging example of this type
is populations of treestructured objects. Deep challenges arise,
which involve a marriage of ideas from statistics, geometry, and
numerical analysis, because the space of trees is strongly
nonEuclidean in nature. These challenges, together with three
completely different approaches to addressing them, are illustrated
using a real data example, where each data point is the tree of blood
arteries in one person's brain. 

Blood flow in the coiled umbilical cord 12:10 Mon 22 Aug, 2011 :: 5.57 Ingkarni Wardli :: Mr David Wilke :: University of Adelaide
The umbilical cord is the connecting cord between the developing embryo or fetus and the placenta. In a normal pregnancy it facilitates the supply of oxygen and nutrients from the placenta, in addition to the return of deoxygenated blood from the fetus. One of the most striking features of the umbilical cord is it's coiled structure, which allows the vasculature to withstand tensile and compressive forces in utero. The level of coiling also has a significant effect on the blood flow and cords exhibiting abnormally high or low levels are known to correlate well with adverse outcomes in pregancy, including fetal demise.
In this talk I will discuss the complexities associated with numerically modeling blood flow within the umbilical cord, and provide an outline of the key features which will be investigated throughout my research. 

Estimating disease prevalence in hidden populations 14:05 Wed 28 Sep, 2011 :: B.18 Ingkarni Wardli :: Dr Amber Tomas :: The University of Oxford
Estimating disease prevalence in "hidden" populations such as injecting
drug users or men who have sex with men is an important public health
issue. However, traditional designbased estimation methods are
inappropriate because they assume that a list of all members of the
population is available from which to select a sample. Respondent Driven
Sampling (RDS) is a method developed over the last 15 years for sampling
from hidden populations. Similarly to snowball sampling, it leverages the
fact that members of hidden populations are often socially connected to
one another. Although RDS is now used around the world, there are several
common population characteristics which are known to cause estimates
calculated from such samples to be significantly biased. In this talk I'll
discuss the motivation for RDS, as well as some of the recent developments
in methods of estimation. 

Understanding the dynamics of event networks 15:00 Wed 28 Sep, 2011 :: B.18 Ingkarni Wardli :: Dr Amber Tomas :: The University of Oxford
Within many populations there are frequent communications between
pairs of individuals. Such communications might be emails sent within a
company, radio communications in a disaster zone or diplomatic
communications
between states. Often it is of interest to understand the factors that
drive the observed patterns of such communications, or to study how these
factors are changing over over time. Communications can be thought of as
events
occuring on the edges of a network which connects individuals in the
population.
In this talk I'll present a model for such communications which uses ideas
from
social network theory to account for the complex correlation structure
between
events. Applications to the Enron email corpus and the dynamics of hospital
ward transfer patterns will be discussed. 

Statistical analysis of schoolbased student performance data 12:10 Mon 10 Oct, 2011 :: 5.57 Ingkarni Wardli :: Ms Jessica Tan :: University of Adelaide
Join me in the journey of being a statistician for 15 minutes of your day (if you are not already one) and experience the task of data cleaning without having to get your own hands dirty. Most of you may have sat the Basic Skills Tests when at school or know someone who currently has to do the NAPLAN (National Assessment Program  Literacy and Numeracy) tests. Tests like these assess student progress and can be used to accurately measure school performance. In trying to answer the research question: "what conclusions about student progress and school performance can be drawn from NAPLAN data or data of a similar nature, using mathematical and statistical modelling and analysis techniques?", I have uncovered some interesting results about the data in my initial data analysis which I shall explain in this talk. 

Statistical modelling for some problems in bioinformatics 11:10 Fri 14 Oct, 2011 :: B.17 Ingkarni Wardli :: Professor Geoff McLachlan :: The University of Queensland
Media...In this talk we consider some statistical analyses of data arising in
bioinformatics. The problems include the detection of differential
expression in microarray geneexpression data, the clustering of
timecourse geneexpression data and, lastly, the analysis of
modernday cytometric data. Extensions are considered to the procedures
proposed for these three problems in McLachlan et al. (Bioinformatics, 2006),
Ng et al. (Bioinformatics, 2006), and Pyne et al. (PNAS, 2009), respectively.
The latter references are available at http://www.maths.uq.edu.au/~gjm/. 

Spinal Research at the University of Adelaide 11:10 Wed 14 Dec, 2011 :: B.17 Ingkarni Wardli :: Dr Robert Moore :: Adelaide Centre for Spinal Research


Spinal Research at the University of Adelaide 15:10 Fri 10 Feb, 2012 :: B.20 Ingkarni Wardli :: Dr Robert Moore :: Adelaide Centre for Spinal Research


IGA Workshop: The mathematical implications of gaugestring dualities 09:30 Mon 5 Mar, 2012 :: 7.15 Ingkarni Wardli :: Prof Rajesh Gopakumar :: HarishChandra Research Institute
Media...Lecture series by Rajesh Gopakumar (HarishChandra Research Institute). The lectures will be supplemented by talks by other invited speakers. 

Financial risk measures  the theory and applications of backward stochastic difference/differential equations with respect to the single jump process 12:10 Mon 26 Mar, 2012 :: 5.57 Ingkarni Wardli :: Mr Bin Shen :: University of Adelaide
Media...This is my PhD thesis submitted one month ago. Chapter 1 introduces the backgrounds of the research fields. Then each chapter is a published or an accepted paper.
Chapter 2, to appear in Methodology and Computing in Applied Probability, establishes the theory of Backward Stochastic Difference Equations with respect to the single jump process in discrete time.
Chapter 3, published in Stochastic Analysis and Applications, establishes the theory of Backward Stochastic Differential Equations with respect to the single jump process in continuous time.
Chapter 2 and 3 consist of Part I Theory.
Chapter 4, published in Expert Systems With Applications, gives some examples about how to measure financial risks by the theory established in Chapter 2.
Chapter 5, accepted by Journal of Applied Probability, considers the question of an optimal transaction between two investors to minimize their risks. It's the applications of the theory established in Chapter 3.
Chapter 4 and 5 consist of Part II Applications. 

P or NP: this is the question 13:10 Tue 22 May, 2012 :: 7.15 Ingkarni Wardli :: Dr Ali Eshragh :: School of Mathematical Sciences
Media...Up to early 70's, the main concentration of mathematicians was the design of algorithms. However, the advent of computers changed this focus from not just the design of an algorithm but also to the most efficient algorithm. This created a new field of research, namely the complexity of algorithms, and the associated problem "Is P equal to NP?" was born. The latter question has been unknown for more than four decades and is one of the most famous open problems of the 21st century. Any person who can solve this problem will be awarded US$1,000,000 by the Clay Institute. In this talk, we are going to introduce this problem through simple examples and explain one of the intriguing approaches that may help to solve it.


Evaluation and comparison of the performance of Australian and New Zealand intensive care units 14:10 Fri 25 May, 2012 :: 7.15 Ingkarni Wardli :: Dr Jessica Kasza :: The University of Adelaide
Media...Recently, the Australian Government has emphasised the need for monitoring and comparing the performance of Australian hospitals. Evaluating the performance of intensive care units (ICUs) is of particular importance, given that the most severe cases are treated in these units. Indeed, ICU performance can be thought of as a proxy for the overall performance of a hospital. We compare the performance of the ICUs contributing to the Australian and New Zealand Intensive Care Society (ANZICS) Adult Patient Database, the largest of its kind in the world, and identify those ICUs with unusual performance.
It is wellknown that there are many statistical issues that must be accounted for in the evaluation of healthcare provider performance. Indicators of performance must be appropriately selected and estimated, investigators must adequately adjust for casemix, statistical variation must be fully accounted for, and adjustment for multiple comparisons must be made. Our basis for dealing with these issues is the estimation of a hierarchical logistic model for the inhospital death of each patient, with patients clustered within ICUs. Both patient and ICUlevel covariates are adjusted for, with a random intercept and random coefficient for the APACHE III severity score. Given that we expect most ICUs to have similar performance after adjustment for these covariates, we follow Ohlssen et al., JRSS A (2007), and estimate a null model that we expect the majority of ICUs to follow. This methodology allows us to rigorously account for the aforementioned statistical issues, and accurately identify those ICUs contributing to the ANZICS database that have comparatively unusual performance. This is joint work with Prof. Patty Solomon and Assoc. Prof. John Moran. 

Epidemiological consequences of householdbased antiviral prophylaxis for pandemic influenza 14:10 Fri 8 Jun, 2012 :: 7.15 Ingkarni Wardli :: Dr Joshua Ross :: The University of Adelaide
Media...Antiviral treatment offers a fast acting alternative to vaccination. It is viewed as a firstline of defence against pandemic influenza, protecting families and household members once infection has been detected. In clinical trials antiviral treatment has been shown to be efficacious in preventing infection, limiting disease and reducing transmission, yet their impact at containing the 2009 influenza A(H1N1)pdm outbreak was limited. I will describe some of our work, which attempts to understand this seeming discrepancy, through the development of a general model and computationally efficient methodology for studying householdbased interventions.
This is joint work with Dr Andrew Black (Adelaide), and Prof. Matt Keeling and Dr Thomas House (Warwick, U.K.). 

Differential topology 101 13:10 Fri 17 Aug, 2012 :: Engineering North 218 :: Dr Nicholas Buchdahl :: University of Adelaide
Much of my recent research been directed at a problem in the
theory of compact complex surfacestrying to fill in a gap
in the EnriquesKodaira classification.
Attempting to classify some collection of mathematical
objects is a very common activity for pure mathematicians,
and there are many wellknown examples of successful
classification schemes; for example, the classification of
finite simple groups, and the classification of simply
connected topological 4manifolds.
The aim of this talk will be to illustrate how techniques
from differential geometry can be used to classify compact
surfaces. The level of the talk will be very elementary, and
the material is all very well known, but it is sometimes
instructive to look back over simple cases of a general
problem with the benefit of experience to gain greater
insight into the more general and difficult cases. 

Infectious diseases modelling: from biology to public health policy 15:10 Fri 24 Aug, 2012 :: B.20 Ingkarni Wardli :: Dr James McCaw :: The University of Melbourne
Media...The mathematical study of humantohuman transmissible pathogens has
established itself as a complementary methodology to the traditional
epidemiological approach. The classic susceptibleinfectiousrecovered
model paradigm has been used to great effect to gain insight into the
epidemiology of endemic diseases such as influenza and pertussis, and
the emergence of novel pathogens such as SARS and pandemic influenza.
The modelling paradigm has also been taken within the host and used to
explain the withinhost dynamics of viral (or bacterial or parasite)
infections, with implications for our understanding of infection,
emergence of drug resistance and optimal druginterventions.
In this presentation I will provide an overview of the mathematical
paradigm used to investigate both biological and epidemiological
infectious diseases systems, drawing on case studies from influenza,
malaria and pertussis research. I will conclude with a summary of how
infectious diseases modelling has assisted the Australian government in
developing its pandemic preparedness and response strategies.


Probability, what can it tell us about health? 13:10 Tue 9 Oct, 2012 :: 7.15 Ingkarni Wardli :: Prof Nigel Bean :: School of Mathematical Sciences
Media...Clinical trials are the way in which modern medical systems test whether individual treatments are worthwhile. Sophisticated statistics is used to try and make the conclusions from clinical trials as meaningful as possible. What can a very simple probability model then tell us about the worth of multiple treatments? What might the implications of this be for the whole health system?
This talk is based on research currently being conducted with a physician at a major Adelaide hospital. It requires no health knowledge and was not tested on animals. All you need is an enquiring and open mind.


Multiscale models of evolutionary epidemiology: where is HIV going? 14:00 Fri 19 Oct, 2012 :: Napier 205 :: Dr Lorenzo Pellis :: The University of Warwick
An important component of pathogen evolution at the population level is evolution within hosts, which can alter the composition of genotypes available for transmission as infection progresses. I will present a deterministic multiscale model, linking the withinhost competition dynamics with the transmission dynamics at a population level. I will take HIV as an example of how this framework can help clarify the conflicting evolutionary pressure an infectious disease might be subject to. 

Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Fair and Loathing in State Parliament 12:10 Mon 29 Oct, 2012 :: B.21 Ingkarni Wardli :: Mr Casey Briggs :: University of Adelaide
Media...The South Australian electoral system has a history of bias, malapportionment and perceived unfairness. These days, it is typical of most systems across Australia, except with one major difference  a specific legislated criterion designed to force the system to be 'fair'. In reality, fairness is a hard concept to define, and an even harder concept to enforce.
In this talk I will briefly take you through the history of South Australian electoral reform, the current state of affairs and my proposed research. There will be very little in the way of rigorous mathematics.
No knowledge of politics is assumed, but an understanding of the process of voting would be useful. 

Numerical Free Probability: Computing Eigenvalue Distributions of Algebraic Manipulations of Random Matrices 15:10 Fri 2 Nov, 2012 :: B.20 Ingkarni Wardli :: Dr Sheehan Olver :: The University of Sydney
Media...Suppose that the global eigenvalue distributions
of two large random matrices A and B are known. It is a
remarkable fact that, generically, the eigenvalue distribution
of A + B and (if A and B are positive definite) A*B are
uniquely determined from only the eigenvalue distributions
of A and B; i.e., no information about eigenvectors are
required. These operations on eigenvalue distributions
are described by free probability theory. We construct a
numerical toolbox that can efficiently and reliably
calculate these operations with spectral accuracy, by
exploiting the complex analytical framework that underlies
free probability theory.


A glimpse at the Langlands program 15:10 Fri 12 Apr, 2013 :: B.18 Ingkarni Wardli :: Dr Masoud Kamgarpour :: University of Queensland
Media...Abstract: In the late 1960s, Robert Langlands made a series of surprising conjectures relating fundamental concepts from number theory, representation theory, and algebraic geometry. Langlands' conjectures soon developed into a highprofile international research program known as the Langlands program. Many fundamental problems, including the ShimuraTaniyamaWeil conjecture (partially settled by Andrew Wiles in his proof of the Fermat's Last Theorem), are particular cases of the Langlands program. In this talk, I will discuss some of the motivation and results in this program. 

Colour 12:10 Mon 13 May, 2013 :: B.19 Ingkarni Wardli :: Lyron Winderbaum :: University of Adelaide
Media...Colour is a powerful tool in presenting data, but it can be tricky to choose just the right colours to represent your data honestly  do the colours used in your heatmap overemphasise the differences between particular values over others? does your choice of colours overemphasize one when they should be represented as equal? etc. All these questions are fundamentally based in how we perceive colour. There has been alot of research into how we perceive colour in the past century, and some interesting results. I will explain how a `standard observer' was found empirically and used to develop an absolute reference standard for colour in 1931. How although the common RedGreenBlue representation of colour is useful and intuitive, distances between colours in this space do not reflect our perception of difference between colours and how alternative, perceptually focused colourspaces where introduced in 1976. I will go on to explain how these results can be used to provide simple mechanisms by which to choose colours that satisfy particular properties such as being equally different from each other, or being linearly more different in sequence, or maintaining such properties when transferred to greyscale, or for a colourblind person. 

Invariant Theory: The 19th Century and Beyond 15:10 Fri 21 Jun, 2013 :: B.18 Ingkarni Wardli :: Dr Jarod Alper :: Australian National University
Media...A central theme in 19th century mathematics was invariant theory, which was viewed as a bridge between geometry and algebra. David Hilbert revolutionized the field with two seminal papers in 1890 and 1893 with techniques such as Hilbert's basis theorem, Hilbert's Nullstellensatz and Hilbert's syzygy theorem that spawned the modern field of commutative algebra. After Hilbert's groundbreaking work, the field of invariant theory remained largely inactive until the 1960's when David Mumford revitalized the field by reinterpreting Hilbert's ideas in the context of algebraic geometry which ultimately led to the influential construction of the moduli space of smooth curves. Today invariant theory remains a vital research area with connections to various mathematical disciplines: representation theory, algebraic geometry, commutative algebra, combinatorics and nonlinear differential operators.
The goal of this talk is to provide an introduction to invariant theory with an emphasis on Hilbert's and Mumford's contributions. Time permitting, I will explain recent research with Maksym Fedorchuk and David Smyth which exploits the ideas of Hilbert, Mumford as well as Kempf to answer a classical question concerning the stability of algebraic curves. 

The Hamiltonian Cycle Problem and Markov Decision Processes 15:10 Fri 2 Aug, 2013 :: B.18 Ingkarni Wardli :: Prof Jerzy Filar :: Flinders University
Media...We consider the famous Hamiltonian cycle problem (HCP) embedded in a Markov decision process (MDP). More specifically, we consider a moving object on a graph G where, at each vertex, a controller may select an arc emanating from that vertex according to a probabilistic decision rule. A stationary policy is simply a control where these decision rules are time invariant. Such a policy induces a Markov chain on the vertices of the graph. Therefore, HCP is equivalent to a search for a stationary policy that induces a 01 probability transition matrix whose nonzero entries trace out a Hamiltonian cycle in the graph. A consequence of this embedding is that we may consider the problem over a number of, alternative, convex  rather than discrete  domains. These include: (a) the space of stationary policies, (b) the more restricted but, very natural, space of doubly stochastic matrices induced by the graph, and (c) the associated spaces of socalled "occupational measures". This approach to the HCP has led to both theoretical and algorithmic approaches to the underlying HCP problem. In this presentation, we outline a selection of results generated by this line of research. 

Symplectic Lie groups 12:10 Fri 9 Aug, 2013 :: Ingkarni Wardli B19 :: Dr Wolfgang Globke :: University of Adelaide
A "symplectic Lie group" is a Lie group G with a symplectic form such that G acts by symplectic transformations on itself. Such a G cannot be semisimple, so the research focuses on solvable symplectic Lie groups. In the compact case, a classification of these groups is known. In many cases, a solvable symplectic Lie group G is a cotangent bundle of a flat Lie group H. Then H is a Lagrange subgroup of G, meaning its Lie algebra h is isotropic in the Lie algebra g of G. The existence of Lagrange subalgebras or ideals in g is an important question which relates to many problems in the general structure theory of symplectic Lie groups.
In my talk, I will give a brief overview of the known results in this field, ranging from the 1970s to a very recent structure theory. 

Lost in Space: Point Pattern Matching and Astrometry 12:35 Mon 14 Oct, 2013 :: B.19 Ingkarni Wardli :: Annie Conway :: University of Adelaide
Astrometry is the field of research that concerns the positions of objects in space. This can be useful for satellite tracking where we would like to know accurate positions of satellites at given times. Telescopes give us some idea of the position, but unfortunately they are not very precise.
However, if a photograph of a satellite has stars in the background, we can use that information to refine our estimate of the location of the image, since the positions of stars are known to high accuracy and are readily available in star catalogues. But there are billions of stars in the sky so first we would need to determine which ones we're actually looking at.
In this talk I will give a brief introduction to astrometry and walk through a point pattern matching algorithm for identifying stars in a photograph. 

Modelling and optimisation of group doseresponse challenge experiments 12:10 Mon 28 Oct, 2013 :: B.19 Ingkarni Wardli :: David Price :: University of Adelaide
Media...An important component of scientific research is the 'experiment'. Effective design of these experiments is important and, accordingly, has received significant attention under the heading 'optimal experimental design'. However, until recently, little work has been done on optimal experimental design for experiments where the underlying process can be modelled by a Markov chain. In this talk, I will discuss some of the work that has been done in the field of optimal experimental design for Markov Chains, and some of the work that I have done in applying this theory to doseresponse challenge experiments for the bacteria Campylobacter jejuni in chickens. 

Recent developments in special holonomy manifolds 12:10 Fri 1 Nov, 2013 :: Ingkarni Wardli 7.15 :: Prof Robert Bryant :: Duke University
One of the big classification results in differential geometry from the past century has been the classification of the possible holonomies of affine manifolds, with the major first step having been taken by Marcel Berger in his 1954 thesis. However, Berger's classification was only partial, and, in the past 20 years, an extensive research effort has been expended to complete this classification and extend it in a number of ways. In this talk, after recounting the major parts of the history of the subject, I will discuss some of the recent results and surprising new examples discovered as a byproduct of research into Finsler geometry. If time permits, I will also discuss some of the open problems in the subject. 

Developing Multiscale Methodologies for Computational Fluid Mechanics 12:10 Mon 11 Nov, 2013 :: B.19 Ingkarni Wardli :: Hammad Alotaibi :: University of Adelaide
Media...Recently the development of multiscale methods is one of the most fertile research areas in mathematics, physics, engineering and computer science. The need for multiscale modeling comes usually from the fact that the available macroscale models are not accurate enough, and the microscale models are not efficient enough. By combining both viewpoints, one hopes to arrive at a reasonable compromise between accuracy and efficiency.
In this seminar I will give an overview of the recent efforts on developing multiscale methods such as patch dynamics scheme which is used to address an important class of time dependent multiscale problems. 

A gentle introduction to bubble evolution in HeleShaw flows 15:10 Fri 22 Nov, 2013 :: 5.58 (Ingkarni Wardli) :: Dr Scott McCue :: QUT
A HeleShaw cell is easy to make and serves as a fun toy for an applied mathematician to play with. If we inject air into a HeleShaw cell that is otherwise filled with viscous fluid, we can observe a bubble of air growing in size. The process is highly unstable, and the bubble boundary expands in an uneven fashion, leading to striking fingering patterns (look up HeleShaw cell or SaffmanTaylor instability on YouTube). From a mathematical perspective, modelling these HeleShaw flows is interesting because the governing equations are sufficiently ``simple'' that a considerable amount of analytical progress is possible. Indeed, there is no other context in which (genuinely) twodimensional moving boundary problems are so tractable. More generally, HeleShaw flows are important as they serve as prototypes for more complicated (and important) physical processes such as crystal growth and diffusion limited aggregation. I will give an introduction to some of the main ideas and summarise some of my present research in this area.


Networkbased approaches to classification and biomarker identification in metastatic melanoma 15:10 Fri 2 May, 2014 :: B.21 Ingkarni Wardli :: Associate Professor Jean Yee Hwa Yang :: The University of Sydney
Media...Finding prognostic markers has been a central question in much of current research in medicine and biology. In the last decade, approaches to prognostic prediction within a genomics setting are primarily based on changes in individual genes / protein. Very recently, however, network based approaches to prognostic prediction have begun to emerge which utilize interaction information between genes. This is based on the believe that largescale molecular interaction networks are dynamic in nature and changes in these networks, rather than changes in individual genes/proteins, are often drivers of complex diseases such as cancer.
In this talk, I use data from stage III melanoma patients provided by Prof. Mann from Melanoma Institute of Australia to discuss how network information can be utilize in the analysis of gene expression analysis to aid in biological interpretation. Here, we explore a number of novel and previously published networkbased prediction methods, which we will then compare to the common singlegene and geneset methods with the aim of identifying more biologically interpretable biomarkers in the form of networks. 

Computing with groups 15:10 Fri 30 May, 2014 :: B.21 Ingkarni Wardli :: Dr Heiko Dietrich :: Monash University
Media...Groups are algebraic structures which show up in many branches of
mathematics and other areas of science; Computational Group Theory is
on the cutting edge of pure research in group theory and its interplay
with computational methods.
In this talk, we consider a practical aspect
of Computational Group Theory: how to represent a group in a computer,
and how to work with such a description efficiently. We will first
recall some wellestablished methods for permutation group; we will
then discuss some recent progress for matrix groups. 

Mathematics: a castle in the sky? 14:10 Mon 25 Aug, 2014 :: Ingkarni Wardli 715 Conference Room :: Dr. David Roberts :: School of Mathematical Sciences
Media...At university you are exposed to more rigorous mathematics than at school, exemplified
by definitions such as those of real numbers individually or as a whole. However, what
does mathematics ultimately rest on? Definitions depend on things
defined earlier, and
this process must stop at some point. Mathematicians expended a lot of
energy in the
late 19th and early 20th centuries trying to pin down the absolutely
fundamental ideas
of mathematics, with unexpected results. The results of these efforts are called
foundations and are still an area of active research today.
This talk will explain what foundations are, some of the historical
setting in which they arose,
and several of the various systems on which mathematics can be built
 and why most of the
mathematics you will do only uses a tiny portion of it! 

Testing Statistical Association between Genetic Pathways and Disease Susceptibility 12:10 Mon 1 Sep, 2014 :: B.19 Ingkarni Wardli :: Andy Pfieffer :: University of Adelaide
Media...A major research area is the identification of genetic pathways associated with various diseases. However, a detailed comparison of methods that have been designed to ascertain the association between pathways and diseases has not been performed.
I will give the necessary biological background behind GenomeWide Association Studies (GWAS), and explain the shortfalls in traditional GWAS methodologies. I will then explore various methods that use information about genetic pathways in GWAS, and explain the challenges in comparing these methods. 

Exploration vs. Exploitation with Partially Observable Gaussian Autoregressive Arms 15:00 Mon 29 Sep, 2014 :: Engineering North N132 :: Julia Kuhn :: The University of Queensland & The University of Amsterdam
Media...We consider a restless bandit problem with Gaussian autoregressive arms, where the state of an arm is only observed when it is played and the statedependent reward is collected. Since arms are only partially observable, a good decision policy needs to account for the fact that information about the state of an arm becomes more and more obsolete while the arm is not being played. Thus, the decision maker faces a tradeoff between exploiting those arms that are believed to be currently the most rewarding (i.e. those with the largest conditional mean), and exploring arms with a high conditional variance. Moreover, one would like the decision policy to remain tractable despite the infinite state space and also in systems with many arms. A policy that gives some priority to exploration is the Whittle index policy, for which we establish structural properties. These motivate a parametric index policy that is computationally much simpler than the Whittle index but can still outperform the myopic policy. Furthermore, we examine the manyarm behavior of the system under the parametric policy, identifying equations describing its asymptotic dynamics. Based on these insights we provide a simple heuristic algorithm to evaluate the performance of index policies; the latter is used to optimize the parametric index. 

Micro Magnetofluidics  Wireless Manipulation for Microfluidics 15:10 Fri 24 Oct, 2014 :: N.132 Engineering North :: Professor NamTrung Nguyen :: Griffith University
Media...Microfluidics is rich in multiphysics phenomena, which offer fundamentally new capabilities in the manipulation and detection of biological particles. Most current microfluidic applications are based on hydrodynamic, electrokinetic, acoustic and optic actuation. Implementing these concepts requires bulky external pumping/valving systems and energy supplies. The required wires and connectors make their fabrication and handling difficult. Most of the conventional approaches induce heat that may affect sensitive bio particles such as cells. There is a need for a technology for fluid handling in microfluidic devices that is of lowcost, simple, wireless, free of induced heat and independent of pH level or ion concentration. The use of magnetism would provide a wireless solution for this need. Micro magnetofluidics is a newly established research field that links magnetism and microfluidics to gain new capabilities. Magnetism provides a convenient and wireless way for control and manipulation of fluid flow in the microscale. Investigation of magnetisminduced phenomena in a microfluidic device has the advantage of welldefined experimental condition such as temperature and magnetic field because of the system size. This talk presents recent interesting phenomena in both continuousflow and digital micro magnetofluidics. 

Modelling segregation distortion in multiparent crosses 15:00 Mon 17 Nov, 2014 :: 5.57 Ingkarni Wardli :: Rohan Shah (joint work with B. Emma Huang and Colin R. Cavanagh) :: The University of Queensland
Construction of highdensity genetic maps has been made feasible by lowcost highthroughput genotyping technology; however, the process is still complicated by biological, statistical and computational issues. A major challenge is the presence of segregation distortion, which can be caused by selection, difference in fitness, or suppression of recombination due to introgressed segments from other species. Alien introgressions are common in major crop species, where they have often been used to introduce beneficial genes from wild relatives.
Segregation distortion causes problems at many stages of the map construction process, including assignment to linkage groups and estimation of recombination fractions. This can result in incorrect ordering and estimation of map distances. While discarding markers will improve the resulting map, it may result in the loss of genomic regions under selection or containing beneficial genes (in the case of introgression).
To correct for segregation distortion we model it explicitly in the estimation of recombination fractions. Previously proposed methods introduce additional parameters to model the distortion, with a corresponding increase in computing requirements. This poses difficulties for large, densely genotyped experimental populations. We propose a method imposing minimal additional computational burden which is suitable for highdensity map construction in large multiparent crosses. We demonstrate its use modelling the known Sr36 introgression in wheat for an eightparent complex cross.


Topology Tomography with Spatial Dependencies 15:00 Tue 25 Nov, 2014 :: Engineering North N132 :: Darryl Veitch :: The University of Melbourne
Media...There has been quite a lot of tomography inference work on measurement networks with a tree topology. Here observations are made, at the leaves of the tree, of `probes' sent down from the root and copied at each branch point. Inference can be performed based on loss or delay information carried by probes, and used in order to recover loss parameters, delay parameters, or the topology, of the tree. In all of these a strong assumption of spatial independence between links in the tree has been made in prior work. I will describe recent work on topology inference, based on loss measurement, which breaks that assumption. In particular I will introduce a new model class for loss with non trivial spatial dependence, the `Jump Independent Models', which are well motivated, and prove that within this class the topology is identifiable. 

Queues and cooperative games 15:00 Fri 18 Sep, 2015 :: Ingkarni Wardli B21 :: Moshe Haviv :: Department of Statistics and the Federmann Center for the Study of Rationality, The Hebrew Universit
Media...The area of cooperative game theory deals with models in which a number of individuals, called players, can form coalitions so as to improve the utility of its members. In many cases, the formation of the grand coalition is a natural result of some negotiation or a bargaining procedure.
The main question then is how the players should split the gains due to their cooperation among themselves. Various solutions have been suggested among them the Shapley value, the nucleolus and the core.
Servers in a queueing system can also join forces. For example, they can exchange service capacity among themselves or serve customers who originally seek service at their peers. The overall performance improves and the question is how they should split the gains, or,
equivalently, how much each one of them needs to pay or be paid in order to cooperate with the others. Our major focus is in the core of the resulting cooperative game and in showing that in many queueing games the core is not empty.
Finally, customers who are served by the same server can also be looked at as players who form a grand coalition, now inflicting damage on each other in the form of additional waiting time. We show how cooperative game theory, specifically the AumannShapley prices, leads to a way in which this damage can be attributed to individual customers or groups of customers. 

Ocean dynamics of Gulf St Vincent: a numerical study 12:10 Mon 2 Nov, 2015 :: Benham Labs G10 :: Henry Ellis :: University of Adelaide
Media...The aim of this research is to determine the physical dynamics of ocean circulation within Gulf St. Vincent, South Australia, and the exchange of momentum, nutrients, heat, salt and other water properties between the gulf and shelf via Investigator Strait and Backstairs Passage. The project aims to achieve this through the creation of highresolution numerical models, combined with new and historical observations from a moored instrument package, satellite data, and shipboard surveys.
The quasirealistic highresolution models are forced using boundary conditions generated by existing larger scale ROMS models, which in turn are forced at the boundary by a global model, creating a global to regional to local model network. Climatological forcing is done using European Centres for Medium range Weather Forecasting (ECMWF) data sets and is consistent over the regional and local models. A series of conceptual models are used to investigate the relative importance of separate physical processes in addition to fully forced quasirealistic models.
An outline of the research to be undertaken is given:
ÃÂ¢ÃÂÃÂ¢ Connectivity of Gulf St. Vincent with shelf waters including seasonal variation due to wind and thermoclinic patterns;
ÃÂ¢ÃÂÃÂ¢ The role of winter time cooling and formation of eddies in flushing the gulf;
ÃÂ¢ÃÂÃÂ¢ The formation of a temperature front within the gulf during summer time; and
ÃÂ¢ÃÂÃÂ¢ The connectivity and importance of nutrient rich, cool, water upwelling from the Bonney Coast with the gulf via Backstairs Passage during summer time. 

Modelling Coverage in RNA Sequencing 09:00 Mon 9 Nov, 2015 :: Ingkarni Wardli 5.57 :: Arndt von Haeseler :: Max F Perutz Laboratories, University of Vienna
Media...RNA sequencing (RNAseq) is the method of choice for measuring the expression of RNAs in a cell population. In an RNAseq experiment, sequencing the full length of larger RNA molecules requires fragmentation into smaller pieces to be compatible with limited read lengths of most deepsequencing technologies. Unfortunately, the issue of nonuniform coverage across a genomic feature has been a concern in RNAseq and is attributed to preferences for certain fragments in steps of library preparation and sequencing. However, the disparity between the observed nonuniformity of read coverage in RNAseq data and the assumption of expected uniformity elicits a query on the read coverage profile one should expect across a transcript, if there are no biases in the sequencing protocol. We propose a simple model of unbiased fragmentation where we find that the expected coverage profile is not uniform and, in fact, depends on the ratio of fragment length to transcript length. To compare the nonuniformity proposed by our model with experimental data, we extended this simple model to incorporate empirical attributes matching that of the sequenced transcript in an RNAseq experiment. In addition, we imposed an experimentally derived distribution on the frequency at which fragment lengths occur.
We used this model to compare our theoretical prediction with experimental data and with the uniform coverage model. If time permits, we will also discuss a potential application of our model. 

Use of epidemic models in optimal decision making 15:00 Thu 19 Nov, 2015 :: Ingkarni Wardli 5.57 :: Tim Kinyanjui :: School of Mathematics, The University of Manchester
Media...Epidemic models have proved useful in a number of applications in epidemiology. In this work, I will present two areas that we have used modelling to make informed decisions. Firstly, we have used an age structured mathematical model to describe the transmission of Respiratory Syncytial Virus in a developed country setting and to explore different vaccination strategies. We found that delayed infant vaccination has significant potential in reducing the number of hospitalisations in the most vulnerable group and that most of the reduction is due to indirect protection. It also suggests that marked public health benefit could be achieved through RSV vaccine delivered to age groups not seen as most at risk of severe disease. The second application is in the optimal design of studies aimed at collection of householdstratified infection data. A design decision involves making a tradeoff between the number of households to enrol and the sampling frequency. Two commonly used study designs are considered: crosssectional and cohort. The search for an optimal design uses Bayesian methods to explore the joint parameterdesign space combined with Shannon entropy of the posteriors to estimate the amount of information for each design. We found that for the crosssectional designs, the amount of information increases with the sampling intensity while the cohort design often exhibits a tradeoff between the number of households sampled and the intensity of followup. Our results broadly support the choices made in existing data collection studies. 

Group meeting 15:10 Fri 20 Nov, 2015 :: Ingkarni Wardli B17 :: Mr Jack Keeler :: University of East Anglia / University of Adelaide
Title: Stability of freesurface flow over topography
Abstract: The forced KdV equation is used as a model to analyse the wave behaviour on the free surface in response to prescribed topographic forcing. The research involves computing steady solutions using numeric and asymptotic techniques and then analysing the stability of these steady solutions in timedependent calculations. Stability is analysed by computing the eigenvalue spectra of the linearised fKdV operator and by exploiting the Hamiltonian structure of the fKdV. Future work includes analysing the solution space for a corrugated topography and investigating the 3 dimensional problem using the KP equation.
+ Any items for group discussion 

Group meeting 15:10 Fri 20 Nov, 2015 :: Ingkarni Wardli B17 :: Mr Jack Keeler :: University of East Anglia / University of Adelaide
Title: Stability of freesurface flow over topography
Abstract: The forced KdV equation is used as a model to analyse the wave behaviour on the free surface in response to prescribed topographic forcing. The research involves computing steady solutions using numeric and asymptotic techniques and then analysing the stability of these steady solutions in timedependent calculations. Stability is analysed by computing the eigenvalue spectra of the linearised fKdV operator and by exploiting the Hamiltonian structure of the fKdV. Future work includes analysing the solution space for a corrugated topography and investigating the 3 dimensional problem using the KP equation.
+ Any items for group discussion 

A SemiMarkovian Modeling of Limit Order Markets 13:00 Fri 11 Dec, 2015 :: Ingkarni Wardli 5.57 :: Anatoliy Swishchuk :: University of Calgary
Media...R. Cont and A. de Larrard (SIAM J. Financial Mathematics, 2013) introduced a tractable stochastic model for the dynamics of a limit order book, computing various quantities of interest such as the probability of a price increase or the diffusion limit of the price process. As suggested by empirical observations, we extend their framework to 1) arbitrary distributions for book events interarrival times (possibly nonexponential) and 2) both the nature of a new book event and its corresponding interarrival time depend on the nature of the previous book event. We do so by resorting to Markov renewal processes to model the dynamics of the bid and ask queues. We keep analytical tractability via explicit expressions for the Laplace transforms of various quantities of interest. Our approach is justified and illustrated by calibrating the model to the five stocks Amazon, Apple, Google, Intel and Microsoft on June 21st 2012. As in Cont and Larrard, the bidask spread remains constant equal to one tick, only the bid and ask queues are modelled (they are independent from each other and get reinitialized after a price change), and all orders have the same size. (This talk is based on our joint paper with Nelson Vadori (Morgan Stanley)). 

How predictable are you? Information and happiness in social media. 12:10 Mon 21 Mar, 2016 :: Ingkarni Wardli Conference Room 715 :: Dr Lewis Mitchell :: School of Mathematical Sciences
Media...The explosion of ``Big Data'' coming from online social networks and the like has opened up the new field of ``computational social science'', which applies a quantitative lens to problems traditionally in the domain of psychologists, anthropologists and social scientists. What does it mean to be influential? How do ideas propagate amongst populations? Is happiness contagious? For the first time, mathematicians, statisticians, and computer scientists can provide insight into these and other questions. Using data from social networks such as Facebook and Twitter, I will give an overview of recent research trends in computational social science, describe some of my own work using techniques like sentiment analysis and information theory in this realm, and explain how you can get involved with this highly rewarding research field as well.


Mathematical modelling of the immune response to influenza 15:00 Thu 12 May, 2016 :: Ingkarni Wardli B20 :: Ada Yan :: University of Melbourne
Media...The immune response plays an important role in the resolution of primary influenza infection and prevention of subsequent infection in an individual. However, the relative roles of each component of the immune response in clearing infection, and the effects of interaction between components, are not well quantified.
We have constructed a model of the immune response to influenza based on data from viral interference experiments, where ferrets were exposed to two influenza strains within a short time period. The changes in viral kinetics of the second virus due to the first virus depend on the strains used as well as the interval between exposures, enabling inference of the timing of innate and adaptive immune response components and the role of crossreactivity in resolving infection. Our model provides a mechanistic explanation for the observed variation in viruses' abilities to protect against subsequent infection at short interexposure intervals, either by delaying the second infection or inducing stochastic extinction of the second virus. It also explains the decrease in recovery time for the second infection when the two strains elicit crossreactive cellular adaptive immune responses. To account for intersubject as well as intervirus variation, the model is formulated using a hierarchical framework. We will fit the model to experimental data using Markov Chain Monte Carlo methods; quantification of the model will enable a deeper understanding of the effects of potential new treatments.


Behavioural Microsimulation Approach to Social Policy and Behavioural Economics 15:10 Fri 20 May, 2016 :: S112 Engineering South :: Dr Drew Mellor :: Ernst & Young
SIMULAIT is a general purpose, behavioural microsimulation system designed to predict behavioural trends in human populations. This type of predictive capability grew out of original research initially conducted in conjunction with the Defence Science and Technology Group (DSTO) in South Australia, and has been fully commercialised and is in current use by a global customer base. To our customers, the principal value of the system lies in its ability to predict likely outcomes to scenarios that challenge conventional approaches based on extrapolation or generalisation. These types of scenarios include: the impact of disruptive technologies, such as the impact of widespread adoption of autonomous vehicles for transportation or batteries for household energy storage; and the impact of effecting policy elements or interventions, such as the impact of imposing water usage restrictions.
SIMULAIT employs a multidisciplinary methodology, drawing from agentbased modelling, behavioural science and psychology, microeconomics, artificial intelligence, simulation, game theory, engineering, mathematics and statistics. In this seminar, we start with a highlevel view of the system followed by a look under the hood to see how the various elements come together to answer questions about behavioural trends. The talk will conclude with a case study of a recent application of SIMULAIT to a significant policy problem  how to address the deficiency of STEM skilled teachers in the Victorian teaching workforce. 

Some free boundary value problems in mean curvature flow and fully nonlinear curvature flows 12:10 Fri 27 May, 2016 :: Eng & Maths EM205 :: Valentina Wheeler :: University of Wollongong
Media...In this talk we present an overview of the current research in mean curvature flow and fully nonlinear curvature flows with free boundaries, with particular focus on our own results. Firstly we consider the scenario of a mean curvature flow solution with a ninetydegree angle condition on a fixed hypersurface in Euclidean space, that we call the contact hypersurface. We prove that under restrictions on either the initial hypersurface (such as rotational symmetry) or restrictions on the contact hypersurface the flow exists for all times and converges to a selfsimilar solution. We also discuss the possibility of a curvature singularity appearing on the free boundary contained in the contact hypersurface. We extend some of these results to the setting of a hypersurface evolving in its normal direction with speed given by a fully nonlinear functional of the principal curvatures.


Time series analysis of paleoclimate proxies (a mathematical perspective) 15:10 Fri 27 May, 2016 :: Engineering South S112 :: Dr Thomas Stemler :: University of Western Australia
Media...In this talk I will present the work my colleagues from the School of
Earth and Environment (UWA), the "trans disciplinary methods" group of
the Potsdam Institute for Climate Impact Research, Germany, and I did to
explain the dynamics of the AustralianSouth East Asian monsoon system
during the last couple of thousand years.
From a time series perspective paleoclimate proxy series are more or
less the monsters moving under your bed that wake you up in the middle
of the night. The data is clearly nonstationary, nonuniform sampled in
time and the influence of stochastic forcing or the level of measurement
noise are more or less unknown. Given these undesirable properties
almost all traditional time series analysis methods fail.
I will highlight two methods that allow us to draw useful conclusions
from the data sets. The first one uses Gaussian kernel methods to
reconstruct climate networks from multiple proxies. The coupling
relationships in these networks change over time and therefore can be
used to infer which areas of the monsoon system dominate the complex
dynamics of the whole system. Secondly I will introduce the
transformation cost time series method, which allows us to detect
changes in the dynamics of a nonuniform sampled time series. Unlike the
frequently used interpolation approach, our new method does not corrupt
the data and therefore avoids biases in any subsequence analysis. While
I will again focus on paleoclimate proxies, the method can be used in
other applied areas, where regular sampling is not possible.


SIR epidemics with stages of infection 12:10 Wed 28 Sep, 2016 :: EM218 :: Matthieu Simon :: Universite Libre de Bruxelles
Media...This talk is concerned with a stochastic model for the spread of an epidemic in a closed homogeneously mixing population. The population is subdivided into three classes of individuals: the susceptibles, the infectives and the removed cases. In short, an infective remains infectious during a random period of time. While infected, it can contact all the susceptibles present, independently of the other infectives. At the end of the infectious period, it becomes a removed case and has no further part in the infection process.
We represent an infectious period as a set of different stages that an infective can go through before being removed. The transitions between stages are ruled by either a Markov process or a semiMarkov process. In each stage, an infective makes contaminations at the epochs of a Poisson process with a specific rate.
Our purpose is to derive closed expressions for a transform of different statistics related to the end of the epidemic, such as the final number of susceptibles and the area under the trajectories of all the infectives. The analysis is performed by using simple matrix analytic methods and martingale arguments. Numerical illustrations will be provided at the end of the talk. 

Transmission Dynamics of Visceral Leishmaniasis: designing a test and treat control strategy 12:10 Thu 29 Sep, 2016 :: EM218 :: Graham Medley :: London School of Hygiene & Tropical Medicine
Media...Visceral Leishmaniasis (VL) is targeted for elimination from the Indian SubContinent. Progress has been much better in some areas than others. Current control is based on earlier diagnosis and treatment and on insecticide spraying to reduce the density of the vector. There is a surprising dearth of specific information on the epidemiology of VL, which makes modelling more difficult. In this seminar, I describe a simple framework that gives some insight into the transmission dynamics. We conclude that the majority of infection comes from cases prior to diagnosis. If this is the case then, early diagnosis will be advantageous, but will require a test with high specificity. This is a paradox for many clinicians and public health workers, who tend to prioritise high sensitivity.
Medley, G.F., Hollingsworth, T.D., Olliaro, P.L. & Adams, E.R. (2015) Healthseeking, diagnostics and transmission in the control of visceral leishmaniasis. Nature 528, S102S108 (3 December 2015), DOI: 10.1038/nature16042 

Leavitt path algebras 12:10 Fri 2 Dec, 2016 :: Engineering & Math EM213 :: Roozbeh Hazrat :: Western Sydney University
Media...From a directed graph one can generate an algebra which captures the movements along the graph. One such algebras are Leavitt path algebras.
Despite being introduced only 10 years ago, Leavitt path algebras have arisen in a variety of different contexts as diverse as analysis, symbolic dynamics, noncommutative geometry and representation theory. In fact, Leavitt path algebras are algebraic counterpart to graph C*algebras, a theory which has become an area of intensive research globally. There are strikingly parallel similarities between these two theories. Even more surprisingly, one cannot (yet) obtain the results in one theory as a consequence of the other; the statements look the same, however the techniques to prove them are quite different (as the names suggest, one uses Algebra and other Analysis). These all suggest that there might be a bridge between Algebra and Analysis yet to be uncovered.
In this talk, we introduce Leavitt path algebras and try to classify them by means of (graded) Grothendieck groups. We will ask nice questions!


Fast approximate inference for arbitrarily large statistical models via message passing 15:10 Fri 17 Mar, 2017 :: Engineering South S111 :: Prof Matt Wand :: University of Technology Sydney
We explain how the notion of message passing can be used
to streamline the algebra and computer coding for fast
approximate inference in large Bayesian statistical models.
In particular, this approach is amenable to handling
arbitrarily large models of particular types
once a set of primitive operations is established.
The approach is founded upon a message passing formulation
of mean field variational Bayes that utilizes
factor graph representations of statistical
models. The notion of factor graph fragments is introduced
and is shown to facilitate compartmentalization of the
required algebra and coding. 

What are operator algebras and what are they good for? 15:10 Fri 12 May, 2017 :: Engineering South S111 :: Prof Aidan Sims :: University of Wollongong
Back in the early 1900s when people were first grappling with the new ideas of quantum mechanics and looking for mathematical techniques to study them, they found themselves, unavoidably, dealing with what have now become known as operator algebras. As a research area, operator algebras has come a very long way since then, and has spread out to touch on many other areas of mathematics, as well as maintaining its links with mathematical physics. I'll try to convey roughly what operator algebras are, and describe some of the highlights of their career thus far, particularly the more recent ones. 

Mathematics is Biology's Next Microscope (Only Better!) 15:10 Fri 11 Aug, 2017 :: Ingkarni Wardli B17 :: Dr Robyn Araujo :: Queensland University of Technology
While mathematics has long been considered "an essential tool for physics", the foundations of biology and the life sciences have received significantly less influence from mathematical ideas and theory. In this talk, I will give a brief discussion of my recent research on robustness in molecular signalling networks, as an example of a complex biological question that calls for a mathematical answer. In particular, it has been a longstanding mystery how the extraordinarily complex communication networks inside living cells, comprising thousands of different interacting molecules, are able to function robustly since complexity is generally associated with fragility. Mathematics has now suggested a resolution to this paradox through the discovery that robust adaptive signalling networks must be constructed from a just small number of welldefined universal modules (or "motifs"), connected together. The existence of these newlydiscovered modules has important implications for evolutionary biology, embryology and development, cancer research, and drug development. 

Mathematics is Biology'ÂÂs Next Microscope (Only Better!) 15:10 Fri 11 Aug, 2017 :: Ingkarni Wardli B17 :: Dr Robyn Araujo :: Queensland University of Technology
While mathematics has long been considered Ã¢ÂÂan essential tool for physics", the foundations of biology and the life sciences have received significantly less influence from mathematical ideas and theory. In this talk, I will give a brief discussion of my recent research on robustness in molecular signalling networks, as an example of a complex biological question that calls for a mathematical answer. In particular, it has been a longstanding mystery how the extraordinarily complex communication networks inside living cells, comprising thousands of different interacting molecules, are able to function robustly since complexity is generally associated with fragility. Mathematics has now suggested a resolution to this paradox through the discovery that robust adaptive signalling networks must be constructed from a just small number of welldefined universal modules (or Ã¢ÂÂmotifsÃ¢ÂÂ), connected together. The existence of these newlydiscovered modules has important implications for evolutionary biology, embryology and development, cancer research, and drug development. 

Stochastic Modelling of Urban Structure 11:10 Mon 20 Nov, 2017 :: Engineering Nth N132 :: Mark Girolami :: Imperial College London, and The Alan Turing Institute
Media...Urban systems are complex in nature and comprise of a large number of individuals that act according to utility, a measure of net benefit pertaining to preferences. The actions of individuals give rise to an emergent behaviour, creating the socalled urban structure that we observe. In this talk, I develop a stochastic model of urban structure to formally account for uncertainty arising from the complex behaviour. We further use this stochastic model to infer the components of a utility function from observed urban structure. This is a more powerful modelling framework in comparison to the ubiquitous discrete choice models that are of limited use for complex systems, in which the overall preferences of individuals are difficult to ascertain. We model urban structure as a realization of a Boltzmann distribution that is the invariant distribution of a related stochastic differential equation (SDE) that describes the dynamics of the urban system. Our specification of Boltzmann distribution assigns higher probability to stable configurations, in the sense that consumer surplus (demand) is balanced with running costs (supply), as characterized by a potential function. We specify a Bayesian hierarchical model to infer the components of a utility function from observed structure. Our model is doublyintractable and poses significant computational challenges that we overcome using recent advances in Markov chain Monte Carlo (MCMC) methods. We demonstrate our methodology with case studies on the London retail system and airports in England. 

Models, machine learning, and robotics: understanding biological networks 15:10 Fri 16 Mar, 2018 :: Horace Lamb 1022 :: Prof Steve Oliver :: University of Cambridge
The availability of complete genome sequences has enabled the construction of computer models of metabolic networks that may be used to predict the impact of genetic mutations on growth and survival. Both logical and constraintbased models of the metabolic network of the model eukaryote, the ale yeast Saccharomyces cerevisiae, have been available for some time and are continually being improved by the research community. While such models are very successful at predicting the impact of deleting single genes, the prediction of the impact of higher order genetic interactions is a greater challenge. Initial studies of limited gene sets provided encouraging results. However, the availability of comprehensive experimental data for the interactions between genes involved in metabolism demonstrated that, while the models were able to predict the general properties of the genetic interaction network, their ability to predict interactions between specific pairs of metabolic genes was poor. I will examine the reasons for this poor performance and demonstrate ways of improving the accuracy of the models by exploiting the techniques of machine learning and robotics.
The utility of these metabolic models rests on the firm foundations of genome sequencing data. However, there are two major problems with these kinds of network models  there is no dynamics, and they do not deal with the uncertain and incomplete nature of much biological data. To deal with these problems, we have developed the Flexible Nets (FNs) modelling formalism. FNs were inspired by Petri Nets and can deal with missing or uncertain data, incorporate both dynamics and regulation, and also have the potential for model predictive control of biotechnological processes.

News matching "Operations research" 
Summer Research Scholarship Applications NOW OPEN Applications for AMSI Vacation Scholarships and Adelaide Summer Research Scholarships are now OPEN.
AMSI Vacation Scholarships: Closing date Friday 16th September.
http://vrs.amsi.org.au/
University of Adelaide Summer Research Scholarships: Closing date Friday 7th October.
http://www.ecms.adelaide.edu.au/scholarships/summer/
Please submit all Adelaide applications to the School of Mathematical Sciences.
Posted Wed 30 Nov 1.More information... 

Dr Yvonne Stokes wins Michell Medal Dr Yvonne Stokes (Applied Mathematics) was awarded the 2007 J.H. Michell Medal of ANZIAM. The award is made annually to an outstanding new researcher, one who is in the first ten years of their research career. Read Yvonne's citation here. Posted Mon 5 Mar 07. 

ARC success The School of Mathematical Sciences was again very successful in attracting Australian Research Council funding for 2008. Recipients of ARC Discovery Projects are (with staff from the School highlighted):
Prof NG Bean; Prof PG Howlett; Prof CE Pearce; Prof SC Beecham; Dr AV Metcalfe; Dr JW Boland:
WaterLog  A mathematical model to implement recommendations of The Wentworth Group.
20082010: $645,000
Prof RJ Elliott:
Dynamic risk measures.
(Australian Professorial Fellowship)
20082012: $897,000
Dr MD Finn:
Topological Optimisation of Fluid Mixing.
20082010: $249,000
Prof PG Bouwknegt; Prof M Varghese; A/Prof S Wu:
Dualities in String Theory and Conformal Field Theory in the context of the Geometric Langlands Program.
20082010: $240,000
The latter grant is held through the ANU Posted Wed 26 Sep 07. 

Potts Medal Winner Professor Charles Pearce, the Elder Profesor of Mathematics, was awarded the Ren Potts Medal by the Australian Society for Operations
Research at its annual meeting in December. This is a national award for outstanding
contributions to Operations Research in Australia.
Posted Tue 22 Jan 08. 

Australian Research Council Discovery Project Successes Congratulations to the following members of the School for their
success in the ARC Discovery Grants which were announced recently.
 A/Prof M Roughan; Prof H Shen $315K Network Management in a World of Secrets
 Prof AJ Roberts; Dr D Strunin $315K
Effective and accurate model dynamics, deterministic and stochastic,
across multiple space and time scales
 A/Prof J Denier; Prof AP Bassom $180K A novel approach to controlling boundarylayer separation
Posted Wed 17 Sep 08. 

Three postdoc positions advertised The School of Mathematical Sciences is seeking to appoint three postdoctoral research associates. These positions have now closed. Posted Wed 29 Jul 09. 

Welcome to Dr Joshua Ross We welcome Dr Joshua Ross as a new lecturer in the School of Mathematical Sciences. Joshua has moved over to Adelaide from the University of Cambridge. His research interests are mathematical modelling (especially mathematical biology) and operations research. Posted Mon 15 Mar 10.More information... 

Go8Germany Research Cooperation Scheme Congratulations to Thomas Leistner whose application under the Go8Germany Research Cooperation Scheme is one of 24 across Australia to be funded in 20112012. Thomas will work with Professor Helga Baum of Humbolt University in Berlin on spinor field equations in global Lorentzian geometry. Posted Thu 4 Nov 10. 

Bushfire CRC postgraduate scholarship success Congratulations to Mika Peace who has been awarded a PhD scholarship from the Bushfire Cooperative Research Centre. Mika is working with Trent Mattner and Graham Mills (from the Bureau of Meteorology) on coupled fireweather modelling Posted Wed 6 Apr 11. 

ARC Grant Success Congratulations to the following staff who were successful in securing funding from the Australian Research Council Discovery Projects Scheme. Associate Professor Finnur Larusson awarded $270,000 for his project Flexibility and symmetry in complex geometry; Dr Thomas Leistner, awarded $303,464 for his project Holonomy groups in Lorentzian geometry, Professor Michael Murray Murray and Dr Daniel Stevenson (Glasgow), awarded $270,000 for their project Bundle gerbes: generalisations and applications; Professor Mathai Varghese, awarded $105,000 for his project Advances in index theory and Prof Anthony Roberts and Professor Ioannis Kevrekidis (Princeton) awarded $330,000 for their project Accurate modelling of large multiscale dynamical systems for engineering and scientific
simulation and analysis Posted Tue 8 Nov 11. 

More ARC Grant Success The School has followed up its success in the ARC Discovery Project Scheme with a Future Fellowship being awarded to Dr Thomas Leistner and DECRAs award to Dr Pedram Hekmati and Dr Robert Yuncken. This brings the total number of ARC research grants to the School to 8 in 2011, which amounts to 14% of the total ARC grants awarded to the University of Adelaide in 2011. Posted Tue 15 Nov 11. 

The mathematical implications of gaugestring dualities Between Monday 5 and Friday 9 March 2012, the Institute for Geometry and its Applications will host a lecture series by Rajesh Gopakumar from the HarishChandra Research Institute. These lectures will be supplemented by talks by other invited speakers. Posted Tue 6 Dec 11.More information... 

Two contract positions are available As a result of the School's success in securing two prestigious Australian Research Council Future Fellowships, we now have two limited term positions available, one in Pure Mathematics and one in Statistics. Posted Wed 14 Dec 11. 

Summer Research Student Thomas Brown wins the AMSI/Cambridge University Press Prize for 2013 Congratulations to Thomas Brown, jointly supervised by Ed Green and Ben Binder who won the AMSI/Cambridge University Press Prize for the best talk at the 2013 CSIRO Big Day In, recently held this month.
After completion of their summer project, vacation scholars must submit a project report which summarises the project and addresses the nature of the topic, methods of investigation, results found, and benefits of the experience. The scholars then present a 15minute presentation about their project at the CSIRO Big Day In (BDI). This experience enables students to meet and socialise with their peers, gain experience presenting to their colleagues and supervisors and learn about a range of careers in science by interacting with several CSIRO scientists (including mathematicians) in a discussion panel.
This is a very pleasing result for Thomas, Ed and Ben as well as for the School of Mathematical Sciences. Well done Thomas.
Posted Fri 15 Feb 13. 

Summer Research Scholarship Applications NOW OPEN Please refer HERE for a list of possible Summer Research Topics.
AMSI Vacation Scholarships: Closing date Friday 16th September.
http://vrs.amsi.org.au/
University of Adelaide Summer Research Scholarships: Closing date Friday 7th October.
http://www.ecms.adelaide.edu.au/scholarships/summer/
Posted Fri 26 Aug 16.More information... 

A/Prof Joshua Ross, 2017 Moran Medal recipient Congratulations to Associate Professor Joshua Ross who has won the 2017 Moran Medal, awarded by the Australian Academy of Science.
The Moran Medal recognises outstanding research by scientists up to 10 years postPhD in applied probability, biometrics, mathematical genetics, psychometrics and statistics.
Associate Professor Ross has made influential contributions to public health and conservation biology using mathematical modelling and statistics to help in decision making.
Posted Fri 23 Dec 16.More information... 

ARC grant recipients The School of Mathematical Sciences wishes to warmly congratulate the school recipients of the latest ARC grant round which was announced on Tuesday, 1 November. These grants include 1 Future Fellowship (Y. Stokes), 1 Discovery Early Career Research Award (Guo Chuan Thiang) and 1 Discovery Project grant (Varghese, Baraglia).
Posted Fri 23 Dec 16. 

Elder Professor Mathai Varghese Awarded Australian Laureate Fellowship Professor Mathai Varghese, Elder Professor of Mathematics in the School of Mathematical Sciences, has been awarded an Australian Laureate Fellowship worth $1.64 million to advance Index Theory and its applications. The project is expected to enhance Australiaâs position at the forefront of international research in geometric analysis. Posted Thu 15 Jun 17.More information... 

Elder Professor Mathai Varghese Awarded Australian Laureate Fellowship Professor Mathai Varghese, Elder Professor of Mathematics in the School of Mathematical Sciences, has been awarded an Australian Laureate Fellowship worth $1.64 million to advance Index Theory and its applications. The project will enhance Australia's position at the forefront of international research in geometric analysis. Posted Thu 15 Jun 17.More information... 
Publications matching "Operations research"Publications 

On risk minimizing portfolios under a Markovian regimeswitching BlackScholes economy Elliott, Robert; Siu, T, Annals of Operations Research 1 (1–21) 2009  A mixer design for the pigtail braid Binder, Benjamin; Cox, Stephen, Fluid Dynamics Research 40 (34–44) 2008  A spacetime NeymanScott rainfall model with defined storm extent Leonard, Michael; Lambert, Martin; Metcalfe, Andrew; Cowpertwait, P, Water Resources Research 44 (9402–9402) 2008  Assessing the potential usefulness of IGFrelated peptides and adiponectin for predicting disease risk Belobrajdic, Damien; Priebe, I; Forbes, Briony; Flyvbjerg, A; Chen, J; Cosgrove, L; Frystyk, J; Saunders, Ian, Growth Hormone & IGF Research 18 (198–204) 2008  Markovian trees: properties and algorithms Bean, Nigel; Kontoleon, Nectarios; Taylor, Peter, Annals of Operations Research 160 (31–50) 2008  Performance measures of a multilayer Markovian fluid model Bean, Nigel; O'Reilly, Malgorzata, Annals of Operations Research 160 (99–120) 2008  Microarray gene expression profiling of osteoarthritic bone suggests altered bone remodelling, WNT and transforming growth factorbeta/bone morphogenic protein signalling Hopwood, Blair; Tsykin, Anna; Findlay, David; Fazzalari, Nicola, Arthritis Research & Therapy 9 (WWW 1–WWW 21) 2007  Efficient simulation of a spacetime NeymanScott rainfall model Leonard, Michael; Metcalfe, Andrew; Lambert, Martin, Water Resources Research 42 (11503–11503) 2006  Formal adjoints and canonical form for linear operations Eastwood, Michael; Gover, A, Conformal Geometry and Dynamics 10 (285–287) 2006  Methodology in metaanalysis: a study from critical care metaanalytic practice Moran, John; Solomon, Patricia; Warn, D, Health Services and Outcomes Research Methodology 5 (207–226) 2006  Lifting surfaces with circular planforms Tuck, Ernest; Lazauskas, Leo, Journal of Ship Research 49 (274–278) 2005  Optimal recursive estimation of raw data Torokhti, Anatoli; Howlett, P; Pearce, Charles, Annals of Operations Research 133 (285–302) 2005  The crossentropy method for network reliability estimation Hui, KinPing; Bean, Nigel; Kraetzl, Miro; Kroese, D, Annals of Operations Research 134 (101–118) 2005  The effect of World War 1 and the 1918 influenza pandemic on cohort life expectancy of South Australian males born in 18811900 Leppard, Phillip; Tallis, George; Pearce, Charles, Journal of Population Research 21 (161–176) 2004  Arbitrage in a Discrete Version of the WickFractional Black Scholes Model Bender, C; Elliott, Robert, Mathematics of Operations Research 29 (935–945) 2004  Numerical error in groundwater flow and solute transport simulation Woods, Juliette; Teubner, Michael; Simmons, Craig; Narayan, K, Water Resources Research 39 (SBH 101–SBH 1011) 2003  Evidence for a Differential Cellular Distribution of Inward Rectifier K Channels in the Rat Isolated Mesenteric Artery Crane, Glenis Jayne; Walker, S; Dora, K; Garland, C, Journal of Vascular Research 40 (159–168) 2003  Comparison of spinal myotatic reflexes in human adults investigated with crosscorrelation and signal averaging methods Miller, S; Clark, J; Eyre, J; Kelly, S; Lim, E; McClelland, V; McDonough, S; Metcalfe, Andrew, Brain Research 899 (47–65) 2001  On a generalized 2 + 1 dispersive water wave hierarchy Gordoa, P; Joshi, Nalini; Pickering, A, Publications of the Research Institute for Mathematical Sciences 37 (327–347) 2001  Some new bounds for singular values and eigenvalues of matrix products Lu, LZ; Pearce, Charles, Annals of Operations Research 98 (141–148) 2001  The maximum sinkage of a ship Gourlay, Timothy; Tuck, Ernest, Journal of Ship Research 45 (50–58) 2001  Threedimensional inviscid waves in buoyant boundary layer flows Denier, James; Stott, Jillian; Bassom, A, Fluid Dynamics Research 28 (89–109) 2001  Metaanalysis, overviews and publication bias Solomon, Patricia; Hutton, Jonathon, Statistical Methods in Medical Research 10 (245–250) 2001  Disease surveillance and data collection issues in epidemic modelling Solomon, Patricia; Isham, V, Statistical Methods in Medical Research 9 (259–277) 2000  Disease surveillance and intervention studies in developing countries Solomon, Patricia, Statistical Methods in Medical Research 9 (183–184) 2000 
Advanced search options
You may be able to improve your search results by using the following syntax:
Query  Matches the following 

Asymptotic Equation  Anything with "Asymptotic" or "Equation". 
+Asymptotic +Equation  Anything with "Asymptotic" and "Equation". 
+Stokes "NavierStokes"  Anything containing "Stokes" but not "NavierStokes". 
Dynam*  Anything containing "Dynamic", "Dynamical", "Dynamicist" etc. 
