January
2019  M  T  W  T  F  S  S   1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31           

Search the School of Mathematical SciencesCourses matching "At least four doors, numerous goats, a car, a frog" 
Applied Probability III Many processes in the real world involve some random variation superimposed on a deterministic structure. For example, the experiment of flipping a coin is best studied by treating the outcome as a random one. Mathematical probability has its origins in games of chance with dice and cards, originating in the fifteenth and sixteenth centuries. This course aims to provide a basic toolkit for modelling and analyzing discretetime problems in which there is a significant probabilistic component. We will consider Markov chain examples in the course including population branching processes (with application to genetics), random walks (with application to games), and more general discrete time examples using Martingales. Topics covered are: basic probability and measure theory, discrete time Markov chains, hitting probabilities and hitting time theorems, population branching processes, inhomogeneous random walks on the line, solidarity properties and communicating classes, necessary and sufficient conditions for transience and positive recurrence, global balance, partial balance, reversibility, Martingales, stopping times and stopping theorems with a link to Brownian motion.
More about this course... 

Probability and Statistics Probability theory is the branch of mathematics that deals with modelling uncertainty. It is important because of its direct application in areas such as genetics, finance and telecommunications. It also forms the fundamental basis for many other areas in the mathematical sciences including statistics, modern optimisation methods and risk modelling. This course provides an introduction to probability theory, random variables and Markov processes. Topics covered are: probability axioms, conditional probability; Bayes' theorem; discrete random variables, moments, bounding probabilities, probability generating functions, standard discrete distributions; continuous random variables, uniform, normal, Cauchy, exponential, gamma and chisquare distributions, transformations, the Poisson process; bivariate distributions, marginal and conditional distributions, independence, covariance and correlation, linear combinations of two random variables, bivariate normal distribution; sequences of independent random variables, the weak law of large numbers, the central limit theorem; definition and properties of a Markov chain and probability transition matrices; methods for solving equilibrium equations, absorbing Markov chains.
More about this course... 
Events matching "At least four doors, numerous goats, a car, a frog" 
Queues with Advance Reservations 15:10 Fri 21 Sep, 2007 :: G04 Napier Building University of Adelaide :: Prof. Peter Taylor :: Department of Mathematics and Statistics, University of Melbourne
Queues where, on "arrival", customers make a reservation for service at some time in the future are endemic. However there is surprisingly little about them in the literature. Simulations illustrate some interesting implications of the facility to make such reservations. For example introducing independent and identically distributed reservation periods into an Erlang loss system can either increase or decrease the blocking probability from that given by Erlang's formula, despite the fact that the process of 'reserved arrivals' is still Poisson. In this talk we shall discuss a number of ways of looking at such queues. In particular, we shall obtain various transient and stationary distributions associated with the "bookings diary" for the infinite server system. However, this does not immediately answer the question of how to calculate the abovementioned blocking probabilities. We shall conclude with a few suggestions as to how this calculation might be carried out. 

Another proof of GaboriauPopa 13:10 Fri 3 Jul, 2009 :: School Board Room :: Prof Greg Hjorth :: University of Melbourne
Gaboriau and Popa showed that a nonabelian free group on finitely many generators has continuum many measure preserving, free, ergodic, actions on standard Borel probability spaces. The original proof used the notion of property (T). I will sketch how this can be replaced by an elementary, and apparently new, dynamical property. 

The two envelope problem 12:10 Wed 11 Aug, 2010 :: Napier 210 :: A/Prof Gary Glonek :: University of Adelaide
Media...The two envelope problem is a long standing paradox in
probability theory. Although its formulation has elements in common
with the celebrated Monty Hall problem, the underlying paradox is
apparently far more subtle. In this talk, the problem will be
explained and various aspects of the paradox will be discussed.
Connections to Bayesian inference and other areas of statistics will
be explored. 

A spatialtemporal point process model for fine resolution multisite rainfall data from Roma, Italy 14:10 Thu 19 Aug, 2010 :: Napier G04 :: A/Prof Paul Cowpertwait :: Auckland University of Technology
A point process rainfall model is further developed that has storm origins occurring in spacetime according to a Poisson process. Each storm origin has a random radius so that storms occur as circular regions in twodimensional
space, where the storm radii are taken to be independent exponential random
variables. Storm origins are of random type z, where z follows a continuous
probability distribution. Cell origins occur in a further spatial Poisson
process and have arrival times that follow a NeymanScott point process. Cell
origins have random radii so that cells form discs in twodimensional space.
Statistical properties up to third order are derived and used to fit the model
to 10 min series taken from 23 sites across the Roma region, Italy.
Distributional properties of the observed annual maxima are compared to
equivalent values sampled from series that are simulated using the fitted
model. The results indicate that the model will be of use in urban drainage
projects for the Roma region.


At least four doors, numerous goats, a car, a frog, four lily pads and some probability 11:10 Wed 13 Oct, 2010 :: Napier 210 :: Dr Joshua Ross :: University of Adelaide
Media...In the process of determining, amongst other things, the optimal strategy for playing a game show, and explaining the apparent persistence of a population that can be shown to die out with certainty, we will encounter a car, numerous goats, at least four doors, a frog, four lily pads, and some applied probability. 

Change detection in rainfall time series for Perth, Western Australia 12:10 Mon 16 May, 2011 :: 5.57 Ingkarni Wardli :: Farah Mohd Isa :: University of Adelaide
There have been numerous reports that the rainfall in south Western Australia,
particularly around Perth has observed a step change decrease, which is
typically attributed to climate change. Four statistical tests are used to
assess the empirical evidence for this claim on time series from five
meteorological stations, all of which exceed 50 years. The tests used in this
study are: the CUSUM; Bayesian Change Point analysis; consecutive ttest and the
Hotellingâs TÂ²statistic. Results from multivariate Hotellingâs TÂ² analysis are
compared with those from the three univariate analyses. The issue of multiple
comparisons is discussed. A summary of the empirical evidence for the claimed
step change in Perth area is given. 

Inference and optimal design for percolation and general random graph models (Part I) 09:30 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂ«mostly populatedÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Inference and optimal design for percolation and general random graph models (Part II) 10:50 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂÃÂÃÂ«mostly populatedÃÂÃÂÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Probability density estimation by diffusion 15:10 Fri 10 Jun, 2011 :: 7.15 Ingkarni Wardli :: Prof Dirk Kroese :: University of Queensland
Media...One of the beautiful aspects of Mathematics is that seemingly
disparate areas can often have deep connections. This talk is about
the fundamental connection between probability density estimation,
diffusion processes, and partial differential equations. Specifically,
we show how to obtain efficient probability density estimators by
solving partial differential equations related to diffusion processes.
This new perspective leads, in combination with Fast Fourier
techniques, to very fast and accurate algorithms for density
estimation. Moreover, the diffusion formulation unifies most of the
existing adaptive smoothing algorithms and provides a natural solution
to the boundary bias of classical kernel density estimators. This talk
covers topics in Statistics, Probability, Applied Mathematics, and
Numerical Mathematics, with a surprise appearance of the theta
function. This is joint work with Zdravko Botev and Joe Grotowski. 

Configuration spaces in topology and geometry 15:10 Fri 9 Sep, 2011 :: 7.15 Ingkarni Wardli :: Dr Craig Westerland :: University of Melbourne
Media...Configuration spaces of points in R^n give a family of interesting geometric objects. They and their variants have numerous applications in geometry, topology, representation theory, and number theory. In this talk, we will review several of these manifestations (for instance, as moduli spaces, function spaces, and the like), and use them to address certain conjectures in number theory regarding distributions of number fields. 

Mixing, dynamics, and probability 15:10 Fri 2 Mar, 2012 :: B.21 Ingkarni Wardli :: A/Prof Gary Froyland :: University of New South Wales
Media...Many interesting natural phenomena are hard to predict.
When modelled as a dynamical system, this unpredictability is often the result of rapid separation of nearby trajectories.
Viewing the dynamics as acting on a probability measure, the mixing property states that two measurements (or random variables), evaluated at increasingly separated times, become independent in the timeseparation limit.
Thus, the later measurement becomes increasingly difficult to predict, given the outcome of the earlier measurement.
If this approach to independence occurs exponentially quickly in time, one can profitably use linear operator tools to analyse the dynamics.
I will give an overview of these techniques and show how they can be applied to answer mathematical questions, describe observed behaviour in fluid mixing, and analyse models of the ocean and atmosphere. 

Forecasting electricity demand distributions using a semiparametric additive model 15:10 Fri 16 Mar, 2012 :: B.21 Ingkarni Wardli :: Prof Rob Hyndman :: Monash University
Media...Electricity demand forecasting plays an important role in shortterm load allocation and longterm planning for future generation facilities and transmission augmentation. Planners must adopt a probabilistic view of potential peak demand levels, therefore density forecasts (providing estimates of the full probability distributions of the possible future values of the demand) are more helpful than point forecasts, and are necessary for utilities to evaluate and hedge the financial risk accrued by demand variability and forecasting uncertainty.
Electricity demand in a given season is subject to a range of uncertainties, including underlying population growth, changing technology, economic conditions, prevailing weather conditions (and the timing of those conditions), as well as the general randomness inherent in individual usage. It is also subject to some known calendar effects due to the time of day, day of week, time of year, and public holidays.
I will describe a comprehensive forecasting solution designed to take all the available information into account, and to provide forecast distributions from a few hours ahead to a few decades ahead. We use semiparametric additive models to estimate the relationships between demand and the covariates, including temperatures, calendar effects and some demographic and economic variables. Then we forecast the demand distributions using a mixture of temperature simulation, assumed future economic scenarios, and residual bootstrapping. The temperature simulation is implemented through a new seasonal bootstrapping method with variable blocks.
The model is being used by the state energy market operators and some electricity supply companies to forecast the probability distribution of electricity demand in various regions of Australia. It also underpinned the Victorian Vision 2030 energy strategy. 

Financial risk measures  the theory and applications of backward stochastic difference/differential equations with respect to the single jump process 12:10 Mon 26 Mar, 2012 :: 5.57 Ingkarni Wardli :: Mr Bin Shen :: University of Adelaide
Media...This is my PhD thesis submitted one month ago. Chapter 1 introduces the backgrounds of the research fields. Then each chapter is a published or an accepted paper.
Chapter 2, to appear in Methodology and Computing in Applied Probability, establishes the theory of Backward Stochastic Difference Equations with respect to the single jump process in discrete time.
Chapter 3, published in Stochastic Analysis and Applications, establishes the theory of Backward Stochastic Differential Equations with respect to the single jump process in continuous time.
Chapter 2 and 3 consist of Part I Theory.
Chapter 4, published in Expert Systems With Applications, gives some examples about how to measure financial risks by the theory established in Chapter 2.
Chapter 5, accepted by Journal of Applied Probability, considers the question of an optimal transaction between two investors to minimize their risks. It's the applications of the theory established in Chapter 3.
Chapter 4 and 5 consist of Part II Applications. 

Correcting Errors in RSA Private Keys 12:10 Mon 23 Apr, 2012 :: 5.57 Ingkarni Wardli :: Mr Wilko Henecka :: University of Adelaide
Media...Let pk=(N,e) be an RSA public key with corresponding secret key sk=(d,p,q,...). Assume that we obtain partial errorfree information of sk, e.g., assume that we obtain half of the most significant bits of p. Then there are wellknown algorithms to recover the full secret key. As opposed to these algorithms that allow for correcting erasures of the key sk, we present for the first time a heuristic probabilistic algorithm that is capable of correcting errors in sk provided that e is small. That is, on input of a full but errorprone secret key sk' we reconstruct the original sk by correcting the faults.
More precisely, consider an error rate of d in [0,1), where we flip each bit in sk with probability d resulting in an erroneous key sk'. Our LasVegas type algorithm allows to recover sk from sk' in expected time polynomial in logN with success probability close to 1, provided that d is strictly less than 0.237. We also obtain a polynomial time LasVegas factorization algorithm for recovering the factorization (p,q) from an erroneous version with error rate d strictly less than 0.084. 

Change detection in rainfall times series for Perth, Western Australia 12:10 Mon 14 May, 2012 :: 5.57 Ingkarni Wardli :: Ms Farah Mohd Isa :: University of Adelaide
Media...There have been numerous reports that the rainfall in south Western Australia,
particularly around Perth has observed a step change decrease, which is
typically attributed to climate change. Four statistical tests are used to
assess the empirical evidence for this claim on time series from five
meteorological stations, all of which exceed 50 years. The tests used in this
study are: the CUSUM; Bayesian Change Point analysis; consecutive ttest and the
Hotelling's T^2statistic. Results from multivariate Hotelling's T^2 analysis are
compared with those from the three univariate analyses. The issue of multiple
comparisons is discussed. A summary of the empirical evidence for the claimed
step change in Perth area is given. 

The change of probability measure for jump processes 12:10 Mon 28 May, 2012 :: 5.57 Ingkarni Wardli :: Mr Ahmed Hamada :: University of Adelaide
Media...In financial derivatives pricing theory, it is very common to change the probability measure from historical measure "real world" into a RiskNeutral measure as a development of the non arbitrage condition.
Girsanov theorem is the most known example of this technique and is used when prices randomness is modelled by Brownian motions. Other genuine candidates for modelling market randomness that have proved efficiency in recent literature are jump process, so how can a change of measure be performed for such processes?
This talk will address this question by introducing the non arbitrage condition, discussing Girsanov theorem for diffusion and jump processes and presenting a concrete example. 

A brief introduction to Support Vector Machines 12:30 Mon 4 Jun, 2012 :: 5.57 Ingkarni Wardli :: Mr Tyman Stanford :: University of Adelaide
Media...Support Vector Machines (SVMs) are used in a variety of contexts for a range of purposes including regression, feature selection and classification. To convey the basic principles of SVMs, this presentation will focus on the application of SVMs to classification. Classification (or discrimination), in a statistical sense, is supervised model creation for the purpose of assigning future observations to a group or class. An example might be determining healthy or diseased labels to patients from p characteristics obtained from a blood sample.
While SVMs are widely used, they are most successful when the data have one or more of the following properties:
The data are not consistent with a standard probability distribution.
The number of observations, n, used to create the model is less than the number of predictive features, p. (The socalled smalln, bigp problem.)
The decision boundary between the classes is likely to be nonlinear in the feature space.
I will present a short overview of how SVMs are constructed, keeping in mind their purpose. As this presentation is part of a double postgrad seminar, I will keep it to a maximum of 15 minutes.


The fundamental theorems of invariant theory, classical and quantum 15:10 Fri 10 Aug, 2012 :: B.21 Ingkarni Wardli :: Prof Gus Lehrer :: The University of Sydney
Media... Let V = C^n, and let (,) be a nondegenerate bilinear form
on V , which is either symmetric or antisymmetric. Write G for the isometry
group of (V , (,)); thus G = O_n (C) or Sp_n (C). The first fundamental
theorem (FFT) provides a set of generators for End_G(V^{\otimes r} ) (r = 1, 2, . . . ),
while the second fundamental theorem (SFT) gives all relations among the
generators. In 1937, Brauer formulated the FFT in terms of his celebrated
'Brauer algebra' B_r (\pm n), but there has hitherto been no similar version of
the SFT. One problem has been the generic nonsemisimplicity of B_r (\pm n),
which caused H Weyl to call it, in his work on invariants 'that enigmatic
algebra'. I shall present a solution to this problem, which shows that there is
a single idempotent in B_r (\pm n), which describes all the relations. The proof
is through a new 'Brauer category', in which the fundamental theorems are
easily formulated, and where a calculus of tangles may be used to prove these
results. There are quantum analogues of the fundamental theorems which I
shall also discuss. There are numerous applications in representation theory,
geometry and topology. This is joint work with Ruibin Zhang. 

Continuous random walk models for solute transport in porous media 15:10 Fri 17 Aug, 2012 :: B.21 Ingkarni Wardli :: Prof Pavel Bedrikovetski :: The University of Adelaide
Media...The classical diffusion (thermal conductivity) equation was derived from the Master random walk equation and is parabolic. The main assumption was a probabilistic distribution of the jump length while the jump time is constant. Distribution of the jump time along with the jump length adds the second time derivative into the averaged equations, but the equation becomes ... elliptic! Where from to take an extra initial condition? We discuss how to pose the wellposed flow problem, exact 1d solution and numerous engineering applications. This is joint work with A. Shapiro and H. Yuan. 

Probability, what can it tell us about health? 13:10 Tue 9 Oct, 2012 :: 7.15 Ingkarni Wardli :: Prof Nigel Bean :: School of Mathematical Sciences
Media...Clinical trials are the way in which modern medical systems test whether individual treatments are worthwhile. Sophisticated statistics is used to try and make the conclusions from clinical trials as meaningful as possible. What can a very simple probability model then tell us about the worth of multiple treatments? What might the implications of this be for the whole health system?
This talk is based on research currently being conducted with a physician at a major Adelaide hospital. It requires no health knowledge and was not tested on animals. All you need is an enquiring and open mind.


Numerical Free Probability: Computing Eigenvalue Distributions of Algebraic Manipulations of Random Matrices 15:10 Fri 2 Nov, 2012 :: B.20 Ingkarni Wardli :: Dr Sheehan Olver :: The University of Sydney
Media...Suppose that the global eigenvalue distributions
of two large random matrices A and B are known. It is a
remarkable fact that, generically, the eigenvalue distribution
of A + B and (if A and B are positive definite) A*B are
uniquely determined from only the eigenvalue distributions
of A and B; i.e., no information about eigenvectors are
required. These operations on eigenvalue distributions
are described by free probability theory. We construct a
numerical toolbox that can efficiently and reliably
calculate these operations with spectral accuracy, by
exploiting the complex analytical framework that underlies
free probability theory.


Asymptotic independence of (simple) twodimensional Markov processes 15:10 Fri 1 Mar, 2013 :: B.18 Ingkarni Wardli :: Prof Guy Latouche :: Universite Libre de Bruxelles
Media...The onedimensional birthand death model is one of the basic processes in applied probability but difficulties appear as one moves to higher dimensions. In the positive recurrent case, the situation is singularly simplified if the stationary distribution has productform. We investigate the conditions under which this property holds, and we show how to use the knowledge to find productform approximations for otherwise unmanageable random walks. This is joint work with Masakiyo Miyazawa and Peter Taylor. 

A stability theorem for elliptic Harnack inequalities 15:10 Fri 5 Apr, 2013 :: B.18 Ingkarni Wardli :: Prof Richard Bass :: University of Connecticut
Media...Harnack inequalities are an important tool in probability theory,
analysis, and partial differential equations. The classical Harnack
inequality is just the one you learned in your graduate complex analysis
class, but there have been many extensions, to different spaces, such as
manifolds, fractals, infinite graphs, and to various sorts of elliptic operators.
A landmark result was that of Moser in 1961, where he proved the Harnack
inequality for solutions to a class of partial differential equations.
I will talk about the stability of Harnack inequalities. The main result
says that if the Harnack inequality holds for an operator on a space,
then the Harnack inequality will also hold for a large class of other operators
on that same space. This provides a generalization of the result of Moser. 

Markov decision processes and interval Markov chains: what is the connection? 12:10 Mon 3 Jun, 2013 :: B.19 Ingkarni Wardli :: Mingmei Teo :: University of Adelaide
Media...Markov decision processes are a way to model processes which involve some sort of decision making and interval Markov chains are a way to incorporate uncertainty in the transition probability matrix. How are these two concepts related? In this talk, I will give an overview of these concepts and discuss how they relate to each other. 

The Hamiltonian Cycle Problem and Markov Decision Processes 15:10 Fri 2 Aug, 2013 :: B.18 Ingkarni Wardli :: Prof Jerzy Filar :: Flinders University
Media...We consider the famous Hamiltonian cycle problem (HCP) embedded in a Markov decision process (MDP). More specifically, we consider a moving object on a graph G where, at each vertex, a controller may select an arc emanating from that vertex according to a probabilistic decision rule. A stationary policy is simply a control where these decision rules are time invariant. Such a policy induces a Markov chain on the vertices of the graph. Therefore, HCP is equivalent to a search for a stationary policy that induces a 01 probability transition matrix whose nonzero entries trace out a Hamiltonian cycle in the graph. A consequence of this embedding is that we may consider the problem over a number of, alternative, convex  rather than discrete  domains. These include: (a) the space of stationary policies, (b) the more restricted but, very natural, space of doubly stochastic matrices induced by the graph, and (c) the associated spaces of socalled "occupational measures". This approach to the HCP has led to both theoretical and algorithmic approaches to the underlying HCP problem. In this presentation, we outline a selection of results generated by this line of research. 

Shannon entropy as a diagnostic tool for PDEs in conservation form 15:10 Fri 16 Aug, 2013 :: B.18 Ingkarni Wardli :: Prof Philip Broadbridge :: La Trobe University
Media...After normalization, an evolving real nonnegative function may be viewed as a probability density. From this we may derive the corresponding evolution law for Shannon entropy. Parabolic equations, hyperbolic equations and fourthorder diffusion equations evolve information in quite different ways. Entropy and irreversibility can be introduced in a selfconsistent manner and at an elementary level by reference to some simple evolution equations such as the linear heat equation. It is easily seen that the 2nd law of thermodynamics is equivalent to loss of Shannon information when temperature obeys a general nonlinear 2nd order diffusion equation.
With fourth order diffusion terms, new problems arise. We know from applications such as thin film flow and surface diffusion, that fourth order diffusion terms may generate ripples and they do not satisfy the Second Law. Despite this, we can identify the class of fourth order quasilinear diffusion equations that increase the Shannon entropy.


Is it possible to beat the lottery system? 12:10 Mon 24 Mar, 2014 :: B.19 Ingkarni Wardli :: Michael Lydeamore :: University of Adelaide
Media...Every week millions of people around the country buy tickets for a round of the lottery. Known as the "lotto", the chances of winning the big prize are less than 1 in 8 million, yet every week people will purchase a ticket. What if there was a smart way of betting which would increase your odds? A few weeks ago an article came across my desk with those very words: "Using this scheme you will win more". In this talk, we'll test those claims. Looking first at a basic counting argument, and then later moving the hard work over to a computer we'll find out if this betting scheme (and many others similar to it) will actually win you more or if just like playing in a casino, you'll still go bankrupt with probability 1. 

A Hybrid Markov Model for Disease Dynamics 12:35 Mon 29 Sep, 2014 :: B.19 Ingkarni Wardli :: Nicolas Rebuli :: University of Adelaide
Media...Modelling the spread of infectious diseases is fundamental to protecting ourselves from potentially devastating epidemics. Among other factors, two key indicators for the severity of an epidemic are the size of the epidemic and the time until the last infectious individual is removed. To estimate the distribution of the size and duration of an epidemic (within a realistic population) an epidemiologist will typically use Monte Carlo simulations of an appropriate Markov process. However, the number of states in the simplest Markov epidemic model, the SIR model, is quadratic in the population size and so Monte Carlo simulations are computationally expensive. In this talk I will discuss two methods for approximating the SIR Markov process and I will demonstrate the approximation error by comparing probability distributions and estimates of the distributions of the final size and duration of an SIR epidemic. 

Multiscale modelling of multicellular biological systems: mechanics, development and disease 03:10 Fri 6 Mar, 2015 :: Lower Napier LG24 :: Dr James Osborne :: University of Melbourne
When investigating the development and function of multicellular biological systems it is not enough to only consider the behaviour of individual cells in isolation. For example when studying tissue development, how individual cells interact, both mechanically and biochemically, influences the resulting tissues form and function. In this talk we present a multiscale modelling framework for simulating the development and function of multicellular biological systems (in particular tissues). Utilising the natural structural unit of the cell, the framework consists
of three main scales: the tissue level (macroscale); the cell level (mesoscale); and the subcellular level (microscale), with multiple interactions occurring between all scales. The cell level is central to the framework and cells are modelled as discrete interacting entities using one of a number of possible modelling paradigms, including lattice based models (cellular automata and cellular Potts) and offlattice based models (cell centre and vertex based representations). The subcellular level concerns numerous metabolic and biochemical processes represented by interaction networks rendered stochastically or into ODEs. The outputs from such systems influence the behaviour of the cell level affecting properties such as adhesion and also influencing cell mitosis and apoptosis. At the tissue level we consider factors or restraints that influence the cells, for example the distribution of a nutrient or messenger molecule, which is represented by field equations, on a growing domain, with individual cells functioning as
sinks and/or sources. The modular approach taken within the framework enables more realistic behaviour to be considered at each scale.
This framework is implemented within the Open Source Chaste library (Cancer Heart and Soft Tissue Environment, (http://www.cs.ox.ac.uk/chaste/)
and has been used to model biochemical and biomechanical interactions in various biological systems. In this talk we present the key ideas of the framework along with applications within the fields of development and disease. 

Cricket and Maths 12:10 Mon 16 Mar, 2015 :: Napier LG29 :: Peter Ballard :: University of Adelaide
Media...Each game of international cricket has a scorecard. You don't need to know much maths to go through these scorecards and extract simple information, such as batting and bowling averages. However there is also the opportunity to use some more advanced maths. I will be using a bit of optimisation, probability and statistics to try to answer the questions: Which was the most dominant team ever? What scores are most likely? And are some players unlucky? 

Groups acting on trees 12:10 Fri 10 Apr, 2015 :: Napier 144 :: Anitha Thillaisundaram :: Heinrich Heine University of Duesseldorf
From a geometric point of view, branch groups are groups acting
spherically transitively on a spherically homogeneous rooted tree. The
applications of branch groups reach out to analysis, geometry,
combinatorics, and probability. The early construction of branch groups
were the Grigorchuk group and the GuptaSidki pgroups. Among its many
claims to fame, the Grigorchuk group was the first example of a group of
intermediate growth (i.e. neither polynomial nor exponential). Here we
consider a generalisation of the family of GrigorchukGuptaSidki groups,
and we examine the restricted occurrence of their maximal subgroups. 

The Mathematics of Crime 15:10 Fri 23 Oct, 2015 :: Ingkarni Wardli B21 :: Prof Andrea Bertozzi :: UCLA
Media...Law enforcement agencies across the US have discovered that partnering with a team of mathematicians and social scientists from UCLA can help them determine where crime is likely to occur. Dr. Bertozzi will talk about the fascinating story behind her participation on the UCLA team that developed a âpredictive policingâ computer program that zerosin on areas that have the highest probability of crime. In addition, the use of mathematics in studying gang crimes and other criminal activities will also be discussed. Commercial use of the "predictivepolicing" program allows communities to put police officers in the right place at the right time, stopping crime before it happens. 

A SemiMarkovian Modeling of Limit Order Markets 13:00 Fri 11 Dec, 2015 :: Ingkarni Wardli 5.57 :: Anatoliy Swishchuk :: University of Calgary
Media...R. Cont and A. de Larrard (SIAM J. Financial Mathematics, 2013) introduced a tractable stochastic model for the dynamics of a limit order book, computing various quantities of interest such as the probability of a price increase or the diffusion limit of the price process. As suggested by empirical observations, we extend their framework to 1) arbitrary distributions for book events interarrival times (possibly nonexponential) and 2) both the nature of a new book event and its corresponding interarrival time depend on the nature of the previous book event. We do so by resorting to Markov renewal processes to model the dynamics of the bid and ask queues. We keep analytical tractability via explicit expressions for the Laplace transforms of various quantities of interest. Our approach is justified and illustrated by calibrating the model to the five stocks Amazon, Apple, Google, Intel and Microsoft on June 21st 2012. As in Cont and Larrard, the bidask spread remains constant equal to one tick, only the bid and ask queues are modelled (they are independent from each other and get reinitialized after a price change), and all orders have the same size. (This talk is based on our joint paper with Nelson Vadori (Morgan Stanley)). 

On the Willmore energy 15:10 Fri 7 Oct, 2016 :: Napier G03 :: Dr Yann Bernard :: Monash University
Media...The Willmore energy of a surface captures its bending. Originally discovered 200 years ago by Sophie Germain in the context of elasticity theory, it has since then been rediscovered numerous times in several areas of science: general relativity, optics, string theory, conformal geometry, and cell biology. For example, our red blood cells assume a peculiar shape that minimises the Willmore energy.
In this talk, I will present the thrilling history of the Willmore energy, its applications, and its main properties. The presentation will be accessible to all mathematicians as well as to advanced undergraduate students. 

In space there is noone to hear you scream 12:10 Tue 12 Sep, 2017 :: Inkgarni Wardli 5.57 :: A/Prof Gary Glonek :: School of Mathematical Sciences
Media...Modern data problems often involve data in very high dimensions. For example, gene expression profiles, used to develop cancer screening models, typically have at least 30,000 dimensions. When dealing with such data, it is natural to apply intuition from low dimensional cases. For example, in a sample of normal observations, a typical data point will be near the centre of the distribution with only a small number of points at the edges.
In this talk, simple probability theory will be used to show that the geometry of data in high dimensional space is very different from what we can see in one and twodimensional examples. We will show that the typical data point is at the edge of the distribution, a long way from its centre and even further from any other points. 

The Markovian binary tree applied to demography and conservation biology 15:10 Fri 27 Oct, 2017 :: Ingkarni Wardli B17 :: Dr Sophie Hautphenne :: University of Melbourne
Markovian binary trees form a general and tractable class of continuoustime branching processes, which makes them wellsuited for realworld applications. Thanks to their appealing probabilistic and computational features, these processes have proven to be an excellent modelling tool for applications in population biology. Typical performance measures of these models include the extinction probability of a population, the distribution of the population size at a given time, the total progeny size until extinction, and the asymptotic population composition. Besides giving an overview of the main performance measures and the techniques involved to compute them, we discuss recently developed statistical methods to estimate the model parameters, depending on the accuracy of the available data. We illustrate our results in human demography and in conservation biology. 

Stochastic Modelling of Urban Structure 11:10 Mon 20 Nov, 2017 :: Engineering Nth N132 :: Mark Girolami :: Imperial College London, and The Alan Turing Institute
Media...Urban systems are complex in nature and comprise of a large number of individuals that act according to utility, a measure of net benefit pertaining to preferences. The actions of individuals give rise to an emergent behaviour, creating the socalled urban structure that we observe. In this talk, I develop a stochastic model of urban structure to formally account for uncertainty arising from the complex behaviour. We further use this stochastic model to infer the components of a utility function from observed urban structure. This is a more powerful modelling framework in comparison to the ubiquitous discrete choice models that are of limited use for complex systems, in which the overall preferences of individuals are difficult to ascertain. We model urban structure as a realization of a Boltzmann distribution that is the invariant distribution of a related stochastic differential equation (SDE) that describes the dynamics of the urban system. Our specification of Boltzmann distribution assigns higher probability to stable configurations, in the sense that consumer surplus (demand) is balanced with running costs (supply), as characterized by a potential function. We specify a Bayesian hierarchical model to infer the components of a utility function from observed structure. Our model is doublyintractable and poses significant computational challenges that we overcome using recent advances in Markov chain Monte Carlo (MCMC) methods. We demonstrate our methodology with case studies on the London retail system and airports in England. 

Calculating optimal limits for transacting credit card customers 15:10 Fri 2 Mar, 2018 :: Horace Lamb 1022 :: Prof Peter Taylor :: University of Melbourne
Credit card users can roughly be divided into `transactors', who pay off their balance each month, and `revolvers', who maintain an outstanding balance, on which they pay substantial interest.
In this talk, we focus on modelling the behaviour of an individual transactor customer. Our motivation is to calculate an optimal credit limit from the bank's point of view. This requires an expression for the expected outstanding balance at the end of a payment period.
We establish a connection with the classical newsvendor model. Furthermore, we derive the Laplace transform of the outstanding balance, assuming that purchases are made according to a marked point process and that there is a simplified balance control policy which prevents all purchases in the rest of the payment period when the credit limit is exceeded. We then use the newsvendor model and our modified model to calculate bounds on the optimal credit limit for the more realistic balance control policy that accepts all purchases that do not exceed the limit.
We illustrate our analysis using a compound Poisson process example and show that the optimal limit scales with the distribution of the purchasing process, while the probability of exceeding the optimal limit remains constant.
Finally, we apply our model to some real credit card purchase data. 

Random walks 15:10 Fri 12 Oct, 2018 :: Napier 208 :: A/Prof Kais Hamza :: Monash University
A random walk is arguably the most basic stochastic process one can define. It is also among the most intuitive objects in the theory of probability and stochastic processes. For these and other reasons, it is one of the most studied processes or rather family of processes, finding applications in all areas of science, technology and engineering.
In this talk, I will start by recalling some of the classical results for random walks and then discuss some of my own recent explorations in this area of research that has maintained relevance for decades. 
News matching "At least four doors, numerous goats, a car, a frog" 
A/Prof Joshua Ross, 2017 Moran Medal recipient Congratulations to Associate Professor Joshua Ross who has won the 2017 Moran Medal, awarded by the Australian Academy of Science.
The Moran Medal recognises outstanding research by scientists up to 10 years postPhD in applied probability, biometrics, mathematical genetics, psychometrics and statistics.
Associate Professor Ross has made influential contributions to public health and conservation biology using mathematical modelling and statistics to help in decision making.
Posted Fri 23 Dec 16.More information... 
Publications matching "At least four doors, numerous goats, a car, a frog"Publications 

Hitting probabilities and hitting times for stochastic fluid flows the bounded model Bean, Nigel; O'Reilly, Malgorzata; Taylor, P, Probability in the Engineering and Informational Sciences 23 (121–147) 2009  A total probability approach to flood frequency analysis in tidal river reaches Need, Steven; Lambert, Martin; Metcalfe, Andrew, World Environmental and Water Resources Congress 2008 Ahupua'a, Honolulu 12/05/08  Algorithms for the LaplaceStieltjes transforms of first return times for stochastic fluid flows Bean, Nigel; O'Reilly, Malgorzata; Taylor, Peter, Methodology and Computing in Applied Probability 10 (381–408) 2008  Robust Optimal Portfolio Choice Under Markovian Regimeswitching Model Elliott, Robert; Siu, T, Methodology and Computing in Applied Probability 11 (145–157) 2008  Dynamic portfolio allocation, the dual theory of choice and probability distortion functions Hamada, M; Sherris, M; Van Der Hoek, John, Astin Bulletin 31 (187–217) 2006  Methods of constrained and unconstrained approximation for mappings in probability spaces Torokhti, Anatoli; Howlett, P; Pearce, Charles, chapter in Modern Applied Mathematics (Narosa Publishing House) 83–129, 2005  Oriented site percolation, phase transitions and probability bounds Pearce, Charles; Fletcher, F, Journal of Inequalities in Pure and Applied Mathematics 6 (WWW 1–WWW 15) 2005  Identification of probability distributions within hidden state models of rainfall Whiting, Julian; Lambert, Martin; Metcalfe, Andrew, 28th International Hydrology and Water Resources Symposium, Wollongong, NWS, Australia 10/11/03  The tree cut and merge algorithm for estimation of network reliability Hui, KinPing; Bean, Nigel; Kraetzl, Miro; Kroese, D, Probability in the Engineering and Informational Sciences 17 (23–45) 2003  What is a unit of capacity worth? Chiera, Belinda; Taylor, Peter, Probability in the Engineering and Informational Sciences 16 (513–522) 2002  On the existence of a quasistationary measure for a Markov chain Lasserre, J; Pearce, Charles, Annals of Probability 29 (437–446) 2001  Levelphase independence for GI/M/1type markov chains Latouche, Guy; Taylor, Peter, Journal of Applied Probability 37 (984–998) 2000 
Advanced search options
You may be able to improve your search results by using the following syntax:
Query  Matches the following 

Asymptotic Equation  Anything with "Asymptotic" or "Equation". 
+Asymptotic +Equation  Anything with "Asymptotic" and "Equation". 
+Stokes "NavierStokes"  Anything containing "Stokes" but not "NavierStokes". 
Dynam*  Anything containing "Dynamic", "Dynamical", "Dynamicist" etc. 
