Сòòò½APP

2011-12 Colloquia talks

 

Date Speaker Talk
Wednesday, June 20, 2012 at 3:30 p.m. in ILB 370 Lavanya Kannan, American Museum of Natural History Inferring Phylogenies using Maximum Parsimony

Abstract: Phylogenetics is the study of the evolutionary history of a group of organisms (eg. species, populations), which is discovered through molecular sequencing data and morphological data matrices. Phylogenetic trees are acyclic graphs whose leaves are labeled by individual organisms that are being studied, and the structure of the graph describes the branching patterns of evolution that lead to the organisms. Several different methods are used to reconstruct the phylogenetic trees. In this talk, we will see one such method, namely the maximum parsimony approach, which infers phylogenetic trees by minimizing the total number of evolutionary steps required to explain a given set of data assigned on the leaves. The input datasets are viewed as mappings from the set of organisms to a set of states that the organisms assume. Calculating the parsimony score of a tree is computationally tractable, and Fitch's algorithm to efficiently find the score will be discussed. Phylogenetic networks are generalizations of phylogenetic trees, that are used to model certain evolutionary events that happen through horizontal exchanges, rather than inheriting from ancestral organisms. We will see some mathematical and computational challenges that abound the phylogenetic network construction.

Tuesday, June 19, 2012 Michael La Croix, University of Waterloo, Ontario, Canada Measuring the Non-Orientability of Maps: Towards a Combinatorial Realization of the Jack Parameter

Abstract: Combinatorial maps are graphs embedded in surfaces. Equivalence classes of maps can be defined consistently in terms of analytic, topological, or combinatorial properties of the embeddings. The compatibility of these properties creates a rich structure, that can be analyzed using a variety of techniques. Separate but parallel enumerative theories exist, depending on whether all surfaces, or only orientable surfaces, are considered.

Enumerative questions for both types of surfaces arise naturally (in the description of multiplication tables of appropriate algebras, and in the evaluation of matrix integrals, for example). Traditional attacks on these questions permit parallel, but separate derivations of the generating series corresponding to these two classes of surfaces, via either algebraic or analytic methods. For maps, the distinction between orientable surfaces and all surfaces roughly parallels the distinction between real (or quaternionic) and complex algebraic geometry. Indeed, in terms of Arnold's Trinities, orientable maps could be considered the complexification of all maps.

The resulting series have natural one-parameter generalizations, from which the corresponding series for orientable surfaces or for all surfaces can be recovered by specialization. In terms of symmetric functions, the Jack parameter interpolates between Schur functions, used to enumerate orientable maps, and zonal polynomials, used to enumerate all maps.

Goulden and Jackson (1996) conjectured that the generalized series can be obtained directly, in terms of an unknown combinatorial statistic that measures non-orientability of rooted maps. I will describe a statistic, defined in my thesis in terms of iterated root-edge deletion, that resolves a special case of this conjecture.

Friday, June 15, 2012 Kathleen Wilkie, Center of Cancer Systems Biology, Steward St. Elizabethís Medical Center, Tufts University School of Medicine Mathematical Insights into Immune Modulation of Tumor Growth

Abstract: Cancer cells can elicit an immune response in the host, which is generally tumor-suppressive, but for weak responses may actually be tumor-promoting. We propose that this complex dynamic may be understood as a process of immune stimulation by the tumor, followed by cytotoxic targeting by the immune cells, which acts to alter tumor size and growth characteristics and subsequent immune stimulation. Just how these influences interact has complex implications for tumor development and cancer dormancy. To show this, we have developed a two-compartment model consisting of a population of cancer cells and a population of immune cells. The model incorporates the combined effects of the various immune cell types, exploiting general principles of self-limited logistic growth and the physical process of tumor-promoting inflammation. A Markov chain Monte Carlo method is used to determine parameter sets that predict tumor growth equally well, but at the same time also predict fundamentally different underlying dynamics. The results underscore the ultimately polar nature of final tumor fate (escape or elimination), while at the same time showing how transient periods of tumor dormancy may precede either of these two outcomes. Another important finding is that near- and long-term responses of a tumor to immune interaction may be opposed; that is to say, a response dynamic that appears to be more promoting of tumor growth than another in the near term may be superior at curtailing tumor growth in the long-term, even to the point of establishing dormancy while the other allows for tumor escape. The striking variability observed even in this simple model demonstrates the significance of intrinsic and unmeasurable factors determining the complex biological processes involved in tumor growth in an immune competent host. Consequences and biological interpretations of this work will be discussed in terms of treatment approaches that exploit immune response to improve tumor suppression, including the potential attainment of an immune-induced dormant state.

This talk will be structured as follows. After introducing the field of biomedical mathematics via an example from my doctoral research, I will describe our modeling framework of cancer-immune interactions focusing first on the effects of inflammation and then on immune-induced tumor dormancy. Finally, I will discuss some ongoing work on cancer immunoediting, where the immune response selectively prunes the tumor of immune-sensitive cells to cause an initially heterogeneous population to become a more homogeneous, and more resistant, population.

Thursday, June 14, 2012 Yunjiao Wang, Mathematical Biosciences Institute, Ohio State University & Department of Computational and Applied Mathematics, Rice University Rigid Phase-Shifts in Periodic Solutions of Network Systems

Abstract: In this talk I will first present a work in which I collaborated with Marty Golubitsky and David Romano, and if time permits, I will describe at the end of the talk some of my other research and future directions. My work with Golubitsky and Romano is motivated by observations from animal gaits: they are periodic and movements of legs have certain phase relations such as synchronous, half period out of phase, and so on. Evidence in the literature shows that the rhythms of animal gaits are generated by central pattern generators (some neuronal networks). The question we address in our work is: what kind of network systems support periodic solutions with given rigid phase relations (Here "rigid" means that the relations remain the same under perturbations that respect the network architecture.) Ian Stewart and Martyn Parker proved that rigid phase-shifts are derived from cyclic group symmetry of a suitable quotient network if the network is (1) transitive and in addition is (2) fully oscillatory, rigid synchronous and satisfies the rigid phase-shifts property (meaning that if two nodes in a hyperbolic periodic solution are rigidly phase related, then the nodes that send signals to them are also rigidly phase related). The validity of the rigid phase-shifts property was a decade-long conjecture in the theory of coupled cell systems. We prove that the three properties (2) are all satisfied and that rigid phase-shifts in a non-transitive network are derived from a quotient network of an extension of the original network.

Friday, May 11, 2012 Mohammad Aziz, California Polytechnic State University Robust Comonotonic Lower Convex Order Bound Approximations for the Sum of log Unified Skew Normal Random Variables

Abstract: The classical work horse in finance and insurance for modeling asset returns is the Gaussian model. However, when modeling complex random phenomena, more flexible distributions are needed which are beyond the normal distribution. This is because most of the financial and economic data are skewed and have ìfat tailsî due to the presence of outliers. Hence symmetric distributions like normal or others may not be good choices while modeling these kinds of data. Flexible distributions like skew normal distribution allow robust modeling of high-dimensional multimodal and asymmetric data. In this talk, we will consider a very flexible financial model to construct robust comonotonic lower convex order bounds in approximating the distribution of the sums of dependent log skew normal random variables. The dependence structure of these random variables is based on a recently developed multivariate skew normal distribution, known as unified skew normal distribution. In order to accommodate the distribution to the model considered, we first study inherent properties of this class of skew normal distribution. These properties along with the convex order and comonotonicity of random variables are used to approximate the distribution function of terminal wealth. The risk measure related to the approximated distribution is then calculated. The accurateness of the approximations is investigated numerically. Results obtained from our methods are competitive to those obtained from a more time consuming method called, Monte Carlo method.

Tuesday, May 8, 2012 Huybrechts Bindele, Auburn University Bounded Influence Nonlinear Signed-Rank Regression

Abstract: In this talk we consider weighted generalized-signed-rank estimators of nonlinear regression coefficients. The generalization allows us to include popular estimators such as the least squares and least absolute deviations estimators but by itself does not give bounded influence estimators. Adding weights results in estimators with bounded influence function. We establish conditions needed for the consistency and asymptotic normality of the proposed estimator and discuss how weight functions can be chosen to achieve bounded influence function of the estimator. Real life examples and Monte Carlo simulation experiments demonstrate the robustness and efficiency of the proposed estimator. An example shows that the weighted signed-rank estimator can be useful to detect outliers in nonlinear regression.

Monday, April 30, 2012 Dulal Bhaumik, University of Illinois at Chicago Meta-Analysis of Binary Rare Events Data

Abstract: The use of meta-analysis for research synthesis has become routine in medical research. Unlike early developments for effect sizes based on continuous and normally distributed outcomes (Hedges and Olkin, 1985), applications of meta-analysis in medical research often focus on the odds ratio (Engles et. al, 2000 and Deeks, 2002) between treated and control conditions in terms of a binary indicator of efficacy and/or the presence or absence of an adverse drug reaction (ADR). A special statistical problem arises when the focus of research synthesis is on rare binary events, such as a rare ADR. In this talk we will discuss meta analysis for rare events. We will illustrate our results with an example of Percutaneous Coronary Intervention (PCI) versus medical treatment alone (MED) in the treatment of patients with stable coronary artery disease.

Friday, April 27, 2012
This talk is aimed in particular at undergraduate and graduate students.
Scott Brown, Сòòò½APP The Data Deluge

Abstract: Massive amounts of data are collected every day. Scientific data comes in massive amounts from sensor networks, astronomical instruments, biometric devices, etc., and needs to be sorted out and understood.

Personal data from our Google searches, our Facebook or Twitter activities, our credit card purchases, our travel habits, and so on, are being mined to provide information and insight.

These data sets provide great opportunities, and pose dangers as well.

Thursday, April 26, 2012 Paul-Hermann Zieschang, University of Texas at Brownsville Association Schemes and Buildings

Abstract: Let X be a set, and define r* := {(y,z) : (z,y) is contained in r} for each binary relation r on X. A collection S of binary relations on X is called an association scheme if it is a partition of X x X, if the identity on X belongs to S, if s* is contained in S for each element s in S, and if, for any two elements p and q in S, the number of elements x in X with (y,x) being contained in p and (x,z) being contained in q does not depend on y or z, but only on p, q, and on the relation that contains (y,z).

The notion of a building is due to Jacques Tits who defined these objects in order to associate to each simple algebraic group over an arbitrary field a geometry in exactly the same way as the finite projective plane of order 2 (the Fano plane) is associated to the simple group PSL3(2) of order 168.

In my talk I will first show that scheme theory naturally generalizes group theory. After that, I will show that buildings play exactly the same role in scheme theory which Coxeter groups play in group theory.

Friday, April 20, 2012
This talk is aimed in particular at undergraduate and graduate students.
David Benko, Сòòò½APP Comparing Sports Events

Abstract: Have you ever wondered which one of the four tennis grand slam tournaments is the best? ï Wimbledon? ï Roland Garros? ï US Open? ï Australian Open? They are played on different surfaces: grass, clay, acrylic hard court, synthetic hard court, and under different weather conditions: sun, wind, rain, snow. How can we pick a winner? As always, mathematics can help us to find the answer.

Thursday, April 19, 2012 Jim Gleason, University of Alabama Analyzing Math Tests: What are we measuring?

Abstract: There are many different reasons for having students take a test in a mathematics class. We will look at how these different philosophies should impact how we write and use these tests. We will then look at tools to analyze tests and test questions to determine if they line up with our purposes for the test.

Tuesday, April 17, 2012 Mark Alber, University of Notre Dame & Indiana University School of Medicine Multiscale Modeling of Bacterial Swarming

Abstract: Many bacteria can rapidly traverse surfaces from which they are extracting nutrient for growth. They generate flat, spreading colonies, called swarms because they resemble swarms of insects. Myxococcus xanthus is a common soil bacteria that is studied in part for the high level of social coordination observed when cells are swarming on different surfaces. Individual cells are flexible rods covered by a viscous polysaccharide capsule that creates an adhesive interaction between cells. M. xanthus cells regularly reverse direction of their motion and organize into single layers of small clusters and large rafts of cells at the edge of a spreading population. We will describe in this talk a newly developed Subcellular Element Model (SCE) of the M. xanthus swarm. Coupled simulations and experimental bacteria tracking demonstrated how the flexibility and adhesion between cells as well as cell reversals impacted the dynamics of cell clusters resulting in better understanding of how these bacteria effectively colonize surfaces. We will also show that periodic reversals in the direction of motion in systems of self-propelled rod-shaped bacteria enable them to effectively resolve traffic jams formed during swarming and maximize the swarming rate of the colony. In the second half of the talk, a connection will be described between a microscopic one-dimensional cell-based stochastic model of reversing non overlapping bacteria and a macroscopic nonlinear diffusion equation describing the dynamics of cellular density.

Friday, April 13, 2012 Qi Tang, Abbott Labs The Sample Average Approximation Method with Statistical Designs for Stochastic Optimization

Abstract: Many computational problems in statistics can be cast as stochastic optimization problems where objective functions are multi-dimensional integrals. The Sample Average Approximation method is widely used for solving such a problem, which first constructs a sampling based approximation to the objective function and then finds the solution to the approximated problem. Independent and identically distributed (IID) sampling is a prevailing choice for constructing such an approximation. Recently it was found that the use of Latin hypercube designs can improve sample average approximations and outperform the IID sampling. However, in computer experiments, U designs are known to possess better space-filling properties than Latin hypercube designs. Inspired by this fact, we propose to use U designs to further enhance the accuracy of the sample average approximation method. Theoretical results are derived to show that sample average approximations with U designs perform significantly better than those with Latin hypercube designs. Numerical examples are provided to corroborate the developed theoretical results.

Thursday, April 12, 2012 Robert Taylor, Clemson University Fun and Opportunities in Probability and Statistics

Abstract: Probability and statistics problems have intrigued and puzzled people for many years. Some of these problems will be analyzed to determine logical solutions and to illustrate facetious approaches to solutions. Monty Hall's "Let's Make a Deal" puzzler will be presented as one example of illogical and logical solutions. In addition, career opportunities for students in the mathematical sciences, especially probability and statistics, will be discussed.

Thursday, April 12, 2012 Robert Taylor, Clemson University Consistency and Validity of Dependent Bootstrapping

Abstract: The traditional bootstrap resamples with replacement from the original sample observations to form arrays of rowwise independent and identically distributed bootstrap random variables. There are situations, for example, when sampling from finite populations, where resampling without replacement provides a more realistic bootstrap procedure and produces dependent bootstrap random variables. The desired properties of consistency and asymptotic validity are shown to hold for certain nonparametric dependent bootstrap estimators. In addition, it is shown that the smaller variation in dependent bootstrap estimators can be used to increase precision in some of the estimators.

Tuesday, April 10, 2012 Xiaosong Li, Georgia Southern University Testing on the Common Mean of Normal Distributions Using Bayesian Method

Abstract: If a Meta-Analysis consists of analysis of a common mean of several different normal populations with unknown and probably unequal variances, there are several ways to make inferences on this common mean. The most common way is point estimation, which uses sample data to calculate a single statistic value serving as a best guess for the unknown population mean; the second way is to conduct a hypothesis test which assumes all populations have the same mean as the null hypothesis.

The first type of inference is widely studied in the literature in the past, but little attention has been paid to the second type. One of the reasons may be that the test statistic(s) of the hypothesis test usually has a complicated sampling distribution(s) which requires computational resources out of reach of the ordinary researcher. With the fast development of new technology which is widely accessible on a personal computer, this is no longer an obstacle and more research has been done in this area.

In their 2008 paper, Dr. Ching-Hui Chang and Dr. Nabendu Pal described several methods that can be used to test hypotheses concerning the common mean of several normal distributions with unknown variances. The methods they proposed are the likelihood ratio test (LRT), two tests based on the Graybill-Deal Estimator (GDE) and a test based on the maximum likelihood estimator (MLE).

In my research, several procedures based on the Bayesian method are proposed, simulation studies of power and robustness of the newly proposed tests, the LRT and GDE tests are performed and discussed. The new tests proposed in this thesis are either based on the assumption that the posterior distribution of the common mean follows some specific familiar distribution (t or normal), or is based on slice sampling (Neal, 2003), Highest Posterior Density (HPD) Method (Berger 1985) or a modified version of HPD utilizing the actual posterior distribution.

Thursday, April 5, 2012 Abdenacer Makhlouf, Université de Haute-Alsace, France & University of South Florida n-ary Algebras: From Physics to Mathematics

Abstract: Lie algebras and Poisson algebras have played an extremely important role in mathematics and physics for a long time. Their generalizations, known as n-Lie algebras and "Nambu algebras" also arise naturally in physics in many different contexts. In this talk, I will review some basics on n-ary algebras, present some key constructions and discuss representation theory and cohomology theory of n-Lie algebras.

Thursday, March 22, 2012 Ha Nguyen, Georgia Southern University The Moment Problem and Real Algebraic Geometry

Abstract: In 1991, Schm¸dgen proved a result on the moment problem, a classical question in analysis, for compact semialgebraic sets. This remarkable and beautiful theorem brought the moment problem to the attention of real algebraic geometers and real algebraic geometry to the attention of those working on the moment problem.

In this talk, we will show connections between a problem from functional analysis - the moment problem - and a problem from real algebraic geometry involving positive polynomials and sums of squares.

Thursday, February 9, 2012 Lori Alvin, University of West Florida Investigations in Low Dimensional Dynamics

Abstract: We discuss several concepts, techniques, and tools that are useful in the study of low dimensional dynamics and focus on a few well-known results to highlight those background techniques. We particularly emphasize the role that symbolic dynamics plays in understanding dynamical systems generated by unimodal maps. We then discuss adding machine maps and highlight several recent results in the field. We conclude by showing how a better understanding of adding machine maps can lead to more generalized results in the family of unimodal maps.

Thursday, December 1, 2011 Marcelo Aguiar, Texas A&M University Hopf Algebras and Real Hyperplane Arrangements

Abstract: The starting point for our considerations is the notion of graded Hopf algebra, particularly those graded over the nonnegative integers. When the latter are replaced by finite sets, one arrives at the notion of Hopf monoid in Joyal's category of species. The goal of this talk is to go one step further, replacing finite sets by finite real hyperplane arrangements. Geometric considerations allow us to define a generalized notion of "Hopf algebra" in this setting. The key ingredient in this construction is furnished by the projection maps of Tits. The case of finite sets (Hopf monoids in species) is recovered by restricting to braid arrangements.

This is joint work in progress with Swapneel Mahajan.

Thursday, November 17, 2011 Wei Liu, Department of Physical Therapy and Mechanical Engineering, University of South Alabama Does Number Matter? Implications of Rehabilitation Research

Abstract: Rehabilitation is an interdisciplinary field of study with the primary aim of enhancing health, function and quality of life among persons who have or who may be at risk of developing, acute injuries or long-term conditions. Rehabilitation research also incorporates the disciplines of athletic training, exercise sciences, occupational therapy, physical therapy, as well as other fields such as public health and engineering. The research of rehabilitation spans the entire life course, from infancy to older adulthood, and addresses a wide variety of patient populations. I will present some case studies that show how mathematics can potentially provide research solutions to understand human locomotion with robot devices and brain activity.

Tuesday, November 15, 2011 Gene Abrams, University of Colorado at Colorado Springs Leavitt Path Algebras - Something for Everyone: Algebra, Analysis, Graph Theory, Number Theory

Abstract: Most of the rings one encounters as basic examples have what's known as the "Invariant Basis Number" property, namely, for every pair of positive integers m and n, if the free left R-modules RRm and RRn are isomorphic, then m=n. (For instance, the IBN property of fields is used to show that the dimension of a vector space over a field is well defined.) In seminal work completed in the early 1960's, Bill Leavitt produced a specific, universal collection of algebras which fail to have IBN. While it's fair to say that these algebras were initially viewed as mere pathologies, it's just as fair to say that these now-so-called Leavitt algebras currently play a central, fundamental role in numerous lines of research in both algebra and analysis.

More generally, from any directed graph E and any field K one can build the Leavitt path algebra LK(E). In particular, the Leavitt algebras arise in this more general context as the algebras corresponding to the graphs consisting of a single vertex. I'll give an overview of some of the work on Leavitt path algebras which has occurred in their first seven years of existence, as well as mention some of the future directions and open questions in the subject. There should be something for everyone in this presentation, including and especially algebraists, analysts, and graph theorists. We'll also present a basic number theoretic result which provides the foundation of one of the recent main results in Leavitt path algebras. The talk will be aimed at a general audience.

Thursday, November 10, 2011 Kevin Meeker, Department of Philosophy, Сòòò½APP Is a Skeptical Hume Mathematically Challenged?

Abstract: Many commentators contend that Hume made some obvious mathematical errors in the section "Of scepticism with regard to reason" of his A Treatise of Human Nature. In my talk I will argue that a proper understanding of Humeís historical context exonerates Hume of any serious mathematical errors and help to provide a plausible understanding of the nature of his argument in his Treatise. More specifically, I shall point to a different way of understanding Hume on probability that avoids an anachronistic Bayesian reading of Humeís sceptical argument. In addition, I shall contend that a non-Bayesian approach helps to alleviate the puzzlement about some other seemingly odd mathematical moves in Humeís argument and allows us to reconstruct Humeís reasoning in a plausible way.

Thursday, November 3, 2011 David Benko, Сòòò½APP Can Dogs Play Ping-Pong? - An Artistic Application of Potential Theory

Abstract: We charge a metal body with electrons. The distribution of electrons minimizing the energy is called the equilibrium measure. Gauss was the first to begin the analysis of the electrostatic equilibrium problem with external fields. He wrote: "The determination of the distribution of the mass lies, in most cases, beyond the powers of present day analysis."

We will study the equilibrium measure for logarithmic and Riesz energy. Results from the UCUR project of Chelsey David will also be discussed (poster is on the 4th floor). Finally, we show how ping-pong and fractal-like dogs relate to potential theory.

Thursday, October 27, 2011 Mahir Can, Tulane University Regular Subvarieties of the Variety of Complete Quadrics

Abstract: A smooth projective variety X is called "regular" if there exists an algebraic action of the invertible upper triangular 2x2 matrices on X such that the unipotent radical has exactly one fixed point. A remarkable approach for studying the cohomology algebra of a regular variety is developed by Akyildiz and Carrell. Among the important examples of regular varieties are homogenous spaces. In particular, when applied to flag varieties, the Akyildiz-Carrell method yields remarkable results in representation theory.

The variety of complete quadrics, which is used by Schubert in his famous computation of the number of space quadrics tangent to 9 quadrics in general position, is a particular ``wonderful compactification'' of the space of non-singular quadric hypersurfaces in n dimensional complex projective space.

In this talk, we first give an overview of the work of Akyildiz and Carrell, then characterize the regular subvarieties of the variety of complete quadrics.

Tuesday, October 18, 2011 Vitaly Voloshin, Troy University Coloring Theory: History, Results and Open Problems

Abstract: In this talk, I will survey the development of fundamental ideas, results and will formulate a few open problems in coloring theory ranging from graph coloring to mixed hypergraph coloring. This talk will be suitable for graduate and upper level undergraduate students.


For colloquium talks from other years click here