**Tuesday, 6 December 2016, 1:00PM**-- MC 6486*Miscellaneous*-- Applied Mathematics*Speaker:*Brendon Phillips, Department of Applied Mathematics*Title:**"Signals of Critical Transitions in Coupled Belief-Behaviour Systems"*

*Abstract:*The overwhelming success of vaccination campaigns over the last few decades has ironically given rise to a new devastating disease, called antivaccination.The previous low prevalence of preventable diseases has led some people to ignore health risks, instead focusing on all the imagined dangers of vaccination. Falling vaccination rates, driven by social interaction, then leave a greater proportion of the population susceptible to infection. These are perfect conditions for an epidemic, as seen by the recent resurgences of measles, mumps, whooping cough, polio, rubella and others illnesses. Also of interest are the effects of "echo chambers", occurring when people of similar opinion form social bonds. This create insular groups where interaction reinforces convictions, rather than challenging them.

Viewing disease spread as a dynamical system, epidemics represent critical transitions. Sometimes, these shifts are preceded (or accompanied) by easily recognisable characteristic behaviours, called early warning signals. Investigating and identifying a class of these signals is crucial to mitigating the global economic and infrastructural damage caused by disease resurgence. Surprisingly, to date, there has been no research done on finding early warning signals in coupled systems.

Simulation of a disease process has shown that critical transitions were preceded by sharp increases in the mutual dependence of physical and social dynamics. Also, both spatial autocorrelation and the number of connected graph components peak at the transition point, with spatial autocorrelation giving false warning signals under certain conditions.

In this talk, we will further discuss these results and the construction of model, as well as the measurements chosen and the reliability of the resulting signals.

**Wednesday, 7 December 2016, 11:30AM**-- DC 2310*Symbolic Computation Group PhD Seminar*-- Computer Science*Speaker:*Curtis Bright, David R. Cheriton School of Computer Science*Title:**"Minimal Elements for the Prime Numbers"*

*Abstract:*We say a string of symbols s is minimal for a language L if s is a member of L and it is not possible to obtain another member of L by striking out one or more symbols from s. Although the set M(L) of minimal strings is necessarily finite, determining it explicitly for a given L can be a difficult computational problem. We use some number-theoretic heuristics to compute M(L), where L is the language of base-b representations of the prime numbers for b between 2 and 30. This is joint work with Jeffrey Shallit and Raymond Devillers.

**Wednesday, 7 December 2016, 12:30PM**-- DC 2310*Symbolic Computation Group PhD Seminar*-- Computer Science*Speaker:*Curtis Bright, David R. Cheriton School of Computer Science*Title:**"MathCheck2: A SAT+CAS Verifier for Combinatorial Conjectures"*

*Abstract:*In this talk we outline MathCheck2, a combination of a SAT solver and a computer algebra system (CAS) aimed at finitely verifying or counterexampling mathematical conjectures. Using MathCheck2 we verified the Hadamard conjecture from design theory for matrices up to order 144 and many additional orders up to 168. Also, we provide independent verification of the claim that Williamson matrices of order 35 do not exist, and demonstrate for the first time that 35 is the smallest number with this property. In the course of our work, we discovered over 500 Hadamard matrices which were not equivalent to any matrices in the comprehensive Magma Hadamard database. This is joint work with Vijay Ganesh, Albert Heinle, Ilias Kotsireas, Saeed Nejati, and Krzysztof Czarnecki.

**Wednesday, 7 December 2016, 3:00PM**-- HH 1102*Centre for Theoretical Neuroscience Lecture*-- Computer Science*Speaker:*Various, University of Waterloo*Title:**"Cognitive Science Confessions: My biggest research mistakes"*

*Abstract:*A group of scholars will reflect on their research. "Confessors" from computer science, linguistics, neuroscience and psychology will be presenting.

**Wednesday, 7 December 2016, 3:30PM**-- Math & Computer, Room 5501*Seminar*-- Combinatorics and Optimization*Speaker:*Dr. Yurii Nesterov, CORE/INMA UCL, Belgium*Title:**"Universal Newton Method"*

*Abstract:*In this talk we present a second-order method for unconstrained minimization of convex functions. It can be applied to functions with Holder continuous Hessians. Our main scheme is the Cubic Regularization of Newton Method, equipped with a special line-search procedure. We show that the global rate of convergence of this scheme depends continuously on the smoothness parameter. Thus, our method can be used even for minimizing functions with discontinuous Hessians. At the same time, the line-search procedure is very efficient: the average number of calls of oracle per iteration is equal to two. We show that for finding a point with small norm of the gradient, the Universal Newton Method must be equipped with a special termination criterion for the line-search, which can be seen as a generalization of Armijo condition.

**Thursday, 8 December 2016, 1:00PM**-- DC 1304*Cryptography, Security, and Privacy (CrySP) Group PhD Seminar*-- Computer Science*Speaker:*Nik Unger, David R. Cheriton School of Computer Science*Title:**"Discovering Cryptography with Machine Learning: Google's Artificial Enigma"*

*Abstract:*On October 24, Google Brain researchers reported that they had created a machine learning system capable of producing its own encryption schemes. The system operated without any human aid, and it was unable to break the encryption techniques that it found. This story was widely reported in the technology press, where multiple publications wondered if the machine-generated encryption was more useful than human-made encryption, and one tech pundit called the research "literally terrifying" and "the start of the singularity".This talk, which is accessible to a general CS audience, places Google's results into context. We will cover the nature of the experiment, the machine learning techniques used in the paper (convolutional and adversarial neural networks), the results, and how they relate to modern cryptography. We discuss why, contrary to much of the reporting on this story, this approach is fundamentally flawed in terms of creating secure communication methods, as the original authors acknowledge. Finally, we point out some legitimate takeaways from the results.

**Thursday, 8 December 2016, 2:00PM**-- Math & Computer, Room 6486*Seminar*-- Combinatorics and Optimization*Speaker:*Dr. Chris Godsil, Dept. of Combinatorics & Optimization, University of Waterloo*Title:**"Interpretations of the strong Arnold hypothesis"*

*Abstract:*I will discuss two interpretations of the strong Arnold hypothesis, indicating connections to bounds on sizes of cocliques, and to embedding graphs in quadrics. In a later lecture I will use this theory to provide a linear algebraic condition for a strongly regular graph to be a core.

**Thursday, 8 December 2016, 3:30PM**-- MC 5417*Analysis Seminar Seminar*-- Pure Mathematics*Speaker:*Martijn Caspers, Utrecht University*Title:**"“Absence of Cartan subalgebras for right angled Hecke von Neumann algebras.”"*

*Remarks:***Please Note Special Day**

*Abstract:*Hecke algebras are ∗-algebras generated by self-adjoint operators T(s) with s in some generating set that satisfy the Hecke relation (T (s) + q)(T (s) − 1/q) = 0 as well as suitable types of commutation relations. They generate a von Neumann algebra called the Hecke von Neumann algebra studied by Dymara et al. It was proved by Garncarek that in the right- angled case these von Neumann algebras are actually factors if the parameter q is in a certain interval around 1 and for q outside of this interval it is a direct sum of a factor and the complex numbers (so the center is always very small). In particular the isomorphism type of these algebras depends on q. In this talk we first find approximation properties of right-angled Hecke von Neumann algebras: we prove that they are non-injective, have the completely contractive approximation property and the Haagerup property. We then turn to the existence of Cartan subalgebras and show that in the hyperbolic case these algebras are strongly solid and hence cannot have a Cartan subalgebra. In the general case, these algebras need not be strongly solid but still we are able to prove the non-existence of Cartan subalgebras.

**Thursday, 8 December 2016, 3:30PM**-- MC 5501*Colloquium*-- Computational Mathematics*Speaker:*Michael Mahoney, ICSI and Department of Statistics, UC Berkeley*Title:**"Terabyte-sized Computational Mathematics"*

*Remarks:*Refreshments at 3:15pm

*Abstract:*When dealing with data of terabyte-size scale and beyond, computing even basic descriptive statistics can be a challenge, and computing finer statistical properties such as correlations can be very non-trivial. The talk will provide an overview of recent work at the interface of methods (statistical and algorithmic theory, etc.), implementations (on single machine versus distributed data center versus supercomputer), and applications (in science, e.g., genetics, astronomy, and climate science, as opposed to internet and social media) that aim to provide tools to perform bread-and-butter computational statistics on data up to and beyond Terabyte-size scales. A key issue here is that in statistics one is often primarily interested in correlational properties of the data, and thus one must go beyond database-like query/counting operations on flat tables to deal with more complex couplings that are implicit when one is modeling data with matrices. As an example, one of the most straightforward formulations of the machine learning problem of feature selection boils down to the linear algebraic problem of selecting good columns from a data matrix. This formulation has the advantage of yielding features that are interpretable to scientists in the domain from which the data are drawn, an important consideration when machine learning methods are applied to realistic scientific data. While simple, this problem is central to many other seemingly nonlinear learning methods. Moreover, while unsupervised, this problem also has strong connections with related supervised learning methods such as Linear Discriminant Analysis and Canonical Correlation Analysis. We will describe recent work implementing Randomized Linear Algebra algorithms for this feature selection problem (as well as related NMF and PCA problems) in parallel and distributed environments on inputs of size ranging from ones to tens of terabytes, as well as the application of these implementations to specific scientific problems in areas such as mass spectrometry imaging and climate modeling.

* ...... WebNotice*