Search results
Research Interests: Bayesian nonparametrics: Dependent nonparametric models, MCMC methods and deterministic approximations for efficient inference in nonparametric models. Continuous time stochastic processes: MCMC methods for inference in Markov jump processes and continuous time Bayesian networks. Point processes:
Old. STAT527: Intro to Statistical Computing (asyncuronous online) STAT656: Bayesian Data Analysis (Fall 2023) CS/STAT242: Intro to Data Science (Spring 2023) STAT420: Intro to Time-series data (Spring 2022) CS/STAT242 (Sprint 2021) STAT598z: Intro to Statistical Computing (Spring 2021) STAT695 (Fall 2020) CS/STAT242 (Sprint 2020)
Flexible Mixture Modeling on Constrained Spaces. [arxiv:1809.09238] Publications (out of date, please look at google scholar:): Wang, Q., Rao, V.A. and Teh Y.W. (2020) An Exact Auxiliary Variable Gibbs Sampler for a Class of Diffusions.
VinayakRao CurriculumVitae Teaching Spring2020,Purdue STAT242: Intro.toDataScience Spring2015-2019,Purdue STAT598Z: Intro.toComputingforStatistics Fall2014-2019 ...
Feb 27, 2018 · Vinayak Rao Department of Statistics. STAT598z. See Lecture 1 for course details Office hours: Tuesday 1230-1330: Piazza webpage: Announcements . Jan 8: Make sure you're registered for the class blackboard and piazza webpages (see Lecture 1). Be sure to answer the class survey and mail it to the class email address: Jan 16: Homework 1 is up (due before midnight on Friday, Jan 28). HW1:
Hypothesis testing and causality (2.5 weeks) Introduction to statistical infer-ence, populations/samples. Overview of hypothesis testing, A/B testing, and how to draw conclusions from data. Correlation vs causation. Similarity and clustering (2 weeks) Definitions and examples of common sim-ilarity/distance measures.
Vinayak Rao Department of Statistics. STAT598z. See Lecture 1 for course details Syllabus Office hours: Tuesday 1230-1330: Piazza webpage: Class email: purduestat598z@gmail.com (Send homeworks here, and NOT to my email address) Announcements . Jan 8: Make sure you're registered for the class blackboard and piazza webpages (see Lecture 1). Be sure to answer the class survey and mail it to the class email address:
Week 1. R Markdown: Tutorial 1 , Tutorial 2 (we will use this for the homeworks) Iain Murray's cribsheet, Sam Roweis' notes, A short introduction to R , Plotting with ggplot2. Useful references: Review of probability and statistics (Stanford notes) , R Manual. Week 1 (contd.)
For each : Solve the regularized least squares problem on training data. Evaluate estimated w on held-out data (call this PE ;k). Pick ^ = argmin mean(PE ) or (argmin (mean(PE ) + stderr(PE ))) Having chosen ^ solve regularized least square on all data. Ridge regression improves performance by reducing variance.
Topics include matrix computation, dynamic programming, the Baum-Welch algorithm for hidden Markov models, the expectation-maximization algorithm, convex optimization, Monte Carlo and Markov chain Monte Carlo methods. Specic topics and the course outline are subject to change as the semester progresses.