# MCMC notes (1)

Recently I was reading articles and books about MCMC, and realized that many materials were not taught in my graduate study. To this end, I decide to make a summary of such content, to assist readers and myself to gain deeper understanding of MCMC in the future. I hope to make this topic a series, although I cannot guarantee its completion. This article is the first one in this hypothetical series, and we are going to introduce an important concept, the geometric ergodicity.

# Updates on RSpectra: new "center" and "scale" parameters for svds()

Per the suggestion by @robmaz, RSpectra::svds() now has two new parameters center and scale, to support implicit centering and scaling of matrices in partial SVD. The minimum version for this new feature is RSpectra >= 0.16-0. These two parameters are very useful for principal component analysis (PCA) based on the covariance or correlation matrix, without actually forming them. Below we simulate a random data matrix, and use both R’s built-in prcomp() and the svds() function in RSpectra to compute PCA.

# recosystem: recommender system using parallel matrix factorization

A Quick View of Recommender System The main task of recommender system is to predict unknown entries in the rating matrix based on observed values, as is shown in the table below: Each cell with number in it is the rating given by some user on a specific item, while those marked with question marks are unknown ratings that need to be predicted. In some other literatures, this problem may be named collaborative filtering, matrix completion, matrix recovery, etc.

# Large scale eigenvalue decomposition and SVD with rARPACK

In January 2016, I was honored to receive an “Honorable Mention” of the John Chambers Award 2016. This article was written for R-bloggers, whose builder, Tal Galili, kindly invited me to write an introduction to the rARPACK package. A Short Story of rARPACK Eigenvalue decomposition is a commonly used technique in numerous statistical problems. For example, principal component analysis (PCA) basically conducts eigenvalue decomposition on the sample covariance of a data matrix: the eigenvalues are the component variances, and eigenvectors are the variable loadings.

# An overview of linear algebra libraries in Scala/Java

This semester I’m taking a course in big data computing using Scala/Spark, and we are asked to finish a course project related to big data analysis. Since statistical modeling heavily relies on linear algebra, I investigated some existing libraries in Scala/Java that deal with matrix and linear algebra algorithms. 1. Set-up Scala/Java libraries are usually distributed as *.jar files. To use them in Scala, we can create a directory to hold them and set up the environment variable to let Scala know about this path.

# A conversation with Hadley Wickham

Dr. Hadley Wickham is the Chief Scientist of RStudio and Assistant Professor of Statistics at Rice University. He is the developer of the famous R package ggplot2 for data visualization and the author of many other widely used packages like plyr and reshape2. On Sep 13, 2013 he gave a talk at Department of Statistics, Purdue University, and later I (Yixuan) had a conversation with him (Hadley), talking about his own experience and interest on data visualization, data tidying, R programming and other related topics.

# Is Normal normal?

The rumor says that Normal distribution is everything. It will take a long long time to talk about the Normal distribution thoroughly. However, today I will focus on a (seemingly) simple question, as is stated below: If \$X\$ and \$Y\$ are univariate Normal random variables, will \$X+Y\$ also be Normal? What’s your reaction towards this question? Well, at least for me, when I saw it I said “Oh, it’s stupid.
• page 1 of 2

#### Yixuan Qiu

Statistics, Data, and Programming

Postdoctoral Researcher

Pittsburgh, PA