Home

Welcome!

I'm a Post Doctoral Associate in the department of Management Science and Information Systems (MSIS) at Rutgers Business School. I am advised by Prof. Jonathan Eckstein
                                                           
 I graduated in May 2017 with a PhD in Electrical and Computer Engineering from the University of Illinois at Urbana-Champaign (UIUC). At UIUC I was advised by Prof. Pierre Moulin in the Coordinated Sciences Laboratory (CSL). 

I received my Master of Science degree in ECE from UIUC in 2014. In 2009 I received my Bachelor of Science degree with the university medal in Electrical Engineering from the University of New South Wales in Sydney, Australia. 

Here is my CV.

Research Interests
  • Convergence (and rate) analysis of algorithms, particularly for optimization
  • Monotone operator theory and variational analysis
  • Continuous optimization (convex, nonconvex, linear/nonlinear programming etc)
  • parallel, asynchronous, and distributed algorithms
  • Applications of the above in machine learning and signal processing among other fields.

See my Publications. Also see my google scholar page.

If you want to chat with me about one of these topics, one of my papers, or anything else, feel free to send me an email at patrick.r.johnstone@gmail.com

More Detailed Summary of Research Interests

I am interested in designing and analyzing algorithms that solve optimization problems. The sort of optimization problems I study arise mainly in image processing, machine learning, and compressed sensing. More specifically I am interested in nonsmooth problems, convex and nonconvex problems, proximal/operator splitting, inertial/momentum methods, and accelerated methods. My day-to-day research revolves around convergence analyses for such algorithms. Typically these methods are iterative, meaning they produce a sequence of approximations to the solution of the optimization problem. Mathematically one would like to establish that this sequence converges to a solution, i.e. that its limit is a solution. One would also like to establish how fast this happens, i.e. what is the convergence rates of the algorithm. Also fundamental is the overall computational complexity of an algorithm for finding a solution to within a specified accuracy. 

Projective Splitting with Forward Steps

My current project is related to asynchronous parallel projective splitting methods. This is a new and highly flexible class of operator splitting algorithms with many advantages over classical techniques. Previously these methods had made use of proximal steps w.r.t. the operators in the problem. Recently I figured out how to use forward steps w.r.t. any operators which are Lipschitz continuous. This leads to new variants of projective splitting with lots of interesting features. In general, the stepsize must be bounded by the inverse of the Lipschitz constant. But it is easy to do a backtracking linesearch technique when this constant is unknown. Further, for affine operators, Prof. Eckstein and I developed an adaptive stepsize which obviates the need to do backtracking. It appears that convergence still holds when the operator is only uniformly continuous (but not necessarily Lipschitz). However we do require that the operator has full domain. 

Subgradient Methods under Holderian Growth
 
Towards the end of my PhD, I developed subgradient methods for minimizing functions satisfying Holder growth conditions. These conditions are extremely useful for deriving convergence rates and are present in many interesting optimization problems. One interesting application is robust phase retrieval, which despite being nonconvex, is weakly convex (a.k.a. prox-regular) and satisfies the sharp Holder Growth condition. Therefore the subgradient method I developed can be applied. 

Local Convergence Results for Sparse Optimization

My early work in my PhD focused on sparse optimization which is applied mainly in compressed sensing-like problems. I managed to derive some results for an inertial method, proving that the method identifies the correct support set in a finite number of iterations and obtains linear convergence thereafter. While this was already known for non-inertial variants, my contribution extended this to inertial variants. 

Early Work

Very early in my PhD I did some work on statistical hypothesis testing; determining the asympotic behavior of the error probabilities of certain statistical tests. I also worked out the behavior of the Kullback-Leibler divergence under the central limit theorem; specifically what happens to the divergence of a sum of i.i.d. random variables.



Other Interests

I play guitar and drums (although I no longer own a drum kit). Recently I finally started to learn piano properly. I own a standard strat, a "classic-custom" les paul, and a maton em325c acoustic. I am an avid user of twitch and youtube. The only sporting team I follow these days is the Australian rugby team, the wallabies, who hopefully will win another world cup sometime during my lifetime (yes my expectations are low).