Affiliation : Berlin Big Data Center, Machine Learning Group, Technische Universität Berlin Research interests : Statistical Learning Theory, Machine Learning, Computer Vision, Data Mining Email : nakajima(at)tu-berlin.de --- Please replace (at) with "@". News - Our conference papers on graph analysis has been accepted: J. Höner, S. Nakajima, A. Bauer, K.-R. Müller, N. Görnitz, ''Minimizing Trust Leaks for Robust Sybil Detection,'' - Our journal papers on structural learning have been accepted: L. A. Lima, N. Görnitz, L. E. Varella, M. Vellascob, K.-R. Müller, S. Nakajima, ''Porosity Estimation by Semi-supervised Learning with Sparsely Available Labeled Samples,'' N. Görnitz, L. A. Lima, L. E. Varella, K.-R. Müller, S. Nakajima, ''Transductive Regression for Data with Latent Dependency Structure,'' A. Bauer, S. Nakajima, K.-R. Müller, ''Efficient Exact Inference with Loss Augmented Objective in Structured Learning,'' - Our journal papers have been accepted: K. Nagata, J. Kitazono, S. Nakajima, S. Eifuku, R. Tamura, M. Okada, ''An Exhaustive Search and Stability of Sparse Estimation for Feature Selection Problem,'' S. Nakajima, R. Tomioka, M. Sugiyama, S. D. Babacan, ''Condition for Perfect Dimensionality Recovery by Variational Bayesian PCA,'' - The following papers have been accepted: S. Nakajima, I. Sato, M. Sugiyama, K. Watanabe, H. Kobayashi, '' Analysis of Variational Bayesian Latent Dirichlet Allocation: Weaker Sparsity than MAP,'' NIPS2014, (Montreal, Canada, December 8-13, 2014). S. Nakajima, M. Sugiyama, ''Global Solution and Performance Analysis of Variational Bayesian Learning: A Short Summary,'' S. D. Babacan, S. Nakajima, M. N. Do, ''Bayesian Group-Sparse modeling and Variational Inference,'' S. Nakajima, M. Sugiyama, ''Analysis of Empirical MAP and Empirical Partially Bayes: Can They be Alternatives to Variational Bayes?'' - Our conference papers have been accepted: S. Nakajima, A. Takeda, S. D. Babacan, M. Sugiyama, I. Takeuchi, ''Global Solver and Its Efficient Approximation for Variational Bayesian Low-Rank Subspace Clustering,'' I. Takeuchi, T. Hongo, M. Sugiyama, S. Nakajima, ''Parametric Task Learning,'' - Our conference paper on light field acquisition has been accepted: P. Ruiz, J. Mateos, M. C. Cardenas, S. Nakajima, R. Molina, A. K. Katsaggelos, ''Light Field Acquisition From Blurred Observations Using a Programmable Coded Aperture Camera,'' - Our journal papers have been accepted: S. Nakajima, M. Sugiyama, S. D. Babacan, ''Variational Bayesian Sparse Additive Matrix Factorization,'' S. Nakajima, M. Sugiyama, S. D. Babacan, R. Tomioka, ''Global Analytic Solution of Fully-observed Variational Bayesian Matrix Factorization,'' - Our conference papers on variational Bayesian methods have been accepted: S. Nakajima, R. Tomioka, M. Sugiyama, S. D. Babacan, ''Perfect Dimensionality Recovery by Variational Bayesian PCA,'' S. D. Babacan, S. Nakajima, M. Do, ''Probabilistic Low-Rank Subspace Clustering,'' S. Nakajima, M. Sugiyama, S. D. Babacan, ''Sparse Additive Matrix Factorization for Robust PCA and Its Generalization,'' - Our journal paper on multiple kernel learning has been accepted: A. Binder, S. Nakajima, M. Kloft, C. Mueller, W. Samek, U. Brefeld, K.-R. Mueller, M. Kawanabe, ''Insights from Classifying Visual Concepts with Multiple Kernel Learning,''
* Model-induced Regularization (MIR) - At first please take a look at a one-page introduction of model-induced regularization (MIR) with a simplest example.
MIR is observed when Bayesian estimation is applied to non-identifiable models, where the mapping between parameters and distributions is not one-to-one. Non-identifiable models include most of the modern probabilistic models, e.g., neural networks, hidden Markov models, Bayesian networks, and matrix factorization. Non-identifiability leads to density non-uniformity of distribution functions in the parameter space. Typically, this makes the conjugate prior (even if almost flat) favors simpler models. My former superviser, Prof. Sumio Watanabe, has been studying this effect, and developed an algebraic geometrical method for quantitative evaluation of generalization performance of (rigorous) Bayesian estimation. In my PhD course, I analyzed the asymptotic behavior (when the number of samples goes to infinity) of the variational Bayesian (VB) approximation for linear neural networks in - S. Nakajima, S. Watanabe, ''Variational Bayes Solution of Linear Neural Networks and its Generalization Performance,''
Neural Computation, vol.19, no.4, pp.1112-1153, 2007. column-wise independence constraint) estimator for matrix factorization inS. Nakajima, M. Sugiyama, ''Theoretical Analysis of Bayesian Matrix Factorization,'' - S. Nakajima, M. Sugiyama, R. Tomioka, ''Global Analytic Solution for Variational Bayesian Matrix Factorization,''
NIPS2010, spotlight (Vancouver, Canada, December 6-11, 2010), videolectures, poster slides. matrix-wise independence constraint) solution is essentially the same as the SimpleVB SolutionS. Nakajima, M. Sugiyama, S. D. Babacan, ''Global Solution of Fully-Observed Variational Bayesian Matrix Factorization is Column-Wise Independent,'' - MIR never occurs in maximum a posteriori (MAP) estimation, because density non-uniformity takes effect on estimation only when at least one parameter is integrated out. It was shown that MIR is observed in partially Bayesian (PB) estimation, where a part of parameters are integrated out.
S. Nakajima, M. Sugiyama, S. D. Babacan, ''On Bayesian PCA: Automatic Dimensionality Selection and Analytic Solution,'' See also my thesis for summary of related works: - Since my one-year stay in intelligent data analysis group (IDA) in Berlin Institute of Technology in 2008, I have started working on image analysis, collaborating with image analysis guys, multiple kernel learning (MKL) guys and robotics guys in IDA. We investigated performance of non-sparse MKL in object recognition:
A. Binder, S. Nakajima, M. Kloft, C. Mueller, W. Samek, U. Brefeld, K.-R. Mueller, M. Kawanabe, ''Insights from Classifying Visual Concepts with Multiple Kernel Learning,'' - S. Nakajima, A. Binder, C. Müller, W. Wojcikiewicz, M. Kloft, U. Brefeld, K.-R. Müller, M. Kawanabe,
''Multiple Kernel Learning for Object Classification,'' IBIS2009, (Fukuoka, Japan, October 19-21, 2009). - A. Binder, M. Kawanabe, M. Kloft, S. Nakajima,
'' Enhancing Image Annotation with Primitive Color Histograms via Non-sparse Multiple Kernel Learning ,'' NIPS Workshop on Understanding Multiple Kernel Learning Methods, (Vancouver, Canada, December 11, 2009). - M. Kloft, S. Nakajima, U. Brefeld, ''Feature Selection for Density Level-Sets,''
ECML-PKDD2009, (Bled, Slovenia, September 7-11, 2009). - M. Kawanabe, S. Nakajima, A. Binder,
''A procedure of adaptive kernel combination with kernel-target alignment for object classification,'' CIVR2009, (Santorini, Greece, July 8-10, 2009). - N. Plath, M. Toussaint, S. Nakajima,
''Multi-class image segmentation using conditional random fields and global classification,'' ICML2009, (Montreal, Canada, June 14-18, 2009). - * Machine Learning for Photolithography
- Steppers/scanners are ones of Nikon's most important products. I have worked for several years on alignment --- to align a silicon wafer to a mask in a few nano meters of accuracy. I developed a series of signal prosessing algorithms,
- S. Nakajima, Y. Kanaya, N. Magome, ''Improving the Measurement Algorithm for Alignment,''
SPIE Microlithography 2001 (Santa Clara, U.S.A., March, 2001), - S. Nakajima, S. Watanabe,
''Simulation Data Generation from Extended EGA Model and Optimization of Alignment Strategy for Lithography,'' ISITA2004 (Parma, Italy, October 10-13, 2004), - S. Nakajima, Y. Kanaya, M. Li, T. Sugihara, A. Sukegawa, N. Magome, ''Outlier Rejection with Mixture Models in Alignment,''
SPIE Microlithography 2003 (Santa Clara, U.S.A., March, 2003). - M. Sugiyama, S. Nakajima, ''Pool-based active learning in approximate linear regression,''
Machine Learning, vol.75, no.3, pp.249-274, 2009. |

### home

Subpages (1):
Publication List