hadi daneshmand
hadi daneshmand
Inria Paris
Email verificata su inria.fr
Titolo
Citata da
Citata da
Anno
Inferring causal molecular networks: empirical assessment through a community-based effort
SM Hill, LM Heiser, T Cokelaer, M Unger, NK Nesser, DE Carlin, Y Zhang, ...
Nature methods 13 (4), 310-318, 2016
1702016
Estimating diffusion network structures: Recovery conditions, sample complexity & soft-thresholding algorithm
H Daneshmand, M Gomez-Rodriguez, L Song, B Schölkopf
International Conference on Machine Learning, 793-801, 2014
1062014
Escaping saddles with stochastic gradients
H Daneshmand, J Kohler, A Lucchi, T Hofmann
arXiv preprint arXiv:1803.05999, 2018
672018
Exponential convergence rates for batch normalization: The power of length-direction decoupling in non-convex optimization
J Kohler, H Daneshmand, A Lucchi, T Hofmann, M Zhou, K Neymeyr
The 22nd International Conference on Artificial Intelligence and Statistics …, 2019
62*2019
Local saddle point optimization: A curvature exploitation approach
L Adolphs, H Daneshmand, A Lucchi, T Hofmann
The 22nd International Conference on Artificial Intelligence and Statistics …, 2019
412019
Starting small-learning with adaptive sample sizes
H Daneshmand, A Lucchi, T Hofmann
International conference on machine learning, 1463-1471, 2016
352016
Adaptive Newton method for empirical risk minimization to statistical accuracy
A Mokhtari, H Daneshmand, A Lucchi, T Hofmann, A Ribeiro
Advances in Neural Information Processing Systems, 4062-4070, 2016
33*2016
Estimating diffusion networks: Recovery conditions, sample complexity & soft-thresholding algorithm
M Gomez-Rodriguez, L Song, H Daneshmand, B Schölkopf
The Journal of Machine Learning Research 17 (1), 3092-3120, 2016
292016
A time-aware recommender system based on dependency network of items
SM Daneshmand, A Javari, SE Abtahi, M Jalili
The Computer Journal 58 (9), 1955-1966, 2015
172015
Batch normalization provably avoids ranks collapse for randomly initialised deep networks
H Daneshmand, J Kohler, F Bach, T Hofmann, A Lucchi
Advances in Neural Information Processing Systems 33, 2020
2*2020
Optimization for Neural Networks: Quest for Theoretical Understandings
H Daneshmand
ETH Zurich, 2020
2020
Mixing of Stochastic Accelerated Gradient Descent
P Zhang, H Daneshmand, T Hofmann
arXiv preprint arXiv:1910.14616, 2019
2019
Accelerated Dual Learning by Homotopic Initialization
H Daneshmand, H Hassani, T Hofmann
arXiv preprint arXiv:1706.03958, 2017
2017
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–13