Segui
Tolga Ergen
Tolga Ergen
Research Scientist, LG AI Research
Email verificata su stanford.edu - Home page
Titolo
Citata da
Citata da
Anno
Unsupervised anomaly detection with LSTM neural networks
T Ergen, SS Kozat
IEEE transactions on neural networks and learning systems 31 (8), 3127-3141, 2019
3472019
Online training of LSTM networks in distributed systems for variable length data sequences
T Ergen, SS Kozat
IEEE transactions on neural networks and learning systems 29 (10), 5159-5165, 2017
1132017
Efficient online learning algorithms based on LSTM neural networks
T Ergen, SS Kozat
IEEE transactions on neural networks and learning systems 29 (8), 3772-3783, 2017
1102017
Neural networks are convex regularizers: Exact polynomial-time convex optimization formulations for two-layer networks
M Pilanci, T Ergen
International Conference on Machine Learning, 7695-7705, 2020
1062020
Revealing the Structure of Deep Neural Networks via Convex Duality
T Ergen, M Pilanci
arXiv preprint arXiv:2002.09773, 2020
86*2020
Convex geometry and duality of over-parameterized neural networks
T Ergen, M Pilanci
Journal of machine learning research 22 (212), 1-63, 2021
562021
Implicit Convex Regularizers of CNN Architectures: Convex Optimization of Two- and Three-Layer Networks in Polynomial Time
T Ergen, M Pilanci
arXiv preprint arXiv:2006.14798, 2020
462020
Vector-output relu neural network problems are copositive programs: Convex analysis of two layer networks and polynomial-time algorithms
A Sahiner, T Ergen, J Pauly, M Pilanci
arXiv preprint arXiv:2012.13329, 2020
412020
Global optimality beyond two layers: Training deep relu networks via convex programs
T Ergen, M Pilanci
International Conference on Machine Learning, 2993-3003, 2021
362021
Convex geometry of two-layer relu networks: Implicit autoencoding and interpretable models
T Ergen, M Pilanci
International Conference on Artificial Intelligence and Statistics, 4024-4033, 2020
322020
Demystifying batch normalization in relu networks: Equivalent convex optimization models and implicit regularization
T Ergen, A Sahiner, B Ozturkler, J Pauly, M Mardani, M Pilanci
arXiv preprint arXiv:2103.01499, 2021
312021
Unraveling attention via convex duality: Analysis and interpretations of vision transformers
A Sahiner, T Ergen, B Ozturkler, J Pauly, M Mardani, M Pilanci
International Conference on Machine Learning, 19050-19088, 2022
272022
Energy-efficient LSTM networks for online learning
T Ergen, AH Mirza, SS Kozat
IEEE transactions on neural networks and learning systems 31 (8), 3114-3126, 2019
232019
Hidden convexity of wasserstein GANs: Interpretable generative models with closed-form solutions
A Sahiner, T Ergen, B Ozturkler, B Bartan, J Pauly, M Mardani, M Pilanci
arXiv preprint arXiv:2107.05680, 2021
202021
Convex optimization for shallow neural networks
T Ergen, M Pilanci
2019 57th Annual Allerton Conference on Communication, Control, and …, 2019
182019
Path regularization: A convexity and sparsity inducing regularization for parallel relu networks
T Ergen, M Pilanci
Advances in Neural Information Processing Systems 36, 2024
172024
Convex neural autoregressive models: Towards tractable, expressive, and theoretically-backed models for sequential forecasting and generation
V Gupta, B Bartan, T Ergen, M Pilanci
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021
14*2021
A novel distributed anomaly detection algorithm based on support vector machines
T Ergen, SS Kozat
Digital Signal Processing 99, 102657, 2020
132020
Parallel deep neural networks have zero duality gap
Y Wang, T Ergen, M Pilanci
arXiv preprint arXiv:2110.06482, 2021
112021
Globally optimal training of neural networks with threshold activation functions
T Ergen, HI Gulluk, J Lacotte, M Pilanci
arXiv preprint arXiv:2303.03382, 2023
102023
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–20