Follow
Koby Bibas
Koby Bibas
PhD candidate, TAU, Israel
Verified email at mail.tau.ac.il - Homepage
Title
Cited by
Cited by
Year
A New Look at an Old Problem: A Universal Learning Approach to Linear Regression
K Bibas, Y Fogel, M Feder
The 2019 IEEE International Symposium on Information Theory (ISIT), 2019
422019
Single layer predictive normalized maximum likelihood for out-of-distribution detection
K Bibas, M Feder, T Hassner
Advances in Neural Information Processing Systems 34, 1179-1191, 2021
222021
Deep pnml: Predictive normalized maximum likelihood for deep neural networks
K Bibas, Y Fogel, M Feder
arXiv preprint arXiv:1904.12286, 2019
192019
Balancing Specialization, Generalization, and Compression for Detection and Tracking
D Kaufman, K Bibas, E Borenstein, M Chertok, T Hassner
British Machine Vision Conference (BMVC), 2019
72019
Distribution Free Uncertainty for the Minimum Norm Solution of Over-parameterized Linear Regression
K Bibas, M Feder
Workshop on Distribution-Free Uncertainty Quantification ICML 2021, 2021
6*2021
Learning Rotation Invariant Features for Cryogenic Electron Microscopy Image Reconstruction
K Bibas, G Weiss-Dicker, D Cohen, N Cahan, H Greenspan
2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI), 2021
52021
Semi-supervised Adversarial Learning for Complementary Item Recommendation
K Bibas, O Sar Shalom, D Jannach
Proceedings of the ACM Web Conference 2023, 1804-1812, 2023
22023
Collaborative Image Understanding
K Bibas, O Sar Shalom, D Jannach
Proceedings of the 31st ACM International Conference on Information …, 2022
22022
Utilizing adversarial targeted attacks to boost adversarial robustness
U Pesso, K Bibas, M Feder
arXiv preprint arXiv:2109.01945, 2021
22021
Deep Individual Active Learning: Safeguarding against Out-of-Distribution Challenges in Neural Networks
S Shayovitz, K Bibas, M Feder
Entropy 26 (2), 129, 2024
2024
Beyond Ridge Regression for Distribution-Free Data
K Bibas, M Feder
arXiv preprint arXiv:2206.08757, 2022
2022
The system can't perform the operation now. Try again later.
Articles 1–11