Avatar

Adel Bibi

PhD Student - Electrical Engineering (Computer Vision)

King Abdullah University of Science and Technology (KAUST)

Biography

I am a PhD student at King Abdullah University of Science & Technology (KAUST) located on the Red Sea at Thuwal in the Kingdom of Saudi Arabia. I’m part of a big diverse group Image and Video Understanding Lab (IVUL) advised by Professor Bernard Ghanem which is part of the Visual Computing Center (VCC). I have worked on a variety of problems. Problems, I personally find interesting and challenging.

Interests

  • Computer Vision
  • Machine Learning
  • Optimization

Education

  • MSc in Electrical Engineering (4.0/4.0), 2016

    King Abdullah University of Science and Technology (KAUST)

  • BSc in Electrical Engineering (3.99/4.0), 2014

    Kuwait University

News

  • [Dec 20th, 2019]: One paper accepted to ICLR20.
  • [Nov 11th, 2019]: One spotlight paper accepted to AAAI20.
  • [Sept 25th, 2019]: Recognized as outstanding reviewer for ICCV19. Link.
  • [August 5th, 2019]: I was invited to give a talk about the most recent research in computer vision and machine learning from the IVUL group at PRIS19, Dead Sea, Jordan. I also gave a 1 hour long workshop about deep learning and pytorch. Slides1/ Slides2/ Material.
  • [July 6th, 2019]: I was invited to give a talk at the Eastern European Conference on Computer Vision, Odessa, Ukraine. Slides.
  • [June 28th, 2019]: I gave a talk at the Biomedical Computer Vision Group directed by Prof Pablo Arbelaez, Bogota, Colombia. Slides.
  • [June 15th, 2019]: Attended CVPR19.
  • [June 9th, 2019]: Recognized as an outstanding reviewer for CVPR19. This is the second time in a row for CVPR. Check it out. :)
  • [May 26th, 2019]: A new paper is out on derivative free optimization with momentum with new rates and results on continuous controls tasks. arXiv.
  • [May 25th, 2019]: New paper! New provably tight interval bounds are derived for DNNs. This allows for very simple robust training of large DNNs. arXiv.
  • [May 11th, 2019]: How to train robust networks outperforming 2-21x fold data augmentation? New paper out on arXiv.
  • [May 6th, 2019]: Attended ICLR19 in New Orleans.
  • [Feb 4th, 2019]: New paper on derivative-free optimization with importance sampling is out! Paper is on arXiv.
  • [Dec 22nd, 2018]: One paper accepted to ICLR19, Louisiana, USA.
  • [Nov 6th, 2018]: One paper accepted to WACV19, Hawaii, USA.
  • [July 3rd, 2018]: One paper accepted to ECCV18, Munich, Germany.
  • [June 19th, 2018]: Attended CVPR18 and gave an oral talk on our most recent work on analyzing piecewise linear deep networks using Gaussian network moments. Tensorflow, Pytorch and MATLAB codes are released.
  • [June 17th, 2018]: Received a fully funded scholarship to attend the AI-DLDA 18 summer school in Udine, Italy. Unfortunately, I won’t be able to attend for time constraints. Link
  • [June 15th, 2018]: New paper out! “Improving SAGA via a Probabilistic Interpolation with Gradient Descent”.
  • [April 30th, 2018]: I’m interning for 6 months at the Intel Labs in Munich this summer with Vladlen Koltun.
  • [April 22nd, 2018]: Recognized as an outstanding reviewer for CVPR18. I’m also on the list of emergency reviewers. Check it out. :)
  • [March 6th, 2018]: One paper accepted as [Oral] in CVPR 2018.
  • [Feb 5, 2018]: Awarded the best KAUST poster prize in the Optimization and Big Data Conference.
  • [Decemmber 11, 2017]: TCSC code is on github.
  • [October 22, 2017]: Attened ICCV17, Venice, Italy.
  • [July 22, 2017]: Attened CVPR17 in Hawaii and gave an oral presentation on our work on solving the LASSO with FFTs, July 2017.
  • [July 16, 2017]: FFTLasso’s code is available online.
  • [July 9, 2017]: Attended the ICVSS17, Sicily, Italy.
  • [June 15, 2017]: Selected to attend the International Computer Vision Summer School (ICVSS17), Sicily, Italy.
  • [March 17, 2017]: 1 paper accepted to ICCV17.
  • [March 14, 2017]: Received my NanoDegree on Deep Learning from Udacity.
  • [March 3, 2017]: 1 oral paper accepted to CVPR17, Hawai, USA.
  • [October 19, 2016]: ECCV16’s code has been released on github.
  • [October 8, 2016]: Attended ECCV16, Amsterdam, Netherlands.
  • [July 11, 2016]: 1 spotlight paper accepted to ECCV16, Amsterdam, Netherlands.
  • [June 26, 2016]: Attended CVPR16, Las Vegas, USA. Two papers presented.
  • [May 13, 2016]: ICCVW15 code is now avaliable online.
  • [April 11, 2016]: Successfully defended my Master’s Thesis.
  • [March 2, 2016]: 2 papers (1 spotlight) accepted to CVPR16, Las Vegas, USA.
  • [November 20, 2015]: 1 paper acceted to ICCVW15, Santiago, Chile.
  • [June 8, 2015]: Attended CVPR15, Boston, USA.

Recent Publications

Quickly discover relevant content by filtering publications.

On the Decision Boundaries of Deep Neural Networks: A Tropical Geometry Perspective

​​This work tackles the problem of characterizing and understanding the decision boundaries of neural networks with piecewise linear non-linearity activations. We use tropical geometry, a new development in the area of algebraic geometry, to characterize the decision boundaries of a simple neural network of the form (Affine, ReLU, Affine). Our main finding is that the decision boundaries are a subset of a tropical hypersurface, which is intimately related to a polytope formed by the convex hull of two zonotopes. The generators of these zonotopes are functions of the neural network parameters. This geometric characterization provides new perspective to three tasks. Specifically, we propose a new tropical perspective to the lottery ticket hypothesis, where we see the effect of different initializations on the tropical geometric representation of a network’s decision boundaries. Moreover, we use this characterization to propose a new set of tropical regularizers, which directly deal with the decision boundaries of a network. We investigate the use of these regularizers in neural network pruning (by removing network parameters that do not contribute to the tropical geometric representation of the decision boundaries) and in generating adversarial input attacks (by producing input perturbations that explicitly perturb the decision boundaries’ geometry and ultimately change the network’s prediction).

Robust Gabor Networks

​​This work takes a step towards investigating the benefits of merging classical vision techniques with deep learning models. Formally, we explore the effect of replacing the first layers of neural network architectures with convolutional layers that are based on Gabor filters with learnable parameters. As a first result, we observe that architectures utilizing Gabor filters as low-level kernels are capable of preserving test set accuracy of deep convolutional networks. Therefore, this architectural change exalts their capabilities in extracting useful low-level features. Furthermore, we observe that the architectures enhanced with Gabor layers gain advantages in terms of robustness when compared to the regular models. Additionally, the existence of a closed mathematical expression for the Gabor kernels allows us to develop an analytical expression for an upper bound to the Lipschitz constant of the Gabor layer. This expression allows us to propose a simple regularizer to enhance the robustness of the network. We conduct extensive experiments with several architectures and datasets, and show the beneficial effects that the introduction of Gabor layers has on the robustness of deep convolutional networks.

A Stochastic Derivative Free Optimization Method with Momentum

​​We consider the problem of unconstrained minimization of a smooth objective function in ℝd in setting where only function evaluations are possible. We propose and analyze stochastic zeroth-order method with heavy ball momentum. In particular, we propose, SMTP, a momentum version of the stochastic three-point method (STP). We show new complexity results for non-convex, convex and strongly convex functions. We test our method on a collection of learning to continuous control tasks on several MuJoCo environments with varying difficulty and compare against STP, other state-of-the-art derivative-free optimization algorithms and against policy gradient methods. SMTP significantly outperforms STP and all other methods that we considered in our numerical experiments. Our second contribution is SMTP with importance sampling which we call SMTP_IS. We provide convergence analysis of this method for non-convex, convex and strongly convex objectives.

Talks

EECVC19, Odessa, Ukranine. Optimization Approach to a Block of Layers and Derivative Free Optimization. Slides

CVPR18, Utah, USA. Analytic Expressions for Probabilistic Moments of PL-DNN With Gaussian Input. Slides

CVPR17, Hawaii, USA. FFTLasso: Large-Scale LASSO in the Fourier Domain. Slides

CVPR17, Optimization and Big Data Conference 2018, KAUST, Saudi Arabia. High Order Tensor Formulation for Convolutional Sparse Coding.

ECCV16, Amsterdam, Netherlands. Target Response Adaptation for Correlation Filter Tracking. Slides

Contact