Top View
- Deep Learning
- Deep Augmentation in Convolutional Neural Networks
- The Unreasonable Effectiveness of Deep Learning
- Implicit Rank-Minimizing Autoencoder
- Long Short-Term Memory Neural Network Equilibria Computation and Analysis
- A Tutorial on Energy-Based Learning
- A Path to AI
- Curriculum Vitae
- Training Large DNN Models on Commodity Servers for the Masses
- Loss Functions for Discriminative Training of Energy-Based Models
- Compression of Deep Learning Models for Text: a Survey
- My Reading List for Deep Learning!
- Generative Adversarial Networks (Gans) : Concept and Application to Cloudy Sky Images Synthesis
- Variational Autoencoders Presented by Alex Beatson Materials from Yann Lecun, Jaan Altosaar, Shakir Mohamed Contents
- Yoshua Bengio, Yann Lecun, Geoffrey Hinton
- Incremental Learning Via Rate Reduction
- Signature Verification Using a "Siamese" Time Delay Neural Network
- A Closer Look at Cross-Modal Transfer of Pretrained Transformers
- Gradient-Based Learning Applied to Document Recognition
- Energy-Based Self-Supervised Learning
- Deep Learning Basics Lecture 4: Regularization II
- Arxiv:1807.08169V1 [Cs.LG]
- A Template for the Arxiv Style
- Dynamic Auto-Encoders for Semantic Indexing
- DARE: Data Augmented Relation Extraction with GPT-2
- Cold Case: the Lost MNIST Digits
- The Unreasonable Effectiveness of Deep Learning
- No More Pesky Learning Rates
- Energy-Based Approaches to Representation Learning
- Convolutional Networks and Applications in Vision
- Innateness, Alphazero, and Artificial Intelligence V3 2.Formatted.Pages
- Simple Fast Convolutional Feature Learning
- Lecture 4 (Deep Learning III: Recurrent Neural Networks, Applications, Ongoing Research and Open Issues)
- Saturating Auto-Encoders
- GPT-3: Its Nature, Scope, Limits, and Consequences
- Scaling Learning Algorithms Towards AI
- Deep Learning Lecture 1: Introduction
- Your GAN Is Secretly an Energy-Based Model and You Should Use Discriminator Driven Latent Sampling
- Spatially-Sparse Convolutional Neural Networks
- Deep Learning: Past, Present and Future
- Deep Learning, Past Present and Future
- Recurrent Neural Networks Winter 2018 Last Time: Computer Vision CNN “Revolution” • Cnns Are Now Being Used Beyond Image Classification
- Binary Recurrent Unit Using FPGA Hardware to Accelerate Inference In
- Improving First-Order Optimization Algorithms (Student Abstract)
- Introduction to Generative Adversarial Networks Nicolas Morizet
- Deep Learning Made Easier by Linear Transformations in Perceptrons