Code on Paper [ECCV2018]Semi-Supervised Deep Learning with Memory
Deep learning side channel privileged memory reader
Sublinear memory optimization for deep learning, reduce GPU memory cost to train deeper nets
A simple memory manager for CUDA designed to help Deep Learning frameworks manage memory
Implemented the deep learning techniques using Google Tensorflow that cover deep neural networks with a fully connected network using SGD and ReLUs; Regularization with a multi-layer neural network using ReLUs, L2-regularization, and dropout, to prevent overfitting; Convolutional Neural Networks (CNNs) with learning rate decay and dropout; and Recurrent Neural Networks (RNNs) for text and sequences with Long Short-Term Memory (LSTM) networks.
all kinds of text classification models and more with deep learning
Benchmarking Deep Learning operations on different hardware
Reimplementation of Memory Networks (MemNN) in Julia using Knet. A Deep Learning project.
Implémentation of the article **Deep Learning CUDA Memory Usage and Pytorch optimization tricks**