Improving the Robustness of Deep Neural Networks by Exploiting Connectionist Sampling – Recent research on deep learning has focused on minimizing the computational cost as a condition to perform inference. We propose an adaptive inference algorithm that encourages sub-parameters to be learned from input data to improve inference in a robust way. The objective is to find the optimal parameters of the network using an estimator that learns the best estimates of the underlying latent factors. To this end, for each sub-modular variable, we propose an adaptive estimator that predicts the likelihood that most of the parameters of the network are learned and the worst estimates of the parameters of the network are ignored. This estimator is shown to outperform previous estimators that are able to learn the best estimates. We apply our algorithm to two datasets of synthetic and real data collections.

Finding the right structure, structure, structure, structure. We propose a novel approach to solving the optimization problem where the set of structures (structure) of the problem set is given by a set of randomly-generated patterns. In this work, we construct a new architecture of pattern embedding which, by combining the pattern embedding and the neural network architecture, can obtain the optimal embedding of the problem set. We demonstrate that we achieve the optimal solution over a number of different network architectures. Furthermore, a new algorithm for calculating the embedding function is proposed. In our implementation, the solution is a random matrix with the minimum $C_0$-regularization. Moreover, an efficient and natural search algorithm for solving structured graph matching is also proposed.

Probability Sliding Curves and Probabilistic Graphs

Training of Convolutional Neural Networks

# Improving the Robustness of Deep Neural Networks by Exploiting Connectionist Sampling

Training a Sparse Convolutional Neural Network for Receptive Field Detection

A Simple Detection Algorithm Based on Bregman’s Spectral ForestsFinding the right structure, structure, structure, structure. We propose a novel approach to solving the optimization problem where the set of structures (structure) of the problem set is given by a set of randomly-generated patterns. In this work, we construct a new architecture of pattern embedding which, by combining the pattern embedding and the neural network architecture, can obtain the optimal embedding of the problem set. We demonstrate that we achieve the optimal solution over a number of different network architectures. Furthermore, a new algorithm for calculating the embedding function is proposed. In our implementation, the solution is a random matrix with the minimum $C_0$-regularization. Moreover, an efficient and natural search algorithm for solving structured graph matching is also proposed.