Contents tagged with Neural networks

  • Training neural network of deeplearning4j for price prediction

    Hi,

    I need to document my first implementation of learning algorithm for usage of neural networks for training

    package org.deeplearning4j.examples.recurrent;

    import org.deeplearning4j.datasets.iterator.impl.ListDataSetIterator;

    import org.deeplearning4j.nn.api.Layer;

    import org.deeplearning4j.nn.api.OptimizationAlgorithm;

    import org.deeplearning4j.nn.conf.BackpropType;

    import org.deeplearning4j.nn.conf.MultiLayerConfiguration;

    import org.deeplearning4j.nn.conf.NeuralNetConfiguration;

    import org.deeplearning4j.nn.conf.Updater;

    import org.deeplearning4j.nn.conf.layers.DenseLayer;

    import org.deeplearning4j.nn.conf.layers.GravesLSTM;

    import org.deeplearning4j.nn.conf.layers. … more

  • Notes about generalization improvement

    Hello everybody,

    today I want to write few words about different regularization techniques. I will compare L1 regularization, L2 regularization and adding noise to the weights during learning process.

    L2 regularization penalizes high weight values.

    L1 regularization penalizes values that do not equal to zero.

    Adding noise to weights during learning ensures that the learned hidden representation take extreme values. 

    more

  • Phoneme recognition is speech recognition

    Hello everybody,

    today I'd like to preserve in my blog few words of practical knowledge about speech recognition. One of the questions which raises in speech recognition systems is related to phoneme detection. 

    According to course at coursera following parameters showed practical. In oder for accurate recognition of what phoneme had been said at a particular time, neural network needs to know sound frequency from 100ms before that time to 100ms after that time. In other words if you need NN which will recognize phonemes, then give as input 100 ms or less into NN. more

  • Neural networks for machine learning at coursera

    Hello everybody,

    today I've completed following course at coursera:

    "Neural Networks for Machine Learning".

    I should admit, that this course was great but for me to pass all of it presented a challenge. But also I shoud notice that neural networks for machine learning was really informative course. I should admit that for me it was very interesting to learn more about perceptrons then I new. Remind myself about restricted boltzmann machine. Very discoverable for me was explanation about recurrent neural networks and how to derive math for recurrent neural networks. And much much more. 

    Also some parts were missing for me. For me it was hard to grasp about probabilities and Bayesian … more

  • Ways to reduce overfitting in NN

    Hello everybody,

    today I want to write short summary of how to reduce overfitting. Here it goes:

    Weight decay.

    Weight sharing

    Early stopping of training

    Model averaging

    Bayesian fitting of NN

    Dropout

    Generative pre-training

    Some explanations about some points.

    Weight decay stands for keeping weights small

    Insist that weights will be similar to each other

    Early stopping stands for not training NN to full memorizing of test set

    In other words usage of different models 

    Little bit another usage of model averaging according to some rules

    random ommiting of hidden units in order to validate results

    more

  • Mathematical notes about Neural networks

    Hello everybody,

    today I want to write few words about topic why mathematician believe that neural networks can be taught of something. Recently I've read book Fundamentasl of Artifical Neural Networks of Mohamad Hassoun and want to share some thoughts in more digestible manner with omitting some theoretical material.

    As you heard, when first attempt of neural networks was invented ( aka Perceptron), society was very admired by them, until Marvin Minsky and Seymour Papert showed that Perceptrons can't implement XOR function or in generally speaking any non linear function.

    It lead to big disappointment in area of neural networks.

    But why? Because sometime one line is not enough in … more

  • Speech recognition with Neural Networks

    Hello everybody,

    today I want to share how to deal with the Speech Recognition with Neural Networks.

    So, the speech recognition task has following stages:

    Pre-processing: convert the sound wave into a vector of acoustic coefficients. Extract a new vector about every 10 milliseconds

    Acoustic model: Use a few adjacent vectors of acoustic coefficients to place bets on which par of which phoeneme is being spoken.

    Decoding: Find the sequence of bets that does the best job of fitting the acoustic data and also fitting a model of the kinds of thinks people say.

    more

  • Neural Networks teaching

    Hello everybody,

    today I watched vidoe of Dr. James McCaffrey about traing neural netwoks. 

    So, few interesting details, which was useful for me. 

    1. As usually it is more then enough one hidden layer. 

    2. Imagine, that you need to feed into your neural network some data. Let's say Age, income, sex, religion. For data normalization it is good to use not big numbers.

    For example for Income which is around 51000 it's good to give to network 5.1 and not 51000.

    For sex passing into NN it's good to use gues what? If you have an idea 0 and one, you are wrong. It's good to pass -1 and 1.

    And for classfication like if you have input for example 3 different categories, the best way to pass … more