gneuralnetwork
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gneuralnetwork] Development Path


From: David Mascharka
Subject: Re: [Gneuralnetwork] Development Path
Date: Sun, 19 Jun 2016 11:59:00 -0400
User-agent: K-9 Mail for Android

I definitely think it's valuable to be able to control individual neurons. I updated that little program with an example of how to access an individual neuron in the network. If you take a single row of the layer's weight matrix, that corresponds to a particular neuron. Then you can set individual weights. So if you want to cancel out a connection, set the weight to 0 in the corresponding index.

I'll keep building on this idea and once I've got a basic network with gradient descent working for an example I'll update the group.

On June 19, 2016 9:25:48 AM EDT, Jean Michel Sellier <address@hidden> wrote:
Hi David,

Thanks for this very interesting email! I would say that your idea could be very useful and if you want to develop it we can see how to do it together. Otherwise, I would say that we also have to keep the level of control we have right now in Gneural Network, since many researchers are also interested in this very fine level of details.

Best,

JM



2016-06-19 6:38 GMT+02:00 David Mascharka <address@hidden>:
Hi everybody,

I've been playing around with the gneural network code and with some
neural net code of my own. I had the idea to utilize the GNU Scientific
Library for Gneural, since it includes BLAS support and some other nice
functionality. I made a small example that you can see at
(https://github.com/davidmascharka/neural-net-playing/blob/master/nn-gsl.c)
creating a fully-connected layer of 2 neurons with 3 inputs. It's a
general layer framework though, that you can make as large or deep as
you want.

I don't have any derivatives in there for backprop so for now it's just
a random matrix dot product applied to an input, then a tanh
nonlinearity applied. I don't mind hand-deriving gradients for
operations like addition, multiplication, and some of the nonlinearities
but I'm planning to implement automatic differentiation for more
flexibility and easier use.

What do you guys think about transitioning to an architecture more like
this? I think it's a lot more flexible than the current approach of
specifying a network neuron-by-neuron, especially if we want to develop
large networks like VGG that have millions of parameters.

Best,
David




--

--
Sent from my Android device with K-9 Mail. Please excuse my brevity.
reply via email to

[Prev in Thread] Current Thread [Next in Thread]