News:

Masm32 SDK description, downloads and other helpful links
Message to All Guests
NB: Posting URL's See here: Posted URL Change

Main Menu

Simple perceptron

Started by mineiro, April 05, 2017, 07:28:35 AM

Previous topic - Next topic

mineiro

A simple perceptron algo example, neural network.
I'd rather be this ambulant metamorphosis than to have that old opinion about everything

Magnum

Your diagrams in perceptron.txt are quite impressive considering that you only used characters that can be produced by a keyboard.

Great job !! :-)

Take care,
                   Andy

Ubuntu-mate-18.04-desktop-amd64

http://www.goodnewsnetwork.org

jj2007

"I learned this..." - what does it mean?

Magnum

Take care,
                   Andy

Ubuntu-mate-18.04-desktop-amd64

http://www.goodnewsnetwork.org

Siekmanski

Confusing our synapses with boolean algebra I think.  :biggrin:
Creative coders use backward thinking techniques as a strategy.

mineiro

hello sir jj2007;
means that algorithm have learned boolean truth table.
If you only change input table on source code and feed algo with this table, so algo can 'mimic' or learn that table and produce same results as boolean logic.

;uncomment one of tables below to algorithm learn
;--------------------------------
;       lea esi,TT_OR
;       lea esi,TT_XOR
        lea esi,TT_AND                  ;boolean truth table to learn, 2 INPUTS, 1 OUTPUT
;--------------------------------

So, if algo learned AND logic, you don't need teach that algo every time, you can save synapses. So, this sounds a bit like a mutant algo, well, the code remains the same only input data changes, and this means that algorithm can realize different actions to different inputs.

This is like a cell, if we don't have boolean instructions (logical instructions) on a pc, well, we can use this code to do this job to us.
I'd rather be this ambulant metamorphosis than to have that old opinion about everything

hutch--

I wish I had time to play with AI as its very interesting stuff, working with it in text form requires some very good parsing code and the data storage is truly massive to be useful but it is fascinating stuff.

mineiro

yes sir hutch;
The nice thing about these neural network is that code is done, so we play only with data.
I can see on future persons selling trained data instead of code.
Something like an update to this simple perceptron is just inserting other logical gates (not tested), like nand,nor,... . So, trained data (synapses) can be shared.

On example posted synapse data can be stored on just 2 bytes (1 byte to each synapse) but I understood your point of view, on complex I.A. we need a lot of input data and organize code with parser to produce something more usefull.

---edited---
I tried nor and nand truth tables and do not work, algo can't learn that. I think that I can create a better trigger (activation function) or a parser, but I  do not touch on this program again, just to be simple example.
I'd rather be this ambulant metamorphosis than to have that old opinion about everything