Advertisement

ReLU as a literal switch

Started by February 24, 2020 01:07 AM
0 comments, last by S6Regen 4 years, 9 months ago

The ReLU neural network activation function as a literal switch.

The variance equation for linear combinations of random variables as a route to a general associative memory algorithm.

https://ai462qqq.blogspot.com/2019/11/artificial-neural-networks.html

This topic is closed to new replies.

Advertisement