Actions

Softmax annotated: Difference between revisions

From Algolit

(Created page with "{| |- | Type: || Algolit extension |- | Datasets: || none |- | Technique: || softmax function |- | Developed by: || Algolit |} '''''Softmax annotated''''' is a annotated scri...")
 
 
(3 intermediate revisions by 2 users not shown)
Line 3: Line 3:
 
| Type: || Algolit extension
 
| Type: || Algolit extension
 
|-
 
|-
| Datasets: || none
+
| Technique(s): || mathematics, softmax function, python
|-
 
| Technique: || softmax function
 
 
|-
 
|-
 
| Developed by: || Algolit
 
| Developed by: || Algolit
 
|}
 
|}
  
'''''Softmax annotated''''' is a annotated script of a softmax function written in python. The basis of the code is found at the [https://en.wikipedia.org/wiki/Softmax_function#Artificial_neural_networks Softmax function] page on the English Wikipedia. This script was part of proces to get a better understanding of the mathematics behind neural networks. The softmax function is often used as the last node in a neural network, to normalize a list of input values.
+
'''''Softmax annotated''''' is a annotated script of a softmax function written in python. The basis of the code is found at the [https://en.wikipedia.org/wiki/Softmax_function#Artificial_neural_networks Softmax function] page on the English Wikipedia. This script was part of proces to get a better understanding of the mathematics behind neural networks.  
 +
 
 +
The softmax function provides a very basic classifier based on logistic regression. It is a building block used in neural networks. A neural network can be seen as running multiple logistic regressions in parallel, which then feed into another layer of (multiple) logistic regressions and so on.
 +
 
 +
More information see [https://cs224d.stanford.edu/lectures/CS224d-Lecture4.pdf Lecture 4 of the Stanford Deep Learning course]

Latest revision as of 17:51, 25 October 2017

Type: Algolit extension
Technique(s): mathematics, softmax function, python
Developed by: Algolit

Softmax annotated is a annotated script of a softmax function written in python. The basis of the code is found at the Softmax function page on the English Wikipedia. This script was part of proces to get a better understanding of the mathematics behind neural networks.

The softmax function provides a very basic classifier based on logistic regression. It is a building block used in neural networks. A neural network can be seen as running multiple logistic regressions in parallel, which then feed into another layer of (multiple) logistic regressions and so on.

More information see Lecture 4 of the Stanford Deep Learning course