How to Use tf.nn.softmax in TensorFlow

tf.nn.softmax:

Computes softmax activations.

Here is an example:

import tensorflow as tf

a = tf.constant([1, 2, 3, 4, 5, 6], dtype=tf.float32)
b = tf.nn.softmax(a)

c =  tf.constant([[1, 3, 3], [4, 1, 6]], dtype=tf.float32)
d = tf.nn.softmax(c)
init=tf.initialize_all_variables()

with tf.Session() as sess:
    sess.run(init)
    print(sess.run(b))
    print(sess.run(d))

The resut is:

[0.00426978 0.01160646 0.03154963 0.08576079 0.233122   0.6336913 ]
[[0.06337894 0.4683105  0.4683105 ]
 [0.11849965 0.00589975 0.8756006 ]]

,