Chadrick Blog

tensorflow manual cross entropy loss calculation

In case you need a raw cross entropy loss calculation done not with logits but with the final value, and no reductions on the output whatsoever, here’s a code snippet for it.

def ce\_loss(y\_true, y\_pred):
    ce\_loss = -((1-y\_true) \* tf.log(1-y\_pred) + y\_true \* tf.log(y\_pred))
    return ce\_loss