Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
[MXNET-751] fix bce_loss flaky (#11955)
Browse files Browse the repository at this point in the history
* add fix to bce_loss

* add comments

* remove unecessary comments
  • Loading branch information
lanking520 authored and eric-haibin-lin committed Aug 5, 2018
1 parent 5474b08 commit 7261d8c
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions tests/python/unittest/test_loss.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,8 +83,8 @@ def test_ce_loss():
initializer=mx.init.Xavier(magnitude=2))
assert mod.score(data_iter, eval_metric=mx.metric.Loss())[0][1] < 0.05


@with_seed(1234)
# tracked at: https://github.com/apache/incubator-mxnet/issues/11691
@with_seed()
def test_bce_loss():
N = 20
data = mx.random.uniform(-1, 1, shape=(N, 20))
Expand All @@ -107,7 +107,7 @@ def test_bce_loss():
prob_npy = 1.0 / (1.0 + np.exp(-data.asnumpy()))
label_npy = label.asnumpy()
npy_bce_loss = - label_npy * np.log(prob_npy) - (1 - label_npy) * np.log(1 - prob_npy)
assert_almost_equal(mx_bce_loss, npy_bce_loss)
assert_almost_equal(mx_bce_loss, npy_bce_loss, rtol=1e-4, atol=1e-5)

@with_seed()
def test_bce_equal_ce2():
Expand Down

1 comment on commit 7261d8c

@larroy
Copy link
Contributor

@larroy larroy commented on 7261d8c Aug 6, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we add a justification why the numeric result is not the same and needs tolerance? It's useful in case the flakyness continues and somebody has to deal with this test.

Please sign in to comment.