Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The model scored poorly after annotating the "detection adjustment" code #4

Closed
Conearth opened this issue May 8, 2022 · 4 comments
Closed

Comments

@Conearth
Copy link

Conearth commented May 8, 2022

Hi, this is an amazing job.
Here I come across a small problem.
On MSL dataset, the model performed good, looks like:
======================TEST MODE====================== Threshold : 0.0017330783803481142 pred: (73700,) gt: (73700,) pred: (73700,) gt: (73700,) Accuracy : 0.9853, Precision : 0.9161, Recall : 0.9473, F-score : 0.9314

But after I annotated the "detection adjustment" code, the score was poorly, looks like:
======================TEST MODE====================== Threshold : 0.0017330783803481142 pred: (73700,) gt: (73700,) pred: (73700,) gt: (73700,) Accuracy : 0.8866, Precision : 0.1120, Recall : 0.0109, F-score : 0.0199

And I'm sure only the "detection adjustment" code was annotated.

Can you help me out of this problem?
thx.

@wuhaixu2016
Copy link
Member

Thanks for your interest.

  • We have especially clarified this in the paper (Section 4 Implementation details).

  • It is notable that the adjustment operation is widely used in previous papers. Thus, we adopt this for a fair comparison with other methods.

@Conearth
Copy link
Author

Conearth commented May 9, 2022

Thank you very much for your attention. In your experiment, do the reconstruction based methods compared in the article use the same adjustment strategy?

@wuhaixu2016
Copy link
Member

Sure, all the comparing methods adopt this adjustment strategy for evaluation.

@Conearth
Copy link
Author

Conearth commented May 9, 2022

That's good. Thx.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants