Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix #728: PyTorchDataGenerator always returns the same batch #731

Merged
merged 7 commits into from
Nov 11, 2020

Conversation

hornel
Copy link
Contributor

@hornel hornel commented Nov 11, 2020

Signed-off-by: hornel hornel@ethz.ch

Description

Fixes art.data_generators.PyTorchDataGenerator and art.data_generators.MXDataGenerator. Previously, they always returned the same batch (assuming the underlying DataLoader had shuffle disabled), whereas they now properly iterate through the underlying DataLoader.

Fixes #728. I also noticed the same issue with MXDataGenerator and provided the same fix.

Type of change

Please check all relevant options.

  • Improvement (non-breaking)
  • Bug fix (non-breaking)
  • New feature (non-breaking)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Testing

Please describe the tests that you ran to verify your changes. Consider listing any relevant details of your test configuration.

  • All unit tests in tests/test_data_generators/TestPyTorchGenerator
  • All unit tests in tests/test_data_generators/TestMXGenerator

Test Configuration:

  • Ubuntu 18.04
  • Python 3.7.4
  • ART 1.4.2
  • PyTorch 1.5.1 / MXNet 1.6.0

Checklist

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

Sorry, something went wrong.

beat-buesser and others added 7 commits November 4, 2020 12:08
Signed-off-by: Beat Buesser <beat.buesser@ie.ibm.com>
…smodels-0.12.1

Bump statsmodels from 0.12.0 to 0.12.1
… same batch

Signed-off-by: hornel <hornel@ethz.ch>
@hornel hornel changed the base branch from main to dev_1.4.3 November 11, 2020 20:12
@beat-buesser beat-buesser self-requested a review November 11, 2020 20:22
@beat-buesser beat-buesser self-assigned this Nov 11, 2020
@beat-buesser beat-buesser added the bug Something isn't working label Nov 11, 2020
@beat-buesser beat-buesser added this to the ART 1.4.3 milestone Nov 11, 2020
@beat-buesser beat-buesser added improvement Improve implementation and removed bug Something isn't working labels Nov 11, 2020
@beat-buesser beat-buesser merged commit b2692fa into Trusted-AI:dev_1.4.3 Nov 11, 2020
@beat-buesser
Copy link
Collaborator

@hornel Thank you very much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
improvement Improve implementation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PyTorchDataGenerator always returns the same batch
3 participants