Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Selfhosted GPU tests restricted to GPU backends only #1579

Merged
merged 5 commits into from
Feb 7, 2025

Conversation

BrunoLiegiBastonLiegi
Copy link
Contributor

@BrunoLiegiBastonLiegi BrunoLiegiBastonLiegi commented Feb 7, 2025

Same as qiboteam/qibojit#206. A --gpu-only option is added to pytest to run tests on GPU backends only.

Checklist:

  • Reviewers confirm new code works as expected.
  • Tests are passing.
  • Coverage does not decrease.
  • Documentation is updated.

Copy link

github-actions bot commented Feb 7, 2025

Run on QPU sim completed! :atom:

You can download the coverage report as an artifact, from the workflow summary page:
https://github.com/qiboteam/qibo/actions/runs/13199560283

@BrunoLiegiBastonLiegi
Copy link
Contributor Author

Ok now the gpu tests take ~6 minutes to complete. Should we include even the pytorch and tensorflow (I don't think we have gpu support for jax yet) gpu tests?

Co-authored-by: Alessandro Candido <candido.ale@gmail.com>
@andrea-pasquale
Copy link
Contributor

Ok now the gpu tests take ~6 minutes to complete. Should we include even the pytorch and tensorflow (I don't think we have gpu support for jax yet) gpu tests?

If you want feel free to do so, however if we see that tests are not passing for PyTorch and tensorflow we can address them in another PR.

Copy link

github-actions bot commented Feb 7, 2025

Run on QPU sim completed! :atom:

You can download the coverage report as an artifact, from the workflow summary page:
https://github.com/qiboteam/qibo/actions/runs/13199807444

Copy link

codecov bot commented Feb 7, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.61%. Comparing base (5108c14) to head (d10aac8).
Report is 6 commits behind head on master.

Additional details and impacted files
@@           Coverage Diff           @@
##           master    #1579   +/-   ##
=======================================
  Coverage   99.61%   99.61%           
=======================================
  Files          76       76           
  Lines       11449    11449           
=======================================
  Hits        11405    11405           
  Misses         44       44           
Flag Coverage Δ
unittests 99.61% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@BrunoLiegiBastonLiegi
Copy link
Contributor Author

The thing is that right now I am not even sure how well is integrated the use of a device in the torch and tf backend. For example:

from qibo import set_backend
from qibo.models import QFT

set_backend(backend="qiboml", platform="pytorch")
c = QFT(5)
r = c()
r.probabilities().device
# device(type='cpu')
r.state().device
# device(type='cpu')

and the same happens for tensorflow.
Hence it's not worth it enabling them now as they would probably just run on CPU anyway. A revision of the device attribute has to be done for both of them first.

@andrea-pasquale
Copy link
Contributor

The thing is that right now I am not even sure how well is integrated the use of a device in the torch and tf backend. For example:

from qibo import set_backend
from qibo.models import QFT

set_backend(backend="qiboml", platform="pytorch")
c = QFT(5)
r = c()
r.probabilities().device
# device(type='cpu')
r.state().device
# device(type='cpu')

and the same happens for tensorflow. Hence it's not worth it enabling them now as they would probably just run on CPU anyway. A revision of the device attribute has to be done for both of them first.

At this point I agree we can skip it, just open an issue so that we don't forget.

@BrunoLiegiBastonLiegi BrunoLiegiBastonLiegi added this pull request to the merge queue Feb 7, 2025
Merged via the queue into master with commit 312efb9 Feb 7, 2025
30 checks passed
@scarrazza scarrazza deleted the gpu_only_selfhosted branch February 12, 2025 12:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants