-
Notifications
You must be signed in to change notification settings - Fork 247
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Good First Issue][NNCF]: Replace common tensor_statistics by experimental tensor_statistics #3041
Comments
.take |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
@kshpv So basically functionally replace everything in old files(common) with the new files(Experimental) and then delete the new files(Experimental) ? |
@AHB102, you also need to be sure that all tests passed and no regression is introduced |
@kshpv Great! I've reviewed the example and will create a PR for a single-file change. We can go from there |
@AHB102 I am really sorry, but @olegkkruglov already picked this GFI. I suggest you to pick another one. |
@kshpv Yup no problem 😄 |
hey @olegkkruglov - do you need any support or you may not have a time to finish the task? |
hey, the work is in progress. sorry, I didn't have enough time to finish. I think I will be able to do it during this weekend, is it ok? |
of course :) |
@kshpv By the time I looked, someone else had taken the issue. Maybe next time. |
I'm unassigning the task due to assignee's inactivity. |
Do we need to merge PR #3106 before working on this issue? |
If you want, you can continue working on this issue. In this case, you should open your PR, creating a branch from PR #3106. You can then rebase your branch to develop and respond to reviewers' comments. After you open your PR, I will close PR #3106 due to lack of activity. |
.take |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
Thanks for being interested in this issue. It looks like this ticket is already assigned to a contributor. Please communicate with the assigned contributor to confirm the status of the issue. |
1 similar comment
Thanks for being interested in this issue. It looks like this ticket is already assigned to a contributor. Please communicate with the assigned contributor to confirm the status of the issue. |
Hi is this open? |
Hi @kshpv , I've been working on the tensor statistics migration issue assigned to me and wanted to check in to ensure I've understood the task correctly. From my investigation, I see that NNCF currently maintains two sets of tensor statistics:
I've written a script to search for references to both implementations across the codebase to identify which files need to be updated. My understanding of the migration task is:
Since the new implementation has fewer files, I'm wondering how the functionality has been reorganized. Has the functionality from files like I'd appreciate your guidance on whether my understanding is correct and if there's anything important I've missed or should be particularly careful about during this migration. Thank you! |
@darshil929 Please make note of this comment:
|
@shumaari I'll take a note of it, Thanks. |
Context
Historically, NNCF has maintained two sets of tensor statistics: the old tensor statistics and the new tensor statistics. The new tensor statistics, located in the
experimental/common/tensor_statistics
directory, offer improved functionality and better performance. However, the codebase still contains references to the old tensor statistics, leading to redundancy and potential confusion.What needs to be done?
experimental/common/tensor_statistics
.common/tensor_statistics/collectors
with the corresponding implementations fromexperimental/common/tensor_statistics
.experimental/common/tensor_statistics
.common/tensor_statistics/reduction
if it is not needed anymore.In the end, there should be no
experimental/common/tensor_statistics
.Example Pull Requests
#2117
Resources
Contact points
@kshpv
The text was updated successfully, but these errors were encountered: