Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【PIR OpTest Fix No.38】 fix test_semi_auto_parallel_c_cross_entropy #59893

Merged
merged 9 commits into from
Dec 25, 2023

Conversation

xingmingyyj
Copy link
Contributor

@xingmingyyj xingmingyyj commented Dec 11, 2023

PR types

Others

PR changes

Others

Description

PIR Op单测修复
修复单测 test_semi_auto_parallel_c_cross_entropy
修复后打开FLAGS_enable_pir_in_executor单测是否通过:是

Copy link

paddle-bot bot commented Dec 11, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added the contributor External developers label Dec 11, 2023
@xingmingyyj xingmingyyj changed the title 【PIR OpTest Fix No.38】 fix test_dist_cross_entropy_dp 【PIR OpTest Fix No.38】 fix test_semi_auto_parallel_c_cross_entropy Dec 20, 2023
@@ -267,6 +267,7 @@ test_seed_op
test_segment_ops
test_segment_ops_static_build
test_selu_op
test_semi_auto_parallel_c_cross_entropy
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

加这个没用吧,这个单测没继承OpTest呀

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

加这个没用吧,这个单测没继承OpTest呀

确实,之后删掉

Copy link
Contributor

@kangguangli kangguangli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

对third_party/flashattn的修改要revert掉

@xingmingyyj
Copy link
Contributor Author

对third_party/flashattn的修改要revert掉

好的

kangguangli
kangguangli previously approved these changes Dec 22, 2023
From00
From00 previously approved these changes Dec 22, 2023
Copy link
Contributor

@From00 From00 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Comment on lines +1402 to +1409
- op: c_softmax_with_cross_entropy
args: (Tensor logits, Tensor label, int64_t ignore_index=-100, int ring_id=0, int rank=0, int nranks=0)
output: Tensor(softmax), Tensor(loss)
infer_meta:
func : CSoftmaxWithCrossEntropyInferMeta
kernel:
func: c_softmax_with_cross_entropy
data_type : logits
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个不需要配置backward吗?

@xingmingyyj xingmingyyj dismissed stale reviews from From00 and kangguangli via bcb317b December 22, 2023 07:16
@xingmingyyj xingmingyyj requested a review from zyfncg December 25, 2023 02:17
@kangguangli kangguangli merged commit de7b288 into PaddlePaddle:develop Dec 25, 2023
@xingmingyyj xingmingyyj deleted the fix_test_c_softmax branch December 25, 2023 06:57
Wanglongzhi2001 pushed a commit to Wanglongzhi2001/Paddle that referenced this pull request Jan 7, 2024
…addlePaddle#59893)

* register c_softmax

* register c_softmax

* Update ops_backward.yaml

* Update utils.cc

* add test_semi_auto_parallel_c_cross_entropy to whitelist

* Revert "add test_semi_auto_parallel_c_cross_entropy to whitelist"

This reverts commit 75b3605.

* add pit test

* Update ops.yaml
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants