Skip to content

Commit a9614cf

Browse files
authored
[Week7 seq2seq] Fix wrong description of attention in week7 in bonus_pytorch.ipynb (yandexdataschool#429)
Some context: One of the students sent a homework with wrong logic in Attention: He multiplies attention weights by their own logits (instead of multiplying encoder sequence by attention weights). I said him about his error. But he said that such logic had beed described in the course notebook. I have checked it and turns out that: 1. There was the original bonus.ipynb with correct description and class draft; 2. Then one of the students created bonus_pytorch.ipynb with wrong description and his PR was merged. Now, as most of the students use pytorch, they can see wrong description and write wrong code. Here we fix the description.
1 parent 3460b35 commit a9614cf

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

week07_seq2seq/bonus_pytorch.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -96,8 +96,8 @@
9696
"#### step-by-step guide:\n",
9797
"\n",
9898
"- compute scores between $h_{e, j}^i$ and $h_{d}^i$ $\\forall j = 1, ... , \\text{len(enc_seq)}$, where i -- number of decoder step\n",
99-
"- apply softmax to scores and get weights for each vector\n",
100-
"- obtain attention vector using scores and weight matrix\n",
99+
"- apply softmax to scores and get weight for each vector\n",
100+
"- obtain attention vector using enc_seq and weights\n",
101101
"\n"
102102
]
103103
},

0 commit comments

Comments
 (0)