Bidirectional RNN #1078
Comments
|
I believe they should be easy to implement. We don't have them by default yet - a PR adding them would be welcome! |
|
Hey @lukaszkaiser mind if I take a stab at this? |
|
if nobody is working currently, I'll submit PR. (@narayanacharya6 ) |
|
I haven't started yet, so go for it @zvikinoza |
|
I've just made a PR for the issue. I wasn't sure where to place it, so I just added it to |
|
In the PR I use |
|
I also have a question about RNN implementation in the Trax. Why do we initialize the hidden state of GRU and LSTM layers proportionally to the dimension of their inputs? Shouldn't we pass |


Is there a way to train a bidirectional RNN (like LSTM or GRU) on trax nowadays?
The text was updated successfully, but these errors were encountered: