Name : R-attention
| |
Version : 0.4.0
| Vendor : obs://build_opensuse_org/devel:languages:R
|
Release : lp153.1.4
| Date : 2024-06-14 10:58:51
|
Group : Development/Libraries/Other
| Source RPM : R-attention-0.4.0-lp153.1.4.src.rpm
|
Size : 0.07 MB
| |
Packager : (none)
| |
Summary : Self-Attention Algorithm
|
Description :
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) < doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) < https://web.stanford.edu/~jurafsky/slp3/> \"Speech and Language Processing (3rd ed.)\" and Alex Graves (2020) < https://www.youtube.com/watch?v=AIiwuClvH6k> \"Attention and Memory in Deep Learning\".
|
RPM found in directory: /packages/linux-pbone/ftp5.gwdg.de/pub/opensuse/repositories/devel:/languages:/R:/autoCRAN/openSUSE_Leap_15.3/x86_64 |