Hierarchical Convolutional Attention Networks for Text Classification
Conference
·
OSTI ID:1471854
- ORNL
Recent work in machine translation has demonstrated that self-attention mecha- nisms can be used in place of recurrent neural networks to increase training speed without sacrificing model accuracy. We propose combining this approach with the benefits of convolutional filters and a hi- erarchical structure to create a document classification model that is both highly ac- curate and fast to train – we name our method Hierarchical Convolutional Atten- tion Networks. We demonstrate the effec- tiveness of this architecture by surpassing the accuracy of the current state-of-the-art on several classification tasks while being twice as fast to train.
- Research Organization:
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- USDOE Office of Science (SC)
- DOE Contract Number:
- AC05-00OR22725
- OSTI ID:
- 1471854
- Resource Relation:
- Conference: Proceedings of The Third Workshop on Representation Learning for NLP - Melbourne, , Australia - 7/20/2018 4:00:00 AM-
- Country of Publication:
- United States
- Language:
- English
Similar Records
Classifying Cancer Pathology Reports with Hierarchical Self-Attention Networks
Hierarchical attention networks for information extraction from cancer pathology reports
Hierarchical Convolutional Neural Networks for Event Classification on PMU Measurements
Journal Article
·
Tue Oct 15 00:00:00 EDT 2019
· Artificial Intelligence in Medicine
·
OSTI ID:1471854
+10 more
Hierarchical attention networks for information extraction from cancer pathology reports
Journal Article
·
Thu Nov 16 00:00:00 EST 2017
· Journal of the American Medical Informatics Association
·
OSTI ID:1471854
+5 more
Hierarchical Convolutional Neural Networks for Event Classification on PMU Measurements
Journal Article
·
Fri Sep 24 00:00:00 EDT 2021
· IEEE Transactions on Instrumentation and Measurement
·
OSTI ID:1471854
+3 more