Visualizing and Measuring the Geometry of BERT
Coenen et al.: https://arxiv.org/abs/1906.02715
#BERT #NaturalLanguageProcessing #UnsupervisedLearning
Coenen et al.: https://arxiv.org/abs/1906.02715
#BERT #NaturalLanguageProcessing #UnsupervisedLearning
arXiv.org
Visualizing and Measuring the Geometry of BERT
Transformer architectures show significant promise for natural language processing. Given that a single pretrained model can be fine-tuned to perform well on many different tasks, these networks...
ACL 2019 Thoughts and Notes
By Vinit Ravishankar, Daniel Hershcovich; edited by Artur Kulmizev, Mostafa Abdou : https://supernlp.github.io/2019/08/16/acl-2019/
#naturallanguageprocessing #machinelearning #deeplearning
By Vinit Ravishankar, Daniel Hershcovich; edited by Artur Kulmizev, Mostafa Abdou : https://supernlp.github.io/2019/08/16/acl-2019/
#naturallanguageprocessing #machinelearning #deeplearning
On Extractive and Abstractive Neural Document Summarization with Transformer Language Models"
Sandeep Subramanian, Raymond Li, Jonathan Pilault, Christopher Pal : https://arxiv.org/abs/1909.03186
#transformer #naturallanguageprocessing #machinelearning
Sandeep Subramanian, Raymond Li, Jonathan Pilault, Christopher Pal : https://arxiv.org/abs/1909.03186
#transformer #naturallanguageprocessing #machinelearning
Transformers: State-of-the-art Natural Language Processing
Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Rémi Louf, Morgan Funtowicz, Jamie Brew : https://arxiv.org/abs/1910.03771
#Transformers #NaturalLanguageProcessing #PyTorch #TensorFlow
Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Rémi Louf, Morgan Funtowicz, Jamie Brew : https://arxiv.org/abs/1910.03771
#Transformers #NaturalLanguageProcessing #PyTorch #TensorFlow
The State of Transfer Learning in NLP
By Sebastian Ruder : http://ruder.io/state-of-transfer-learning-in-nlp/
#TransferLearning #NaturalLanguageProcessing #NLP
By Sebastian Ruder : http://ruder.io/state-of-transfer-learning-in-nlp/
#TransferLearning #NaturalLanguageProcessing #NLP
ruder.io
The State of Transfer Learning in NLP
This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. It highlights key insights and takeaways and provides updates based on recent work.
"MultiFiT: Efficient Multi-lingual Language Model Fine-tuning"
Eisenschlos et al.: https://arxiv.org/abs/1909.04761
Post: http://nlp.fast.ai/classification/2019/09/10/multifit.html
#ArtificialIntelligence #MachineLearning #NaturalLanguageProcessing
Eisenschlos et al.: https://arxiv.org/abs/1909.04761
Post: http://nlp.fast.ai/classification/2019/09/10/multifit.html
#ArtificialIntelligence #MachineLearning #NaturalLanguageProcessing
Evaluating the Factual Consistency of Abstractive Text Summarization
Kryscinski et al.: https://arxiv.org/abs/1910.12840
#ArtificialIntelligence #DeepLearning #NaturalLanguageProcessing
Kryscinski et al.: https://arxiv.org/abs/1910.12840
#ArtificialIntelligence #DeepLearning #NaturalLanguageProcessing
arXiv.org
Evaluating the Factual Consistency of Abstractive Text Summarization
Currently used metrics for assessing summarization algorithms do not account for whether summaries are factually consistent with source documents. We propose a weakly-supervised, model-based...
Q8BERT: Quantized 8Bit BERT
Zafrir et al.: https://arxiv.org/abs/1910.06188
#NaturalLanguageProcessing #NLP #Transformer
Zafrir et al.: https://arxiv.org/abs/1910.06188
#NaturalLanguageProcessing #NLP #Transformer
arXiv.org
Q8BERT: Quantized 8Bit BERT
Recently, pre-trained Transformer based language models such as BERT and GPT, have shown great improvement in many Natural Language Processing (NLP) tasks. However, these models contain a large...
Language Models as Knowledge Bases?
Petroni et al.: https://arxiv.org/abs/1909.01066
#Transformers #NaturalLanguageProcessing #MachineLearning
Petroni et al.: https://arxiv.org/abs/1909.01066
#Transformers #NaturalLanguageProcessing #MachineLearning
arXiv.org
Language Models as Knowledge Bases?
Recent progress in pretraining language models on large textual corpora led to a surge of improvements for downstream NLP tasks. Whilst learning linguistic knowledge, these models may also be...
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Clark et al.: https://openreview.net/forum?id=r1xMH1BtvB
#ArtificialIntelligence #DeepLearning #NaturalLanguageProcessing
Clark et al.: https://openreview.net/forum?id=r1xMH1BtvB
#ArtificialIntelligence #DeepLearning #NaturalLanguageProcessing
OpenReview
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than...
A text encoder trained to distinguish real input tokens from plausible fakes efficiently learns effective language representations.
Language Models as Knowledge Bases?
Petroni et al.: https://arxiv.org/abs/1909.01066
#Transformers #NaturalLanguageProcessing #MachineLearning
Petroni et al.: https://arxiv.org/abs/1909.01066
#Transformers #NaturalLanguageProcessing #MachineLearning
arXiv.org
Language Models as Knowledge Bases?
Recent progress in pretraining language models on large textual corpora led to a surge of improvements for downstream NLP tasks. Whilst learning linguistic knowledge, these models may also be...
CS224N : Natural Language Processing with Deep Learning
https://www.youtube.com/playlist?list=PLU40WL8Ol94IJzQtileLTqGZuXtGlLMP_
#NaturalLanguageProcessing #DeepLearning #ArtificialIntelligence
https://www.youtube.com/playlist?list=PLU40WL8Ol94IJzQtileLTqGZuXtGlLMP_
#NaturalLanguageProcessing #DeepLearning #ArtificialIntelligence
Language Models as Knowledge Bases?
Petroni et al.: https://arxiv.org/abs/1909.01066
#Transformers #NaturalLanguageProcessing #MachineLearning
Petroni et al.: https://arxiv.org/abs/1909.01066
#Transformers #NaturalLanguageProcessing #MachineLearning