On Artificial Intelligence
108 subscribers
27 photos
36 files
466 links
If you want to know more about Science, specially Artificial Intelligence, this is the right place for you
Admin Contact:
@Oriea
加入频道
Forwarded from Tensorflow(@CVision) (Alireza Akhavan)
#خبر
در حال حاضر BERT از فارسی پشتیبانی میکند.
Persian (Farsi)
https://github.com/google-research/bert/blob/master/multilingual.md#list-of-languages

پست مدیوم مرتبط در مورد چند زبانگی bert
Hallo multilingual BERT, cómo funcionas?
https://medium.com/omnius/hallo-multilingual-bert-c%C3%B3mo-funcionas-2b3406cc4dc2

پست فارسی از آقای خوشمهر در مورد BERT و کاربردهایش:
معرفی BERT، تحولی در NLP
http://blog.class.vision/1397/09/bert-in-nlp/

#bert #nlp
Forwarded from Tensorflow(@CVision) (Alireza Akhavan)
Language, trees, and geometry in neural networks

A visualization technique to understand BERT.
https://twitter.com/burkov/status/1139391818443808769

#bert #NLP
Forwarded from Tensorflow(@CVision) (Vahid Reza Khazaie)
New Google Brain Optimizer Reduces BERT Pre-Training Time From Days to Minutes

کاهش مدت زمان pre-training مدل زبانی BERT از سه روز به 76 دقیقه با ارائه یک تابع بهینه ساز جدید!

Google Brain researchers have proposed LAMB (Layer-wise Adaptive Moments optimizer for Batch training), a new optimizer which reduces training time for its NLP training model BERT (Bidirectional Encoder Representations from Transformers) from three days to just 76 minutes.

لینک مقاله: https://arxiv.org/abs/1904.00962
لینک بلاگ پست: https://medium.com/syncedreview/new-google-brain-optimizer-reduces-bert-pre-training-time-from-days-to-minutes-b454e54eda1d

#BERT #language_model #optimizer
Forwarded from Tensorflow(@CVision) (Vahid Reza Khazaie)
Fast-Bert

This library will help you build and deploy BERT based models within minutes:

Fast-Bert is the deep learning library that allows developers and data scientists to train and deploy BERT and XLNet based models for natural language processing tasks beginning with Text Classification.

The work on FastBert is built on solid foundations provided by the excellent Hugging Face BERT PyTorch library and is inspired by fast.ai and strives to make the cutting edge deep learning technologies accessible for the vast community of machine learning practitioners.

With FastBert, you will be able to:

Train (more precisely fine-tune) BERT, RoBERTa and XLNet text classification models on your custom dataset.

Tune model hyper-parameters such as epochs, learning rate, batch size, optimiser schedule and more.

Save and deploy trained model for inference (including on AWS Sagemaker).

Fast-Bert will support both multi-class and multi-label text classification for the following and in due course, it will support other NLU tasks such as Named Entity Recognition, Question Answering and Custom Corpus fine-tuning.

Blog post: https://medium.com/huggingface/introducing-fastbert-a-simple-deep-learning-library-for-bert-models-89ff763ad384

Code: https://github.com/kaushaltrivedi/fast-bert

#language_model #BERT