Python Data Science Jobs & Interviews
18.4K subscribers
146 photos
3 videos
17 files
258 links
Your go-to hub for Python and Data Science—featuring questions, answers, quizzes, and interview tips to sharpen your skills and boost your career in the data-driven world.

Admin: @Hussein_Sheikho
加入频道
Question 6 (Advanced):
Which of the following attention mechanisms is used in transformers?

A) Hard Attention
B) Additive Attention
C) Self-Attention
D) Bahdanau Attention

#Transformers #NLP #DeepLearning #AttentionMechanism #AI
2
Question 10 (Advanced):
In the Transformer architecture (PyTorch), what is the purpose of masked multi-head attention in the decoder?

A) To prevent the model from peeking at future tokens during training
B) To reduce GPU memory usage
C) To handle variable-length input sequences
D) To normalize gradient updates

#Python #Transformers #DeepLearning #NLP #AI

By: https://yangx.top/DataScienceQ
2
Question 32 (Advanced - NLP & RNNs):
What is the key limitation of vanilla RNNs for NLP tasks that led to the development of LSTMs and GRUs?

A) Vanishing gradients in long sequences
B) High GPU memory usage
C) Inability to handle embeddings
D) Single-direction processing only

#Python #NLP #RNN #DeepLearning

By: https://yangx.top/DataScienceQ
3