Python Data Science Jobs & Interviews
18.9K subscribers
150 photos
3 videos
17 files
262 links
Your go-to hub for Python and Data Science—featuring questions, answers, quizzes, and interview tips to sharpen your skills and boost your career in the data-driven world.

Admin: @Hussein_Sheikho
加入频道
Question 5 (Intermediate):
In a neural network, what does the ReLU activation function return?

A) 1 / (1 + e^-x)
B) max(0, x)
C) x^2
D) e^x / (e^x + 1)

#NeuralNetworks #DeepLearning #ActivationFunctions #ReLU #AI
1
Question 6 (Advanced):
Which of the following attention mechanisms is used in transformers?

A) Hard Attention
B) Additive Attention
C) Self-Attention
D) Bahdanau Attention

#Transformers #NLP #DeepLearning #AttentionMechanism #AI
2
Question 10 (Advanced):
In the Transformer architecture (PyTorch), what is the purpose of masked multi-head attention in the decoder?

A) To prevent the model from peeking at future tokens during training
B) To reduce GPU memory usage
C) To handle variable-length input sequences
D) To normalize gradient updates

#Python #Transformers #DeepLearning #NLP #AI

By: https://yangx.top/DataScienceQ
2
Question 11 (Expert):
In Vision Transformers (ViT), how are image patches typically converted into input tokens for the transformer encoder?

A) Raw pixel values are used directly
B) Each patch is flattened and linearly projected
C) Patches are processed through a CNN first
D) Edge detection is applied before projection

#Python #ViT #ComputerVision #DeepLearning #Transformers

By: https://yangx.top/DataScienceQ
1