Data Science Machine Learning Data Analysis
37.1K subscribers
1.13K photos
27 videos
39 files
1.24K links
This channel is for Programmers, Coders, Software Engineers.

1- Data Science
2- Machine Learning
3- Data Visualization
4- Artificial Intelligence
5- Data Analysis
6- Statistics
7- Deep Learning

Cross promotion and ads: @hussein_sheikho
加入频道
🔗 Machine Learning from Scratch by Danny Friedman

This book is for readers looking to learn new #machinelearning algorithms or understand algorithms at a deeper level. Specifically, it is intended for readers interested in seeing machine learning algorithms derived from start to finish. Seeing these derivations might help a reader previously unfamiliar with common algorithms understand how they work intuitively. Or, seeing these derivations might help a reader experienced in modeling understand how different #algorithms create the models they do and the advantages and disadvantages of each one.

This book will be most helpful for those with practice in basic modeling. It does not review best practices—such as feature engineering or balancing response variables—or discuss in depth when certain models are more appropriate than others. Instead, it focuses on the elements of those models.


https://dafriedman97.github.io/mlbook/content/introduction.html

#DataAnalytics #Python #SQL #RProgramming #DataScience #MachineLearning #DeepLearning #Statistics #DataVisualization #PowerBI #Tableau #LinearRegression #Probability #DataWrangling #Excel #AI #ArtificialIntelligence #BigData #DataAnalysis #NeuralNetworks #GAN #LearnDataScience #LLM #RAG #Mathematics #PythonProgramming  #Keras

https://yangx.top/CodeProgrammer
Please open Telegram to view this post
VIEW IN TELEGRAM
👍42
📚 Become a professional data scientist with these 17 resources!



1️⃣ Python libraries for machine learning

◀️ Introducing the best Python tools and packages for building ML models.



2️⃣ Deep Learning Interactive Book

◀️ Learn deep learning concepts by combining text, math, code, and images.



3️⃣ Anthology of Data Science Learning Resources

◀️ The best courses, books, and tools for learning data science.



4️⃣ Implementing algorithms from scratch

◀️ Coding popular ML algorithms from scratch



5️⃣ Machine Learning Interview Guide

◀️ Fully prepared for job interviews



6️⃣ Real-world machine learning projects

◀️ Learning how to build and deploy models.



7️⃣ Designing machine learning systems

◀️ How to design a scalable and stable ML system.



8️⃣ Machine Learning Mathematics

◀️ Basic mathematical concepts necessary to understand machine learning.



9️⃣ Introduction to Statistical Learning

◀️ Learn algorithms with practical examples.



1️⃣ Machine learning with a probabilistic approach

◀️ Better understanding modeling and uncertainty with a statistical perspective.



1️⃣ UBC Machine Learning

◀️ Deep understanding of machine learning concepts with conceptual teaching from one of the leading professors in the field of ML,



1️⃣ Deep Learning with Andrew Ng

◀️ A strong start in the world of neural networks, CNNs and RNNs.



1️⃣ Linear Algebra with 3Blue1Brown

◀️ Intuitive and visual teaching of linear algebra concepts.



🔴 Machine Learning Course

◀️ A combination of theory and practical training to strengthen ML skills.



1️⃣ Mathematical Optimization with Python

◀️ You will learn the basic concepts of optimization with Python code.



1️⃣ Explainable models in machine learning

◀️ Making complex models understandable.



⚫️ Data Analysis with Python

◀️ Data analysis skills using Pandas and NumPy libraries.


#DataScience #MachineLearning #DeepLearning #Python #AI #MLProjects #DataAnalysis #ExplainableAI #100DaysOfCode #TechEducation #MLInterviewPrep #NeuralNetworks #MathForML #Statistics #Coding #AIForEveryone #PythonForDataScience



⚡️ BEST DATA SCIENCE CHANNELS ON TELEGRAM 🌟
Please open Telegram to view this post
VIEW IN TELEGRAM
👍75🔥3
🚀 Master the Transformer Architecture with PyTorch! 🧠

Dive deep into the world of Transformers with this comprehensive PyTorch implementation guide. Whether you're a seasoned ML engineer or just starting out, this resource breaks down the complexities of the Transformer model, inspired by the groundbreaking paper "Attention Is All You Need".

🔗 Check it out here:
https://www.k-a.in/pyt-transformer.html

This guide offers:

🌟 Detailed explanations of each component of the Transformer architecture.

🌟 Step-by-step code implementations in PyTorch.

🌟 Insights into the self-attention mechanism and positional encoding.

By following along, you'll gain a solid understanding of how Transformers work and how to implement them from scratch.

#MachineLearning #DeepLearning #PyTorch #Transformer #AI #NLP #AttentionIsAllYouNeed #Coding #DataScience #NeuralNetworks


💯 BEST DATA SCIENCE CHANNELS ON TELEGRAM 🌟

🧠💻📊
Please open Telegram to view this post
VIEW IN TELEGRAM
👍3🔥1
10 GitHub repos to build a career in AI engineering:

(100% free step-by-step roadmap)

1️⃣ ML for Beginners by Microsoft

A 12-week project-based curriculum that teaches classical ML using Scikit-learn on real-world datasets.

Includes quizzes, lessons, and hands-on projects, with some videos.

GitHub repo → https://lnkd.in/dCxStbYv

2️⃣ AI for Beginners by Microsoft

This repo covers neural networks, NLP, CV, transformers, ethics & more. There are hands-on labs in PyTorch & TensorFlow using Jupyter.

Beginner-friendly, project-based, and full of real-world apps.

GitHub repo → https://lnkd.in/dwS5Jk9E

3️⃣ Neural Networks: Zero to Hero

Now that you’ve grasped the foundations of AI/ML, it’s time to dive deeper.

This repo by Andrej Karpathy builds modern deep learning systems from scratch, including GPTs.

GitHub repo → https://lnkd.in/dXAQWucq

4️⃣ DL Paper Implementations

So far, you have learned the fundamentals of AI, ML, and DL. Now study how the best architectures work.

This repo covers well-documented PyTorch implementations of 60+ research papers on Transformers, GANs, Diffusion models, etc.

GitHub repo → https://lnkd.in/dTrtDrvs

5️⃣ Made With ML

Now it’s time to learn how to go from notebooks to production.

Made With ML teaches you how to design, develop, deploy, and iterate on real-world ML systems using MLOps, CI/CD, and best practices.

GitHub repo → https://lnkd.in/dYyjjBGb

6️⃣ Hands-on LLMs

- You've built neural nets.
- You've explored GPTs and LLMs.

Now apply them. This is a visually rich repo that covers everything about LLMs, like tokenization, fine-tuning, RAG, etc.

GitHub repo → https://lnkd.in/dh2FwYFe

7️⃣ Advanced RAG Techniques

Hands-on LLMs will give you a good grasp of RAG systems. Now learn advanced RAG techniques.

This repo covers 30+ methods to make RAG systems faster, smarter, and accurate, like HyDE, GraphRAG, etc.

GitHub repo → https://lnkd.in/dBKxtX-D

8️⃣ AI Agents for Beginners by Microsoft

After diving into LLMs and mastering RAG, learn how to build AI agents.

This hands-on course covers building AI agents using frameworks like AutoGen.

GitHub repo → https://lnkd.in/dbFeuznE

9️⃣ Agents Towards Production

The above course will teach what AI agents are. Next, learn how to ship them.

This is a practical playbook for building agents covering memory, orchestration, deployment, security & more.

GitHub repo → https://lnkd.in/dcwmamSb

🔟 AI Engg. Hub

To truly master LLMs, RAG, and AI agents, you need projects.

This covers 70+ real-world examples, tutorials, and agent app you can build, adapt, and ship.

GitHub repo → https://lnkd.in/geMYm3b6

#AIEngineering #MachineLearning #DeepLearning #LLMs #RAG #MLOps #Python #GitHubProjects #AIForBeginners #ArtificialIntelligence #NeuralNetworks #OpenSourceAI #DataScienceCareers


✉️ Our Telegram channels: https://yangx.top/addlist/0f6vfFbEMdAwODBk

📱 Our WhatsApp channel: https://whatsapp.com/channel/0029VaC7Weq29753hpcggW2A
Please open Telegram to view this post
VIEW IN TELEGRAM
3
Auto-Encoder & Backpropagation by hand ✍️ lecture video ~ 📺 https://byhand.ai/cv/10

It took me a few years to invent this method to show both forward and backward passes for a non-trivial case of a multi-layer perceptron over a batch of inputs, plus gradient descents over multiple epochs, while being able to hand calculate each step and code in Excel at the same time.

= Chapters =
• Encoder & Decoder (00:00)
• Equation (10:09)
• 4-2-4 AutoEncoder (16:38)
• 6-4-2-4-6 AutoEncoder (18:39)
• L2 Loss (20:49)
• L2 Loss Gradient (27:31)
• Backpropagation (30:12)
• Implement Backpropagation (39:00)
• Gradient Descent (44:30)
• Summary (51:39)

#AIEngineering #MachineLearning #DeepLearning #LLMs #RAG #MLOps #Python #GitHubProjects #AIForBeginners #ArtificialIntelligence #NeuralNetworks #OpenSourceAI #DataScienceCareers


✉️ Our Telegram channels: https://yangx.top/addlist/0f6vfFbEMdAwODBk
Please open Telegram to view this post
VIEW IN TELEGRAM
5
What is torch.nn really?

When I started working with PyTorch, my biggest question was: "What is torch.nn?".


This article explains it quite well.

📌 Read

#pytorch #AIEngineering #MachineLearning #DeepLearning #LLMs #RAG #MLOps #Python #GitHubProjects #AIForBeginners #ArtificialIntelligence #NeuralNetworks #OpenSourceAI #DataScienceCareers


✉️ Our Telegram channels: https://yangx.top/addlist/0f6vfFbEMdAwODBk
Please open Telegram to view this post
VIEW IN TELEGRAM
7
Topic: CNN (Convolutional Neural Networks) – Part 1: Introduction and Basic Concepts

---

1. What is a CNN?

• A Convolutional Neural Network (CNN) is a type of deep learning model primarily used for analyzing visual data.

• CNNs automatically learn spatial hierarchies of features through convolutional layers.

---

2. Key Components of CNN

Convolutional Layer: Applies filters (kernels) to input images to extract features like edges, textures, and shapes.

Activation Function: Usually ReLU (Rectified Linear Unit) is applied after convolution for non-linearity.

Pooling Layer: Reduces the spatial size of feature maps, typically using Max Pooling.

Fully Connected Layer: After feature extraction, maps features to output classes.

---

3. How Convolution Works

• A kernel (small matrix) slides over the input image, computing element-wise multiplications and summing them up to form a feature map.

• Kernels detect features like edges, lines, and patterns.

---

4. Basic CNN Architecture Example

| Layer Type | Description |
| --------------- | ---------------------------------- |
| Input | Image of size (e.g., 28x28x1) |
| Conv Layer | 32 filters of size 3x3 |
| Activation | ReLU |
| Pooling Layer | MaxPooling 2x2 |
| Fully Connected | Flatten + Dense for classification |

---

5. Simple CNN with PyTorch Example

import torch.nn as nn
import torch.nn.functional as F

class SimpleCNN(nn.Module):
def __init__(self):
super(SimpleCNN, self).__init__()
self.conv1 = nn.Conv2d(1, 32, kernel_size=3) # 1 input channel, 32 filters
self.pool = nn.MaxPool2d(2, 2)
self.fc1 = nn.Linear(32 * 13 * 13, 10) # Assuming input 28x28

def forward(self, x):
x = self.pool(F.relu(self.conv1(x)))
x = x.view(-1, 32 * 13 * 13) # Flatten
x = self.fc1(x)
return x


---

6. Why CNN over Fully Connected Networks?

• CNNs reduce the number of parameters by weight sharing in kernels.

• They preserve spatial relationships unlike fully connected layers.

---

Summary

• CNNs are powerful for image and video tasks due to convolution and pooling.

• Understanding convolution, pooling, and architecture basics is key to building models.

---

Exercise

• Implement a CNN with two convolutional layers and train it on MNIST digits.

---

#CNN #DeepLearning #NeuralNetworks #Convolution #MachineLearning

https://yangx.top/DataScience4
7
Topic: CNN (Convolutional Neural Networks) – Part 3: Flattening, Fully Connected Layers, and Final Output

---

1. Flattening the Feature Maps

• After convolution and pooling layers, the resulting feature maps are multi-dimensional tensors.

Flattening transforms these 3D tensors into 1D vectors to be passed into fully connected (dense) layers.

Example:

x = x.view(x.size(0), -1)


This reshapes the tensor from shape [batch_size, channels, height, width] to [batch_size, features].

---

2. Fully Connected (Dense) Layers

• These layers are used to perform classification based on the extracted features.

• Each neuron is connected to every neuron in the previous layer.

• They are placed after convolutional and pooling layers.

---

3. Output Layer

• The final layer is typically a fully connected layer with output neurons equal to the number of classes.

• Apply a softmax activation for multi-class classification (e.g., 10 classes for digits 0–9).

---

4. Complete CNN Example (PyTorch)

import torch.nn as nn
import torch.nn.functional as F

class FullCNN(nn.Module):
def __init__(self):
super(FullCNN, self).__init__()
self.conv1 = nn.Conv2d(1, 32, 3, padding=1)
self.pool = nn.MaxPool2d(2, 2)
self.conv2 = nn.Conv2d(32, 64, 3, padding=1)
self.fc1 = nn.Linear(64 * 7 * 7, 128) # assumes input 28x28
self.fc2 = nn.Linear(128, 10)

def forward(self, x):
x = self.pool(F.relu(self.conv1(x))) # 28x28 -> 14x14
x = self.pool(F.relu(self.conv2(x))) # 14x14 -> 7x7
x = x.view(-1, 64 * 7 * 7) # Flatten
x = F.relu(self.fc1(x))
x = self.fc2(x) # Output layer
return x


---

5. Why Fully Connected Layers Are Important

• They combine all learned spatial features into a single feature vector for classification.

• They introduce the final decision boundary between classes.

---

Summary

Flattening bridges the convolutional part of the network to the fully connected part.

Fully connected layers transform features into class scores.

• The output layer applies classification logic like softmax or sigmoid depending on the task.

---

Exercise

• Modify the CNN above to classify CIFAR-10 images (3 channels, 32x32) and calculate the total number of parameters in each layer.

---

#CNN #NeuralNetworks #Flattening #FullyConnected #DeepLearning

https://yangx.top/DataScienceM
6