Data Science by ODS.ai 🦜
46.1K subscribers
663 photos
77 videos
7 files
1.75K links
First Telegram Data Science channel. Covering all technical and popular staff about anything related to Data Science: AI, Big Data, Machine Learning, Statistics, general Math and the applications of former. To reach editors contact: @malev
加入频道
Inverse Compositional Spatial Transformer Networks

In this paper, we establish a theoretical connection between the classical Lucas & Kanade (LK) algorithm and the emerging topic of Spatial Transformer Networks (STNs). STNs are of interest to the vision and learning communities due to their natural ability to combine alignment and classification within the same theoretical framework. Inspired by the Inverse Compositional (IC) variant of the LK algorithm, we present Inverse Compositional Spatial Transformer Networks (IC-STNs). We demonstrate that IC-STNs can achieve better performance than conventional STNs with less model capacity; in particular, we show superior performance in pure image alignment tasks as well as joint alignment/classification problems on real-world problems.

https://arxiv.org/abs/1612.03897

#arxiv #dl #cv
Three Models for Anomaly Detection: Pros and Cons.

Nice intro into anomaly detection.

https://blogs.technet.microsoft.com/uktechnet/2016/12/13/three-models-for-anomaly-detection-pros-and-cons/
Where to start with Data Science

There is now way to be taught to be data scientist, but you can learn how to become one yourself. There is no right way, but there is a way, which was adopted by a number of data scientists and it goes through online courses (MOOC). Following suggested order is not required, but might be helpful.

Best resources to study Data Science /Machine Learning

1. Andrew Ng’s Machine Learning (https://www.coursera.org/learn/machine-learning).
2. Geoffrey Hinton’s Neural Networks for Machine Learning (https://www.coursera.org/learn/neural-networks).
3. Probabilistic Graphical Models specialisation on Coursera from Stanford (https://www.coursera.org/specializations/probabilistic-graphical-models).
4. Learning from data by Caltech (https://work.caltech.edu/telecourse.html).
5. CS229 from Stanford by Andrew Ng (http://cs229.stanford.edu/materials.html)
6. CS224d: Deep Learning for Natural Language Processing from Stanford (http://cs224d.stanford.edu/syllabus.html).
7. CS231n: Convolutional Neural Networks for Visual Recognition from Stanford (http://cs231n.stanford.edu/syllabus.html).
8. Deep Learning Book by Ian Goodfellow and Yoshua Bengio and Aaron Courville (http://www.deeplearningbook.org/).
9. Machine Learning Yearning by Andrew Ng (http://www.mlyearning.org/).

#books #wheretostart #mooc
There is a new $1MM competition on Kaggle to use ML / AI to diagnose lung cancer from CT scans.

Not only it is the great breakthrough for Kaggle (it is the first competition with this huge prize fund), it is also a breakthrough for science, since top world researchers and enginners will compete to basically crowdsource and ease the lung cancer diagnostics.

Competition is available at: https://www.kaggle.com/c/data-science-bowl-2017

#kaggle #segmentation #deeplearning #cv
New release in PyTorch: «GPU Tensors, Dynamic Neural Networks and deep Python integration. Hello world!»

http://pytorch.org
Today Kaggle announced the launch of Two Sigma's new recruiting competition. In this competition, participants are invited to explore detailed NYC rental listing data from Two Sigma's competition co-sponsor, RentHop, to ease the often hectic process of finding the perfect home.

#kaggle
If you have any news worth spreading, please address @opendatasciencebot and send him a link with quick description.