Progressive Learning and Network Growing in TensorFlow
https://towardsdatascience.com/progressive-learning-and-network-growing-in-tensorflow-e41414f304d2?source=collection_home---4------1---------------------
https://towardsdatascience.com/progressive-learning-and-network-growing-in-tensorflow-e41414f304d2?source=collection_home---4------1---------------------
Towards Data Science
Progressive Learning and Network Growing in TensorFlow
In many real world applications new training data becomes available after a network has already been trained. Especially with big neural…
🎥 AI Learning Morphology and Movement...at the Same Time!
👁 1 раз ⏳ 235 сек.
👁 1 раз ⏳ 235 сек.
The paper "Reinforcement Learning for Improving Agent Design" is available here:
https://designrl.github.io/
https://arxiv.org/abs/1810.03779
Our job posting for a PostDoc:
https://www.cg.tuwien.ac.at/jobs/3dspatialization/
Pick up cool perks on our Patreon page:
› https://www.patreon.com/TwoMinutePapers
We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
313V, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Anthony Vdovitchenko, Brian Gilman, Christian Ahlin, Chr
Vk
AI Learning Morphology and Movement...at the Same Time!
The paper "Reinforcement Learning for Improving Agent Design" is available here:
https://designrl.github.io/
https://arxiv.org/abs/1810.03779
Our job posting for a PostDoc:
https://www.cg.tuwien.ac.at/jobs/3dspatialization/
Pick up cool perks on our Patreon…
https://designrl.github.io/
https://arxiv.org/abs/1810.03779
Our job posting for a PostDoc:
https://www.cg.tuwien.ac.at/jobs/3dspatialization/
Pick up cool perks on our Patreon…
Stemming? Lemmatization? What?
https://towardsdatascience.com/stemming-lemmatization-what-ba782b7c0bd8?source=collection_home---4------3---------------------
https://towardsdatascience.com/stemming-lemmatization-what-ba782b7c0bd8?source=collection_home---4------3---------------------
Towards Data Science
Stemming? Lemmatization? What?
Taking a high-level dive into what stemming and lemmatization do for natural language processing tasks and how they do it.
🎥 TF Machine Learning for Programmers (TensorFlow @ O’Reilly AI Conference, San Francisco '18)
👁 1 раз ⏳ 2516 сек.
👁 1 раз ⏳ 2516 сек.
In this talk, Laurence Moroney from Google talked about Machine Learning, AI, Deep Learning and more, and how they fit the programmers toolkit. He introduced what it's all about, cutting through the hype, to show the opportunities that are available in Machine Learning. He also introduced TensorFlow, and how it's a framework that's designed to make Machine Learning easy and accessible, and how intelligent apps that use ML can run in a variety of places including mobile, web and IoT.
More TensorFlow vide
Vk
TF Machine Learning for Programmers (TensorFlow @ O’Reilly AI Conference, San Francisco '18)
In this talk, Laurence Moroney from Google talked about Machine Learning, AI, Deep Learning and more, and how they fit the programmers toolkit. He introduced what it's all about, cutting through the hype, to show the opportunities that are available in Machine…
Facebook have created and now open-sourced Nevergrad, a Python3 library that claims making easier to perform gradient-free optimizations.
Link: https://code.fb.com/ai-research/nevergrad/
Github: https://github.com/facebookresearch/nevergrad
🔗 Nevergrad: An open source tool for derivative-free optimization - Facebook Code
We are open-sourcing Nevergrad, a Python3 library that makes it easier to perform gradient-free optimizations used in many machine learning tasks.
Link: https://code.fb.com/ai-research/nevergrad/
Github: https://github.com/facebookresearch/nevergrad
🔗 Nevergrad: An open source tool for derivative-free optimization - Facebook Code
We are open-sourcing Nevergrad, a Python3 library that makes it easier to perform gradient-free optimizations used in many machine learning tasks.
Engineering at Meta
Nevergrad: An open source tool for derivative-free optimization
We are open-sourcing Nevergrad, a Python3 library that makes it easier to perform gradient-free optimizations used in many machine learning tasks.
MIT Deep Learning
🔗 MIT Deep Learning
Lectures on deep learning, deep reinforcement learning, artificial intelligence, and autonomous vehicles by Lex Fridman at MIT.
🔗 MIT Deep Learning
Lectures on deep learning, deep reinforcement learning, artificial intelligence, and autonomous vehicles by Lex Fridman at MIT.
MIT Deep Learning and Artificial Intelligence Lectures
MIT Deep Learning and Artificial Intelligence Lectures | Lex Fridman
A collection of lectures on deep learning, deep reinforcement learning, autonomous vehicles, and artificial intelligence organized by Lex Fridman.
Deep Learning for Recommender Systems | Alexandros Karatzoglou
🔗 Deep Learning for Recommender Systems | Alexandros Karatzoglou
Deep neural networks are being used in a number of complex machine learning tasks such as computer vision, natural language processing and speech recognition...
🔗 Deep Learning for Recommender Systems | Alexandros Karatzoglou
Deep neural networks are being used in a number of complex machine learning tasks such as computer vision, natural language processing and speech recognition...
YouTube
Deep Learning for Recommender Systems | Alexandros Karatzoglou
Deep neural networks are being used in a number of complex machine learning tasks such as computer vision, natural language processing and speech recognition with immense success. Deep Learning has been hailed as the “next big thing” in recommender systems…
Real-Time Noise Suppression Using Deep Learning
https://towardsdatascience.com/real-time-noise-suppression-using-deep-learning-38719819e051?source=collection_home---4------1---------------------
https://towardsdatascience.com/real-time-noise-suppression-using-deep-learning-38719819e051?source=collection_home---4------1---------------------
Towards Data Science
Real-Time Noise Suppression Using Deep Learning
How to build deep learning based, real time, low-latency noise suppression systems.
https://habr.com/post/433982/
Искусственный интеллект мыслит, как группа людей, что вызывает беспокойство
#machinelearning #neuralnets #deeplearning #машинноеобучение
Наш телеграмм канал - https://yangx.top/ai_machinelearning_big_data
Искусственный интеллект мыслит, как группа людей, что вызывает беспокойство
#machinelearning #neuralnets #deeplearning #машинноеобучение
Наш телеграмм канал - https://yangx.top/ai_machinelearning_big_data
Хабр
Искусственный интеллект мыслит, как группа людей, что вызывает беспокойство
Искусственный интеллект был создан для принятия организационных решений и государственного управления; он нуждается в человеческой этике, заявляет Джонни Пенн из...
How to Learn Data Science: Staying Motivated.
https://towardsdatascience.com/how-to-learn-data-science-staying-motivated-8665ed649687?source=collection_home---4------0---------------------
https://towardsdatascience.com/how-to-learn-data-science-staying-motivated-8665ed649687?source=collection_home---4------0---------------------
Towards Data Science
How to Learn Data Science: Staying Motivated.
Advice on how to be more consistent in your educational journey.
Deep Learning Enables at Least 100-Fold Dose Reduction for Pet Imaging
🔗 Deep Learning Enables at Least 100-Fold Dose Reduction for Pet Imaging
Presentation of research from Stanford University
🔗 Deep Learning Enables at Least 100-Fold Dose Reduction for Pet Imaging
Presentation of research from Stanford University
YouTube
Deep Learning Enables at Least 100-Fold Dose Reduction for Pet Imaging
Presentation of research from Stanford University
How to Reduce the Variance of Deep Learning Models in Keras Using Model Averaging Ensembles
https://machinelearningmastery.com/model-averaging-ensemble-for-deep-learning-neural-networks/
🔗 How to Reduce the Variance of Deep Learning Models in Keras Using Model Averaging Ensembles
Deep learning neural network models are highly flexible nonlinear algorithms capable of learning a near infinite number of mapping functions. A frustration with this flexibility is the high variance in a final model. The same neural network model trained on the same dataset may find one of many different possible “good enough” solutions each time …
https://machinelearningmastery.com/model-averaging-ensemble-for-deep-learning-neural-networks/
🔗 How to Reduce the Variance of Deep Learning Models in Keras Using Model Averaging Ensembles
Deep learning neural network models are highly flexible nonlinear algorithms capable of learning a near infinite number of mapping functions. A frustration with this flexibility is the high variance in a final model. The same neural network model trained on the same dataset may find one of many different possible “good enough” solutions each time …
MachineLearningMastery.com
How to Develop an Ensemble of Deep Learning Models in Keras - MachineLearningMastery.com
Deep learning neural network models are highly flexible nonlinear algorithms capable of learning a near infinite number of mapping functions. A frustration with this flexibility is the high variance in a final model. The same neural network model trained…
🎥 Lecture #7a: Boosting and Ensembles; Multi-class Classification and Ranking (10/29/2018)
👁 1 раз ⏳ 5399 сек.
👁 1 раз ⏳ 5399 сек.
Lecture #7a: Boosting and Ensembles; Multi-class Classification and Ranking
CIS 419 2018C Applied Machine Learning on 10/29/2018 Mon
Vk
Lecture #7a: Boosting and Ensembles; Multi-class Classification and Ranking (10/29/2018)
Lecture #7a: Boosting and Ensembles; Multi-class Classification and Ranking
CIS 419 2018C Applied Machine Learning on 10/29/2018 Mon
CIS 419 2018C Applied Machine Learning on 10/29/2018 Mon
Organic computers may be our future https://phys.org/news/2018-12-amoeba-approximate-solutions-np-hard-problem.html
🔗 Amoeba finds approximate solutions to NP-hard problem in linear time
Researchers have demonstrated that an amoeba—a single-celled organism consisting mostly of gelatinous protoplasm—has unique computing abilities that may one day offer a competitive alternative to ...
🔗 Amoeba finds approximate solutions to NP-hard problem in linear time
Researchers have demonstrated that an amoeba—a single-celled organism consisting mostly of gelatinous protoplasm—has unique computing abilities that may one day offer a competitive alternative to ...
phys.org
Amoeba finds approximate solutions to NP-hard problem in linear time
Researchers have demonstrated that an amoeba—a single-celled organism consisting mostly of gelatinous protoplasm—has unique computing abilities that may one day offer a competitive alternative to ...
Data Scientist — Is it the sexiest job of the 21st Century?
https://towardsdatascience.com/data-scientist-is-it-the-sexiest-job-of-the-21st-century-35a5bf409363?source=collection_home---4------2---------------------
https://towardsdatascience.com/data-scientist-is-it-the-sexiest-job-of-the-21st-century-35a5bf409363?source=collection_home---4------2---------------------
Towards Data Science
Data Scientist — Is it the sexiest job of the 21st Century?
Food for thought on the direction we are taking
Progressive Learning and Network Growing in TensorFlow
https://towardsdatascience.com/progressive-learning-and-network-growing-in-tensorflow-e41414f304d2?source=collection_home---4------1---------------------
https://towardsdatascience.com/progressive-learning-and-network-growing-in-tensorflow-e41414f304d2?source=collection_home---4------1---------------------
Towards Data Science
Progressive Learning and Network Growing in TensorFlow
In many real world applications new training data becomes available after a network has already been trained. Especially with big neural…
Lecture #8b: Neural Networks and Deep Learning (11/07/2018)
🔗 Lecture #8b: Neural Networks and Deep Learning (11/07/2018)
Lecture #8b: Neural Networks and Deep Learning CIS 419 2018C Applied Machine Learning on 11/07/2018 Wed
🔗 Lecture #8b: Neural Networks and Deep Learning (11/07/2018)
Lecture #8b: Neural Networks and Deep Learning CIS 419 2018C Applied Machine Learning on 11/07/2018 Wed
YouTube
Lecture #8b: Neural Networks and Deep Learning (11/07/2018)
Lecture #8b: Neural Networks and Deep Learning CIS 419 2018C Applied Machine Learning on 11/07/2018 Wed
Reproducing high-quality singing voice
with state-of-the-art AI technology.
Some advance in singing voice synthesis. This opens path toward more interesting collaborations and sythetic celebrities projects.
P.S. Hatsune Miku's will still remain popular for their particular qualities, but now there is more room for competitors.
Link: https://www.techno-speech.com/news-20181214a-en
#SOTA #Voice #Synthesis
🔗 Reproducing high-quality singing voice
with state-of-the-art AI technology.
Some advance in singing voice synthesis. This opens path toward more interesting collaborations and sythetic celebrities projects.
P.S. Hatsune Miku's will still remain popular for their particular qualities, but now there is more room for competitors.
Link: https://www.techno-speech.com/news-20181214a-en
#SOTA #Voice #Synthesis
🔗 Reproducing high-quality singing voice
Techno-Speech, Inc.
Reproducing high-quality singing voice
Techno-Speech, Inc. and Nagoya Institute of Technology Speech and Language Processing Laboratory recently developed a singing voice synthesis technology that can reproduce human voice quality, unique characteristics, and singing style more precisely than…
The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
🔗 The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
The year 2018 has been an inflection point for machine learning models handling text (or more accurately, Natural Language Processing or NLP for short). Our conceptual understanding of how best to represent words and sentences in a way that best captures underlying meanings and relationships is rapidly evolving. Moreover, the NLP community has been putting forward incredibly powerful components that you can freely download and use in your own models and pipelines (It’s been referred to as NLP’s ImageNet moment, referencing how years ago similar developments accelerated the development of machine learning in Computer Vision tasks).
🔗 The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
The year 2018 has been an inflection point for machine learning models handling text (or more accurately, Natural Language Processing or NLP for short). Our conceptual understanding of how best to represent words and sentences in a way that best captures underlying meanings and relationships is rapidly evolving. Moreover, the NLP community has been putting forward incredibly powerful components that you can freely download and use in your own models and pipelines (It’s been referred to as NLP’s ImageNet moment, referencing how years ago similar developments accelerated the development of machine learning in Computer Vision tasks).
jalammar.github.io
The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
Discussions:
Hacker News (98 points, 19 comments), Reddit r/MachineLearning (164 points, 20 comments)
Translations: Chinese (Simplified), French 1, French 2, Japanese, Korean, Persian, Russian, Spanish
2021 Update: I created this brief and highly accessible…
Hacker News (98 points, 19 comments), Reddit r/MachineLearning (164 points, 20 comments)
Translations: Chinese (Simplified), French 1, French 2, Japanese, Korean, Persian, Russian, Spanish
2021 Update: I created this brief and highly accessible…