Understanding Deep Learning on Controlled Noisy Labels
https://ai.googleblog.com/2020/08/understanding-deep-learning-on.html
Code: https://github.com/google-research/google-research/tree/master/mentormix
Dataset: https://google.github.io/controlled-noisy-web-labels/index.html
@ai_machinelearning_big_data
https://ai.googleblog.com/2020/08/understanding-deep-learning-on.html
Code: https://github.com/google-research/google-research/tree/master/mentormix
Dataset: https://google.github.io/controlled-noisy-web-labels/index.html
@ai_machinelearning_big_data
Googleblog
Understanding Deep Learning on Controlled Noisy Labels
📗 Forward from the 'Deep Learning for Coders' Book
Post: https://www.fast.ai/2020/08/20/soumith-forward/
Free Book in Jupiter: https://github.com/fastai/fastbook/blob/master/01_intro.ipynb
Github: https://github.com/fastai/fastbook
@ai_machinelearning_big_data
Post: https://www.fast.ai/2020/08/20/soumith-forward/
Free Book in Jupiter: https://github.com/fastai/fastbook/blob/master/01_intro.ipynb
Github: https://github.com/fastai/fastbook
@ai_machinelearning_big_data
👍1
Facebook research at ECCV 2020
https://ai.facebook.com/blog/facebook-research-at-eccv-2020/
@ai_machinelearning_big_data
https://ai.facebook.com/blog/facebook-research-at-eccv-2020/
@ai_machinelearning_big_data
Facebook
Facebook research at ECCV 2020
Facebook researchers and engineers specializing in computer vision, AR/VR, artificial intelligence, infrastructure, and more will be presenting their research…
Introducing Semantic Reactor: Explore NLP in Google Sheets
https://blog.tensorflow.org/2020/08/introducing-semantic-reactor-explore-nlp-sheets.html
Code sample: https://github.com/google/making_with_ml/blob/master/semantic_ml/use_sample.js
@ai_machinelearning_big_data
https://blog.tensorflow.org/2020/08/introducing-semantic-reactor-explore-nlp-sheets.html
Code sample: https://github.com/google/making_with_ml/blob/master/semantic_ml/use_sample.js
@ai_machinelearning_big_data
blog.tensorflow.org
Introducing Semantic Reactor: Explore NLP in Google Sheets
The Semantic Reactor is a new plugin for Google Sheets that lets you run natural language understanding (NLU) models on your own data, right from a spreadsheet.
Building a Neural Network to Predict Loan Risk
https://tymick.me/blog/loan-risk-neural-network
Github: https://github.com/tywmick/loan-risk-neural-network
@ai_machinelearning_big_data
https://tymick.me/blog/loan-risk-neural-network
Github: https://github.com/tywmick/loan-risk-neural-network
@ai_machinelearning_big_data
Ty Mick
Building a Neural Network to Predict Loan Risk - Ty Mick
or, Ty Goes Into Far Too Much Detail About Cleaning Data
PyTorch framework for cryptographically secure random number generation, torchcsprng, now available
https://pytorch.org/blog/torchcsprng-release-blog/
https://pytorch.org/blog/torchcsprng-release-blog/
PyTorch
PyTorch framework for cryptographically secure random number generation, torchcsprng, now available
One of the key components of modern cryptography is the pseudorandom number generator. Katz and Lindell stated, “The use of badly designed or inappropriate random number generators can often leave a good cryptosystem vulnerable to attack. Particular care…
Introducing TF-Coder, a tool that writes tricky TensorFlow expressions for you
https://blog.tensorflow.org/2020/08/introducing-tensorflow-coder-tool.html
Paper: https://arxiv.org/abs/2003.09040
Code: https://github.com/google-research/tensorflow-coder
Colab: https://colab.research.google.com/github/google-research/tensorflow-coder/blob/master/TF-Coder_Colab.ipynb
https://blog.tensorflow.org/2020/08/introducing-tensorflow-coder-tool.html
Paper: https://arxiv.org/abs/2003.09040
Code: https://github.com/google-research/tensorflow-coder
Colab: https://colab.research.google.com/github/google-research/tensorflow-coder/blob/master/TF-Coder_Colab.ipynb
blog.tensorflow.org
Introducing TF-Coder, a tool that writes tricky TensorFlow expressions for you!
TF-Coder is a program synthesis tool that helps you write TensorFlow code. Instead of coding a tricky tensor manipulation directly, you can just demonstrate it through an illustrative example, and TF-Coder provides the corresponding code automatically.
This media is not supported in your browser
VIEW IN TELEGRAM
👄 Wav2Lip: Accurately Lip-syncing Videos In The Wild
Lip-sync videos to any target speech with high accuracy. Try our interactive demo.
Github: https://github.com/Rudrabha/Wav2Lip
Paper: https://arxiv.org/abs/2008.10010
Interactive Demo: https://bhaasha.iiit.ac.in/lipsync/
Colab: https://colab.research.google.com/drive/1tZpDWXz49W6wDcTprANRGLo2D_EbD5J8?usp=sharing
@ai_machinelearning_big_data
Lip-sync videos to any target speech with high accuracy. Try our interactive demo.
Github: https://github.com/Rudrabha/Wav2Lip
Paper: https://arxiv.org/abs/2008.10010
Interactive Demo: https://bhaasha.iiit.ac.in/lipsync/
Colab: https://colab.research.google.com/drive/1tZpDWXz49W6wDcTprANRGLo2D_EbD5J8?usp=sharing
@ai_machinelearning_big_data
Microsoft’s DoWhy is a Cool Framework for Causal Inference
https://www.kdnuggets.com/2020/08/microsoft-dowhy-framework-causal-inference.html
Github: https://github.com/microsoft/dowhy
@ai_machinelearning_big_data
https://www.kdnuggets.com/2020/08/microsoft-dowhy-framework-causal-inference.html
Github: https://github.com/microsoft/dowhy
@ai_machinelearning_big_data
KDnuggets
Microsoft’s DoWhy is a Cool Framework for Causal Inference - KDnuggets
Inspired by Judea Pearl’s do-calculus for causal inference, the open source framework provides a programmatic interface for popular causal inference methods.
This media is not supported in your browser
VIEW IN TELEGRAM
The Hessian Penalty — Official Implementation
It efficiently optimizes the Hessian of your neural network to be diagonal in an input, leading to disentanglement in that input.
https://www.wpeebles.com/hessian-penalty
Github: https://github.com/wpeebles/hessian_penalty
Paper: https://arxiv.org/abs/2008.10599
Video: https://www.youtube.com/watch?v=uZyIcTkSSXA&feature=youtu.be
@ai_machinelearning_big_data
It efficiently optimizes the Hessian of your neural network to be diagonal in an input, leading to disentanglement in that input.
https://www.wpeebles.com/hessian-penalty
Github: https://github.com/wpeebles/hessian_penalty
Paper: https://arxiv.org/abs/2008.10599
Video: https://www.youtube.com/watch?v=uZyIcTkSSXA&feature=youtu.be
@ai_machinelearning_big_data
Introducing Opacus: A high-speed library for training PyTorch models with differential privacy
https://ai.facebook.com/blog/introducing-opacus-a-high-speed-library-for-training-pytorch-models-with-differential-privacy/
Github: https://github.com/pytorch/opacus
Differential Privacy Series Part 1 | DP-SGD Algorithm Explained: https://medium.com/pytorch/differential-privacy-series-part-1-dp-sgd-algorithm-explained-12512c3959a3
https://ai.facebook.com/blog/introducing-opacus-a-high-speed-library-for-training-pytorch-models-with-differential-privacy/
Github: https://github.com/pytorch/opacus
Differential Privacy Series Part 1 | DP-SGD Algorithm Explained: https://medium.com/pytorch/differential-privacy-series-part-1-dp-sgd-algorithm-explained-12512c3959a3
Meta
Introducing Opacus: A high-speed library for training PyTorch models with differential privacy
We are releasing Opacus, a new high-speed library for training PyTorch models with differential privacy (DP) that’s more scalable than existing state-of-the-art methods.
Top2Vec
Top2Vec is an algorithm for topic modeling and semantic search. It automatically detects topics present in text and generates jointly embedded topic, document and word vectors.
Github: https://github.com/ddangelov/Top2Vec
Paper: https://arxiv.org/abs/2008.09470v1
Doc2vec: https://radimrehurek.com/gensim/models/doc2vec.html
Top2Vec is an algorithm for topic modeling and semantic search. It automatically detects topics present in text and generates jointly embedded topic, document and word vectors.
Github: https://github.com/ddangelov/Top2Vec
Paper: https://arxiv.org/abs/2008.09470v1
Doc2vec: https://radimrehurek.com/gensim/models/doc2vec.html
GitHub
GitHub - ddangelov/Top2Vec: Top2Vec learns jointly embedded topic, document and word vectors.
Top2Vec learns jointly embedded topic, document and word vectors. - ddangelov/Top2Vec
Awsome-domain-adaptation
This repo is a collection of AWESOME things about domain adaptation, including papers, code, etc. Feel free to star and fork.
Github: https://github.com/zhaoxin94/awesome-domain-adaptation
Paper: https://arxiv.org/abs/2009.00155v1
This repo is a collection of AWESOME things about domain adaptation, including papers, code, etc. Feel free to star and fork.
Github: https://github.com/zhaoxin94/awesome-domain-adaptation
Paper: https://arxiv.org/abs/2009.00155v1
Auto-Sklearn for Automated Machine Learning in Python
https://machinelearningmastery.com/auto-sklearn-for-automated-machine-learning-in-python/
https://machinelearningmastery.com/auto-sklearn-for-automated-machine-learning-in-python/
The Little W-Net that Could
State-of-the-Art Retinal Vessel Segmentation with Minimalistic Models.
Github: https://github.com/agaldran/lwnet
Paper: https://arxiv.org/abs/2009.01907v1
State-of-the-Art Retinal Vessel Segmentation with Minimalistic Models.
Github: https://github.com/agaldran/lwnet
Paper: https://arxiv.org/abs/2009.01907v1
KILT: a Benchmark for Knowledge Intensive Language Tasks
All tasks in KILT are grounded in the same snapshot of Wikipedia, reducing engineering turnaround through the re-use of components, as well as accelerating research into task-agnostic memory architectures.
Github: https://github.com/facebookresearch/KILT
Paper: https://arxiv.org/abs/2009.02252
@ai_machinelearning_big_data
All tasks in KILT are grounded in the same snapshot of Wikipedia, reducing engineering turnaround through the re-use of components, as well as accelerating research into task-agnostic memory architectures.
Github: https://github.com/facebookresearch/KILT
Paper: https://arxiv.org/abs/2009.02252
@ai_machinelearning_big_data
🧙♂️ How to Create a Cartoonizer with TensorFlow Lite
https://blog.tensorflow.org/2020/09/how-to-create-cartoonizer-with-tf-lite.html
Code: https://github.com/margaretmz/cartoonizer-with-tflite
E2E TFLite Tutorials: https://github.com/ml-gde/e2e-tflite-tutorials
@ai_machinelearning_big_data
https://blog.tensorflow.org/2020/09/how-to-create-cartoonizer-with-tf-lite.html
Code: https://github.com/margaretmz/cartoonizer-with-tflite
E2E TFLite Tutorials: https://github.com/ml-gde/e2e-tflite-tutorials
@ai_machinelearning_big_data
blog.tensorflow.org
How to Create a Cartoonizer with TensorFlow Lite
This is an end-to-end tutorial on how to convert a TF 1.x model to TensorFlow Lite (TFLite) and deploy it to an Android app. We use Android Studio’s ML Model Binding to import the model for cartoonizing an image captured with CameraX .
HyperOpt for Automated Machine Learning With Scikit-Learn
https://machinelearningmastery.com/hyperopt-for-automated-machine-learning-with-scikit-learn/
https://machinelearningmastery.com/hyperopt-for-automated-machine-learning-with-scikit-learn/
MachineLearningMastery.com
HyperOpt for Automated Machine Learning With Scikit-Learn - MachineLearningMastery.com
Automated Machine Learning (AutoML) refers to techniques for automatically discovering well-performing models for predictive modeling tasks with very little user involvement. HyperOpt is an open-source library for large scale AutoML and HyperOpt-Sklearn is…
❤1👍1
TorchKGE: Knowledge Graph Embedding in Python and PyTorch
https://torchkge.readthedocs.io/en/latest/
Github: https://github.com/torchkge-team/torchkge
Paper: https://arxiv.org/abs/2009.02963v1
@ai_machinelearning_big_data
https://torchkge.readthedocs.io/en/latest/
Github: https://github.com/torchkge-team/torchkge
Paper: https://arxiv.org/abs/2009.02963v1
@ai_machinelearning_big_data
GitHub
GitHub - torchkge-team/torchkge: TorchKGE: Knowledge Graph embedding in Python and PyTorch.
TorchKGE: Knowledge Graph embedding in Python and PyTorch. - GitHub - torchkge-team/torchkge: TorchKGE: Knowledge Graph embedding in Python and PyTorch.