🎥 Machine Learning Fairness: Lessons Learned (Google I/O'19)
👁 1 раз ⏳ 2183 сек.
👁 1 раз ⏳ 2183 сек.
ML fairness is a critical consideration in machine learning development. This session will present a few lessons Google has learned through our products and research and how developers can apply these learnings in their own efforts. Techniques and resources will be presented that enable evaluation and improvements to models, including open source datasets and tools such as TensorFlow Model Analysis. This session will enable developers to proactively think about fairness in product development.
Watch more #Vk
Machine Learning Fairness: Lessons Learned (Google I/O'19)
ML fairness is a critical consideration in machine learning development. This session will present a few lessons Google has learned through our products and research and how developers can apply these learnings in their own efforts. Techniques and resources…
An End-to-End AutoML Solution for Tabular Data at KaggleDays
http://ai.googleblog.com/2019/05/an-end-to-end-automl-solution-for.html
🔗 An End-to-End AutoML Solution for Tabular Data at KaggleDays
Posted by Yifeng Lu, Software Engineer, Google AI Machine learning (ML) for tabular data (e.g. spreadsheet data) is one of the most acti...
http://ai.googleblog.com/2019/05/an-end-to-end-automl-solution-for.html
🔗 An End-to-End AutoML Solution for Tabular Data at KaggleDays
Posted by Yifeng Lu, Software Engineer, Google AI Machine learning (ML) for tabular data (e.g. spreadsheet data) is one of the most acti...
Googleblog
An End-to-End AutoML Solution for Tabular Data at KaggleDays
TensorFlow Graphics
PyPI project status Travis build status Code coverage Supported Python version PyPI release version
The last few years have seen a rise in novel differentiable graphics layers which can be inserted in neural network architectures. From spatial transformers to differentiable graphics renderers, these new layers leverage the knowledge acquired over years of computer vision and graphics research to build new and more efficient network architectures. Explicitly modeling geometric priors and constraints into neural networks opens up the door to architectures that can be trained robustly, efficiently, and more importantly, in a self-supervised fashion.
https://github.com/tensorflow/graphics/
🔗 tensorflow/graphics
TensorFlow Graphics: Differentiable Graphics Layers for TensorFlow - tensorflow/graphics
PyPI project status Travis build status Code coverage Supported Python version PyPI release version
The last few years have seen a rise in novel differentiable graphics layers which can be inserted in neural network architectures. From spatial transformers to differentiable graphics renderers, these new layers leverage the knowledge acquired over years of computer vision and graphics research to build new and more efficient network architectures. Explicitly modeling geometric priors and constraints into neural networks opens up the door to architectures that can be trained robustly, efficiently, and more importantly, in a self-supervised fashion.
https://github.com/tensorflow/graphics/
🔗 tensorflow/graphics
TensorFlow Graphics: Differentiable Graphics Layers for TensorFlow - tensorflow/graphics
GitHub
GitHub - tensorflow/graphics: TensorFlow Graphics: Differentiable Graphics Layers for TensorFlow
TensorFlow Graphics: Differentiable Graphics Layers for TensorFlow - tensorflow/graphics
Наш телеграм канал - tglink.me/ai_machinelearning_big_data
https://www.youtube.com/watch?v=ADYZmf7GvOw
🎥 Keras/TensorFlow 2.0, NLP with SQuAd, Spark SQL Expressions - Advanced Spark TensorFlow Meetup - SF
👁 1 раз ⏳ 6533 сек.
https://www.youtube.com/watch?v=ADYZmf7GvOw
🎥 Keras/TensorFlow 2.0, NLP with SQuAd, Spark SQL Expressions - Advanced Spark TensorFlow Meetup - SF
👁 1 раз ⏳ 6533 сек.
Agenda
* Meetup Updates and Announcements - 4 Years and 230 Events!
* Intro Grammarly (Umayah Abdennabi, 5 mins)
* Meetup Updates and Announcements (Chris, 5 mins)
* Custom Functions in Spark SQL (30 mins)
Speaker: Umayah Abdennabi
Spark comes with a rich Expression library that can be extended to make custom expressions. We will look into custom expressions and why you would want to use them.
* TF 2.0 + Keras (30 mins)
Speaker: Francesco Mosconi
Tensorflow 2.0 was announced at the March TF Dev Summit, aYouTube
Keras/TensorFlow 2.0, NLP with SQuAd, Spark SQL Expressions - Advanced Spark TensorFlow Meetup - SF
Agenda * Meetup Updates and Announcements - 4 Years and 230 Events! * Intro Grammarly (Umayah Abdennabi, 5 mins) * Meetup Updates and Announcements (Chris, 5...
🎥 56 - Машинное обучение. Отбор признаков в несколько итераций. Часть 1
👁 1 раз ⏳ 425 сек.
👁 1 раз ⏳ 425 сек.
Лектор: Артём Шевляков
https://stepik.org/8057Vk
56 - Машинное обучение. Отбор признаков в несколько итераций. Часть 1
Лектор: Артём Шевляков
https://stepik.org/8057
https://stepik.org/8057
Building Gmail style smart compose with a char ngram language model
🔗 Building Gmail style smart compose with a char ngram language model
“OpenAI built a language model so good, it’s considered too dangerous to release” — Techcrunch
🔗 Building Gmail style smart compose with a char ngram language model
“OpenAI built a language model so good, it’s considered too dangerous to release” — Techcrunch
Towards Data Science
Building Gmail style smart compose with a char ngram language model
“OpenAI built a language model so good, it’s considered too dangerous to release” — Techcrunch
🎥 22. GAN'ы и SuperResolution: Сергей Овчаренко (Яндекс)
👁 23 раз ⏳ 4588 сек.
👁 23 раз ⏳ 4588 сек.
Уважаемые слушатели!
В своей лекции Сергей Овчаренко (руководитель группы Нейросетевых технологий Службы компьютерного зрения, Яндекс) подробно рассказывает про различные архитектуры генеративных состязательных нейросетей (Generative Adversarial Networks), а также про их применение в задаче улучшения качества изображений и видео (SuperResolution). С примером их работы можно ознакомиться здесь: https://yandex.ru/blog/company/oldfilms
Презентация доступна по ссылке: https://bit.ly/2YkVAX1
---
Deep LearninVk
22. GAN'ы и SuperResolution: Сергей Овчаренко (Яндекс)
Уважаемые слушатели!
В своей лекции Сергей Овчаренко (руководитель группы Нейросетевых технологий Службы компьютерного зрения, Яндекс) подробно рассказывает про различные архитектуры генеративных состязательных нейросетей (Generative Adversarial Networks)…
В своей лекции Сергей Овчаренко (руководитель группы Нейросетевых технологий Службы компьютерного зрения, Яндекс) подробно рассказывает про различные архитектуры генеративных состязательных нейросетей (Generative Adversarial Networks)…
How Negative Sampling work on word2vec?
🔗 How Negative Sampling work on word2vec?
During neural network training, it always adjust all neuron weight so that it learn how to do the prediction correctly. In NLP, we may…
🔗 How Negative Sampling work on word2vec?
During neural network training, it always adjust all neuron weight so that it learn how to do the prediction correctly. In NLP, we may…
Towards Data Science
How Negative Sampling work on word2vec?
During neural network training, it always adjust all neuron weight so that it learn how to do the prediction correctly. In NLP, we may…
🎥 Machine Learning Part 16: Naive Bayes Classifier In Python
👁 1 раз ⏳ 802 сек.
👁 1 раз ⏳ 802 сек.
In this video, we cover the naive bayes classifier and walk through an example in python.
CONNECT
Site: https://coryjmaklin.com/
Medium: https://medium.com/@corymaklin
GitHub: https://github.com/corymaklin
Twitter: https://twitter.com/CoryMaklin
Linkedin: https://www.linkedin.com/in/cory-maklin-a51732b7/
Facebook: https://www.facebook.com/cory.maklin
Patreon: https://www.patreon.com/corymaklinVk
Machine Learning Part 16: Naive Bayes Classifier In Python
In this video, we cover the naive bayes classifier and walk through an example in python.
CONNECT
Site: https://coryjmaklin.com/
Medium: https://medium.com/@corymaklin
GitHub: https://github.com/corymaklin
Twitter: https://twitter.com/CoryMaklin
Linkedin:…
CONNECT
Site: https://coryjmaklin.com/
Medium: https://medium.com/@corymaklin
GitHub: https://github.com/corymaklin
Twitter: https://twitter.com/CoryMaklin
Linkedin:…
🎥 What is Data Science ? How to Become a Data Scientist ? | Data Science for Beginners
👁 1 раз ⏳ 1739 сек.
👁 1 раз ⏳ 1739 сек.
Hi,
I'm Kaish Ansari and in this video I've have covered all the topics regarding What is Data Science ? How to Become a Data Scientist ?
I have covered why data science is so trending now a days and how someone can become a data scientist.
I've covered about the dataset available around us.
This video tutorial also contains information about various platforms where one can learn about machine learning and mathematics for data science!
Like we have khan academy for mathematics
and coursera or udacity forVk
What is Data Science ? How to Become a Data Scientist ? | Data Science for Beginners
Hi,
I'm Kaish Ansari and in this video I've have covered all the topics regarding What is Data Science ? How to Become a Data Scientist ?
I have covered why data science is so trending now a days and how someone can become a data scientist.
I've covered…
I'm Kaish Ansari and in this video I've have covered all the topics regarding What is Data Science ? How to Become a Data Scientist ?
I have covered why data science is so trending now a days and how someone can become a data scientist.
I've covered…
Discovering the essential tools for Named Entities Recognition
🔗 Discovering the essential tools for Named Entities Recognition
It’s all about the names!
🔗 Discovering the essential tools for Named Entities Recognition
It’s all about the names!
Towards Data Science
Discovering the essential tools for Named Entities Recognition
It’s all about the names!
https://arxiv.org/abs/1905.00507
🔗 Learning higher-order sequential structure with cloned HMMs
Variable order sequence modeling is an important problem in artificial and natural intelligence. While overcomplete Hidden Markov Models (HMMs), in theory, have the capacity to represent long-term temporal structure, they often fail to learn and converge to local minima. We show that by constraining HMMs with a simple sparsity structure inspired by biology, we can make it learn variable order sequences efficiently. We call this model cloned HMM (CHMM) because the sparsity structure enforces that many hidden states map deterministically to the same emission state. CHMMs with over 1 billion parameters can be efficiently trained on GPUs without being severely affected by the credit diffusion problem of standard HMMs. Unlike n-grams and sequence memoizers, CHMMs can model temporal dependencies at arbitrarily long distances and recognize contexts with "holes" in them. Compared to Recurrent Neural Networks, CHMMs are generative models that can natively deal with uncertainty. Moreover, CHMMs return a higher-order graph that represents the temporal structure of the data which can be useful for community detection, and for building hierarchical models. Our experiments show that CHMMs can beat n-grams, sequence memoizers, and LSTMs on character-level language modeling tasks. CHMMs can be a viable alternative to these methods in some tasks that require variable order sequence modeling and the handling of uncertainty.
🔗 Learning higher-order sequential structure with cloned HMMs
Variable order sequence modeling is an important problem in artificial and natural intelligence. While overcomplete Hidden Markov Models (HMMs), in theory, have the capacity to represent long-term temporal structure, they often fail to learn and converge to local minima. We show that by constraining HMMs with a simple sparsity structure inspired by biology, we can make it learn variable order sequences efficiently. We call this model cloned HMM (CHMM) because the sparsity structure enforces that many hidden states map deterministically to the same emission state. CHMMs with over 1 billion parameters can be efficiently trained on GPUs without being severely affected by the credit diffusion problem of standard HMMs. Unlike n-grams and sequence memoizers, CHMMs can model temporal dependencies at arbitrarily long distances and recognize contexts with "holes" in them. Compared to Recurrent Neural Networks, CHMMs are generative models that can natively deal with uncertainty. Moreover, CHMMs return a higher-order graph that represents the temporal structure of the data which can be useful for community detection, and for building hierarchical models. Our experiments show that CHMMs can beat n-grams, sequence memoizers, and LSTMs on character-level language modeling tasks. CHMMs can be a viable alternative to these methods in some tasks that require variable order sequence modeling and the handling of uncertainty.
arXiv.org
Learning higher-order sequential structure with cloned HMMs
Variable order sequence modeling is an important problem in artificial and natural intelligence. While overcomplete Hidden Markov Models (HMMs), in theory, have the capacity to represent long-term...
💥 Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups
🔗 💥 Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups
Training neural networks with larger batches in PyTorch: gradient accumulation, gradient checkpointing, multi-GPUs and distributed setups…
🔗 💥 Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups
Training neural networks with larger batches in PyTorch: gradient accumulation, gradient checkpointing, multi-GPUs and distributed setups…
Medium
💥 Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups
Training neural networks with larger batches in PyTorch: gradient accumulation, gradient checkpointing, multi-GPUs and distributed setups…
The basics of Deep Neural Networks
🔗 The basics of Deep Neural Networks
With the rise of libraries such as Tensorflow 2.0 and Fastai, implementing deep learning has become accessible to so many more people and…
🔗 The basics of Deep Neural Networks
With the rise of libraries such as Tensorflow 2.0 and Fastai, implementing deep learning has become accessible to so many more people and…
Towards Data Science
The basics of Deep Neural Networks
With the rise of libraries such as Tensorflow 2.0 and Fastai, implementing deep learning has become accessible to so many more people and…
Deep Learning for Data Integration
🔗 Deep Learning for Data Integration
Synergistic effects of data integration with Deep Learning
🔗 Deep Learning for Data Integration
Synergistic effects of data integration with Deep Learning
Towards Data Science
Deep Learning for Data Integration
Synergistic effects of data integration with Deep Learning
GANs vs. Autoencoders: Comparison of Deep Generative Models
🔗 GANs vs. Autoencoders: Comparison of Deep Generative Models
Want to turn horses into zebras? Make DIY anime characters or celebrities? Generative adversarial networks (GANs) are your new best friend.
🔗 GANs vs. Autoencoders: Comparison of Deep Generative Models
Want to turn horses into zebras? Make DIY anime characters or celebrities? Generative adversarial networks (GANs) are your new best friend.
Towards Data Science
GANs vs. Autoencoders: Comparison of Deep Generative Models
Want to turn horses into zebras? Make DIY anime characters or celebrities? Generative adversarial networks (GANs) are your new best friend.
Supervised Machine Learning Workflow from EDA to API
🔗 Supervised Machine Learning Workflow from EDA to API
An introduction to supervised machine learning, ridge regression and APIs
🔗 Supervised Machine Learning Workflow from EDA to API
An introduction to supervised machine learning, ridge regression and APIs
Towards Data Science
Supervised Machine Learning Workflow from EDA to API
An introduction to supervised machine learning, ridge regression and APIs
Как Tesla обучает автопилот
Расшифровка 2-й части Tesla Autonomy Investor Day. Цикл обучения автопилота, инфраструктура сбора данных, автоматическая разметка данных, подражание водителям-людям, определение расстояния по видео, sensor-supervision и многое другое.
https://habr.com/ru/post/450796/
🔗 Как Tesla обучает автопилот
Расшифровка 2-й части Tesla Autonomy Investor Day. Цикл обучения автопилота, инфраструктура сбора данных, автоматическая разметка данных, подражание водителям-...
Расшифровка 2-й части Tesla Autonomy Investor Day. Цикл обучения автопилота, инфраструктура сбора данных, автоматическая разметка данных, подражание водителям-людям, определение расстояния по видео, sensor-supervision и многое другое.
https://habr.com/ru/post/450796/
🔗 Как Tesla обучает автопилот
Расшифровка 2-й части Tesla Autonomy Investor Day. Цикл обучения автопилота, инфраструктура сбора данных, автоматическая разметка данных, подражание водителям-...
Хабр
Как Tesla обучает автопилот
Расшифровка 2-й части Tesla Autonomy Investor Day. Цикл обучения автопилота, инфраструктура сбора данных, автоматическая разметка данных, подражание водителям-людям, определение расстояния по видео,...
Using Ant Colony and Genetic Evolution to Optimize Ride-Sharing Trip Duration
🔗 Using Ant Colony and Genetic Evolution to Optimize Ride-Sharing Trip Duration
Urban transportation is going through a rapid and significant evolution. Since the birth of the Internet and smartphones, we have become…
🔗 Using Ant Colony and Genetic Evolution to Optimize Ride-Sharing Trip Duration
Urban transportation is going through a rapid and significant evolution. Since the birth of the Internet and smartphones, we have become…
Towards Data Science
Using Ant Colony and Genetic Evolution to Optimize Ride-Sharing Trip Duration
Urban transportation is going through a rapid and significant evolution. Since the birth of the Internet and smartphones, we have become…
🎥 Deep Machine Learning for Biometric Privacy and Security
👁 1 раз ⏳ 1695 сек.
👁 1 раз ⏳ 1695 сек.
Current scientific discourse identifies human identity recognition as one of the crucial tasks performed by government, social services, consumer, financial and health institutions worldwide. Biometric image and signal processing is increasingly used in a variety of applications to mitigate vulnerabilities, to predict risks, and to allow for rich and more intelligent data analytics. But there is an inherent conflict between enforcing stronger security and ensuring privacy rights protection. This keynote lecVk
Deep Machine Learning for Biometric Privacy and Security
Current scientific discourse identifies human identity recognition as one of the crucial tasks performed by government, social services, consumer, financial and health institutions worldwide. Biometric image and signal processing is increasingly used in a…