Neural Networks | Нейронные сети
11.6K subscribers
803 photos
184 videos
170 files
9.45K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
Наш телеграм канал - tglink.me/ai_machinelearning_big_data

https://www.youtube.com/watch?v=ADYZmf7GvOw

🎥 Keras/TensorFlow 2.0, NLP with SQuAd, Spark SQL Expressions - Advanced Spark TensorFlow Meetup - SF
👁 1 раз 6533 сек.
Agenda
* Meetup Updates and Announcements - 4 Years and 230 Events!
* Intro Grammarly (Umayah Abdennabi, 5 mins)
* Meetup Updates and Announcements (Chris, 5 mins)
* Custom Functions in Spark SQL (30 mins)
Speaker: Umayah Abdennabi

Spark comes with a rich Expression library that can be extended to make custom expressions. We will look into custom expressions and why you would want to use them.

* TF 2.0 + Keras (30 mins)
Speaker: Francesco Mosconi

Tensorflow 2.0 was announced at the March TF Dev Summit, a
🎥 22. GAN'ы и SuperResolution: Сергей Овчаренко (Яндекс)
👁 23 раз 4588 сек.
Уважаемые слушатели!

В своей лекции Сергей Овчаренко (руководитель группы Нейросетевых технологий Службы компьютерного зрения, Яндекс) подробно рассказывает про различные архитектуры генеративных состязательных нейросетей (Generative Adversarial Networks), а также про их применение в задаче улучшения качества изображений и видео (SuperResolution). С примером их работы можно ознакомиться здесь: https://yandex.ru/blog/company/oldfilms

Презентация доступна по ссылке: https://bit.ly/2YkVAX1

---

Deep Learnin
🎥 Machine Learning Part 16: Naive Bayes Classifier In Python
👁 1 раз 802 сек.
In this video, we cover the naive bayes classifier and walk through an example in python.

CONNECT
Site: https://coryjmaklin.com/
Medium: https://medium.com/@corymaklin
GitHub: https://github.com/corymaklin
Twitter: https://twitter.com/CoryMaklin
Linkedin: https://www.linkedin.com/in/cory-maklin-a51732b7/
Facebook: https://www.facebook.com/cory.maklin
Patreon: https://www.patreon.com/corymaklin
🎥 What is Data Science ? How to Become a Data Scientist ? | Data Science for Beginners
👁 1 раз 1739 сек.
Hi,
I'm Kaish Ansari and in this video I've have covered all the topics regarding What is Data Science ? How to Become a Data Scientist ?
I have covered why data science is so trending now a days and how someone can become a data scientist.
I've covered about the dataset available around us.

This video tutorial also contains information about various platforms where one can learn about machine learning and mathematics for data science!
Like we have khan academy for mathematics
and coursera or udacity for
https://arxiv.org/abs/1905.00507

🔗 Learning higher-order sequential structure with cloned HMMs
Variable order sequence modeling is an important problem in artificial and natural intelligence. While overcomplete Hidden Markov Models (HMMs), in theory, have the capacity to represent long-term temporal structure, they often fail to learn and converge to local minima. We show that by constraining HMMs with a simple sparsity structure inspired by biology, we can make it learn variable order sequences efficiently. We call this model cloned HMM (CHMM) because the sparsity structure enforces that many hidden states map deterministically to the same emission state. CHMMs with over 1 billion parameters can be efficiently trained on GPUs without being severely affected by the credit diffusion problem of standard HMMs. Unlike n-grams and sequence memoizers, CHMMs can model temporal dependencies at arbitrarily long distances and recognize contexts with "holes" in them. Compared to Recurrent Neural Networks, CHMMs are generative models that can natively deal with uncertainty. Moreover, CHMMs return a higher-order graph that represents the temporal structure of the data which can be useful for community detection, and for building hierarchical models. Our experiments show that CHMMs can beat n-grams, sequence memoizers, and LSTMs on character-level language modeling tasks. CHMMs can be a viable alternative to these methods in some tasks that require variable order sequence modeling and the handling of uncertainty.
​Как Tesla обучает автопилот

Расшифровка 2-й части Tesla Autonomy Investor Day. Цикл обучения автопилота, инфраструктура сбора данных, автоматическая разметка данных, подражание водителям-людям, определение расстояния по видео, sensor-supervision и многое другое.
https://habr.com/ru/post/450796/

🔗 Как Tesla обучает автопилот
Расшифровка 2-й части Tesla Autonomy Investor Day. Цикл обучения автопилота, инфраструктура сбора данных, автоматическая разметка данных, подражание водителям-...
🎥 Deep Machine Learning for Biometric Privacy and Security
👁 1 раз 1695 сек.
Current scientific discourse identifies human identity recognition as one of the crucial tasks performed by government, social services, consumer, financial and health institutions worldwide. Biometric image and signal processing is increasingly used in a variety of applications to mitigate vulnerabilities, to predict risks, and to allow for rich and more intelligent data analytics. But there is an inherent conflict between enforcing stronger security and ensuring privacy rights protection. This keynote lec
🎥 Lesson 5 Deep Learning 2019 Back propagation; Accelerated SGD; Neural net from scratch
👁 1 раз 8014 сек.
In lesson 5 we put all the pieces of training together to understand exactly what is going on when we talk about *back propagation*. We'll use this knowledge to create and train a simple neural network from scratch.

We'll also see how we can look inside the weights of an embedding layer, to find out what our model has learned about our categorical variables. This will let us get some insights into which movies we should probably avoid at all costs...

Although embeddings are most widely known in the contex
🎥 Lesson 7 Deep Learning 2019 Resnets from scratch; U net; Generative adversarial networks
👁 1 раз 7926 сек.
In the final lesson of Practical Deep Learning for Coders, we'll study one of the most essential techniques in modern architectures: the *skip connection*. This is most famously used in the *present*, which is the architecture we've used throughout this course for image classification and appears in many cutting edge results. We'll also look at the *U-net* architecture, which uses a different type of skip connection to significantly improve segmentation results (and even for similar tasks where the output s
🎥 Lesson 6 Deep Learning 2019 Regularization; Convolutions; Data ethics
👁 1 раз 8263 сек.
Today we discuss some powerful techniques for improving training and avoiding over-fitting:
- *Dropout*: remove activations at random during training in order to regularize the model
- *Data augmentation*: modify model inputs during training in order to effectively increase data size
- *Batch normalization*: adjust the parameterization of a model in order to make the loss surface smoother.

Next up, we'll learn all about *convolutions*, which can be thought of as a variant of matrix multiplication with tied