Neural Networks | Нейронные сети
11.6K subscribers
803 photos
184 videos
170 files
9.45K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
🎥 Deep Neural Networks step by step final prediction model #part 5
👁 1 раз 1737 сек.
In this tutorial we will use the functions we had implemented in the previous parts to build a deep network, and apply it to cat vs dog classification. Hopefully, we will see an improvement in accuracy relative to our previous logistic regression implementation. After this part we will be able to build and apply a deep neural network to supervised learning using only numpy library.

Full tutorial code and cats vs dogs image data-set can be found on my GitHub page: https://github.com/pythonlessons/Logistic-r
🎥 Deep Learning and Modern NLP - Zachary S Brown
👁 1 раз 5393 сек.
In this tutorial, we’ll cover the fundamental building blocks of neural network architectures and how they are utilized to tackle problems in modern natural language processing. Topics covered will include an overview of language vector representations, text classification, named entity recognition, and sequence to sequence modeling approaches. An emphasis will be placed on the shape of these types of problems from the perspective of deep learning architectures. This will help to develop an intuition for id
🎥 Leveraging NLP and Deep Learning for Document Recommendations in the CloudGuoqiong Song Intel
👁 1 раз 1254 сек.
Efficient recommender systems are critical for the success of many industries, such as job recommendation, news recommendation, ecommerce, etc. This talk will illustrate how to build an efficient document recommender system by leveraging Natural Language Processing(NLP) and Deep Neural Networks (DNNs). The end-to-end flow of the document recommender system is build on AWS at scale, using Analytics Zoo for Spark and BigDL. The system first processes text rich documents into embeddings by incorporating Global
🎥 TWiML x Fast ai v3 Deep Learning Part 2 Study Group - Lesson 12 - Spring 2019 1080p
👁 1 раз 5738 сек.
**SUBSCRIBE AND TURN ON NOTIFICATIONS** **twimlai.com**

This video is a recap of our TWiML Online Fast.ai Deep Learning Part 2 Study Group.

In this session, we had a mini presentation on Deep Representation Learning for Trigger Monitoring and a review Lesson 12 of the Fast.ai v3 Deep Learning Part 2 course.

It’s not too late to join the study group. Just follow these simple steps:

1. Head over to twimlai.com/meetup, and sign up for the programs you're interested in, including either of the Fast.ai study
​Digging Into Self-Supervised Monocular Depth Estimation

https://arxiv.org/abs/1806.01260

https://github.com/nianticlabs/monodepth2

🔗 Digging Into Self-Supervised Monocular Depth Estimation
Per-pixel ground-truth depth data is challenging to acquire at scale. To overcome this limitation, self-supervised learning has emerged as a promising alternative for training models to perform monocular depth estimation. In this paper, we propose a set of improvements, which together result in both quantitatively and qualitatively improved depth maps compared to competing self-supervised methods. Research on self-supervised monocular training usually explores increasingly complex architectures, loss functions, and image formation models, all of which have recently helped to close the gap with fully-supervised methods. We show that a surprisingly simple model, and associated design choices, lead to superior predictions. In particular, we propose (i) a minimum reprojection loss, designed to robustly handle occlusions, (ii) a full-resolution multi-scale sampling method that reduces visual artifacts, and (iii) an auto-masking loss to ignore training pixels that violate camera motion assumptions. We demonstrate t
🎥 TWiML x Fast ai v3 Deep Learning Part 2 Study Group - Lesson 13 - Spring 2019 1080p
👁 1 раз 5669 сек.
**SUBSCRIBE AND TURN ON NOTIFICATIONS** **twimlai.com**

This video is a recap of our TWiML Online Fast.ai Deep Learning Part 2 Study Group.

In this session, we review Lesson 13, on Clone the Fast.ai Repo, of the Fast.ai v3 Deep Learning Part 2 course.

It’s not too late to join the study group. Just follow these simple steps:

1. Head over to twimlai.com/meetup, and sign up for the programs you're interested in, including either of the Fast.ai study groups or our Monthly Meetup groups.

2. Use the email i
🎥 Иван Ямщиков - Как поговорить с алгоритмом
👁 12 раз 4034 сек.
Что такое машинное обучение? Где оно применяется? Какие задачи и проблемы стоят перед машинным обучением в обработке языка, текстов и речи? Почему язык - это сложно? Как говорят машины? Какова история машинного обучения?
Рассказывает Иван Ямщиков, PhD, научный сотрудник Института Макса Планка в Лейпциге, AI-евангелист компании ABBYY.

Наш подкаст-канал:
http://nauka-pro.ru/podcasting

Выражаем благодарность Просветительскому проекту "МатЧасть" за помощь в организации съёмок.

Друзья, если вы хотите подд
🎥 Building Autonomous Systems with Machine Teaching - THR3006
👁 1 раз 1205 сек.
Building autonomous systems with traditional Machine Learning techniques is difficult. Machine Teaching is a new approach to building intelligence using deep reinforcement learning. Come to this session to learn how to use machine teaching to apply expert knowledge to create deep reinforcement models for control industrial systems like bulldozers, oil drills, and more.
🎥 2019 - Guy Royse - Deep Learning like a Viking: Building Convolutional Neural Networks with Keras
👁 1 раз 3298 сек.
The Vikings came from the land of ice and snow, from the midnight sun, where the hot springs flow. In addition to longships and bad attitudes, they had a system of writing that we, in modern times, have dubbed the Younger Futhark (or ᚠᚢᚦᚬᚱᚴ if you're a Viking). These sigils are more commonly called runes and have been mimicked in fantasy literature and role-playing games for decades. Of course, having an alphabet, runic or otherwise, solves lots of problems. But, it also introduces others. The Vikings had t
​ISSCC2019: Intelligence on Silicon: From Deep Neural Network Accelerators to Brain-Mimicking AI-SoCs

🔗 ISSCC2019: Intelligence on Silicon: From Deep Neural Network Accelerators to Brain-Mimicking AI-SoCs
Hoi-Jun Yoo, KAIST, Daejeon, Korea

Deep learning is influencing not only the technology itself but also our everyday lives.
Formerly, most AI functionalities and applications were centralized on datacenters. However,
the primary platform for AI has recently shifted to mobile devices. With the increasing demand
on mobile AI, conventional hardware solutions face their ordeal because of their low energy
efficiency on such power hungry applications. For the past few years, dedicated DNN
accelerators inference