Neural Networks | Нейронные сети
11.6K subscribers
803 photos
184 videos
170 files
9.45K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
🎥 Leveraging NLP and Deep Learning for Document Recommendations in the CloudGuoqiong Song Intel
👁 1 раз 1254 сек.
Efficient recommender systems are critical for the success of many industries, such as job recommendation, news recommendation, ecommerce, etc. This talk will illustrate how to build an efficient document recommender system by leveraging Natural Language Processing(NLP) and Deep Neural Networks (DNNs). The end-to-end flow of the document recommender system is build on AWS at scale, using Analytics Zoo for Spark and BigDL. The system first processes text rich documents into embeddings by incorporating Global
🎥 TWiML x Fast ai v3 Deep Learning Part 2 Study Group - Lesson 12 - Spring 2019 1080p
👁 1 раз 5738 сек.
**SUBSCRIBE AND TURN ON NOTIFICATIONS** **twimlai.com**

This video is a recap of our TWiML Online Fast.ai Deep Learning Part 2 Study Group.

In this session, we had a mini presentation on Deep Representation Learning for Trigger Monitoring and a review Lesson 12 of the Fast.ai v3 Deep Learning Part 2 course.

It’s not too late to join the study group. Just follow these simple steps:

1. Head over to twimlai.com/meetup, and sign up for the programs you're interested in, including either of the Fast.ai study
​Digging Into Self-Supervised Monocular Depth Estimation

https://arxiv.org/abs/1806.01260

https://github.com/nianticlabs/monodepth2

🔗 Digging Into Self-Supervised Monocular Depth Estimation
Per-pixel ground-truth depth data is challenging to acquire at scale. To overcome this limitation, self-supervised learning has emerged as a promising alternative for training models to perform monocular depth estimation. In this paper, we propose a set of improvements, which together result in both quantitatively and qualitatively improved depth maps compared to competing self-supervised methods. Research on self-supervised monocular training usually explores increasingly complex architectures, loss functions, and image formation models, all of which have recently helped to close the gap with fully-supervised methods. We show that a surprisingly simple model, and associated design choices, lead to superior predictions. In particular, we propose (i) a minimum reprojection loss, designed to robustly handle occlusions, (ii) a full-resolution multi-scale sampling method that reduces visual artifacts, and (iii) an auto-masking loss to ignore training pixels that violate camera motion assumptions. We demonstrate t
🎥 TWiML x Fast ai v3 Deep Learning Part 2 Study Group - Lesson 13 - Spring 2019 1080p
👁 1 раз 5669 сек.
**SUBSCRIBE AND TURN ON NOTIFICATIONS** **twimlai.com**

This video is a recap of our TWiML Online Fast.ai Deep Learning Part 2 Study Group.

In this session, we review Lesson 13, on Clone the Fast.ai Repo, of the Fast.ai v3 Deep Learning Part 2 course.

It’s not too late to join the study group. Just follow these simple steps:

1. Head over to twimlai.com/meetup, and sign up for the programs you're interested in, including either of the Fast.ai study groups or our Monthly Meetup groups.

2. Use the email i
🎥 Иван Ямщиков - Как поговорить с алгоритмом
👁 12 раз 4034 сек.
Что такое машинное обучение? Где оно применяется? Какие задачи и проблемы стоят перед машинным обучением в обработке языка, текстов и речи? Почему язык - это сложно? Как говорят машины? Какова история машинного обучения?
Рассказывает Иван Ямщиков, PhD, научный сотрудник Института Макса Планка в Лейпциге, AI-евангелист компании ABBYY.

Наш подкаст-канал:
http://nauka-pro.ru/podcasting

Выражаем благодарность Просветительскому проекту "МатЧасть" за помощь в организации съёмок.

Друзья, если вы хотите подд
🎥 Building Autonomous Systems with Machine Teaching - THR3006
👁 1 раз 1205 сек.
Building autonomous systems with traditional Machine Learning techniques is difficult. Machine Teaching is a new approach to building intelligence using deep reinforcement learning. Come to this session to learn how to use machine teaching to apply expert knowledge to create deep reinforcement models for control industrial systems like bulldozers, oil drills, and more.
🎥 2019 - Guy Royse - Deep Learning like a Viking: Building Convolutional Neural Networks with Keras
👁 1 раз 3298 сек.
The Vikings came from the land of ice and snow, from the midnight sun, where the hot springs flow. In addition to longships and bad attitudes, they had a system of writing that we, in modern times, have dubbed the Younger Futhark (or ᚠᚢᚦᚬᚱᚴ if you're a Viking). These sigils are more commonly called runes and have been mimicked in fantasy literature and role-playing games for decades. Of course, having an alphabet, runic or otherwise, solves lots of problems. But, it also introduces others. The Vikings had t
​ISSCC2019: Intelligence on Silicon: From Deep Neural Network Accelerators to Brain-Mimicking AI-SoCs

🔗 ISSCC2019: Intelligence on Silicon: From Deep Neural Network Accelerators to Brain-Mimicking AI-SoCs
Hoi-Jun Yoo, KAIST, Daejeon, Korea

Deep learning is influencing not only the technology itself but also our everyday lives.
Formerly, most AI functionalities and applications were centralized on datacenters. However,
the primary platform for AI has recently shifted to mobile devices. With the increasing demand
on mobile AI, conventional hardware solutions face their ordeal because of their low energy
efficiency on such power hungry applications. For the past few years, dedicated DNN
accelerators inference
​ISSCC 2019: Deep Learning Hardware: Past, Present, and Future - Yann LeCun

🔗 ISSCC 2019: Deep Learning Hardware: Past, Present, and Future - Yann LeCun
Yann LeCun, Facebook AI Research & New York University, New York, NY

Deep learning has caused revolutions in computer understanding of images, audio, and text,
enabling new applications such as information search and filtering, autonomous driving,
radiology screening, real-time language translation, and virtual assistants. But almost all these
successes largely use supervised learning, which requires human-annotated data, or
reinforcement learning, which requires too many trials to be practical in most rea
🎥 Swift for TensorFlow (Google I/O'19)
👁 1 раз 1705 сек.
Swift for TensorFlow is a platform for the next generation of machine learning that leverages innovations like first-class differentiable programming to seamlessly integrate deep neural networks with traditional software development. In this session, learn how Swift for TensorFlow can make advanced machine learning research easier and why Jeremy Howard’s fast.ai has chosen it for the latest iteration of their deep learning course.

Watch more #io19 here: Machine Learning at Google I/O 2019 Playlist → https:
🎥 Exploring the Deep Learning Framework PyTorch - Stephanie Kim
👁 1 раз 2159 сек.
STEPHANIE KIM | SOFTWARE ENGINEER AT ALGORITHMIA

Users rapidly adopted PyTorch 1.0 for many reasons. PyTorch is intuitive to learn, and its modularity enhances debugging and visibility. Additionally, unlike other frameworks such as Tensorflow, PyTorch supports dynamic computation graphs that allow network behavior changes on the fly. This talk showcases PyTorch benefits like TorchScript, which allows models to be exported in non-Python environments. We’ll also discuss pre-release serialization and performa