Neural Networks | Нейронные сети
11.6K subscribers
802 photos
184 videos
170 files
9.45K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
​Self-Attention Generative Adversarial Networks

🔗 Self-Attention Generative Adversarial Networks
In this paper, we propose the Self-Attention Generative Adversarial Network (SAGAN) which allows attention-driven, long-range dependency modeling for image generation tasks. Traditional convolutional GANs generate high-resolution details as a function of only spatially local points in lower-resolution feature maps. In SAGAN, details can be generated using cues from all feature locations. Moreover, the discriminator can check that highly detailed features in distant portions of the image are consistent with each other. Furthermore, recent work has shown that generator conditioning affects GAN performance. Leveraging this insight, we apply spectral normalization to the GAN generator and find that this improves training dynamics. The proposed SAGAN achieves the state-of-the-art results, boosting the best published Inception score from 36.8 to 52.52 and reducing Frechet Inception distance from 27.62 to 18.65 on the challenging ImageNet dataset. Visualization of the attention layers shows that the generator levera
🎥 How Much Data is Enough to Build a Machine Learning Model
👁 1 раз 1549 сек.
Because machine learning models learn from data it is important to have enough data that the model can learn to handle every case that you will throw at the model when it is actually used. It is a common practice to make sure that all of the inputs to a model (such as a neural network) are within the ranges of the training data. However, this univariate approach does not look at how you would deal with multi-variate coverage of data. For example, your training data may have individuals with heights rangi
ТОЛЬКО СЕГОДНЯ! пройдите небольшой опрос и получите приз 1ООО на счёт вашей карты!
Подробности у меня на стене!!!
🎥 CatBoost - градиентный бустинг от Яндекса
👁 161 раз 4853 сек.
Приглашённая лекция в рамках курса «Машинное обучение, часть 2» (весна 2018).
Лектор — Анна Вероника Дорогуш (Яндекс).
Страница лекции на сайте CS центра: https://goo.gl/YwePW1
Наш телеграм канал - tglink.me/ai_machinelearning_big_data

https://www.youtube.com/watch?v=s4Lcf9du9L8

🎥 TensorFlow Installation | Step By Step Guide to Install TensorFlow on Windows | Edureka
👁 1 раз 546 сек.
*** AI and Deep-Learning with TensorFlow - https://www.edureka.co/ai-deep-learning-with-tensorflow ***
This video provides a step by step installation process of tensorflow. It also provide you with a brief on tensorflow and how different industries are using tensorflow to solve real-life problems.
1:03 What is TensorFlow?
1:43 Applications of TensorFlow
2:51 Installation

------------------------------------------------
*** Machine Learning Podcast - https://castbox.fm/channel/id1832236 ***
Instagram:
🎥 NVIDIA's AI Creates Beautiful Images From Your Sketches
👁 1 раз 250 сек.
If you wish to support the series, please buy anything through this Amazon link - you don't lose anything and we get a small kickback. Thank you so much!
US: https://amzn.to/2FQHPcs
EU: https://amzn.to/2UnB2yF

📝 The paper "Semantic Image Synthesis with Spatially-Adaptive Normalization" and its source code is available here:
https://nvlabs.github.io/SPADE/
https://github.com/NVlabs/SPADE

❤️ Pick up cool perks on our Patreon page: https://www.patreon.com/TwoMinutePapers

🙏 We would like to thank our generou
🎥 Extract data using API (Python) - Part 2 | Machine & Deep Learning
👁 1 раз 1473 сек.
Extract data using API (Python) - Part 2 | Machine & Deep Learning Bootcamp

Welcome to "The AI University".

Subtitles available in: Hindi, English, French

About this video:
This video explains how to extract data from the CoinMarketCap API endpoint using API key provided by CoinMarketCap.This is the continuation and part 2 of previous video where I gave the introduction of APIs. Once extracted data will be stored in pretty as well as text file.

Follow me on Twitter: https://twitter.com/theaiunivers
🎥 Lesson 4. Optimization basics: derivative and gradient
👁 1 раз 2613 сек.
The "learning" process of the modern AI stands for optimization the loss function given the data, i.e. the features of the objects and the answers (in the Supervised learning setting).

In this lesson the core concepts of optimization methods are considered: derivative and gradient. Thank to them we can use gradient descent to optimize the linera models and one neuron.

Lecturer: Kirill Golubev (MIPT)

Materials:
https://drive.google.com/open?id=1zxLACGTyzWigd_JkCN76D8GCxyM6LWCf

---

About Deep Learning S
🎥 Lesson 4. Linear models and Gradient Descent
👁 1 раз 1942 сек.
Linear models are the base algorithms in machine learning, they are used almost everywhere in production. The are the key for understanding the work of one neuron in neural nets.

Lecturer: Kirill Golubev (MIPT)

Materials:
https://drive.google.com/open?id=1zxLACGTyzWigd_JkCN76D8GCxyM6LWCf

---

About Deep Learning School at PSAMI MIPT

Official website: https://www.dlschool.org
Github-repo: https://github.com/DLSchool/dlschool_english

About PSAMI MIPT

Official website: https://mipt.ru/english/edu/phy
🎥 Theoretical Deep Learning. The Information Bottleneck method. Part 1
👁 1 раз 5831 сек.
In this class we introduce the information bottleneck method. We also discuss what we can learn about the training process of neural nets using this technique.

Find out more: https://github.com/deepmipt/tdl

Our open-source framework to develop and deploy conversational assistants: https://deeppavlov.ai/