Neural Networks | Нейронные сети
11.6K subscribers
802 photos
184 videos
170 files
9.45K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
​3Q: Assessing MIT’s computing infrastructure needs
In planning for the MIT Schwarzman College of Computing, working group is exploring needs across all parts of the Institute.
http://news.mit.edu/2019/schwarzman-college-computing-infrastructure-0429Наш телеграм канал - tglink.me/ai_machinelearning_big_data

🔗 3Q: Assessing MIT’s computing infrastructure needs
In planning for the MIT Schwarzman College of Computing, working group is exploring needs across all parts of the Institute.
https://arxiv.org/abs/1904.11621

🔗 Meta-Sim: Learning to Generate Synthetic Datasets
Training models to high-end performance requires availability of large labeled datasets, which are expensive to get. The goal of our work is to automatically synthesize labeled datasets that are relevant for a downstream task. We propose Meta-Sim, which learns a generative model of synthetic scenes, and obtain images as well as its corresponding ground-truth via a graphics engine. We parametrize our dataset generator with a neural network, which learns to modify attributes of scene graphs obtained from probabilistic scene grammars, so as to minimize the distribution gap between its rendered outputs and target data. If the real dataset comes with a small labeled validation set, we additionally aim to optimize a meta-objective, i.e. downstream task performance. Experiments show that the proposed method can greatly improve content generation quality over a human-engineered probabilistic scene grammar, both qualitatively and quantitatively as measured by performance on a downstream task.
​Announcing the 6th Fine-Grained Visual Categorization Workshop

🔗 Announcing the 6th Fine-Grained Visual Categorization Workshop
Posted by Christine Kaeser-Chen, Software Engineer and Serge Belongie, Visiting Faculty, Google AI In recent years, fine-grained visual ...
Oriol Vinyals: DeepMind AlphaStar, StarCraft, and Language | Artificial Intelligence Podcast
https://www.youtube.com/watch?v=Kedt2or9xlo

🎥 Oriol Vinyals: DeepMind AlphaStar, StarCraft, and Language | Artificial Intelligence Podcast
👁 2 раз 6361 сек.
Oriol Vinyals is a senior research scientist at Google DeepMind. Before that he was at Google Brain and Berkeley. His research has been cited over 39,000 times. He is one of the most brilliant and impactful minds in the field of deep learning. He is behind some of the biggest papers and ideas in AI, including sequence to sequence learning, audio generation, image captioning, neural machine translation, and reinforcement learning. He is a co-lead (with David Silver) of the AlphaStar project, creating an agen
🎥 Neural Networks and Python: Image Classification -- Part 2
👁 1 раз 818 сек.
General Description:
In this series of videos, we will be using the TensorFlow Python module to construct a neural network that classifies whether a given image of an article of clothing

We will be obtaining image data from the Fashion MNIST dataset. The intent of these videos is to showcase the use of TensorFlow as well as showing a simple example of how to construct and use a simple neural network.

This video is part of a series on Machine Learning in Python. The link to the playlist may be accessed her
​Taming Recurrent Neural Networks for Better Summarization
http://www.abigailsee.com/2017/04/16/taming-rnns-for-better-summarization.html

🔗 Taming Recurrent Neural Networks for Better Summarization | Abigail See
This is a blog post about our latest paper, Get To The Point: Summarization with Pointer-Generator Networks, to appear at ACL 2017. The code is available here.
🎥 Exploring And Attacking Neural Networks With Activation Atlases
👁 1 раз 245 сек.
📝 The paper "Exploring Neural Networks with Activation Atlases" is available here:
https://distill.pub/2019/activation-atlas/

❤️ Pick up cool perks on our Patreon page: https://www.patreon.com/TwoMinutePapers

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
313V, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Anthony Vdovitchenko, Brian Gilman, Bruno Brito, Bryan Learn, Christian Ahlin, Christoph Jadanowski, Claudio Fernandes, Dennis Abts, Eric Haddad, Eric M
🎥 Data Science Tutorial - NO EXP REQUIRED | Python - #grindreel #lambdaschool
👁 1 раз 858 сек.
🔥 Land the job! Get help with a resume and cover letter https://bit.ly/2CNoxTm
📚My Courses: https://grindreel.academy/
💻 Learn Code FREE for 2 months: https://bit.ly/2HXTU1o
Treehouse Discount: https://bit.ly/2CZDFNn | IT Certifications: https://bit.ly/2uSCgnz
Want to work at Google? Cheat Sheet: https://goo.gl/N56orD

Code Bootcamps I've worked with: 🏫
Lambda School: FREE until you get a job: https://lambda-school.sjv.io/josh

Support the channel! ❤️
https://www.patreon.com/joshuafluke
Donations: paypal.me
​bentoML: One Model to Rule Them All

🔗 bentoML: One Model to Rule Them All
The machine learning community focuses too much on predictive performance. But machine learning models are always a small part of a complex system. This post discusses our obsession with finding the best model and emphasizes what we should do instead: Take a step back and see the bigger picture in which the machine learning model is embedded.
🎥 AI in 2040
👁 18 раз 781 сек.
What does the field of Artificial Intelligence look like in 2040? It's a really hard question to answer since there are still so many unanswered questions about the nature of reality and computing. In this episode, I'll make my best predictions about AI hardware, AI software, and the societal impact of AI in 2040. We'll cover quantum mechanics, neuromorphic computing, DNA storage, decentralized computing, basic income, and mind-body machines. Enjoy!

Code for this video:
https://github.com/llSourcell/quant