A Comprehensive Survey on Graph Neural Networks https://arxiv.org/abs/1901.00596
arXiv.org
A Comprehensive Survey on Graph Neural Networks
Deep learning has revolutionized many machine learning tasks in recent years, ranging from image classification and video processing to speech recognition and natural language understanding. The...
Forwarded from Deep Learning
NeurIPS 2018 Paper Summary and Categorization on Reinforcement Learning 👉🏻
Medium
NeurIPS 2018 Reinforcement Learning Summary
This is your one stop shop for everything RL at NeurIPS 2018
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context -- SOTA on 5 datasets https://arxiv.org/abs/1901.02860
arXiv.org
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Transformers have a potential of learning longer-term dependency, but are limited by a fixed-length context in the setting of language modeling. We propose a novel neural architecture...
Panoptic Feature Pyramid Networks https://arxiv.org/abs/1901.02446
arXiv.org
Panoptic Feature Pyramid Networks
The recently introduced panoptic segmentation task has renewed our community's interest in unifying the tasks of instance segmentation (for thing classes) and semantic segmentation (for stuff...
Auto-DeepLab: Hierarchical Neural Architecture Search for Semantic Image Segmentation https://arxiv.org/abs/1901.02985
arXiv.org
Auto-DeepLab: Hierarchical Neural Architecture Search for Semantic...
Recently, Neural Architecture Search (NAS) has successfully identified neural network architectures that exceed human designed ones on large-scale image classification. In this paper, we study NAS...
Forwarded from Valery Kirichenko
YouTube
The most unexpected answer to a counting puzzle
Solution: https://youtu.be/6dTyOl1fmDo
Light-based solution: https://youtu.be/brU5yLm9DZM
Help fund future projects: https://www.patreon.com/3blue1brown
An equally valuable form of support is to simply share some of the videos.
Special thanks to these supporters:…
Light-based solution: https://youtu.be/brU5yLm9DZM
Help fund future projects: https://www.patreon.com/3blue1brown
An equally valuable form of support is to simply share some of the videos.
Special thanks to these supporters:…
RetinaMask: Learning to predict masks improves state-of-the-art single-shot detection for free https://arxiv.org/abs/1901.03353
arXiv.org
RetinaMask: Learning to predict masks improves state-of-the-art...
Recently two-stage detectors have surged ahead of single-shot detectors in
the accuracy-vs-speed trade-off. Nevertheless single-shot detectors are
immensely popular in embedded vision...
the accuracy-vs-speed trade-off. Nevertheless single-shot detectors are
immensely popular in embedded vision...
Forwarded from форель разбивает лоб
а вот это уже круто: перепись физиков мира!
в Nature Publishing Group запустили новый журнал Nature Reviews Physics, и решили в первом номере забабахать вот такое, чтобы народ в твиттерах пошел обсуждать. И в общем есть что обсудить, да и упаковано всё идеально (слишком идеально... иногда возникает впечатление, что у академических звезд типа Barabási на фултайме пара спецов по инфографике)
Over the past decades, the diversity of areas explored by physicists has exploded, encompassing new topics from biophysics and chemical physics to network science. However, it is unclear how these new subfields emerged from the traditional subject areas and how physicists explore them. To map out the evolution of physics subfields, here, we take an intellectual census of physics by studying physicists’ careers. We use a large-scale publication data set, identify the subfields of 135,877 physicists and quantify their heterogeneous birth, growth and migration patterns among research areas. We find that the majority of physicists began their careers in only three subfields, branching out to other areas at later career stages, with different rates and transition times. Furthermore, we analyse the productivity, impact and team sizes across different subfields, finding drastic changes attributable to the recent rise in large-scale collaborations. This detailed, longitudinal census of physics can inform resource allocation policies and provide students, editors and scientists with a broader view of the field’s internal dynamics.
https://www.nature.com/articles/s42254-018-0005-3
в Nature Publishing Group запустили новый журнал Nature Reviews Physics, и решили в первом номере забабахать вот такое, чтобы народ в твиттерах пошел обсуждать. И в общем есть что обсудить, да и упаковано всё идеально (слишком идеально... иногда возникает впечатление, что у академических звезд типа Barabási на фултайме пара спецов по инфографике)
Over the past decades, the diversity of areas explored by physicists has exploded, encompassing new topics from biophysics and chemical physics to network science. However, it is unclear how these new subfields emerged from the traditional subject areas and how physicists explore them. To map out the evolution of physics subfields, here, we take an intellectual census of physics by studying physicists’ careers. We use a large-scale publication data set, identify the subfields of 135,877 physicists and quantify their heterogeneous birth, growth and migration patterns among research areas. We find that the majority of physicists began their careers in only three subfields, branching out to other areas at later career stages, with different rates and transition times. Furthermore, we analyse the productivity, impact and team sizes across different subfields, finding drastic changes attributable to the recent rise in large-scale collaborations. This detailed, longitudinal census of physics can inform resource allocation policies and provide students, editors and scientists with a broader view of the field’s internal dynamics.
https://www.nature.com/articles/s42254-018-0005-3
Nature
Taking census of physics
Nature Reviews Physics - An analysis of the number of physicists and their career paths reveals the changing landscape of the physics subdisciplines, highlighting the connections between different...
Eliminating all bad Local Minima from Loss Landscapes without even adding an Extra Unit https://arxiv.org/abs/1901.03909
arXiv.org
Eliminating all bad Local Minima from Loss Landscapes without even...
Recent work has noted that all bad local minima can be removed from neural network loss landscapes, by adding a single unit with a particular parameterization. We show that the core technique from...