Neural Networks | Нейронные сети
11.6K subscribers
803 photos
184 videos
170 files
9.45K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
коллеги у меня в видео только один глаз а следить надо за зрачком, получается всего один объект, сделал пока что на SqueezeNet, AlexNet, Yolo. Но ведь это все старые модели?. Какую модель глубокого обучения использовать лучше для данной цели?
​We're standardizing OpenAI's deep learning framework on PyTorch to increase our research productivity at scale on GPUs (and have just released a PyTorch version of Spinning Up in Deep RL): https://openai.com/blog/openai-pytorch/

🔗 OpenAI→PyTorch
We are standardizing OpenAI’s deep learning framework on PyTorch. In the past, we implemented projects in many frameworks depending on their relative strengths. We’ve now chosen to standardize to make it easier for our team to create and share optimized implementations of our models. As part of this
Repository of the paper "RatLesNetv2: A Fully Convolutional Network for Rodent Brain Lesion Segmentation".

https://github.com/jmlipman/RatLesNetv2

🔗 jmlipman/RatLesNetv2
RatLesNetv2 is convolutional neural network for rodent brain lesion segmentation. - jmlipman/RatLesNetv2
​Bringing Stories Alive: Generating Interactive Fiction Worlds

Paper: https://arxiv.org/abs/2001.10161

Code: https://github.com/rajammanabrolu/WorldGeneration/

Наш телеграм канал - tglink.me/ai_machinelearning_big_data

🔗 rajammanabrolu/WorldGeneration
Generating Interactive Fiction worlds from story plots - rajammanabrolu/WorldGeneration
A network of science: 150 years of Nature papers
https://www.youtube.com/watch?v=GW4s58u8PZo&feature=youtu.be

🎥 A network of science: 150 years of Nature papers
👁 1 раз 309 сек.
Science is a network, each paper linking those that came before with those that followed. In an exclusive analysis, researchers have delved into Nature's part of that network. We explore their results, taking you on a tour of 150 years of interconnected, interdisciplinary research, as represented by Nature's publication record.

Explore the network yourself: https://www.nature.com/articles/d41586-019-03165-4
Read more: https://www.nature.com/collections/eidahgdici/

Sign up for the Nature Briefing: An esse
🎥 Bayesian Deep Learning
👁 1 раз 5876 сек.
Bayesian Deep Learning



Abstract:
While deep learning has been revolutionary for machine learning, most modern deep learning models cannot represent their uncertainty nor take advantage of the well studied tools of probability theory. This has started to change following recent developments of tools and techniques combining Bayesian approaches with deep learning. The intersection of the two fields has received great interest from the community over the past few years, with the introduction of new deep lea
🎥 Object Detection with Deep Learning - Andreu Girbau - UPC TelecomBCN Barcelona 2019
👁 1 раз 1254 сек.
https://telecombcn-dl.github.io/2019-dlcv/

Deep learning technologies are at the core of the current revolution in artificial intelligence for multimedia data analysis. The convergence of large-scale annotated datasets and affordable GPU hardware has allowed the training of neural networks for data analysis tasks which were previously addressed with hand-crafted features. Architectures such as convolutional neural networks, recurrent neural networks and Q-nets for reinforcement learning have shaped a brand
🎥 Face Recognition with Deep Learning - Ramon Morros - UPC TelecomBCN Barcelona 2019
👁 1 раз 1921 сек.
Deep learning technologies are at the core of the current revolution in artificial intelligence for multimedia data analysis. The convergence of large-scale annotated datasets and affordable GPU hardware has allowed the training of neural networks for data analysis tasks which were previously addressed with hand-crafted features. Architectures such as convolutional neural networks, recurrent neural networks and Q-nets for reinforcement learning have shaped a brand new scenario in signal processing. This cou
📚Новая книга Нассима Талеба

Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications

https://arxiv.org/abs/2001.10488

🔗 Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications
The book investigates the misapplication of conventional statistical techniques to fat tailed distributions and looks for remedies, when possible. Switching from thin tailed to fat tailed distributions requires more than "changing the color of the dress". Traditional asymptotics deal mainly with either n=1 or $n=\infty$, and the real world is in between, under of the "laws of the medium numbers" --which vary widely across specific distributions. Both the law of large numbers and the generalized central limit mechanisms operate in highly idiosyncratic ways outside the standard Gaussian or Levy-Stable basins of convergence. A few examples: + The sample mean is rarely in line with the population mean, with effect on "naive empiricism", but can be sometimes be estimated via parametric methods. + The "empirical distribution" is rarely empirical. + Parameter uncertainty has compounding effects on statistical metrics. + Dimension reduction (principal components)


📝 Statistical Consequences of Fat Tails- Real World Preasymptotics, Epistemology, and Applications.pdf - 💾28 601 829