Forwarded from Data Science by ODS.ai 🦜
The Code for Facial Identity in the Primate Brain
This paper showed that facial images can be reconstructed from a simple linear model using responses of only ~200 visual neurons recorded from a monkey. This approach uses "face cells" which are encoding how much a face differs from average in particular ways ("eigenface dimensions").
https://www.sciencedirect.com/science/article/pii/S009286741730538X
#cv #dl
This paper showed that facial images can be reconstructed from a simple linear model using responses of only ~200 visual neurons recorded from a monkey. This approach uses "face cells" which are encoding how much a face differs from average in particular ways ("eigenface dimensions").
https://www.sciencedirect.com/science/article/pii/S009286741730538X
#cv #dl
Forwarded from Data Science by ODS.ai 🦜
Weakly supervised mitosis detection in breast histopathology images using concentric loss
Weakly-supervised mitosis detection in breast histopathology images shows that only using one-click annotation can obtain the best performances on three challenging datasets.
Link: https://www.sciencedirect.com/science/article/abs/pii/S1361841519300118?dgcid=author
#healthcare #medical #CV #cancer #DL
Weakly-supervised mitosis detection in breast histopathology images shows that only using one-click annotation can obtain the best performances on three challenging datasets.
Link: https://www.sciencedirect.com/science/article/abs/pii/S1361841519300118?dgcid=author
#healthcare #medical #CV #cancer #DL
Forwarded from Data Science by ODS.ai 🦜
(Re)Discovering Protein Structure and Function Through Language Modeling
Trained solely on unsupervised language modeling, the Transformer's attention mechanism recovers high-level structural (folding) and functional properties of proteins!
Why this is important: traditional protein modelling requires lots of computational power. This might be a key to more efficient structure modelling. Protein structure => function. Function => faster drug research and understanding of diseases mechanisms.
Blog: https://blog.einstein.ai/provis/
Paper: https://arxiv.org/abs/2006.15222
Code: https://github.com/salesforce/provis
#DL #NLU #proteinmodelling #bio #biolearning #insilico
Trained solely on unsupervised language modeling, the Transformer's attention mechanism recovers high-level structural (folding) and functional properties of proteins!
Why this is important: traditional protein modelling requires lots of computational power. This might be a key to more efficient structure modelling. Protein structure => function. Function => faster drug research and understanding of diseases mechanisms.
Blog: https://blog.einstein.ai/provis/
Paper: https://arxiv.org/abs/2006.15222
Code: https://github.com/salesforce/provis
#DL #NLU #proteinmodelling #bio #biolearning #insilico