📄 Kindermann, Ross; Snell, J. Laurie (1980).
🔹 Markov Random Fields and Their Applications 🔹
(PDF): http://www.cmap.polytechnique.fr/~rama/ehess/mrfbook.pdf
🔹 Markov Random Fields and Their Applications 🔹
(PDF): http://www.cmap.polytechnique.fr/~rama/ehess/mrfbook.pdf
🗞 Review:
"Statistical physics of inference: Thresholds and algorithms"
Lenka Zdeborová, Florent Krzakala
(Submitted on 8 Nov 2015 (v1), last revised 28 Jul 2016 (this version, v4))
🔗 https://arxiv.org/pdf/1511.02476
Many questions of fundamental interest in todays science can be formulated as inference problems: Some partial, or noisy, observations are performed over a set of variables and the goal is to recover, or infer, the values of the variables based on the indirect information contained in the measurements. For such problems, the central scientific questions are: Under what conditions is the information contained in the measurements sufficient for a satisfactory inference to be possible? What are the most efficient algorithms for this task? A growing body of work has shown that often we can understand and locate these fundamental barriers by thinking of them as phase transitions in the sense of statistical physics. Moreover, it turned out that we can use the gained physical insight to develop new promising algorithms. Connection between inference and statistical physics is currently witnessing an impressive renaissance and we review here the current state-of-the-art, with a pedagogical focus on the Ising model which formulated as an inference problem we call the planted spin glass. In terms of applications we review two classes of problems: (i) inference of clusters on graphs and networks, with community detection as a special case and (ii) estimating a signal from its noisy linear measurements, with compressed sensing as a case of sparse estimation. Our goal is to provide a pedagogical review for researchers in physics and other fields interested in this fascinating topic.
"Statistical physics of inference: Thresholds and algorithms"
Lenka Zdeborová, Florent Krzakala
(Submitted on 8 Nov 2015 (v1), last revised 28 Jul 2016 (this version, v4))
🔗 https://arxiv.org/pdf/1511.02476
Many questions of fundamental interest in todays science can be formulated as inference problems: Some partial, or noisy, observations are performed over a set of variables and the goal is to recover, or infer, the values of the variables based on the indirect information contained in the measurements. For such problems, the central scientific questions are: Under what conditions is the information contained in the measurements sufficient for a satisfactory inference to be possible? What are the most efficient algorithms for this task? A growing body of work has shown that often we can understand and locate these fundamental barriers by thinking of them as phase transitions in the sense of statistical physics. Moreover, it turned out that we can use the gained physical insight to develop new promising algorithms. Connection between inference and statistical physics is currently witnessing an impressive renaissance and we review here the current state-of-the-art, with a pedagogical focus on the Ising model which formulated as an inference problem we call the planted spin glass. In terms of applications we review two classes of problems: (i) inference of clusters on graphs and networks, with community detection as a special case and (ii) estimating a signal from its noisy linear measurements, with compressed sensing as a case of sparse estimation. Our goal is to provide a pedagogical review for researchers in physics and other fields interested in this fascinating topic.
🗞 Review:
"Statistical Physics of Hard Optimization Problems"
Lenka Zdeborová - PhD thesis
(Submitted on 25 Jun 2008)
🔗 https://arxiv.org/pdf/0806.4112
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the NP-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this thesis is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.
"Statistical Physics of Hard Optimization Problems"
Lenka Zdeborová - PhD thesis
(Submitted on 25 Jun 2008)
🔗 https://arxiv.org/pdf/0806.4112
Optimization is fundamental in many areas of science, from computer science and information theory to engineering and statistical physics, as well as to biology or social sciences. It typically involves a large number of variables and a cost function depending on these variables. Optimization problems in the NP-complete class are particularly difficult, it is believed that the number of operations required to minimize the cost function is in the most difficult cases exponential in the system size. However, even in an NP-complete problem the practically arising instances might, in fact, be easy to solve. The principal question we address in this thesis is: How to recognize if an NP-complete constraint satisfaction problem is typically hard and what are the main reasons for this? We adopt approaches from the statistical physics of disordered systems, in particular the cavity method developed originally to describe glassy systems. We describe new properties of the space of solutions in two of the most studied constraint satisfaction problems - random satisfiability and random graph coloring. We suggest a relation between the existence of the so-called frozen variables and the algorithmic hardness of a problem. Based on these insights, we introduce a new class of problems which we named "locked" constraint satisfaction, where the statistical description is easily solvable, but from the algorithmic point of view they are even more challenging than the canonical satisfiability.
🌀 What physics can tell us about inference ?
Cristopher Moore, Santa Fe Institute
🎞 http://www.savoirs.ens.fr/expose.php?id=2696
This colloquium is organized around data sciences in a broad sense, with the goal of bringing together researchers with diverse backgrounds (including mathematics, computer science, physics, chemistry and neuroscience) but a common interest in dealing with large scale or high dimensional data.
There is a deep analogy between statistical inference and statistical physics; I will give a friendly introduction to both of these fields. I will then discuss phase transitions in two problems of interest to a broad range of data sciences: community detection in social and biological networks, and clustering of sparse high-dimensional data. In both cases, if our data becomes too sparse or too noisy, it suddenly becomes impossible to find the underlying pattern, or even tell if there is one. Physics both helps us locate these phase transiitons, and design optimal algorithms that succeed all the way up to this point. Along the way, I will visit ideas from computational complexity, random graphs, random matrices, and spin glass theory.
Cristopher Moore, Santa Fe Institute
🎞 http://www.savoirs.ens.fr/expose.php?id=2696
This colloquium is organized around data sciences in a broad sense, with the goal of bringing together researchers with diverse backgrounds (including mathematics, computer science, physics, chemistry and neuroscience) but a common interest in dealing with large scale or high dimensional data.
There is a deep analogy between statistical inference and statistical physics; I will give a friendly introduction to both of these fields. I will then discuss phase transitions in two problems of interest to a broad range of data sciences: community detection in social and biological networks, and clustering of sparse high-dimensional data. In both cases, if our data becomes too sparse or too noisy, it suddenly becomes impossible to find the underlying pattern, or even tell if there is one. Physics both helps us locate these phase transiitons, and design optimal algorithms that succeed all the way up to this point. Along the way, I will visit ideas from computational complexity, random graphs, random matrices, and spin glass theory.
www.savoirs.ens.fr
Savoirs ENS
Les derniers cours, séminaires, colloques et conférences enregistrées à l'Ecole Normale Supérieure (Paris)
🔘 One week summerschool at bachelor level
Mathematical Modelling, Nonlinear Dynamics,Stochastic and Complex Systems
August 27 - September 2, 2017
Rostock, Germany
http://www.math.uni-rostock.de/complexsystems/
Mathematical Modelling, Nonlinear Dynamics,Stochastic and Complex Systems
August 27 - September 2, 2017
Rostock, Germany
http://www.math.uni-rostock.de/complexsystems/
🔆 "Complex Systems Studies" is a graduate-level channel aiming to discuss all kinds of stuff related to the field of Complex Systems.
✔️ Our purpose is to be up-to-date, precise and international.
➡️ https://xn--r1a.website/ComplexSys
✔️ Our purpose is to be up-to-date, precise and international.
➡️ https://xn--r1a.website/ComplexSys
Telegram
Complex Systems Studies
What's up in Complexity Science?!
Check out here:
@ComplexSys
#complexity #complex_systems #networks #network_science
📨 Contact us: @carimi
Check out here:
@ComplexSys
#complexity #complex_systems #networks #network_science
📨 Contact us: @carimi
Forwarded from Deleted Account [SCAM]
This media is not supported in your browser
VIEW IN TELEGRAM
Dynamic scaling in natural swarms
🗞 Inferring Structural Characteristics of Networks with Strong and Weak Ties from Fixed-Choice Surveys
Naghmeh Momeni, Michael Rabbat
🔗 https://arxiv.org/pdf/1706.07828
📌 ABSTRACT
Knowing the structure of an offline social network facilitates a variety of analyses, including studying the rate at which infectious diseases may spread and identifying a subset of actors to immunize in order to reduce, as much as possible, the rate of spread. Offline social network topologies are typically estimated by surveying actors and asking them to list their neighbours. While identifying close friends and family (i.e., strong ties) can typically be done reliably, listing all of one's acquaintances (i.e., weak ties) is subject to error due to respondent fatigue. This issue is commonly circumvented through the use of so-called "fixed choice" surveys where respondents are asked to name a fixed, small number of their weak ties (e.g., two or ten). Of course, the resulting crude observed network will omit many ties, and using this crude network to infer properties of the network, such as its degree distribution or clustering coefficient, will lead to biased estimates. This paper develops estimators, based on the method of moments, for a number of network characteristics including those related to the first and second moments of the degree distribution as well as the network size, using fixed-choice survey data. Experiments with simulated data illustrate that the proposed estimators perform well across a variety of network topologies and measurement scenarios, and the resulting estimates are significantly more accurate than those obtained directly using the crude observed network, which are commonly used in the literature. We also describe a variation of the Jackknife procedure that can be used to obtain an estimate of the estimator variance.
Naghmeh Momeni, Michael Rabbat
🔗 https://arxiv.org/pdf/1706.07828
📌 ABSTRACT
Knowing the structure of an offline social network facilitates a variety of analyses, including studying the rate at which infectious diseases may spread and identifying a subset of actors to immunize in order to reduce, as much as possible, the rate of spread. Offline social network topologies are typically estimated by surveying actors and asking them to list their neighbours. While identifying close friends and family (i.e., strong ties) can typically be done reliably, listing all of one's acquaintances (i.e., weak ties) is subject to error due to respondent fatigue. This issue is commonly circumvented through the use of so-called "fixed choice" surveys where respondents are asked to name a fixed, small number of their weak ties (e.g., two or ten). Of course, the resulting crude observed network will omit many ties, and using this crude network to infer properties of the network, such as its degree distribution or clustering coefficient, will lead to biased estimates. This paper develops estimators, based on the method of moments, for a number of network characteristics including those related to the first and second moments of the degree distribution as well as the network size, using fixed-choice survey data. Experiments with simulated data illustrate that the proposed estimators perform well across a variety of network topologies and measurement scenarios, and the resulting estimates are significantly more accurate than those obtained directly using the crude observed network, which are commonly used in the literature. We also describe a variation of the Jackknife procedure that can be used to obtain an estimate of the estimator variance.