Senza categoria

The “rich get richer” (Matthew’s Talents) effect on the bibliometrics indicators.

The rich get richer or success breeds success effect, also called Matthew’s principle from the parable of the Talents in Matthew  25:14-30, has been invoked many times in the sociology of science to justify highly skewed distributions of bibliometric indicators measuring the scientific production of scholars. The basic underlying idea it is that if you have more, it’s easier to gain more. This is a consequence of “the process of allocation of rewards to scientists for their contributions” (recognition) “which in turn affects the flow of ideas and findings through the communication networks of science” generating a reputational effect.

Here, we propose a general explanation of the observed evidence by developing a straightforward model based on the following simple assumptions: (1) the materialist principle of the natural equality of human intelligence, (2) the success breeds success effect, which can be traced back to the Gospel parables about the Talents (Matthew) and Minas (Luke), and, (3) the recognition and reputation mechanism.

Do social sciences and humanities behave like life and hard sciences?

The quantitative evaluation of Social Science and Humanities (SSH) and the investigation of the existing similarities between SSH and Life and Hard Sciences (LHS) represent the forefront of scientometrics research. We analyze the scientific production of the universe of Italian academic scholars, over a 10-year period across 2002–2012, from a national database built by the Italian National Agency for the Evaluation of Universities and Research Institutes. Here we demonstrate that all Italian scholars of SSH and LHS are equals, as far as their publishing habits. They share the same general law, which is lognormal. At the same time, however, they are different, because we measured their scientific production with different indicators required by the Italian law; we eliminated the “silent” scholars and obtained different scaling values—proxy of their productivity rates. Our findings may be useful to further develop indirect quali–quantitative comparative analysis across heterogeneous disciplines and, more broadly, to investigate the generative mechanisms behind the observed empirical regularities.

Analytical description of the transverse Anderson localization of light

 

 

Anderson localization is a sophisticated phenomenon which is still not completely understood. In a recent paper, we propose a comprehensive mean field theory, (a combination of an effective-medium theory for transverse disorder with the self-consistent localization theory), which covers the relevant aspects of the phenomenon in the transverse case, that is in the case of light propagating in a system invariant along propagation direction and with the disorder on the transverse plane.

In particular, we explain the focusing mechanism leading to the establishment of narrow transparent channels along a disordered optical fiber.

Disorder inducing single-mode transmisison

Disorder is usually connected with opacity: the impossibility to transmit information due to the randomization of light paths. On the other hand, disorder is also responsible for a peculiar phenomenon which is the Anderson localization, trapping light into standing waves occupying a tiny space.

By exploiting light localization into an optical fiber, we demonstrated (in a recent journal paper) that disorder may be used to host single-modes: the purest form of light transmission allowing for light traveling without interfering with other optical paths. In this case, the disorder is responsible for a purer form of transparency and not for opacity.

The result may found applications in endoscopy and telecommunications.

Super storage capacity in neural networks

A neural network (NN), an ensemble of interconnected neurons, is able to store memories in the so-called “fixed points”, steady activity patterns, which are the solutions of the neural dynamics and associated to the individual memory items. In past years, several works have been devoted to determine the maximum storage capacity of NN, especially for the case of the Hopfield network, the most popular kind of NN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored memory patterns (P) exceeds a fraction (≈ 14%) of the network size N. In a recently published paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero. We investigate this model at finite N. We give an analytical expression for the number of retrieval errors and show that, by increasing the number of stored patterns over a certain threshold, the errors start to decrease and reach values below unit for PN. We demonstrate that the strongest trade-off between efficiency and effectiveness relies on the number of patterns (P) that are stored in the network by appropriately fixing the connection weights. When PN and the diagonal elements of the adjacency matrix are not forced to be zero, the optimal storage capacity is obtained with a number of stored memories much larger than previously reported. This theory paves the way to the design of NN with high storage capacity and able to retrieve the desired pattern without distortions.

Bessel beams through disordered curtains

bessel

Bessel beams are nondiffractive light structures extensively exploited in life science and for technological applications.

Here we demonstrate how to build a Bessel beam through a disordered, strongly scattering curtain enabling to extend a large set of investigation techniques through opaque media.We exploit a form of spectral filtering which allows transforming  standard speckle pattern into a disorganized superposition of interfering Bessel beams named  amorphous speckle pattern.  Then by exploiting  adaptive focusing we are able to build constructive interference at a user selected target, which results in a Bessel beam itself.

Quantum key distribution with localized photons

Information is usually extracted from a photon when a photodetector clicks. On the other hand, a single particle encloses part of the information it is carrying in its wavefunction which is usually lost in a single click. The wavefunction shape may be recovered reconstructing it with many detections, but there is no way to reconstruct it with a single detector’s click….

….in all but one case : when the wavefunction is localized.

If a photon wavefunction is localized the click on the detector also pinpoints the photon location so that much more information may be extracted.

AL_quantum_13_06_2016.dvi

In a recently published paper we demonstrate how to exploit this additional information in an optical fiber supporting transverse localization. Moreover by exploiting position and momentum complementarity we are able to encrypt information with a quantum key distribution protocol.

 

Enhanced adaptive focusing through semi-transparent media

Semi_trasparent

Adaptive focusing enables to focus light through opaque curtains. The maximum resolution of this process is limited by the effective numerical aperture of the system: the span of angles which contribute to the focusing. In a recent paper , we demonstrate, by exploiting a custom spatial filter, how it is possible to select, between the many light paths contributing to the adaptive focusing, the ones which provide the higher numerical aperture. This optical selection effectively improves the resolution of the adaptive focusing in the weak scattering regime.

Read more on the Scientific Reports Article

or on the web paper.