Results 11  20
of
223,263
Statistical mechanics of complex networks
 Rev. Mod. Phys
"... Complex networks describe a wide range of systems in nature and society, much quoted examples including the cell, a network of chemicals linked by chemical reactions, or the Internet, a network of routers and computers connected by physical links. While traditionally these systems were modeled as ra ..."
Abstract

Cited by 2083 (10 self)
 Add to MetaCart
Complex networks describe a wide range of systems in nature and society, much quoted examples including the cell, a network of chemicals linked by chemical reactions, or the Internet, a network of routers and computers connected by physical links. While traditionally these systems were modeled as random graphs, it is increasingly recognized that the topology and evolution of real
A Set Of Principles For Conducting And Evaluating Interpretive Field Studies In Information Systems
, 1999
"... This article discusses the conduct and evaluation of interpretive research in information systems. While the conventions for evaluating information systems case studies conducted according to the natural science model of social science are now widely accepted, this is not the case for interpretive f ..."
Abstract

Cited by 874 (5 self)
 Add to MetaCart
This article discusses the conduct and evaluation of interpretive research in information systems. While the conventions for evaluating information systems case studies conducted according to the natural science model of social science are now widely accepted, this is not the case for interpretive field studies. A set of principles for the conduct and evaluation of interpretive field research in information systems is proposed, along with their philosophical rationale. The usefulness of the principles is illustrated by evaluating three published interpretive field studies drawn from the IS research literature. The intention of the paper is to further reflection and debate on the important subject of grounding interpretive research methodology.
Parameterized Complexity
, 1998
"... the rapidly developing systematic connections between FPT and useful heuristic algorithms  a new and exciting bridge between the theory of computing and computing in practice. The organizers of the seminar strongly believe that knowledge of parameterized complexity techniques and results belongs ..."
Abstract

Cited by 1218 (75 self)
 Add to MetaCart
the rapidly developing systematic connections between FPT and useful heuristic algorithms  a new and exciting bridge between the theory of computing and computing in practice. The organizers of the seminar strongly believe that knowledge of parameterized complexity techniques and results belongs into the toolkit of every algorithm designer. The purpose of the seminar was to bring together leading experts from all over the world, and from the diverse areas of computer science that have been attracted to this new framework. The seminar was intended as the rst larger international meeting with a specic focus on parameterized complexity, and it hopefully serves as a driving force in the development of the eld. 1 We had 49 participants from Australia, Canada, India, Israel, New Zealand, USA, and various European countries. During the workshop 25 lectures were given. Moreover, one night session was devoted to open problems and Thursday was basically used for problem discussion
Epidemic Spreading in ScaleFree Networks
, 2000
"... The Internet, as well as many other networks, has a very complex connectivity recently modeled by the class of scalefree networks. This feature, which appears to be very efficient for a communications network, favors at the same time the spreading of computer viruses. We analyze real data from c ..."
Abstract

Cited by 550 (14 self)
 Add to MetaCart
The Internet, as well as many other networks, has a very complex connectivity recently modeled by the class of scalefree networks. This feature, which appears to be very efficient for a communications network, favors at the same time the spreading of computer viruses. We analyze real data from computer virus infections and find the average lifetime and prevalence of viral strains on the Internet. We define a dynamical model for the spreading of infections on scalefree networks, finding the absence of an epidemic threshold and its associated critical behavior. This new epidemiological framework rationalize data of computer viruses and could help in the understanding of other spreading phenomena on communication and social networks. PACS numbers: 05.70.Ln, 05.50.+q Typeset using REVT E X 1 Many social, biological, and communication systems can be properly described by complex networks whose nodes represent individuals or organizations, and links mimic the interactions amo...
The Systems Biology Markup Language (SBML): a medium for representation and exchange of biochemical network models
 Bioinformatics
, 2003
"... ..."
Why We Twitter: Understanding Microblogging Usage and Communities
"... Microblogging is a new form of communication in which users can describe their current status in short posts distributed by instant messages, mobile phones, email or the Web. Twitter, a popular microblogging tool has seen a lot of growth since it launched in October, 2006. In this paper, we present ..."
Abstract

Cited by 547 (2 self)
 Add to MetaCart
Microblogging is a new form of communication in which users can describe their current status in short posts distributed by instant messages, mobile phones, email or the Web. Twitter, a popular microblogging tool has seen a lot of growth since it launched in October, 2006. In this paper, we present our observations of the microblogging phenomena by studying the topological and geographical properties of Twitter’s social network. We find that people use microblogging to talk about their daily activities and to seek or share information. Finally, we analyze the user intentions associated at a community level and show how users with similar intentions connect with each other.
Algorithms for Quantum Computation: Discrete Logarithms and Factoring
, 1994
"... A computer is generally considered to be a universal computational device; i.e., it is believed able to simulate any physical computational device with a increase in computation time of at most a polynomial factor. It is not clear whether this is still true when quantum mechanics is taken into consi ..."
Abstract

Cited by 1103 (7 self)
 Add to MetaCart
A computer is generally considered to be a universal computational device; i.e., it is believed able to simulate any physical computational device with a increase in computation time of at most a polynomial factor. It is not clear whether this is still true when quantum mechanics is taken into consideration. Several researchers, starting with David Deutsch, have developed models for quantum mechanical computers and have investigated their computational properties. This paper gives Las Vegas algorithms for finding discrete logarithms and factoring integers on a quantum computer that take a number of steps which is polynomial in the input size, e.g., the number of digits of the integer to be factored. These two problems are generally considered hard on a classical computer and have been used as the basis of several proposed cryptosystems. (We thus give the first examples of quantum cryptanalysis.) 1 Introduction Since the discovery of quantum mechanics, people have found the behavior of...
Transfer of Cognitive Skill
, 1989
"... A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the s ..."
Abstract

Cited by 869 (21 self)
 Add to MetaCart
A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the skill. This general framework has been instantiated in the ACT system in which facts are encoded in a propositional network and procedures are encoded as productions. Knowledge compilation is the process by which the skill transits from the declarative stage to the procedural stage. It consists of the subprocesses of composition, which collapses sequences of productions into single productions, and proceduralization, which embeds factual knowledge into productions. Once proceduralized, further learning processes operate on the skill to make the productions more selective in their range of applications. These processes include generalization, discrimination, and strengthening of productions. Comparisons are made to similar concepts from past learning theories. How these learning mechanisms apply to produce the power law speedup in processing time with practice is discussed. It requires at least 100 hours of learning and practice to acquire any significant cognitive skill to a reasonable degree of proficiency. For instance, after 100 hours a student learning to program a computer has achieved only a very modest facility in the skill. Learning one's primary language takes tens of thousands of hours. The psychology of human learning has been very thin in ideas about what happens to skills under the impact of this amount of learning—and for obvious reasons. This article presents a theory about the changes in the nature of a skill over such large time scales and about the basic learning processes that are responsible.
Image denoising using a scale mixture of Gaussians in the wavelet domain
 IEEE TRANS IMAGE PROCESSING
, 2003
"... We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vecto ..."
Abstract

Cited by 514 (17 self)
 Add to MetaCart
We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vector and a hidden positive scalar multiplier. The latter modulates the local variance of the coefficients in the neighborhood, and is thus able to account for the empirically observed correlation between the coefficient amplitudes. Under this model, the Bayesian least squares estimate of each coefficient reduces to a weighted average of the local linear estimates over all possible values of the hidden multiplier variable. We demonstrate through simulations with images contaminated by additive white Gaussian noise that the performance of this method substantially surpasses that of previously published methods, both visually and in terms of mean squared error.
The information bottleneck method
 University of Illinois
, 1999
"... We define the relevant information in a signal x ∈ X as being the information that this signal provides about another signal y ∈ Y. Examples include the information that face images provide about the names of the people portrayed, or the information that speech sounds provide about the words spoken. ..."
Abstract

Cited by 545 (38 self)
 Add to MetaCart
We define the relevant information in a signal x ∈ X as being the information that this signal provides about another signal y ∈ Y. Examples include the information that face images provide about the names of the people portrayed, or the information that speech sounds provide about the words spoken. Understanding the signal x requires more than just predicting y, it also requires specifying which features of X play a role in the prediction. We formalize this problem as that of finding a short code for X that preserves the maximum information about Y. That is, we squeeze the information that X provides about Y through a ‘bottleneck ’ formed by a limited set of codewords ˜X. This constrained optimization problem can be seen as a generalization of rate distortion theory in which the distortion measure d(x, ˜x) emerges from the joint statistics of X and Y. This approach yields an exact set of self consistent equations for the coding rules X → ˜ X and ˜ X → Y. Solutions to these equations can be found by a convergent re–estimation method that generalizes the Blahut–Arimoto algorithm. Our variational principle provides a surprisingly rich framework for discussing a variety of problems in signal processing and learning, as will be described in detail elsewhere. 1 1
Results 11  20
of
223,263