Results 1 - 10
of
2,535
Optimization of Processing of Enormous Amounts of Geographical Data
"... The paper deals with the optimization of processing of large volume of geographic data. The essence of the method is hierarchical decomposition of the set of processes into elementary processes and the allocation of means to these processes. The means can be of three types: hardware, software or hum ..."
Abstract
- Add to MetaCart
The paper deals with the optimization of processing of large volume of geographic data. The essence of the method is hierarchical decomposition of the set of processes into elementary processes and the allocation of means to these processes. The means can be of three types: hardware, software or human factor, eventually combination of these types. Each elementary process can be processed at one of these means in certain time. Generally, the processes and the means can be interdependent or independent. The described problem can be represented using an oriented graph, where nodes correspond to the processes or the means and edges represent either the interdependence of processes and means, or the processing time of certain process on a given mean. The map of processes is formed on the basis of the graph. This map contains temporal continuity of solutions of sub-processes. Then, the duration of all processes is compiled from this map, which must be less than the time solving a task in the required quality of results. If not, the pairs of sub process –mean are replaced alternative pairs according to the map of processes with lower duration. The special algorithm was designed for this task. If the sum of the durations of all processes complies with solutions, the optimization ends and at this
Agent Based Approach for Searching, Mining and Managing Enormous Amounts of Spatial Image Data
"... Scientists and intelligence analysts are interested in quickly discovering new results from the vast amount of available geospatial data. The key issues that arise in this pursuit are how to cope with new and changing information and how to manage the steadily increasing amount of available data. Th ..."
Abstract
- Add to MetaCart
Scientists and intelligence analysts are interested in quickly discovering new results from the vast amount of available geospatial data. The key issues that arise in this pursuit are how to cope with new and changing information and how to manage the steadily increasing amount of available data
Parallel discrete event simulation
, 1990
"... Parallel discrete event simulation (PDES), sometimes I called distributed simulation, refers to the execution of a single discrete event simulation program on a parallel computer. PDES has attracted a considerable amount of interest in recent years. From a pragmatic standpoint, this interest arises ..."
Abstract
-
Cited by 818 (39 self)
- Add to MetaCart
from the fact that large simulations in engineering, computer science, economics, and military apphcations, to mention a few, consume enormous amounts of time
Guide to Elliptic Curve Cryptography
, 2004
"... Elliptic curves have been intensively studied in number theory and algebraic geometry for over 100 years and there is an enormous amount of literature on the subject. To quote the mathematician Serge Lang: It is possible to write endlessly on elliptic curves. (This is not a threat.) Elliptic curves ..."
Abstract
-
Cited by 610 (18 self)
- Add to MetaCart
Elliptic curves have been intensively studied in number theory and algebraic geometry for over 100 years and there is an enormous amount of literature on the subject. To quote the mathematician Serge Lang: It is possible to write endlessly on elliptic curves. (This is not a threat.) Elliptic curves
Fast and accurate short read alignment with Burrows-Wheeler transform
- BIOINFORMATICS, 2009, ADVANCE ACCESS
, 2009
"... Motivation: The enormous amount of short reads generated by the new DNA sequencing technologies call for the development of fast and accurate read alignment programs. A first generation of hashtable based methods has been developed, including MAQ, which is accurate, feature rich and fast enough to a ..."
Abstract
-
Cited by 2096 (24 self)
- Add to MetaCart
Motivation: The enormous amount of short reads generated by the new DNA sequencing technologies call for the development of fast and accurate read alignment programs. A first generation of hashtable based methods has been developed, including MAQ, which is accurate, feature rich and fast enough
Fab: Content-based, collaborative recommendation
- Communications of the ACM
, 1997
"... Fab is a recommendation system designed to help users sift through the enormous amount of information available in the World Wide Web. Operational since Dec. 1994, this system combines the content-based and collaborative methods of recommendation in a way that exploits the advantages of the two appr ..."
Abstract
-
Cited by 682 (0 self)
- Add to MetaCart
Fab is a recommendation system designed to help users sift through the enormous amount of information available in the World Wide Web. Operational since Dec. 1994, this system combines the content-based and collaborative methods of recommendation in a way that exploits the advantages of the two
Why Do Some Countries Produce So Much More Output Per Worker Than Others?
, 1998
"... Output per worker varies enormously across countries. Why? On an accounting basis, our analysis shows that differences in physical capital and educational attainment can only partially explain the variation in output per worker — we find a large amount of variation in the level of the Solow residual ..."
Abstract
-
Cited by 2442 (24 self)
- Add to MetaCart
Output per worker varies enormously across countries. Why? On an accounting basis, our analysis shows that differences in physical capital and educational attainment can only partially explain the variation in output per worker — we find a large amount of variation in the level of the Solow
Sizing Router Buffers
, 2004
"... All Internet routers contain buffers to hold packets during times of congestion. Today, the size of the buffers is determined by the dynamics of TCP’s congestion control algorithm. In particular, the goal is to make sure that when a link is congested, it is busy 100 % of the time; which is equivalen ..."
Abstract
-
Cited by 352 (17 self)
- Add to MetaCart
approximately 250ms × 10Gb/s = 2.5Gbits of buffers; and the amount of buffering grows linearly with the line-rate. Such large buffers are challenging for router manufacturers, who must use large, slow, off-chip DRAMs. And queueing delays can be long, have high variance, and may destabilize the congestion
Assessing the Effects of School Resources on Student Performance: An Update
, 1997
"... The relationship between school resources and student achievement has been controversial, in large part because it calls into question a variety of traditional policy approaches. This article reviews the available educational production literature, updating previous summaries. The close to 400 studi ..."
Abstract
-
Cited by 287 (11 self)
- Add to MetaCart
investigations on how school resources affect labor market outcomes. Simple resource policies hold little hope for improving student outcomes. Reflecting its policy significance, an enormous amount of research has focused on the relationship between resources devoted to schools and student performance. Recent
Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey
- Data Mining and Knowledge Discovery
, 1997
"... Decision trees have proved to be valuable tools for the description, classification and generalization of data. Work on constructing decision trees from data exists in multiple disciplines such as statistics, pattern recognition, decision theory, signal processing, machine learning and artificial ne ..."
Abstract
-
Cited by 224 (1 self)
- Add to MetaCart
. Enormous amounts of data are being collected daily from major scientific projects e.g., Human Genome...
Results 1 - 10
of
2,535