• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,016,133
Next 10 →

The MIT Alewife Machine: Architecture and Performance

by Anant Agarwal, Ricardo Bianchini, David Chaiken, David Kranz, John Kubiatowicz, Beng-hong Lim, Kenneth Mackenzie, Donald Yeung - In Proceedings of the 22nd Annual International Symposium on Computer Architecture , 1995
"... Alewife is a multiprocessor architecture that supports up to 512 processing nodes connected over a scalable and cost-effective mesh network at a constant cost per node. The MIT Alewife machine, a prototype implementation of the architecture, demonstrates that a parallel system can be both scalable a ..."
Abstract - Cited by 193 (22 self) - Add to MetaCart
Alewife is a multiprocessor architecture that supports up to 512 processing nodes connected over a scalable and cost-effective mesh network at a constant cost per node. The MIT Alewife machine, a prototype implementation of the architecture, demonstrates that a parallel system can be both scalable

The MIT Alewife Machine

by Anant Agarwal, Ricardo Bianchini, David Chaiken, Frederic T. Chong, Kirk L. Johnson, David Kranz, John Kubiatowicz, Beng-hong Lim, Kenneth Mackenzie, Donald Yeung - Proc. of the IEEE, Special Issue on Distributed Shared Memory , 1991
"... A variety of models for parallel architectures such as shared memory, message passing, and dataflow, have converged in the recent past to a hybrid architecture form called distributed shared memory (DSM). By using a combination of hardware and software mechanisms, DSM combines the nice features of a ..."
Abstract - Cited by 9 (0 self) - Add to MetaCart
of all the above models and is able to achieve both the scalability of message passing machines and the programmability of shared memory systems. Alewife, an early prototype of such DSM architectures, uses a hybrid of software and hardware mechanisms to support coherent shared memory, efficient user

The MIT Alewife Machine:

by Large-Scale Distributed-Memory Multiprocessor, David V. James, Anthony T. Laundrie, Stein Gjessing, Gurindar S. Sohi Distributed-directory
"... The Alewife multiprocessor project focuses on the architecture and design of a large-scale parallel machine. The machine uses a low-dimensional direct interconnection network to provide scalable communication bandwidth, while allowing the exploitation of locality. Despite its distributed-memory a ..."
Abstract - Add to MetaCart
The Alewife multiprocessor project focuses on the architecture and design of a large-scale parallel machine. The machine uses a low-dimensional direct interconnection network to provide scalable communication bandwidth, while allowing the exploitation of locality. Despite its distributed

Abstract The MIT Alewife Machine

by Anant Agarwal, Ricardo Bianchini, David Chaiken, Frederic T. Chong, Kirk L. Johnson, Kenneth Mackenzie, Donald Yeung, David Kranz, John Kubiatowicz, Beng-hong Lim
"... A variety of models for parallel architectures such as shared memory, message passing, and dataflow, have converged in the recent past to a hybrid architecture form called distributed shared memory (DSM). By using a combination of hardware and software mechanisms, DSM combines the nice features of a ..."
Abstract - Add to MetaCart
of all the above models and is able to achieve both the scalability of message passing machines and the programmability of shared memory systems. Alewife, an early prototype of such DSM architectures, uses a hybrid of software and hardware mechanisms to support coherent shared memory, efficient user

Training Support Vector Machines: an Application to Face Detection

by Edgar Osuna, Robert Freund, Federico Girosi , 1997
"... We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision sur ..."
Abstract - Cited by 728 (1 self) - Add to MetaCart
We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision

Live Migration of Virtual Machines

by Christopher Clark, Keir Fraser, Steven H, Jakob Gorm Hansen, Eric Jul, Christian Limpach, Ian Pratt, Andrew Warfield - In Proceedings of the 2nd ACM/USENIX Symposium on Networked Systems Design and Implementation (NSDI , 2005
"... Migrating operating system instances across distinct physical hosts is a useful tool for administrators of data centers and clusters: It allows a clean separation between hardware and software, and facilitates fault management, load balancing, and low-level system maintenance. By carrying out the ma ..."
Abstract - Cited by 613 (14 self) - Add to MetaCart
the majority of migration while OSes continue to run, we achieve impressive performance with minimal service downtimes; we demonstrate the migration of entire OS instances on a commodity cluster, recording service downtimes as low as 60ms. We show that that our performance is sufficient to make live migration

Ensemble Methods in Machine Learning

by Thomas G. Dietterich - MULTIPLE CLASSIFIER SYSTEMS, LBCS-1857 , 2000
"... Ensemble methods are learning algorithms that construct a set of classifiers and then classify new data points by taking a (weighted) vote of their predictions. The original ensemble method is Bayesian averaging, but more recent algorithms include error-correcting output coding, Bagging, and boostin ..."
Abstract - Cited by 607 (3 self) - Add to MetaCart
, and boosting. This paper reviews these methods and explains why ensembles can often perform better than any single classifier. Some previous studies comparing ensemble methods are reviewed, and some new experiments are presented to uncover the reasons that Adaboost does not overfit rapidly.

Support Vector Machine Active Learning with Applications to Text Classification

by Simon Tong , Daphne Koller - JOURNAL OF MACHINE LEARNING RESEARCH , 2001
"... Support vector machines have met with significant success in numerous real-world learning tasks. However, like most machine learning algorithms, they are generally applied using a randomly selected training set classified in advance. In many settings, we also have the option of using pool-based acti ..."
Abstract - Cited by 729 (5 self) - Add to MetaCart
-based active learning. Instead of using a randomly selected training set, the learner has access to a pool of unlabeled instances and can request the labels for some number of them. We introduce a new algorithm for performing active learning with support vector machines, i.e., an algorithm for choosing which

Sketchpad: A man-machine graphical communication system

by Ivan Edward Sutherland , 2003
"... The Sketchpad system uses drawing as a novel communication medium for a computer. The system contains input, output, and computation programs which enable it to interpret information drawn directly on a computer display. It has been used to draw electrical, mechanical, scientific, mathematical, and ..."
Abstract - Cited by 702 (6 self) - Add to MetaCart
The Sketchpad system uses drawing as a novel communication medium for a computer. The system contains input, output, and computation programs which enable it to interpret information drawn directly on a computer display. It has been used to draw electrical, mechanical, scientific, mathematical, and animated drawings; it is a general purpose system. Sketchpad has shown the most usefulness as an aid to the understanding of processes, such as the notion of linkages, which can be described with pictures. Sketchpad also makes it easy to draw highly repetitive or highly accurate drawings and to change drawings previously drawn with it. The many drawings in this thesis were all made with Sketchpad.

Making Large-Scale Support Vector Machine Learning Practical

by Thorsten Joachims , 1998
"... Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large lea ..."
Abstract - Cited by 620 (1 self) - Add to MetaCart
Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large
Next 10 →
Results 1 - 10 of 1,016,133
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University