#### DMCA

## MICCLLR: A Generalized Multiple-Instance Learning Algorithm Using Class Conditional Log Likelihood Ratio

Citations: | 3 - 1 self |

### Citations

13302 | Statistical Learning Theory,
- Vapnik
- 1995
(Show Context)
Citation Context ...ce the MI data has been transformed into a standard supervised learning data, any propositional classifier can be trained from such data. In this work, we used support vector machine (SVM) classifier =-=[18]-=- with an RBF kernel as the propositional classifier. To classify an unlabeled bag B, we first transform it into a single meta-instance using Eq. 1 and then we use the SVM classifier h to classify the ... |

4471 |
Data Mining: Practical machine learning tools and techniques", 2nd Edition,
- Witten, Frank
- 2005
(Show Context)
Citation Context ...xperiments reported in this paper, k is a radial basis function (RBF) kernel. 4 Experiments and Results We implemented MICCLLR and the statistical kernel [8] using the Weka machine learning workbench =-=[22]-=-. For both methods, we trained an SMO classifier with the RBF kernel. We tuned the C and γ and used the default values for the remaining parameters to get the optimal performance of the SMO classifier... |

1753 | Additive logistic regression: A statistical view of boosting,” - Friedman, Hastie, et al. - 1998 |

690 | Machine Learning
- Mitchell
- 1997
(Show Context)
Citation Context ...easily estimated from the training data using standard probability methods based on relative frequencies of each attribute value and class label occurrences observed in the training labeled instances =-=[13]-=-. Step 3 uses the collected statistics to map each bag into a single meta-instance. Let Bi = {Xi1, . . . ,Xik} be a bag of k instances. Each instance is represented by an ordered tuple of n attribute ... |

314 | Support vector machines for multipleinstance learning.
- Andrews, Tsochantaridis, et al.
- 2003
(Show Context)
Citation Context ...age segments, the underlying assumption is that the object of interest (e.g. elephant) is contained in at least one image segment of the target image. In our experiments, we used three CBIR data sets =-=[16]-=-. Each data Table 1. Comparison of the performance (% correct ± std. deviation) of MICCLLR and our implementation of statistical kernel, kstat, with those of other methods on the Musk data sets. All m... |

264 | A framework for multiple-instance learning.
- Maron, Lozano-Perez
- 1998
(Show Context)
Citation Context ...o MIL have been investigated in the literature including a MIL variant of the backpropagation algorithm [14], variants of the k-nearest neighbor (k-NN) algorithm [20], the Diverse Density (DD) method =-=[10]-=- and EM-DD [23] which improves on DD by using Expectation Maximization (EM), DD-SVM [4] which trains an SVM in a feature space constructed from a mapping defined by the local maximizers and minimizers... |

263 | Solving the multiple-instance problem with axis-parallel rectangles.
- Dietterich, Lathrop, et al.
- 1997
(Show Context)
Citation Context ...pplied. Experimental results on a wide range of MI data sets show that MICCLLR is competitive with some of the best performing MIL algorithms reported in literature. 1. Introduction Dietterich et al. =-=[5]-=- introduced the multiple-instance learning (MIL) problem motivated by his work on classifying aromatic molecules according to whether or not they are ”musky”. In this classification task, each molecul... |

229 | Multiple-instance learning for natural scene classification.
- Maron, Ratan
- 1998
(Show Context)
Citation Context ...e bag must have at least one positive instance. The resulting classification task finds application in drug discovery, identifying Thioredoxin-fold proteins [19], content-based image retrieval (CBIR) =-=[11, 24, 2]-=-, and computer aided diagnosis (CAD) [7]. Several approaches to MIL have been investigated in the literature including a MIL variant of the backpropagation algorithm [14], variants of the k-nearest ne... |

196 | Image categorization by learning and reasoning with regions.
- Chen, JZ
- 2004
(Show Context)
Citation Context ...ation algorithm [14], variants of the k-nearest neighbor (k-NN) algorithm [20], the Diverse Density (DD) method [10] and EM-DD [23] which improves on DD by using Expectation Maximization (EM), DD-SVM =-=[4]-=- which trains an SVM in a feature space constructed from a mapping defined by the local maximizers and minimizers of the DD function, and MI logistic regression (MI/LR) [15]. Most of these methods rel... |

165 | EM-DD: An improved multiple-instance learning technique.
- Zhang, Goldman
- 2002
(Show Context)
Citation Context ... investigated in the literature including a MIL variant of the backpropagation algorithm [14], variants of the k-nearest neighbor (k-NN) algorithm [20], the Diverse Density (DD) method [10] and EM-DD =-=[23]-=- which improves on DD by using Expectation Maximization (EM), DD-SVM [4] which trains an SVM in a feature space constructed from a mapping defined by the local maximizers and minimizers of the DD func... |

157 | Multi-instance kernels.
- Gartner, Flach, et al.
- 2002
(Show Context)
Citation Context ...-of-the-art MI methods. The rest of this paper is organized as follows: Section 2 summarizes the formulations of the MIL problem and overviews two related MIL methods, TLC [21] and statistical kernel =-=[8]-=-, that uses the same idea of mapping each bag into a single instance. Section 3 introduces our method. Experimental results on data sets from two MI classification tasks and on artificially generated ... |

127 | Solving multiple-instance problem: a lazy learning approach.
- Wang, Zucker
- 2000
(Show Context)
Citation Context ...gnosis (CAD) [7]. Several approaches to MIL have been investigated in the literature including a MIL variant of the backpropagation algorithm [14], variants of the k-nearest neighbor (k-NN) algorithm =-=[20]-=-, the Diverse Density (DD) method [10] and EM-DD [23] which improves on DD by using Expectation Maximization (EM), DD-SVM [4] which trains an SVM in a feature space constructed from a mapping defined ... |

101 | Multiple-instance learning via embedded instance selection,”
- Chen, Bi, et al.
- 2006
(Show Context)
Citation Context ...d MI logistic regression (MI/LR) [15]. Most of these methods rely on the assumption that a bag is positive if and only if it has at least one positive instance. Alternatively, a number of MIL methods =-=[21, 17, 3]-=- have a generalized view of the MIL problem where all the instances in a bag are assumed to participate in determining the bag label. Against this background, we introduce MICCLLR, a new generalized M... |

91 | Content-based image retrieval using multiple instance learning.
- Zhang, Goldman, et al.
- 2002
(Show Context)
Citation Context ...e bag must have at least one positive instance. The resulting classification task finds application in drug discovery, identifying Thioredoxin-fold proteins [19], content-based image retrieval (CBIR) =-=[11, 24, 2]-=-, and computer aided diagnosis (CAD) [7]. Several approaches to MIL have been investigated in the literature including a MIL variant of the backpropagation algorithm [14], variants of the k-nearest ne... |

68 | On learning from multi-instance examples: empirical evaluation of a theoretical approach.
- Auer
- 1997
(Show Context)
Citation Context ...LC [21] 88.7 ± 1.6 83.1 ± 3.23 DD-SVM [4] 85.8 91.3 MILES [3] 86.3 87.7 IAPR [5] 92.4 89.2 DD [10] 88.9 82.5 EM-DD [16] 84.8 84.9 MI-SVM [16] 77.9 84.3 mi-SVM [16] 87.4 83.6 MI-NN [14] 88 82 Multinst =-=[1]-=- 76.7 84 MICA [9] 88.4 90.5 CH-FD [7] 88.8 85.7 set corresponds to one of three different categories, namely Elephant, Fox, and Tiger. For each category, the data set has 100 positive and 100 negative... |

66 | Supervised versus multiple instance learning: an empirical comparison.
- Ray, Craven
- 2005
(Show Context)
Citation Context ...n Maximization (EM), DD-SVM [4] which trains an SVM in a feature space constructed from a mapping defined by the local maximizers and minimizers of the DD function, and MI logistic regression (MI/LR) =-=[15]-=-. Most of these methods rely on the assumption that a bag is positive if and only if it has at least one positive instance. Alternatively, a number of MIL methods [21, 17, 3] have a generalized view o... |

49 | A two-level learning method for generalized multi-instance problem.
- Weidmann, Frank, et al.
- 2003
(Show Context)
Citation Context ...d MI logistic regression (MI/LR) [15]. Most of these methods rely on the assumption that a bag is positive if and only if it has at least one positive instance. Alternatively, a number of MIL methods =-=[21, 17, 3]-=- have a generalized view of the MIL problem where all the instances in a bag are assumed to participate in determining the bag label. Against this background, we introduce MICCLLR, a new generalized M... |

41 | Multi instance neural networks.
- Ramon, Raedt
- 2000
(Show Context)
Citation Context ...image retrieval (CBIR) [11, 24, 2], and computer aided diagnosis (CAD) [7]. Several approaches to MIL have been investigated in the literature including a MIL variant of the backpropagation algorithm =-=[14]-=-, variants of the k-nearest neighbor (k-NN) algorithm [20], the Diverse Density (DD) method [10] and EM-DD [23] which improves on DD by using Expectation Maximization (EM), DD-SVM [4] which trains an ... |

36 | SVMbased generalized multiple-instance learning via approximate box counting
- Tao, Scott, et al.
- 2004
(Show Context)
Citation Context ...d MI logistic regression (MI/LR) [15]. Most of these methods rely on the assumption that a bag is positive if and only if it has at least one positive instance. Alternatively, a number of MIL methods =-=[21, 17, 3]-=- have a generalized view of the MIL problem where all the instances in a bag are assumed to participate in determining the bag label. Against this background, we introduce MICCLLR, a new generalized M... |

31 | A sparse support vector machine approach to region-based image categorization.
- Bi, Chen, et al.
- 2005
(Show Context)
Citation Context ...e bag must have at least one positive instance. The resulting classification task finds application in drug discovery, identifying Thioredoxin-fold proteins [19], content-based image retrieval (CBIR) =-=[11, 24, 2]-=-, and computer aided diagnosis (CAD) [7]. Several approaches to MIL have been investigated in the literature including a MIL variant of the backpropagation algorithm [14], variants of the k-nearest ne... |

20 | Multiple instance classification via successive linear programming. - Mangasarian, Wild - 2008 |

15 | Identifying predictive structures in relational data using multiple instance learning
- McGovern, Jensen
- 2003
(Show Context)
Citation Context ...es of violating the iid assumption as a result of autocorrelation among instances in probability estimation have been explored and addressed in the context of multi-relational learning in the work of =-=[12]-=-. Because the instances within a bag are likely to be autocorrelated, it would be interesting to explore variants of MIL algorithms (including MICCLLR) that use statistical estimators that correct for... |

13 | A study in modeling low-conservation protein superfamilies
- Wang, Scott, et al.
- 2004
(Show Context)
Citation Context ... positively labeled instance, and a positive bag must have at least one positive instance. The resulting classification task finds application in drug discovery, identifying Thioredoxin-fold proteins =-=[19]-=-, content-based image retrieval (CBIR) [11, 24, 2], and computer aided diagnosis (CAD) [7]. Several approaches to MIL have been investigated in the literature including a MIL variant of the backpropag... |

1 |
et al. Multiple instance learning for computer aided diagnosis
- Fung, Dundar
- 2006
(Show Context)
Citation Context ...The resulting classification task finds application in drug discovery, identifying Thioredoxin-fold proteins [19], content-based image retrieval (CBIR) [11, 24, 2], and computer aided diagnosis (CAD) =-=[7]-=-. Several approaches to MIL have been investigated in the literature including a MIL variant of the backpropagation algorithm [14], variants of the k-nearest neighbor (k-NN) algorithm [20], the Divers... |