Results 1 
4 of
4
THE NUMBER OF SMALL COVERS OVER CUBES
, 802
"... Abstract. In the present paper we find a bijection between the set of small covers over an ncube and the set of acyclic digraphs with n labeled nodes. Using this, we give a formula of the number of small covers over an ncube (generally, a product of simplices) up to DavisJanuszkiewicz equivalence ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. In the present paper we find a bijection between the set of small covers over an ncube and the set of acyclic digraphs with n labeled nodes. Using this, we give a formula of the number of small covers over an ncube (generally, a product of simplices) up to DavisJanuszkiewicz equivalence classes and Z n 2equivariant diffeomorphism classes. Moreover we prove that the number of acyclic digraphs with n unlabeled nodes is an upper bound of the number of small covers over an ncube up to diffeomorphism. Contents
Tree Augmented Classification of Binary Data Minimizing Stochastic Complexity
, 2002
"... We establish the algorithms and procedures that augment by trees the classfiers of binary feature vectors in (Gyllenberg et. al. 1993, 1997, Gyllenberg et. al. 1999 and Gyllenberg and Koski 2002). The notion of augmenting a classifier by a tree is due to (Chow and Liu 1968) and in a more extensive f ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We establish the algorithms and procedures that augment by trees the classfiers of binary feature vectors in (Gyllenberg et. al. 1993, 1997, Gyllenberg et. al. 1999 and Gyllenberg and Koski 2002). The notion of augmenting a classifier by a tree is due to (Chow and Liu 1968) and in a more extensive form due to (Friedman et. al. 1997). These techniques will in another report be primarily applied to unsupervised classification of bacterial DNA fingerprints (or electrophoretic patterns), c.f., (Gyllenberg and Koski 2001 (a), Rademaker et. al. 1999). By classification we mean here both the (unsupervised) procedures of finding the classes in (training) data of items as well as the actual outcome of the procedure, i.e., a partitioning of the items. By identification we mean the procedures for finding the assignment of items in classes, preestablished in one way or the other. The distinction should be clear, although the algorithms of classification as given in the sequel will also...
Corresponding Author's Institution:
"... Abstract: It is a wellknown fact that the Bayesian Networks ' (BNs) use as classifiers in different fields of application has recently witnessed a noticeable growth. Yet, the Naïve Bayes ' application, and even the augmented Naïve Bayes', to classifierstructure learning, has been vulnerable to cer ..."
Abstract
 Add to MetaCart
Abstract: It is a wellknown fact that the Bayesian Networks ' (BNs) use as classifiers in different fields of application has recently witnessed a noticeable growth. Yet, the Naïve Bayes ' application, and even the augmented Naïve Bayes', to classifierstructure learning, has been vulnerable to certain limits, which explains the practitioners ' resort to other more sophisticated types of algorithms. Consequently, the use of such algorithms has paved the way for raising the problem of superexponential increase in computational complexity of the Bayesian classifier learning structure, with the increasing number of descriptive variables. In this context, the present work's major objective lies in setting up a further solution whereby a remedy can be conceived for the intricate algorithmic complexity imposed during the learning of Bayesian classifiers ' structure with the use of sophisticated algorithms. Noteworthy, the present paper's framework is organized as follows. We start, in the first place, by stating the BNs' definition along with the problems related to their structurelearning from data. We, then, go on to propose a novel approach designed to reduce the algorithmic complexity without engendering any loss of information when learning the structure of a Bayesian classifier. Ultimately, our approach is to be tested on a car diagnosis, a Lymphography diagnosis and a cardiac disease diagnosis databases, along with our achieved results ' discussion, and an exposition of our conducted work's interests as a closing
A new approach for Bayesian classifier learning structure via K2 Algorithm
"... Abstract. It is a wellknown fact that the Bayesian Networks ’ (BNs) use as classifiers in different fields of application has recently witnessed a noticeable growth. Yet, the Naïve Bayes ’ application, and even the augmented Naïve Bayes’, to classifierstructure learning, has been vulnerable to cer ..."
Abstract
 Add to MetaCart
Abstract. It is a wellknown fact that the Bayesian Networks ’ (BNs) use as classifiers in different fields of application has recently witnessed a noticeable growth. Yet, the Naïve Bayes ’ application, and even the augmented Naïve Bayes’, to classifierstructure learning, has been vulnerable to certain limits, which explains the practitioners ’ resort to other more sophisticated types of algorithms. Consequently, the use of such algorithms has paved the way for raising the problem of superexponential increase in computational complexity of the Bayesian classifier learning structure, with the increasing number of descriptive variables. In this context, the present work’s major objective lies in setting up a further solution whereby a remedy can be conceived for the intricate algorithmic complexity imposed during the learning of Bayesian classifiers ’ structure with the use of sophisticated algorithms. Noteworthy, the present paper’s framework is organized as follows. We start, in the first place, by to propose a novel approach designed to reduce the algorithmic complexity without engendering any loss of information when learning the structure of a Bayesian classifier. We, then, go on to test our approach on a car diagnosis and a Lymphography diagnosis databases. Ultimately, an exposition of our conducted work’s interests will be a closing step to this work.