Results

**1 - 7**of**7**### E1 Reconceiving Machine Learning E2 Aims and Background

"... Beware of the man of one method or one instrument, either experimental or theoretical. He tends to become method oriented rather than problem oriented. The method-oriented man is shackled: the problem-oriented man is at least reaching freely toward what is most important. 52 Context Machine Learning ..."

Abstract
- Add to MetaCart

Beware of the man of one method or one instrument, either experimental or theoretical. He tends to become method oriented rather than problem oriented. The method-oriented man is shackled: the problem-oriented man is at least reaching freely toward what is most important. 52 Context Machine Learning is a sub-discipline of Information and Communication Technology (ICT) that develops the technologies for machines to recognise and learn patterns in data. It is distinct from, although related to, statistics. It can be differentiated by its focus on creating technology rather than the human-centred analysis of data. It is the science and engineering behind Data Mining. Machine learning is pervasive: it plays a key role in all stages of the scientific process and across diverse fields including bioinformatics, engineering and finance. It is widely accepted that ICT plays an enabling role across almost all technological disciplines. Analogously, Machine Learning plays an enabling role across most parts of ICT, from embedded to enterprise systems, and consequently is a crucial enabler of the Digital Economy 16. Vast quantities of data are now routinely collected and stored because it is affordable to do so. Machine learning makes sense of this data flood. The Problem The massive reduction in the cost of collecting, storing, transporting and processing

### Estimating the null distribution for conditional inference and

, 2009

"... genome-scale screening ..."

### A GAME-THEORETIC FRAMEWORK FOR BLENDING BAYESIAN AND FREQUENTIST METHODS OF STATISTICAL INFERENCE

"... Papers compiled in Good (1983) made first attempts at combining attractive aspects of Bayesian and frequentist approaches to statistical inference. While the hybrid inference approach of Yuan (2009) succeeded in leveraging Bayesian point estimators with maximum likelihood estimates, hybrid inference ..."

Abstract
- Add to MetaCart

Papers compiled in Good (1983) made first attempts at combining attractive aspects of Bayesian and frequentist approaches to statistical inference. While the hybrid inference approach of Yuan (2009) succeeded in leveraging Bayesian point estimators with maximum likelihood estimates, hybrid inference does not yet cover the case of a parameter of interest that has a partially known prior. Since such partial knowledge of a prior occurs in many scientific inference situations, it calls for a theoretical framework for method development that appropriately blends Bayesian and frequentist methods by meeting these criteria: 1. Complete knowledge of the prior. If the prior is known, the corresponding posterior is used for inference. Among statisticians, this principle is almost universally acknowledged. However, it is rarely the case of the prior is essentially known. 2. Negligible knowledge of the prior. If there is no reliable knowledge of a prior, inference is based on methods that do not require such knowledge. This principle motivates not only the development of confidence intervals and p-values but also Bayesian This research was partially supported by the Canada Foundation for Innovation, by the Ministry of Research

### Direct Bayes for Interest Parameters

"... For an interest parameter ψ(θ) the Bayesian method eliminates the nuisance parameter λ(θ) by integrating with respect to a conditional prior for λ given ψ. This conditional prior may be difficult to specify in both the subjective and objective Bayesian contexts. We propose the use of results from li ..."

Abstract
- Add to MetaCart

For an interest parameter ψ(θ) the Bayesian method eliminates the nuisance parameter λ(θ) by integrating with respect to a conditional prior for λ given ψ. This conditional prior may be difficult to specify in both the subjective and objective Bayesian contexts. We propose the use of results from likelihood theory that give highly accurate third order determinations of various marginal distributions in the continuous case. The appropriate conditional prior exists as part of the calculations for the marginal distributions and corresponds to an integration used in the marginalization. To third order however the integrated likelihood for the interest parameter can be obtained directly; it needs no model information beyond that commonly available for accurate analyses of likelihood. Simple examples are given where the steps are analytically available to illustrate this direct Bayesian calculation. Keywords: