Results 1 
2 of
2
Partial Solution and Entropy
 MFCS 2009, LNCS 5734
, 2009
"... Abstract. If the given problem instance is partially solved, we want to minimize our effort to solve the problem using that information. In this paper we introduce the measure of entropy H(S) for uncertainty in partially solved input data S(X) = (X1,..., Xk), where X is the entire data set, and eac ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. If the given problem instance is partially solved, we want to minimize our effort to solve the problem using that information. In this paper we introduce the measure of entropy H(S) for uncertainty in partially solved input data S(X) = (X1,..., Xk), where X is the entire data set, and each Xi is already solved. We use the entropy measure to analyze three example problems, sorting, shortest paths and minimum spanning trees. For sorting Xi is an ascending run, and for shortest paths, Xi is an acyclic part in the given graph. For minimum spanning trees, Xi is interpreted as a partially obtained minimum spanning tree for a subgraph. The entropy measure, H(S), is defined by regarding pi = Xi/X  as a probability measure, that is, H(S) = −nΣ k i=1pi log pi, where n = Σ k i=1Xi. Then we show that we can sort the input data S(X) in O(H(S)) time, and solve the shortest path problem in O(m + H(S)) time where m is the number of edges of the graph. Finally we show that the minimum spanning tree is computed in O(m + H(S)) time. Keywords:entropy, complexity, adaptive sort, minimal mergesort, ascending runs, shortest paths, nearly acyclic graphs, minimum spanning trees 1
Entropy as Computational Complexity
"... Abstract. If the given problem instance is partially solved, we want to minimize our effort to solve the problem using that information. In this paper we introduce the measure of entropy, H(S), for uncertainty in partially solved input data S(X) = (X1,..., Xk), where X is the entire data set, and e ..."
Abstract
 Add to MetaCart
Abstract. If the given problem instance is partially solved, we want to minimize our effort to solve the problem using that information. In this paper we introduce the measure of entropy, H(S), for uncertainty in partially solved input data S(X) = (X1,..., Xk), where X is the entire data set, and each Xi is already solved. We propose a generic algorithm that merges Xi’s repeatedly, and finishes when k becomes 1. We use the entropy measure to analyze three example problems, sorting, shortest paths and minimum spanning trees. For sorting Xi is an ascending run, and for minimum spanning trees, Xi is interpreted as a partially obtained minimum spanning tree for a subgraph. For shortest paths, Xi is an acyclic part in the given graph. When k is small, the graph can be regarded as nearly acyclic. The entropy measure, H(S), is defined by regarding pi = Xi/X  as a probability measure, that is, H(S) = −nΣ k i=1pi log pi, where n = Σ k i=1Xi. We show that we can sort the input data S(X) in O(H(S)) time, and that we can complete the minimum cost spanning tree in O(m + H(S)) time, where m in the number of edges. Then we solve the shortest path problem in O(m + H(S)) time. Finally we define dual entropy on the partitioning process, whereby we give the time bounds on a generic quicksort and the shortest path problem for another kind of nearly acyclic graphs.