## Entropy as Computational Complexity

### BibTeX

@MISC{Takaoka_entropyas,

author = {Tadao Takaoka and Yuji Nakagawa},

title = {Entropy as Computational Complexity},

year = {}

}

### OpenURL

### Abstract

Abstract. If the given problem instance is partially solved, we want to minimize our effort to solve the problem using that information. In this paper we introduce the measure of entropy, H(S), for uncertainty in partially solved input data S(X) = (X1,..., Xk), where X is the entire data set, and each Xi is already solved. We propose a generic algorithm that merges Xi’s repeatedly, and finishes when k becomes 1. We use the entropy measure to analyze three example problems, sorting, shortest paths and minimum spanning trees. For sorting Xi is an ascending run, and for minimum spanning trees, Xi is interpreted as a partially obtained minimum spanning tree for a subgraph. For shortest paths, Xi is an acyclic part in the given graph. When k is small, the graph can be regarded as nearly acyclic. The entropy measure, H(S), is defined by regarding pi = |Xi|/|X | as a probability measure, that is, H(S) = −nΣ k i=1pi log pi, where n = Σ k i=1|Xi|. We show that we can sort the input data S(X) in O(H(S)) time, and that we can complete the minimum cost spanning tree in O(m + H(S)) time, where m in the number of edges. Then we solve the shortest path problem in O(m + H(S)) time. Finally we define dual entropy on the partitioning process, whereby we give the time bounds on a generic quicksort and the shortest path problem for another kind of nearly acyclic graphs.

### Citations

829 |
A note on two problems in connection with graphs
- Dijkstra
- 1959
(Show Context)
Citation Context ... v to be deleted. The complexity O(log nv) , not O(log n), is crucial. A delete is defined on a Fibonacci heap in [1]. Example 1. Suppose we delete node(4) in Fig. 1. Then we have the following T = 1T=-=(3)-=- given in Fig. 2. 7 25 � � � ��� � � � ��� � � � � � ��� � ��� � � 2 ❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵ ❡ 14 ❡ ❅ ❡ 8❅❅❅ ❡ 3 ❡ ❅ 10 ❡ 9 ❡ 13 ❡ 11❅❅❅ ❡ 4 ❡ ❅ ❡ 23 ❡ 21 ❡ 26 ❡ 18 ❡ 12❡ ❅❅❅ 5 ❡ 30 ❡ 19 ❡ 16 ❡ 15 ❡ 22 ❡... |

604 |
Data structures and network algorithms
- TARJAN
- 1983
(Show Context)
Citation Context ...e standard single source algorithm. Note that we use the same symbol S for the state of data and the solution set, hoping this is not a source of confusion. We give the following well known algorithm =-=[14]-=- and its correctness for acyclic graphs for the sake of completeness. See [14] for the proof. It runs in O(m) time, that is, we do not need an operation of finding the minimum in the priority queue. A... |

577 |
Fibonacci heaps and their uses in improved network optimization algorithms
- Fredman, Tarjan
- 1987
(Show Context)
Citation Context ...heap after two decrease-key operations Let us express trunks by tuples and the node with label x by node(x). After node(4) is deleted, we have a 2T(0) with (12, 19) as the main trunk and a 2T(1) with =-=(5, 16, 22)-=- and (15, 17) connected in heap order. After (12, 14, 19) is formed� � � ��� � � � ��� � � � � � ��� � � �� 2 ❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵ ❡ ❅ 7 ❡ 8❅❅❅ ❡ 3 ❡ ❅ 10 ❡ 9 ❡ 13 ❡ 11❅❅❅ ❡ 5 ❡ ❅ 25 ❡ 23 ❡ 21 ❡ 26 ❡... |

164 |
The Art of Computer
- Knuth
- 1998
(Show Context)
Citation Context ...he length of list X by |X|. S(X) is abbreviated as S. Note that ≤ · · · ≤ a(i) ni for each Xi and a (i) ni > a (i+1) 1 if Xi is not the last list. The sort a (i) 1 algorithm called natural merge sort =-=[6]-=- sorts X by merging two adjacent lists for each phase, halving the number of ascending runs after each phase so that sorting is completed in O(n log k) time. Mannila [7] proved that this method is opt... |

65 | A survey of adaptive sorting algorithms
- Estivill-Castro, Wood
- 1992
(Show Context)
Citation Context ...aptive sorting is to sort the list of n numbers into increasing order as efficiently as possible by utilizing the structure of the list which reflects some presortedness. See Estivill-Castro and Wood =-=[4]-=- for a general survey on adaptive sorting. There are many measures of disorder or presortedness. The simplest one is the number of ascending runs in the list. Let the given list X = (a1, a2, · · · , a... |

39 |
A fast merging algorithm
- Brown, Tarjan
- 1979
(Show Context)
Citation Context ... concepts. Let |X| = n, ni = |Xi| and pi = ni/n. Note that ∑ pi = 1. We define the entropy of adecomposition of X, H(S(X)), abbreviated as H(S), by H(S) = −n k∑ i=1 pi log pi = k∑ |Xi| log(|X|/|Xi|) =-=(2)-=- The entropy is regarded as a negative aspect of the data, i.e., the less entropy, the closer to the solution. Normally entropy is defined without the factor of n, the size of the data set. We include... |

26 |
Measures of presortedness and optimal sorting algorithms
- Mannila
- 1985
(Show Context)
Citation Context ...ithm called natural merge sort [6] sorts X by merging two adjacent lists for each phase, halving the number of ascending runs after each phase so that sorting is completed in O(n log k) time. Mannila =-=[7]-=- proved that this method is optimal under the measure of the number of ascending runs. In this paper we generalize the measure RUNS (S) of the number of ascending runs into that of the entropy of asce... |

7 | Are Fibonacci heaps optimal
- Abuaiadh, Kingston
(Show Context)
Citation Context ...ber of the subtrees, which is O(log nv), where nv is the number of descendants of node v to be deleted. The complexity O(log nv) , not O(log n), is crucial. A delete is defined on a Fibonacci heap in =-=[1]-=-. Example 1. Suppose we delete node(4) in Fig. 1. Then we have the following T = 1T(3) given in Fig. 2. 7 25 � � � ��� � � � ��� � � � � � ��� � ��� � � 2 ❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵❵ ❡ 14 ❡ ❅ ❡ 8❅❅❅ ❡ 3 ❡ ❅ ... |

6 | Shortest Path Algorithms for Nearly Acyclic Directed Graphs
- Takaoka
- 1996
(Show Context)
Citation Context ...4.1. Line 4.2 is to obtain shortest distances within sc-components whereas lines 6.1 and 6.2 are to update distances through edges between sc-components. The correctness of this algorithm is given in =-=[12]-=-, in which the time complexity is given by O(m + n log r). We give a sharper analysis by the dual entropy. The time for line 1 is O(m). The time spent at line 6.2 is O(m) in total. The time for the GS... |

5 | Theory of 2-3 - Takaoka - 2003 |

2 |
A Difficulty Estimation Method for Multidimensional Nonlinear 01 Knapsack Problem Using Entropy
- Nakagawa
- 2004
(Show Context)
Citation Context ...graphs, minimum spanning trees 1 Introduction The concept of entropy is successfully used in information and communication theory. In algorithm research, the idea is used explicitly or implicitly. In =-=[8]-=-, entropy is explicitly used to navigate the computation of the knapsack problem. 1 Preliminary versions of this paper appeared at CATS 1997 [11] and MFCS 2009 [13]. Part of this research was done whi... |

2 | Solving shortest paths efficiently on nearly acyclic directed graphs
- Saunders, Takaoka
- 2007
(Show Context)
Citation Context ...r nearly acyclic graphs If the given directed graph with non-negative edge costs is nearly acyclic, we can solve the single source shortest path (SSSP) problem faster than a general graph as noted in =-=[9]-=-, [1], etc. The main purpose of this section is to investigate the possibility of measuring the degree of acyclicity of the given graph by entropy. As the degree of acyclicity is determined by the SSS... |

2 |
Entropy – Measure of Disorder
- Takaoka
- 1998
(Show Context)
Citation Context ...research, the idea is used explicitly or implicitly. In [8], entropy is explicitly used to navigate the computation of the knapsack problem. 1 Preliminary versions of this paper appeared at CATS 1997 =-=[11]-=- and MFCS 2009 [13]. Part of this research was done while the first author was on leave at Kansai UniversityOn the other hand, entropy is used implicitly to analyze the computing time of various adap... |

1 | Partial Solution and Entropy
- Takaoka
- 2009
(Show Context)
Citation Context ...is used explicitly or implicitly. In [8], entropy is explicitly used to navigate the computation of the knapsack problem. 1 Preliminary versions of this paper appeared at CATS 1997 [11] and MFCS 2009 =-=[13]-=-. Part of this research was done while the first author was on leave at Kansai UniversityOn the other hand, entropy is used implicitly to analyze the computing time of various adaptive sorting algori... |