Results 1 
7 of
7
Thresholding of Badly Illuminated Document Images through Photometric Correction
 ACM DOCENG 2007
, 2007
"... This paper presents a document image thresholding technique that binarizes badly illuminated document images by the photometric correction. Based on the observation that illumination normally varies smoothly and document images often contain a uniformly colored background, the global shading variati ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
This paper presents a document image thresholding technique that binarizes badly illuminated document images by the photometric correction. Based on the observation that illumination normally varies smoothly and document images often contain a uniformly colored background, the global shading variation is estimated by using a twodimensional SavitzkyGolay filter that fits a least square polynomial surface to the luminance of a badly illuminated document image. With the knowledge of the global shading variation, shading degradation is then corrected through a compensation process that produces an image with roughly uniform illumination. Badly illuminated document images are accordingly binarized through the global thresholding of the compensated ones. Experiments show that the proposed thresholding technique is fast, robust, and efficient for the binarization of badly illuminated document images.
Application of multiple moving approximation with polynomials in curve smoothing
 Journal of KONES Powertrain and Transport
, 2010
"... The paper has characterized the method of multiple moving leastsquare approximation with polynomials, known as the SavitzkyGolay filter. This method enables smoothing the measurement series, decomposition and separation of disturbances, generation of derivatives as well as approximate integration ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
The paper has characterized the method of multiple moving leastsquare approximation with polynomials, known as the SavitzkyGolay filter. This method enables smoothing the measurement series, decomposition and separation of disturbances, generation of derivatives as well as approximate integration of measurement series. The smoothing properties of the method as well as the possibilities of separation and decomposition of disturbances are shown on the examples of treatment of the selected indicator graph. The attention was paid to the need for additional application of special filters in case of abnormal impulse deviations. The examples show that in case of the analyzed curve, the results of smoothing by means of several wave filters from Wavelet Explorer package, are worse than those received with the methods of moving approximation. On the derivatives of smoothened curve by means of wavelet filters, there are significant oscillations, what is typical of the wholeinterval approximation with spline functions. One should highlight the high simplicity of algorithms of multiple moving approximation and, associated with this, high speed of operation, what particularly predisposes this method of data processing in an online mode. The authors presented their own proposals of programs for measurement data processing with the method of multiple moving approximation with polynomials, with the possibility of selection of an approximation polynomial to the fifth degree. The programs were developed in the Excel and Delphi environment.
Precision Measurement of Neutrino Oscillation Parameters and Investigation of Nuclear Georeactor Hypothesis with KamLAND
, 2010
"... A combined analysis of examining the neutrino oscillation parameters and investigation of nuclear georeactor hypothesis with the KamLAND experiement is presented. With a total exposure of 2.75 ktonyears, 930 antielectronneutrino candidate events above 3.4 MeV neutrino energy threshold were detect ..."
Abstract
 Add to MetaCart
(Show Context)
A combined analysis of examining the neutrino oscillation parameters and investigation of nuclear georeactor hypothesis with the KamLAND experiement is presented. With a total exposure of 2.75 ktonyears, 930 antielectronneutrino candidate events above 3.4 MeV neutrino energy threshold were detected, with estimated 109±13 events from backgrounds. Assuming CPT invariance by combining with solar neutrino results, the bestfit value of georeactor fission power is 4.9^(+3.8)_(4.8) TW. The 90% upper limit on the georeactor power is determined to be 11.2 TW. This result has put a significant constraint on the contribution of a possible georeactor to the total heat from the Earth. The bestfit values of the neutrino oscillation parameters, including the georeactor power as a free parameter, is consistent with KamLAND's previously published results with nullgeoreactor assumption.
Datum der Einreichung:01.06.2011 Datum der Promotion: 10.11.2011
"... doctor rerum naturalium ..."
(Show Context)
Simplifying Grasping Complexity through Generalization of
"... Abstract—There has been a growing enthusiasm to use anthropomorphic hands of humanoid robots to manipulate everyday objects and tools designed for humans. However, multifingered grasping imposes a formidable control challenge due to the high dimensionality of the joint space and the difficulty to ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—There has been a growing enthusiasm to use anthropomorphic hands of humanoid robots to manipulate everyday objects and tools designed for humans. However, multifingered grasping imposes a formidable control challenge due to the high dimensionality of the joint space and the difficulty to form a functional grip on objects. We propose a hybrid technique based on grasping synergies extracted from kinaesthetic demonstrations on a given object with a primitive geometry a cuboid in this case and passive kinematic enveloping as a generalization technique. Experiments were carried out on an iCub humanoid robot using everyday objects such as a telephone receiver, a computer mouse, three white board markers bundled together, a fencing handle, a compact disc keep case, and a drinking glass. We prove that the primitives extracted from kinaesthetic demonstrations on a cuboid can be generalized across a majority of the above real world objects. I.
Filtered Legendre Expansion Method for Numerical Differentiation at the Boundary Point with Application to Blood Glucose Predictions
"... Abstract Let f: [−1,1] → R be continuously differentiable. We consider the question of approximating f ′ (1) from given data of the form (tj, f(tj)) M j=1 where the points tj are in the interval [−1,1]. It is well known that the question is ill–posed, and there is very little literature on the subj ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Let f: [−1,1] → R be continuously differentiable. We consider the question of approximating f ′ (1) from given data of the form (tj, f(tj)) M j=1 where the points tj are in the interval [−1,1]. It is well known that the question is ill–posed, and there is very little literature on the subject known to us. We consider a summability operator using Legendre expansions, together with high order quadrature formulas based on the points tj’s to achieve the approximation. We also estimate the effect of noise on our approximation. The error estimates, both with or without noise, improve upon those in the existing literature, and appear to be unimprovable. The results are applied to the problem of short term prediction of blood glucose concentration, yielding better results than other comparable methods.
Trend Filtering Methods for Momentum Strategies ∗
, 2011
"... This paper studies trend filtering methods. These methods are widely used in momentum strategies, which correspond to an investment style based only on the history of past prices. For example, the CTA strategy used by hedge funds is one of the bestknown momentum strategies. In this paper, we review ..."
Abstract
 Add to MetaCart
This paper studies trend filtering methods. These methods are widely used in momentum strategies, which correspond to an investment style based only on the history of past prices. For example, the CTA strategy used by hedge funds is one of the bestknown momentum strategies. In this paper, we review the different econometric estimators to extract a trend of a time series. We distinguish between linear and nonlinear models as well as univariate and multivariate filtering. For each approach, we provide a comprehensive presentation, an overview of its advantages and disadvantages and an application to the S&P 500 index. We also consider the calibration problem of these filters. We illustrate the two main solutions, the first based on prediction error, and the second using a benchmark estimator. We conclude the paper by listing some issues to consider when implementing a momentum strategy.