Results 1 -
5 of
5
Analog integrated 2-d optical flow sensor with programmable pixels
- In IEEE Int. Simp. on Circ. And Sys. (ISCAS’04
, 2004
"... We present a framework for real-time visual motion per-ception consisting of a novel analog VLSI optical flow sen-sor with reconfigurable pixels, connected in feedback with a controlling processor. The 2-D sensor array is composed of motion processing pixels that can be individually recruited to for ..."
Abstract
-
Cited by 5 (1 self)
- Add to MetaCart
(Show Context)
We present a framework for real-time visual motion per-ception consisting of a novel analog VLSI optical flow sen-sor with reconfigurable pixels, connected in feedback with a controlling processor. The 2-D sensor array is composed of motion processing pixels that can be individually recruited to form dynamic ensembles that collectively compute visual motion. This flexible framework lends itself to the emula-tion of multi-layer recurrent network architectures for high-level processing of visual motion. In particular, attentional modulation can easily be incorporated in the visual motion processing. We show a simple example of visual tracking that demonstrates the potential of the framework. 1.
Analog VLSI Focal-Plane Array With Dynamic Connections for the Estimation of Piecewise-Smooth Optical Flow
"... Abstract—An analog very large-scale integrated (aVLSI) sensor is presented that is capable of estimating optical flow while detecting and preserving motion discontinuities. The sensor’s architecture is composed of two recurrently connected networks. The units in the first network (the optical-flow n ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract—An analog very large-scale integrated (aVLSI) sensor is presented that is capable of estimating optical flow while detecting and preserving motion discontinuities. The sensor’s architecture is composed of two recurrently connected networks. The units in the first network (the optical-flow network) collectively estimate two-dimensional optical flow, where the strength of their nearest-neighbor coupling determines the degree of motion integration. While the coupling strengths in our previous implementations were globally set and adjusted by the operator, they are now dynamically and locally controlled by a second on-chip network (the motion-discontinuity network). The coupling strengths are set such that visual motion integration is inhibited across image locations that are likely to represent motion boundaries. Results of a prototype sensor illustrate the potential of the approach and its functionality under real-world conditions. Index Terms—Cellular neural networks, dynamic connectivity, gradient descent, line process, motion discontinuities, motion segmentation, neuromorphic, optimization, recurrent feedback. I.
Integrated 2-D Optical Flow Sensor
, 2004
"... I present a new focal-plane analog VLSI sensor that estimates optical flow in two visual dimensions. The chip significantly improves previous approaches both with re-spect to the applied model of optical flow estimation as well as the actual hardware implementation. Its distributed computational arc ..."
Abstract
- Add to MetaCart
(Show Context)
I present a new focal-plane analog VLSI sensor that estimates optical flow in two visual dimensions. The chip significantly improves previous approaches both with re-spect to the applied model of optical flow estimation as well as the actual hardware implementation. Its distributed computational architecture consists of an array of locally connected motion units that collectively solve for the unique optimal optical flow estimate. The novel gradient-based motion model assumes visual motion to be translational, smooth and biased. The model guarantees that the estimation prob-lem is computationally well-posed regardless of the visual input. Model parameters can be globally adjusted, leading to a rich output behavior. Varying the smoothness strength, for example, can provide a continuous spectrum of motion estimates, rang-ing from normal to global optical flow. Unlike approaches that rely on the explicit matching of brightness edges in space or time, the applied gradient-based model as-sures spatiotemporal continuity on visual information. The non-linear coupling of the individual motion units improves the resulting optical flow estimate because it re-duces spatial smoothing across large velocity differences. Extended measurements of a 30x30 array prototype sensor under real-world conditions demonstrate the validity of the model and the robustness and functionality of the implementation. index: visual motion perception, 2-D optical flow, constraint optimization, gradient
Analog Integrated Circuits and Signal Processing, 46, 121–138, 2006 c © 2005 Springer Science + Business Media, Inc. Manufactured in The Netherlands. Analog Integrated 2-D Optical Flow Sensor
, 2004
"... Abstract. I present a new focal-plane analog very-large-scale-integrated (aVLSI) sensor that estimates optical flow in two visual dimensions. Its computational architecture consists of a two-layer network of locally connected motion units that collectively estimate the optimal optical flow field. Th ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. I present a new focal-plane analog very-large-scale-integrated (aVLSI) sensor that estimates optical flow in two visual dimensions. Its computational architecture consists of a two-layer network of locally connected motion units that collectively estimate the optimal optical flow field. The applied gradient-based optical flow model assumes visual motion to be translational and smooth, and is formulated as a convex optimization problem. The model also guarantees that the estimation problem is well-posed regardless of the visual input by imposing a bias towards a preferred motion under ambiguous or noisy visual conditions. Model parameters can be globally adjusted, leading to a rich output behavior. Varying the smoothness strength, for example, can provide a continuous spectrum of motion estimates, ranging from normal to global optical flow. The non-linear network conductances improve the resulting optical flow estimate because they reduce spatial smoothing across large velocity differences and minimize the bias for reliable stimuli. Extended characterization and recorded optical flow fields from a 30 × 30 array prototype sensor demonstrate the validity of the optical flow model and the robustness and functionality of the computational architecture and its implementation. Key Words: recurrent network, neuromorphic, constraint satisfaction, regularization, parallel computation 1.