Results

**1 - 4**of**4**### Computational Methods for Global Illumination and Visualisation of Complex 3D Environments

"... Complex three dimensional environments are visualised by rendering images of these environments as seen from different view points. Over the last three decades rendering techniques have been continuously evolving to greater levels of sophistication in terms of the complexity of environments and the ..."

Abstract
- Add to MetaCart

Complex three dimensional environments are visualised by rendering images of these environments as seen from different view points. Over the last three decades rendering techniques have been continuously evolving to greater levels of sophistication in terms of the complexity of environments and the realism with which the images are produced. In all image synthesis techniques the fundamental step is computation of the amount and nature of the light from the three dimensional environment reaching the eye in any given direction. Computer graphics rendering techniques carry out this computation by simulating the behaviour of light in the environment. Greater degrees of realism would mean higher correlation between the simulation and the physical world. In the physical world, lighting, reflection and scattering effects are very complicated and subtle. Every object receives light directly from light sources, or indirectly from reflection or scatter by other neighbouring objects. For realistic image synthesis these intra-environmental effects must be modelled in great detail. This thesis presents the results of a detailed investigation of illumination computation and rendering techniques. The four major contributions of this thesis are: A taxonomy of illumination computation methods. Particle tracing techniques for global illumination computation. The potential equation for illumination computation and the mathematical framework of adjoint equations. Demonstration of the practicality of this new class of global illumination computation algorithms. From a theoretical point of view the primary contribution is the development ofa mathematical framework of adjoint equations which provides the basis for all known illumination computation techniques. This mathematical framework consists of two integral equations- the radiance and the potential equation, which are duals of each other. While the radiance equation has been known in one form or the other to the computer graphics community, the potential equation for illumination has been introduced for the first time in this thesis. The signi cance and importance of this new mathematical framework stems from the fact that it not only enables us to review and analyse existing methods but also provides the necessary handles for deriving new and

### 1.2 Solution Strategies: State of the Art................... 5 1.3 Non Deterministic Particle Tracing: A New Algorithm......... 8 1.4 Particle Tracing in Participating Volumes: Simulation in Complex Environments

"... ..."

(Show Context)
### Parallel and Distributed Processing of Large Image Data Sets

, 1999

"... SAR image processing requires compute-intensive and thus time-consuming algorithms and will continue to do so in future since both the size of the images and the algorithmic complexity will increase. To process large data sets, a single workstation is not sufficient. The need for a powerful collecti ..."

Abstract
- Add to MetaCart

SAR image processing requires compute-intensive and thus time-consuming algorithms and will continue to do so in future since both the size of the images and the algorithmic complexity will increase. To process large data sets, a single workstation is not sufficient. The need for a powerful collection of processors is necessary. In this thesis, parallel processing techniques applied to key-algorithms in SAR image processing and visualization are evaluated on different computing architectures. The Magellan Data Set served as a testbed for these investigations. Extending the idea of concurrency, distributed computing appeared to be a valuable means to enhance computing performance. Therefore, different approaches are evaluated with respect to high-throughput computing, which has been adapted to the requirements extracted from a schedule to process the Magellan Data Set. The experiments showed that a combination of parallel and distributed processing is necessary to reach the goal of hi...

### The Utah Raster Toolkit

- in Proceedings of the Usenix Workshop on Graphics
, 1986

"... The Utah Raster Toolkit is a set of programs for manipulating and composing raster images. These tools are based on the Unix concepts of pipes and filters, and operate on images in much the same way as the standard Unix tools operate on textual data. The Toolkit uses a special run length encoding (R ..."

Abstract
- Add to MetaCart

The Utah Raster Toolkit is a set of programs for manipulating and composing raster images. These tools are based on the Unix concepts of pipes and filters, and operate on images in much the same way as the standard Unix tools operate on textual data. The Toolkit uses a special run length encoding (RLE) format for storing images and interfacing between the various programs. This reduces the disk space requirements for picture storage and provides a standard header containing descriptive information about an image. Some of the tools are able to work directly with the compressed picture data, increasing their efficiency. A library of C routines is provided for reading and writing the RLE image format, making the toolkit easy to extend. This paper describes the individual tools, and gives several examples of their use and how they work together. Additional topics that arise in combining images, such as how to combine color table information from multiple sources, are also discussed. 1. Int...