Mark Stock's 2nd Raytraced Animation Page
Mark Stock\'s 2nd Raytraced Animation Page

Older Radiance raytraced MPEG movies

All of the movies are in standard MPEG-1 video compression format (using the Brekeley Encoder), and should play normally on most machines with an MPEG-1 have one, right?

Jump back to My Recent Raytraced Movies page.

Older raytraced animations:

In one of 12,000 apartments inside one of the habitation modules of the arcology seen below and on my raytracing page, 256x256, 120 frames, two ambient bounces, 40 hours on an old SGI Indigo, 460KB
Flying through a latticework arcology, also seen on my raytracing page, 128x128, 100 frames, no ambient bounces (i.e. no radiosity calculation), an hour or two on old Indigo, 434KB
Third rocket shot, at night, with launcher heavy gear, 256x256, 50 frames, no ambient bounces, several jitter options set, 15 hours on same Indigo, 280KB
Second rocket shot, more scenic test of sh scripts controlling firing and trajectory. 256x256, 80 frames, no ambient bounces, 20 hours on old SGI Indigo, 803KB
Rocket shot, simply a test of whether a light source can act as a good exhaust plume. See for yourself. 304x200, 100 frames, 3 ambient bounces, less than 1 hour rendering time on an HP 715/64! 311KB
Soleri's Linear city, only a small section of it, though. This movie is 240x160, 200 frames, 2 ambient bounces, but only took 6 hours or so on an unloaded HP715/64. 495KB
Spinning CAEN logo, went back to action scenes, a most cool for the Enterprise! 10 hours on HP735/99, 160x120, 180 frames, 163KB, 511 KB version
EECS movie, take 3, first movie using radiosity techniques, 240x160, 200 frames, 3 ambient bounces, took all damn weekend! 415 KB
EECS movie, take 2, 200 frames, 160x120, daylight calculation, 16 hours on HP715/64, 167 KB

A little history...

I started using Radiance in March 95, I believe. It took a while for me to get to the level of basic understanding necessary to render images. That kept me going for a few months. Then, I thought, I could try to make an animation out of my raytracings.

So, I took my map of North Campus, computed 36 viewpoints by hand, manually rendered 36 frames, FTPd them to Mac, and made a movie with Sparkle. I found out this was the long way to make a movie, but I did get a movie. Here is is:

I realized I needed some sort of automation tool, though. I remembered a little C from EECS 284 last term. Late May I decided to crunch some code and see what I could do. While it isn't the prettiest tool, I wrote a C program, that writes a C-shell script, which does all the rendering calls and computes moving viewpoints. 10 Minutes of work setting up the script, and I could load a machine for 15 hours. Happy was I.

I also found an MPEG encoder, precompiled for SunOS. Its from Berkeley, and here's the FTP directory.

Here are the first three movies made using this method:

Not satisfied with merely the viewpoint moving, I found a way to allow objects in the scene move on their own. In the scene description file, a number can be replaced by a numerical expression with the letters NUM in it. The program I wrote replaces all instances of NUM with the current frame number, and uses that scene file to make the rendering.

Here are two movies I have made with moving objects in them:

And, just recently, I figured out the daylight calculations. You need to run the .pic through pfilt -e -1 or something to get a decent-looking daylight rendering. So, I varied the time of day to make this next movie.

Then, to test how well this rendering technique would work for something like movie credits, I made the following movie, with both a moving light source and a moving viewpoint: In this one, I made a script that generates a random forest. The reflective arcology is courtesy of markusn. This is my first multiple path movie. It is also my longest. I will soon code the capability for multiple bezier paths and multiple moving objects into my anim8.8 program. (note: yeah right, when I get some free time) OK, I've finally got back into it. I'm looking forward to rendering movies based on ADAMS simulations, that should be cool, but for now, there's a few movies that are relevant to the RoboTech campaign I'm in. The first one is a rotating view from the underside of a Zentraedi landing pod.

So, then, in March 1996, I decided I could make the EECS movie a little better, so I did. I rendered it at 3 times the pixel size of the final movie, leading to smoother edges. I also MPEG encoded it better. See the above thumbnail links for these two movies!

Mark J. Stock, Graduate Student, Aerospace Engineering, The University of Michigan