This is horribly formatted, but the important thing at this point is to make the info available...


Contents ===================================================================
  • * ENGINE
  • ** TODO LIST (16 nov 99)
  • * PREP
  • ** TODO
  • * MEMORY
  • ** HANDLES
  • ** DISPLAY
  • ** OPENGL
  • * CAMERA



The design emphasizes 3D platform/puzzle solving games, I think there are lots of games which can be done with WF which can't under Quake (but with the source out there, this might change).

Instead of focusing on graphics related items (since that is what EVERYONE is working on), I would prefer to work on things which improve gameplay. Sure, Quake is beautiful, but how varied is its behavior? I want to make games with more depth to them, be able to make complex puzzles, intricate plot lines, etc.

One of my goals is to be able to do everything in Mario 64. We can do a lot of it, but not all. After that, I would like to play with writing really good enemy AI.

I guess that's one of the reasons you chose an AI language as your scripting language, huh? I am considering using Guile, also a Scheme interpreter, as a scripting language for my own engine. Do you have any experience with Guile? In particular I was wondering about performance. I noticed WF's Scheme interpreter is apparently very small and fast, which I am not sure if Guile would be or not. Extreme Wave uses Guile too, right?

Funny story: we had a test level where the enemy just ran towards the player. so all you had to do to avoid him was stand behind some boxes. One day Phil (the guy who wrote our physics) was adding some features, and all of a sudden the bad guy comes flying down out of the air and kills the player. This was a bit disconcerting, as the AI didn't have jump programmed into it. After some investigation, it was discovered the new feature (which was making explosions push other objects away) was the culprit. The enemy would run towards the player, and when he hit the boxes, he would start throwing grenades at him, but since the boxes were in the way, the grenade would just land at the enemies feet, then when it went off, it threw the enemy into the air. Since the enemy was still press forward on the joystick, he was able to get over the boxes and land on the player (sure, he took some damage, but he got his man! wink

  • * ENGINE

We have very few 'special' objects. Any object can stand/rest on any other. There are no 'floors', 'walls', etc., the player can stand on anything which will hold him up.

We don't have pre-sorted geometry (like bsp-trees), so all of the objects in the entire world can move every frame and we won't go any slower (try having all of the walls and floors move in Quake, if I understand it correctly, the environment is seperate from moving objects).

We try to make as few assumtions as possible, for example, there is nothing in the engine which assumes an object of type 'player' is what the camera looks at. The camera decides for itself where to look, and following an object (any object) is one of its options. This means, for example, you can create either a 1st person or 3rd person game. (We did some cool stuff with this in Velocity (the game we wrote WF for which never saw the light of day), we had a missile weapon (again, no code in the engine for missiles, it was all done with scripts), first you would activate the weapon, which would zoom the camera from 3rd person to a 1st person view, and the joystick would pan and tilt the view to aim the weapon. Then you would fire the weapon, and the camera would follow the missile in 3rd person, allowing you to steer the missile with the joystick (while your character just stood around), once the missile hit something the camera would pull back to see the resulting explosion, then it would fly back to your character and resume normal 3rd person behavior. (there is some cleanup which can be done in the derivation of objects, we have more than we need right now, like there shouldn't be a difference between 'player' and 'enemy' in the engine, just in how their scripts are coded).

This does place a greater burden on the designer, since some script work needs to be done to set up even a simple level (but we have already written them, so they just need to drop them in). Also a few objects have to be set up correctly, but we keep a starting project around to get things going. One of the things we talked about was having 'wizards' for creating common configurations, but we didn't get that far.

I think we have better development tools (but I haven't really looked at Quake editors, so I don't know, I KNOW we have better tools than the Doom editors I have seem).


The design is fairly general, I tried as much as possible to build an engine instead of a game.

Question: '> Yes, in particular the idea with separate library directories, each
'> of which can be separately tested, is sound. Can each library also
'> be used separately? Would it make sense to make separate use of
'> individual libraries?

As much as possible, I have tried to make the libraries be a vertical stack of dependencies, with pigsys (damn I need to rename that) at the bottom, and the game at the top. They pretty much follow the order given in /wfsource/source/README. (oops, pigsys and HAL should be reversed in README, this is because I wrote HAL, someone else wrote pigsys, and we fought over who should be at the bottom, and he won). I want to take pigsys, streams and cpplib and release them as a base toolkit suitable for any project (where exceptions aren't needed). They provide good assertion macros, validation, debugging streams, binary IO streams, platform independence, library independence. (If you throw HAL in you get platform independent task switching and timers which, ironicaly are not used by WF (the memory manegment is, however).

Above the core I would like the libraries to be fairly independent, but they still stack, gfx needs iff and math, for example, and anim needs gfx. I intend to make a diagram of the relationship of all of the libraries, just haven't taken the time to find a package to do it in. (I might look at the stuff you used for your book).

Most of the tools use a sub-set of the libraries, so they demonstrate how easy it is to link against them. Any project I start in the future will most certainly use at least the core libraries. (I am thinking of writting a package to transport MIDI over TCP/IP, since I recently aquired a bunch of cheap 133Mhz K5 machines to use as synthesizers).


We have very few globals in the game, and in the libraries any globals are not meant to be used by the caller, only internally to the library (there aren't very many). This is just a matter of good coding practice, choice of language shouldn't matter, there is tons of portable, reusable C out there. I wrote an object-oriented debugger in C (and the entire amiga OS was written in object oriented C), so it can be done with discipline.


Overall, with no demomstration levels, I think these are pretty weak arguments. If WF is to get any mindshare, it HAS to have a cool working level to play.

'> I thought you did have working levels for the Windows version?
'> And even though I don't have 3DS, you do, right? So why can't
'> you just compile all of the sample levels on the WF FTP server
'> into an asset file which can then be run, even if only under
'> Windows? Am I forgetting something?

Sorry, we have several boring, 'doesn't really demonstrate the engine' levels. What we don't have is anything like the dozen or so levels which were made for Velocity, which were full-fledged, playable levels, with goals, and enemies, etc. (We do have a promotional video we made, I should see if I can get that into a format suitable for downloading). I want something where a person downloads the executable with a level, plays it, has fun, says 'This is cool, I want to work on this!'. Or 'I want to make a level with this'.

  • ** TODO LIST (16 nov 99)

Bill had done a stand alone attributes editor in windows back when it was a commercial product, I have successfully recompiled it, but the function to open the window fails when I run it (which means I get to learn a bit about windows programming I guess).

I decided I would have better luck promoting WF if there were better levels to show, so I am concentrating on getting a working windows release out there so that artists could have a chance of making some. If someone made a cool level showing off the capabilities of WF, I think it would start to take off.

So my todo list is:

- Release Windows GDK Test and release all of the sample levels

- put all of the documentation in one place

- make another mailing list for supporting users (wfusers maybe)

- Debug the stand alone attribute editor on windows and release it (so that you can do the gui port if you want)

- Re-work the texture handling in the engine so that it will work with the 3dfx card.

- Finish the .iff based (not embedded in 3dsmax) level converter

- Port level conversion process (from .iff file on) to linux

- Write geometry output for some 3d modeler.

At that point all that will be needed to develop levels on linux will be the attribute editor.

At my current rate, I would say it is going to take me 6 months to execute the above list (but hopefully there will be some cool levels developed by then as well).

Kind of depressing, considering the above list would be about 1 month when I was working on it full time.

'> 3. How long do you expect it to take before a fairly stable version of
'> World Foundry, including level-building tools, is available on Linux?
'> Half a year? One year? Two years?

Well, that is a tough question. Currently I am the only person really working on it (I have received a few patches from others). I see it as 2 projects: port the engine, and port the production environment. The engine is running, but needs lots of work. The production pathway used 3dsMax as its level layout tool (max and other 3d packages were used for geometry creation), so the first step is to choose a 3d package to use on linux. I am currently leaning towards extreme wave. The windows version had a max plug-in which did the level exporting, much of the code in it is not max specific, last year we decided it would be much cleaner if the exporter simply wrote out an intermediate file, and have a command line tool do the the level processing. I am still working on this (I have a max exporter which produces an iff file with all the object data in it, and am nearing completion on the command line version of the level converter). Once this is done, all that is left to have a functional pathway is the attributes editor in the modeler, and the exporter. At my current rate, I imagine I will have something working in 6 months or so. Now if I get some help, that could probably be shorted quite a bit.



We used to use GNU make on the windows side, but it was a nightmare due to problems with '\' and ':' handling. Opus Make is a very nice make package, unfortunatly is isn't free, so we need to find something else. There are few areas where we use Opus Make specific features, these will be the hardest to port. I will look at tmake and see if it will work. There is also a set of GNU makefiles to build the linux version (of the source, I haven't done the asset makefiles yet), so you can look at them as well.

Braindump on the engine makefiles:

wfsource/source/makefile: just cd's into each libraries sub-directory and runs 'make' wfsource/source//makefile: builds that library. Includes wfsource/Makefile.env and wfsource/source/Makefile.lib

wfsource/Makefile.env: defines all the build variables and compiler options Includes: wfsource/Makefile.<targetname, i.e. linux, win, psx) wfsource/Makefile.bld wfsource/Makefile.rul

wfsource/Makefile.: platform specific definitions, name of compiler, name of shell commands (rename, delete, etc).

wfsource/Makefile.bld: implements easy build names, for example: buildmode=release is all optimizations on, assertions off, debug streaming off, etc; where buildmode=debug is no optimizations, assertions on, debug streamming on, etc. This allows definition of new buildmodes easily (some are built on others, for example, buildmode=tool is based on buildmode=debug.

wfsource/Makefile.rul: make rules, defines how to build a .cc file, a .asm file, etc.

wfsource/source/Makefile.lib: rule to build library file Includes: wfsource/source/Makefile.test wfsource/source/Makefile.print

wfsource/source/Makefile.test: rule for building test program for each library

wfsource/source/Makefile.print: rule for dumping lots of variable settings, used to debug the makefiles.

Sorry, I forgot VC has a GUI, these need to be set in the DOS shell, i.e.: set RENDERER=gl set RENDERER_PIPELINE=software etc. I place all of these in a batch file and execute the batch file before I try to make (enclosed to make it easier).

  • *PREP

'> By the way what exactly IS prep, anyway? What can it do that m4 or
'> other macro preprocessors can't do? What are its goals? What is its
'> purpose in this universe?

prep was started before I was much of an internet user (way back in 1994), so I didn't know about m4. However, looking at m4 now I don't like it. prep was written to get around limitations of the c preprocessor.

Some of my design goals: Only ONE escape character (@, choosen because C doesn't use it). I hate having to escape half of the characters because they mean something to the preprocessor.

Better macros: default parameters in macros (heavily used in the oas directory) (and, better than C++, the defaults don't have to go at the end, and you can specify param 1, default 2, param 3 (you just put two commas in a row)).

Ability to have preprocessor commands in macro expansions

Better expression handing.

regular expression based search/replace.

More powerfull macro execution: in prep you can define a macro during macro execution. This amounts to having variables "@define foo @e(foo+1)@t" will create a string which is numerically 1 larger than it was before (@e = evaulate expression, @t = terminate macro definition).

stream oriented instead of line oriented. With a few exceptions, all prep commands can operate in the middle of a line (or the middle of a macro invocation).

Written as a text input class. Prep can be used in any program to pre-process text (hand it a file to open, and call GetLine repeatedly to get processed text).

Small. The code base is small enough to understand in just a few hours (I think).

There is a word document on prep in the prep directory (prep.doc), I should convert it to html or something.

It has grown to have looping and other sick things God did not intend a text preproces sor to have.


Cryptic. Not well documented (sometimes I read the source when writting prep macros).

Slow. Doesn't process text particularly fast (watch how long it takes to make in the oas directory).

So, in conclusion, it does what WF needs it to do, but would need some cleanup before I would claim it was general purpose.


Also, I wanted to give you a brief summary of how aegis is used, in case you wish to investigate the tool in more depth. Though aegis comes with a very good user manual, I think an example of its use illustrates its strength and flexibility well. I use aegis daily and it is the best GPL bug-tracking/source code control system I have seen. Commercially, probably the most complete system I have seen/used is IBM's CMVC, which I used for a previous employer to manage the build environment for a project with millions of lines of code and 120 developers. Aegis's functionality is pretty close to CMVC's, i.e., it is very good.

1. New Change: aenc

This enters the change into the change database.

2. Development Begin: aedb

This makes a new sandbox directory with links to the baseline files.

3. Change files:

  1. Copy baseline files to the local directory to change them: aecp. The file can then be edited. This is something like "checking out" a file, though it is not locked. Multiple concurrent changes are resolved because a file cannot be checked in unless it successfully builds against the current baseline and passes all tests.

b. Move files: aemv

c. Delete files: aerm

d. Create new files to be added to the baseline: aenf

All changes only occur locally within your development directory.

4. aeb: Build the program

5. aet: test the program

6. aed: see what files have changed from the baseline

7. After a successful local aeb, aet, and aed, you have finished developing your change in your sandbox. Changing any files after an aeb requires you to redo the aeb (this is automatically enforced). Then comes time to integrate.

8. aede: Develop End. The change must now be reviewed by a project reviewer.

9. aegis -review_pass: After the details of the change are approved, the change may begun to be integrated into the baseline. With -review_fail, of course, the change goes back to the developer.

10. aegis -integrate_begin: This makes a completely fresh, clean directory with links to the most current baseline files. It copies all altered files from the change into the integration directory. It then tries to build and test the change one last time in a clean environment.

11. aegis -integrate_pass: After integration succeeds, the files are then actually written to the baseline. All currently open and future changes will now build against this new baseline. For multiple concurrent updates to the same file, this means that whoever's changes are successfully integrated first sets the baseline against which subsequent changes must build and test.

If integration fails, of course, no changes are written to the baseline and the developer goes back to the aeb, aet, aed cycle.



'> 3) I have no idea in what format the data needs to be saved. The
'> simpler, the better.

I am working on a file format for oad data when exported from a 3d package, it uses iff, but is fairly simple, mostly boils down to: name "attribute name" data "value" wrapped in chunks. Internally (before export, when you are using your attribute editor, you can use anything you want (but again, the easiest is just to store name,value for each field).

'> 4) The underlying fundamental or important characteristics of an
'> attribute editor are not yet completely clear. It appears mainly
'> to be a type-attribute system, i.e. you choose an object-type for
'> the mesh, then depending on the type you get a list of valid
'> attributes for this type which you edit. Are there more fundamental
'> ideas which need to play a key role in the early design?

You are correct, the class type must be chosen first to establish what type of game object it is, which determines what attributes will be displayed. The only other key concept I can think of is that the interface is not hard coded (you don't know what attributes a 'Player' has when you compile the editor, you read that information from the oad file and construct a user interface on the fly for it (instead of laying out an attribute page, you just lay out a gui element for int, fixed, object reference, color picker, etc., then build stacks of them based on the oad data).

'> 5) Are meshes really the only objects which need attributes? (I hope
'> so, though workarounds are always possible.) How about paths?
'> Anything else?

Yes, in World Foundry, you have to be an object to get data attached to you (so we have a special object, the "level" object, which contains any level-wide data we wish to encode (this object doesn't appear in the running level). Paths are something which happens to objects, so any object which has a path ought to export it when its position and rotation are exported.

'> 6) This really isnt a problem but I will put it here - I need to
'> be careful about isolating dependencies on widget-set, system,
'> Blender, and my magic-X-encoding in separate classes, so that
'> the fundamental part of the system (type-attribute) can be
'> reused with other editors or other encoding schemes.

Agreed, I will try to supply something which already has full oad support so all you should need to do is provide user interface and blender stuff. If you make it able to run stand-alone (without blender), it can be the starting point for any other 3d packages attributes editor as well (I would get that done before attempting the blender integration).


'> To summarize, the videoscape .obj files, even though they are
'> generated 1 per object, are just temporary and thus not so
'> important. The real files are the .blend file containing the object
'> positions, and the one or more attributes files containing the
'> attributes for the objects. For now I will probably have one
'> attribute file per object and will investigate ways of reducing this
'> to one file containing all attributes for all objects (a simple
'> concatenated text file or something like a zip file using zlib),
'> which could conceivably even be bound into the .blend file if I find
'> the appropriate mechanism for doing so -- though everything in 1
'> file doesn't seem of paramount importance to me; two files seems
'> bearable. Two files is fine.

'>> (this is sort of a wart on our oad files, they list all of the data
'>> we wanted to control in addition to what max gave us, so they don't
'>> contain position, orientation, name, etc, even though all of those
'>> things get exported into the .lvl file (except the name, which is
'>> only used for object references)). When everything gets stable I
'>> want to re-think that a bit. > > Um, can't say I really follow -
'>> what part needs rethinking, what is > the problem? (But I agree
'>> with the part "rethink it when everything > gets stable"...)

Sorry. The .lvl file contains object position, object orientation, object bounding box, AND all of the oad data for that object. I think it would be cleaner if the oad data called for position, orientation, etc, so that the only object data in the .lvl file was the oad data (which would then INCLUDE position, etc). Not a priority.

'> >> > Orientation: I just >> > finished extending the magic-X to be a
magic-coordinate system,

>> >> > Path information: >> Paths are only used for objects which have a set motion (mostly for platform >> (sonic or mario) style games. Conveyor belts, or machinery. In WF they are stored >> as keyframes which we draw straight lines between, so we can't really follow curves >> very well (but it would be easy for someone to add a new position player to path to >> do it). So I would get everything else working first, then we can figure out paths.

> > Thankfully (I just tried it) Blender can store and export polylines > as meshes (it can also convert curves into polylines), BUT these > polylines consist of tons of ... two-vertex edges, which is exactly > what I previously assumed was not allowable and thus could flag > the begin of my meta-data. But, don't panic, I can signal my > meta-data with a series of three (or 20, or 42) OVERLAPPING two-vertex > edges, which should definitely never occur in any geometry, even > in polylines, so the magic-X encoding will work for paths, as well. > > Anyway, let's not worry too much about paths now, I at least am 95% > sure it will work, so it's just an "implementation detail" which > we'll do later after the other stuff with boxes is working.

'>> > I think we're getting close: object identity, arbitrary object
'>> > attributes, export of all objects with OID, position, and
'>> > orientation, and almost certainly path information too (once I
'>> > understand exactly how it works in WF). Anything else we need?
'>> > >> I believe that is all we use for level export (we will discuss
'>> > geometry >> export later, once we have a working level running
'>> > with boxes). > > So... what exactly do I need to export? I
'>> > imagine something like > the following... You click "export
'>> > level" in my little standlone > program. This sends the
'>> > keystrokes to blender to mark and export all > meshes. Then I
'>> > invoke a perl script (to be written) which parses > all the .obj
'>> > (exported videoscape) files, extracts the id, position, > and
'>> > orientation from the magic-X, finds the object id in the
'>> > attributes > file and finds the appropriate attributes. > > So
'>> > far, so good. (It doesn't have to be in perl, either, but it >
'>> > would probably be fastest to whip it up this way.) At this point
'>> > > I have all objects, their positions, orientations, and
'>> > attributes > (including the name, the object-ID is irrelevant to
'>> > external > users). > > In what form do you then need to access
'>> > this information? > Can I simply then write out some sort of a
'>> > simple text file > > OBJECTNAME=BLAHBLAH > POSITION=1.0,2.0,3.0 >
'>> > ORIENTATION_X=1.0,0.0,0.0 > ORIENTATION_Y=0.0,0.0,0.0 >
'>> > ORIENTATION_Z=0.0,0.0,0.0 > OBJECT_SPECIFIC_ATTRIB_1="my name is
'>> > joe" > OBJECT_SPECIFIC_ATTRIB_2="green" > END_OBJECT > >
'>> > OBJECTNAME=BLAHBLAH_2 > POSITION=9.0,9.0,9.0 >
'>> > ORIENTATION_X=0.0,0.0,0.0 > ORIENTATION_Y=0.0,1.0,0.0 >
'>> > ORIENTATION_Z=0.0,0.0,0.0 >
'>> > ANOTHER_OBJECT_SPECIFIC_ATTRIB_2="he said mount huh huh"; >
'>> > (courtesy of pole.oas, line 15 smile ) > END_OBJECT > > which you
'>> > then parse into whatever form you need? Or should I output > IFF
'>> > at this point already? I've never done any IFF stuff, so don't >
'>> > necessarily want to slow up the whole development process by >
'>> > struggling with it if I don't have to...

Wow, that looks just like videoscape (I can't imagine why ;-). Have I got a bargain for you. What if I told you you could have the power of iff with the ease of use of text? All you have to do is produce a text file which iffcomp can read, and you can produce binary iff output. Now how much would you pay? (Doesn't matter, its free!).

So, here is some sample output:

'> I'm not a Microsoft Windows expert, but I play one at work. Though
'> Linux is my OS of choice, the reality of the marketplace for my main
'> employer dictates that we write software for M$win, so I have some
'> experience with Windows programming. Therefore it is conceivable
'> that I could start looking at the property editor now even if it
'> doesn't completely run yet. The main problem I foresee is that the
'> code is probably not yet compilable with any free compiler (Cygnus
'> gcc for windows), right? I don't have access to any non-free C++
'> compilers at the moment, and in fact have no need for anything other
'> than gcc.

Since it is based on the 3dsmax code, it is compiled with VC++ (you HAVE to use VC++ to do 3dsmax work). I don't care too much about having a working windows version, I just want to prove the rest of the code is working, it shouldn't take me too long to debug it (the source is in wfsource.tar.gz in source/attrib if you want to look at it. It hasn't been updated to read the new oad iff files, still reads the .oad binary files).

'>> I decided I would have better luck promoting WF if there were
'>> better levels to show, so I am concentrating on getting a working
'>> windows release out there so that artists could have a chance of
'>> making some. If someone made a cool level showing off the
'>> capabilities of WF, I think it would start to take off. > > Sounds
'>> perfectly reasonable. However, don't forget that the sooner a >
'>> free (Linux, or at least non-3DS-dependent) GDK is available, the
'>> more > people will be able to develop levels. Yes, I am done
'>> working on the windows GDK (other than documentation and any
'>> support anyone needs).

'> If it's not too fancy, the attributes editor could conceivably run
'> under WINE, the Linux Windows emulator. Wine can run a surprising
'> number of native Windows binaries these days. In fact it can even
'> run Tomb Raider II with hardware acceleration (the Direct3D emuation
'> layer makes calls to Mesa). Pretty amazing, huh? gcc can also cross
'> compile for Windows, meaning the editor could even be compiled under
'> Linux if the source code is gcc-friendly.

Interesting idea, it most likely will work, but getting it to compile under gcc for windows might be more of a hassle than porting it to another gui toolkit under linux. I guess I will leave that up to you. (On second thought, I am sure it would be easier to change compilers than to change gui toolkits, as soon as I get it working I will test it on wine).

'> I need to review our previous emails to understand the entire system
'> architecture again regarding levels and attributes, but assuming
'> that the attributes editor can run unchanged under Wine, then how
'> much additional work is needed to get a level editing environment up
'> and running with, say, blender? What type of file does the
'> standalone attributes editor read in, anyway? I am thinking along
'> these lines: Blender \ \ \ \ stand alone \ property editor my
'> magic-x / object ID encoding / \ / \ / \ / \ / some common file
'> format In other words, I lay out some cubes in Blender and write
'> only the most elementary of attribute information (type, name) into
'> some common file format using my already-established magic-X
'> encoding and blender remote-control via keystrokes. The standalone
'> property editor (running under Wine on Linux, for instance) would
'> run parallel as a separate program and would read in the file I just
'> wrote, and from there I could fill in the additional properties. In
'> other words (hey, there's that phrase again) I would just use
'> Blender and my remote-control system to help identify and position
'> objects in 3D space. Everything else would be specified through your
'> already-working (or almost-already-working) standlone property
'> editor, which could run under Wine on Linux. Is this chain of
'> reasoning tenable? If so, the main problem I see is the common file
'> format read by the existing standalone property editor. I guess
'> this is IFF? Or is it even uglier, being some sort of native 3DS
'> format? Sorry if you already covered this in a previous email, it's
'> all kind of foggy in my memory right now.

Yes, I think that can be made to work. There are actually two common file formats: the ObjectAttributeDescription (oad), which describes each field (name, etc), which is only changed by programmers (which the attribute editor reads but doesn't change) , and the data itself (oad data), which there is a copy of for each object, and is what the attribute editor edits. I intend both of these to be store in .iff files, vcurrently the oad is stored in a binary file (.oad) (although I have made the production pathway produce an .iff version as well), and the oad data is stored inside of 3dstudio max (although I have written an exporter which writes it out as .iff). So I have the .iff files, just not the code to use it (about 1/2 finished with a version of the level converter which uses the .iff files).

Within a few days I will get back on that task.

'> As for me, I enhanced the magic-X Blender encoding so that it also
'> works with paths (polylines) by using 4 overlapping edges at the
'> beginning to flag the start of the metadata. It still needs a bit of
'> tweaking but is mostly working okay.

  • ** TODO
--> NOTE NORMAN 18 sep 2000: not correct, there is no "End of transmission" flag or bit counter for the overlapping edges => no way to know when the id stops and the path begins. but if the path cannot overlap the magic bit edges then there's no problem, the code in blend_at/vidinfo must stop once it finds a non-overlapping edge (maybe it already does?)

'>> (this is sort of a wart on our oad files, they list all of the data
'>> we wanted to control in addition to what max gave us, so they don't
'>> contain position, orientation, name, etc, even though all of those
'>> things get exported into the .lvl file (except the name, which is
'>> only used for object references)). When everything gets stable I
'>> want to re-think that a bit. > > Um, can't say I really follow -
'>> what part needs rethinking, what is > the problem? (But I agree
'>> with the part "rethink it when everything > gets stable"...)

Sorry. The .lvl file contains object position, object orientation, object bounding box, AND all of the oad data for that object. I think it would be cleaner if the oad data called for position, orientation, etc, so that the only object data in the .lvl file was the oad data (which would then INCLUDE position, etc). Not a priority.

'> As far as I can tell, I need to get up to speed on the OAS/OAD
'> files to do an attribute editor. I am not quite sure as to what I
'> need to do. For writing a new attribute editor, do I need to
'> parse/create OAD files, or are OAD files completely 3DS specific? I
'> am a bit confused here. I had a first pass at reading the readme.txt
'> file in the oad and oas directories but am still kind of
'> disoriented. Maybe you could give a quick summary as to the whole
'> pipeline, starting from OAS and going down through the
'> transformations into different file types, which of those file types
'> are 3DS specific, and what a non-3DS attribute editor actually needs
'> to create and parse? -Norman

This stuff is hard to describe, I need to relay a lot of data, and I am not sure what the correct order is.

This can get confusing real fast. The problem is we don't have very good terminology for discussing this stuff.

First I want to explain what is in an .oad file:

An oad file describes an object type in the game. There is one oad file for player, one for platform, one for camera, etc. The oad describes what attributes each object type has (it does NOT contain how many hit points the player has, it indicates the player has an attribute called "HitPoints", that it is a fixed point number, what its valid range is, and maybe a help string).

Each oad file contains an array of oad entries. An oad entry is a struct (defined is source/oas/oad.h, called _typeDescriptor). (Note this struct is not used in the game!, only by the tools). Each entry maps to exactly one attribute on an object. Each entry contains a variety of information: Display information name type (int, color, object reference, bool, etc) user interface (combo box, slider, drop down menu, text entry, object selector) help string describing usage etc Editing information (minimum value, maximum value, default value, validation)

The attributes editor builds its display by reading the oad file. For example: if an oad file has 4 oad entries the attributes editor will build a user interface with 4 editable fields, each customized to the entry.

Since they say a picture is worth a thousand words: look at: Look at the section labeled "Common", that was created by reading the oad file generated from: PROPERTY_SHEET_HEADER(Common) TYPEENTRYFIXED32(Mass,,FIXED32(0),FIXED32(100),FIXED32( DEFAULT_MASS )) TYPEENTRYFIXED32(Surface Friction,,FIXED32(0),FIXED32(1.0),FIXED32(DEFAULT_FRICTION)) TYPEENTRYFIXED32(hp,,FIXED32(0),FIXED32(32767),FIXED32(DEFAULT_HITPOINTS)) TYPEENTRYINT32(Number Of Local Mailboxes,Local Mailboxes,0,40,0,"",SHOW_AS_SLIDER) TYPEENTRYOBJREFERENCE(Poof) TYPEENTRYBOOLEAN(Is Needle Gun Target,Needle Gun Target,0) TYPEENTRYINT32(Write To Mailbox On Death,Death Mailbox, 0, 3999, 0,,SHOW_AS_NUMBER,"Mailbox to write to when object is destroyed") TYPEENTRYXDATA_CONVERT(Script,DEFAULT_SCRIPTNAME,"Cave Logic Studios AI Script||..\\script s\\default.s",XDATA_NOT_REQUIRED,,,,,XDATA_SCRIPT) TYPEENTRYBOOLEAN(Script Controls Input,,0) TYPEENTRYXDATA_NOTES PROPERTY_SHEET_FOOTER LEVELCONFLAGENDCOMMON

As you can see, there is a one-to-one corespondence between user interface elements and oad entries. The whole purpose of the oad file is to describe these user interface elements and how to write out a structure the game can read. Change the oad file, and the interface changes (without recompiling the attributes editor).

The 3dsmax attributes editor used a binary data file (.oad). The .oad file was a kludge (it was created by making a C structure definition, compiling it (without a main), and using the binary output). It sucks, and is not portable.

So, it was recently decided to switch to an iff file which contains all of the attribute information. I have made the oad system generate these files, but they have not been tested very well.

I already have C++ classes which load the .oad files into memory, making it easy to query it, etc. I plan to make the same classes read from the iff file instead. Once this is done, you should be able to write the attributes editor without worrying much about how the data is stored on disk.

The oad files are compiled from the oas files, they are the source file, and the only one that should ever be edited by the programmers making a game. Through much trickery (ok, by using prep), I generate several different output files from the .oas file: .iff.txt ( new improved oad file, goes into the attributes editor) .ht ( header file for including in the engine) .oad ( not implemented on linux, use the iff file instead) .def ( used by the scripting system )

We have a stand-alone windows version of the attributes editor (not really usefull, but the idea was to use it as a starting point for adding attribute editing to other 3d packages). I haven't verified it builds right now, but I will soon. Once I get it working, this is where I plan to test switching from oad to iff files. Once that is all working I think you should use it as the starting point to making a GTK version (does GTK have a C++ interface? I would like to use a well abstracted user interface if possible).

I hope this explaination helps, I focused on the part pertaining to making an attributes editor, and skipped lots of other oas directory info, so ask away.


'> Looks doable (except for the quaternion parts - ack! Only thing I
'> know about quaternions is that their discoverer scratched their
'> derivation on a bridge whilst walking home so he wouldn't forget
'> it). Got any good online references? (Not just formulas, I want to
'> understand what they are!) Quats are weird, but they have the
'> unique property of being able to tween orientations naturally (if
'> you tween eulers, it can look strange). They are only there because
'> that is how the data came from 3dsmax. We use Eulers in the engine,
'> so we can just change it to be eulers.

'>> We have a library which makes writting such iff files (text or
binary) easy. > > Is this library already part of the (compilable) Linux wftools? What > do the library calls look like (sample code perhaps?) The library is in wfsource/source/iffwrite, use looks like:

_iff->enterChunk( ID( "STR" ) ); _iff->enterChunk( ID( "NAME" ) );
                          • _iff << "Class Name"; _iff->exitChunk(); _iff->enterChunk( ID( "DATA" ) );
                          • _iff << className.c_str(); _iff->exitChunk(); _iff->exitChunk();

'> And, any news on the Windows standalone attribute editor? (Again, no
'> rush - just hungry for news!)

The runtime is broken down into several libraries in a loose hierarchy:

HAL: Hardware Abstraction Layer: abstracts things like joysticks, task switching, game startup and s hutdown, messaging, etc. Unfortunately, mostly written with a C interface Pigsys (needs to be renamed to SAL): Software Abstraction Layer: abstract typical C operating system calls like file IO, memory management, etc. Memory: Functions to manage a chunk of memory, manages allocations inside of blocks of memory, both dynamic and static allocators cpplib: general C++ stuff, contains our re-directable streams, and the null-stream implementation. streams: code to deal with streaming assets from disk iff: iff reader. All assets are stored as binary iff files, this is the reader. math: 1.15.16 bit fixed point package, and also portions of a 1.31.32 bit package input: upper level input handling (not hardware specific) gfx: entire 2d/3d renderer and pipeline anim: handles model animation audio: beginning of the music/sound effects code/interface physics: collision and movement handling particle: A particle system which makes lots of lightweight objects game: main game code. Needs to be broken into smaller libraries (first step is to reduce cross-linka ge between pieces)

the rest are less interesting or obsolete: profile: psx execution profiler midi: beginning of midi player movie: psx movie player menu: simple text based level selector

savegame: code to serialize game state data, probably out of date streams loadfile: Used to directly load a file without going through the asset system. Should only be used by debugging or test code. timer: thread based wakeup packet sender, not really used in the engine at this time console: simple printing into a text buffer which is then displayed (used on playstation for startup screen) shell: beginning of a game shell to go around the level engine. Since abandoned in favor of using the level engine to implement all of the startup screens, etc.

not used by the engine, only by tools: eval math expression parser attrib stand alone version of attributes editor, currently only compiles for windows audiofmt gfxfmt iffwrite iff writer library ini simple windows ini parser oas/oad recolib regexp regular expression expansion registry window registry interface template toolstub

To get started I would recommend looking over the code base and getting familiar with the role of each of the libraries.

What I really need to write is a coding standards document, and document the assertion macros and streaming code we use everywhere.

BTW, each library has a test program in it for exersizing that library, so you can play with each library in isolation (whenever a bug is found in a library, first we update the test program to reproduce the bug, then fix it (so that we get automatic regression testing)).


'> I tried to run streets and Babylon but I get mallac errors for some

Since this engine had the playstation as a target, we have our own memory allocation, there is a file called ram.iff.txt (I recently changed this, your version might be called ramiff.txt) which controls the size of each memory buffer. All game/level specific memory allocations come from these buffers (which means if it runs at all, it will never fail with a system memory allocation failure (so once a game has been tested enough to insure it allocates enough to start with, it will never fail).

Those levels need more ram (or in a different configuration) than the default. I probably failed to update those files in the zips I uploaded.

Here is the one I have for streets: { 'RAM' 'OBJD' 340000l 'PERM' 123600l 'ROOM' 191072l 'FLAG' 0l 0l // doomstick, bungeecam }

Here is the one I have for babylon5: { 'RAM' 'OBJD' 340000l 'PERM' 153600l 'ROOM' 131072l 'FLAG' 0l 0l // doomstick, bungeecam }

I have not tested babylon5 in quite some time, it might not run without some work (it was going to be a descent style level, it is just started, so it is not much to look at anyway).

'> Another question (I would probably ask fewer questions if I had the
'> development environment up and running so I could try stuff out) -
'> you only store 3 rooms of the world in memory at once. Does this
'> mean that NPC's can only be active in these three rooms? In other
'> words, if I am at one side of the world, and an NPC is 50 rooms
'> away, can the NPC move around, think, shoot, or basically act even
'> though his room is not currently within the "3 active room" window?
'> Currently only objects in the 3 rooms run, so everything else is
'> "frozen" in time. We talked about having objects which always run,
'> but that can get problematic with the physics engine. As fast as
'> computers are today, probably the best solution for this sort of
'> game would be to separate what gets rendered from what gets physics,
'> i.e. only render 9 rooms or so, but update all of them.

'> 1. It appears your visibility scheme is room-based: one room before,
the > current room, and one room after. (I use a portal scheme in my > engine.) Is World Foundry therefore limited to indoor types of > environments, or can outdoor scenes also be done reasonably well? > (Landscapes, mountains, space combat shooters, flight simulators, > racing games, etc) The current 3 room limit came from having the sony playstation as one of the targets (this allowed us to control 3 resources: texture memory, model memory, and execution time). In the short term I would like to simply increase the # of rooms loaded at once, which would greatly expand the game possibilities (if we increase it to 9 it would allow laying out a 2d grid of rooms for games like racing, etc). In the longer term I would like to investigate other methods of model/execution management (bsp trees, etc. (and I haven't had the time to read up on portal based systems).

In addition, we have a tile based 2d background renderer for distant images (not currently ported to GL).

'> So most of the games produced with World Foundry have been inddor,
'> right? That's what it looked like from the screenshots.

Indoor, outdoors with fog (with movement restricted), or floating in cyberspace (no boundaries, if you fall, you die).

'> It gets more confusing (to me) when I start thinking about
'> arbitrary, unlimited, outdoor geometry. I suppose some sort of a
'> regular spatial partitioning (such as a 9x9 grid like you suggested)
'> is one of the more easily doable and understandable
'> strategies. Every other method seems to require some complicated way
'> of deciding when to stop drawing (when you reach a certain depth in
'> the BSP tree, when you have covered all pixels in the screen, when
'> you have reached a certain distance from the camera, when you have
'> run out of rendering time, etc.).

I think the key to any system is quick ways to cull large pieces of geometry, it doesn't matter if it is bsp's, bounding boxes, whatever, as long as you can quickly decide to ignore most of the level geometry, you should be all right (and with the speed of todays cpus, you don't even have to try that hard).

'> Also how about networking and sound support, how do these work in a
'> cross-platform way, or more fundamentally, how do they work at all
'> in WF?

Our sound support is a bit primitive, we can trigger sounds from inside of scripts, we can trigger sounds from animation key frames (with events). The sound stuff definitely needs more design work, we were just getting to it when we stopped.

We don't currently have any networking stuff, but our game can run deterimisticaly (we have a joystick recorder which can re-produce an entire game by just recording the joystick and time values), so it shouldn't be that hard to make a frame-locked networked version (revisionist history would be MUCH harder). Actually, I think revisionist history can't really work unless gameplay is quite simple.


'> Just a quick question which occurred to me as I considered some of
'> the technical issues a 3D engine has to face. How does WF know which
'> objects are within a room? There doesn't appear to be a "starts
'> within room" attribute on any objects. So does WF use the bounding
'> box of the room to determine which objects are in the room, or how
'> does this work? Is it even correct to say that objects "belong to" a
'> room? What is the relationship between room and object and how is it
'> specified? Yes, it uses the bounding

You have to make room objects which are the size and shape of the room. The level converter (currently runs in 3dsmax and exports a .lvl file) sorts all of the objects into room lists in the .lvl file (based, as you said, on the bounding box of the room). When the game loads a level, it goes through all of the rooms constructing the objects (we construct all of them, even though we only run 3 rooms worth at a time). The room objects keep track of which objects they contain, and hand them off to other rooms as they move about. Yes, objects "belong" in a room, it is possible for an object to be in room 1 and stick out into room 2 (which room you are in is decided by your position point). Since the current collision system doesn't check objects in the previous room against objects in the next room, objects should not be longer than a room (or a long object in room 3 sticking into room 2 might pass through a long object in room 1 sticking into room 2). Even fairly short objects can cause problems if you are not careful with the level design (if there is an object in room 3 sticking into room 2 being stood on by an object in room 2, and the camera moves into room 1, the objects in room 3 stop getting rendered and colllision checks, so the object in room 2 falls through it. On the psx we were willing to live with this, now I think we definitly want more rooms running at once.


It is one of the features of doing it this way.

When we load a level, as we create each object, we associate that object with some geometry (usually by loading it off disk). We actually have several different render types for objects: None: You don't see anything (good for cameras and actboxes) Model: a 3D model Box: a 3D box the size and shape of the objects collision box Scarecrow: a 2D bitmap in 3space which always faces the camera (like Doom) matte: a large, tiled 2d scrolling background (like a Genesis game) emitter: a particle system which produces particles which do not get physic run on them (we also have a generator object which produces objects which DO get physics run on them).

So as you can see, each object can have different visual representations. When we load the level, if the render type is set to "box", instead of loading geometry from disk, we just create a box which is the size and shape of the collision box for the object.

'>> We attach the attribute data to each object in the modeler so that
'>> it gets saved out with the modeler's project file. So we need to be
'>> able to attach arbitrary data to an object (coming in the next
'>> release of ewave), and we need to be able to run a plug-in on a
'>> object when the users requests it. > > The arbitrary data
'>> attachment sounds like it is half of the work, is this > right? And
'>> what do you mean by running a plug-in on an object - do you just >
'>> mean calling up its property editor, or is there more? If the
'>> modeler already has the ability to attach data to objects, that
'>> part is a snap.
If it doesn't, then we would have to add that ability to it. I mean running OUR property editor (which we call the attributes editor), which allows editing of World Foundry attributes, like mass, render type, etc. When I say "plug-in" I just mean our code instead of the modelers code. It could be a plug-in, extension, script, whatever, as long as it can tell which object is selected, read/write our data to that object, and present a user interface to allow editing our attributes.

'> And, how about non-humanoid characters - robots, spaceships, dogs,
'> snakes, dinosaurs, snails, sharks. Are these doable? It looked like
'> the screenshots all seemed to involve humanoid-based characters.

Of course the imagery can be whatever you want. The physic system views most things as axis aligned boxes (the exceptions are sloped surfaces (which are always contained in a box for quick culling), and what we call 'handles', which are points in geometry which can collide with other collision boxes (imagine a player stabbing another player with a sword, the handle would be the tip of the sword). We don't handle objects which change shape greatly very well, but then, show me a game that does wink I do have some ideas in this area, however.


'>> But overall, the physics wasn't hard, it was the collision handling
'>> that was hard (it is easy to calculate where an object would be if
'>> it doesn't hit anything, it is much more difficult to figure out
'>> where it should be after hitting 3 objects in the middle of the
'>> last frame, and do it quickly). > > As for the physics, I have
'>> heard people say that simple so-called > "Euler integration" (I am
'>> not 100% sure what they mean by this, I assume > they mean simply
'>> computing the physical time difference between one frame > and
'>> another and plugging this delta-t into the velocity and
'>> acceleration) > has lots of instability problems. Nonetheless, due
'>> to its simplicity, > and before I try to understand anything more
'>> complicated, I was wanting > to play with this kind of physics. Did
'>> you use this in WF, or do you use > some more advanced form of
'>> integration? What is important for physics > in a game, anyway?
'>> Velocity and acceleration, clearly enough -- but what > else? WF
'>> seems to have momentum transfer too, a phenomenon whose equation >
'>> I have unfortunately long forgotten - any other important physics
'>> equations > built into WF's physics engine? I haven't heard of it,
'>> but we do use Euler's to store our orientation. Integration would
'>> be slow, so we simplify by saying that all accelerations are linear
'>> (the rate of acceleration does not vary in the middle of a frame).
'>> Once you do that, then you can just average the old and new speed
'>> across the frame (if you start the frame at .1 meters per second,
'>> and you end at .2 meters per second, you can just pretend you went
'>> at .15 meters per second the entire frame). Which is not quite
'>> accurate (certain objects which should have "just missed" each
'>> other might hit, etc. But the higher the framerate, the more
'>> accurate it becomes. (If we just ran physics at 1000 frames per
'>> second, a lot of problems would just go away).

'> As for collision detection, one general approach I read about is to
'> assume the invariant condition is that no objects are colliding
'> during any frame. Then, after updating an object's position, if it
'> is colliding, then move it back where it was earlier. If you can't
'> move it back to where it was earlier because something else moved
'> there in the meantime, you have to move that object back, too -- and
'> possibly meaning in the worst case that everything has to be moved
'> back to its original location. Do you use such an approach in WF,
'> or do you search space for new allowable non-colliding positions, or
'> something else?

Our physics is kind of interesting, in that it is the design of two people: myself, who has no official training (and started very weak on physics and math), and Phil, who had a much more rigorous math/physics background. I had years of video game experience, he was just starting. So what would happen is I would make up some weird way of accomplishing something (usually involving a white board and diagrams), and then task Phil with implementing it. As he ran into problems, we would work together identifying what was the cause, and try to come up with ways to fix it.

So, here is how it works (Phil, feel free to correct me where I get it wrong, this went through so many revisions I might mis-remember something): Each object has a starting position each frame, and calculates an ending position where it would be if it didn't hit anything. We then calculate an axis aligned collision box surrounding the objects at both the starting and ending positions. This is called the expanded colbox. We do (NxM)/2 collision checks on the expanded colboxes on everything in the room (and objects in the 2nd room check against objects in the 1st and 3rd). This produces a collision list (one entry per collision, if one object hits 3 others, there will be 3 entries in the list). As we are generating the list, we calculate what time the 2 objects hit each other (this time will obviously fall between the beginning of the frame and the end of the frame). The list is then sorted by this time. Then the list is processed one entry at a time, dealing with each collision in turn (configure the objects to the time in question, calculate the result of the collision). If the resulting position of the objects still fit inside of the original expand colbox, then we re-check the objects against the other objects in the collision list, add any appropriate new collisions, sort them by time, and proceed. If the new positions are OUTSIDE of the expanded colbox, we have to re-collide the offending object with the room again (1xM).

So we try to do everything a frame at a time, but when a collision occurs, we sub-divide the frame time to the time of each collision.

In most cases, this system works fairly well in quickly reducing the # of objects which need to be processed (bsp trees or some other collision partitioning would help a whole bunch more).

I never liked the idea of moving objects inside of each other and then trying to fix it. First off, that is not how it happens in real life, so I have no proof it works. (I always said "How does the universe do it?", which is why I don't know calculus, the universe doesn't calculate where the cannonball will land, it just iteratively adds the velocity and subtracts gravity, and it eventually gets there). I have seen failed versions of this idea where when attempting to back out the object would go the wrong way and come out the other side of the wall, etc.

On 23-Sep-99 nlin wrote: > First off, thanks for the physics explanation, that will give me > reading material for a couple of days for sure (man-days, that is, > probably meaning weeks before I actually get around to it). I > did wonder about this though: > >> Then the list is processed one entry at a time, dealing with each collision >> in turn (configure the objects to the time in question, calculate the result of > ^^^^^^^^^^^^^^^^^^^^^^^ >> the collision). If the resulting position of the objects still fit inside of the > ^^^^^^^^^^^^^^ > > Wha do you mean, "the result of the collision"? So you find a collision, > you set the Grand Universal Clock to the time of the collision, but what > do you calculate? Is this where objects bounce off of each other, transfer > momentum, etc? Well, if you want to get a headache, just read wfsource/source/game/ function ResolveCollisionEvent (line 259). Yes, this is where we do momentum transfer, and handle special cases like anchored objects (walls, floors, etc), objects on paths (which won't deviate, so also behave like they have infinite mass), steps (if you hit the edge of an object and its top is slightly higher than your bottom, just raise yourself over it (allows for climbing steps ala doom, but also covers over small alignment errors in the level data). I don't remember the momentum transfer function off the top of my head either, but it is in the code. I realize that I didn't really answer your question, I would have to dig in and wrap my brain around it again to really articulate some of it, and don't have the time right now.

'> Also do you have any opinions as to whether this sort of stuff would
'> be easier/harder/better/worse with bounding spheres instead of
'> boxes, or some combination of sphere-sphere and sphere-plane
'> collision methods (which are what I currently use)? I could imagine
'> that (while still not nearly 'correct' as poly-to-poly collision)
'> the combination of sphere-sphere, sphere-plane, and box-box could
'> cover a lot of common collision cases pretty well.

Well, for most level games, the floor is flat, so you at least need bounding boxes (a space flight simulator might get by with just spheres). So since we had to have bounding boxes, we decided to try to make everything work with them. Near the end we added box-plane collision for sloped surface support.

'>> original expand colbox, then we re-check the objects against the
'>> other objects in the collision list, add any appropriate new
'>> collisions, sort them by time, and proceed. If the new positions
'>> are OUTSIDE of the expanded colbox, we have to re-collide the
'>> offending object with the room again (1xM). > > I am still trying
'>> to figure out the idea behind the expanded colbox. > It sort of
'>> represents the volume of the path that the object took > between
'>> the last frame and this one, right? So if the new position is >
'>> inside of the expanded colbox this means that the collision stopped
'>> > it from moving along its path but didn't push it OUTSIDE of its
'>> path. > If the new position is outside of the expanded colbox this
'>> means that > the collision pushed the object outside of its path
'>> thus requiring > a new collision check. Is this a correct
'>> interpretation, or am I missing > the point of the expanded colbox?

The idea was to cull possible collisions quickly. I reasoned that if two expanded colboxes didn't overlap, then the containing objects did not touch. This idea would suck in a situation where all of the objects were going so fast they traversed most of a room every frame. The expanded colboxes of each object would touch most of the others, and we would run real slow.

Since most objects don't move that fast per frame, in a typical case very few objects have expanded colbox overlap, and we focus our efforts on dealing with those. An overlapped colbox doesn't mean the 2 objects hit, just that they could have. We then do more mathematically intense checking to see if they actually hit. We could throw away the expanded colbox code completely and the game would still work, just run much slower as it spends time considering cases it will eventually reject. Since the expanded colbox is axis aligned, an object moving diagonally will generate more false hits than an object moving along a axis. If after a collision the object still fits entirely inside its expanded colbox, it doesn't need to consider any additional game objects that weren't already in its collision list (it didn't increase the # of objects it could have it). If it does leave its expanded colbox, it needs to consider all of the other objects in the room, since it now could have hit objects which did not overlap its expanded colbox.

'> Also what exactly are N and M in your equations? I fiddled around
'> with a couple of assumptions (N the # of moving objects, M the ># of
'> static objects) but couldn't exactly decide what these are.

Sorry, N & M are both the # of objects in the room, we check object #0 against the rest, then object #1 against 2 and up, etc, so end up doing (NxM)/2 checks.

'> (Looks like this discussion could be the beginning of part of a
'> programmers guide, eh?)

That would be cool, especially if someone else did the editing. We could host it on the WF web site. I would like to start with a FAQ, and grow it into some programmers documentation. I am sure you have noticed by now I am not a very good manual writer.

'> What would theoretically be wrong with a sphere-plane collision for
'> the character-floor collision? Nothing, but since our main
'> character looked much more like a box than a sphere, we just stuck
'> with box-box collisions. Also, our floors aren't planes, they are
'> boxes, which means they don't extend to the horizon.

'>> An overlapped colbox doesn't mean the 2 objects hit, just that they
'>> could have. We then do more mathematically intense checking to see
'>> if they actually hit. > > Aha, that was the key point I missed,
'>> that the expanded colbox is used > as a first-pass "could they
'>> possibly have collided" before bothering to > do further checks. >
'>> > When you say "more mathematically intense checking" do you mean
'>> actual > poly-poly penetration for every polygon of both objects?

No, our physics don't know anything about geometry, it just works with the bounding boxes (2 exceptions: handles, and sloped surfaces: handles are a vertex which can collide with other bounding boxes, and sloped surfaces are done as one additional set of collision checks after all of the others).

"more mathematically intense" is just real did it hit or miss collision checks. Involves calculating at what time exactly did the two objects begin to overlap (in all 3 axis), plus other stuff I can't remember without looking at the code again.

I can tell you what it means: One of the fundamental assumptions is that objects cannot ever be inside of each other. It is the task of the physics system to prevent this from occuring. This assertion means at the beginning of a frame, before the physics have done anything, actor 1 and actor 67 were inside of each other. (BTW, this restriction only applies to objects with a mass greater than 0, so lots of objects can be inside of an activation box, for example).

So, this either means there is a bug in the physics, or a bug in some code it relys on (like math). The fastest way to tell if it is linux specific would be to enable the joystick recorder and send me a recording which reproduces the problem, then I could play it on the windows version and see if the same thing occurs (if not, I will crank up the debugging streams, run one each (linux and windows), and compare their output to see where the difference occured (This is how I debugged most of the linux port).

This sort of thing is much easier to track down when you can edit your own levels (make a simple test case).


On 25-Sep-99 nlin wrote: > > The collision assertion I mentioned in my last message is > as follows: > > +- ASSERTION MESSAGE


AssertMsg:Actor #1 (unknown) and Actor #67 (unknown) > \xDA\xC4 ASSERTION FAILED \xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4Ŀ > \xB3!attr1.PremoveCollisionCheck(attr2) \xB3 > \xB3in file "" on line 82 \xB3 > \xC0\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xD9 '> sys_exit: game quit > sys_exit(-1) called > Calling sys_call_atexit
function > sys_call_atexit function called, num_atFuns = 0 > calling exit with return code of -1 > ValidPtr( (nil) ) failed > ValidPtr( (nil) ) failed > \xDA\xC4 ASSERTION FAILED \xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4Ŀ > \xB3ValidPtr(_oadData) \xB3 > \xB3in file "actor.hpi" on line 117 \xB3 > \xC0\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xC4\xD9 '> sys_exit: game quit > sys_exit(-1) called > sys_exit called
recursively, retcode = -1, calling exit directly. >


Yes, I definitely want to make games where the bad guy uses the environment to attack you (and you can later use the same objects on them).

Objects can be pushed around, but geometry cannot be altered dynamically (you can push a box around, but you cannot deform the geometry of the box), but you could make an animation of a box getting deformed.

There are currently some chain of execution issues with pushing boxes around, if you push a box against an immovable wall, then push another box against the first box, then push on it, our physics system fails, because it thinks the box will move out of your way, and it can't. What we need to do is add the ability to have each object track what other objects they are in contact with, so that when they are pushed they can push on any objects they would hit and add that other objects mass to their own so that they push back on the player hard enough. BTW, there is nothing special about the player, any object can push any other, their mass controls how much of the momentum is transfered to each object. There are also "anchored" objects, which cannot move under any circumstances.

'>> One of the attributes on an object is what geometry file to use, we
'>> don't convert the geometry from the level file. This allows the
'>> geometry to come from a different 3d package than the level. > >
'>> That's an interesting concept. It is key to making a playstation
'>> game work, since re-use of geometry is the only way to fit into
'>> memory (2mb main, 1mb video including frame buffers). (That 2mb of
'>> main includes the code, stack, EVERYTHING). Actually, most games
'>> do the same thing, you only make geometry for one grenade, then
'>> generate lots of them as the game runs. We just extended the idea
'>> to all geometry (separate what it looks like from how it behaves).

'> Also, how about stuff like water? Seems that it has become popular
'> to allow the hero/ine to jump between land and water. Can the
'> physics system somehow be changed from one "room" (above water) to
'> another room (below water) to accomodate swimming? How about the
'> subtle lighting changes which occur when underwater? Have you done
'> something like this with the engine before?

The physics engine is set up so that there are handlers for different situations: the air handler runs when you are not standing on anything, and accelerates you towards the ground. The Ground handler runs when you are standing on something, and responds to the movement of the object you are standing on. We do not currently have a water handler, but it would not be hard to write.

It is also possible to get pretty close without modifying the engine: There is an object called an 'activation box', and one of its capabilities is it can 'push' on any object within it. So water could be done by making an actbox the size of the water with an up vector at about the same rate as objects fall. This would make objects hover (they would behave like they do in the air).

'> This isn't one of my immediate goals, but is a Tomb-Raider style
'> character possible, where she clibms up ladders, hangs off of
'> ledges, etc? I have no idea what sort of collision detection they
'> are using, but it seems like it has to be pretty accurate.

We did some things like that in velocity, but it required custom movement handlers. We had swinging on poles, climbing up boxes, etc. For example, climbing a box was done by placing a handle on the character's hand, then the movement handler would align the handle with the top of the box. Make an animation of the character climbing (with respect to the handle), and it worked fine.

The screen size can be overridden on the command line: wfrdebug -height=200 (the width will be automatically calculated), possible height values are: 200,240, 384, 400,480,600,768,864,1024,1200

  • **OPENGL
Currently, we are not using GL correctly, in that we are only using it to draw 2D polygons and using our own 3D pipeline, which causes the lighting to look incorrect, since GL doesn't allow me to make a 2D polygon which is brighter than the texture map (I can only darken, not lighten).


Never mind, I re-did it using xdemos/offset as an example, and it is much better now. It is now double-buffered, runs much faster on mesa software, and works with the 3dfx (with the exception of textures).

Since World Foundry started on the psx, I just make one huge 1024x512 pixel texture, and map portions of it onto my polys This doesn't work on the 3dfx because it has a maximum texture size of 256x256. So next I need to re-do some of the texture handling (does anyone know if there is a way to tell OpenGL I have a large image with multiple textures in it?

We have a tool (textile) which fits all of the textures used in a room into a single texture (256x512 pixels on the psx), this allows for quick loading of textures as the camera moves from room to room (we keep 3 rooms plus some permenant textures loaded at any given time).

I would hate to give this up for a system where the renderer manages the textures, where you ust hope they fit into video memory, and don't have any control over them. If this is the only way to do it under GL, I now understand why we need video cards with so much memory.

I want to load a single large image, then tell GL there are textures in that image, so that video memory does not get fragmented.

Any suggestions or info would be greatly appreciated.

The Voodoo2 doesn't support textures larger than 256x256 (we are allocating one big 1024x768 texture), so until we re-do the texture handling, textures won't work.

'> As a matter of curiosity, does WF have a software rasterizer, for
'> instance? Did it need one for the PSX?

No, the psx has a 2d polygon engine, so WF does not currently have a software rasterizer (and with GL available, no need for one). I have written rasterizers in the past (way back, when working at spectrum holobyte, so only flat shaded). It does have a complete 3D pipeline, however, the only thing it relies on external libraries for is the 2D rasterizer (it would not be hard to write one, just haven't needed it). Hmm, maybe that would be a good idea for non-accelerated machines. Currently, we are not using GL correctly, in that we are only using it to draw 2D polygons and using our own 3D pipeline, which causes the lighting to look incorrect, since GL doesn't allow me to make a 2D polygon which is brighter than the texture map (I can only darken, not lighten).

Visuals: Our lighting is currently room based (lights affect the room they are in, and don't affect other rooms), so you could make the water a separate room and light everything differently there. Overall our graphics engine doesn't compare to something like quake, it implements everything a playstation can do, but nothing more.


'> 2. It appears that World Foundry can do first-person or third-person
'> perspective. Is this correct? Yes, it doesn't even think in those
terms (the camera is a separate object, and follows whatever object you specify (player, bad guy, bullet, etc). So it is free to wander away from the player, or look at something else (we had some great cut-scene style behavior in velocity where you would drop a grenade, run away, and when the grenade went off the camera would jump back to the explosion and watch it, then jump back to the player). Another example would be we had as a weapon a guided missile, when fired it the camera would follow it (and you could steer the missile), when the missile hit something, the camera would back up to get a good shot of the explosion, then glide back to the player. This was all done without touching the engine code, just a few scripts.

'> 4. Are there any license restrictions on the games I produce with >
World Foundry? Can I sell games either as shareware or commercially > without needing to pay you any license fees?

No fees of any sort (unless of course you WANT to send us money wink The code base is all GPL'ed, so you can do anything the GPL allows. This would include creating a commercial game with it (but any enhancements you make to the engine you would have to release under GPL). You would NOT have to release any of your assets (levels, geometry, scripts, etc), which are what make a game a game.

Let me know if you think this is a problem, we are open to other licensing arrangements.
Topic revision: r1 - 07 Oct 2002, MrLin;
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback