Btw Zircon, you say "3d modeling apps use hardware mode." The only way to directly access the hardware under the windows environment without installing "fake drivers" onto the system is with OpenGL or DirectX. DirectX handles transformation, lighting, and rasterization. The program tells DirectX what to do, DirectX does it. The program does not. (Yes, I've programmed it before a little bit)
So therefore, games and 3d modeling apps alike are using much of the same code, which is all run by the processor.
*sigh* Ok, let me answer that with a couple of questions.
1: If that is the case why not use the graphic card, it should give a tremendous boost?
2: How come AMD processors perform badly in 3d modelling applications even when they render a scene without raytracing options (in other words no hyperthreading) ?
(The normal scanline renderer doesn't have hyperthreading support)
3: Why are opengl and directx the only accesss points (communicators), what about Mesa, QuickDraw 3D, Reality Lab, BRender and RenderWare. They may be old but they are independent and have no connection with either opengl or directx, (mesa = opengl clone) Is it an impossibility that discreet developed their own API in order to "talk" with the hardware? and isn't that one of the reasons why 3dsmax can switch between the mac and pc or does the mac use directx (think not), that only leaves us with opengl which i doubt is "sophisticated" enough.
4: If directx x can be explained in three simple steps transformation, lighting, and rasterization is there no other difference between directx9 and directx1 other then optimisations?
5: "Why the heck did you do the initial renderer development as a Render Effect?
In 3DStudio MAX, a plugin renderer is required to do much more than just render the scene. A plugin renderer is responsible for enumerating the entire scene, sub-system sequencing, bitmap writing, gbuffer management, etc, etc. It must force each object to generate a valid mesh, generate all the vertex normals including smoothing group processing, generate all the shadowmaps, create a complete and valid g-buffer, trigger the set up of all auto-reflect style maps, etc. While this is certainly doable, and the prototype Max 1.2 ray trace texture actually did much of this, it is an incredible amount of tedious (and potentially buggy) code to deal with when one is trying to proof of concept a renderer. As a developer, it can be very difficult to tell if it is the scene enumeration, or actual rendering that is broken when a problem arises. In many ways, this huge speedbump in development is what has kept us, and likely many others, from experimenting with actual renderer code in Max.
Render Effects on the other hand, is the perfect place to hang a prototype renderer. Why? A Render Effect is a very simple plugin to implement, and when its single method Apply() is called by Max, it gets handed three things, a complete and valid enumerated scene, a bitmap, and a handle to the render progress dialog. That's all you need. The enumerated scene is in the form of a single C++ object called the RenderGlobalContext, and contains the entire scene, with prepped shadow maps, meshs, etc. ready to go, and the bitmap contains a valid g-buffer, handily prerendered by the Default Scanline Renderer. The bitmap itself is the VFB. In other words, a developer can focus totally on the task of generating an image, which is the real purpose behind writing a renderer. The rendering technology can be incrementally implemented, with much less concern about compatibility.
The Render Effect API is much closer to the way Renderer plugins should have been implemented, with the scene enumerator being a separate plugin altogether. Had this been the case, I suspect there probably would have been many more renderer plugins developed over the years. If anyone out there wants to experiment with writing a renderer for MAX, steer clear of Renderer plugins during your prototyping phase, and head for Render Effects. You will be far less frustrated.
Also, for us (Scott and Steve), working this way provided a convenient place to delineate our coding responsibilities for these early stages of development. One developer can focus on the core rendering/rayserver tech, and the other can focus on the scene enumeration tech, and we don't need to sync code nearly as much.
Brazil does not require DirectX and is not supported under Windows 98."
Is splutterfish (Brazils creator) lying?