ZBrushCentral

Environmental Modeling

I think what Skycastle was stating was that bump/displacement maps look the same insofar as their grayscale values are concerned. Of course they look different at rendertime, as one actually uses the grayscale values to “displace” the mesh, the other just does a nifty visual trick with the normals, i.e. bump. Normal mapping is like bump on steroids. It still wont displace, and thus not take as long to render, but damn it looks hella nice. Im not certain as to how the lowrez to highrez transition occurs. I am guessing that it would be something like a blend of the rgb values for certain poly normals on the high rez mesh to certain nomrmals on polys of the low rez mesh.

I think everyone here knows the difference between a normal map and bump map. I’m wondering if a normal map provides better detail than a bump map and if so would it be any use to apply that over a bump map to provide increased detail. Looking at the maya rendering there are some details that have not really come out (looks a little blury in other words)…Or is that because of lighting in maya.

Anyway i got my code to download zbrush 2 (finally :slight_smile: ) so i can experiment for myself.

As far as I’m aware, Normal maps are a hybrid of Bump and Displacement maps.

Bump map(greyscale)…gives the effect of displacement but is only a “2d” effect to give the impression from the camera angle. It cannot change the mesh and so the “silhoette” of the object won’t change.

Displacement map (greyscale)…actually deforms the mesh at rendertime.

Normal map (multicoloured)…It creates an effect similar to bump maps except it also has information to change in the Z axis from the mesh. Red=info to change in the x axis, Green= info to change in the y axis, Blue= info to change in z axis.

In effect perhaps this means that Normal maps will eventually take over from Displacement maps…when it’s more supported by 3d programs.

This is all my impression, so anyone correct me if it’s wrong.

Yeah, but normal maps do not alter the silhoette. In other words, you get all of the “details” within the lowpoly model, but on the edges of the low poly model, you will still see the lowpoly silhoette. But for geometry like Dave posted, it wouldn’t really matter.

This is the Z2 Normal Map displayed
in the Crytek PolyBump viewer.

Dave

Dave have you changed something on the normal map setup ? I tried with Crytek tool but can’t to obtain the same result…

another question have you tried Kaldera plug named Mankua ?

No I did not do anything strange?
I baked a tangent space map out of Z2
then exported from Photoshop as a .dds ( I think ). I had to hack my .mtl file
for the .obj. And I also had to triangulate
my model… I think that was it.

Dave

Hey Frouad B
The Mankua plugin is for Max. I downloaded the demo and installed it but the demo will only let you render a normal Map off one of their supplied models. You can’t test it with a ZBrush model. :frowning:

Upham :slight_smile:

is there a way to view the tangent maps in zbrush, or do you have to quit out of zbrush to view the tangent map applied in real-time. it seems like a zbrush material to do this would be exeptionally usefull.

Thx Dave, I never tried the nvidia plug in (“dds” format) I will

UphRRam :stuck_out_tongue: , yes I tried already that plug in… and the result on 3dsmax seems better than Zbrush, and normal map seems different than a map produced by Zb (colors I mean),

I someone have more experience about that, I’m listening.
thx .

Awesome looking work, would it be possible for you to post a zscript of it, or maybe just a quick process guide? Im really interested in how you made the cracks so sharp, and the smashed concrete. Ive tried to make something simaler, but i cant seem to get the results i want.

My understanding of things…
Greyscale images used as bumps in maya perturb the normals only out in the direction of the original normal the amount defined by the grey scale.
RGB normal maps use RGB to represent XYZ and thus the bump is not only in/out.
Displacements either move mesh points (lightwave???) or add enough extra mesh points to represent the details and then move them at render time or move pixels at render time (subpixel displacements, mental ray, renderman, etc.).
Also, file nodes in maya have the filtering set really high by default (1.0) I like to always set texture map’s filtering to 0.01 or 0.001.
At leasts that’s what I know.
g

quick Q from a nubi
how u apply Normal maps in Maya, do u need aspecial plugin for that
thanks

Yeah you do. You can go to www.drone.org to get maya plugs. I think version six (Maya) will generate Normal Maps but it still doesn’t use them in their standard renderer. :cool:

Here is a test render of a normal map
rendered on the same geo inside Maya.
The normal map was rendered Turtle Beta
( new render for Maya illuminatelabs.com ).

The normal map drives a per pixel effect that
bends the normal at for each pixel based on the map. Gives some nice detail without the
micro poly displacement render times.

Dave

Never heard about turtle before. HOW COULD I HAVE MISSED THIS ??? !!! gonna give it a try ASAP ! Thanks for the trick.

Have anyone tried the normal map generator that comes with Maya 6 in standard yet ?
(lighting/shading > Transfer Surface Information)

Hi Dave,

Just wondering what options you used for this regarding painting maps and then displacement problems on ~90 degree edges. You have painted a large crater right on the corner edges, below the large area, and it seems to have worked fine. Whenever I use projection master, or sculpting in zbrush the edges (in mycase 3 edges bevelled to subdivide nicely) interesect, and displacement calculation is bad.

Thanks

Ben

Thats a good point, details like that what " go in" to the geometry will interesct with the low poly, and you will have probs with the normal map it generates. Do you just size your cage right up with the push tool?

This is great thread! Thanks for all the explanations everyone. This stuff is actually making sense to a noob like me.

–roger

Been a while since I caught up with this post but my workflow was :

Geometric shape in Maya (cube for instnace)
Import to ZB and use projection master to push in along edges with rocky alpha
Pick up PM (but turn on ONLY deformation NO normalised or fade)
Now you have object with PM result in edit mode
Go to level 1 and calculate disp map on this object (do NOT switch to original mesh)
Export lvl 1 mesh as above and use this as the new base mesh in Maya.

So I ended up replacing my models with new ZB lvl 1 instead, you will notice ZB modofies your original surface lvl 1 when you do such big sculpt changes higher up, this is to accomodate the big difference