ZBrushCentral

4 million polygons

I heard that 4 million polygons is enough to get great
displacement using projection master.

But I can’t get a model to 4 million even if I’m working a simple polymesh cube3d (I can’t get past 2 million). :cry:
Do you hide certain parts and divide (and get triangles)?
How would I split up a mesh that clearly is very organic and meant to be one peice such as a human head? :frowning:

How much RAM does your system have?

If you have 1 GB you will be able to hit up to 4 million polygons, but 2 million is optimal. If you have 2 GB, you should be able to reach twice those figures.

However, a Cube3D is a parametric object. They cannot be divided as far as polymesh objects can. What you would do with such a model is convert it to a polymesh using Tool>MakePolymesh3D and then select the new object in the Tool palette.

Also, what is your Preferences>Mem>Compact Mem setting at? It should be about 256 MB less than the amount of RAM that your system has. See the ZBrush 2 Performance Tips, found in the General part of the FAQ here at ZBC.

I have messed around with the mem settings before, my cube was a polymesh, and my system has 1 gb of ram.

But I’m going to make sure I look at those tips before asking for help on this again. Thank you Aurick. :slight_smile:

I would like to know more about splitting up the mesh,which was my other question.

Try doing a search for posts by me using the word “split”. I’ve covered the topic quite a few times already.

So when we talk about polys, are you referring to the zbrush polygon count that you get by moving the cursor over the currently drawn tool in the tool menu? Is there another way to get your polygon count? (I know you can inside zmapper but that is a bit of a haggle to just get the correct tri count.)

Since zbrush’s polygon count is in quads and not tris it can be a little confusing sometimes; because lets say your zbrush polygon count is at 2 million, well technically that is 4 million tris. So if you went to a program like max or maya and tried to render it out you have to hit 4 million tris at render time (if your using a displacement map to re-create your zbrush mesh). Some computers and some render’s even at render time can have trouble with polygon counts that get above that. Once you calculate in all the lights and textures and shadows even with 2 GB of ram. It can bring a system to its knees.

I would think zbrush would want to state the tri count and not the quad count. Since the tri count is double. Thus giving the appearance of a much larger model.

Nobody has an answer for this? :smiley:

I was talking about the number you get went dragging your mouse over the current tool,

as far as tris and quads go
I never have used tris, quads seem to be the standard now, I wasn’t around when tris were used, I thought that tris were an earlier form of modeling which makes up quads now:qu:

but according to aurick if I recall correctly, 4 million quad polygons are nessary for clean displacement in zbrush.
You can only displace the points that are there, with
high poly counts there are more points, and more detail is allowed.

any links to splitting up models tutorials or concepts for more detail in a model, would be greatly helpful.
thank you aurick and --E-- for responding.

I guess I didn’t explain my point clearly, I apologize. Yes of course you want to keep your model in quads when modeling and for nice sub division and displacement that is a given. What I am saying is that most every other application when it renders your object and gives you your “poly count” it gives it to you correctly, in tris. Not quads.

Create a simple 6 sided box in max or maya and look at the poly count when rendered its 12 not 6. That is because it is counting every tri. In zbrush a 6 sided box’s poly count is 6, because its counting quads not tris. (Which is not the norm with most rendering engines and applications that deal with polys. Game engines in particular. )

Same situation if you imported your high rez mesh directly into max from zbrush. lets say you imported your mesh from zbrush. and in zbrush its poly count is 4 million. once imported into max (if possible :)) the count would automatically be 8 million because upon import of an obj max converts the mesh to an editable mesh which counts your tris automatically.

But the main point was being that if you rendered a 6 sided box in max, the rendered poly count is 12 tris. This basically meaning that you’re rendered poly count in max needs to be double that of zbrush in order to get the same level of detail.

So if you had a 4 million poly mesh in zbrush which counts in quads. And you wanted to replicate that same mesh in max it would have to be 8 million tris at redner time in order to replicate that same mesh. This is because max counts in tris and not quads at render time. If you only set it to render 4 million tris that technically is only 2 million quads and thus half of the required resolution to match zbrush.

Sorry for the confusion mate.:smiley:

Even splitting a model across places that are on full display and not tucked away in seams and wrinkles should not prove hugely problematic, if you model up your whole mesh to its maximum poly limit and then provide each separated off part with a two-poly-deep buffer stip of polys at the cut edge. You should then be able to continue up the levels with your new tools and combine all the generated normal/displacement maps seamlessly, so long as you avoid detailing near the parts edges.

I seem to remember you saying your purpose is to render the model in Z-Brush, in which case you’ll not be generating normal maps, but just compositing the parts together on the canvas, won’t you? In which case you’ll just need to paint transparency map alphas for each part to hide the duplicated buffer poly rims.

Best of luck!

R

thank you Rory_L, I’ve almost got it. :slight_smile:

Hitting 8 million polys with 1.5 gb DDRam no problem.

Did this only by shutting down other apps, turning off memory optimizing program (rambooster), and setting COMPACT MEM to 4096.

Model still handles well with good speed.

Comp stats: p4 2.8 @ 3.4ghz, 1.5gb DDR500 @ 240mhz

Hi beeek,

With 1.5 GB of RAM you should NEVER set Compact Mem to higher than 1536. A setting of 1200 is better so that enough RAM is left over for your OS as well.

Compact Mem tells ZBrush what point it should stop writing to RAM and start writing to disk. When you set the value higher than the available RAM in your system, you have set a value that ZBrush cannot reach. Because it cannot reach that value, it never writes to disk – even when it really needs to.

It’s great that you’re able to hit 8 million polygons and still get good performance, but you have to admit that you’re taking some extraordinary measures to do this. :wink: That’s why we state that MaxPolysPerMesh should never be set to more than double the default value (which is determined by ZBrush based on your available RAM). Depending on the user and the computer, you MAY be able to go higher than that, but doing so is always a risk. Performance will almost always suffer, and you also risk getting into a situation where the computer can’t save your work or just plain crashes.

Please see the FAQ>General section here at ZBC. The Performance Tips entry will help you get your settings under control so that ZBrush can perform better. (And believe me, setting Compact Mem to 4096 is hurting your performance; not helping it.)

Aurick:

Thanks for the suggestion, program is considerably more responsive and compacts more quickly and less frequently with COMPACT MEM set to about 1GB (the amount actually available to ZB on my system).

beeek :+1:

I have 3 Gb and a 2.2 Ghz machine and I can’t work with more than 2.2 million polys - I’ve adjusted the MAX polys and Mamory uasage settings to no avail. Is it possible that if I were to take a Gig out of my machine ( so that I have 2 Gb ) that will be able to increase the number of polys I can work with? Thsi doesn’t seem correct however. Thanks for anyone that can help.

Even though your system has 3 GB of RAM, you need to set Compact Mem no higher than 2048. This is because 2000/XP do not make more than 2 GB available to any program.

If you continue to have difficulty, you may well find it necessary to remove one GB. I have seen cases where Windows got flaky after passing 2 GB of RAM.