-
Posts
114 -
Joined
-
Last visited
Content Type
Profiles
News Articles
Tutorials
Forums
Downloads
Everything posted by Almightygir
-
perfect. i'll begin trolling immediately.
-
That didn't link to anything =[
-
Question: Do you guys have a discord/slack channel that i can hang out in?
-
toolbag 3 supports animations via geocaching, so while it won't natively run the animations, if you export the mesh + whatever test animation as fbx or alembic, you can view them in toolbag. that said, viewing animations is to see how a mesh deforms, not how a texture looks. if you want a solid PBR renderer to look at things before exporting to engine, that's what it's for. it's specifically designed to be middleware for pre-vis and lookdev while working in the iteration stages. oh and because it bakes textures (in realtime!!!!!!!!!!!) it's fucking awesome. anyway, all i'm saying is: updating a renderer is no trivial thing. and since there is already a previs tool that exists, it's a low priority (in my personal opinion as a member of the public).
-
Because Marmoset Toolbag exists
-
haha awesome dude! now for the fun part: Spherical harmonics
-
hey sorry, been at uni all day: Xycaleth is correct.
-
Since you'd just multiply one with the other in the shader, it makes more sense to do it directly in the texture, to avoid an additional texture call or using a texture channel you could reserve for something else.
-
Also these: https://www.marmoset.co/posts/physically-based-rendering-and-you-can-too/ <--- practical application https://www.marmoset.co/posts/basic-theory-of-physically-based-rendering/ <--- education on PBR theory (very useful to artists, but can get techy) https://www.marmoset.co/posts/pbr-texture-conversion/ <--- if you want to (try to) convert older assets to current gen, may be more work required than what's in the doc though. So the frostbyte guys have a "special" type of AO map, where they make the occlusion angles highly accute, which causes much tighter shadows in the map. They're in practice, the same thing... as in, they both affect the renderer the same way. Why do they do it though? Well, an AO map will usually have large, sweeping shadows (think of like... armpits, or between the legs and stuff). Often the lack of ambient light to these regions is enough to create odd looking shading. So by using cavity occlusion instead, they're limiting the lack of ambiance specifically to areas which have really deep geometric variance. Interestingly, Naughty Dog recently released a paper on Uncharted 4 which can help with this stuff too, by interpolating the baked AO details out at glancing angles. i wrote a blog post on this (have to keep a stupid fucking blog for my masters degree), if anyone wants to read it: https://leedevonaldma.wordpress.com/2017/02/19/keeping-things-unreal/ The math/pseudocode is: float FresnelAO = lerp(1.0, BakedAO, saturate(dot(vertexNomalWorldSpace, ViewDir))); as for nomenclature... It was a jarring switch industry wide, and a lot of people rebelled against it. But it really did make sense to use new terminology (ie: Albedo vs Diffuse, Reflectivity vs Specular) when considering the subtle differences. Sick! There's definitely something odd going on with the fresnel component though, it gets really really dark at the edges here, when really it should get closer to pure reflection instead.
-
you should rename the Specular RGB map to something else, Reflective or Microsurface would be better. Specular has an actual defined term in current industry usage. If you're using metal masks (which you are), then reflectivity = lerp(0.04, albedo.rgb, metalmask). if you're using a specular map, then reflectivity == specular.
-
Also, not sure of how your system is set up, but a common method of storing AO at the moment is to put it into the red channel of a packed texture map: Red = AO G = Roughness B = Metallic (not sure if this one applies, does your shader system use metalness or specular for reflective input?)
-
NO NO NO NO NO!!! AO is AMBIENT occlusion. it masks the AMBIENT contribution in the shader pipeline. by applying it directly to the albedo, you're basically sticking a finger up to the PBR system which is specifically designed to give you consistent lighting, by saying "i want permanent shadows in this exact position, fuck where your lights are".
-
Ah, okay so it's just a bent normal map, something i (character artist) give no hoots about
-
just curious, but wtf is "delux mapping"?
-
that's bad. we need both.
-
You're either working on mip chains, or importance sampling. either of which are crucial to getting PBR IBL correct. good job Edit: oops, that was posted yesterday i guess... either way, i'm right >;] Manually setting the mip-level is kinda what you have to do. it's kinda like... you importance sample across the hemisphere using a crazy mathematical equation to figure out where those samples happen, and the mip level you sample is usually just a divisor of the 8 mip levels by the gloss input.
-
my only problem with sticking to .xsi as a format, is that softimage is now non-existent and will be harder and harder to get hold of as various torrenters stop seeding it. i thought about this some more last night, you're much better off supporting any form of FBX > GLM/GL2 worflow. Unfortunately the support i can offer right now is limited, i'm in the middle of doing a masters degree which is sucking up all of my time. However i finish in september, and i'm more than happy to jump in and help with tools development then, maybe even see if @@Xycaleth's program can be updated to include skeletal mesh outputs? If we can reach that stage, then the entire .xsi workflow will be obsolete.
-
So here's an idea that might be better... @@Xycaleth wrote an fbx > mdl converter, right? So why not modify that to export the new .GTB file containing the tangents and binormals found in the fbx, which can then be read by the importer. That way, you're no longer relient on xNormal, and instead can use the fbx format for your low poly when baking and texturing in the now industry standard substance tools, marmoset toolbag, or knald baker. And because the .GTB file will contain the tangents and binormals that the fbx had, you can be sure they are synced to the normal map bake as well. edit: this also breaks reliance on the now depricated .xsi formats in favour of a more widely used current format.
-
xNormal loads FBX files the same way every other piece of software does, FBX stores vertex data as a struct, with multiple vectors as entries in that struct, one of which is tangents, another is binormals. It just accesses each entry in the struct as needed. You absolutely should NOT store tangents or binormals in vertex colors as those are often capped at half precision.
-
Just to make it absolutely clear to all. the high poly can be any filetype. It doesn't need to be .XSI Bake came out good Psyk0!
-
When asking "are we going to force a pipeline on the community?", it's prudent to consider that since they've been able to modify the game at all, some 15+ years ago, the pipeline has forced on them. It's far better to have a forced, yet clearly defined, outlined, and documented pipeline, than to just say "make it how you like".
-
Just want to quickly point out that normals, tangents, binormals etc. are all just vectors. and colours in terms of rendering, are also stored as vectors. therefor you can visualize any channel by outputting that vector into a color channel.
-
You should check that option if you are using stored mesh tangents and binormals (which is what you should be aiming to do). Otherwise it will generate arbitrary tangents and binormals for the mesh and pass those through to the pixel shader. By checking that option it's telling the pixel shader to use the tangents and binormals per mesh vertex. xNormal doesn't export tangents. it exports a tangent space map, which was rendered using tangents either stored in the mesh, or generated arbitrarily by xNormal. This is why it's important to have a synced workflow. If the baker is using arbitrary values when baking the tangent space map, the renderer will never look right.
-
If storing tangents directly in the .GLM file is undesirable, as long as they can be reliably stored in a secondary file (.GTB for example, Ghoul Tangent Binormal) and read by the renderer, i don't see a problem. @@Xycaleth, seems spot on to me.
-
I think i'm confused... you keep referencing the DF2 mod, i thought that was a separate entity to this? Xycaleth seemed to make that assertion anyway. As for me coming along and saying "it's not perfect". Sorry bro, i'll let you live in your world of rainbows and butterflys. Adding a new field to the GLM format won't affect backward compatibility, the renderer will attempt to read tangents in files that dont have them, it will return null, and it will calculate its own. Worst case scenario, a tangent calculation will need to be added to the renderer to supply meshes which don't contain it, with it.