![]() ![]() I've only had it for two weeks, but already can't imagine giving it up. I'm amazed at your service level and the Bakery asset is proving to be exceptionally valuable beyond the sticker price in terms of build quality and time savings. ![]() Thanks for your simple, easily understood instructions. They could even slap the "Rendered In Engine" tag in the bottom corner of the screen. If it could be used to render out videos for trailers, even better. If they're already using bakery, chances are they're targeting a reasonably high level of visual fidelity that would lend itself to path-traced rendering. Every game company needs high-res, high-quality images for there promo materials. I'd also like to add, if you market this well, this is not just for the ArchViz & Automotive industry. I'd wager most ArchViz studios are comprised of artists though & do not have the resources to approach this problem.Īs to the specularity issue you raised, how are Unity handling it with the default path-tracer? Have they introduced a new pass specifically for path-tracing? I think a dedicated shader would definitely defeat the purpose here unfortunately. Given enough time we could probably solve these problems ourselves. Now we have reasonably strong programmers / technical artists. What we're after is to simply press a button & have all the bakery lights convert to real-time equivalents, then have an image be rendered, denoised & saved to disk at whatever resolution we require. We'd like to automate the denoising step, currently we are taking the output images & denoising them with an aftermarket denoiser (Topaz AI). We need to have our artists go through all of the lighting in the scene & match up the perceived real-time lighting to our baked lighting. We currently have 2 problems we're facing. We have this working for the most part, but the process is pretty tedious. Ideally we can simply capture high quality path-traced images from our real-time setup, with all our lighting & materials in-tact & no additional setup. Some companies simply don't offer marketing renders & just supply a real-time PC or VR application. ![]() 3DS Max/V-Ray) & re-setup all there lighting & materials to produce a render, or offer a rasterized image captured in-engine, which doesn't really hit the quality threshold for offline images, even with high-quality baked lighting. In ArchViz (& Automotive) most companies offer high-resolution rendered images for marketing purposes.Ĭurrently anyone doing real-time ArchViz either has to duplicate there scene in a rendering app (eg. I should clarify, we are not looking for a real-time solution. made it not so real-timeĭLSS and other denoisers used in games are much faster (but slightly less accurate) than OptiX/OIDN, but AFAIK you can't simply download DLSS SDK, it should be specifically discussed with NV. ![]() I can make a single set of PT-compatible standartized shaders for each pipeline and it'll work, but I don't know if it's convenient, especially if you want to build the same interactive scene with different shaders.īTW, I tried to apply denoising in real-time on top of RTPreview, but it. Unity defines a "meta pass" system which standartizes albedo and emission, but it doesn't go beyond that, meaning that there are as many ways of encoding glossiness/roughness as there are different shaders. Making a complete path-tracer is the easiest problem to solve here, but the bigger issue is properly supplying all material data to it. Currently RTPreview receives downsampled textures and it doesn't account for specularity. Click to expand.I haven't, as I didn't realize there is a need. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |