marble rendertest

Inspired by great Marble renders and Shader hack by Lee Griggs. I’ve decided to recreate some tricks with different render engines, a little render Comparison.

The basic idea, using a glass shader for outter shell and an inner sphere with textured volume to fake depth. this way, you will save a lots of work with actually modeling the inner part of a marble. Spectral renderer don’t use this kind of trickery they can actully the inner part as real glass medium (using textures).

I’ve use Cycles (Blender), Renderman, Arnold, Octane and Indigo Renderer. I’ve tried to create a Marvel in Redshift, but i could make it work with single texture and 2 spheres. For Redshift you need actually model a marvel to get realistic rendering.

the spectral rendere engines was the fastest by far. that’s because with spectral render I used a medium instead of volume for interia, that’s saves a lot of render time.

here is quick test with single glass object with Indigo and Cycles :

this here are glass spheres with regular solid texture spheres inside:

Renderman Spline rendering

I’ve tested RenderMan with spline rendering. it’s much faster compared with baked Geometry. the spline primitives are very good in Renderman.

I’ve stopped the test with mantra after 10 minutes, it is super slow even with optimized render settings.


the Renderman images rendered for 3:20 on a 6 core Xeon
2.7ghz with pxr pathtracer. 4k resolution

geometry:

splines primitives:

rendering splines directly saves a lot disk space and meshing time. interactive rendering was super responsive.

Octane 2019.2 renderings is very fast, I’ve used the geo to render it. 2 minutes, as except its lot faster the CPU renderings. the spline rendering in octane does features custom width yet.

Octane gives physical correct shading out of the box. Renderman the shading is extreme flexible with NPR renderings, which is hard to get in Octane. the IPR responsibility was a little slow with shader tweaks.

light simulation

I’ve made a simple scene to test the physics of light. for proper light calculation, I used spectral render Indigo and Octane.

Indigo has multiple Engines, standard Spectral Path tracer on CPU or GPU and Bidirectional path tracing with MTL sampling (metropolis Light Transport). Octane has only default Spectral Path tracer on GPU but includes an MTL sampling method. I’v also added Renderman 23 to the test with its unified rendering integrator. It supports bidirectional path, manifold caustics and path guiding on the CPU.  

Another render engine like Arnold or Cycles with regular Path tracing would impractical fro complex light calculation tasks.

The base scene is Sphere and squashed Sphere underneath inside a volume box (uniform VDB).      

The following image is the result of Indigo Renderer with BiDirectional path tracing and MTL. It was by far the fastest rendering.

About Render Engines part 1

This is a quick overview of current render Engines for Houdini and General in terms of MotionGraphics and VFX usage. 

There are different RenderEngines out there, each one is unique and uses different method to solve a problem. I am looking into Arnold, RenderMan, Vray, Octane and Redshift. For comparison reason I added Indigo Renderer engine.

There are different way to render a scene with benefits and shortcomings. lets start with most common one.

image by Glare Technology

Pathtracing (PT)

to be precise Backward Pathtracing.  In backward ray tracing, an eye ray is created at the eye; it passes through the viewplane and on into the world.  The first object the eye ray hits is the object that will be visible from that point of the viewplane.  After the ray tracer allows that light ray to bounce around, it figures out the exact coloring and shading of that point in the viewplane and displays it on the corresponding pixel on the computer monitor screen. that’s classical way, which all of the Render engines uses as standard.

Metropolis light transport (MLT)

This procedure has the advantage, relative to bidirectional path tracing, that once a path has been found from light to eye, the algorithm can then explore nearby paths; thus difficult-to-find light paths can be explored more thoroughly with the same number of simulated photons. Metropolis light transport is an unbiased method that, in some cases (but not always), converges to a solution of the rendering equation faster than other unbiased algorithms such as path tracing or bidirectional path tracing. MetroPolis is often used in Bidirectional mode (BDMLT).

Path Guiding

Mix between Path-tracing and MLT, unbiased technique for intelligent light-path construction in path-tracing algorithms. Indirect Guiding that improves indirect lighting by sampling from the better lit or more important areas of the scene. goal is to allow path-tracing algorithms to iteratively “learn” how to construct high-energy light paths.

link to latest Siggraph paper

BiDirectional Pathtracing ( BDPT )

Regular backward Pathtracing has hard time in indoor scene with small light source because it take lot’s rays and bounce to find a tiny light in a room, just to see if a object gets light by the light.

with Bidirectional, rays are fired from both the camera and light sources. They are then joined together to create many complete light paths.

Spectral rendering

image by Silverwing

Unlike most renderers which work with RGB colours, Spectral renderers uses spectral colour throughout, from the physically-based sky model to the reflective and refractive properties of materials. The material models are completely based on the laws of physics.
This makes it possible to render transparent materials like glass and water at the highest degree of realism.
Spectral renderer are pretty good in simulate different medium atmospheric effects like under water or earth air atmosphere.

Biased Rendering

hat Biased Render Engine actually means is pre-computing a lot of information before sending out rays from the camera. In more simple words, It uses an optimization algorithm to greatly speed up the render time but doing so It is not strictly just modeling the physics of light but it is giving an approximation

here is an example what Spectral rendering able to do:

Indigo renderer Planet-scale atmospheric simulation

Unlike other rendering systems which rely on so-called practical models based on approximations, Indigo’s sun and sky system is derived directly from physical principles. Using Rayleigh/Mie scattering and data sourced from NASA, Indigo’s atmospheric simulation is highly accurate. It’s stored with full spectral information, and allows fast rendering and real-time changes of sun position.

some examples of Atmosphere simulations by Indigo Forum user Yonosoy.

image by Yonosoy.
image by Yonosoy.
image by Yonosoy.
image by Yonosoy.
image by Yonosoy., even complete Planet athmosphere simulation is possible

maya sprites rendertest in rendeman

its a simple render test of maya sprites in arnold and renderman Studio. it uses a simple noise pattern on a 2d ramp with 2d ramp displacement. one simple noise plugged into a ramp can give a really complex shading behaviour. the extreme displacement come from a simple ramp, this give the smoke more depth.

maya sprite test from Heribert Raab on Vimeo.