Scroll over the words "Home" at the top right (PC web browser) or top left (three horizontal bars, if you are using a mobile browser) of the page to view a drop-down navigation menu for the rest of this website.
I rendered out an equirectangular image (look it up using Auntie Google, if unfamiliar) using Daz Studio Pro. To get a bracket HDR, I adjusted the image in Clip Studio Paint EX to create one underexposed image and one over-exposed image. I therefore end up with three equirectangular images - the original, the underexposed one, and the over-exposed one.
I go into Poser Pro 11.x and load 3 geospheres. The geosphere prop is part of the Poser default content and its geometry works with equirectangular images for 360 backgrounds. The geospheres have to be enlarge to form an immersive skydome for your scene (example, size=17500). Then make one geosphere just one unit smaller (example, size=17499), and the next one two units smaller (example, size=17498).
After that, I go into the Poser Material Room, I created a Cycles root surface emission shader tree for each geosphere. The Cycles shader tree of the middle geosphere gets the image I rendered in Daz Studio Pro. The outermost geosphere gets the overexposed image and the inner geosphere gets the underexposed image. I adjust the emission and transparency node values of the inner geospheres until the scene renders acceptably in Superfly.
Basically, I have faked a bracketed HDRI. Here I show the L'Homme figure in my emulated HDRI environment. All the illumination in the scene is coming from the three concentric geospheres. I used the Poser default content Square Plane with Vince Bagna's Superfly wood shader. I modelled the dynamic clothing set in Shade3D. I created the textures for the clothes using Clip Studio Paint EX. The cap is from EF-Steve assets but I reversed its position on the head.
So why not just use an HDRI on one Geosphere ? Well have you seen the size of HDRI files ? if you have an old computer like mine, no way to load the file and get the rest of the scene working.
My shaders for the three geospheres are shown below.
I used Cycles shaders incorporating a low-resolution equirectangular image, to cause a blurry appearance in the rendered scene.
Basically, I have three parallel shader trees feeding into one final node in Cycles. I adjust the position of each images of the trees, so that the image literally renders out of focus. Although I have used this with the Poser Construct Prop, it applies to a skydome or geosphere as well, provided your image is correctly mapped.
I found some old renders which show that I was indeed successful in importing rigged FBX from Mixamo into Daz Studio Pro.
I wanted to create a low-polygon weight city backrgound for use in anime-type 3D scenes. I used shade 3D to create the geometry, and set up the scene in Poser software. I also created textures and shaders for the scene. At that time, I did not know how to use equirectangular images for backgrounds, so the renders are just simple camera shots from various locations within my scene. I wanted the result to be Non-Photorealistic 3D. The trees and grass are 3D, but the birds are added post-render. The skies are my own creation as well. The camera zoom and blur settings are done inside Poser. I added some bloom and bokeh and gradient effects using Clip Studio Paint.
So I read up a bit about creating 360 equirectangular images in Blender3D. This was before I learned that I could use Daz Studio Pro Spherical lense camera to render my own equirectangular backgrounds. I was really re-shaping 2D images into distorted equirectangular form. There was - as to be expected - only one workable camera view that would not have distortion in my normal 3D scene, when the equirectangular image was applied to the skydome (or geosphere). I did manage to get some sort of CG render, although it was experimental and very early days for me. I used 3rd-party 3D geometries.
I do have a Sway document which details my steps; I will add a link to it later.
I rendered various Poser scenes using WhiteMagus' High-Rise City 3D asset, then processed the renders using Clip Paint Studio. I tried to get a mang or anime style. For the window pane scenes, I had to actually model the windows using Shade3D and position them in the camera view inside Poser.
I used Shade3D 3D assets mixed in with Clip Studio Assets, to get the cityscapes shown below.
In the following section, I used 3D assets imported intoShade3D, rendered the scene, then applied effects in Clip Paint Studio. To be honest, I didn't like the result too much. I suppose I could continue to use SKP assets but ever since Trimble took it over from Google, the SKP format has become less interoperable viw import add-ons and more restricted to its own community of SKP users.
Clip Studio Paint EX (CSPEX)has certain functions which use Artificial Intelligence to process images and produce a modified result. There isn't much user input control available, though. You just let the AI do its thing. I added on prior step to this process. I let the nVidia and the Intel Denoisers work on my Daz Studio Pro renders, then applied the CSPEX AI functions. Some of my results are shown in the section below:
AI-based De-Noiser plus CSPEX AI effects
AI-based De-noiser only
Not exactly a neat result with the AI...
Lots of patches requiring manual clean-up after the AI has had a go...
The two images on the left show some unsavoury results by the AI. Well, this is all experimental anyway.
I am not really sure 2D software are purely 2D anymore in this day and age, because they can import and use 3D files. In this case, I am using Clip Studio Paint EX (CSPEX) as the 2D graphics software. I import a posed 3D figure and position it inside CSPEX, then continue to paint or apply filters. This was just an experimental exercise for me.
Before exporting, combine the figures. Hide the unseen polygons.
At time of exporting to FBX:
- export single frame
- Select only the combined figure (you can inspect choices in the hierarchy pop-up)
- another pop-up appears after you accept; this is Export Options
In the Export Options pop-up:
- select Custom units and input 10 %
- select ONLY: Bake all morphs, Bake diffuse colour etc; Bake transparency etc ( I use RGB_Zero); invert transmap; Compile texture atlas; Maximum texture size (I use 128 and select power of 2);
- select Embed Media FBX fily type: Binary; FBX Format Version: FBX 2014.
- select Include Geometry.
You don't need Normals, you don't need rigging, you don't need every frame as keyframe.
What do I know, I just render out the scene in Daz Studio Pro using iray, and then import it into Clip Studio Paint EX and add some effects... (This is a Genesis 2 male Ash)
One basic scene with different HDRI lighting setups in Daz Studio Pro, and also too the render into graphics editing in Clip Studio Paitn EX to get a BW toon look. I used the Daz3D Genesis 2 Male figure with Kenji chracter morph.
This feature has now been disabled in Facebook, but there was a time when Facebook allowed uploading of files to create a fake 3D depth perception image. You can still do that with photos uploaded from your mobile phone client of FaceBook, but back in the day, you could also make a depth map from your image and upload both to make the fake 3D image from a 2D image. Here I use the Elizabeth 2 figure by Sean Dodger Cannon.
Rendered in Poser software
Corresponding Depth Map created from rendered image