Scroll over the words "Home" at the top right (PC web browser) or top left (three horizontal bars, if you are using a mobile browser) of the page to view a drop-down navigation menu for the rest of this website.
2022 May 23
I got the loose T-shirt by chasfh and painted a transparency map using the templates provided. I changed the shader inputs for the materials to get the look. The cut-out strands don't drape as much as in the original item, but this is close enough. Software: #DazStudioPro and #ClipStudioPaintEX. #TxT #FanArt #Beomgyu #Styling - I want to send kudos to the designer who made this outfit in real life. Please note that the character in this test render does not resemble anyone in real life.
#TxT #FanArt #Beomgyu #Styling
2022 May 26
Software: Daz Studio Pro
I used Aave Nainen's Open Jacket from the Ordinarily for Genesis 2 Female(s) product, and created strand hair from it, then styled the hair strands.
June 2022
I installed UE 5 and the Quixel Bridge to get access to MetaHuman. To cut a long story short, my target was to get a MetaHuman into Omniverse, but sadly, while the technology can do it, the license only permits rendering of MetaHuman inside Unreal Engine 5. Additionally, I have zero idea of what to do inside UE5, and it appears to me to be more of a rendering and animation solution.
I also uploaded to my Instagram feed a smartphone camera video of my MetaHuman in default animation.
June 2022
After initial installation merry-go-round, I figured out how the Daz To Blender Bridge works. I manahed to get a Daz3D character into Blender3D version 3.2 , for Daz Studio Pro. The hands are distorted, and the figure reverts to its default pose. The rendering speed is blazingly fast, for Cycles GPU compute - completely out of the ball park speed (thus leaving Daz Studio as well as Poser 12 Cycles rendering, in the dust).
I submitted a tech Support ticket to Daz3D asking hem what I could do aout the distorted hands.
Also, I used a 3rd-party 360 equirectanuglar HDRI and fiddled with the lighting, but I still cannot figure out the cameras in Blender3D, not can I get the figure posed inside Blender. His pupil is overly reflective. I don't know how to access his material settings in Blender.
2022 August 22
#3D #CG #WIP I used Blender3D to generate two ANT landscapes and edited them to be computationally lightweight fake fairy mist planes. I used tiling cloud textures from PixaBay which do not need crediting and which are free even for commercial use. I set up the shader materials for each mist plane. The next step is for me to parent the planes to a camera so that the mist and fog will always be viewed in the scene. I am still experimenting. I may release these as freebies is there is interest. It could be a lightweight solution to VDBs and NeuralDBs heh heh.
Once the planes are parented to the shooting camera, and to get variation in the appearance of the mist (perhaps not for animation, though) , I propose the planes be rotated along the shooting axis of the camera view. Each mist plane can be rotated independently from the other. FAILURE. DOES NOT WORK. I will try another approach.
2022 August 23
#3D #CG #DazStudioPro
Re: computationally-lightweight mist or fog using generated ANT landscape planes. ANT is the acronym for Another Noise Tool. As mentioned in an earlier post within these topic hashtags, I have been trying to set up a workable system using the ANT planes and a virtual camera. Ideally, the camera should be able to pan and rotate within the 3D scene space, without losing the planes in its field of view. Also, the camera should not rotate uncontrollably in 360 degrees ! I have tried various configurations and settings. I don't think it is worth my trying to provide a complete and idiotproof solution (pardon the phrase). Basically, the setup will allow minimal flexibility in a still render of a scene with lightweight mist or fog effects. Some manual settings will be required, according to what the user wants to achieve. I have been mulling whether I should even release the freebie, seeing as some lazybones may complain about it being not completely useful for their use.
Mist planes both visible.
Lighting coming from in front of mist camera.
Lighting coming from behind mist camera.
Mist planes rotation 180 degrees along the Y axis - for visual variety. I also added a smoothing modifier to each of the two mist planes.
Both mist planes visible and rotated 180 along the Y axis for variety
Both mist planes visiable and in default orientation.
Default render settings of software and both mist planes visiable.
One of the two mist planes made invisible.
The other of the two mist planes made invisible.
Both mist planes made invisible.
Scene with my own custom render settings for bloom, toning and gamma. Both mist planes invisible.
Scene with my own custom render settings for bloom, toning and gamma. Both mist planes visible.
I used a third-party script to add glow blobs to the planes. I also tested with both planes invisible, with front and then with rear plane invisible, respectively. Having just the rear plane visile doesn't add much to the render, but I may reduce the tiling size of the shader to see if that improves the appearance of the mist.
With both front and rear mist planes.
With glow blobs and only rear mist plane.
With glow blobs and only front mist plane.
With glow blobs and both mist planes invisible.
2022 October 12
I was disappointed that Blender3D version 3.4 has a less than satisfactory Scatter Objects script. I've used the Scatter script in Shade3D version 16 and it has so much more options and the results are so much batter. However, I am slowly graduating out of Shade3D these days. I used the UtraScatterPro script in Daz Studio Pro and generated several groups of differently-coloured light sticks - the types you see at live concerts by pop groups from Japan and S Korea. My base primitive was a cube which I elongated into a prism. I also learned that in Daz Studio Pro, you have to switch Spectral Rendering OFF to get the Spotlight to throw any light it the scene, even though all the other iRay-compatible settings have been used in the spotlight settings. (I usually render with Spectral on.) I also discovered how to render Bokeh-style emissive meshes. I show all my settings in the following images.
My seredipitous discovery of how to create bokeh effect using render settings.
Before I discovered that I had to switch Spectral Rendering OFF - this is found under the Render Settings tab of daz Studio Pro.
I increased the shadow biase and softness for better look and feel - theseare not my settings for my final render. My luminosity settings had to go quite high - 250000.
My screen capture truncated the most important part - Photomertics is ON in my settings.
This 3D CG render use case is a technical experiment for a composited 360 HDRI based on source image generated by Blockade Lab's SkyBox AI online.
My methodology is as follows:
1) Generate 360 equirectangular (non-HDRI) image using SkyBox AI by Blockade Labs. Most of the artistic styles they offer will output non-photoreal output. The generated files, while equirectangular, are JPGs and are not HDR. JPGs do not contain different EVs levels which a true HDRI does. Hence these JPG images, while can be used as 360 backgrounds, are not able to illuminate your scene.
2) So there is a way to cheat a bit. Launch GIMP and open your 360 JPG generated by SkyBox. Make another 6 copies (4, if you're a bit lazy; 2, if you're really just playing around...). I made 6 copies. Three of the six copies will be adjusted to minus EVs and the other 3 will be adjusted to plus EVs. Put your original image at the bottom of the layer stack.
2a) Use the drop-down menu from Colour options. Scroll to Exposure. Change the exposure value in the pop up window. In my case, my layers are changed as follows, starting from the top layer:
Topmost layer - exposure level change to +5
Layer 2 - exposure level changed to +3
Layer 3 - exposure level changed to +1
Layer 4 - exposure level changed to -1
Layer 5 - exposure level changed to -3
Layer 6 - exposure level changed to -5
Bottom layer - do nothing
2b) For each of the + value layers, right-click on each layer, Add Layer Mask, select Greyscale Copy Of Layer and also select Invert Mask.
2c) For each of the - value layers, right-click on each layer, Add Layer Mask, select Greyscale Copy Of Layer but do not select Invert Mask.
2d) Make sure all layers are visible (eye icon appears). Export as HDR.
3) Of course this is technically just a cheat, like as in game cheat. So I bring the HDR into Daz Studio, I use it in the Render Settings tab, applied to Dome. I need to adjust the Dome Intensity to get a reasonable scene illumination, which is a little bit better than just the basic JPG from SkyBox. Not all base JPG source images will produce great results after all the effort, but I wanted to try using non-photoreal graphical style 360 backgrounds and Blackade Labs' SkyBox was a way to get some generated.
References:
https://skybox.blockadelabs.com/
https://community.theta360.guide/t/making-hdr-360-images-with-gimp/162
Disclaimer: I am not associated with nor related to Blockade Labs. I represent myself in this discussion.