2021 October 12
I have been using the Poser morph brush to paint weights on my rigged figures for a while but today is the first time I learned how to do it in Daz Studio Pro.
I needed to fix a commercial 3rd-party jacket which had cuffs that distorted when following the bends of the gloves (gloves were rigged as one unit together with the jacket). I learned how to get to the weight node paint brush and the smoothing brush, and pretty much applied the same technique as I would to a rigged figure in Poser software).
I illustrate how I did the fix in the following images:
I noticed that the cuffs of the jacket were following the palm of the gloves - an indication that weight mapping needs fixing.
After some quick research over the internet, I learned how to access and apply the weight painting and smoothing brushes. In the image above, the red areas on the jacket cuff are the cause of excessive distortion.
I used the weight brush and reduced the weight on the cuffs, so now you see it looking less distorted. I used the x-symmetry setting so the cuff on the right side of the jacket would also be fixed.
And here is a screen capture of my final render. The jacket cuff is less distorted.
21 October 2021
Both of these are products by the same creator and are commercially available at Daz3D. If you adjust the Clothing Helper morphs in the Parameter Settings, the BJD's joint conformers will change as well !
It seems that these two sets of morphs cannot be used together. This brings us to the next problem.
The BJD joint conformers for elbows stick out of clothing because they are not calculated as part of the Figure when conforming clothes to the figure. But since the Clothing Helper morphs cannot be used with the BJD Morphs, you cannot fix clothing parts to cover the the ball joint conformers !!
Manually adjust the clothing via weight painting is not possible, as I have found out. The weight piinting brush is blind to the ball joint conformers.
This is quite a nasty situation.
22 October 2021
I used the G8F BJD and also faced the same situation as described above - the ball joints stick out over clothing which cover the elbows and thighs. Once again, normal conforming clothing and dForce clothing are "blind" to the BJD ball joints for arms and legs. Also the torso line affects the fall of the clothes - better to set the BJD torso line morph to zero under Parameters.
Although a solution would be to just make the ball joints invisible, this makes working with transparent clothing tricky, because the rendered appearance is clearly awkward.
In the following 3D scene project, I decided to colour the ball joints as close as possible to the clothing colours.
I used 3rd-party assets in the construction of this scene.
02 November 2021
I created an epicanthic fold or Asian eyelid fold morph Pz2 for the Poser 12 L Homme figure. I used the Poser morph brush to do this. I released it at ShadeCG here. It loads into the Body part of the figure.
I also manually modified the PBR default shaders with tints to make the head, limbs, and torso more like East Asian skin.
I also hand-dialled some other face morphs to get a more East Asian look. I compared the result with contemporary East Asian male faces. There are of course such a wide range of looks among East Asian men. Also, many famous celebrities get cosmetic surgery so I don't know what's the real deal these days haha !
02 November 2021
In beta versions of Vroid Studio, it was possible to export hair geometry directly. For me, that was great for creating anime-style hair for use in Poser and Daz Studio. Unfortunately, that is not the workflow of interest for most VR Chat and mobile games avatar creators. In the Release Version 1 of Vroid Studio, this direct export capability is gone. It makes life easier for other developers but means I need to jump through more hoops !
So here is the solution:
Use Blender and the VRM Add-On add-on (yes, rolls eyes). Important: the VRM Importer add-on no longer works; so go and get the VRM Add-On from GitHub. After installing the add-on, We can import the VRM file. Once the Vroid Avatar appears on-screen, select the hair portion and export as OBJ. Thereafter, usual procedure for importing into Poser and Daz studio software.
Here is my screen capture of my Blender workspace showing my avatar Keith and the hair I designed for him in Vroid Studio. I am exporting the hair geometry.
For my earlier discussion and images showing my Vroid Hair Creation, please see this section of this website.
04 Novermber 2021
With the relaese version of Vroid Studio, it is no longer directly possible to extract hair and scalp geometries as simply as in beta verions. This change was made for the sake of mobile game character developers and VR Chat avatar creators. However, for perple like me who use Vroid Studio to create anime-style hairstyles, I need to use either Blender3D or Shade3D to extract the scalp mesh.
Once the VRM file is imported into Blender3D (in my example shown), I need to select the Body part under the Armature hierarchy, then Select All, then perform Separate by Materials. Going through the resultant separated parts, and by a process of elimination (switching visibility on and off), I can find the scalp. I export only that part of the geometry. Done.
It is easy to export the hair as well, because it comes in as a separate geometry provided you ticked that option when exporting the VRM from Vroid Studio v1. So I can now get both Scalp and Hair.
06 Novmber 2021
In the Hairstyle section, left click Edit Hairstyle. (This is an original hairstyle I created.)
2. Left Click on your hair group.
3. Left Click on a Material. (I have more than one Material group in my hair mesh).
4. Left Click Edit Texture as shown.
5. Right Click on each texture item as shown in my example. Your hair may have a different coloured item from mine, if you have customised it).
6. From the pop-up menu, Left Click on Export. Done.
07 November 2021
Separating the scalp part from the figure geometry using Blender3D and Separate By Materials function. (With Version 1 of Vroid Studio, the scalp and figure bidy are integrated. You will need to separate the scalpe using an external geometry editor. I could also have used Shade3D for this.)
2. I wanted to check the UV of the OBJ exported from Blender. Looks OK.
3. This is a different scalp for a different Vroid character I created. I exported the VRM from Vroid Studio and separated the scalp using Blender3D. I took the mesh into Shade3D and adjusted the vertices to fit roughly over the Genesis 8 figure geometry (no screen capture of this stage, sorry ! I was too nusy adjusting the vertices !). I exported the scalp and took it into Poser 12. I used the morph brush to smoothen the scalp and improve the fit.
4. I took the exported smoothened and improved-fit scalp into Daz Studio Pro and loaded the Genesis 8 male figure in its native environment. Because of the differences in measurement systems ( inches verses centimetres), I needed to adjust the position of the scalp a little bit. It is basically done at this stage.
5. Some time ago, I had created a hair geometry using the beta version of Vroid Studio. I named the hair Keith Hair. Her it is imported into Daz Studio pro and fitted to the Genesis 8.1 Male Kota figure.
6. Here is the Filament Mode preview render of my Keith hair in Daz Studio Pro.
7. I added the scalp I had re-fitted, then used one of the the nVidia vMaterials version 1.7 and modified the shader settings to give the hair a more NPR appearance. This is the iRay preview.
In the following group of images, I show how I updated an earlier hair geometry that I had created for Poser software's La Femme figure. I added a movement morph inside Poser 12. I created the clothes using Shade3D. Background 360 HDR is by Yashiro Amamiya-san.
In the next pair of images, I show the results of my re-fitted Vroid Scalp for the La Femme and the Anime Girld morph of La Femme. I used Poser 12 and the morph brush to get the final fit. The texture input was hand-painted by me using Clip Studio Paint EX and its brush assets.
January 2022
I used Clip Studio Paint EX and the UV templates for the Poser 12 L'Homme figure to paint a set of second skin bodysuit textures. I use these as inputs to my Cycles shaders in Layer 1 of each body part in the Material Room.
Update: adding sampled renders showing a couple of the possible combination of the second skin bodysuit.
The second skin is useful when testing clothing I created. Not all combinations shown.
January 2022
Canvas is an AI-enabled photorealisitc scenery generator software by nVidia. After nVidia upgraded their Canvas software, I decided to generate some new images and refresh my earlier generated images. See this page to get an idea of what I am talking about.
So I decided to try using these images to create a 360 background for my 3D scenes. Of course this silly detour did not work out because the composite image cannot be equirectangularly mapped, and some of the images did not have horizons at the vertical midpoint. Anyway, I show you what I attempted.
Basically, I took the square output from Canvas and took it into Clip Studio Paint EX using a canvas of equirectangular dimensions (i.e. width twice of height). I duplicated the layer and flipped it horizontally, then moved it to fill up the rest of the canvas space.
ONE GOOD RESULT FROM THIS EXERCISE WAS THAT I DECIDED TO USE THE OUTPUT IMAGES IN BLENDER3D SOFTWARE, AND CHANGE THEIR TONING USING CYCLES SHADERS TO PRODUCE ANIME BACKGROUNDS. That's just how my brain works, sorry if you can't make any connection !
This is actually related to the content above. So I could not make equirectangular 360 output from my experiment, but I decided to take the composite images from Clip Studio Paint EX and process them further in Blender3D. I used my own Cycles3 shaders to tone the image to an anime backgrond look and feel.
Just some direct image editing in Clip Studio Paint EX, to achieve the look and feel of anime-style background. The original photorealistic images were generated by the Artificial Intelligence in the nVidia Canvas software.
I show here some screen shots of what I did, including my node setups.
I actually used an equirectangular setting for the Blender3D camera, and other settings, so these images are really 360 equirectangular outputs.
I also played with using the procedural sky nodes in Cycles3 to generate equirectanuglar skies in Blender3D.
Unfortunately, I can't show you thumbnails of my rendered images because they are all HDR format, and web browser don't show HDRs.
Using a 360 equirectangular image by WhiteMagus as an example, I show how Cycles3 shader nodes can tune the image from photorealisitic to anime style.
The input image is a composite of the nVidia Canvas photorealistic output, which I put together in Clip Studio paint EX, as described in the sections above. The composite input image is 2:1 width:height.
The input image is a purely hand-painted image using Clip Studio Piant EX and Clip Studio Assets. The input image is 2:1 width:height.
Using procedural sky with Cycles3 nodes, toproduce a 360 background.
Using procedural sky with Cycles3 nodes, toproduce a 360 background.
Well, here's what a toned 360 background looks like in Poser software. as described above, I used Blender3D to produce and tone the resultant 360 image.
simple works best ha ha !
I need to use a higher resolution image. This is an anime-ised version of the photoreal 360 image by WhiteMagus, for testing only.
by Juniper Chew
2022 February 27
Windforce works with a cloth or hair that can be dynamically simulated.
Basic use:
After setting up your figure, pose, dynamic clothes, and/or dynamic hair, make sure you are in Frame 0. (You may work with the default 30 frames in your scene, but go to frame 0 for the windforce setup).
Click on Object, then click on Create Windforce.
By default, the windforce widget is loaded at scene centre (x,y,z, coordinates 0,0,0,).
Move the windforce widget to a location from which you want the wind to be blowing. Since it is laying halfway in ground level, raise the widget to a height within the height of your figure. Now, go to the parameter dials for the Windforce widget, and fine tune the position.
The strength of the wind is determined by the Amplitude slider. Depending on your dynamic hair or cloth settings, the Amplitutude setting of the Windforce widget adds a windblown effect when simulation is calculated. A heavier or stiffer item will get blown less than a lighter or less stiff material.
Advance use:
You can change the Translation and Rotation parameter dial settings of the Windforce widget through an animated scene, to get more motion effects over time (you will need many more frames in your scene, of course). Just make sure that which ever dynamic item you want windblown is within the cone of the widget's influence (straight lines in preview mode). You can also dial up or down the Amplitude to adjust the strength of the wind blowing your dynamic items in the animation.
That's about it.