After the story is determined, we start to choose a role, choose one of these roles.



After determining the role, Jay made a preview of the scene for us.


After the story is determined, we start to choose a role, choose one of these roles.
After determining the role, Jay made a preview of the scene for us.
This week’s task is how to use two different solvers together. On the basis of the previous study, I learned some new knowledge points, including the use of rigid bodies and particles at the same time to make an effect.
particle
Show particles-How Houdini draws particles and broken points.
Dots-draw dots as uniform dots, controlled by dot size (in pixels). In this mode, the near and far points are drawn at the same size.
Pixel — draw the point as a single pixel. This can be useful for very dense particle simulations.
Line-Draw particles as stripes. This only affects particles (disconnected points are drawn as points).
Disk — Draw particles as solid circles, with the radius controlled by the size of the disk (in world space units). In this mode, the particles are drawn as actual geometry, so the closer particles appear larger than the far ones. This only affects particles (disconnected points are drawn as points).
Show sprite — If the particle has sprite attributes (see sprite node), draw the sprite image at the particle’s position.
Point size — When the displayed particle is a “point”, the size of the particle and the unconnected point (in pixels).
Disk size — when the displayed particle is a “disk”, the size of the solid circle (in Houdini as a unit).
Create node
Circle
Vdb from polygon
dispersion
Voronoifracture
Exploded view (for visualization)
Conversion
Empty (rbd_packed_source)
Add Popforce and Multisolver so that the fragments will fly
Add more details, add scatter points and merge them.
Choose to delete the geometry but keep the points-this will destroy all polygons, NURB and other primitives, leaving only the points intact.
final
This week will be about volume and smoke, fire or explosion.
In Volumes, we will learn to create and control combustion style simulations in the sparse pyrolysis solver, and then further study by running a bunch of wedge simulations using PDG so that we can work more efficiently. Custom simulation and post-simulation techniques, as well as custom shaders and lighting strategies, will allow us to create production-quality works in ways not possible with out-of-the-box tools.
Atmospheric volume
This shader simulates light scattered by a thin, uniform atmosphere. It produces optical axis and volumetric shadows cast from geometric objects. It is suitable for point light sources, spotlights and area light sources, but not suitable for distant or skylights. This is a scene-wide volume shader.
Smoke:
Time Scale-Specify a scale factor that relates the DOP time to the simulation time of this microsolver. A value greater than 1 means that the simulation time is earlier than the DOP time. Values less than 1 cause the simulation to appear to run in slow motion relative to the DOP time. There are multiple expression functions, such as doptime, which are used to convert global time to analog time and vice versa.
Scale-The turbulence amplitude applied to the specified velocity field.
Swirl size — Initialize the (base) vortex size value. Measured in world units. The value comes from frequency.
Particles-The amount of influence of the added turbulence zone, relative to the initial vortex size.
Pulse length-the speed at which the noise moves. A higher value will result in a slower movement speed.
Seed-defines the initial noise offset
Attenuation — Defines the gradual attenuation of intensity.
Impact threshold-When to apply turbulence based on the specified density field.
Turbulence-For a smoother transition relative to the applied turbulence level of the initial vortex size, use a lower value
fire:
First remove the temperature attributes of attribnoise and volumerasterizeattributes, increase the flame density source and target field attributes, then add interference, add gas vortex restriction and restriction scale 0.5.
Then use Gas Wind DOP to apply wind force and adjust the speed field in the direction of the ambient wind direction,and finally adjust the volume to add color.
explosion
The first input provides the source for the Pyro simulation, and the second input provides the collision for the Pyro simulation to make the density smaller and the flame effect clearer, adding density and divergence to make it expand quickly.
This week, we’re focusing on Houdini’s materials, lighting and rendering.
Displacement rendering
Displacement maps are usually used to represent the height fluctuations of objects in the rendering.
The effect is usually to move the position of the point along the normal of the surface by a distance defined in the map.
It gives texture the ability to express detail and depth.
It can also allow self-covering, self-projection, and edge contour rendering.
On the other hand, compared with other similar technologies, this technology consumes the most performance because it requires a lot of additional geometric information.
Mantra is a highly advanced renderer included with Houdini. It is a multi-paradigm renderer that implements scan lines, ray tracing, and physically-based rendering. You should use a physically-based rendering engine unless you have a good reason to use another engine. Mantra is deeply integrated with Houdini, such as efficient rendering of packed primitives and volumes.
Lighting
Ambient lighting adds light to the scene as if it came from a sphere surrounding the scene. Usually, the light is colored using an image called an environment map. Environment maps can match the lighting (and reflections) of the scene to the real-world position, and can also be used to add interesting changes to the lighting of the scene.
HDR mapping refers to environment mapping with high dynamic range in image software such as 3D. Generally speaking, HDR texture is a “seamless texture” composed of “HDR photos” (seamless texture is a picture where the edges of the picture are connected up, down, left and right, no seams or traces can be seen). HDR textures are generally natural Scenery or indoor environment.
The HDR map has a high dynamic range of the illumination information data image, the normal image is 8bit, and the HDR map is 32bit. In other words, he has more grayscale details and richer details. High dynamic range images are closer to the dynamic range of the human eye, and even exceed the human eye. In short, it is a photo with rich light and dark details. This is an image processing software that makes up for the lack of camera dynamic range, and can take multiple photos with the same position but different exposures at the same time.
8-bit color:
In television, each individual value represents a specific color in a color area. When we talk about 8-bit color, we basically mean that the TV can represent colors from 00000000 to 11111111, with 256 color variations for each value. Since all TVs can represent red, green, and blue values, 256 variations of each color means that the TV can reproduce 256x256x256 colors or a total of 16,777,216 colors. This is considered VGA and has been used as a standard for TVs and monitors for many years.
10-bit color:
10-bit color can represent each of the red, blue, and yellow colors between 0000000000 and 1111111111, which means that it can represent 64 times the 8-bit color. This can reproduce 1024x1024x1024 = 1,073,741,824 colors, which is definitely much more than 8-bit colors. For this reason, many gradients in the image will look smoother, as shown in the image above, 10-bit images are obviously better than 8-bit images.
12-bit color:
The 12-bit color range is from 000000000000 to 1111111111, making the color scale range of 4096 versions of each primary color, or 4096x4096x4096 = 68,719,476,736. Although this is technically a 64 times wider color range than 10-bit color, the TV must produce a bright enough image to see the color difference between the two.
Light object:
Point — A light that emits light from a specific point in space defined by a light transformation.
Line-The line of light from (-0.5, 0, 0) to (0.5, 0, 0) in the space of light.
Grid-a rectangular grid from (-0.5, -0.5, 0) to (0.5, 0.5, 0) in the light space.
Disc-a disc-shaped lamp. The disc is a unit circle in the XY plane in optical space.
Sphere-spherical light. The sphere is a unit sphere in the space of light.
Tube-Tube light. The first parameter of Area Size controls the height of the tube, and the second parameter controls the radius.
Geometry — Use the object specified by the geometry object parameter to define the shape of the area light.
Far-A directional light source at infinite distance from the scene. Light sources in the distance will cast clear shadows, so depth map shadows can be used.
Sun-a finite size (non-point) directional light source at infinite distance from the scene.
Sunlight is similar to high beams, except that they produce penumbras-similar to the actual sun with soft shadow.
Arnold
This week we will try to make a destructive effect, and I will use the wooden house model I made in the first week for destructive testing.
First, we need to understand Voronoi.
Voronoi is the closest thing we have found to make some natural or destructive shapes. The way it works is basically that we spread a lot of points, and then draw a line between every two points, and this line stops where there is another line between the other points. This effect is exactly what we want to achieve in the 3D crushing process. Broken irregularities will give the effect a more realistic texture.
The first way
Voronoifracture + Explodeview node
When you uncheck Distance VDB in vdbfrompolygons, and then select Fog VDB, the state of the model will change.
Create an isooffset node. Connect the isooffset node directly to the sphere node. At this point, you can observe the change in the shape of the object.
The second way
Add scatter/grid/copy to points and attrirandomize
Modify the scale of the transformation and the uniform scale of the exploded view.
The third way
Add rdbmaterialfracture
Adjusting the value of Scatter Points in rbdmaterialfracture will increase the number of fragments.
The task for the second week is to start visiting the particle system and start some preview videos.
Auto update]: When we change the value of a node in Houdini, it will automatically (and very quickly) update the effect on the screen.
[On mouse up]: When we change the value of a node in Houdini, it will only update when we click on the window with the mouse.
[Manual]: Manual update. This can be interpreted as an independent update of the window effect when we change the value of the node. (This mode is very convenient. I have been using this mode for my last homework.)
[Null] Node: This node has no effect. But it is easier to create a [Null] node in the panel to observe the process. (Maybe because the shape of this node is too unusual. Most of the other nodes are rectangular, and the [Null] node is a shape similar to X).
Vop and Vex
First create a box, select the node and press I to enter
Create an attribute wrangle node and connect it to the box node
Select the attribute wrangle node, press P to open the parameter panel, and enter the vex code in the vexpression
Click on vexpression and press Alt + E to open the vex code editor
[Gravity] is something common to all solvers.
Put [gravity] under [popsolver]. At this point, we can see that the particles on the [testgeometry_crag] model have a downward trend.
There are different types of wind in [popnet]
Increase the value of [Wind Velocity] in [popwind], the particles will change the direction of movement.
[Rendering animation book]
Render Houdini files.
Create a [popforce] node. Changing the value of [Amplitude] in [popforce] can be similar to [noise wind] and can affect the direction of particle movement.
Houdini is a brand new software for me, and it is a new challenge for me. Maya is still in the stage of self-learning.
The main study this time is
Different from MAYA, the Houdini interface allows us to move or rotate the interface directly with the mouse.
Many operations can be performed in the object panel on the right
Enter the name of the object to be created in the [TAB] search panel, select the object to be searched, and create it directly. Remember to select the object you want to create, then right click on it in the properties panel.
Created a base sphere. Then follow the tutorial video and create a box in the same way. If you want to switch between viewing boxes, you can click the blue mark on the right side of the [Sphere] panel. After hiding the sphere, you can observe the box.
[Parameter] in Houdini can be displayed by clicking [P] on the keyboard.
Different from MAYA, the project folder in Houdini does not have [Scene]. Scene data in Houdini is usually very small. In most cases, the main file is stored in [Desktop Folder].
Rock practice.
Create [Sphere] first, then select [Polygon] in [Primitive Type].
Select [Flat shaded] to observe [sphere].
Change the location attribute of the noise [Attribute Names] to [P].
Add [null] node. [null] has a certain shape, which helps us find the current item more easily.
Change the values of [Noise Type] and [Element Size] in [atribnoise]. You can also add more details in [Fractal] and [Deformation].
The ultimate rock.
Making of the wooden house
Create two [geo], create [box] and [testgeometry_tommy1] respectively.
Adjust the value on the [box] property so that the model will stay on the horizon when you increase [Size] on the y-axis.
Select points and create transform nodes
Alt+drag = copy , Create boolean
Polyextrude-thickness, Reverse-normal reverse , Roof-two extrusions
The ultimate wooden house.
Image-based lighting (IBL) settings (2)
First capture the environment map. This image can be taken by the camera in the real world (HDR is recommended for better results). It can also be rendered in real time via the camera in the game.
AOV
The light in one space allows us to decompose the beauty and render it into multiple light renderings. In addition, for each lighting rendering, we can decompose it into food according to the shading components (diffuse component specular component and subsurface ghetto component).
Render separately
Select Transform> SphericalTransform to insert a SphericalTransform node after the HDR image. You can use this node to convert HDR images to spherical map images. In the control of the node, select Input Type and Output Type.
Add contact shadows and reflections
Image-based lighting (IBL) settings
Classification of HDR
Maya
Introduction to Lighting in Visual Effect
Part 1