Gottfried Hofmann – DIGITAL PRODUCTION https://digitalproduction.com Magazine for Digital Media Production Tue, 10 Dec 2024 10:56:21 +0000 en-US hourly 1 https://digitalproduction.com/wp-content/uploads/2024/09/cropped-DP_icon@4x-32x32.png Gottfried Hofmann – DIGITAL PRODUCTION https://digitalproduction.com 32 32 236729828 How to Persuade Competing VFX Studios to Cooperate https://digitalproduction.com/2024/10/30/how-to-persuade-competing-vfx-studiosto-cooperate/ Wed, 30 Oct 2024 11:04:31 +0000 https://digitalproduction.com/?p=150601
How can you get a number of companies that are in extreme competition with each other and, therefore, do not trust each other to work together? You introduce an independent third organization in which all competitors are jointly involved.
]]>

In the case of Linux, this is done by the Linux Foundation. Some VFX-relevant open-source projects, such as OpenEXR or OpenVDB, are now under the umbrella of the Academy Software Foundation (ASWF), which in turn is under the umbrella of the Linux Foundation.

Anyone working with VFX will inevitably also directly or indirectly use open-source technologies, such as OpenEXR for exchanging images with high bit depth in the scene-referenced colour space, OpenColorIO for colour management, or OpenImageIO for reading and writing numerous image formats. These components are usually invisible to the user as parts of larger software, such as The Foundry Nuke, Autodesk Maya, or Blender. Although these open-source projects are used by many companies, the development of the projects was mostly in the hands of individual companies. For example, OpenEXR was under the control of Industrial Light & Magic (ILM), Open Shading Language (OSL) was managed by Sony Pictures Imageworks, and OpenVDB was overseen by DreamWorks.

A Question of Trust

This situation was unsatisfactory for other market participants, as they did not trust each other. This is also one of the reasons why all these projects are open-source and subject to extremely permissive licenses. Disney, for instance, would never use proprietary software from DreamWorks, but it would use OpenVDB as open-source software under the Mozilla Public License. However, trust would be further strengthened if the projects were managed by a neutral third party. This would also address the issue of projects becoming orphaned when a central developer changes jobs.

Enter the ASWF

Jean-Francois Panisset (is.gd/jean_francois_panisset)

The Academy Software Foundation (ASWF) has taken on this role, and under the auspices of the experienced Linux Foundation, it has successfully persuaded competitors such as NVIDIA and AMD, DreamWorks and Weta Digital, as well as The Foundry and Adobe, to work together. Jean-Francois Panisset explains how this was achieved in an interview.

DP: I‘m here at FMX with Jean-Francois Panisset (is.gd/jean_francois_panisset) from the Academy Software Foundation (ASWF). It‘s basically an umbrella foundation for a lot of open source projects that are currently used in visual effects. But why do you need an umbrella foundation for those projects at all?
Jean-Francois Panisset: Many foundational projects, like OpenEXR, OpenColorIO, and OpenVDB, were originally developed by major visual effects studios and later open-sourced. OpenEXR, for example, was open-sourced around 2003, and for a while, everything worked well — other studios could download the software, and vendors could integrate it into their products.
However, as time went on, the original creators moved on to new roles, left their companies, or even left the industry entirely. This led to a lack of active maintenance, and these projects began to stagnate. Bug reports went unaddressed, pull requests weren’t reviewed or merged, and in some cases, the projects couldn’t even be built anymore due to a lack of automated CI builds. It became an industry-wide problem. Recognizing this issue, discussions began on how to maintain these projects long-term and create a sustainable ecosystem. CTOs from major studios and software companies held meetings, conducted surveys, and explored solutions with organizations experienced in setting up foundations.
The Linux Foundation emerged as the best fit, having successfully created similar initiatives, like the Automotive Linux Foundation, which brought together competing car manufacturers to collaborate. This led to the decision to establish the Academy Software Foundation (ASWF) as a sub-foundation of the Linux Foundation, providing a structured environment for professional open-source development, ensuring projects wouldn’t collapse when individual contributors moved on.
Since its formation, the ASWF has expanded from the original three projects to 14, all of which are now active and well-maintained. One key advantage of the foundation is its legal framework, which allows competing companies to collaborate on these projects without concerns about antitrust issues. This neutral, legally protected space enables a higher level of cooperation than a three friends developing together – competing companies need to have a neutral space to collaborate.

DP: How do you support developers looking to contribute?
Jean-Francois Panisset: The ASWF offers a range of resources to support project development. We provide collaboration tools like a paid Slack instance, Zoom, and scheduling platforms to keep teams connected. Our GitHub organization is at the enterprise level, giving projects higher build-time limits, and we also have technical support through the Linux Foundation’s release engineering team for any GitHub issues.
We cover paid infrastructure like Docker Hub to avoid throttling on container pulls and offer more powerful build systems for projects that need extra memory or CPU, like OpenVDB. Projects with GPU-enabled test suites can also access paid GPU runners, which aren‘t available for free on GitHub. While development still follows the open-source model, working publicly on GitHub, these resources help avoid roadblocks and make the process smoother. The way you develop software within ASWF is really still the open source way – you get access to all these extra features that just facilitate the development process and mean you don‘t get blocked because “oh we don‘t have access to this”.

DP: Let‘s say I found a bug in one of your projects and I want to file a report or maybe I even have a fix. Usually it can be quite daunting to approach a project. I think you have some on-boarding process?
Jean-Francois Panisset: It varies by project, but most follow a standard process since they’re all GitHub-based. If you find a bug, the first step is to create a GitHub issue. All projects monitor these, and their technical steering committees review and prioritize new issues. If you also know how to fix it, submitting a pull request (PR) is encouraged. Some projects require signing a Contributor License Agreement (CLA), which assigns ownership of the code to the project, preventing IP disputes. The Linux Foundation provides an easy process for this. Individual developers sign an ICLA, while companies can sign a corporate CLA, allowing their employees to contribute without additional approvals. Though it may seem like legal paperwork, it can simplify contributions, especially from large organizations.
Other projects have a simpler process, requiring a Developer Certificate of Origin (DCO), where you certify that you wrote the code. Before submitting a PR, it‘s always good to engage with the project to confirm how they‘d like it implemented. Even if a PR needs adjustments, the steering committees ensure that issues and contributions are addressed.

At the very beginning, projects end up in the sandbox at the ASWF. The toolset
for content management OpenAssetIO and the programme collection for the playback and review process Open Review Intiative are currently there.

DP: What if we’ve made a larger change, like customizing a library, and want to start small before submitting the bigger contribution?
Jean-Francois Panisset: The concept of a “good first issue” is key to project maturity. Projects are encouraged to flag these issues to attract new developers. While some projects require domain-specific knowledge—like contributing to OpenColorIO’s colour transform engine, which is suited for colour scientists—there are still many contributions that don‘t require specialized expertise. By tagging beginner-friendly tasks, newcomers can browse the backlog, find something they can tackle, and submit a pull request.
Last year, we introduced “Dev Days,” a 48-hour event, similar to a hackathon, where developers can get real-time help with setting up their environment or validating their approach. It’s a great way to onboard new contributors. Facilities are also participating by encouraging their engineers to work on open-source projects. This helps studios prevent knowledge loss — especially when only one developer knows certain systems like OpenEXR — by getting more developers familiar with key projects.
There‘s a tool called Clotributor, developed by the Cloud Native Computing Foundation (CNCF), that aggregates “good first issues” across multiple projects. You can search for projects you‘re interested in, and it will suggest issues flagged for beginners. It‘s not limited to ASWF; several foundations use it. Some projects have received PRs from developers outside their field who found the issues interesting. It‘s in every project‘s best interest to attract new contributors, even if it‘s just for a one-off fix — you‘re still better off than before.

DP: The list of projects on your umbrella is pretty impressive and every artist in the industry is using them actively. What‘s the rationale for the organizations behind the projects to open source their technology in the first place?
Jean-Francois Panisset: So, obviously, I don‘t speak for any of those organizations, but there‘s multiple motivations. The first one is: don‘t underestimate people wanting to do the right thing, and realizing that in our industry being able to collaborate is important. Maybe there‘s that, but also almost a business reason, because today there‘s very few projects that are done at only one facility. So, if I as a facility have this great proprietary data format that no one else know, that‘s not really helpful. How am I going to exchange data with other facilities? So open sourcing my component and “hoping” it becomes the de facto industry standard is great because I don‘t have to rewrite all my internal software.
Also, I don‘t have to carry a hundred percent of the burden of maintaining that software, because other facilities have an incentive to help develop and maintain that software. That‘s partly why the list of projects in ASWF are interchange formats – which are handling how you move data around between projects and departments and facilities and vendors. There‘s also the idea that I would love software vendors to support that format – and the way to do that is to make sure that my format becomes a de facto standard and the implementation becomes the de facto implementation.

Also if they can easily adopt the implementation into their software, it becomes more likely – who wants extra work, right? And in turn this would attract and retain software development talent – which is always a challenge and providing the opportunity for your engineers to not just “work internally” but also to contribute to a greater community will demonstrate their value to the community. And that‘s good for the company in the long term, because you‘ll have happier engineers who will be more productive.

Because, as an engineer, if you‘re exposed to not just the way things are done in your company but the way things are done elsewhere,you learn more and you become better at what you do. Open Sourcing as a way to make your engineers more productive and attract talent. I wouldn’t want to work where all the work is just internal – and with an Open Source Development, you will show the fact that you‘re not just a great studio that makes great visual effects, but you also have a great technical culture. There‘s no better way to advertise to professionals that than through open sourcing some of your internal technologies.

Although they are in competition with each other, Dreamworks and Sony Pictures Imageworks work together under the ASWF umbrella, as do SideFX with Autodesk, Red Hat with Canonical and nVidia with AMD.


]]>
DIGITAL PRODUCTION In the incubator The projects that do not yet fulfil all the criteria are in the incubator with the aim of adoption. These include the material format MaterialX, the VFX plug-in standard OpenFX, the library of 3D example assets DPEL, the image exchange library OpenImageIO and the exchange format for video editing OpenTimelineIO. 150601
Blender 4.1 goes into detail https://digitalproduction.com/2024/05/13/blender-4-1-goes-into-detail/ Mon, 13 May 2024 07:32:00 +0000 https://digitalproduction.com/?p=144561
The Blender release cycle consists of three new versions of the software per year. The first release is usually characterised by new features. The reason for this is that the third and final release is a Long Term Support (LTS) version, which is supplied with bug fixes for another two years.
]]>

The developers are naturally more hesitant when it comes to adding new features and are happy to postpone them until the next cycle.

With a medium release, which includes the current version 4.1, clean-up work and improvements usually take place. This is the case again this time. Many of the new features that were introduced in Blender 4.0 have now been polished again.

Die Covergrafik der letzten Ausgabe mit der Kuwahara-Node verfremdet. Die Filtergröße ist umso kleiner, je näher die Elemente am Betrachter sind. Dadurch lassen sich die Schläuche im Inneren gut erkennen, während die Skulptur in die Tiefe immer weiter verschwimmt.
The cover graphic of the last edition with the Kuwahara Node alienated. The filter size is smaller the closer the elements are to the viewer. This makes it easy to recognise the hoses inside, while the sculpture becomes increasingly blurred in depth.
Kuwahara filter controllable

One example of this is the Kuwahara filter in the Compositor, which was introduced in Blender 4.0 and can be used to give images an oil painting look. This can now optionally be executed with higher precision, which should achieve a better result for images with HDR dynamic range and at particularly high resolutions with a slightly longer execution time. The size of the filter area is now no longer static, but can be influenced via a socket. This makes it possible, for example, for image elements to look more picturesque or blurred the further away they are from the camera.

Ein Beispiel für die Inpaint-Node. Die Mitte der Nase wurde maskiert und per Inpaint neu gefüllt. Links das Ergebnis in Blender 4.0, rechts in Blender 4.1. Die deutlich erkennbare Linie in der Mitte des Inpaint-Bereichs im linken Bild liegt daran, dass die Randpixel in der Mitte zusammenlaufen. Dank eines zweiten Passes ist der Bereich in Blender 4.1 weich und stetig.
An example of the inpaint node. The centre of the nose was masked and refilled using Inpaint. On the left the result in Blender 4.0, on the right in Blender 4.1. The clearly recognisable line in the middle of the inpaint area in the left image is due to the fact that the edge pixels converge in the middle. Thanks to a second pass, the area in Blender 4.1 is smooth and continuous.
Viewport compositor finally complete

The depth pass required for this now also works in Eevee and the Workbench engine and is supported by the Live Compositor. This allows you to display the compositing result in the viewport. In Blender 4.1, all nodes are now supported for the first time. Only the Render Layers node is limited to the image, alpha and depth passes. The latter is also not yet available with cycles and outputs the depth in normalised coordinates and not the absolute distance of the pixels to the camera sensor as in rendering. The developers therefore recommend attaching a normalise node directly to the depth pass if you want to use it in the viewport. This ensures that the result does not suddenly change during rendering.

The Split Viewer node has been replaced by a new node called Split. Like its predecessor, it divides the image into two halves, either along the X or Y axis. This allows two effects to be compared directly in one image. Unlike the Split Viewer node, it has an image output, so it can no longer be used purely for viewing, but can also be used to post-process or save the result. The Pixelate node has a new property size. Previously, you had to switch a node in front of the pixel effect to scale it down and a second node behind it to scale it up again. This can now be dispensed with and the size can be set directly in the node. The inpaint node can be used to remove ropes, markers and other small details from images and videos by extending the edge pixels of an area defined via a mask or the alpha channel inwards. In Blender 4.1, it now uses the Euclidean distance instead of the Manhattan distance, which should ensure a more even fill. In addition, the node now works in two steps, which means that there should no longer be any artefacts at the point where the fills converge. Previously, a clear line was usually visible there, but now the area looks soft and blurred.

Detail improvements have also been made to a number of other nodes. The Defocus node now calculates the bokeh radius more accurately, which means that the results match the output of render engines better.

The Sun Beams node now produces softer beams and the anti-aliasing of Z Combine and Dilate has been improved. The Double Edge Mask node works between 50 and 250 times faster and also utilises the edge pixels, whereas previously it was shifted one pixel inwards. The crop node now makes an image disappear completely if the upper border is below the lower border, whereas up to Blender 4.0 it would have inverted the crop in this case. The flip node now works in local coordinates. This means that the image no longer moves away when the source is moved. The UV map node now has a choice between anisotropic and nearest neighbour filtering. This simplifies some NPR workflows such as palette-based remapping of colour tones. More interpolation options may be implemented in the future.

Die verschiedenen Inter­polationsmodi für Strips im Video Sequence Editor (VSE). Neu hinzugekommen sind die beiden kubischen Algorithmen, wobei sich Mitchell grundsätzlich besser für Bilder eignet als Cubic B-Spline, welches auch an anderen Stellen in Blender zum Einsatz kommt.
The various interpolation modes for strips in the Video Sequence Editor (VSE). The two cubic algorithms are new additions, whereby Mitchell is generally better suited for images than Cubic B-Spline, which is also used in other places in Blender.

With the keying screen node, two-dimensional colour gradients are generated by sampling points on a source image. The idea behind this is to fill the colour input of a keying node with a gradient in order to compensate for uneven illumination of a green screen. The gradients created in this way were previously characterised by hard edges and linear gradients. The new version in Blender 4.1 uses Gaussian interpolation, which ensures a buttery smooth result. The compositor is now only executed if its result is actually displayed somewhere, for example with a viewer node or in the image editor. For the entire node tree, you can now select whether it should be calculated with full or automatic numerical precision. The latter uses half the bit strength for previews, which means that the calculations can be performed faster and with less memory, although this can lead to increased artefacts.

Eevee Next only in the next release

In the last issue, we reported that we were looking forward to Eevee Next, a modernised version of the Eevee real-time render engine supplied with Blender. This was actually supposed to be integrated into Blender 4.0, but was then postponed to Blender 4.1. And then came the news that it still does not meet the developers’ quality requirements and will only be released in Blender 4.2. In Blender 4.1, the light probes in Eevee were renamed from Reflection Cubemap to Sphere, Reflection Plane to Plane and Irradiance Grid to Volume. The changes are not purely cosmetic, but also affect the Python API.

Denoising mit OpenImageDenoise auf unterschiedlicher Hardware. Als Beispieldatei wurde der Junkshop-Splashscreen von Blender 2.81 verwendet. Eine Geforce RTX 3090-GPU entrauscht die Szene ca. 15-mal schneller als eine Intel i9-13900k-CPU.
Denoising with OpenImageDenoise on different hardware. The Junkshop splash screen from Blender 2.81 was used as an example file. A Geforce RTX 3090 GPU denoises the scene approx. 15 times faster than an Intel i9-13900k CPU.
OpenImageDenoise on the GPU

After rendering with pathtracing-based render engines such as Cycles, which is included in Blender, there is usually a post-processing step in which the image noise typical of pathtracing is removed. Blender comes with two solutions for this. OpenImageDenoise from Intel and the OptiX Denoiser from Nvidia. Previously, only the latter could be used on the graphics card, which meant that users of non-Nvidia hardware were excluded from the acceleration. This meant that with OpenImageDenoise, noise removal often took longer than the actual rendering. In Blender 4.1, OpenImageDenoise now also works on the graphics card. Specifically, Nvidia GPUs from GTX 16xx, TITAN V and all RTX models are supported, as well as Intel graphics chips with Xe-HPG architecture or newer and Apple Silicon with MacOS 13.0 or newer. AMD GPUs are not yet supported due to stability issues. If you are using a graphics card with an AMD RDNA2 or RDNA3 chip, you can switch to the alpha version of Blender 4.2, where support is already enabled. The developers have used the splash screen from Blender 2.81 as the basis for a benchmark. There, an Apple M2 Ultra GPU with 76 cores is more than three times as fast as an M2 Ultra CPU. An Intel i9-13900k CPU even takes around 15 times as long as an Nvidia RTX 3090.

Hardware support further expanded

If you are using an integrated graphics card from AMD with the RDNA3 chipset, you can now use it for rendering on the CPU. Rendering performance on the CPU under Linux has been improved by around 5 per cent across all benchmarks, which is particularly relevant for render farms and cloud rendering.

Improvements in video editing

Blender comes with its own video editing editor. The Video Sequence Editor (VSE) has received performance improvements in various areas. The timeline should now update three to four times faster for more complex projects. Colour management, audio resampling, reading and writing of frames and parts of the code for image transformation have also been optimised. The glow effect now works between six and ten times faster, wipe can even be calculated up to 20 times faster. Gamma Cross is now four times faster, Gaussian Blur one and a half times faster and Solid Colour twice as fast.

Die Vector Scopes lassen sich jetzt einfärben und behalten ihr Seitenverhältnis. Eine Linie zeigt den durchschnittlichen Kaukasischen Hautton an.
The vector scopes can now be coloured and retain their aspect ratio. A line shows the average Caucasian skin tone.
New scopes

The Luma Waveform is calculated eight to 15 times faster and has also received an optical update. The display has also been improved and now shows more information. The RGB Parade variant, in which the individual channels are displayed separately, now uses less saturated colours and slightly additive blending to make it more pleasant for the eyes. The histogram now also displays more information, is now less saturated and is also displayed faster thanks to GPU acceleration. The Vector Scope now retains its aspect ratio and has been given a line that corresponds to the average Caucasian skin type. It can also be coloured, making it less abstract.

Links Blender 4.0, rechts Blender 4.1. Von oben nach unten das normale Histogramm, die Waveform-Anzeige der Helligkeit und die Waveform-Anzeige nach RGB-Kanälen aufgeteilt, die sogenannte Parade-Ansicht.
Left Blender 4.0, right Blender 4.1. From top to bottom the normal histogram, the waveform display of brightness and the waveform display divided by RGB channels, the so-called parade view.
Audio waveforms as standard

In the Video Sequence Editor, the waveforms are now displayed by default for audio strips. As these are usually symmetrical, you can restrict the display to the upper half.

Automatically the best filtering

Cubic interpolation is now also offered when rotating and scaling strips. This was previously only available in the transform effect strip. Performance has also been improved at the same time. Cubic interpolation is offered in the B-Spline variant, which is also used elsewhere in Blender, and the Mitchell variant, which is usually better suited for images. The bilinear filter no longer produces a transparent border at the edge of the image when it is scaled up and a whole series of errors have been eliminated where images were shifted by one pixel at a time, resulting in annoying gaps. The subsampled3x3 filter has been replaced by a generalised box filter, which also performs well when images are scaled more than 3×3 smaller. By default, the filter that is expected to produce the best results in the situation is now applied to a strip. If a strip is not scaled or rotated and its position is only changed in integer steps, Nearest is selected as the filter. If an image is enlarged by more than double, Cubic Mitchell is used; if it is reduced by less than half, Blender 4.1 uses the Box filter and in all other cases the interpolation remains with Bilinear.

Outliner

In the Outliner, you can now double-click on a collection icon to select all its children. An Expand/Collapse All entry has been added to the context menu. This expands or collapses the entire hierarchy. Previously, this option was only available via the Shift A shortcut. Previously, it was not possible to apply modifiers to objects in the outliner; an entry has also been added to the context menu for this.

Wenn man durch die Kamera schaut, erscheint in Blender 4.1 ein neues Gizmo mit einem Vorhängeschloss als Icon. Damit kann die Option Lock Camera to View ein- und ausgeschaltet werden, was bisher nur umständlich im View-Tab in der Sidebar möglich war.
When you look through the camera, a new gizmo with a padlock icon appears in Blender 4.1. This allows you to switch the Lock Camera to View option on and off, which was previously only possible in the View tab in the sidebar.
Lock Camera to View is now a gizmo

Companies normally collect biometric data from the users of their software in order to improve the user interface. This could be heatmaps that show where users click particularly frequently or simply statistics on which functions are accessed how frequently and which menus are visited how often. Anyone who has concerns about data protection here is on the right track. This is also the reason why the Blender developers do not collect any such data. Instead, the development of the interface, like the rest of Blender, follows the open source approach. And that means mock-ups, demo implementations and constant discussions between programmers and users. The process can be perceived as tough, but it is the price of privacy. One example is the Lock Camera to View feature, which allows the camera to follow the user’s movements in the 3D viewport. This allows a camera to be positioned in the same way as you would otherwise move through the 3D viewport, which is why this function was particularly popular among beginners. If they knew about it at all, because it was located in the view tab sidebar, which is hidden by default. Quite deep in the interface for such a frequently used function. And so the idea of introducing another viewport gizmo came up years ago. This appears when you look through the camera and has a padlock as an icon. This small but useful change has now finally found its way into Blender 4.1.

Blender bringt einen eigenen Dateibrowser mit. In Blender 4.1 werden jetzt im Tooltip Metainformationen wie die Blenderversion, mit der ein Projekt gespeichert wurde, oder die Auflösung von Bildern und Bildrate von Videos angezeigt.
Blender comes with its own file browser. In Blender 4.1, meta information such as the Blender version used to save a project or the resolution of images and frame rate of videos are now displayed in the tooltip.
UI detail improvements

Tooltips in the file browser now show the Blender version in which a file was saved and metadata such as resolution for images or frame rate for video files. The tooltips are also displayed in the Open Recent menu, where the preview image can also be found. While you are working on a project, Blender automatically saves to the temporary directory every two minutes by default. If Blender crashes, you can then restore your project via File -> Recover -> Auto Save and continue working. However, it could happen that you save your project manually and then save it again immediately afterwards using Autosave. In Blender 4.1 the counter is now reset every time you save manually.

Links der Color Picker aus Blender 4.1, rechts in Blender 4.1. Die gewählte Farbe und Helligkeit wird direkt im Cursor angezeigt, was die Lesbarkeit vereinfacht.
On the left the colour picker from Blender 4.1, on the right in Blender 4.1. The selected colour and brightness is displayed directly in the cursor, which makes it easier to read.

With the colour picker, the selected colour and brightness are now displayed directly in the respective cursor, making it easier to read. In addition, many other details have been added to the interface, from optimising the rounding of the corners of pop-up and conventional menus, to higher quality shadows for these menus, to the animation markers, whose line is no longer drawn by the marker itself. The text that is used as default when adding a text object is now translated into the language in which the interface is used. So if you have set your interface to Spanish, you will now be greeted by “Texto” when you add a text object.

Import and export via drag and drop

External files in the formats Alembic, Collada, OBJ, OpenUSD, PLY, and STL can now be imported into Blender using drag-and-drop. The reader will have noticed that these are formats whose exporters and importers are not realised in Python, but in C. STL has been added in Blender 4.1 and should now work three to ten times as fast as the previous implementation in Python, which will still be supplied for a few versions but will be removed from Blender in the long term. In future versions of Blender, support for drag and drop will also be added for formats whose import and export are implemented in Python. This will be made possible by a new callback, which also gives developers of external add-ons the opportunity to implement drag-and-drop.

USD & Co

The exporter for the Universal Scene Description Language (USD) now supports armatures and shape keys, while the importer supports the instantiation of objects, collections and USD primitives on a point basis. These are loaded as a Point Cloud object with a Geometry Nodes setup with an Instance on Points node. The import can also be extended using Python hooks, making it easier to integrate Blender into in-house pipelines. The import and export of Stanford PLY files now supports Custom Vertex Attributes and when exporting to OBJ format, objects whose shading is completely set to Flat or Smooth are exported between 20 per cent and 40 per cent faster.

News on the glTF front

The glTF exporter can now optionally optimise the generated files for display with OpenGL using gltfpack by reordering the mesh data in such a way that memory consumption and draw calls are minimised. UDIMs are not supported by glTF. They are therefore now split during export, with each tile receiving its own material. Unused images and textures can now still be exported, e.g. because they are still needed later in an interactive application, and anisotropy is now supported for materials.

Bake bake Geonodes

Geometry nodes now allow intermediate results from node groups to be saved via baking. Previously, baking support was only available for the Simulation Zone. Data is now better deduplicated in the cache, which means that the file size should be significantly smaller in some cases. The caches should no longer be lost after an undo and volumes can now also be baked. The auto-smooth option for meshes has been replaced by a modifier node group asset. At the same time, you now have full control over the custom normals of a mesh in the geometry nodes.

Growth in the Geometry Nodes

The new Active Camera node returns the currently active camera, the Index Switch node allows you to select any input via an index and the Sort Elements node can be used to redefine the vertex order of a mesh. Split to Instances can be used to split a mesh into individual parts based on an ID and the Blackbody node known from the Shader Editor is now also available in Geometry Nodes.

New rotations step by step

There is a new Rotate Rotation node for rotations, which replaces the Rotate Euler node and is easier to use. This is part of the gradual introduction of the new Rotation Socket, which has been introduced in Blender 4.1 for the following nodes: Distribute Points on Faces, Instance on Points, Rotate Instances, Transform Geometry, Object Info and Instance Rotation.

Mit der Menu Switch-Node ist es jetzt möglich, Dropdown-Menüs für selbstgebaute Geometry Nodes-Assets zu erstellen.
With the Menu Switch node, it is now possible to create drop-down menus for custom-built geometry node assets.
Home-made Geometry Nodes

One of the design goals of geometry nodes in Blender is that users should be able to recreate high-level nodes completely with on-board tools. Until now, however, this was only possible to a limited extent, as some nodes work with drop-downs, a control element that you could not yet recreate yourself. In Blender 4.1, it is now possible to define your own drop-down menus via the menu switch node, which finally closes this gap.

Conclusion

Blender 4.1 offers detailed improvements across the board. A successful intermediate release, for the grand finale in the form of Blender 4.2 LTS we are still waiting for EEVEE Next.

]]>
DIGITAL PRODUCTION 144561
Blender: An upgrade for our particle system – We lay pipes! https://digitalproduction.com/2024/01/02/blender-an-upgrade-for-our-particle-system-we-lay-hoses/ Tue, 02 Jan 2024 18:49:00 +0000 https://digitalproduction.com/?p=144232
In issue 23:04|05 we learnt how to create a particle system with the new Simulation Nodes in Blender 3.6. In Blender 4.0, an interesting new function has been added that allows us to connect a series of points via curves. That would be a nice feature upgrade for our custom build. In addition, Cycles can now do light linking, which allows us to set the scene perfectly.
]]>

There are features in Blender that users have been waiting decades for. Light linking is one of them, even if users of other programmes find it hard to believe. For Blender users, however, this really is a new feature that Cycles has been given. We want to try it out together with the particle system from Simulation Nodes, which we built in issue 23:04|05. But first we’ll add another new feature to Blender 4.0, namely the ability to connect points with curves. The result adorns the cover of this issue.

No longer quite (so) tight

First download the result of the simulation nodes workshop and open the file – is.gd/simstrings. If you start the animation with the space bar, you will notice how close together the particles are. A tube is to be laid through each of these particles later, which would only result in a lump with this quantity. Therefore, first reduce the density in the modifier panel to 100.

Perlenkette: Indem wir den Seed aus der „Distribute Points on Faces“-Node nicht mehr in jedem Frame ändern, erhält unser Partikelsystem vom Aussehen her den Character von Perlenketten.
String of pearls: By no longer changing the seed from the “Distribute Points on Faces” node in each frame, our particle system takes on the appearance of strings of pearls.

The seed has to go

The particles are now much less dense. So that we can make threads out of them later, they should not appear randomly on the surface of the object, but always in the same place. This allows us to create a thread-like look even without the conversion to curves. Go to the geometry node workspace and make sure that the “Fire Particle System” node tree is open in the geometry node editor. Look for the “Distribute Points on Faces” node at the bottom left of the node tree and remove the connection in the seed socket. If you now play the animation, the particles look like strings of pearls that slowly disintegrate. Instead, you can move the seed to the outside as a parameter in the modifier panel by dragging it into the empty socket of the group input node.

Points to curves

At the other end of the node tree, the generated points flow out of the “Simulation Output”. At this point, we can convert them to curves. Add a new node Points -> Points to Curves and connect it to the Geometry output of the Simulation Output node and the Geometry input of the Group Output node. Threads now appear in the viewport instead of points. We leave the Set Material node in the tree; we can still use it later to make the particles appear additionally.

Kurven: Mit der Points to Curves-Node lassen sich die Partikel zu Curves verbinden. Dabei wird jeweils aus den Partikeln, die in einem Frame emittiert wurden, eine Curve.
Curves: The Points to Curves node can be used to connect the particles to curves. This turns the particles that were emitted in a frame into a curve.

Curves to meshes

The curves now appear in the viewport, but not yet in the render, as they do not yet have a surface. This is ensured by the Node Curve -> Operations -> Curve to Mesh. Set this between Points to Curves and Group Output. The curve has now become a mesh, but this consists of individual edges. For a proper surface, we need another curve for the profile. Click on the Profile Curve socket and drag out a new connection. A search field appears when you release the mouse pointer. Here you search for a circle, it is sufficient to enter “ci”, thanks to Type Ahead Find, Curve Circle -> Curve already appears in the second field, which you select. A new node now appears, which creates a circle that then acts as a profile for the curves created from the particles.

Resolution: Mit der Curve Circle-Node erzeugen wir eine Hülle für unsere Kurven. Deren Aufl ösung holen wir nach Außen ins Modifier-UI rechts im Bild.
Resolution: We use the Curve Circle node to create an envelope for our curves. We bring their resolution to the outside in the modifier UI on the right of the screen.

Resolution

There is now a lot going on in the viewport, as suddenly a huge amount of geometry wants to be displayed. You can put a stop to this by reducing the resolution in the curve circle node to eight. But perhaps you want to reduce the resolution a little further when experimenting and then increase it again during the final rendering? Switching to the Geometry Node workspace each time and searching for the right node in the Node Editor may not be the most skilful way to do this. It is therefore a good idea to move this parameter to the outside of the modifier UI. Drag out a new node connection as you have just done and search for Group Input. A Group Input node appears, in which all sockets are hidden except for the newly created resolution. The Blender interface is full of little surprises that make everyday work easier.

Node Group Assets

The curves are currently very sharp on each particle, which has a negative effect on shading and may not be the style everyone wants. We need something to round them off like the subsurf modifier for meshes. Such a tool is now supplied in Blender as a Node Group Asset with the new hair assets. We take advantage of the fact that hair and curves are almost the same thing in Blender, the corresponding assets actually all work on curves. So we can also use them with our setup. Add a node Hair -> Deformation -> Smooth Hair Curves and place it between Points to Curve and Curve to Mesh.

The wild curves

The result should look pretty wild, none of the curves are in place anymore. This is due to the Preserve Length setting. Switch this off and our threads are all back in the right place, albeit slightly rounded. You can use the iterations to determine how strong the effect should be. One was enough for our cover, the more you use, the more rounded the curves become.

Particles: Über Smooth Hair Curves können wir die Fäden etwas abrunden und erhalten ein deutlich weicheres Shading. Über Join Geometry können wir die Partikel wieder mit einmischen.
Particles: We can use Smooth Hair Curves to round off the threads slightly and achieve a much softer shading. We can use Join Geometry to blend in the particles again.

Bringing back the particles

As we are already simulating the movements of a particle system for the threads anyway, we can use them at the same time by representing them as points again. For this purpose, we have not deleted the Set Material node earlier in the article, but merely separated it. Add a Node Geometry -> Join Geometry between Curve to Mesh and Group Output. Also connect the output of the Set Material node to the Join Geometry node. The particles now appear as point objects in the viewport and should already have the appropriate material in the render preview.

Stahlrohre: Das Material für die Fäden muss in den Geometry Nodes gesetzt werden, für die Übersicht geben wir ihm einen eigenen Material Slot. Bei einem Material mit Metallic von 1.0 werden die Curves nicht mehr von der Umgebung beschienen, weil sie in unserer Startdatei für Glossy Shader ausgeschalten ist.
Steel pipes: The material for the threads must be set in the geometry nodes, for the overview we give it its own material slot. With a material with metallic of 1.0, the curves are no longer illuminated by the environment because it is switched off in our start file for glossy shaders.

Steel tubes

To give the threads a material as well, add another material slot to the Particle Nodes Container object and create a new material there with metallic at 1.0. However, it must first be assigned in the Geometry Nodes so that it also appears on the tubes. Duplicate the Set Material node and place it between Curve to Mesh and Join Geometry. The threads should now appear very dark again. This is due to a special feature of the source file. In this file, the world is invisible to glossy shaders, which produces an interesting effect, as the points in the shader have a diffuse component and therefore appear as if they themselves are glowing, although not as evenly as would be the case with emission without tricks.

Light Linking

The fact that the world is not visible in shiny reflective surfaces is a simple version of light linking and has been present in Cycles from the very beginning. However, this is shader-based and therefore very generalised. In Blender 4.0, it is now possible for the first time to limit the influence of light sources on objects in a collection. And we would now like to use this to illuminate the threads of two area lights on the left and right, and only the threads, particles and logo, but not the floor. To do this, create a new collection in the Outliner and drag the DP logo and the Particle Nodes container into it.

Light Linking befindet sich etwas versteckt im Shading Panel in den Object Properties.
Light Linking is somewhat hidden in the Shading Panel in the Object Properties.

Orange and Teal

Then create another new collection and two area lights in it, whose shape you set to Rectangle in the Object Data Properties and set Size X to 3.0. Align the two area lights so that they shine on the scene from the left and right and give them two contrasting colours, e.g. the famous combination of orange and aquamarine. You should set the cold light source to a much stronger colour than the warm one, e.g. 200 watts and 50 watts.

Still well hidden

To restrict the illumination of the two area lights to the logo, particles and threads, you must select the collection in which the three objects are located for both in the object properties in the shading panel under Light Linking. Now you no longer illuminate the floor, which draws the viewer’s attention to the particle action.

Fade-in also for threads

However, there is one last detail I would like to mention. The particles fade in our example file because we built our particle system in the last workshop so that it saves the age of each point as a value between 0.0 and 1.0. We can also access these values for the threads. In other words, the curves can also be threaded in and out.

Mastered with flying colours

Go to the Shading Workspace and select the Particle Nodes Container object. Select the Particles Fading Out material in the material slots and select the three connected nodes Attribute, Invert Colour and Colour Ramp in the Shader Editor. Copy them using Ctrl C, then select the material that you have given to the threads, or press Ctrl V in the Shader Editor. Now that you have copied the nodes, connect the output of the Colour Ramp node to the alpha socket of the Principled BSDF. The curves will now appear and fade and you have mastered the technical part of the workshop with flying colours.

Let off steam

Now it’s time to let off steam. For the cover image, I have changed the direction of movement of the particles upwards. You can also change the direction in the Vector Add Node in the Simulation Zone. And at this point, think about how you could make the structure even more user-friendly. For example, by also exposing the direction of movement. Or adding an auxiliary object that specifies the direction and “wind force”?

RADiCAL Motion Capture: Der fertige Effekt eignet sich besonders gut, um Bewegungen zu visualisieren, dieses Beispiel nutzt den RADiCAL Service für Motion Capture mit dem Smartphone.
RADiCAL Motion Capture: The finished effect is particularly suitable for visualising movements; this example uses the RADiCAL service for motion capture with a smartphone.

Conclusion and outlook

Blender 4.0 brings some new features, including the eagerly awaited Light Linking and new Geometry Nodes. Both were combined in this workshop. But there are possibilities to go further. For example, the threads are currently generated at each frame like tangles; with a few more nodes, they could be displayed like a string of particles or like growing hair, quasi perpendicular to the current direction. This method is particularly suitable for motion capture recordings, as it allows a motion path to be created for any point on a character. I used this method for the Udon asset for the RADiCAL blender add-on. RADiCAL is a service for extracting motion data from simple video recordings or livestreams.

Für diejenigen, die sich vor Ort mit Geometry und Simulation Nodes vertraut machen wollen, gibt es dieses Jahr eine Reihe von Workshops an der Blender Summer School, die vom 26. bis 28.07. in Mannheim stattfindet. Impressionen vom letzten Jahr und die Anmeldung für das kommende Jahr findet ihr hier: blender3dschool.de
For those who want to familiarise themselves with geometry and simulation nodes on site, there will be a series of workshops at this year’s Blender Summer School, which takes place from 26 to 28 July in Mannheim. Impressions from last year and the registration for next year can be found here: blender3dschool.de

]]>
DIGITAL PRODUCTION 144232
Your own particle system with simulation nodes https://digitalproduction.com/2023/08/10/your-own-particle-system-with-simulation-nodes/ Thu, 10 Aug 2023 08:41:00 +0000 https://digitalproduction.com/?p=156111
The Digital Production logo has been through a lot recently. It was thrown in Houdini with a wet towel and turned to earth in Tyflow. And now it is set on fire in Blender - where will it end?
]]>

There are features in Blender that users have been waiting decades for. Until now, these have included a new particle system developed from scratch. In Blender 3.6, an alternative to the outdated, conventional particles is now finally available: The “Simulation Area” in the Geometry Nodes. As a side effect, not only particles can be simulated, but also cloth, soft bodies, etc. However, you still have to create the simulation manually from scratch. And that’s what we’re going to do in this article using a particle system as an example.

Burn, logo, burn!

As a concrete example, an object should burst into flames, because the fire simulation in Blender also has its quirks, so we use an old-school particle simulation for a stylised fire effect. The Digital Production logo serves as an example object, but you can use any other 3D object.

What actually is a Simulation in Blender?

In Blender, all processes whose state in a frame depends on the state in the previous frame are a simulation.
This classically includes particles, Cloth, Soft Bodies, Rigid Bodies, Fire, Water, Smoke and the Blender speciality “Dynamic Paint”. On the other hand, there are tools such as the modifiers from the “Modify”, “Generate” and “Deform” categories as well as
geometry Nodes, where each frame can be each frame is independent of the others.

Simulation Zone

In Blender 3.6, you can now set up a so-called “Simulation Zone” in these geometry nodes zone”, which is the area in which the simulations run. You can visualise this as follows. At the input of the simulation zone you can feed in data. These are read out once and remain in the simulation zone from then on Simulation zone. Processed data can be output. These are then fed back into the simulation in the next frame via sent back to the simulation zone via the input and can be further processed there. This can be the position of a particle, but also its size or any other property size or any other property, which can be accessed with Geometry Nodes. Changing the size with the lifetime of a particle, for example, was previously not possible at all. Thanks to geometry nodes, we now have almost complete freedom when it comes to the structure of particle systems. However, this is still accompanied by the requirement that you have to create everything yourself. On the geometry node side, Blender 3.6 only offers the simulation zone and the associated features such as baking, but not yet any high-level tools such as emitters or force fields. These will probably be delivered in the future as nodegroup assets similar to the hair assets that have found their way into Blender 3.5. Until then, however, manual work is the order of the day.

Create a new scene and leave the default cube


Leave the default cube alive for a change. It should act as a container for our particle system. It is best to give it a suitable name such as “Particle Nodes Container” using the shortcut F2. Then switch to the Geometry Nodes workspace and click on “New” to create a new node tree. Assign a suitable name here too, such as “Fire Particle System”.

My first particle system: the “Hello World” of particle systems, so to speak. Particles are distributed on the faces of the cube, which fly upwards in the following frames thanks to the offset in the set position node.

Enter the Zone

Use “Shift A – Simulation – Simulation Zone” to create a sub-zone in which the simulation will take place later. This is highlighted in burgundy and has its own input and output. If nodes are interposed, the highlighted area becomes larger. Nodes that are located within it have access to simulation data and are themselves part of the Simulation. Nodes from outside can be connected to the nodes in the zone, but then have no access to the simulation themselves, which will prove to be practical later on.

Distribute points on surfaces

A particle system is based on points, so our first task is to add them. For now, the geometry of the default cube will serve as the emitter. Add a “Point – Distribute Points on Faces” node and place it between the geometry inputs and outputs of the simulation zone. Nothing should happen yet, however, as the simulation zone is not yet connected to anything. Drag the geometry output of the simulation output node to the geometry input of the group output node and the geometry input of the group input node to the input of the simulation input node with the same name. If the playhead in the timeline is set to frame 1, points should now appear in the viewport.

Set in motion

However, the dots are not yet moving, i.e. we have a particle system but not yet a simulation. A node that changes or updates the position of the particles in each frame is still missing. Add a “Geometry – Write – Set Position” node and place it between the Points output of the Distribute Points on Faces node and the Geometry input of the Simulation Output node. Under “Offset”, set the value for Z to 0.1. If you now start the animation from frame 1, the particles move upwards at a constant speed, as 0.1 is added to the Z position in each frame.

My better particle system: With just a few nodes, we were able to create a particle system with animatable emission.

My first Particle system

Congratulations, you have just created your first own particle system with the Blender Simulation Nodes. It consists of an emitter that distributes particles on the surfaces of the input object. These particles are shifted upwards by a constant factor in each frame. The structure corresponds to a legacy blender particle system in which the start and end of the particle emission fall on the same frame.

Influence from outside

Another typical way of emitting particles is recurring emission over several frames, a kind of inflow object. This is also the default setting of the legacy particle system. In the simulation nodes, we have to regularly add new particles from outside the simulation area. This requires an additional object. At this point, the previous cube becomes the container of the particle system and another object takes on the role of the emitter.

In our example, we use the DP logo, but you can use any mesh objects. Add an “Input – Scene – Object Info” node. This has an orange input socket. Connect it to the empty socket of the group input node. An input field for objects now appears in the modifier. You can name it by opening the sidebar in the Node Editor with the N key and entering a suitable name such as “Emitter Object” in the Group tab under “Inputs”. You can even define a tooltip here.

Degraded to a mere container

Switch the object info node to “Relative” so that the points also appear in the correct position later if you move, scale or rotate the object. Then connect the geometry output to the mesh input of the Distribute Points on Faces node and disconnect the geometry input of the Simulation Input node. This cuts the connection to the original geometry of the cube; it is now just a container for the simulation.

Union

Add an object of your choice to the scene and select it in the Geometry Nodes modifier of the container. If you now play the animation from frame one using the space bar, particles will only appear once again. In order for the emitter to emit particles permanently, the newly added points must be merged with the existing ones in each step.
Add a “Geometry – Join Geometry” node and place it on the connection between “Distribute Points on Face” and “Set Position”. The Join Geometry node has a slightly elongated input socket. This illustration is intended to indicate that any number of nodes can be plugged in here. Connect the geometry output of the simulation input node to it.

Randomness at any time

If you now start the simulation from frame one, you will see a stream of particles from the emitter. But they still look like threads because they are generated from exactly the same position on the surface of the object at each frame. However, we need a different distribution in each frame so that it looks like particles are being emitted from the entire surface. Add a node “Input – Scene – Scene Time” and connect the frame output to the seed input of the Distribute Points on Faces node. Also connect the Density input of the node to the empty socket of the Group Input node in order to be able to control the emission density from outside.

Animated particle emission

If you now play the animation, you will not only see a particle beam flying away from your object, you can even animate how many particles the emitter generates per frame. This was previously not so easy to do with the legacy particle system in Blender. This is where the strength of Simulation Nodes comes into play, because you no longer have to worry about such limitations.

For life

Another feature of particle systems is the option of giving each particle a lifetime and reading out its current age. In the simulation nodes, we achieve this by setting an “age” attribute for each point at birth, which is then incremented by one in each frame. Add a node “Attribute – Capture Attribute” between Distribute Points on Faces and Join Geometry. A float with the value 0.0 is now assigned to each point when it is created. Connect the attribute output to the empty input socket of the simulation output node. A corresponding output now appears at the simulation input node.

Marry

Just as with the Join Geometry node for the points, we also need to find a way to “marry” the age of the existing particles with the newly added ones. Add a “Utilities – Math – Math” node. This is already set to the correct “Add” operation by default. Now all particles have the attribute and it is looped through the simulation. However, we are not yet counting up. Duplicate the add node and place it between the existing add node and the simulation output node and set the value in the lower input to 1.0. Now one is added to the age with each frame.

Age and lifetime: A new lifetime factor has been added to the particle system. If this is exceeded, the corresponding points are removed from the simulation. A corresponding output attribute has been added so that the shading can later be influenced based on the age of the particles.

Age vs. lifetime

Age has not yet had any effect. We can use it to make particles die or disappear after a certain time in frames. Duplicate one of the add nodes and place it in a free area in the simulation zone. Connect the upper input to the second add node and change the The operation to “Greater Than”. Add then use “Input – Group – Group Input” to add another Group Input node and connect the threshold input of the Greater Than node to the free socket of the Socket of the group input node. Name the new parameter “Lifetime”, a default value of 50.0 makes sense here. Add a “Geometry – Operations – Delete Geometry” node and place it between the Geometry output of the Set Position node and the Geometry input of the Simulation Output node. Connect the Selection input to the the Value output of the Greater Than Node. From now on, all particles that are older than their lifetime will be removed from the simulation. If you play the animation now, the particles will disappear again from frame 50.

Normalisation

We can also use the age of the points as an output attribute so that we can later colour the particles differently in Cycles depending on their age. As Cycles likes to be fed with values between 0.0 and 1.0, we should normalise it to this value range beforehand. First connect the attribute output of the simulation output node to the free socket of the group output node. If you now open the Ouput Attributes panel in the modifier, you will see an empty field. Here you can later give the attribute a name so that you can access it in the shader, e.g. “age”. It will then also appear as a separate column in the Spreadsheet Editor. You can change the labelling of the field and the tooltip again in the Group tab of the sidebar of the Node Editor, for example to “Age”.

Force fields

The second component we would add to the legacy particle system would be force fields to control the movement of the particles. In the simulation nodes, this is done within the simulation zone via vectors that control the offset of the Set Position node. In our case these would be two components. A kind of wind that blows the particles in a desired direction and a field for swirling. We can simplify the wind extremely by assuming a constant movement in one direction. If we add a Z component, we have also integrated the buoyancy.

The finished particle system: Once the noise texture has been integrated as a turbulence field and the material has been set, the particle system is complete.

Drift

Connect the offset input of the Set Position node to the free socket of the Group Input node and name the newly created input “Wind Force”. A value of 0.05, 0.01 and 0.025 causes the particles to drift gently and slightly backwards.

Swirl

We need a second force to swirl the particles. We can extract this from a noise texture. This is because the RGB colours of the texture can also be interpreted as XYZ values of a vector. These must be merged with the previous forces, again using maths. Add a “Utilities – Vector – Vector Math” node and place it between the wind force socket of the Group Input node and the offset input of the Set Position node. Click on the lower, free vector input of the add node and drag the mouse to a free position. There should be a plus symbol next to the mouse cursor. If you now release the mouse, a search field will appear. Enter “Noise” there and select “Noise Texture – Colour” from the search results.

Adjustments

A noise texture appears whose colour output is directly connected to the vector input of the add node. If you now play the animation, the particles shoot off at a diagonal. This is because the noise texture only outputs positive values between 0.0 and 1.0 for each channel. The result should be colours, and their channel values are defined in Blender as a Range between 0.0 and 1.0.

Negative

However, the particles should move in all directions, even in the opposite direction to an axis, i.e. in a negative direction. To achieve this, we have to subtract 0.5 from all channels, then the range is between -0.5 and 0.5. Duplicate the Vector Math node, place it between the colour output of the Noise Texture node and the existing Vector Math node, which is currently set to “Add”, and set the operator of the new node to “Subtract”. Enter 0.5, 0.5 and 0.5 in the lower vector field.

Buzz

If you now play the animation, you will see quite a hustle and bustle. The turbulence caused by the noise texture is still much too strong. Duplicate a Vector Math node again, place it between Subtract and Add and set the operation to “Scale”. Set the lower input to 0.2 and connect it to the free socket of the to the free socket of the Group Input Node. Name the new input parameter “Turbulence Strength” and view the animation. Set the Lifetime to 100 and the particles will now be swirled around by the noise texture.

Control

How coarse or fine the turbulence is can be controlled via the scale input of the noise texture node. Set it to 1.0 and also connect it to the Group Input node and name the parameter “Turbulence Scale”. If you now play the animation, the particles will flow as if you had added a turbulence force field to a legacy particle system and set the flow value to 1.0. However, this stream-like flow is not quite the way fire moves. The flames flicker and constantly change direction.

The fourth dimension

To simulate this effect with the Noise Texture, switch the drop-down in the Noise Texture to “4D”. A new input “W” has now been added. This can be used to permanently change the noise texture. In other programmes, the parameter is called “Evolution”, which really is a good name. To animate this, you do not need to set any keyframes. Instead, connect it to the Seconds output of the Scene Time node and the particles will wobble and flicker when the animation is played.

Set material

Before you can move on to shading, you need to give the particles a material. To do this, add a “Material – Set Material” node between the geometry output of the simulation output node and the geometry input of the group output node. Select an existing material from the drop-down menu and edit the name and the shader in the next step.

Rendering in Cycles: The points generated by the simulation nodes cannot yet be displayed with Eevee, so we use Cycles as the render engine. The ‘age’ attribute is used for colouring, the name of which we have assigned in the modifier and entered exactly as it is in the attribute node.

Rendering in Cycles

To render the particles, we need the Cycles render engine, as Eevee cannot yet display the points. In the Render tab of the Properties Editor, change the render engine to Cycles and switch to the Shading Workspace. Switch on the Render Preview in the viewport; the particles now appear as small spheres. In the Shader Editor, select the same material in the material drop-down that you have also assigned in the geometry nodes. Now you can also change the name, e.g. to “Particle Material”.

Cycles should recognise them by their name

Add a new node “Input – Attribute” and enter the exact name you have given to the Age attribute in the “Name” field and connect the Fac output to the Base Colour input of the Principled BSDF node. The particles are now coloured in a gradient from black to white, depending on their age. The perfect input for a colour ramp.

Ramp

Insert a “Converter – Colour Ramp” node between Fac and Base Color. Set another stop by pressing the plus icon of the node and set the stop on the far left to a light, desaturated orange. Then set the value for “Value” in the colour wheel to 5.0. Now the particles reflect more light than hits them. A nice effect, which is not physically correct at all, but gives a little more detail than when using emission. Set the second stop to a rich red with a value of 2.0 and the last stop to pure black. Also set the interpolation in the dropdown in the top right-hand corner of the node to “Ease”.

Fadeout

The spheres are now coloured, but it would be nice if they fade out as if the flames were burning out or dissipating like smoke. Connect the Fac output of the Attribute node to the Alpha input of the Principled BSDF node. Now it looks as if the logo is smoking at the beginning and the smoke is turning into fire. For the fire to finally fade out, we need another colour ramp. Set three stops again and the interpolation to “Ease”. The centre stop is given a pure white and the right-hand stop a pure black. The left stop is given a value of 0.1 so that the particles on the emitter are still slightly visible and virtually envelop it.

Artefacts

Black artefacts should now have appeared in the tips of the flames, depending on how many particles you use. The black spots are caused by the fact that Cycles only visits a certain number of surfaces, the so-called bounces. And every time a ray passes through one of the spheres, that’s two transparent bounces. Set the number of transparent bounces in the Light Paths panel of the Render Properties to 256. Now the flame tongues fade out cleanly.

Light and shadow

As the particles in our setup are dependent on light from outside to glow, it is worth loading an HDRI texture into the world. You can achieve the black background by opening the “Ray Visibility” panel in the world properties and unchecking “Camera” and “Glossy”. The latter is a small preparatory step for the next step.

Laying the floor

Add a tarp to the scene and scale it by a factor of 50. Then add a new material and set the value for “Metallic” in the Principled BSDF node to 1.0. You can adjust the strength of the reflection by making the base colour lighter or darker. For an exact replication of the result, set the value of “Value” in the colour selection of the base colour to 0.5. Next, delete the light source that is still present and, if necessary, make the material of the emitter object darker so that it is clearly recognisable in contrast to the light “smoke”.

Bake a cake

After placing the camera, it’s time to render. You can render both a still image and an animation. It would be practical if the simulation data could be saved so that you don’t have to simulate again and again. This process is called “Baking” and can be found for the simulation nodes in the Physics tab of the Properties Editor. Open the Simulation Nodes panel there and click on “Bake”. All frames in the timeline are now simulated and saved. A new simulation is not necessary.

Outlook

This article was intended to provide an overview of how particle systems in the new are structured in the new simulation nodes. From this basis you can proceed further. You could also change the radius of the particles with age. A feature that is also not so easily possible with the legacy particle system. Or you can modify it so that the time-dependent calculations always take place in seconds instead of frames, making your system independent of the project’s frame rate. In order to make the fire flicker even better, you could also
also modulate the emission with a safely time-varying noise texture. Even more: you can make the result more realistic by giving the particles a very slight initial velocity along the normal of the emission object.

Conclusion

With the new simulation nodes in Blender 3.6, particle systems can be created whose thickness exceeds that of the existing legacy particle system. Thanks to the power of the geometry nodes, there is enormous potential ahead of you, but it still needs to be realised. Because there are still no high-level nodes, you have to click together things like an emitter or a force field yourself. It is to be expected that there will be numerous node groups and assets in the future, both from the developers themselves and from the community.

]]>
DIGITAL PRODUCTION 156111
Blender 3.5, Wuhu! https://digitalproduction.com/2023/03/29/blender-3-5-wuhu/ https://digitalproduction.com/2023/03/29/blender-3-5-wuhu/#comments Wed, 29 Mar 2023 19:24:32 +0000 https://www.digitalproduction.com/?p=115750 Der neue Belnder Splash Screen von Nicole Morena
Hey you, are you ready for the latest update of Blender? With Blender 3.5, the Blender Foundation brings an improved hair system and built-in hair assets, support for vector displacement maps, a viewport compositor and a new timing mode for Grease Pencil, improvements to the Pose Library and much more.
]]>
Der neue Belnder Splash Screen von Nicole Morena

In future, Blender will rely on three instead of four updates per year

The Blender Foundation, the brains behind our favourite open source tool, is planning a change in the release of new versions. The Blender Foundation has announced that in future it will focus on three releases per year instead of four. The change is being made based on the experiences of developers and users in recent years. The plan is to leave the software longer in the Bcon1 and Bcon2 phases – the aim is to give users more time to test the latest version of Blender. The developers hope that this will give developers more time to write code, users more time to test, and Studio TDs and add-on developers more time to maintain tools. With these changes, Blender hopes to optimise the development of Blender and make work easier for users and developers. But now to Blender 3.5 !

Upgrades for Hair and more

If you want to save time, check out Blender 3.5’s new library of hair assets. It’s a set of 26 pre-built Geometry Nodes setups that you can easily insert into your scene. These will help you simplify tasks like creating hair curves on a scalp surface, creating clumps and styling hair.

The new and improved Geometry Nodes-based curved hair system is one of the most notable new features of Blender 3.5. With this upgrade, you can quickly and easily create any type of hair, fur or grass. Best of all, you can load a library of hair assets directly into the software for easy drag-and-drop use.

New nodes for procedural modelling and image processing

With the new Geometry Nodes in Blender 3.5, you can now also use Edges to Face Groups to find groups of faces surrounded by selected edges. In addition, there are now Blur Attributes, with which you can mix the attribute values of neighbouring elements. The new Image Input and Image Info Nodes also expand your creative options for processing images.

Reduce noise in cycle rendering with the new Light Tree

Blender 3.5 introduces the Light Tree, which helps Cycles to sample scenes with many lights more effectively. The result: less noise and faster rendering times. Unfortunately, this function is not yet available on AMD GPUs.

Facial modelling made easy with vector displacement maps

Support for Vector Displacement Map (VDM) brushes in Blender 3.5 is one of the latest features for digital sculptors and character artists. With just one click, you can now create complex shapes for noses, ears, horns and tails on your model. VDM brushes are not included in traditional displacement maps as they can move the surface of the model in all three dimensions. Discover the possibilities!

Support for Vector Displacement Maps

Thanks to support for vector displacement maps, you can now create complex shapes with overhangs in just one brush dab using the Draw Brush in Sculpt mode.

Revamped 3D viewport and improvements for Blender Cycles and Pose Library

The 3D viewport has been polished and has been given a GPU-based compositor backend. The overlays are now displayed on the compositing result so you can see and interact with your mesh and other objects. Blender Cycles has been updated and can now also use a light tree to sample scenes with many lights more effectively. The Pose Library has also been improved with new options and shortcuts.

Upgrade for Grease Pencil

For all Grease Pencil fans, there is now a new Natural Drawing Speed Timing mode in the build modifier that renders strokes at the speed of the pencil used to create them.

And even more!

And that’s not all! Other new features in Blender 3.5 include a new timing mode for the build modifier, a new ease operator in the graph editor and much, much, much more. If you want to know more, you can find the latest release here and we’ll take a closer look at all the details in the next issue of DP!

]]>
https://digitalproduction.com/2023/03/29/blender-3-5-wuhu/feed/ 1 DIGITAL PRODUCTION Der neue Belnder Splash Screen von Nicole Morena 115750