Oct 3, 2017

Painter Ball


Edited: Oct 23, 2017

This September update added an example prefabs and scripts that use painter to... well, paint stuff. A dirty ball, bullets, guts and stuff.


In the picture at the top, left object is using Big Render Texture pair, right one - exclusive RenderTexture. You can paint on many objects with Worldspace brushes, but they need to have Render Textures.


To be exact, new version of Sphere Brush actually renders a capsule, not a sphere. So if you move a brush or this painter ball very fast, it will not render circles, but a line between where it was last frame and where it is this frame. It uses formula to find the closest distance from a point to a line, nothing too difficult.



Now you may be thinking "Can I use it to render damage onto a character, like from a bullet." Answer is yes, you can (video below). This is an early stage, shader will be available with the next update.


Actually, it doesn't paint dirt and blood onto a texture, it just paints green and red color to a mask: green - dirt, red - blood. And shader uses that mask to cobine character texture with gore and dirt texture. Mask is also sampled in vertex function to indent mesh. Ideally there would be another mask with normals and max indent amount to really control deformation.


In terms of performance: painting a sphere brush to an Exclusive RenderTexture will take same time as rendering a quad to a screen that has resolution of that Render Texture. It takes much more for RenderTexture Pair though. For regular brush, not sphere, it's much much faster, just like rendering a quad . Swapping render targets could eat up some performance though. But in general, performance is fine.


Painter ball currently modifies only one texture at a time, to avoid additional renders. But you can make your own painting scripts so be mindful that when you try to paint on texture2D using RenderTexture target, it will first render texture2D to RenderTexture, and will have to do this again if you change paint target on snother texture.


What is Render Texture Pair


When using Render Texture as target, you can't read from it, but RenderTexture pair contains two copies of the same texture, which is updated after every stroke, so almost all possible brush effects can be reproduced with shader because fragment function can sample destination color (from copy).


Thankfully, Sphere Brush Type doesn't need the pair, so it can paint on single(exclusive) Render Texture as well as on the Pair. Only Blit modes are limited: no Blur, Bloom or Volumetric Decals when using exclusive Render Texture.


Tanks to Daniel for embracing Painter Tool and providing me with valuable feedback. He is using RenderTexture Pair here, I think.






New Posts
  • Early concept: Let's say there is a nice artsy picture, and it's visuals need to be replicated in the scene. Normally It will take some artistic fiddling, but in theory, brush shader could use that fancy picture to paint the diffuse color needed to perfectly recreate the desired colors under current lighting. So, artist sets light parameters and objects in the scene, using the picture. Special shader could project that picture onto the objects from predetermined projection source (Most likely: Main Camera). The same projection source will be used later to compute the color that needs to be painted onto texture for it to have needed color when lit. There will need to be a special deconstructor brush for different shaders though.
  • I will explain the process:  In Unity you can render to screen, like you usually do, or to Render Texture. Effect like Blur, Bloom and many others, first render everything to Render Texture, and then frrom it to screen with some adjustments. If you create a Point Light and move it, the effect is similar as with my Sphere brush, but off course, light doesn't stay, doesn't create a line, it moves with the source. So what needs to be done is to Render it onto a texture. You may have heard about shaders that change vertex positions: like when you see trees move from wind, or water: mesh is not changed, but in graphics card, each vertex is moved. You may also know what is UV coordinates: it’s how texture is wrapped around a mesh. So if we think about mesh and texture as some sort of Origami, we could unwrap it into a square sheet by adding a few lines to vertex function, that replaces it's position with UV coordinates. Now the way Point Light works, is it divides the brightness by distance from world position of a pixel to world position of light source. So before unwraping our origamy, we also provide original world position as an additional argument, so we can still calculate that distance. Now we use the described shader to render unwrapped mesh with original texture and "Point Light" on top of it to that very same texture. The last part may be difficult to imagine. Crumble a sheet of paper into a ball, mess up one side of it and straighten it. That part you messed up will be our point light. Now Sphere brush actually takes two points, not one, and draws a line between them, but that is just details. The code could be used if you want a long neon light stick or light saber to cast light correctly, though.
  • I was asked about this asset. Simple answer is yes, it works as expected, but it takes a bit of know how to set it up. I will describe the process below. I was using NeoFur 2.1 for this test. I will provide short list of steps, but recommend to read in full to understand. 1) Assign Fur Material as additional material on Mesh Renderer. 2) Configure Fur Material before pressing play (Select where to use Textures and where - single color) 3) In Playtime Painter disable Auto Select Material , select Fur Material and the Texture you want to edit. 4) Go for it) Now in detail: Without understanding of how materials, sub meshes and both assets work it's easy to get confused with painting Fur. NeoFur needs each object to have it's own material , but to save you some time you provide one material and it will clone it for every object. Playtime Painter can edit a texture, but to save you some time it lets you selected a material and textures to edit. Problem is, Fur Material is not attached to object's Mesh Renderer , so Painter can't just get it automatically. You could assign Fur Material to the Mesh Renderer and then Painter will let you select and edit those textures. But keep in mind that the Material you assigned to NeoFur script will be cloned, so any changes to original instance of Fur Material (that is assigned to NeoFur, and now to Renderer) during play will not change how fur looks, because Fur is Rendered with a Clone of that Material. So you need to configure it before pressing play : After selecting where we to use a single color, and where to use a texture, clones of the material will use the same textures, and in Playtime Editor you can select them and edit. But the reason fur is not assigned to the object's Mesh Renderer by default is because Fur does not just change one shader for another, you still have your object, with it's original "skin" shader and Material, and fur is rendered on top of it . To be able to see your "skin" material and edit Fur at the same time, don't replace object's "skin" Material. Add Fur material as another material on your Mesh Renderer: Unity let's you use many materials for meshes that are made of sub meshes . There is nothing complex about sub meshes, they are just a bunch of meshed stored as one, for usability. And if you have 5 sub meshes and 5 materials, each sub mesh will be rendered with it's own material. Again, it's almost the same as having 5 meshes each with different material stored under same parent GameObject. There may be more to it, I just haven't encountered any nuances to be mindful of, yet. Now, again, to save you some time, Playtime Painter automatically finds the submesh you are painting on and selects it's material. And if you select Fur Material, it will select skin Material. So disable that Auto Select Material under material selector Because Painter is not accessing Material that is actually used to render Fur, switching to Render Texture mode will not work: you can use it to edit texture, but changes will not show until you return to Texture2D mode. The solution is not perfect because as Warning Message above says, it will cause additional Draw Calls. So I'll contact the developer, maybe he'll offer a nice solution. Alternatively, in addition to NonMaterialTexture class I could create NonMeshRendererMaterial class for this kind of cases.
GitHub icon
Discord icon
Facebook icon
Instagram icon