I will explain the process:
In Unity you can render to screen, like you usually do, or to Render Texture. Effect like Blur, Bloom and many others, first render everything to Render Texture, and then frrom it to screen with some adjustments.
If you create a Point Light and move it, the effect is similar as with my Sphere brush, but off course, light doesn't stay, doesn't create a line, it moves with the source. So what needs to be done is to Render it onto a texture.
You may have heard about shaders that change vertex positions: like when you see trees move from wind, or water: mesh is not changed, but in graphics card, each vertex is moved.
You may also know what is UV coordinates: it’s how texture is wrapped around a mesh. So if we think about mesh and texture as some sort of Origami, we could unwrap it into a square sheet by adding a few lines to vertex function, that replaces it's position with UV coordinates.
Now the way Point Light works, is it divides the brightness by distance from world position of a pixel to world position of light source.
So before unwraping our origamy, we also provide original world position as an additional argument, so we can still calculate that distance.
Now we use the described shader to render unwrapped mesh with original texture and "Point Light" on top of it to that very same texture.
The last part may be difficult to imagine. Crumble a sheet of paper into a ball, mess up one side of it and straighten it. That part you messed up will be our point light.
Now Sphere brush actually takes two points, not one, and draws a line between them, but that is just details. The code could be used if you want a long neon light stick or light saber to cast light correctly, though.
If you just want to paint 🎨, attach component to the object, switch to GPU mode, and select sphere brush. For other purposes, you’d need a script that uses Paint function. Asset has a Painter Ball example for this.
Now next question... how to use it.