By "parenting" you mean transform decal by related bone's transformation?
Yes, make the projection follow the bone(s).
One question here: knowing location and normal of the hit result and ray direction how can I find an aprox. impact point on graphical mesh that "wraps" physics object? Is there any tricky math I can employ or the only option is just to iterate through graphical mesh triangles and to find ray intersection?
When using projective decals, a reasonably tight ray proxy (at the level of having a forearm shape, for example) would avoid the need for finding the impact point on the graphical model.
If you truly want impact points on the graphical model based on physical hit points, some type of mapping is required. Marching over triangles and finding a graphical intersection from the physical intersection is a brute force runtime implementation of such a mapping, but there are lots of others. Off the top of my head, here's another example:
The model has a decal-dedicated UV map. It is subdivided into different regions, one for each bone. Each region represents the output space of a cylindrical projection. Vertices are placed in the UV map such that they sample from the bone specific decal areas appropriately (e.g. a forearm vertex will sample from the appropriate spot in the forearm decal region). This mapping could be mostly automated in the content pipeline.
Assume a hit is found on a bone proxy. Apply a cylindrical projection to the impact point on the bone proxy. Look up the appropriate region in the decal UV texture based on the bone. Draw the decal onto the character's decal texture.
When rendering, sample the decal texture the same way you'd sample any other texture.
The above could be improved a little bit depending on the specific use- there's no fundamental requirement that there exist a dedicated texture and uv map, but it could make content development easier.