Good afternoon, Habravchane! I would like to tell you about one of the ways to render the lighting and shading in 2D-space, taking into account the geometry of the scene. I really like the lighting in the implementation of Gish and Super MeatBoy, although mitboe it can only be seen in dynamic levels with decaying or moving platforms, and Guichet it everywhere. Lighting in these games I think so "warm" tube, which will certainly want to implement something like this yourself. And this is what came of it. Tezisno what is and what needs to be done: there is a 2D-world in which to embed dynamic lighting + shading; world is not necessarily a tile of any geometry; light sources should be, in principle, an unlimited adjust a wing number (limited only by the system's performance); the large number of light sources at one point or a single light source with a large coefficient of "lighting", adjust a wing should adjust a wing not only cover the area of 100%, and it should zasvetlyat; must pay all, of course, in real-time; For all this we need OpenGL, GLSL, technology FrameBuffer and a little math. Limited versions adjust a wing of OpenGL 3.3 and GLSL 3.30 as card of one of my systems by today's standards very outdated (GeForce 310), and for 2D is more than enough (and earlier versions cause rejection adjust a wing of the inconsistency adjust a wing versions OpenGL and GLSL). The algorithm by itself is not complicated and is done in 3 steps: Generate texture size of the region render black and paint it lit area (the so-called light map), accumulating adjust a wing at the same rate for all points of light; Render the scene in a separate texture; In the context adjust a wing of the render output quad, completely adjust a wing covering it, and in the fragment shader to reduce the resulting texture. At this point, you can "play" with the fragment shader, adding, for example, the effects adjust a wing of refraction of water / fire lens, color correction for every taste and other post-processing. 1 Map coverage will use one of the most popular technologies - deferred shading, port it in 2D (PS I'm sorry, is not deferred shading, and shadow map, my joint, thanks for the correction). The essence of this method - render the scene moving the camera to the position of the light source, having caught the depth buffer. Simple operation with the matrices of the camera and light source pixel by pixel shader can be found on the Shadowed by translating the coordinates of the pixel-rendered scenes in the texture coordinate buffer. In 3D used z-buffer, but here I decided to create a one-dimensional depth buffer of the CPU. It does not claim to rationality and optimality of this approach lacks illumination adjust a wing algorithms and each has its own pluses with minuses. During obmozgovyvaniya way seemed quite entitled to life, and I started adjust a wing to implement. Note that at the time of this writing found here is a way to ... well, okay, so the bicycle wheel. 1.1 Z-buffer depth buffer essence aka Z-buffer - storage of removed items from the scene camera, which allows you to cut off the invisible for more neighbors adjust a wing object pixels. If the 3D scene depth buffer is a plane, in our flat world, he would be a line or a one-dimensional array. Light sources - point radiating light from the center in all directions. Accordingly, adjust a wing the buffer index and the value will correspond to the polar coordinates of the nearest to the source adjust a wing object. I buffer size determined empirically, whereby stopped at 1024 (of course, depends adjust a wing on the size of the window). The smaller dimension of the buffer, adjust a wing the greater the difference will be noticeable object boundaries and illuminated the area, especially in the presence of small objects, and sometimes can appear totally unacceptable artifacts:
The algorithm for generating buffer: fill the radius value of the light source (the distance over which the force reaches zero coverage); for each object that is within the light source, take those edges that are turned towards the light source face. If you take the ribs rotated back of the objects will be highlighted automatically, but will be a problem with standing side by side:
projecting the resulting list of edges, transforming them Cartesian coordinates to polar light source. Recalculate the point (x; y) in the (φ; r): φ = arccos (xAxis normalize (point)) where: - the scalar product of vectors; xAxis - a unit vector corresponding to the axis x (1, 0), since 0 degrees correspond to the right of the circle center point; point - the vector directed from the center of the light source adjust a wing to a point belonging to the edge (the coordinates of the edges in the coordinate system of the light source); normalize - normalize the vector; r = | point | - distance to the point; Project the two ends of the ribs and intermediate. The number of points required for the conversion corresponds to the number of cells of the buffer that is covered by the projection of the edge. Calculation of the buffer corresponding to the angle φ: index = φ / (2 * π) * razmerBufera; Thus we find two extreme buffer index corresponding to the extreme points of the ribs. For
No comments:
Post a Comment