Search Results

Search found 1333 results on 54 pages for 'geometry shader'.

Page 4/54 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Shadow volume shader optimization (GLSL)

    - by Soubok
    I wondering if there is a way to optimize this vertex shader. This vertex shader projects (in the light direction) a vertex to the far plane if it is in the shadow. void main(void) { vec3 lightDir = (gl_ModelViewMatrix * gl_Vertex - gl_LightSource[0].position).xyz; // if the vertex is lit if ( dot(lightDir, gl_NormalMatrix * gl_Normal) < 0.01 ) { // don't move it gl_Position = ftransform(); } else { // move it far, is the light direction vec4 fin = gl_ProjectionMatrix * ( gl_ModelViewMatrix * gl_Vertex + vec4(normalize(lightDir) * 100000.0, 0.0) ); if ( fin.z > fin.w ) // if fin is behind the far plane fin.z = fin.w; // move to the far plane (needed for z-fail algo.) gl_Position = fin; } }

    Read the article

  • Counting texels using a fragment shader

    - by Brett
    Hi, I have two textures generated using a fragment shader. I want to be able to count the number of texels in each texture that are above some colour intensity. My question is how can this be done? My initial thought is to count these texels using the fragment shader before generating the texture. However, this would require some sort of global counter. I can't use occlusion queries because the textures are created from other textures. I'm using OpenGL 2.1. Any ideas appreciated. Thanks

    Read the article

  • How to create a "retro" pixel shader for transformed 2D sprites that maintains pixel fidelity?

    - by David Gouveia
    The image below shows two sprites rendered with point sampling on top of a background: The left skull has no rotation/scaling applied to it, so every pixel matches perfectly with the background. The right skull is rotated/scaled, and this results in larger pixels that are no longer axis aligned. How could I develop a pixel shader that would render the transformed sprite on the right with axis aligned pixels of the same size as the rest of the scene? This might be related to how sprite scaling was implemented in old games such as Monkey Island, because that's the effect I'm trying to achieve, but with rotation added. Edit As per kaoD's suggestions, I tried to address the problem as a post-process. The easiest approach was to render to a separate render target first (downsampled to match the desired pixel size) and then upscale it when rendering a second time. It did address my requirements above. First I tried doing it Linear -> Point and the result was this: There's no distortion but the result looks blurred and it loses most of the highlights colors. In my opinion it breaks the retro look I needed. The second time I tried Point -> Point and the result was this: Despite the distortion, I think that might be good enough for my needs, although it does look better as a still image than in motion. To demonstrate, here's a video of the effect, although YouTube filtered the pixels out of it: http://youtu.be/hqokk58KFmI However, I'll leave the question open for a few more days in case someone comes up with a better sampling solution that maintains the crisp look while decreasing the amount of distortion when moving.

    Read the article

  • A problem with connected points and determining geometry figures based on points' location analysis

    - by StolePopov
    In school we have a really hard problem, and still no one from the students has solved it yet. Take a look at the picture below: http://d.imagehost.org/0422/mreza.gif That's a kind of a network of connected points, which doesn't end and each point has its own number representing it. Let say the numbers are like this: 1-23-456-78910-etc. etc.. (You can't see the number 5 or 8,9... on the picture but they are there and their position is obvious, the point in middle of 4 and 6 is 5 and so on). 1 is connected to 2 and 3, 2 is connected to 1,3,5 and 4 etc. The numbers 1-2-3 indicate they represent a triangle on the picture, but the numbers 1-4-6 do not because 4 is not directly connected with 6. Let's look at 2-3-4-5, that's a parallelogram (you know why), but 4-6-7-9 is NOT a parallelogram because the in this problem there's a rule which says all the sides must be equal for all the figures - triangles and parallelograms. Also there are hexagons, for ex. 4-5-7-9-13-12 is a hexagon - all sides must be equal here too. 12345 - that doesn't represent anything, so we ignore it. I think i explained the problem well. The actual problem which is given to us by using an input of numbers like above to determine if that's a triangle/parallelogram/hexagon(according to the described rules). For ex: 1 2 3 - triangle 11 13 24 26 -parallelogram 1 2 3 4 5 - nothing 11 23 13 25 - nothing 3 2 5 - triangle I was reading computational geometry in order to solve this, but i gave up quickly, nothing seems to help here. One friend told me this site so i decided to give it a try. If you have any ideas about how to solve this, please reply, you can use pseudo code or c++ whatever. Thank you very much.

    Read the article

  • Trade offs of linking versus skinning geometry

    - by Jeff
    What are the trade offs between inherent in linking geometry to a node versus using skinned geometry? Specifically: What capabilities do you gain / lose from using each method? What are the performance impacts of doing one over the other? What are the specific situations where you would want to do one over the other? In addition, do the answers to these questions tend to be engine specific? If so, how much?

    Read the article

  • projective geometry: how do I turn a projection of a rectangle in 3D into a 2D view

    - by bonomo
    So the problem is that I have a 3D projection of a rectangle that I want to turn into 2D. That is I have a photo of a sheet of paper laying on a table which I want to transform into a 2D view of that sheet. So what I need is to get an undistorted 2D image by reverting all the 3D/projection transformations and getting a plain view of the sheet from the top. I happened to find some directions on the subject but they don't suggest an immediate instruction on how this can be achieved. It would be really helpful to get a step-by-step instruction of what needs to be done. Or, alternatively, a link on a resource that breaks it down to little details. Thank you

    Read the article

  • Monogame/SharpDX - Shader parameters missing

    - by Layoric
    I am currently working on a simple game that I am building in Windows 8 using MonoGame (develop3d). I am using some shader code from a tutorial (made by Charles Humphrey) and having an issue populating a 'texture' parameter as it appears to be missing. Edit I have also tried 'Texture2D' and using it with a register(t0), still no luck I'm not well versed writing shaders, so this might be caused by a more obvious problem. I have debugged through MonoGame's Content processor to see how this shader is being parsed, all the non 'texture' parameters are there and look to be loading correctly. Edit This seems to go back to D3D compiler. Shader code below: #include "PPVertexShader.fxh" float2 lightScreenPosition; float4x4 matVP; float2 halfPixel; float SunSize; texture flare; sampler2D Scene: register(s0){ AddressU = Clamp; AddressV = Clamp; }; sampler Flare = sampler_state { Texture = (flare); AddressU = CLAMP; AddressV = CLAMP; }; float4 LightSourceMaskPS(float2 texCoord : TEXCOORD0 ) : COLOR0 { texCoord -= halfPixel; // Get the scene float4 col = 0; // Find the suns position in the world and map it to the screen space. float2 coord; float size = SunSize / 1; float2 center = lightScreenPosition; coord = .5 - (texCoord - center) / size * .5; col += (pow(tex2D(Flare,coord),2) * 1) * 2; return col * tex2D(Scene,texCoord); } technique LightSourceMask { pass p0 { VertexShader = compile vs_4_0 VertexShaderFunction(); PixelShader = compile ps_4_0 LightSourceMaskPS(); } } I've removed default values as they are currently not support in MonoGame and also changed ps and vs to v4 instead of 2. Could this be causing the issue? As I debug through 'DXConstantBufferData' constructor (from within the MonoGameContentProcessing project) I find that the 'flare' parameter does not exist. All others seem to be getting created fine. Any help would be appreciated. Update 1 I have discovered that SharpDX D3D compiler is what seems to be ignoring this parameter (perhaps by design?). The ConstantBufferDescription.VariableCount seems to be not counting the texture variable. Update 2 SharpDX function 'GetConstantBuffer(int index)' returns the parameters (minus textures) which is making is impossible to set values to these variables within the shader. Any one know if this is normal for DX11 / Shader Model 4.0? Or am I missing something else?

    Read the article

  • Setting uniform value of a vertex shader for different sprites in a SpriteBatch

    - by midasmax
    I'm using libGDX and currently have a simple shader that does a passthrough, except for randomly shifting the vertex positions. This shift is a vec2 uniform that I set within my code's render() loop. It's declared in my vertex shader as uniform vec2 u_random. I have two different kind of Sprites -- let's called them SpriteA and SpriteB. Both are drawn within the same SpriteBatch's begin()/end() calls. Prior to drawing each sprite in my scene, I check the type of the sprite. If sprite instance of SpriteA: I set the uniform u_random value to Vector2.Zero, meaning that I don't want any vertex changes for it. If sprite instance of SpriteB, I set the uniform u_random to Vector2(MathUtils.random(), MathUtils.random(). The expected behavior was that all the SpriteA objects in my scene won't experience any jittering, while all SpriteB objects would be jittering about their positions. However, what I'm experiencing is that both SpriteA and SpriteB are jittering, leading me to believe that the u_random uniform is not actually being set per Sprite, and being applied to all sprites. What is the reason for this? And how can I fix this such that the vertex shader correctly accepts the uniform value set to affect each sprite individually? passthrough.vsh attribute vec4 a_color; attribute vec3 a_position; attribute vec2 a_texCoord0; uniform mat4 u_projTrans; uniform vec2 u_random; varying vec4 v_color; varying vec2 v_texCoord; void main() { v_color = a_color; v_texCoord = a_texCoord0; vec3 temp_position = vec3( a_position.x + u_random.x, a_position.y + u_random.y, a_position.z); gl_Position = u_projTrans * vec4(temp_position, 1.0); } Java Code this.batch.begin(); this.batch.setShader(shader); for (Sprite sprite : sprites) { Vector2 v = Vector2.Zero; if (sprite instanceof SpriteB) { v.x = MathUtils.random(-1, 1); v.y = MathUtils.random(-1, 1); } shader.setUniformf("u_random", v); sprite.draw(this.batch); } this.batch.end();

    Read the article

  • Passing Boost uBLAS matrices to OpenGL shader

    - by AJM
    I'm writing an OpenGL program where I compute my own matrices and pass them to shaders. I want to use Boost's uBLAS library for the matrices, but I have little idea how to get a uBLAS matrix into OpenGL's shader uniform functions. matrix<GLfloat, column_major> projection(4, 4); // Fill matrix ... GLuint projectionU = glGetUniformLocation(shaderProgram, "projection"); glUniformMatrix4fv(projectionU, 1, 0, (GLfloat *)... Um ...); Trying to cast the matrix to a GLfloat pointer causes an invalid cast error on compile.

    Read the article

  • How to set Alpha value from pixel shader in SlimDX Direct3d9

    - by Yashwinder
    I am trying to set alpha value of color as color.a = 0.5f in my pixel shader but all the time it is giving an exception. I can set color.r, color.g, color.b but it is not allowing me to set color.a and throwing an exception D3DERR_INVALIDCALL: Invalid call (-2005530516). I have just created a direct3d9 device and assigned my pixel shader to it. My pixel shader code is as below sampler2D ourImage : register(s0); float4 main(float2 locationInSource : TEXCOORD) : COLOR { float4 color = tex2D( ourImage , locationInSource.xy); color.a = 0.2; return color; } I am creating my pixel shader as byte[] byteCode = GiveFxFile(transitionEffect.PixelShaderFileName); var shaderBytecode = ShaderBytecode.Compile(byteCode, "main", "ps_2_0", ShaderFlags.None); var pixelShader = new PixelShader(device, ShaderBytecode); _device.PixelShader=pixelShader; I have initialized my device as var _presentParams = new PresentParameters { Windowed = _isWindowedMode, BackBufferWidth = (int)SystemParameters.PrimaryScreenWidth, BackBufferHeight = (int)SystemParameters.PrimaryScreenHeight, // Enable Z-Buffer // This is not really needed in this sample but real applications generaly use it EnableAutoDepthStencil = true, AutoDepthStencilFormat = Format.D16, // How to swap backbuffer in front and how many per screen refresh BackBufferCount = 1, SwapEffect = SwapEffect.Copy, BackBufferFormat = _direct3D.Adapters[0].CurrentDisplayMode.Format, PresentationInterval = PresentInterval.Immediate, DeviceWindowHandle = _windowHandle }; _device = new Device(_direct3D, 0, DeviceType.Hardware, _windowHandle, deviceFlags | CreateFlags.Multithreaded, _presentParams);

    Read the article

  • OpenGL lighting with dynamic geometry

    - by Tank
    I'm currently thinking hard about how to implement lighting in my game. The geometry is quite dynamic (fixed 3D grid with custom geometry in each cell) and needs some light to get more depth and in general look nicer. A scene in my game always contains sunlight and local light sources like lamps (point lights). One can move underground, so sunlight must be able to illuminate as far as it can get. Here's a render of a typical situation: The lamp is positioned behind the wall to the top, and in the hollow cube there's a hole in the back, so that light can shine through. (I don't want soft shadows, this is just for illustration) While spending the whole day searching through Google, I stumbled on some keywords like deferred rendering, forward rendering, ambient occlusion, screen space ambient occlusion etc. Some articles/tutorials even refer to "normal shading", but to be honest I don't really have an idea to even do simple shading. OpenGL of course has a fixed lighting pipeline with 8 possible light sources. However they just illuminate all vertices without checking for occluding geometry. I'd be very thankful if someone could give me some pointers into the right direction. I don't need complete solutions or similar, just good sources with information understandable for someone with nearly no lighting experience (preferably with OpenGL).

    Read the article

  • What is the minimum of shader I need to use to run basic calculation on GPU?

    - by Jinxi
    I read, that the Hull Shader, Domain Shader, Geometry Shader and Pixel Shader can be used optional. So, is the Vertex Shader optional too? If no: What does a basic Vertex Shader look like? Just like a simple pass through? Is the Vertex Shader necessary to tell what kind of datastructure (Van Stripes or Meshes) are used? What can I do, with just the vertex shader? Are the fixed functions working without any help of programming a programmable stage?

    Read the article

  • PostgreSQL: SELECT all fields, filter some

    - by Adam Matan
    Hi, In one of our databases, there is a table with dozens of columns, one of which is a geometry column. I want to SELECT rows from the table, with the geometry transformed to another SRID. I want to use something like: `SELECT *` in order to avoid: SELECT col_a, col_b, col_c, col_d, col_e, col_f, col_g, col_h, transform(the_geom, NEW_SRID), ..., col_z Any ideas? Adam

    Read the article

  • Determining line orientation using vertex shaders

    - by Brett
    Hi, I want to be able to calculate the direction of a line to eye coordinates and store this value for every pixel on the line using a vertex and fragment shader. My idea was to calculate the direction gradient using atan2(Gy/Gx) after a modelview tranformation for each pair of vertices then quantize this value as a color intensity to pass to a fragment shader. How can I get access to the positions of pairs of vertices to achieve this or is there another method I should use? Thanks

    Read the article

  • Computational geometry: find where the triangle is after rotation, translation or reflection on a mi

    - by newba
    I have a small contest problem in which is given a set of points, in 2D, that form a triangle. This triangle may be subject to an arbitrary rotation, may be subject to an arbitrary translation (both in the 2D plane) and may be subject to a reflection on a mirror, but its dimensions were kept unchanged. Then, they give me a set of points in the plane, and I have to find 3 points that form my triangle after one or more of those geometric operations. Example: 5 15 8 5 20 10 6 5 17 5 20 20 5 10 5 15 20 15 10 I bet that have to apply some known algorithm, but I don't know which. The most common are: convex hull, sweep plane, triangulation, etc. Can someone give a tip? I don't need the code, only a push, please!

    Read the article

  • Computation geometry: find where's the triangle after rotation, tranlastion or reflection in a mirro

    - by newba
    Hi, I have a small contest problem in which is given a set of points, in 2D, that form a triangle. This triangle may be subject to an arbitrary rotation, may be subject to an arbitrary translation (both in the 2D plane) and may be subject to a reflection on a mirror, but its dimensions were kept unchanged. Then, they give me a set of points in the plane, and I have to find 3 points that form my triangle after one or more of those geometric operations. Example: 5 15 8 5 20 10 6 5 17 5 20 20 5 10 5 15 20 15 10 I bet that have to apply some known algorithm, but I don't know which. The most common are: convex hull, sweep plane, triangulation, etc. Can someone give a tip? I don't need the code, only a push, please!

    Read the article

  • python geometry help

    - by Enriquev
    Hello, I have the following problem, I am trying to find the following distances (F1 and F2): This is what I have as of now: def FindArrow(self, X1, Y1, X2, Y2, X3, Y3): self.X1 = float(X1) self.Y1 = float(Y1) self.X2 = float(X2) self.Y2 = float(Y2) self.X3 = float(X3) self.Y3 = float(Y3) #center coords of the circle self.Xc = None self.Yc = None #radius self.R = None #F1 and F2 self.FAB = None self.FBC = None #check if the coordinates are collinear invalide = self.X1 * (self.Y2 - self.Y3) + self.X2 * (self.Y3 - self.Y1) + self.X3 * (self.Y1 - self.Y2) if (invalide == 0): return #get the coords of the circle's center s = (0.5 * ((self.X2 - self.X3)*(self.X1 - self.X3) - (self.Y2 - self.Y3) * (self.Y3 - self.Y1))) / invalide self.Xc = 0.5 * (self.X1 + self.X2) + s * (self.Y2 - self.Y1) self.Yc = 0.5 * (self.Y1 + self.Y2) + s * (self.X1 - self.X2) #get the radius self.R = math.sqrt(math.pow(self.Xc - self.X1, 2) + math.pow(self.Yc - self.Y1, 2)) Until here everything seems to work, now what would be the next steps to get F1 and F2 ?

    Read the article

  • HLSL, Program pixel shader with different Texture2D downscaling algorithms

    - by Kaminari
    I'm trying to port some image interpolation algorithms into HLSL code, for now i got: float2 texSize; float scale; int method; sampler TextureSampler : register(s0); float4 PixelShader(float4 color : COLOR0, float2 texCoord : TEXCOORD0) : COLOR0 { float2 newTexSize = texSize * scale; float4 tex2; if(texCoord[0] * texSize[0] > newTexSize[0] || texCoord[1] * texSize[1] > newTexSize[1]) { tex2 = float4( 0, 0, 0, 0 ); } else { if (method == 0) { tex2 = tex2D(TextureSampler, float2(texCoord[0]/scale, texCoord[1]/scale)); } else { float2 step = float2(1/texSize[0], 1/texSize[1]); float4 px1 = tex2D(TextureSampler, float2(texCoord[0]/scale-step[0], texCoord[1]/scale-step[1])); float4 px2 = tex2D(TextureSampler, float2(texCoord[0]/scale , texCoord[1]/scale-step[1])); float4 px3 = tex2D(TextureSampler, float2(texCoord[0]/scale+step[0], texCoord[1]/scale-step[1])); float4 px4 = tex2D(TextureSampler, float2(texCoord[0]/scale-step[0], texCoord[1]/scale )); float4 px5 = tex2D(TextureSampler, float2(texCoord[0]/scale+step[0], texCoord[1]/scale )); float4 px6 = tex2D(TextureSampler, float2(texCoord[0]/scale-step[0], texCoord[1]/scale+step[1])); float4 px7 = tex2D(TextureSampler, float2(texCoord[0]/scale , texCoord[1]/scale+step[1])); float4 px8 = tex2D(TextureSampler, float2(texCoord[0]/scale+step[0], texCoord[1]/scale+step[1])); tex2 = (px1+px2+px3+px4+px5+px6+px7+px8)/8; tex2.a = 1; } } return tex2; } technique Resample { pass Pass1 { PixelShader = compile ps_2_0 PixelShader(); } } The problem is that programming pixel shader requires different approach because we don't have the control of current position, only the 'inner' part of actual loop through pixels. I've been googling for about whole day and found none open source library with scaling algoriths used in loop. Is there such library from wich i could port some methods? I found http://www.codeproject.com/KB/GDI-plus/imgresizoutperfgdiplus.aspx but I really don't understand His approach to the problem, and porting it will be a pain in the ... Wikipedia tells a matematic approach. So my question is: Where can I find easy-to-port graphic open source library wich includes simple scaling algorithms? Of course if such library even exists :)

    Read the article

  • OpenGL ES 2.0: Vertex and Fragment Shader for 2D with Transparency

    - by Bunkai.Satori
    Could I knindly ask for correct examples of OpenGL ES 2.0 Vertex and Fragment shader for displaying 2D textured sprites with transparency? I have fairly simple shaders that display textured polygon pairs but transparency is not applied despite: texture map contains transparency information Blending is enabled: glEnable(GL_BLEND); glEnable(GL_DEPTH_TEST); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); My Vertex Shader: uniform mat4 uOrthoProjection; uniform vec3 Translation; attribute vec4 Position; attribute vec2 TextureCoord; varying vec2 TextureCoordOut; void main() { gl_Position = uOrthoProjection * (Position + vec4(Translation, 0)); TextureCoordOut = TextureCoord; } My Fragment Shader: varying mediump vec2 TextureCoordOut; uniform sampler2D Sampler; void main() { gl_FragColor = texture2D(Sampler, TextureCoordOut); }

    Read the article

  • Shader effect similar to Metro 2033 gasmask

    - by Tim
    I was thinking about effects in games the other day and I was reminded of the Gasmask effect from Metro 2033. Once you put the gasmask on it blurred a bit in the corners and could ice up and even get cracked. I assume that something like that is done using a shader. I have been experimenting a bit with game development, so far mostly playing with existing rendering engines and adding physics support etc. I would like to learn more about this sort of effect. Can someone give me a simple example of a shader that would alter the entire scene like this. Or if not a shader then an idea on how it would be done. Thanks. Edit : Include screenshot of the metro 2033 gasmask effect.

    Read the article

  • Bitwise operators in DX9 ps_2_0 shader

    - by lapin
    I've got the following code in a shader: // v & y are both floats nPixel = v; nPixel << 8; nPixel |= y; and this gives me the following error in compilation: shader.fx(80,10): error X3535: Bitwise operations not supported on legacy targets. shader.fx(92,18): ID3DXEffectCompiler::CompileEffect: There was an error compiling expression ID3DXEffectCompiler: Compilation failed The error is on the following line: nPixel |= y; What am I doing wrong here?

    Read the article

  • Cool examples of procedural pixel shader effects?

    - by Robert Fraser
    What are some good examples of procedural/screen-space pixel shader effects? No code necessary; just looking for inspiration. In particular, I'm looking for effects that are not dependent on geometry or the rest of the scene (would look okay rendered alone on a quad) and are not image processing (don't require a "base image", though they can incorporate textures). Multi-pass or single-pass is fine. Screenshots or videos would be ideal, but ideas work too. Here are a few examples of what I'm looking for (all from the RenderMonkey samples): PS - I'm aware of this question; I'm not asking for a source of actual shader implementations but instead for some inspirational ideas -- and the ones at the NVIDIA Shader Library mostly require a scene or are image processing effects. EDIT: this is an open-ended question and I wish there was a good way to split the bounty. I'll award the rep to the best answer on the last day.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >