Search Results

Search found 17287 results on 692 pages for 'card game'.

Page 382/692 | < Previous Page | 378 379 380 381 382 383 384 385 386 387 388 389  | Next Page >

  • Sprite/Tile Sheets Vs Single Textures

    - by Reanimation
    I'm making a race circuit which is constructed using various textures. To provide some background, I'm writing it in C++ and creating quads with OpenGL to which I assign a loaded .raw texture too. Currently I use 23 500px x 500px textures of which are all loaded and freed individually. I have now combined them all into a single sprite/tile sheet making it 3000 x 2000 pixels seems the number of textures/tiles I'm using is increasing. Now I'm wondering if it's more efficient to load them individually or write extra code to extract a certain tile from the sheet? Is it better to load the sheet, then extract 23 tiles and store them from one sheet, or load the sheet each time and crop it to the correct tile? There seems to be a number of way to implement it... Thanks in advance.

    Read the article

  • Calculating the correct particle angle in an outwards explosion

    - by Sun
    I'm creating a simple particle explosion but am stuck in finding the correct angle to rotate my particle. The effect I'm going for is similar to this: Where each particle is going outwards from the point of origin and at the correct angle. This is what I currently have: As you can see, each particle is facing the same angle, but I'm having a little difficulty figuring out the correct angle. I have the vector for the point of emission and the new vector for each particle, how can I use this to calculate the angle? Some code for reference: private Particle CreateParticle() { ... Vector2 velocity = new Vector2(2.0f * (float)(random.NextDouble() * 2 - 1), 2.0f * (float)(random.NextDouble() * 2 - 1)); direction = velocity - ParticleLocation; float angle = (float)Math.Atan2(direction.Y, direction.X); ... return new Particle(texture, position, velocity, angle, angularVelocity, color, size, ttl, EmitterLocation); } I am then using the angle created as so in my particles Draw method: spriteBatch.Draw(Texture, Position, null, Color, Angle, origin, Size, SpriteEffects.None, 0f);

    Read the article

  • How can I simulate a rigid body bounced from a wall in 3D world?

    - by HyperGroups
    How can I simulate a rigid sword bounced from a wall and hit the ground (like in physical world)? I want to use this for a simple animation. I can detect the figure and the size of the sword (maybe needed in doing bounce). Rotation can be controlled by quaternions/matrix/euler angles. It should turn the head and do rotations and fly to the ground. How can I simulate this physical process? Maybe what I need is an equation and some parameters? I need these data, and would combine them into my movie file, I use Mathematica to do the thing that generate the movie file(If I have the data, I can also export it into a 3DSMax script for example).

    Read the article

  • A* how make natural look path?

    - by user11177
    I've been reading this: http://theory.stanford.edu/~amitp/GameProgramming/Heuristics.html But there are some things I don't understand, for example the article says to use something like this for pathfinding with diagonal movement: function heuristic(node) = dx = abs(node.x - goal.x) dy = abs(node.y - goal.y) return D * max(dx, dy) I don't know how do set D to get a natural looking path like in the article, I set D to the lowest cost between adjacent squares like it said, and I don't know what they meant by the stuff about the heuristic should be 4*D, that does not seem to change any thing. This is my heuristic function and move function: def heuristic(self, node, goal): D = 10 dx = abs(node.x - goal.x) dy = abs(node.y - goal.y) return D * max(dx, dy) def move_cost(self, current, node): cross = abs(current.x - node.x) == 1 and abs(current.y - node.y) == 1 return 19 if cross else 10 Result: The smooth sailing path we want to happen: The rest of my code: http://pastebin.com/TL2cEkeX

    Read the article

  • Memory allocation strategy for the vertex buffers (DirectX 10/11)

    - by Alex
    I have the following question. I write CAD system. So I have a 3D scene and there are many different objects (walls, doors, windows and so on). User can add or delete some objects. The question is: how can I organise the keeping of vertices for all my objects. I can create vertex buffer for every object. But I think drawing/switching from one buffer to another would have performance penalty. Another way - I can create several big buffers for every object type. But I don't understand how to update such buffers. It is too big to update whole buffer (for example buffer for all walls). What I need to do if I want to delete the object from the middle of the buffer? Actually I have the similar question: http://stackoverflow.com/questions/5515700/how-to-properly-update-vertex-buffers-in-directx-10 Most examples I've found work with very static models. Therefore, they tend to create a single vertex buffer with their list of points, and then are just manipulated by matrix transformations. I, on the other hand, will be updating the scene very often.

    Read the article

  • Box2D Difference Between WorldCenter and Position

    - by Free Lancer
    So this problem has been brothering for a couple of days now. First off, what is the difference between say Body.getWorldCenter() and Body.getPosition(). I heard that WorldCenter might have to do with the center of gravity or something. Second, When I create a Box2D Body for a sprite the Body is always at the lower left corner. I check it by printing a Rectangle of 1 pixel around the box.getWorldCenter(). From what I understand the Body should be in the center of the Sprite and its bounding box should wrap around the Sprite, correct? Here's an image of what I mean (The Sprite is Red, Body Blue): Here's some code: Body Creator: public static Body createBoxBody( final World pPhysicsWorld, final BodyType pBodyType, final FixtureDef pFixtureDef, Sprite pSprite ) { float pRotation = 0; float pCenterX = pSprite.getX() + pSprite.getWidth() / 2; float pCenterY = pSprite.getY() + pSprite.getHeight() / 2; float pWidth = pSprite.getWidth(); float pHeight = pSprite.getHeight(); final BodyDef boxBodyDef = new BodyDef(); boxBodyDef.type = pBodyType; //boxBodyDef.position.x = pCenterX / Constants.PIXEL_METER_RATIO; //boxBodyDef.position.y = pCenterY / Constants.PIXEL_METER_RATIO; boxBodyDef.position.x = pSprite.getX() / Constants.PIXEL_METER_RATIO; boxBodyDef.position.y = pSprite.getY() / Constants.PIXEL_METER_RATIO; Vector2 v = new Vector2( boxBodyDef.position.x * Constants.PIXEL_METER_RATIO, boxBodyDef.position.y * Constants.PIXEL_METER_RATIO ); Gdx.app.log("@Physics", "createBoxBody():: Box Position: " + v); // Temporary Box shape of the Body final PolygonShape boxPoly = new PolygonShape(); final float halfWidth = pWidth * 0.5f / Constants.PIXEL_METER_RATIO; final float halfHeight = pHeight * 0.5f / Constants.PIXEL_METER_RATIO; boxPoly.setAsBox( halfWidth, halfHeight ); // set the anchor point to be the center of the sprite pFixtureDef.shape = boxPoly; final Body boxBody = pPhysicsWorld.createBody(boxBodyDef); Gdx.app.log("@Physics", "createBoxBody():: Box Center: " + boxBody.getPosition().mul(Constants.PIXEL_METER_RATIO)); boxBody.createFixture(pFixtureDef); boxBody.setTransform( boxBody.getWorldCenter(), MathUtils.degreesToRadians * pRotation ); boxPoly.dispose(); return boxBody; } Making the Sprite: public Car( Texture texture, float pX, float pY, World world ) { super( "Car" ); mSprite = new Sprite( texture ); mSprite.setSize( mSprite.getWidth() / 6, mSprite.getHeight() / 6 ); mSprite.setPosition( pX, pY ); mSprite.setOrigin( mSprite.getWidth()/2, mSprite.getHeight()/2); FixtureDef carFixtureDef = new FixtureDef(); // Set the Fixture's properties, like friction, using the car's shape carFixtureDef.restitution = 1f; carFixtureDef.friction = 1f; carFixtureDef.density = 1f; // needed to rotate body using applyTorque mBody = Physics.createBoxBody( world, BodyDef.BodyType.DynamicBody, carFixtureDef, mSprite ); }

    Read the article

  • Calculating the rotational force of a 2D sprite

    - by Jon
    I am wondering if someone has an elegant way of calculating the following scenario. I have an object of (n) number of squares, random shapes, but we will pretend they are all rectangles. We are dealing with no gravity, so consider the object in space, from a top down perspective. I am applying a force to the object at a specific square (as illustrated below). How do I calculate the rotational angle, based on the force being applied, at the location being applied. If applied in the center square, it would go straight. How should it behave the further I move from the center? How do I calculate the rotational velocity?

    Read the article

  • Dynamic Jump spot

    - by Pasquale Sada
    I have an initial velocity V(Vx,Vy,VZ) and a spot where he stands still at S(Sx,Sy,Sz). What I'm trying to achieve is a jump on a spot E(Ex,Ey,Ez) where you have clicked on(only lower or higher spot, because I've in place a simple steering behavior for even terrains). There are no obstacle around. I've implemented a formula that can make him jump in a precise way on a spot but you need to declare an angle: the problem arise when the selected spot is straight above your head. It' pretty lame that the char hang there and can reach a thing that is 1cm above is head. I'll share the code I'm using: Vector3 dir = target - transform.position; // get target direction float h = dir.y; // get height difference dir.y = 0; // retain only the horizontal direction float dist = dir.magnitude ; // get horizontal distance float a = angle * Mathf.Deg2Rad; // convert angle to radians dir.y = dist * Mathf.Tan(a); // set dir to the elevation angle dist += h / Mathf.Tan(a); // correct for small height differences // calculate the velocity magnitude float vel = Mathf.Sqrt(dist * Physics.gravity.magnitude / Mathf.Sin(2 *a)); return vel * dir.normalized;

    Read the article

  • Working out of a vertex array for destrucible objects

    - by bobobobo
    I have diamond-shaped polygonal bullets. There are lots of them on the screen. I did not want to create a vertex array for each, so I packed them into a single vertex array and they're all drawn at once. | bullet1.xyz | bullet1.rgb | bullet2.xyz | bullet2.rgb This is great for performance.. there is struct Bullet { vector<Vector3f*> verts ; // pointers into the vertex buffer } ; This works fine, the bullets can move and do collision detection, all while having their data in one place. Except when a bullet "dies" Then you have to clear a slot, and pack all the bullets towards the beginning of the array. Is this a good approach to handling lots of low poly objects? How else would you do it?

    Read the article

  • The right way to add images to Monogame/Windows

    - by ashes999
    I'm starting out with MonoGame. For now, I'm only targeting Windows (desktop -- not Windows 8 specifically). I've used a couple of XNA products in the past (raw XNA, FlatRedBall, SilverSprite), so I may have a misunderstanding about how I should add images to my content. How do I add images to my project? Currently, I created a new Monogame project, added a folder called "Content," and added images under there; the only caveat is that I need to set the Copy to Output Directory action to one of the Copy ones. It seems strange, because my "raw" XNA project just last week had a Content project in it (XNA Framework Content Pipeline, according to VS2010), which compiled my images to XNB (I think). It seems like Monogame doesn't use the same content pipeline, but I'm not sure. Edit: My question is not about "how do I get the XNA content pipeline to work with Monogame." My question is "why would I want to use the XNA content pipeline in Monogame?" Because there are (at least) two solutions (that I see today): Add the images to the Monogame project and set the Copy to Output Directory options to copy. Add a XNA content pipeline project and add my images to that instead; reference it from my MOnogame project. Which solution should I use, and why? I currently have a working version with the first option.

    Read the article

  • How can I model a pendulum blade?

    - by Micah Delane Bolen
    Like this one from Saw V: What primitive shape/s would you start out with? How would you transform the primitive shape/s to give it a nice, smooth, sharp blade on one side without distorting the entire object in a weird way? I tried starting out with a cylinder and then subtracting the top half using a duplicate cylinder and a difference modifier, but I ended up distorting the entire object when I tried to pull the "blade" edges together. I think I need to add lattices to smoothly "sharpen" the edge of the blade.

    Read the article

  • OpenGL directional light creating black spots

    - by AnonymousDeveloper
    I probably ought to start by saying that I suspect the problem is that one of my vectors is not in the correct "space", but I don't know for sure. I am having a strange problem with a directional light. When I move the camera away from (0.0, 0.0, 0.0) it creates tiny black spots that grow larger as the distance increases. I apologize ahead of time for the length of the code. Vertex shader: #version 410 core in vec3 vf_normal; in vec3 vf_bitangent; in vec3 vf_tangent; in vec2 vf_textureCoordinates; in vec3 vf_vertex; out vec3 tc_normal; out vec3 tc_bitangent; out vec3 tc_tangent; out vec2 tc_textureCoordinates; out vec3 tc_vertex; uniform mat3 vf_m_normal; uniform mat4 vf_m_model; uniform mat4 vf_m_mvp; uniform mat4 vf_m_projection; uniform mat4 vf_m_view; uniform float vf_te_inner; uniform float vf_te_outer; void main() { tc_normal = vf_normal; tc_bitangent = vf_bitangent; tc_tangent = vf_tangent; tc_textureCoordinates = vf_textureCoordinates; tc_vertex = vf_vertex; gl_Position = vf_m_mvp * vec4(vf_vertex, 1.0); } Tessellation Control shader: #version 410 core layout (vertices = 3) out; in vec3 tc_normal[]; in vec3 tc_bitangent[]; in vec3 tc_tangent[]; in vec2 tc_textureCoordinates[]; in vec3 tc_vertex[]; out vec3 te_normal[]; out vec3 te_bitangent[]; out vec3 te_tangent[]; out vec2 te_textureCoordinates[]; out vec3 te_vertex[]; uniform float vf_te_inner; uniform float vf_te_outer; uniform vec4 vf_l_color; uniform vec3 vf_l_position; uniform mat4 vf_m_depthBias; uniform mat4 vf_m_model; uniform mat4 vf_m_mvp; uniform mat4 vf_m_projection; uniform mat4 vf_m_view; uniform sampler2D vf_t_diffuse; uniform sampler2D vf_t_normal; uniform sampler2DShadow vf_t_shadow; uniform sampler2D vf_t_specular; #define ID gl_InvocationID float getTessLevelInner(float distance0, float distance1) { float avgDistance = (distance0 + distance1) / 2.0; return clamp((vf_te_inner - avgDistance), 1.0, vf_te_inner); } float getTessLevelOuter(float distance0, float distance1) { float avgDistance = (distance0 + distance1) / 2.0; return clamp((vf_te_outer - avgDistance), 1.0, vf_te_outer); } void main() { te_normal[gl_InvocationID] = tc_normal[gl_InvocationID]; te_bitangent[gl_InvocationID] = tc_bitangent[gl_InvocationID]; te_tangent[gl_InvocationID] = tc_tangent[gl_InvocationID]; te_textureCoordinates[gl_InvocationID] = tc_textureCoordinates[gl_InvocationID]; te_vertex[gl_InvocationID] = tc_vertex[gl_InvocationID]; float eyeToVertexDistance0 = distance(vec3(0.0), vec4(vf_m_view * vec4(tc_vertex[0], 1.0)).xyz); float eyeToVertexDistance1 = distance(vec3(0.0), vec4(vf_m_view * vec4(tc_vertex[1], 1.0)).xyz); float eyeToVertexDistance2 = distance(vec3(0.0), vec4(vf_m_view * vec4(tc_vertex[2], 1.0)).xyz); gl_TessLevelOuter[0] = getTessLevelOuter(eyeToVertexDistance1, eyeToVertexDistance2); gl_TessLevelOuter[1] = getTessLevelOuter(eyeToVertexDistance2, eyeToVertexDistance0); gl_TessLevelOuter[2] = getTessLevelOuter(eyeToVertexDistance0, eyeToVertexDistance1); gl_TessLevelInner[0] = getTessLevelInner(eyeToVertexDistance2, eyeToVertexDistance0); } Tessellation Evaluation shader: #version 410 core layout (triangles, equal_spacing, cw) in; in vec3 te_normal[]; in vec3 te_bitangent[]; in vec3 te_tangent[]; in vec2 te_textureCoordinates[]; in vec3 te_vertex[]; out vec3 g_normal; out vec3 g_bitangent; out vec4 g_patchDistance; out vec3 g_tangent; out vec2 g_textureCoordinates; out vec3 g_vertex; uniform float vf_te_inner; uniform float vf_te_outer; uniform vec4 vf_l_color; uniform vec3 vf_l_position; uniform mat4 vf_m_depthBias; uniform mat4 vf_m_model; uniform mat4 vf_m_mvp; uniform mat3 vf_m_normal; uniform mat4 vf_m_projection; uniform mat4 vf_m_view; uniform sampler2D vf_t_diffuse; uniform sampler2D vf_t_displace; uniform sampler2D vf_t_normal; uniform sampler2DShadow vf_t_shadow; uniform sampler2D vf_t_specular; vec2 interpolate2D(vec2 v0, vec2 v1, vec2 v2) { return vec2(gl_TessCoord.x) * v0 + vec2(gl_TessCoord.y) * v1 + vec2(gl_TessCoord.z) * v2; } vec3 interpolate3D(vec3 v0, vec3 v1, vec3 v2) { return vec3(gl_TessCoord.x) * v0 + vec3(gl_TessCoord.y) * v1 + vec3(gl_TessCoord.z) * v2; } float amplify(float d, float scale, float offset) { d = scale * d + offset; d = clamp(d, 0, 1); d = 1 - exp2(-2*d*d); return d; } float getDisplacement(vec2 t0, vec2 t1, vec2 t2) { float displacement = 0.0; vec2 textureCoordinates = interpolate2D(t0, t1, t2); vec2 vector = ((t0 + t1 + t2) / 3.0); float sampleDistance = sqrt((vector.x * vector.x) + (vector.y * vector.y)); sampleDistance /= ((vf_te_inner + vf_te_outer) / 2.0); displacement += texture(vf_t_displace, textureCoordinates).x; displacement += texture(vf_t_displace, textureCoordinates + vec2(-sampleDistance, -sampleDistance)).x; displacement += texture(vf_t_displace, textureCoordinates + vec2(-sampleDistance, sampleDistance)).x; displacement += texture(vf_t_displace, textureCoordinates + vec2( sampleDistance, sampleDistance)).x; displacement += texture(vf_t_displace, textureCoordinates + vec2( sampleDistance, -sampleDistance)).x; return (displacement / 5.0); } void main() { g_normal = normalize(interpolate3D(te_normal[0], te_normal[1], te_normal[2])); g_bitangent = normalize(interpolate3D(te_bitangent[0], te_bitangent[1], te_bitangent[2])); g_patchDistance = vec4(gl_TessCoord, (1.0 - gl_TessCoord.y)); g_tangent = normalize(interpolate3D(te_tangent[0], te_tangent[1], te_tangent[2])); g_textureCoordinates = interpolate2D(te_textureCoordinates[0], te_textureCoordinates[1], te_textureCoordinates[2]); g_vertex = interpolate3D(te_vertex[0], te_vertex[1], te_vertex[2]); float displacement = getDisplacement(te_textureCoordinates[0], te_textureCoordinates[1], te_textureCoordinates[2]); float d2 = min(min(min(g_patchDistance.x, g_patchDistance.y), g_patchDistance.z), g_patchDistance.w); d2 = amplify(d2, 50, -0.5); g_vertex += g_normal * displacement * 0.1 * d2; gl_Position = vf_m_mvp * vec4(g_vertex, 1.0); } Geometry shader: #version 410 core layout (triangles) in; layout (triangle_strip, max_vertices = 3) out; in vec3 g_normal[3]; in vec3 g_bitangent[3]; in vec4 g_patchDistance[3]; in vec3 g_tangent[3]; in vec2 g_textureCoordinates[3]; in vec3 g_vertex[3]; out vec3 f_tangent; out vec3 f_bitangent; out vec3 f_eyeDirection; out vec3 f_lightDirection; out vec3 f_normal; out vec4 f_patchDistance; out vec4 f_shadowCoordinates; out vec2 f_textureCoordinates; out vec3 f_vertex; uniform vec4 vf_l_color; uniform vec3 vf_l_position; uniform mat4 vf_m_depthBias; uniform mat4 vf_m_model; uniform mat4 vf_m_mvp; uniform mat3 vf_m_normal; uniform mat4 vf_m_projection; uniform mat4 vf_m_view; uniform sampler2D vf_t_diffuse; uniform sampler2D vf_t_normal; uniform sampler2DShadow vf_t_shadow; uniform sampler2D vf_t_specular; void main() { int index = 0; while (index < 3) { vec3 vertexNormal_cameraspace = vf_m_normal * normalize(g_normal[index]); vec3 vertexTangent_cameraspace = vf_m_normal * normalize(f_tangent); vec3 vertexBitangent_cameraspace = vf_m_normal * normalize(f_bitangent); mat3 TBN = transpose(mat3( vertexTangent_cameraspace, vertexBitangent_cameraspace, vertexNormal_cameraspace )); vec3 eyeDirection = -(vf_m_view * vf_m_model * vec4(g_vertex[index], 1.0)).xyz; vec3 lightDirection = normalize(-(vf_m_view * vec4(vf_l_position, 1.0)).xyz); f_eyeDirection = TBN * eyeDirection; f_lightDirection = TBN * lightDirection; f_normal = normalize(g_normal[index]); f_patchDistance = g_patchDistance[index]; f_shadowCoordinates = vf_m_depthBias * vec4(g_vertex[index], 1.0); f_textureCoordinates = g_textureCoordinates[index]; f_vertex = (vf_m_model * vec4(g_vertex[index], 1.0)).xyz; gl_Position = gl_in[index].gl_Position; EmitVertex(); index ++; } EndPrimitive(); } Fragment shader: #version 410 core in vec3 f_bitangent; in vec3 f_eyeDirection; in vec3 f_lightDirection; in vec3 f_normal; in vec4 f_patchDistance; in vec4 f_shadowCoordinates; in vec3 f_tangent; in vec2 f_textureCoordinates; in vec3 f_vertex; out vec4 fragColor; uniform vec4 vf_l_color; uniform vec3 vf_l_position; uniform mat4 vf_m_depthBias; uniform mat4 vf_m_model; uniform mat4 vf_m_mvp; uniform mat4 vf_m_projection; uniform mat4 vf_m_view; uniform sampler2D vf_t_diffuse; uniform sampler2D vf_t_normal; uniform sampler2DShadow vf_t_shadow; uniform sampler2D vf_t_specular; vec2 poissonDisk[16] = vec2[]( vec2(-0.94201624, -0.39906216), vec2( 0.94558609, -0.76890725), vec2(-0.09418410, -0.92938870), vec2( 0.34495938, 0.29387760), vec2(-0.91588581, 0.45771432), vec2(-0.81544232, -0.87912464), vec2(-0.38277543, 0.27676845), vec2( 0.97484398, 0.75648379), vec2( 0.44323325, -0.97511554), vec2( 0.53742981, -0.47373420), vec2(-0.26496911, -0.41893023), vec2( 0.79197514, 0.19090188), vec2(-0.24188840, 0.99706507), vec2(-0.81409955, 0.91437590), vec2( 0.19984126, 0.78641367), vec2( 0.14383161, -0.14100790) ); float random(vec3 seed, int i) { vec4 seed4 = vec4(seed,i); float dot_product = dot(seed4, vec4(12.9898, 78.233, 45.164, 94.673)); return fract(sin(dot_product) * 43758.5453); } float amplify(float d, float scale, float offset) { d = scale * d + offset; d = clamp(d, 0, 1); d = 1 - exp2(-2.0 * d * d); return d; } void main() { vec3 lightColor = vf_l_color.xyz; float lightPower = vf_l_color.w; vec3 materialDiffuseColor = texture(vf_t_diffuse, f_textureCoordinates).xyz; vec3 materialAmbientColor = vec3(0.1, 0.1, 0.1) * materialDiffuseColor; vec3 materialSpecularColor = texture(vf_t_specular, f_textureCoordinates).xyz; vec3 n = normalize(texture(vf_t_normal, f_textureCoordinates).rgb * 2.0 - 1.0); vec3 l = normalize(f_lightDirection); float cosTheta = clamp(dot(n, l), 0.0, 1.0); vec3 E = normalize(f_eyeDirection); vec3 R = reflect(-l, n); float cosAlpha = clamp(dot(E, R), 0.0, 1.0); float visibility = 1.0; float bias = 0.005 * tan(acos(cosTheta)); bias = clamp(bias, 0.0, 0.01); for (int i = 0; i < 4; i ++) { float shading = (0.5 / 4.0); int index = i; visibility -= shading * (1.0 - texture(vf_t_shadow, vec3(f_shadowCoordinates.xy + poissonDisk[index] / 3000.0, (f_shadowCoordinates.z - bias) / f_shadowCoordinates.w))); }\n" fragColor.xyz = materialAmbientColor + visibility * materialDiffuseColor * lightColor * lightPower * cosTheta + visibility * materialSpecularColor * lightColor * lightPower * pow(cosAlpha, 5); fragColor.w = texture(vf_t_diffuse, f_textureCoordinates).w; } The following images should be enough to give you an idea of the problem. Before moving the camera: Moving the camera just a little. Moving it to the center of the scene.

    Read the article

  • Objects disappear when zoomed out in Unity

    - by Starkers
    Ignore the palm trees here. I have some oak-like trees when I'm zoomed in: They disappear when I zoom out: Is this normal? Is this something to do with draw distance? How can I change this so my trees don't disappear? The reason I ask is because my installation had a weird terrain glitch. If this isn't normal I'm going to reinstall right away because I'm always thinking 'is that a feature? Or a glitch'?

    Read the article

  • Correcting Lighting in Stencil Reflections

    - by Reanimation
    I'm just playing around with OpenGL seeing how different methods of making shadows and reflections work. I've been following this tutorial which describes using GLUT_STENCIL's and MASK's to create a reasonable interpretation of a reflection. Following that and a bit of tweaking to get things to work, I've come up with the code below. Unfortunately, the lighting isn't correct when the reflection is created. glPushMatrix(); plane(); //draw plane that reflection appears on glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE); glDepthMask(GL_FALSE); glEnable(GL_STENCIL_TEST); glStencilFunc(GL_ALWAYS, 1, 0xFFFFFFFF); glStencilOp(GL_REPLACE, GL_REPLACE, GL_REPLACE); plane(); //draw plane that acts as clipping area for reflection glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE); glDepthMask(GL_TRUE); glStencilFunc(GL_EQUAL, 1, 0xFFFFFFFF); glStencilOp(GL_KEEP, GL_KEEP, GL_KEEP); glDisable(GL_DEPTH_TEST); glPushMatrix(); glScalef(1.0f, -1.0f, 1.0f); glTranslatef(0,2,0); glRotatef(180,0,1,0); sphere(radius, spherePos); //draw object that you want to have a reflection glPopMatrix(); glEnable(GL_DEPTH_TEST); glDisable(GL_STENCIL_TEST); sphere(radius, spherePos); //draw object that creates reflection glPopMatrix(); It looked really cool to start with, then I noticed that the light in the reflection isn't correct. I'm not sure that it's even a simple fix because effectively the reflection is also a sphere but I thought I'd ask here none-the-less. I've tried various rotations (seen above the first time the sphere is drawn) but it doesn't seem to work. I figure it needs to rotate around the Y and Z axis but that's not correct. Have I implemented the rotation wrong or is there a way to correct the lighting?

    Read the article

  • Very basic OpenGL ES 2 error

    - by user16547
    This is an incredibly simple shader, yet I'm having a lot of trouble understanding what's wrong with it. I'm trying to send a float to my fragment shader. Its purpose is to adjust the alpha of the fragment colour. Here is my fragment shader: precision mediump float; uniform sampler2D u_Texture; uniform float u_Alpha; varying vec2 v_TexCoordinate; void main() { gl_FragColor = texture2D(u_Texture, v_TexCoordinate); gl_FragColor.a *= u_Alpha; } and below is my rendering method. I get a 1282 (invalid operation) on the GLES20.glUniform1f(u_Alpha, alpha); line. alpha is 1 (but I tried other values as well) and transparent is true: public void render() { GLES20.glUseProgram(mProgram); if(transparent) { GLES20.glEnable(GLES20.GL_BLEND); GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA); GLES20.glUniform1f(u_Alpha, alpha); } Matrix.setIdentityM(mModelMatrix, 0); Matrix.rotateM(mModelMatrix, 0, angle, 0, 0, 1); Matrix.translateM(mModelMatrix, 0, x, y, z); Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0); Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0); GLES20.glUniformMatrix4fv(u_MVPMatrix, 1, false, mMVPMatrix, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, vbo[0]); GLES20.glVertexAttribPointer(a_Position, 3, GLES20.GL_FLOAT, false, 12, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, vbo[1]); GLES20.glVertexAttribPointer(a_TexCoordinate, 2, GLES20.GL_FLOAT, false, 8, 0); //snowTexture start GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]); GLES20.glUniform1i(u_Texture, 0); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, ibo[0]); GLES20.glDrawElements(GLES20.GL_TRIANGLE_STRIP, indices.capacity(), GLES20.GL_UNSIGNED_BYTE, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, 0); if(transparent) { GLES20.glDisable(GLES20.GL_BLEND); } GLES20.glUseProgram(0); }

    Read the article

  • AndEngine GLES2 - getting Black screen on emulator 4.1

    - by dizworld.com
    I'm new in andengine . I create following code public class MainActivity extends BaseGameActivity { static final int CAMERA_WIDTH = 800; static final int CAMERA_HEIGHT = 480; public Font mFont; public Camera mCamera; //A reference to the current scene public Scene mCurrentScene; public static BaseActivity instance; public EngineOptions onCreateEngineOptions() { instance = this; mCamera = new Camera(0, 0, CAMERA_WIDTH, CAMERA_HEIGHT); return new EngineOptions(true, ScreenOrientation.LANDSCAPE_SENSOR, new RatioResolutionPolicy(CAMERA_WIDTH, CAMERA_HEIGHT), mCamera); } @Override public void onCreateResources(OnCreateResourcesCallback arg0) throws Exception { mFont = FontFactory.create(this.getFontManager(),this.getTextureManager(), 256, 256,Typeface.create(Typeface.DEFAULT, Typeface.BOLD), 32); mFont.load(); } @Override public void onCreateScene(OnCreateSceneCallback arg0) throws Exception { mEngine.registerUpdateHandler(new FPSLogger()); mCurrentScene = new Scene(); Log.v("Scene","enter"); mCurrentScene.setBackground(new Background(0.09804f, 0.7274f, 0.8f)); // return mCurrentScene; } @Override public void onPopulateScene(Scene arg0, OnPopulateSceneCallback arg1) throws Exception { // TODO Auto-generated method stub } } I got code on sites there is returning scene but in AndEngine GLES2 in method onCreateScene() there is no return scene ... so my First run is BLACK .. any suggestion :)

    Read the article

  • How could I implement 3D player collision with rotation in LWJGL?

    - by Tinfoilboy
    I have a problem with my current collision implementation. Currently for player collision, I just use an AABB where I check if another AABB is in the way of the player, as shown in this code. (The code below is a sample of checking for collisions in the Z axis) for (int z = (int) (this.position.getZ()); z > this.position.getZ() - moveSpeed - boundingBoxDepth; z--) { // The maximum Z you can get. int maxZ = (int) (this.position.getZ() - moveSpeed - boundingBoxDepth) + 1; AxisAlignedBoundingBox aabb = WarmupWeekend.getInstance().currentLevel.getAxisAlignedBoundingBoxAt(new Vector3f(this.position.getX(), this.position.getY(), z)); AxisAlignedBoundingBox potentialCameraBB = new AxisAlignedBoundingBox(this, "collider", new Vector3f(this.position.getX(), this.position.getY(), z), boundingBoxWidth, boundingBoxHeight, boundingBoxDepth); if (aabb != null) { if (potentialCameraBB.colliding(aabb) && aabb.COLLIDER_TYPE.equalsIgnoreCase("collider")) { break; } else if (!potentialCameraBB.colliding(aabb) && z == maxZ) { if (this.grounded) { playFootstep(); } this.position.z -= moveSpeed; break; } } else if (z == maxZ) { if (this.grounded) { playFootstep(); } this.position.z -= moveSpeed; break; } } Now, when I tried to implement rotation to this method, everything broke. I'm wondering how I could implement rotation to this block (and as all other checks in each axis are the same) and others. Thanks in advance.

    Read the article

  • Normal vector of a face loaded from an FBX model during collision?

    - by Corey Ogburn
    I'm loading a simple 6 sided cube from a UV-mapped FBX model and I'm using a BoundingBox to test for collisions. Once I determine there's a collision, I want to use the normal vector of the collided surface to correct the movement of whatever collided with the cube. I suppose this is a two-part question: 1) How can I determine which face of the cube was collided with in a collision? 2) How can I get the normal vector of that surface?

    Read the article

  • Problem with alleg42.dll / program crashes / Allegro & Codeblocks

    - by user24152
    I'm having a serious problem with allegro. The program should display random pixels on the screen and when I build and run it I get the following error message: Below is the full code of my program: #include <stdio.h> #include <stdlib.h> #include <time.h> #include "allegro.h" #define Text_Color_Red makecol(255,0,0) int main() { int ret; int color_depth = 32; int x; int y; int red; int green; int blue; int color; //init allegro allegro_init(); //install keyboard install_keyboard(); //set color depth to 32 bits set_color_depth(color_depth); //init random seed srand(time(NULL)); //init video mode to 640 x 480 ret = set_gfx_mode(GFX_AUTODETECT_WINDOWED,640,480,0,0); if(ret !=0) { allegro_message(allegro_error); return 1; } //Display string textprintf(screen,font,0,0,10,0,Text_Color_Red,"Screen Resolution is: %dx%d -- Press ESC to quit !",SCREEN_W,SCREEN_H); //display pixels until ESC key is pressed //wait for keypress while(!key[KEY_ESC]) { //set a random location x = 10 + rand() % (SCREEN_W-20); y = 10 + rand() % (SCREEN_H-20); //set a random color red = rand() % 255; green = rand() % 255; blue = rand() % 255; color = makecol(red,green,blue); //draw the pixel putpixel(screen, x, y, color); } //quit allegro allegro_exit(); } END_OF_MAIN() Error message: AllegroPixels1.exe has encountered a problem and needs to close. We are sorry for the inconvenience. Error signature: AppName: allegropixels1.exe AppVer: 0.0.0.0 ModName: alleg42.dll ModVer: 4.2.3.0 Offset: 0006c05c I am using Windows XP inside a virtual machine under Parallels 7.0

    Read the article

  • The most efficent ways for drawing lines all day long with OpenGL

    - by nkint
    I'd like to put a computer screen that is running an OpenGL programs in a room. It has to run all day long (not in the night). I'd like to draw lines that are slowly fading in the background. The setting is simple: a uniform color background (say, black) and colored lines (say, white) that are slowly fading out. With slowly I mean.. hours. Say that the first line I draw is with alpha 255 (fully visible), after one hours is 240. After 10 hours is 105. One line could have 250 points and there will be like 300 line in one day. For now I have done a prototype with very rudimentary method like: glBegin( GL_LINE_STRIP ); iterator = point_list.begin(); for (++iterator, end = point_list.end(); iterator != end; ++iterator) { const Vec3D &v = *iterator; glVertex2f(v.x(), v.y()); } glEnd(); More efficient method?

    Read the article

  • LibGDX onTouch() method Array and flip method

    - by johnny-b
    How can I add this on my application. i want to use the onTouch() method from the implementation of the InputProcessor to kill the enemies on screen. how do i do that? do i have to do anything to the enemy class? also i am trying to add a Array of enemies and it keeps throwing exceptions or the bullet now is facing LEFT <--- again after I used the flip method in the bullet class. All the code is below so please anyone feel free to have a look thanks. please help Thank you M // This is the bullet class. public class Bullet extends Sprite { public static final float BULLET_HOMING = 6000; public static final float BULLET_SPEED = 300; private Vector2 velocity; private float lifetime; private Rectangle bul; public Bullet(float x, float y) { velocity = new Vector2(0, 0); setPosition(x, y); AssetLoader.bullet1.flip(true, false); AssetLoader.bullet2.flip(true, false); setSize(AssetLoader.bullet1.getWidth(), AssetLoader.bullet1.getHeight()); bul = new Rectangle(); } public void update(float delta) { float targetX = GameWorld.getBall().getX(); float targetY = GameWorld.getBall().getY(); float dx = targetX - getX(); float dy = targetY - getY(); float distToTarget = (float) Math.sqrt(dx * dx + dy * dy); dx /= distToTarget; dy /= distToTarget; dx *= BULLET_HOMING; dy *= BULLET_HOMING; velocity.x += dx * delta; velocity.y += dy * delta; float vMag = (float) Math.sqrt(velocity.x * velocity.x + velocity.y * velocity.y); velocity.x /= vMag; velocity.y /= vMag; velocity.x *= BULLET_SPEED; velocity.y *= BULLET_SPEED; bul.set(getX(), getY(), getOriginX(), getOriginY()); Vector2 v = velocity.cpy().scl(delta); setPosition(getX() + v.x, getY() + v.y); setOriginCenter(); setRotation(velocity.angle()); } public Rectangle getBounds() { return bul; } public Rectangle getBounds1() { return this.getBoundingRectangle(); } } // This is the class where i load all the images from public class AssetLoader { public static Texture texture; public static TextureRegion bg, ball1, ball2; public static Animation bulletAnimation, ballAnimation; public static Sprite bullet1, bullet2; public static void load() { texture = new Texture(Gdx.files.internal("SpriteN1.png")); texture.setFilter(TextureFilter.Nearest, TextureFilter.Nearest); bg = new TextureRegion(texture, 80, 421, 395, 30); bg.flip(false, true); ball1 = new TextureRegion(texture, 0, 321, 32, 32); ball1.flip(false, true); ball2 = new TextureRegion(texture, 32, 321, 32, 32); ball2.flip(false, true); bullet1 = new Sprite(texture, 380, 350, 45, 20); bullet1.flip(false, true); bullet2 = new Sprite(texture, 425, 350, 45, 20); bullet2.flip(false, true); TextureRegion[] balls = { ball1, ball2 }; ballAnimation = new Animation(0.16f, balls); ballAnimation.setPlayMode(Animation.PlayMode.LOOP); } Sprite[] bullets = { bullet1, bullet2 }; bulletAnimation = new Animation(0.06f, aims); bulletAnimation.setPlayMode(Animation.PlayMode.LOOP); } public static void dispose() { texture.dispose(); } // This is for the rendering or drawing onto the screen/canvas. public class GameRenderer { private Bullet bullet; private Ball ball; public GameRenderer(GameWorld world) { myWorld = world; cam = new OrthographicCamera(); cam.setToOrtho(true, 480, 320); batcher = new SpriteBatch(); // Attach batcher to camera batcher.setProjectionMatrix(cam.combined); shapeRenderer = new ShapeRenderer(); shapeRenderer.setProjectionMatrix(cam.combined); // Call helper methods to initialize instance variables initGameObjects(); initAssets(); } private void initGameObjects() { ball = GameWorld.getBall(); bullet = myWorld.getBullet(); scroller = myWorld.getScroller(); } private void initAssets() { ballAnimation = AssetLoader.ballAnimation; bulletAnimation = AssetLoader.bulletAnimation; } public void render(float runTime) { Gdx.gl.glClearColor(0, 0, 0, 1); Gdx.gl.glClear(GL30.GL_COLOR_BUFFER_BIT); batcher.begin(); batcher.disableBlending(); batcher.enableBlending(); batcher.draw(AssetLoader.ballAnimation.getKeyFrame(runTime), ball.getX(), ball.getY(), ball.getWidth(), ball.getHeight()); batcher.draw(AssetLoader.bulletAnimation.getKeyFrame(runTime), bullet.getX(), bullet.getY(), bullet.getOriginX(), bullet.getOriginY(), bullet.getWidth(), bullet.getHeight(), 1.0f, 1.0f, bullet.getRotation()); // End SpriteBatch batcher.end(); } } // this is to load the image etc on the screen i guess public class GameWorld { public static Ball ball; private Bullet bullet; private ScrollHandler scroller; public GameWorld() { ball = new Ball(480, 273, 32, 32); bullet = new Bullet(10, 10); scroller = new ScrollHandler(0); } public void update(float delta) { ball.update(delta); bullet.update(delta); scroller.update(delta); } public static Ball getBall() { return ball; } public ScrollHandler getScroller() { return scroller; } public Bullet getBullet() { return bullet; } } //This is the input handler class public class InputHandler implements InputProcessor { private Ball myBall; private Bullet bullet; private GameRenderer aims; // Ask for a reference to the Soldier when InputHandler is created. public InputHandler(Ball ball) { myBall = ball; } @Override public boolean touchDown(int screenX, int screenY, int pointer, int button) { return false; } @Override public boolean keyDown(int keycode) { return false; } @Override public boolean keyUp(int keycode) { return false; } @Override public boolean keyTyped(char character) { return false; } @Override public boolean touchUp(int screenX, int screenY, int pointer, int button) { return false; } @Override public boolean touchDragged(int screenX, int screenY, int pointer) { return false; } @Override public boolean mouseMoved(int screenX, int screenY) { return false; } @Override public boolean scrolled(int amount) { return false; } } i am rendering all graphics in a GameRender class and a gameworld class if you need more info please let me know I am trying to make the array work but keep finding that when an array is initialized then the bullet fips back to the original and ends up being backwards???? and if I create an array I keep getting Exceptions throw??? Thank you for any help given.

    Read the article

  • XNA Drag Gestures - fractional delta values

    - by Den
    I have an issue with objects moving roughly twice as far as expected when dragging them. I am comparing my application to the standard TouchGestureSample sample from MSDN. For some reason in my application gesture samples have fractional positions and deltas. Both are using same Microsoft.Xna.Framework.Input.Touch.dll, v4.0.30319. I am running both apps using standard Windows Phone Emulator. I am setting my break point immediately after this line of code in a simple Update method: GestureSample gesture = TouchPanel.ReadGesture(); Typical values in my app: Delta = {X:-13.56522 Y:4.166667} Position = {X:184.6956 Y:417.7083} Typical values in sample app: Delta = {X:7 Y:16} Position = {X:497 Y:244} Have anyone seen this issue? Does anyone have any suggestions? Thank you.

    Read the article

  • glTranslate, how exactly does it work?

    - by mykk
    I have some trouble understanding how does glTranslate work. At first I thought it would just simply add values to axis to do the transformation. However then I have created two objects that would load bitmaps, one has matrix set to GL_TEXTURE: public class Background { float[] vertices = new float[] { 0f, -1f, 0.0f, 4f, -1f, 0.0f, 0f, 1f, 0.0f, 4f, 1f, 0.0f }; .... private float backgroundScrolled = 0; public void scrollBackground(GL10 gl) { gl.glLoadIdentity(); gl.glMatrixMode(GL10.GL_MODELVIEW); gl.glTranslatef(0f, 0f, 0f); gl.glPushMatrix(); gl.glLoadIdentity(); gl.glMatrixMode(GL10.GL_TEXTURE); gl.glTranslatef(backgroundScrolled, 0.0f, 0.0f); gl.glPushMatrix(); this.draw(gl); gl.glPopMatrix(); backgroundScrolled += 0.01f; gl.glLoadIdentity(); } } and another to GL_MODELVIEW: public class Box { float[] vertices = new float[] { 0.5f, 0f, 0.0f, 1f, 0f, 0.0f, 0.5f, 0.5f, 0.0f, 1f, 0.5f, 0.0f }; .... private float boxScrolled = 0; public void scrollBackground(GL10 gl) { gl.glMatrixMode(GL10.GL_MODELVIEW); gl.glLoadIdentity(); gl.glTranslatef(0f, 0f, 0f); gl.glPushMatrix(); gl.glMatrixMode(GL10.GL_MODELVIEW); gl.glLoadIdentity(); gl.glTranslatef(boxScrolled, 0.0f, 0.0f); gl.glPushMatrix(); this.draw(gl); gl.glPopMatrix(); boxScrolled+= 0.01f; gl.glLoadIdentity(); } } Now they are both drawn in Renderer.OnDraw. However background moves exactly 5 times faster. If I multiply boxScrolled by 5 they will be in sinc and will move together. If I modify backgrounds vertices to be float[] vertices = new float[] { 1f, -1f, 0.0f, 0f, -1f, 0.0f, 1f, 1f, 0.0f, 0f, 1f, 0.0f }; It will also be in sinc with the box. So, what is going under glTranslate?

    Read the article

  • rotating menu with Actors in libgdx

    - by joecks
    I am intending to build a circular menu, with menu items equally distributed around the circle. When clicking on a menu item the circle should rotate so that the selected item is facing the top. I am using libgdx and I am not very familiar with the Actors concept, so I intuitivly tried to implement an Actor, who is drawing a texture and then transforming it by using Actions, with no success: class CircleActor extends Actor { @Override public void draw(SpriteBatch batch, float parentAlpha) { batch.draw(texture1, 100, 100); } @Override public Actor hit(float x, float y) { return this; } } and the rotate action: CircleActor circleActor = new CircleActor(); circleActor.action(Forever.$(RotateBy.$(0.1f, 0.1f))); // stage.addActor(); stage.addActor(circleActor); The texture is rectangular, but it doe not work. 1. What is wrong? 2. Is it a good approach to solve the task? Thanks!

    Read the article

  • Gui not showing when accessing AudioSource.Volume

    - by Lautaro
    I have A GuiManager class A SoundManager with 2 AudioSources SfxPlayer is created in the inspector on the same object as SoundManager MusicPlayer is created programatically within the SoundManager If i from anywhere in the GuiManager access the volume of MusicPlayer then all the Gui dissapears. Nothing is shown, not even the start menu. I dont get any errors or exceptions. I dont have any Try Catch statements. Anyone knows whats up?

    Read the article

< Previous Page | 378 379 380 381 382 383 384 385 386 387 388 389  | Next Page >