Search Results

Search found 44501 results on 1781 pages for 'software development life'.

Page 583/1781 | < Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >

  • Modern techniques for spriting

    - by DevilWithin
    Hello, I would like to know the flow for making modern 2D game artwork. How are the assets made nowadays? Bitmap? Vector-based? Hand-drawn and painted? Drawn digitally? Modeled in 3D and exported to bitmaps? I would like some information on programs as well, for fine looking art. Why does Flash's vector art style look good in most games? How do I make equivalent graphics with external tools? Or equaly good and not vector-based, anyway. Any special hints for animating? An answer oriented towards a one-man-army indie developer with little experience but some artistic sense would be appreciated! Not a complete dummy with paint programs, but also not a master at all, just need efficient ways to achieve results. Thanks. NOTE: Pixel art is not the goal of this question, nothing related to direct pixel manipulation should be brought up here, but you're free to do exactly that :)

    Read the article

  • Power & Sleep Management

    - by Espressofa
    I'm running 12.10 with xmonad. Trying to ensure that the right things happen when I close laptop lid, etc. I see Internet search results for similar issues that mostly point towards gnome-power-manager. I have the package installed, but gnome-power-manager is not in my path anywhere. The behavior I'm looking for is as following: Sleep on lid close Awaken on lid open Turn off screen after 10 idle minutes Most importantly, have better battery life. I'm supposed to be getting 9 hours and I haven't seen the battery life estimate above 2.5 hours yet. Any tips on where to look or how to configure this would be much appreciated.

    Read the article

  • Greiner-Hormann clipping problem

    - by Belgin
    I have a set of planar polygons in 3D space defined by their vertices in counterclockwise order. Let's define the 'positive face' as being the face of the 3D polygon such as when observed, the vertices appear in counterclockwise order, and the 'negative face', the face which when observed, the vertices appear in clockwise order. I'm doing perspective projection of the set of polygons onto a projection polygon defined by the points in this order: (0, h, 0), (0, 0, 0), (w, 0, 0), and (w, h, 0), where w and h are strictly positive integers. The positive face of this projection polygon is oriented towards positive Z, and the camera point is somewhere at (0, 0, d), where d is a strictly negative number. In order to 'clip' the projected polygons into the projection polygon, I'm applying the Greiner-Hormann (PDF) clipping algorithm, which requires that the clipper and the to-be-clipped polygons be in the same order (i.e. clockwise or counterclockwise). My question is the following: How can I determine whether the projected face of the 3D polygon is the negative or the positive one? Meaning, how do I find out if I have to work with the vertices in normal or inverted order for the algorithm to work? I noticed that only if the 3D polygon is facing the projection polygon with its negative face, both of them are in the same order (counterclockwise), otherwise, a modification needs to be done. Here is a picture (PNG) that illustrates this. Note that the planes described by the polygon from the set and the projection polygon may not always be parallel.

    Read the article

  • Problems with texture orientation in space

    - by frankie
    I am currently drawing texture in 3D space and have some problems with it's orientation. I'd like me textures always to be oriented with front face to user. My desirable result looks like Note, that text size stay without changes when we rotating world and stay oriented with front face to user. Now I can draw text in 3D space, but it is not oriented with front but rotating with world. Such results I got with following shaders: Vertex Shader uniform vec3 Position; void main() { gl_Position = vec4(Position, 1.0); } Geometry Shader layout(points) in; layout(triangle_strip, max_vertices = 4) out; out vec2 fsTextureCoordinates; uniform mat4 projectionMatrix; uniform mat4 modelViewMatrix; uniform sampler2D og_texture0; uniform float og_highResolutionSnapScale; uniform vec2 u_originScale; void main() { vec2 halfSize = vec2(textureSize(og_texture0, 0)) * 0.5 * og_highResolutionSnapScale; vec4 center = gl_in[0].gl_Position; center.xy += (u_originScale * halfSize); vec4 v0 = vec4(center.xy - halfSize, center.z, 1.0); vec4 v1 = vec4(center.xy + vec2(halfSize.x, -halfSize.y), center.z, 1.0); vec4 v2 = vec4(center.xy + vec2(-halfSize.x, halfSize.y), center.z, 1.0); vec4 v3 = vec4(center.xy + halfSize, center.z, 1.0); gl_Position = projectionMatrix * modelViewMatrix * v0; fsTextureCoordinates = vec2(0.0, 0.0); EmitVertex(); gl_Position = projectionMatrix * modelViewMatrix * v1; fsTextureCoordinates = vec2(1.0, 0.0); EmitVertex(); gl_Position = projectionMatrix * modelViewMatrix * v2; fsTextureCoordinates = vec2(0.0, 1.0); EmitVertex(); gl_Position = projectionMatrix * modelViewMatrix * v3; fsTextureCoordinates = vec2(1.0, 1.0); EmitVertex(); } Fragment Shader in vec2 fsTextureCoordinates; out vec4 fragmentColor; uniform sampler2D og_texture0; uniform vec3 u_color; void main() { vec4 color = texture(og_texture0, fsTextureCoordinates); if (color.a == 0.0) { discard; } fragmentColor = vec4(color.rgb * u_color.rgb, color.a); } Any ideas how to get my desirable result? EDIT 1: I make edit in my geometry shader and got part of lable drawn on screen at corner. But it is not rotating. .......... vec4 centerProjected = projectionMatrix * modelViewMatrix * center; centerProjected /= centerProjected.w; vec4 v0 = vec4(centerProjected.xy - halfSize, 0.0, 1.0); vec4 v1 = vec4(centerProjected.xy + vec2(halfSize.x, -halfSize.y), 0.0, 1.0); vec4 v2 = vec4(centerProjected.xy + vec2(-halfSize.x, halfSize.y), 0.0, 1.0); vec4 v3 = vec4(centerProjected.xy + halfSize, 0.0, 1.0); gl_Position = og_viewportOrthographicMatrix * v0; ..........

    Read the article

  • Tool for creating Spritesheet? and Tips

    - by Spooks
    I am looking for a tool that I can use to create sprite sheet easily. Right now I am using Illustrator, but I can never get the center of the character in the exact position, so it looks like it is moving around(even though its always in one place), while being loop through the sprite sheet. Is there any better tools that I can be using? Also what kind of tips would you give for working with a sprite sheet? Should I create each part of the character in individual layers (left arm, right arm, body, etc.) or everything at once? any other tips would also be helpful! thank you

    Read the article

  • List of bounding boxes?

    - by Christian Frantz
    When I create a bounding box for each object in my chunk, would it be better to store them in a list? List<BoundingBox> cubeBoundingBox Or can I just use a single variable? BoundingBox cubeBoundingBox The bounding boxes will be used for all types of things so they will be moving around. In any case, I'd be adding it to a method that gets called 2500+ times for each chunk, so either I have a giant list of them or 2500+ individual boxes. Is there any advantage to using one or the other?

    Read the article

  • Physics like asteroides

    - by user2933016
    I try to make a ship that has the physic properties like asteroides. I have this for now(All in Java): Ship.class public class Ship { public static final float sMaxHealth = 0.1F; public static final float sMaxMoveVelocity = 5.0F; public static final float sMaxAngleVelocity = 20.0F; public static final float sRadius = 1.0F; public static final float sMoveDeceleration = 10.0F; public static final float sMoveAcceleration = 2.0F; public static final float sAngleDeceleration = 15.0F; public static final float sAngleAcceleration = 20.0F; private float mHealth; private float mXVelocity; private float mYVelocity; private float mAngleVelocity; private float mX; private float mY; private float mAngle; } (I let the getter and setter away for now) Controller code // Player input if(Gdx.input.isKeyPressed(Keys.UP)) { mPlayer.setXVelocity(mPlayer.getXVelocity() + (float) Math.cos(mPlayer.getAngle()) * Ship.sMoveAcceleration); mPlayer.setYVelocity(mPlayer.getYVelocity() + (float) Math.sin(mPlayer.getAngle()) * Ship.sMoveAcceleration); } if(Gdx.input.isKeyPressed(Keys.LEFT)) { mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() + Ship.sAngleAcceleration * pDeltaTime); } if(Gdx.input.isKeyPressed(Keys.RIGHT)) { mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() - Ship.sAngleAcceleration * pDeltaTime); } // X velocity if(mPlayer.getXVelocity() < 0) { if(-mPlayer.getXVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setXVelocity(-Ship.sMaxMoveVelocity); } mPlayer.setXVelocity(mPlayer.getXVelocity() + Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getXVelocity() > 0) { mPlayer.setXVelocity(0); } } else if(mPlayer.getXVelocity() > 0) { if(mPlayer.getXVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setXVelocity(Ship.sMaxMoveVelocity); } mPlayer.setXVelocity(mPlayer.getXVelocity() - Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getXVelocity() < 0) { mPlayer.setXVelocity(0); } } // Y velocity if(mPlayer.getYVelocity() < 0) { if(-mPlayer.getYVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setYVelocity(-Ship.sMaxMoveVelocity); } mPlayer.setYVelocity(mPlayer.getYVelocity() + Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getYVelocity() > 0) { mPlayer.setYVelocity(0); } } else if(mPlayer.getYVelocity() > 0) { if(mPlayer.getYVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setYVelocity(Ship.sMaxMoveVelocity); } mPlayer.setYVelocity(mPlayer.getYVelocity() - Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getYVelocity() < 0) { mPlayer.setYVelocity(0); } } // Angle velocity if(mPlayer.getAngleVelocity() < 0) { if(-mPlayer.getAngleVelocity() > Ship.sMaxAngleVelocity) { mPlayer.setAngleVelocity(-Ship.sMaxAngleVelocity); } mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() + Ship.sAngleDeceleration * pDeltaTime); if(mPlayer.getAngleVelocity() > 0) { mPlayer.setAngleVelocity(0); } } else if(mPlayer.getAngleVelocity() > 0) { if(mPlayer.getAngleVelocity() > Ship.sMaxAngleVelocity) { mPlayer.setAngleVelocity(Ship.sMaxAngleVelocity); } mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() - Ship.sAngleDeceleration * pDeltaTime); if(mPlayer.getAngleVelocity() < 0) { mPlayer.setAngleVelocity(0); } } mPlayer.setX(mPlayer.getX() + mPlayer.getXVelocity() * pDeltaTime); mPlayer.setY(mPlayer.getY() + mPlayer.getYVelocity() * pDeltaTime); mPlayer.setAngle(mPlayer.getAngle() + mPlayer.getAngleVelocity() * pDeltaTime); Why the ship does not behave like in asteroides ? What do I wrong?

    Read the article

  • Finite state machine in C++

    - by Electro
    So, I've read a lot about using FSMs to do game state management, things like what and FSM is, and using a stack or set of states for building one. I've gone through all that. But I'm stuck at writing an actual, well-designed implementation of an FSM for that purpose. Specifically, how does one cleanly resolve the problem of transitioning between states, (how) should a state be able to use data from other states, and so on. Does anyone have any tips on designing and writing a implementation in C++, or better yet, code examples?

    Read the article

  • Avoid if statements in DirectX 10 shaders?

    - by PolGraphic
    I have heard that if statements should be avoid in shaders, because both parts of the statements will be execute, and than the wrong will be dropped (which harms the performance). It's still a problem in DirectX 10? Somebody told me, that in it only the right branch will be execute. For the illustration I have the code: float y1 = 5; float y2 = 6; float b1 = 2; float b2 = 3; if(x>0.5){ x = 10 * y1 + b1; }else{ x = 10 * y2 + b2; } Is there an other way to make it faster? If so, how do it? Both branches looks similar, the only difference is the values of "constants" (y1, y2, b1, b2 are the same for all pixels in Pixel Shader).

    Read the article

  • How can I locate empty space next to polygon regions?

    - by Stephen
    Let's say I have the following area in a top-down map: The circle is the player, the black square is an obstacle, and the grey polygons with red borders are walk-able areas that will be used as a navigation mesh for enemies. Obstacles and grey polygons are always convex. The grey regions were defined using an algorithm when the world was generated at runtime. Notice the little white column. I need to figure out where any empty space like this is, if at all, after the algorithm builds the grey regions, so that I can fill the space with another region. Basically what I'm hoping for is an algorithm that can detect empty space next to a polygon.

    Read the article

  • Vector Graphics in DirectX

    - by Doug
    I'm curious as to people's thoughts on the best way to use vector graphics in a directX game instead of rasterized textures(think Super Meat Boy). I want to remain resolution independent and don't want to downscale/upscale rasterized graphics. Also the idea would be for all assets to be vector graphics(again think Super Meat Boy). I've looked at Valve's paper "Improved Alpha-Tested Magnification for Vector Textures and Special Effects" and also looked at using shaders http://http.developer.nvidia.com/GPUGems3/gpugems3_ch25.html. Wondering if anyone has done something similar or an alternate approach. Cheers

    Read the article

  • Rain effect looks like snowfall effect?

    - by Nikhil Lamba
    i am making a game in that game i want rain effect i am little bit far from this right now i am doing like below particleSystem.addParticleInitializer(new ColorInitializer(1, 1, 1)); particleSystem.addParticleInitializer(new AlphaInitializer(0)); particleSystem.setBlendFunction(GL10.GL_SRC_ALPHA, GL10.GL_ONE); particleSystem.addParticleInitializer(new VelocityInitializer(2, 2, 20, 10)); particleSystem.addParticleInitializer(new RotationInitializer(0.0f, 30.0f)); particleSystem.addParticleModifier(new ScaleModifier(1.0f, 2.0f, 0, 150)); particleSystem.addParticleModifier(new ColorModifier(1, 1, 1, 1f, 1, 1, 1, 3)); particleSystem.addParticleModifier(new ColorModifier(1, 1, 1f, 1, 1, 1, 1, 6)); particleSystem.addParticleModifier(new AlphaModifier(0, 1, 0, 3)); particleSystem.addParticleModifier(new AlphaModifier(1, 0, 1, 125)); particleSystem.addParticleModifier(new ExpireModifier(50, 50)); scene.attachChild(particleSystem); But its looks like snowfall effect what changes i can do for make it rain effect please correct me EDIT : here is link for snapshot http://i.imgur.com/bRIMP.png

    Read the article

  • Normal map applied as diffuse textures looks wrong

    - by KaiserJohaan
    Diffuse textures works fine, but I am having problem with normal maps, so I thought I'd tried to apply the normal maps as the diffuse map in my fragment shader so I could see everything is OK. I comment-out my normal map code and just set the diffuse map to the normal map and I get this: http://postimg.org/image/j9gudjl7r/ Looks like a smurf! This is the actual normal map of the main body: http://postimg.org/image/sbkyr6fg9/ Here is my fragment shader, notice I commented out normal map code so I could debug the normal map as a diffuse texture "#version 330 \n \ \n \ layout(std140) uniform; \n \ \n \ const int MAX_LIGHTS = 8; \n \ \n \ struct Light \n \ { \n \ vec4 mLightColor; \n \ vec4 mLightPosition; \n \ vec4 mLightDirection; \n \ \n \ int mLightType; \n \ float mLightIntensity; \n \ float mLightRadius; \n \ float mMaxDistance; \n \ }; \n \ \n \ uniform UnifLighting \n \ { \n \ vec4 mGamma; \n \ vec3 mViewDirection; \n \ int mNumLights; \n \ \n \ Light mLights[MAX_LIGHTS]; \n \ } Lighting; \n \ \n \ uniform UnifMaterial \n \ { \n \ vec4 mDiffuseColor; \n \ vec4 mAmbientColor; \n \ vec4 mSpecularColor; \n \ vec4 mEmissiveColor; \n \ \n \ bool mHasDiffuseTexture; \n \ bool mHasNormalTexture; \n \ bool mLightingEnabled; \n \ float mSpecularShininess; \n \ } Material; \n \ \n \ uniform sampler2D unifDiffuseTexture; \n \ uniform sampler2D unifNormalTexture; \n \ \n \ in vec3 frag_position; \n \ in vec3 frag_normal; \n \ in vec2 frag_texcoord; \n \ in vec3 frag_tangent; \n \ in vec3 frag_bitangent; \n \ \n \ out vec4 finalColor; " " \n \ \n \ void CalcGaussianSpecular(in vec3 dirToLight, in vec3 normal, out float gaussianTerm) \n \ { \n \ vec3 viewDirection = normalize(Lighting.mViewDirection); \n \ vec3 halfAngle = normalize(dirToLight + viewDirection); \n \ \n \ float angleNormalHalf = acos(dot(halfAngle, normalize(normal))); \n \ float exponent = angleNormalHalf / Material.mSpecularShininess; \n \ exponent = -(exponent * exponent); \n \ \n \ gaussianTerm = exp(exponent); \n \ } \n \ \n \ vec4 CalculateLighting(in Light light, in vec4 diffuseTexture, in vec3 normal) \n \ { \n \ if (light.mLightType == 1) // point light \n \ { \n \ vec3 positionDiff = light.mLightPosition.xyz - frag_position; \n \ float dist = max(length(positionDiff) - light.mLightRadius, 0); \n \ \n \ float attenuation = 1 / ((dist/light.mLightRadius + 1) * (dist/light.mLightRadius + 1)); \n \ attenuation = max((attenuation - light.mMaxDistance) / (1 - light.mMaxDistance), 0); \n \ \n \ vec3 dirToLight = normalize(positionDiff); \n \ float angleNormal = clamp(dot(normalize(normal), dirToLight), 0, 1); \n \ \n \ float gaussianTerm = 0.0; \n \ if (angleNormal > 0.0) \n \ CalcGaussianSpecular(dirToLight, normal, gaussianTerm); \n \ \n \ return diffuseTexture * (attenuation * angleNormal * Material.mDiffuseColor * light.mLightIntensity * light.mLightColor) + \n \ (attenuation * gaussianTerm * Material.mSpecularColor * light.mLightIntensity * light.mLightColor); \n \ } \n \ else if (light.mLightType == 2) // directional light \n \ { \n \ vec3 dirToLight = normalize(light.mLightDirection.xyz); \n \ float angleNormal = clamp(dot(normalize(normal), dirToLight), 0, 1); \n \ \n \ float gaussianTerm = 0.0; \n \ if (angleNormal > 0.0) \n \ CalcGaussianSpecular(dirToLight, normal, gaussianTerm); \n \ \n \ return diffuseTexture * (angleNormal * Material.mDiffuseColor * light.mLightIntensity * light.mLightColor) + \n \ (gaussianTerm * Material.mSpecularColor * light.mLightIntensity * light.mLightColor); \n \ } \n \ else if (light.mLightType == 4) // ambient light \n \ return diffuseTexture * Material.mAmbientColor * light.mLightIntensity * light.mLightColor; \n \ else \n \ return vec4(0.0); \n \ } \n \ \n \ void main() \n \ { \n \ vec4 diffuseTexture = vec4(1.0); \n \ if (Material.mHasDiffuseTexture) \n \ diffuseTexture = texture(unifDiffuseTexture, frag_texcoord); \n \ \n \ vec3 normal = frag_normal; \n \ if (Material.mHasNormalTexture) \n \ { \n \ diffuseTexture = vec4(normalize(texture(unifNormalTexture, frag_texcoord).xyz * 2.0 - 1.0), 1.0); \n \ // vec3 normalTangentSpace = normalize(texture(unifNormalTexture, frag_texcoord).xyz * 2.0 - 1.0); \n \ //mat3 tangentToWorldSpace = mat3(normalize(frag_tangent), normalize(frag_bitangent), normalize(frag_normal)); \n \ \n \ // normal = tangentToWorldSpace * normalTangentSpace; \n \ } \n \ \n \ if (Material.mLightingEnabled) \n \ { \n \ vec4 accumLighting = vec4(0.0); \n \ \n \ for (int lightIndex = 0; lightIndex < Lighting.mNumLights; lightIndex++) \n \ accumLighting += Material.mEmissiveColor * diffuseTexture + \n \ CalculateLighting(Lighting.mLights[lightIndex], diffuseTexture, normal); \n \ \n \ finalColor = pow(accumLighting, Lighting.mGamma); \n \ } \n \ else { \n \ finalColor = pow(diffuseTexture, Lighting.mGamma); \n \ } \n \ } \n"; Here is my wrapper around a texture OpenGLTexture::OpenGLTexture(const std::vector<uint8_t>& textureData, uint32_t textureWidth, uint32_t textureHeight, TextureFormat textureFormat, TextureType textureType, Logger& logger) : mLogger(logger), mTextureID(gNextTextureID++), mTextureType(textureType) { glGenTextures(1, &mTexture); CHECK_GL_ERROR(mLogger); glBindTexture(GL_TEXTURE_2D, mTexture); CHECK_GL_ERROR(mLogger); GLint glTextureFormat = (textureFormat == TextureFormat::TEXTURE_FORMAT_RGB ? GL_RGB : textureFormat == TextureFormat::TEXTURE_FORMAT_RGBA ? GL_RGBA : GL_RED); glTexImage2D(GL_TEXTURE_2D, 0, glTextureFormat, textureWidth, textureHeight, 0, glTextureFormat, GL_UNSIGNED_BYTE, &textureData[0]); CHECK_GL_ERROR(mLogger); glGenerateMipmap(GL_TEXTURE_2D); CHECK_GL_ERROR(mLogger); glBindTexture(GL_TEXTURE_2D, 0); CHECK_GL_ERROR(mLogger); } OpenGLTexture::~OpenGLTexture() { glDeleteBuffers(1, &mTexture); CHECK_GL_ERROR(mLogger); } And here is the sampler I create which is shared between Diffuse and normal textures // texture sampler setup glGenSamplers(1, &mTextureSampler); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_MAG_FILTER, GL_LINEAR); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_WRAP_S, GL_REPEAT); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_WRAP_T, GL_REPEAT); CHECK_GL_ERROR(mLogger); glSamplerParameterf(mTextureSampler, GL_TEXTURE_MAX_ANISOTROPY_EXT, mCurrentAnisotropy); CHECK_GL_ERROR(mLogger); glUniform1i(glGetUniformLocation(mDefaultProgram.GetHandle(), "unifDiffuseTexture"), OpenGLTexture::TEXTURE_UNIT_DIFFUSE); CHECK_GL_ERROR(mLogger); glUniform1i(glGetUniformLocation(mDefaultProgram.GetHandle(), "unifNormalTexture"), OpenGLTexture::TEXTURE_UNIT_NORMAL); CHECK_GL_ERROR(mLogger); glBindSampler(OpenGLTexture::TEXTURE_UNIT_DIFFUSE, mTextureSampler); CHECK_GL_ERROR(mLogger); glBindSampler(OpenGLTexture::TEXTURE_UNIT_NORMAL, mTextureSampler); CHECK_GL_ERROR(mLogger); SetAnisotropicFiltering(mCurrentAnisotropy); The diffuse textures looks like they should, but the normal looks so wierd. Why is this?

    Read the article

  • Only draw visible objects to the camera in 2D

    - by Deukalion
    I have Map, each map has an array of Ground, each Ground consists of an array of VertexPositionTexture and a texture name reference so it renders a texture at these points (as a shape through triangulation). Now when I render my map I only want to get a list of all objects that are visible in the camera. (So I won't loop through more than I have to) Structs: public struct Map { public Ground[] Ground { get; set; } } public struct Ground { public int[] Indexes { get; set; } public VertexPositionNormalTexture[] Points { get; set; } public Vector3 TopLeft { get; set; } public Vector3 TopRight { get; set; } public Vector3 BottomLeft { get; set; } public Vector3 BottomRight { get; set; } } public struct RenderBoundaries<T> { public BoundingBox Box; public T Items; } when I load a map: foreach (Ground ground in CurrentMap.Ground) { Boundaries.Add(new RenderBoundaries<Ground>() { Box = BoundingBox.CreateFromPoints(new Vector3[] { ground.TopLeft, ground.TopRight, ground.BottomLeft, ground.BottomRight }), Items = ground }); } TopLeft, TopRight, BottomLeft, BottomRight are simply the locations of each corner that the shape make. A rectangle. When I try to loop through only the objects that are visible I do this in my Draw method: public int Draw(GraphicsDevice device, ICamera camera) { BoundingFrustum frustum = new BoundingFrustum(camera.View * camera.Projection); // Visible count int count = 0; EffectTexture.World = camera.World; EffectTexture.View = camera.View; EffectTexture.Projection = camera.Projection; foreach (EffectPass pass in EffectTexture.CurrentTechnique.Passes) { pass.Apply(); foreach (RenderBoundaries<Ground> render in Boundaries.Where(m => frustum.Contains(m.Box) != ContainmentType.Disjoint)) { // Draw ground count++; } } return count; } When I try adding just one ground, then moving the camera so the ground is out of frame it still returns 1 which means it still gets draw even though it's not within the camera's view. Am I doing something or wrong or can it be because of my Camera? Any ideas why it doesn't work?

    Read the article

  • Is it possible to extract textures or sprites from compiled game files?

    - by Brian Reindel
    For instance, every map in Portal has what appear to be sprites over a texture indicating the obstacles you'll face (see screenshot). Are these resources compiled into the source as byte code, or is it possible to extract them from installation files? Obviously I understand copyright implications, and I am only interested in using it for a recreational project. Instead of recreating them, I wonder if they can be extracted.

    Read the article

  • Fastest way to group units that can see each other?

    - by mac
    In the 2D game I'm working with, the game engine is able to give me, for each unit, the list of other units that are in its view range. I would like to know if there is an established algorithm to sort the units in groups, where each group would be defined by all those units which are "connected" to each other (even through others). An example might help understand the question better (E=enemy, O=own unit). First the data that I would get from the game engine: E1 can see E2, E3, O5 E2 can see E1 E3 can see E1 E4 can see O5 E5 can see O2 E6 can see E7, O9, O1 E7 can see E6 O1 can see E6 O2 can see O5, E5 O5 can see E1, E4, O2 O9 can see E6 Then I should compute the groups as follow: G1 = E1, E2, E3, E4, E5, O2, O5 G2 = O1, O9, E6, E7 It can be safely assumed that there is a transitive property for the field of view: [if A sees B, then B sees A]. Just to clarify: I already wrote a naïve implementation that loops on each row of the game engine info, but from the look of it, it seems a problem general enough for it to have been studied in depth and have various established algorithms (maybe passing through some tree-like structure?). My problem is that I couldn't find a way to describe my problem that returned useful google hits. Thank you in advance for your help!

    Read the article

  • DirectX10 How to use Constant Buffers

    - by schnozzinkobenstein
    I'm trying to access some variables in my shader, but I think I'm doing this wrong. Say I have a constant buffer that looks like this: cbuffer perFrame { float foo; float bar; }; I got an ID3D10EffectConstantBuffer reference to it, and I can get a specific index by calling GetMemberByIndex, but how can I figure out how many members perFrame has so that I can get each member without going out of bounds?

    Read the article

  • Grid-Based 2D Lighting Problems

    - by Lemoncreme
    I am aware this question has been asked before, but unfortunately I am new to the language, so the complicated explanations I've found do not help me in the least. I need a lighting engine for my game, and I've tried some procedural lighting systems. This method works the best: if (light[xx - 1, yy] > light[xx, yy]) light[xx, yy] = light[xx - 1, yy] - lightPass; if (light[xx, yy - 1] > light[xx, yy]) light[xx, yy] = light[xx, yy - 1] - lightPass; if (light[xx + 1, yy] > light[xx, yy]) light[xx, yy] = light[xx + 1, yy] - lightPass; if (light[xx, yy + 1] > light[xx, yy]) light[xx, yy] = light[xx, yy + 1] - lightPass; (Subtracts adjacent values by 'lightPass' variable if they are more bright) (It's in a for() loop) This is all fine and dandy except for a an obvious reason: The system favors whatever comes first in the for() loop This is what the above code looks like applied to my game: If I could get some help on creating a new procedural or otherwise lighting system I would really appreciate it!

    Read the article

  • Why doesn't my GameMaker step event work?

    - by exceltior
    So I want to make some kind of floor that which the player when walks in gets his movement reduced but im having thousand of different issues implementing this since it doesnt appear to do anything ... So i have tried different way: 1 - I have tried Step Event which had the following script: if keyboard_check(ord('A')) { player.x = -5; } if keyboard_check(ord('D')) { player.x = -5; } if keyboard_check(ord('W')) { player.x = -5; } if keyboard_check(ord('S')) { player.x = -5; } 2 - I have tried a Collision Event with the same code 3- I have tried a Step event with collision detection on a script None of these options seem to work at all ... Can you help me?

    Read the article

  • How should I invoke a physics engine?

    - by ymfoi
    I'm new to writing games. I'm planning to write a 2D battle game which may require an physics engine. Suppose I've written one, but how can I combine it with the main routine of my game? Should I attach it directly to the graphics render routine or put it in an individual thread? I've spent much time looking for some common approach, but found nothing. So can you reveal some basics idea for me, a newbie? Thanks! P.S. There're many other problems I have to deal with if I choose to start a separate thread for the physics engine, for example, the lock problem, while from my intuition, I guess I'd better separate the render and the physics engine.

    Read the article

  • Algorithmically generating neon layers on pixel grid

    - by user190929
    In an attempt at a screensaver I am making, I am a fan of neo-like graphics, which, of course, look great against a black background. As I understand it, neon, graphically speaking, is essentially a gradient of a color, brightest in the center, and gets darker proceeding outward. Although, more accurate is similar, but separating it into tubes and glow. The tubes are mostly white, while the glow is where most of the color is seen. Well... the tubes could also be a light variant of the color, you could say. The glow is darker. Anyhow, my question is, how could you generate such things given an initial pattern of pixels that would be the tubes? For example, let's say I want to make a neon 'H'. I, via the libraries, can attain the rectangles of pixels which represent it, but I want to make it look neonized. How could I algorithmically achieve such an effect given a base tube shape and base color? EDIT: ok, I mistated that. Got a bit distracted. My purpose for this was similar to a neon effect, but not. Sorry about that. What I am looking for is something like this: Start with a pattern of pixels: [!][!][!][!][!][!][!][!] [!][!][O][!][!][!][!][!] [!][!][O][O][!][!][!][!] [!][!][!][!][O][!][!][!] [!][!][!][!][!][!][!][!] How to I find the U pixels? [!][E][E][E][!][!][!][!] [!][E][O][E][E][!][!][!] [!][E][O][O][E][E][!][!] [!][E][E][E][O][E][!][!] [!][!][!][E][E][E][!][!] Sorry if that looks bad.

    Read the article

  • 2D Skeletal Animation Transformations

    - by Brad Zeis
    I have been trying to build a 2D skeletal animation system for a while, and I believe that I'm fairly close to finishing. Currently, I have the following data structures: struct Bone { Bone *parent; int child_count; Bone **children; double x, y; }; struct Vertex { double x, y; int bone_count; Bone **bones; double *weights; }; struct Mesh { int vertex_count; Vertex **vertices; Vertex **tex_coords; } Bone->x and Bone->y are the coordinates of the end point of the Bone. The starting point is given by (bone->parent->x, bone->parent->y) or (0, 0). Each entity in the game has a Mesh, and Mesh->vertices is used as the bounding area for the entity. Mesh->tex_coords are texture coordinates. In the entity's update function, the position of the Bone is used to change the coordinates of the Vertices that are bound to it. Currently what I have is: void Mesh_update(Mesh *mesh) { int i, j; double sx, sy; for (i = 0; i < vertex_count; i++) { if (mesh->vertices[i]->bone_count == 0) { continue; } sx, sy = 0; for (j = 0; j < mesh->vertices[i]->bone_count; j++) { sx += (/* ??? */) * mesh->vertices[i]->weights[j]; sy += (/* ??? */) * mesh->vertices[i]->weights[j]; } mesh->vertices[i]->x = sx; mesh->vertices[i]->y = sy; } } I think I have everything I need, I just don't know how to apply the transformations to the final mesh coordinates. What tranformations do I need here? Or is my approach just completely wrong?

    Read the article

  • Cocos2d sprite's parent not reflecting true scale value

    - by Paul Renton
    I am encountering issues with determining a CCSprite's parent node's scale value. In my game I have a class that extends CCLayer and scales itself based on game triggers. Certain child sprites of this CCLayer have mathematical calculations that become inaccurate once I scale the parent CCLayer. For instance, I have a tank sprite that needs to determine its firing point within the parent node. Whenever I scale the layer and ask the layer for its scale values, they are accurate. However, when I poll the sprites contained within the layer for their parent's scale values, they always appear as one. // From within the sprite CCLOG(@"ChildSprite-> Parent's scale values are scaleX: %f, scaleY: %f", self.parent.scaleX, self.parent.scaleY); // Outputs 1.0,1.0 // From within the layer CCLOG(@"Layer-> ScaleX : %f, ScaleY: %f , SCALE: %f", self.scaleX, self.scaleY, self.scale); // Output is 0.80,0.80 Could anyone explain to me why this is the case? I don't understand why these values are different. Maybe I don't understand the inner design of Cocos2d fully. Any help is appreciated.

    Read the article

  • Collision filtering techniques

    - by Griffin
    I was wondering what efficient techniques are out there for mapping collision filtering between various bodies, sub-bodies, and so forth. I'm familiar with the simple idea of having different layers of 2D bodies, but this is not sufficient for more complex mapping: (Think of having sub-bodies of a body, such as limbs, collide with each other by placing them on the same layer, and then wanting to only have the legs collide with the ground while the arms would not) This can be solved with a multidimensional layer setup, but I would probably end up just creating more and more layers to the point where the simplicity and efficiency of layer filtering would be gone. Are there any more complex ways to solve even more complex situations than this?

    Read the article

  • Increase animation speed according to the swipe speed in unity for Android

    - by rohit
    I have the animation done through Maya and brought the FBX file to unity. Here is my code to calculate the speed of the swipe: Vector2 speedMeasuredInScreenWidthsPerSecond =(Input.touches[0].deltaPosition / Screen.width) * Input.touches[0].deltaTime; Now I wanted to take speedMeasuredInScreenWidthsPerSecond and use it to increase the animation speed accordingly like this: animation["gmeChaAnimMiddle"].speed=Mathf.Round(speedMeasuredInScreenWidthsPerSecond); However, this results in an error that I need to convert Vector2 to float. So how do I overcome it?

    Read the article

< Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >