Search Results

Search found 44501 results on 1781 pages for 'software development life'.

Page 583/1781 | < Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >

  • Multiple Sprites using foreach Collison Detection in XNA (C#)

    - by Bradley Kreuger
    Back again from my last question. Now I was curious I use a foreach statement to use the same shot class. How would I go about doing collison detection. I used the tutorial here on how to shoot a fireball http://www.xnadevelopment.com/tutorials.shtml. I tried to put in several places a foreach to look at all of them to see if they have reached the borders of my sprite hero but doesn't seem to do anything. If again some one might know of a good site that has tutorials to explain collision detection a little bit better that would be appriecated.

    Read the article

  • List of bounding boxes?

    - by Christian Frantz
    When I create a bounding box for each object in my chunk, would it be better to store them in a list? List<BoundingBox> cubeBoundingBox Or can I just use a single variable? BoundingBox cubeBoundingBox The bounding boxes will be used for all types of things so they will be moving around. In any case, I'd be adding it to a method that gets called 2500+ times for each chunk, so either I have a giant list of them or 2500+ individual boxes. Is there any advantage to using one or the other?

    Read the article

  • Using Shader causes triangle to disappear

    - by invisal
    The following is my rendering code. Private Sub GameRender() GL.Clear(ClearBufferMask.ColorBufferBit + ClearBufferMask.DepthBufferBit) GL.ClearColor(Color.SkyBlue) GL.UseProgram(theProgram) GL.EnableClientState(ArrayCap.VertexArray) GL.EnableClientState(ArrayCap.ColorArray) GL.BindBuffer(BufferTarget.ArrayBuffer, vertexPositionID) GL.DrawArrays(BeginMode.Triangles, 0, 3) GL.DisableClientState(ArrayCap.ColorArray) GL.DisableClientState(ArrayCap.VertexArray) GlControl1.SwapBuffers() End Sub This is screenshot without GL.UseProgram(theProgram) This is screenshot with GL.UseProgram(theProgram) Here are my shader code that I picked from online tutorial. Vertex Shader #version 330 layout(location = 0) in vec4 position; void main() { gl_Position = position; } Fragment Shader #version 330 out vec4 outputColor; void main() { outputColor = vec4(1.0f, 1.0f, 1.0f, 1.0f); } These are my shader creation code. '' Initialize Shader Dim shaderList(1) As Integer shaderList(0) = CreateShader(ShaderType.VertexShader, strVertexShader) shaderList(1) = CreateShader(ShaderType.FragmentShader, strFragShader) theProgram = CreateProgram(shaderList) GL.DeleteShader(shaderList(0)) GL.DeleteShader(shaderList(1)) Here are my helper functions Private Function CreateShader(ByVal shaderType As ShaderType, ByVal code As String) Dim shader As Integer = GL.CreateShader(shaderType) GL.ShaderSource(shader, code) GL.CompileShader(shader) Dim status As Integer GL.GetShader(shader, ShaderParameter.CompileStatus, status) If status = False Then MsgBox(GL.GetShaderInfoLog(shader)) End If Return shader End Function Private Function CreateProgram(ByVal shaderList() As Integer) As Integer Dim program As Integer = GL.CreateProgram() For i As Integer = 0 To shaderList.Length - 1 GL.AttachShader(program, shaderList(i)) Next GL.LinkProgram(program) Dim status As Integer GL.GetProgram(program, ProgramParameter.LinkStatus, status) If status = False Then MsgBox(GL.GetProgramInfoLog(program)) End If For i As Integer = 0 To shaderList.Length - 1 GL.DetachShader(program, shaderList(i)) Next Return program End Function

    Read the article

  • More Changes...

    - by MOSSLover
    Stuff has changed drastically for me in the past two to three years.  I moved over 1000 miles from Saint Louis.  I go outside and I get up in front of crowds with less issues.  Now I'm changing jobs again.  I'm not really sure what to say here.  I was obviously unhappy and I needed to do something different.  So quit two days ago and I guess it worked out that I end with B&R this Friday, then head to TEC and SPS Huntsville and a week from this Monday I start my new job at Gig Werks.  I'm not sure what to expect or where I'm heading, but I think it's a step in the right direction.  I won't really know what kind of impact this will have on my life for at least another 6 months to a year. For some reason I can't sleep tonight and I think it's really a reflection of my last day.  Tomorrow is an ending and a beginning at the same time.  So it's both kind of sad and exciting.  I don't know why I'm really excited to go to Disney Land for the second time ever in my life time.  I get to ride the Teacups.  For the longest time when I was a kid I wanted to go to Disney Land.  I wanted to ride the teacups.  In 2007, at the age of 25, I rode the teacups for my first ever visit to LA.  That was the start of finally syncing up with my childhood goals.  I wanted to live near a major city.  I wanted to visit all the major cities in the world.  I wanted to see everything and meet everyone.  This job change will probably turn into something great I just don't know it yet.  I'm walking again outside my comfort zone and stepping into uncharted territory.  In 2-3 years I'll probably write another blog post how this week lead to something great.  It just stinks when you have to leave behind something you know and love.  I will miss all my current colleagues, but I'm sure I'll gain some new ones and keep in touch with the old.  To 2010 being a great year for change and hopefully by the end of the year I can say I went to Europe.  To reaching my goals and my dreams.  Don't let anyone stop you from getting what you want in life (unless you are axe murderer please don't kill anyone that's just wrong).  Have a good weekend everyone!

    Read the article

  • How to implement a simple bullet trajectory

    - by AirieFenix
    I searched and searched and although it's a fair simple question, I don't find the proper answer but general ideas (which I already have). I have a top-down game and I want to implement a gun which shoots bullets that follow a simple path (no physics nor change of trajectory, just go from A to B thing). a: vector of the position of the gun/player. b: vector of the mouse position (cross-hair). w: the vector of the bullet's trajectory. So, w=b-a. And the position of the bullet = [x=x0+speed*time*normalized w.x , y=y0+speed*time * normalized w.y]. I have the constructor: public Shot(int shipX, int shipY, int mouseX, int mouseY) { //I get mouse with Gdx.input.getX()/getY() ... this.shotTime = TimeUtils.millis(); this.posX = shipX; this.posY = shipY; //I used aVector = aVector.nor() here before but for some reason didn't work float tmp = (float) (Math.pow(mouseX-shipX, 2) + Math.pow(mouseY-shipY, 2)); tmp = (float) Math.sqrt(Math.abs(tmp)); this.vecX = (mouseX-shipX)/tmp; this.vecY = (mouseY-shipY)/tmp; } And here I update the position and draw the shot: public void drawShot(SpriteBatch batch) { this.lifeTime = TimeUtils.millis() - this.shotTime; //position = positionBefore + v*t this.posX = this.posX + this.vecX*this.lifeTime*speed*Gdx.graphics.getDeltaTime(); this.posY = this.posY + this.vecY*this.lifeTime*speed*Gdx.graphics.getDeltaTime(); ... } Now, the behavior of the bullet seems very awkward, not going exactly where my mouse is (it's like the mouse is 30px off) and with a random speed. I know I probably need to open the old algebra book from college but I'd like somebody says if I'm in the right direction (or points me to it); if it's a calculation problem, a code problem or both. Also, is it possible that Gdx.input.getX() gives me non-precise position? Because when I draw the cross-hair it also draws off the cursor position. Sorry for the long post and sorry if it's a very basic question. Thanks!

    Read the article

  • Modern techniques for spriting

    - by DevilWithin
    Hello, I would like to know the flow for making modern 2D game artwork. How are the assets made nowadays? Bitmap? Vector-based? Hand-drawn and painted? Drawn digitally? Modeled in 3D and exported to bitmaps? I would like some information on programs as well, for fine looking art. Why does Flash's vector art style look good in most games? How do I make equivalent graphics with external tools? Or equaly good and not vector-based, anyway. Any special hints for animating? An answer oriented towards a one-man-army indie developer with little experience but some artistic sense would be appreciated! Not a complete dummy with paint programs, but also not a master at all, just need efficient ways to achieve results. Thanks. NOTE: Pixel art is not the goal of this question, nothing related to direct pixel manipulation should be brought up here, but you're free to do exactly that :)

    Read the article

  • Power & Sleep Management

    - by Espressofa
    I'm running 12.10 with xmonad. Trying to ensure that the right things happen when I close laptop lid, etc. I see Internet search results for similar issues that mostly point towards gnome-power-manager. I have the package installed, but gnome-power-manager is not in my path anywhere. The behavior I'm looking for is as following: Sleep on lid close Awaken on lid open Turn off screen after 10 idle minutes Most importantly, have better battery life. I'm supposed to be getting 9 hours and I haven't seen the battery life estimate above 2.5 hours yet. Any tips on where to look or how to configure this would be much appreciated.

    Read the article

  • Rotate model using quaternion

    - by ChocoMan
    Currently I have this to rotate my 3D model that rotates on it's local axis independent from the world's axis: // Rotate model with Right Thumbstick modelRotation -= pController.ThumbSticks.Right.X * mRotSpeed; // float value What I'm trying to do is rotate the model using quaternion and not by a matrix. I've searched for tutorials, but have found none that explains thoroughly on how to achieve this. Does anyone know how to I can use quaternions to rotate my model or a complete tutorial?

    Read the article

  • How can I locate empty space next to polygon regions?

    - by Stephen
    Let's say I have the following area in a top-down map: The circle is the player, the black square is an obstacle, and the grey polygons with red borders are walk-able areas that will be used as a navigation mesh for enemies. Obstacles and grey polygons are always convex. The grey regions were defined using an algorithm when the world was generated at runtime. Notice the little white column. I need to figure out where any empty space like this is, if at all, after the algorithm builds the grey regions, so that I can fill the space with another region. Basically what I'm hoping for is an algorithm that can detect empty space next to a polygon.

    Read the article

  • Tool for creating Spritesheet? and Tips

    - by Spooks
    I am looking for a tool that I can use to create sprite sheet easily. Right now I am using Illustrator, but I can never get the center of the character in the exact position, so it looks like it is moving around(even though its always in one place), while being loop through the sprite sheet. Is there any better tools that I can be using? Also what kind of tips would you give for working with a sprite sheet? Should I create each part of the character in individual layers (left arm, right arm, body, etc.) or everything at once? any other tips would also be helpful! thank you

    Read the article

  • Normal map applied as diffuse textures looks wrong

    - by KaiserJohaan
    Diffuse textures works fine, but I am having problem with normal maps, so I thought I'd tried to apply the normal maps as the diffuse map in my fragment shader so I could see everything is OK. I comment-out my normal map code and just set the diffuse map to the normal map and I get this: http://postimg.org/image/j9gudjl7r/ Looks like a smurf! This is the actual normal map of the main body: http://postimg.org/image/sbkyr6fg9/ Here is my fragment shader, notice I commented out normal map code so I could debug the normal map as a diffuse texture "#version 330 \n \ \n \ layout(std140) uniform; \n \ \n \ const int MAX_LIGHTS = 8; \n \ \n \ struct Light \n \ { \n \ vec4 mLightColor; \n \ vec4 mLightPosition; \n \ vec4 mLightDirection; \n \ \n \ int mLightType; \n \ float mLightIntensity; \n \ float mLightRadius; \n \ float mMaxDistance; \n \ }; \n \ \n \ uniform UnifLighting \n \ { \n \ vec4 mGamma; \n \ vec3 mViewDirection; \n \ int mNumLights; \n \ \n \ Light mLights[MAX_LIGHTS]; \n \ } Lighting; \n \ \n \ uniform UnifMaterial \n \ { \n \ vec4 mDiffuseColor; \n \ vec4 mAmbientColor; \n \ vec4 mSpecularColor; \n \ vec4 mEmissiveColor; \n \ \n \ bool mHasDiffuseTexture; \n \ bool mHasNormalTexture; \n \ bool mLightingEnabled; \n \ float mSpecularShininess; \n \ } Material; \n \ \n \ uniform sampler2D unifDiffuseTexture; \n \ uniform sampler2D unifNormalTexture; \n \ \n \ in vec3 frag_position; \n \ in vec3 frag_normal; \n \ in vec2 frag_texcoord; \n \ in vec3 frag_tangent; \n \ in vec3 frag_bitangent; \n \ \n \ out vec4 finalColor; " " \n \ \n \ void CalcGaussianSpecular(in vec3 dirToLight, in vec3 normal, out float gaussianTerm) \n \ { \n \ vec3 viewDirection = normalize(Lighting.mViewDirection); \n \ vec3 halfAngle = normalize(dirToLight + viewDirection); \n \ \n \ float angleNormalHalf = acos(dot(halfAngle, normalize(normal))); \n \ float exponent = angleNormalHalf / Material.mSpecularShininess; \n \ exponent = -(exponent * exponent); \n \ \n \ gaussianTerm = exp(exponent); \n \ } \n \ \n \ vec4 CalculateLighting(in Light light, in vec4 diffuseTexture, in vec3 normal) \n \ { \n \ if (light.mLightType == 1) // point light \n \ { \n \ vec3 positionDiff = light.mLightPosition.xyz - frag_position; \n \ float dist = max(length(positionDiff) - light.mLightRadius, 0); \n \ \n \ float attenuation = 1 / ((dist/light.mLightRadius + 1) * (dist/light.mLightRadius + 1)); \n \ attenuation = max((attenuation - light.mMaxDistance) / (1 - light.mMaxDistance), 0); \n \ \n \ vec3 dirToLight = normalize(positionDiff); \n \ float angleNormal = clamp(dot(normalize(normal), dirToLight), 0, 1); \n \ \n \ float gaussianTerm = 0.0; \n \ if (angleNormal > 0.0) \n \ CalcGaussianSpecular(dirToLight, normal, gaussianTerm); \n \ \n \ return diffuseTexture * (attenuation * angleNormal * Material.mDiffuseColor * light.mLightIntensity * light.mLightColor) + \n \ (attenuation * gaussianTerm * Material.mSpecularColor * light.mLightIntensity * light.mLightColor); \n \ } \n \ else if (light.mLightType == 2) // directional light \n \ { \n \ vec3 dirToLight = normalize(light.mLightDirection.xyz); \n \ float angleNormal = clamp(dot(normalize(normal), dirToLight), 0, 1); \n \ \n \ float gaussianTerm = 0.0; \n \ if (angleNormal > 0.0) \n \ CalcGaussianSpecular(dirToLight, normal, gaussianTerm); \n \ \n \ return diffuseTexture * (angleNormal * Material.mDiffuseColor * light.mLightIntensity * light.mLightColor) + \n \ (gaussianTerm * Material.mSpecularColor * light.mLightIntensity * light.mLightColor); \n \ } \n \ else if (light.mLightType == 4) // ambient light \n \ return diffuseTexture * Material.mAmbientColor * light.mLightIntensity * light.mLightColor; \n \ else \n \ return vec4(0.0); \n \ } \n \ \n \ void main() \n \ { \n \ vec4 diffuseTexture = vec4(1.0); \n \ if (Material.mHasDiffuseTexture) \n \ diffuseTexture = texture(unifDiffuseTexture, frag_texcoord); \n \ \n \ vec3 normal = frag_normal; \n \ if (Material.mHasNormalTexture) \n \ { \n \ diffuseTexture = vec4(normalize(texture(unifNormalTexture, frag_texcoord).xyz * 2.0 - 1.0), 1.0); \n \ // vec3 normalTangentSpace = normalize(texture(unifNormalTexture, frag_texcoord).xyz * 2.0 - 1.0); \n \ //mat3 tangentToWorldSpace = mat3(normalize(frag_tangent), normalize(frag_bitangent), normalize(frag_normal)); \n \ \n \ // normal = tangentToWorldSpace * normalTangentSpace; \n \ } \n \ \n \ if (Material.mLightingEnabled) \n \ { \n \ vec4 accumLighting = vec4(0.0); \n \ \n \ for (int lightIndex = 0; lightIndex < Lighting.mNumLights; lightIndex++) \n \ accumLighting += Material.mEmissiveColor * diffuseTexture + \n \ CalculateLighting(Lighting.mLights[lightIndex], diffuseTexture, normal); \n \ \n \ finalColor = pow(accumLighting, Lighting.mGamma); \n \ } \n \ else { \n \ finalColor = pow(diffuseTexture, Lighting.mGamma); \n \ } \n \ } \n"; Here is my wrapper around a texture OpenGLTexture::OpenGLTexture(const std::vector<uint8_t>& textureData, uint32_t textureWidth, uint32_t textureHeight, TextureFormat textureFormat, TextureType textureType, Logger& logger) : mLogger(logger), mTextureID(gNextTextureID++), mTextureType(textureType) { glGenTextures(1, &mTexture); CHECK_GL_ERROR(mLogger); glBindTexture(GL_TEXTURE_2D, mTexture); CHECK_GL_ERROR(mLogger); GLint glTextureFormat = (textureFormat == TextureFormat::TEXTURE_FORMAT_RGB ? GL_RGB : textureFormat == TextureFormat::TEXTURE_FORMAT_RGBA ? GL_RGBA : GL_RED); glTexImage2D(GL_TEXTURE_2D, 0, glTextureFormat, textureWidth, textureHeight, 0, glTextureFormat, GL_UNSIGNED_BYTE, &textureData[0]); CHECK_GL_ERROR(mLogger); glGenerateMipmap(GL_TEXTURE_2D); CHECK_GL_ERROR(mLogger); glBindTexture(GL_TEXTURE_2D, 0); CHECK_GL_ERROR(mLogger); } OpenGLTexture::~OpenGLTexture() { glDeleteBuffers(1, &mTexture); CHECK_GL_ERROR(mLogger); } And here is the sampler I create which is shared between Diffuse and normal textures // texture sampler setup glGenSamplers(1, &mTextureSampler); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_MAG_FILTER, GL_LINEAR); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_WRAP_S, GL_REPEAT); CHECK_GL_ERROR(mLogger); glSamplerParameteri(mTextureSampler, GL_TEXTURE_WRAP_T, GL_REPEAT); CHECK_GL_ERROR(mLogger); glSamplerParameterf(mTextureSampler, GL_TEXTURE_MAX_ANISOTROPY_EXT, mCurrentAnisotropy); CHECK_GL_ERROR(mLogger); glUniform1i(glGetUniformLocation(mDefaultProgram.GetHandle(), "unifDiffuseTexture"), OpenGLTexture::TEXTURE_UNIT_DIFFUSE); CHECK_GL_ERROR(mLogger); glUniform1i(glGetUniformLocation(mDefaultProgram.GetHandle(), "unifNormalTexture"), OpenGLTexture::TEXTURE_UNIT_NORMAL); CHECK_GL_ERROR(mLogger); glBindSampler(OpenGLTexture::TEXTURE_UNIT_DIFFUSE, mTextureSampler); CHECK_GL_ERROR(mLogger); glBindSampler(OpenGLTexture::TEXTURE_UNIT_NORMAL, mTextureSampler); CHECK_GL_ERROR(mLogger); SetAnisotropicFiltering(mCurrentAnisotropy); The diffuse textures looks like they should, but the normal looks so wierd. Why is this?

    Read the article

  • How to utilize miniMax algorithm in Checkers game

    - by engineer
    I am sorry...as there are too many articles about it.But I can't simple get this. I am confused in the implementation of AI. I have generated all possible moves of computer's type pieces. Now I can't decide the flow. Whether I need to start a loop for the possible moves of each piece and assign score to it.... or something else is to be done. Kindly tell me the proper flow/algorithm for this. Thanks

    Read the article

  • Typical Applications of Linear System Solver in Game Developemnt

    - by craftsman.don
    I am going to write a custom solver for linear system. I would like to survey the typical problems involved the linear system solving in games. So that I can custom optimization on these problems based on the shape of the matrix. currently I am focus on these problems: B-Spline editing (I use a linear solve to resolve the C0, C1, C2 continuity) Constraint in Simulation (especially Position-Constraint, cloth) Both of them are Banded Matrix. I want to hear about some other applications of a linear system in games. Thank you.

    Read the article

  • Cocos2d sprite's parent not reflecting true scale value

    - by Paul Renton
    I am encountering issues with determining a CCSprite's parent node's scale value. In my game I have a class that extends CCLayer and scales itself based on game triggers. Certain child sprites of this CCLayer have mathematical calculations that become inaccurate once I scale the parent CCLayer. For instance, I have a tank sprite that needs to determine its firing point within the parent node. Whenever I scale the layer and ask the layer for its scale values, they are accurate. However, when I poll the sprites contained within the layer for their parent's scale values, they always appear as one. // From within the sprite CCLOG(@"ChildSprite-> Parent's scale values are scaleX: %f, scaleY: %f", self.parent.scaleX, self.parent.scaleY); // Outputs 1.0,1.0 // From within the layer CCLOG(@"Layer-> ScaleX : %f, ScaleY: %f , SCALE: %f", self.scaleX, self.scaleY, self.scale); // Output is 0.80,0.80 Could anyone explain to me why this is the case? I don't understand why these values are different. Maybe I don't understand the inner design of Cocos2d fully. Any help is appreciated.

    Read the article

  • Why do the order of uniforms gets changed by the compiler?

    - by Aybe
    I have the following shader, everything works fine when setting the value of one of the matrices but I've discovered that getting a value back is incorrect for View and Projection, they are in reverse order. #version 430 precision highp float; layout (location = 0) uniform mat4 Model; layout (location = 1) uniform mat4 View; layout (location = 2) uniform mat4 Projection; layout (location = 0) in vec3 in_position; layout (location = 1) in vec4 in_color; out vec4 out_color; void main(void) { gl_Position = Projection * View * Model * vec4(in_position, 1.0); out_color = in_color; } When querying their location they are effectively reversed, I did a small test by renaming View to Piew which puts it before Projection if sorted alphabetically and the order is correct. Now if I do remove layout (location = ...) from the uniforms, the problem disappears !? I am starting to think that this is a driver bug as explained in the wiki. Do you know why the order of the uniforms is changed whenever the shader is compiled ? (using an AMD HD7850)

    Read the article

  • I can't figure out how to animate my loaded model with Assimp

    - by Brendan Webster
    I have loaded in a model to my C++ OpenGL game. It is a COLLADA file type that I have loaded, and I setup an animation under blender for the file. The problem is I don't know how to animate the model. The Assimp documentation didn't really help me out. Their source code didn't use animations, and I can't seem to find anywhere online that someone explains how to animate your loaded model... I'm sorta wondering if someone could link me to a helpful website, or maybe just help me out, so that maybe I will understand how to do animations with assimp.

    Read the article

  • In a Tower defense game, how to do buffs/debuffs

    - by Gabe
    The question is at the very bottom. If you understand Buffs/Debuffs in tower defense games then you should skip the bulk of this question and go to the bottom (seperated with the long line) I plan on making an IPhone TD game. The fact that its an iPhone game isn't relevant but I am coding in Objective-c with Cocos2D. I am relatively inexperienced in the field of game design so I'm looking for some advice from someone experienced in this field. In tower defense, there are two things that are relevant to my question: towers/enemies (both have their own classes/children). They each have stats like hp, damage, speed, etc. I want to add buffs/defuffs, for instance: Towers A,B and C each have 15 base damage. Tower D would be a buff tower with no damage, a tower with an AOE(area of effect) aura that gives 10% damage to all towers in range. Tower E might slow enemies in its AOE, a debuff. Stuff like that. The same could go for enemies. Enemy A is a boss that has a slow aura that affects towers and slows their base attack speed or something along those lines. So the question is, what would be the most effective way to implement this? If it was just towers then I would just mess around with the tower classes, but since tower classes and enemy classes are both affected, should I make a buff class? TD games can consume quite a bit of memory with large amounts of creeps and towers, and buffs I feel like would also consume quit a bit... So I'm trying to be as effective as possible.

    Read the article

  • 2D Skeletal Animation Transformations

    - by Brad Zeis
    I have been trying to build a 2D skeletal animation system for a while, and I believe that I'm fairly close to finishing. Currently, I have the following data structures: struct Bone { Bone *parent; int child_count; Bone **children; double x, y; }; struct Vertex { double x, y; int bone_count; Bone **bones; double *weights; }; struct Mesh { int vertex_count; Vertex **vertices; Vertex **tex_coords; } Bone->x and Bone->y are the coordinates of the end point of the Bone. The starting point is given by (bone->parent->x, bone->parent->y) or (0, 0). Each entity in the game has a Mesh, and Mesh->vertices is used as the bounding area for the entity. Mesh->tex_coords are texture coordinates. In the entity's update function, the position of the Bone is used to change the coordinates of the Vertices that are bound to it. Currently what I have is: void Mesh_update(Mesh *mesh) { int i, j; double sx, sy; for (i = 0; i < vertex_count; i++) { if (mesh->vertices[i]->bone_count == 0) { continue; } sx, sy = 0; for (j = 0; j < mesh->vertices[i]->bone_count; j++) { sx += (/* ??? */) * mesh->vertices[i]->weights[j]; sy += (/* ??? */) * mesh->vertices[i]->weights[j]; } mesh->vertices[i]->x = sx; mesh->vertices[i]->y = sy; } } I think I have everything I need, I just don't know how to apply the transformations to the final mesh coordinates. What tranformations do I need here? Or is my approach just completely wrong?

    Read the article

  • Finite state machine in C++

    - by Electro
    So, I've read a lot about using FSMs to do game state management, things like what and FSM is, and using a stack or set of states for building one. I've gone through all that. But I'm stuck at writing an actual, well-designed implementation of an FSM for that purpose. Specifically, how does one cleanly resolve the problem of transitioning between states, (how) should a state be able to use data from other states, and so on. Does anyone have any tips on designing and writing a implementation in C++, or better yet, code examples?

    Read the article

  • Making entire scene fade to grayscale

    - by Fibericon
    When the player loses all of their lives, I want the entire game screen to go grayscale, but not stop updating immediately. I'd also prefer it fade to grayscale instead of suddenly lose all color. Everything I've found so far is either about taking a screenshot and making it grayscale, or making a specific texture grayscale. Is there a way to change the entire playing field and all objects within to grayscale without iterating through everything?

    Read the article

  • Vector Graphics in DirectX

    - by Doug
    I'm curious as to people's thoughts on the best way to use vector graphics in a directX game instead of rasterized textures(think Super Meat Boy). I want to remain resolution independent and don't want to downscale/upscale rasterized graphics. Also the idea would be for all assets to be vector graphics(again think Super Meat Boy). I've looked at Valve's paper "Improved Alpha-Tested Magnification for Vector Textures and Special Effects" and also looked at using shaders http://http.developer.nvidia.com/GPUGems3/gpugems3_ch25.html. Wondering if anyone has done something similar or an alternate approach. Cheers

    Read the article

  • Physics like asteroides

    - by user2933016
    I try to make a ship that has the physic properties like asteroides. I have this for now(All in Java): Ship.class public class Ship { public static final float sMaxHealth = 0.1F; public static final float sMaxMoveVelocity = 5.0F; public static final float sMaxAngleVelocity = 20.0F; public static final float sRadius = 1.0F; public static final float sMoveDeceleration = 10.0F; public static final float sMoveAcceleration = 2.0F; public static final float sAngleDeceleration = 15.0F; public static final float sAngleAcceleration = 20.0F; private float mHealth; private float mXVelocity; private float mYVelocity; private float mAngleVelocity; private float mX; private float mY; private float mAngle; } (I let the getter and setter away for now) Controller code // Player input if(Gdx.input.isKeyPressed(Keys.UP)) { mPlayer.setXVelocity(mPlayer.getXVelocity() + (float) Math.cos(mPlayer.getAngle()) * Ship.sMoveAcceleration); mPlayer.setYVelocity(mPlayer.getYVelocity() + (float) Math.sin(mPlayer.getAngle()) * Ship.sMoveAcceleration); } if(Gdx.input.isKeyPressed(Keys.LEFT)) { mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() + Ship.sAngleAcceleration * pDeltaTime); } if(Gdx.input.isKeyPressed(Keys.RIGHT)) { mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() - Ship.sAngleAcceleration * pDeltaTime); } // X velocity if(mPlayer.getXVelocity() < 0) { if(-mPlayer.getXVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setXVelocity(-Ship.sMaxMoveVelocity); } mPlayer.setXVelocity(mPlayer.getXVelocity() + Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getXVelocity() > 0) { mPlayer.setXVelocity(0); } } else if(mPlayer.getXVelocity() > 0) { if(mPlayer.getXVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setXVelocity(Ship.sMaxMoveVelocity); } mPlayer.setXVelocity(mPlayer.getXVelocity() - Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getXVelocity() < 0) { mPlayer.setXVelocity(0); } } // Y velocity if(mPlayer.getYVelocity() < 0) { if(-mPlayer.getYVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setYVelocity(-Ship.sMaxMoveVelocity); } mPlayer.setYVelocity(mPlayer.getYVelocity() + Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getYVelocity() > 0) { mPlayer.setYVelocity(0); } } else if(mPlayer.getYVelocity() > 0) { if(mPlayer.getYVelocity() > Ship.sMaxMoveVelocity) { mPlayer.setYVelocity(Ship.sMaxMoveVelocity); } mPlayer.setYVelocity(mPlayer.getYVelocity() - Ship.sMoveDeceleration * pDeltaTime); if(mPlayer.getYVelocity() < 0) { mPlayer.setYVelocity(0); } } // Angle velocity if(mPlayer.getAngleVelocity() < 0) { if(-mPlayer.getAngleVelocity() > Ship.sMaxAngleVelocity) { mPlayer.setAngleVelocity(-Ship.sMaxAngleVelocity); } mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() + Ship.sAngleDeceleration * pDeltaTime); if(mPlayer.getAngleVelocity() > 0) { mPlayer.setAngleVelocity(0); } } else if(mPlayer.getAngleVelocity() > 0) { if(mPlayer.getAngleVelocity() > Ship.sMaxAngleVelocity) { mPlayer.setAngleVelocity(Ship.sMaxAngleVelocity); } mPlayer.setAngleVelocity(mPlayer.getAngleVelocity() - Ship.sAngleDeceleration * pDeltaTime); if(mPlayer.getAngleVelocity() < 0) { mPlayer.setAngleVelocity(0); } } mPlayer.setX(mPlayer.getX() + mPlayer.getXVelocity() * pDeltaTime); mPlayer.setY(mPlayer.getY() + mPlayer.getYVelocity() * pDeltaTime); mPlayer.setAngle(mPlayer.getAngle() + mPlayer.getAngleVelocity() * pDeltaTime); Why the ship does not behave like in asteroides ? What do I wrong?

    Read the article

  • Is it possible to extract textures or sprites from compiled game files?

    - by Brian Reindel
    For instance, every map in Portal has what appear to be sprites over a texture indicating the obstacles you'll face (see screenshot). Are these resources compiled into the source as byte code, or is it possible to extract them from installation files? Obviously I understand copyright implications, and I am only interested in using it for a recreational project. Instead of recreating them, I wonder if they can be extracted.

    Read the article

  • Algorithmically generating neon layers on pixel grid

    - by user190929
    In an attempt at a screensaver I am making, I am a fan of neo-like graphics, which, of course, look great against a black background. As I understand it, neon, graphically speaking, is essentially a gradient of a color, brightest in the center, and gets darker proceeding outward. Although, more accurate is similar, but separating it into tubes and glow. The tubes are mostly white, while the glow is where most of the color is seen. Well... the tubes could also be a light variant of the color, you could say. The glow is darker. Anyhow, my question is, how could you generate such things given an initial pattern of pixels that would be the tubes? For example, let's say I want to make a neon 'H'. I, via the libraries, can attain the rectangles of pixels which represent it, but I want to make it look neonized. How could I algorithmically achieve such an effect given a base tube shape and base color? EDIT: ok, I mistated that. Got a bit distracted. My purpose for this was similar to a neon effect, but not. Sorry about that. What I am looking for is something like this: Start with a pattern of pixels: [!][!][!][!][!][!][!][!] [!][!][O][!][!][!][!][!] [!][!][O][O][!][!][!][!] [!][!][!][!][O][!][!][!] [!][!][!][!][!][!][!][!] How to I find the U pixels? [!][E][E][E][!][!][!][!] [!][E][O][E][E][!][!][!] [!][E][O][O][E][E][!][!] [!][E][E][E][O][E][!][!] [!][!][!][E][E][E][!][!] Sorry if that looks bad.

    Read the article

  • How important is Programming for a Level Designer?

    - by WryGrin
    I'm currently attending school in a Level Design program, and I was wondering how important programming really is in being a Level Designer? I'm apparently incapable of learning programming (despite my best efforts), and tend to do very well in all other courses 3D modelling, story/character design, narrative and dialogue writing, environmental and conceptual design etc. I'm wondering if my strengths in the other areas are enough (with practice) to let me become a Level Designer, or I'm wasting my time if I can't program? I really want to be a Designer, but I just can't seem to wrap my head around the "language" of programming in general (Java kicks my teeth in even with tutoring and additional work on my own).

    Read the article

< Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >