Search Results

Search found 26774 results on 1071 pages for 'distributed development'.

Page 478/1071 | < Previous Page | 474 475 476 477 478 479 480 481 482 483 484 485  | Next Page >

  • Architecture of an action multiplayer game from scratch

    - by lcf
    Not sure whether it's a good place to ask (do point me to a better one if it's not), but since what we're developing is a game - here it goes. So this is a "real-time" action multiplayer game. I have familiarized myself with concepts like lag compensation, view interpolation, input prediction and pretty much everything that I need for this. I have also prepared a set of prototypes to confirm that I understood everything correctly. My question is about the situation when game engine must be rewind to the past to find out whether there was a "hit" (sometimes it may involve the whole 'recomputation' of the world from that moment in the past up to the present moment. I already have a piece of code that does it, but it's not as neat as I need it to be. The domain logic of the app (the physics of the game) must be separated from the presentation (render) and infrastructure tools (e.g. the remote server interaction specifics). How do I organize all this? :) Is there any worthy implementation with open sources I can take a look at? What I'm thinking is something like this: -> Render / User Input -> Game Engine (this is the so called service layer) -> Processing User Commands & Remote Server -> Domain (Physics) How would you add into this scheme the concept of "ticks" or "interactions" with the possibility to rewind and recalculate "the game"? Remember, I cannot change the Domain/Physics but only the Game Engine. Should I store an array of "World's States"? Should they be just some representations of the world, optimized for this purpose somehow (how?) or should they be actual instances of the world (i.e. including behavior and all that). Has anybody had similar experience? (never worked on a game before if that matters)

    Read the article

  • Best way to blend colors in tile lighting? (XNA)

    - by Lemoncreme
    I have made a color, decent, recursive, fast tile lighting system in my game. It does everything I need except one thing: different colors are not blended at all: Here is my color blend code: return (new Color( (byte)MathHelper.Clamp(color.R / factor, 0, 255), (byte)MathHelper.Clamp(color.G / factor, 0, 255), (byte)MathHelper.Clamp(color.B / factor, 0, 255))); As you can see it does not take the already in place color into account. color is the color of the previous light, which is weakened by the above code by factor. If I wanted to blend using the color already in place, I would use the variable blend. Here is an example of a blend that I tried that failed, using blend: return (new Color( (byte)MathHelper.Clamp(((color.R + blend.R) / 2) / factor, 0, 255), (byte)MathHelper.Clamp(((color.G + blend.G) / 2) / factor, 0, 255), (byte)MathHelper.Clamp(((color.B + blend.B) / 2) / factor, 0, 255))); This color blend produces inaccurate and strange results. I need a blend that is accurate, like the first example, that blends the two colors together. What is the best way to do this?

    Read the article

  • Reversi/Othello early-game evaluation function

    - by Vladislav Il'ushin
    I've written my own Reversi player, based on the MiniMax algorithm, with Alpha-Beta pruning, but in the first 10 moves my evaluation function is too slow. I need a good early-game evaluation function. I'm trying to do it with this matrix (corresponding to the board) which determines how favourable that square is to have: { 30, -25, 10, 5, 5, 10, -25, 30,}, {-25, -25, 1, 1, 1, 1, -25, -25,}, { 10, 1, 5, 2, 2, 5, 1, 10,}, { 5, 1, 2, 1, 1, 2, 1, 5,}, { 5, 1, 2, 1, 1, 2, 1, 5,}, { 10, 1, 5, 2, 2, 5, 1, 10,}, {-25, -25, 1, 1, 1, 1, -25, -25,}, { 30, -25, 10, 5, 5, 10, -25, 30,},}; But it doesn't work well. Have you even written an early-game evaluation function for Reversi?

    Read the article

  • Vector vs Scalar velocity?

    - by Serguei Fedorov
    I am revamping an engine I have been working on and off on for the last few weeks to use a directional vector to dictate direction; this way I can dictate the displacement based on a direction. However, the issue I am trying to overcome is the following problem; the speed towards X and speed towards Y are unrelated to one another. If gravity pulls the object down by an increasing velocity my velocity towards the X should not change. This is very easy to implement if my speed is broken into a Vector datatype, Vector.X dictates one direction Vector.Y dictates the other (assuming we are not concerned about the Z axis). However, this defeats the purpose of the directional vector because: SpeedX = 10 SpeedY = 15 [1, 1] normalized = ~[0.7, 0.7] [0.7, 0.7] * [10, 15] = [7, 10.5] As you can see my direction is now "scaled" to my speed which is no longer the direction that I want to be moving in. I am very new to vector math and this is a learning project for me. I looked around a little bit on the internet but I still want to figure out things on my own (not just look at an example and copy off it). Is there way around this? Using a directional vector is extremely useful but I am a little bit stumped at this problem. I am sorry if my mathematical understanding maybe completely wrong.

    Read the article

  • My rhythm game runs choppy even with high frame rate

    - by felipedrl
    I'm coding a rhythm game and the game runs smoothly with uncapped fps. But when I try to cap it around 60 the game updates in little chunks, like hiccups, as if it was skipping frames or at a very low frame rate. The reason I need to cap frame rate is because in some computers I tested, the fps varies a lot (from ~80 - ~250 fps) and those drops are noticeable and degrade response time. Since this is a rhythm game this is very important. This issue is driving me crazy. I've spent a few weeks already on it and still can't figure out the problem. I hope someone more experienced than me could shed some light on it. I'll try to put here all the hints I've tried along with two pseudo codes for game loops I tried, so I apologize if this post gets too lengthy. 1st GameLoop: const uint UPDATE_SKIP = 1000 / 60; uint nextGameTick = SDL_GetTicks(); while(isNotDone) { // only false when a QUIT event is generated! if (processEvents()) { if (SDL_GetTicks() > nextGameTick) { update(UPDATE_SKIP); render(); nextGameTick += UPDATE_SKIP; } } } 2nd Game Loop: const uint UPDATE_SKIP = 1000 / 60; while (isNotDone) { LARGE_INTEGER startTime; QueryPerformanceCounter(&startTime); // process events will return false in case of a QUIT event processed if (processEvents()) { update(frameTime); render(); } LARGE_INTEGER endTime; do { QueryPerformanceCounter(&endTime); frameTime = static_cast<uint>((endTime.QuadPart - startTime.QuadPart) * 1000.0 / frequency.QuadPart); } while (frameTime < UPDATE_SKIP); } [1] At first I thought it was a timer resolution problem. I was using SDL_GetTicks, but even when I switched to QueryPerformanceCounter, supposedly less granular, I saw no difference. [2] Then I thought it could be due to a rounding error in my position computation and since game updates are smaller in high FPS that would be less noticeable. Indeed there is an small error, but from my tests I realized that it is not enough to produce the position jumps I'm getting. Also, another intriguing factor is that if I enable vsync I'll get smooth updates @60fps regardless frame cap code. So why not rely on vsync? Because some computers can force a disable on gfx card config. [3] I started printing the maximum and minimum frame time measured in 1sec span, in the hope that every a few frames one would take a long time but still not enough to drop my fps computation. It turns out that, with frame cap code I always get frame times in the range of [16, 18]ms, and still, the game "does not moves like jagger". [4] My process' priority is set to HIGH (Windows doesn't allow me to set REALTIME for some reason). As far as I know there is only one thread running along with the game (a sound callback, which I really don't have access to it). I'm using AudiereLib. I then disabled Audiere by removing it from the project and still got the issue. Maybe there are some others threads running and one of them is taking too long to come back right in between when I measured frame times, I don't know. Is there a way to know which threads are attached to my process? [5] There are some dynamic data being created during game run. But It is a little bit hard to remove it to test. Maybe I'll have to try harder this one. Well, as I told you I really don't know what to try next. Anything, I mean, anything would be of great help. What bugs me more is why at 60fps & vsync enabled I get an smooth update and at 60fps & no vsync I don't. Is there a way to implement software vsync? I mean, query display sync info? Thanks in advance. I appreciate the ones that got this far and yet again I apologize for the long post. Best Regards from a fellow coder.

    Read the article

  • Where to implement storable items

    - by James Hay
    I'm creating a multiplayer online trading game. The things that are traded range from raw items to complex products. For example Steel is a raw item. Mechanical Assembly is a more complex item that requires 2x Steel and maybe 1x Rubber. Then Hydraulics is an item that contains 2x Mechanical Assemblies and 1x Electronics (which is another complex item). So and so forth. These items will be created by me, players can't create their own items, so it doesn't need to be able to handle arbitrary layers of complexity for items. If my example isn't clear, think Minecraft. You have wooden planks, which can be made into sticks. From there the sticks - combined with metals - can be made into tools. My game is nothing to do with minecraft or any sandbox building game, but it uses a similar progressive complexity to creating items that I want to have in my game. My question is basically, how do you store something like this assuming that I will want to add more items in the future? Do you store it in a database or in a seperate library that the game uses? EDIT None of the items actually "do" anything, they are simply there to either sell, purchase, or combine with other items to make a more complex item, which can then be sold, purchased or combined... you get the idea. The items themselves would not have any properties, but the instances of the items would. For example an item that one player has would have a certain "quality" and if they were selling it a certain "price". An instance of that same item that a different player had would need to have a different "quality" and "price" if they were selling it. I think the price part will not be required on an individual item because instead I would have a "sale" object which was for a price and contained certain items.

    Read the article

  • Ignore collisions with some objects in certain contexts

    - by Paul Manta
    I'm making a racing game with cars in Unity. The car has a boost/nitro powerup. While boosting, I wouldn't want to be deviated when colliding with zombies, but I do want to be deviated when colliding with walls. On the other hand, I don't want to ignore collision with zombies, because I still want to hit them on impact. How should I handle this? Basically, what I want is for the car to not rotate when colliding with certain objects.

    Read the article

  • Interacting with scene cocos2d

    - by cjroebuck
    I'm attempting to make my first cocos2d (for iphone) multiplayer game and having difficulty understanding how to interact with a scene once it is running. The game is a simple turn-based one and so I have a GameController class which co-ordinates the rounds. I also have a GameScene class which is the actual scene that is displayed during a round of the game. The basic interaction I need is for the GameController to be able to pass messages to the GameScene class.. such as StartRound/StopRound etc. The thing that complicates this is that I am loading the GameScene with a LoadingScene class which simply initialises the scene and replaces the current scene with this one, so there is no reference from GameController to GameScene, so passing messages is quite tricky. Does anyone have any ways to get around this, ideally I would still like to use a Loading class as it smooths out the memory hit when replacing scenes.

    Read the article

  • cocos2d-x simple shader usage [on hold]

    - by Narek
    I want to obtain color ramp effect from this tutorial: http://www.raywenderlich.com/10862/how-to-create-cool-effects-with-custom-shaders-in-opengl-es-2-0-and-cocos2d-2-x Here is my code in cocos2d-x 3: bool HelloWorld::init() { ////////////////////////////// // 1. super init first if ( !Layer::init() ) { return false; } Vec2 origin = Director::getInstance()->getVisibleOrigin(); sprite = Sprite::create("HelloWorld.png"); sprite->setAnchorPoint(Vec2(0, 0)); sprite->setRotation(3); sprite->setPosition(origin); addChild(sprite); std::string str = FileUtils::getInstance()->getStringFromFile("CSEColorRamp.fsh"); const GLchar * fragmentSource = str.c_str(); GLProgram* p = GLProgram::createWithByteArrays(ccPositionTextureA8Color_vert, fragmentSource); p->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_POSITION, GLProgram::VERTEX_ATTRIB_POSITION); p->bindAttribLocation(GLProgram::ATTRIBUTE_NAME_TEX_COORD, GLProgram::VERTEX_ATTRIB_TEX_COORD); p->link(); p->updateUniforms(); sprite->setGLProgram(p); // 3 colorRampUniformLocation = glGetUniformLocation(sprite->getGLProgram()->getProgram(), "u_colorRampTexture"); glUniform1i(colorRampUniformLocation, 1); // 4 colorRampTexture = Director::getInstance()->getTextureCache()->addImage("colorRamp.png"); colorRampTexture->setAliasTexParameters(); // 5 sprite->getGLProgram()->use(); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, colorRampTexture->getName()); glActiveTexture(GL_TEXTURE0); return true; } And here is the fragment shader as it is in the tutorial: #ifdef GL_ES precision mediump float; #endif // 1 varying vec2 v_texCoord; uniform sampler2D u_texture; uniform sampler2D u_colorRampTexture; void main() { // 2 vec3 normalColor = texture2D(u_texture, v_texCoord).rgb; // 3 float rampedR = texture2D(u_colorRampTexture, vec2(normalColor.r, 0)).r; float rampedG = texture2D(u_colorRampTexture, vec2(normalColor.g, 0)).g; float rampedB = texture2D(u_colorRampTexture, vec2(normalColor.b, 0)).b; // 4 gl_FragColor = vec4(rampedR, rampedG, rampedB, 1); } As a result I get a black screen with 2 draw calls. What is wrong? Do I miss something?

    Read the article

  • 2D Tile-based terrian generation

    - by a240
    As a summer project I decided it would be fun to make a flash game. Right now I'm going for something like the look of http://www.terraria.org/. It's been a lot of fun, but today I've hit a snag. I need a way to generate my worlds. I've read up Perlin Noise ( http://freespace.virgin.net/hugo.elias/models/m_perlin.htm ) as a possibility, but I my attempts have given me sporadic looking results. What are some techniques used to generate these 2D tile-based worlds? Ideally I would like to be able to generate mountains, plains, and caves.

    Read the article

  • How to get a point to the left/right of a vector

    - by MulletDevil
    I have a position vector of a point in space and a quaternion for it's rotation. What i'm trying to calculate is a point too the left and a point to the right. I have the position and rotation(quaternion) of the red dot. What I want is to get the position of the green dots. I have a float value for the distance I want these points to be. With only the position and rotation is it possible to get a unit direction vector pointing left/right which I can multiply by my float value? Edit: I also know the original direction vector.

    Read the article

  • [JOGL] my program is too slow, ho can i profile with Eclipse?

    - by nkint
    hi juys my simple opengl program is really toooo slow and not fluid i'm rendering 30 sphere with simple illumination and simple material. the only hard(?) computing stuffs i do is a collision detection between ray-mouse and spheres (that works ok and i do it only in mouseMoved) i have no thread only animator to move spheres how can i profile my jogl project? or mayebe (most probable..) i have some opengl instruction that i don't understand and make render particular accurate (or back face rendering that i don't need or whatever i don't know exctly i'm just entered in opengl world)

    Read the article

  • How to make Pokémon White 3D effect?

    - by Pipo
    I just wondered how to create a 3D effect similar to Pokemon White/Black? It seems to be not polygon based, but created just with sprites. If the perspective changes the sprites stay sharp and don't get blurred. How can I archive this? Source: https://www.youtube.com/watch?v=fZEPUPYOnRc&feature=youtube_gdata_player Edit: Wow, two downvotes because I used a video instead of screenshots? Don't get me wrong, I thank you, because you want to help me, but the 3D effect can be better understand in motion. Anyway, here is a screenshot: http://wearearcade.com/wp-content/uploads/2011/03/pokemon-black-white-starter-town.jpg So, if this is a hardware limitation, how can I archive this o na different hardware, e.g. a HTML5 game? Thank you.

    Read the article

  • Physics from other games

    - by Carlosrdz1
    I'm making a platform engine with XNA Game Studio, and I've solved almost everything about colliding stuff. But now, I'm searching for good physics for the player, I'm trying to emulate characters from other games like Mario from Super Mario World, or MegaMan X... do you know a website or something, where the physics from that games are revealed? I remember seen a page with something like that. Or what's the process you think is the best to emulate physics from other games? Just trial and error? Thank you.

    Read the article

  • One Step-Ahead A-Star

    - by Jonathan Dickinson
    I am attempting to create a server-centric RTS (as opposed to usual parallel synchronised simulation route of most RTS games today) - however I am still leveraging the discreet N-turns-ahead paradigm discussed by one of the AOE developers on Gamasutra. I have [possibly questionably?] decided that the path finding should only ever find the next cell the entity needs to move to, and was wondering if anyone has any clever ideas on how to optimize the algorithm for this specific scenario - or any other ideas on how to keep the pathfinding as lean as possible on the server. I have investigated a few possible algorithms but could only come up with one appropriation: Tiered A-Star - Relatively large T1 tiles, work out (and cache) each cell as you enter it. Other than that: doing the full A-Star pass and caching the entire path, which might use too much memory if a large amount of units are present. I know about the existence of naive progressive pathfinding algorithms (if you hit a block, turn in the direction closer to your target etc.) but they suffer from infinite feedback loops - and very poor pathing even if visited blocks are memorised. Not an option. Many thanks.

    Read the article

  • Java: Reflection Packet Builder using getField()

    - by Matchlighter
    So I just finished writing a packet builder that dynamically loads data into a data stream which is then sent out. Each builder operates by finding fields in its class (and its superclasses) that are marked with an @data annotation. Upon finishing the builder, I remembered that getFields() does not return in "any specific order". I quite like my builder because it allows for quite simple, yet hard-typed packets. Could this implementation be a problem? What would be the best next step to keep the simplicity - do alphabetical sorting of fields?

    Read the article

  • In concept how is Animation done?

    - by sharethis
    The first approaches in animation for my game relied mostly on sine and cosine functions with the time as parameter. As a jump a perfect sine function is acceptable but for motions of arms, weapons or face it would look quite unnatural. Moreover patching every animation out of sine and cosine is stretched to its limits soon. I head of skeletons and rigging already. Although I could not implement skeletal animations I can't imagine that quite natural animations in major games are made of static predefined motion states. So how in general is animation done today?

    Read the article

  • Examples of good Javascript/HTML5 based games

    - by Zuch
    Now that Flash is largely being replaced with HTML5 elements (video, audio, canvas, etc.) are there any good examples of web-based games built on completely open standards (meaning Javascript, HTML and CSS)? I see a lot of examples of pure HTML5 implementations of what was once only in Flash (like stuff here: http://www.html5rocks.com/) but not many games, a domain which still seem dominated by Flash. I'm curious what's possible and what the limitations are.

    Read the article

  • Music for Kids Game!

    - by Dane
    I'm developing a Multimedia Software for Kindergarten Kids. It introduce them to animals, Alphabets, Simple Math, Colors and it contain some simple games. Music is very crucial for my project and it is very important to choose the right sort of music for different sections. But unfortunately I know nothing about music. Is there a music consultant firm which can help me to choose melodies and rythmes for my project from free music available in internet. My Budget is limited but as this is mandatory and I have no knowledge or taste about music, I think I can afford to pay for this.

    Read the article

  • What are the pro/cons of Unity3D as a choice to make games?

    - by jokoon
    We are doing our school project with Unity3d, since they were using Shiva the previous year (which seems horrible to me), and I wanted to know your point of view for this tool. Pros: multi platform, I even heard Google is going to implement it in Chrome everything you need is here scripting languages makes it a good choice for people who are not programming gurus Cons: multiplayer ? proprietary, you are totally dependent of unity and its limit and can't extend it it's less "making a game from scratch" C++ would have been a cool thing I really think this kind of tool is interesting, but is it worth it to use at school for a project that involves more than 3 programming persons ? What do we really learn in term of programming from using this kind of tool (I'm ok with python and js, but I hate C#) ? We could have use Ogre instead, even if we were learning direct x starting january...

    Read the article

  • Scaling along an arbitrary axis (Dealing with non-uniform scale)

    - by Jon
    I'm trying to build my own little engine to get more familiar with the concepts of 3D programming. I have a transform class that on each frame it creates a Scaling Matrix (S), a Rotation Matrix from a Quaternion (R) and concatenates them together (S*R). Once i have SR, I insert the translation values into the bottom of the three columns. So i end up with a transformation matrix that looks like: [SR SR SR 0] [SR SR SR 0] [SR SR SR 0] [tx ty tz 1] This works perfectly in all cases except when rotating an object that has a non-uniform scale. For example a unit cube with ScaleX = 4, ScaleY = 2, ScaleZ = 1 will give me a rectangular box that is 4 times as wide as the depth and twice as high as the depth. If i then translate this around, the box stays the same and looks normal. The problem happens whenever I try to rotate this scaled box. The shape itself becomes distorted and it appears as though the Scale factors are affecting the object on the World X,Y,Z axis rather than the local X,Y,Z axis of the object. I've done some pretty extensive research through a variety of textbooks (Eberly, Moller/Hoffman, Phar etc) and there isn't a ton there to go off of. Online, most of the answers say to avoid non-uniform scaling which I understand the desire to avoid it, but I'd still like to figure out how to support it. The only thing I can think off is that when constructing a Scale Matrix: [sx 0 0 0] [0 sy 0 0] [0 0 sz 0] [0 0 0 1] This is scaling along the World Axis instead of the object's local Direction, Up and Right vectors or it's local Z, Y, X axis. Does anyone have any tips or ideas on how to handle construction a transformation matrix that allows for non-uniform scaling and rotation? Thanks!

    Read the article

  • Kinect Click counter function

    - by Sweta Dwivedi
    So i have the following kinect click function which will check if the hand is within the bounds then it will click with a counter . . however there is a slight problem . .the first few button clicks work fine.. but after it clicks one of the buttons it changes the game state and immediately clicks the other button without the counter reaching 200. . . Kinect click is a method in the button class. . .and each button inside a list can access the Kinect click method. . . public bool KinectClick(int x,int y) { if ((x >= position.X && x <= position.X + position.Width) && (y >= position.Y && y <= position.Y + position.Height)) { counter++; if (counter > 200) { counter = 0; return true; } } else { counter = 0; } return false; } I call to check if this property is true in the Game update method to act as a button click. . foreach(Button g_t in Game_theme) { if ((g_t.KinectClick(x_c, y_c) == true || g_t.ButtonClicked() == true) && g_t.name == "animoe") { Selected_anim = true; currentGameState = GameState.InGame; } if ((g_t.KinectClick(x_c, y_c) == true || g_t.ButtonClicked() == true) && g_t.name == "planet") { Selected_planet = true; currentGameState = GameState.InGame; }

    Read the article

  • GLSL: Strange light reflections

    - by Tom
    According to this tutorial I'm trying to make a normal mapping using GLSL, but something is wrong and I can't find the solution. The output render is in this image: Image1 in this image is a plane with two triangles and each of it is different illuminated (that is bad). The plane has 6 vertices. In the upper left side of this plane are 2 identical vertices (same in the lower right). Here are some vectors same for each vertice: normal vector = 0, 1, 0 (red lines on image) tangent vector = 0, 0,-1 (green lines on image) bitangent vector = -1, 0, 0 (blue lines on image) here I have one question: The two identical vertices does need to have the same tangent and bitangent? I have tried to make other values to the tangents but the effect was still similar. Here are my shaders Vertex shader: #version 130 // Input vertex data, different for all executions of this shader. in vec3 vertexPosition_modelspace; in vec2 vertexUV; in vec3 vertexNormal_modelspace; in vec3 vertexTangent_modelspace; in vec3 vertexBitangent_modelspace; // Output data ; will be interpolated for each fragment. out vec2 UV; out vec3 Position_worldspace; out vec3 EyeDirection_cameraspace; out vec3 LightDirection_cameraspace; out vec3 LightDirection_tangentspace; out vec3 EyeDirection_tangentspace; // Values that stay constant for the whole mesh. uniform mat4 MVP; uniform mat4 V; uniform mat4 M; uniform mat3 MV3x3; uniform vec3 LightPosition_worldspace; void main(){ // Output position of the vertex, in clip space : MVP * position gl_Position = MVP * vec4(vertexPosition_modelspace,1); // Position of the vertex, in worldspace : M * position Position_worldspace = (M * vec4(vertexPosition_modelspace,1)).xyz; // Vector that goes from the vertex to the camera, in camera space. // In camera space, the camera is at the origin (0,0,0). vec3 vertexPosition_cameraspace = ( V * M * vec4(vertexPosition_modelspace,1)).xyz; EyeDirection_cameraspace = vec3(0,0,0) - vertexPosition_cameraspace; // Vector that goes from the vertex to the light, in camera space. M is ommited because it's identity. vec3 LightPosition_cameraspace = ( V * vec4(LightPosition_worldspace,1)).xyz; LightDirection_cameraspace = LightPosition_cameraspace + EyeDirection_cameraspace; // UV of the vertex. No special space for this one. UV = vertexUV; // model to camera = ModelView vec3 vertexTangent_cameraspace = MV3x3 * vertexTangent_modelspace; vec3 vertexBitangent_cameraspace = MV3x3 * vertexBitangent_modelspace; vec3 vertexNormal_cameraspace = MV3x3 * vertexNormal_modelspace; mat3 TBN = transpose(mat3( vertexTangent_cameraspace, vertexBitangent_cameraspace, vertexNormal_cameraspace )); // You can use dot products instead of building this matrix and transposing it. See References for details. LightDirection_tangentspace = TBN * LightDirection_cameraspace; EyeDirection_tangentspace = TBN * EyeDirection_cameraspace; } Fragment shader: #version 130 // Interpolated values from the vertex shaders in vec2 UV; in vec3 Position_worldspace; in vec3 EyeDirection_cameraspace; in vec3 LightDirection_cameraspace; in vec3 LightDirection_tangentspace; in vec3 EyeDirection_tangentspace; // Ouput data out vec3 color; // Values that stay constant for the whole mesh. uniform sampler2D DiffuseTextureSampler; uniform sampler2D NormalTextureSampler; uniform sampler2D SpecularTextureSampler; uniform mat4 V; uniform mat4 M; uniform mat3 MV3x3; uniform vec3 LightPosition_worldspace; void main(){ // Light emission properties // You probably want to put them as uniforms vec3 LightColor = vec3(1,1,1); float LightPower = 40.0; // Material properties vec3 MaterialDiffuseColor = texture2D( DiffuseTextureSampler, vec2(UV.x,-UV.y) ).rgb; vec3 MaterialAmbientColor = vec3(0.1,0.1,0.1) * MaterialDiffuseColor; //vec3 MaterialSpecularColor = texture2D( SpecularTextureSampler, UV ).rgb * 0.3; vec3 MaterialSpecularColor = vec3(0.5,0.5,0.5); // Local normal, in tangent space. V tex coordinate is inverted because normal map is in TGA (not in DDS) for better quality vec3 TextureNormal_tangentspace = normalize(texture2D( NormalTextureSampler, vec2(UV.x,-UV.y) ).rgb*2.0 - 1.0); // Distance to the light float distance = length( LightPosition_worldspace - Position_worldspace ); // Normal of the computed fragment, in camera space vec3 n = TextureNormal_tangentspace; // Direction of the light (from the fragment to the light) vec3 l = normalize(LightDirection_tangentspace); // Cosine of the angle between the normal and the light direction, // clamped above 0 // - light is at the vertical of the triangle -> 1 // - light is perpendicular to the triangle -> 0 // - light is behind the triangle -> 0 float cosTheta = clamp( dot( n,l ), 0,1 ); // Eye vector (towards the camera) vec3 E = normalize(EyeDirection_tangentspace); // Direction in which the triangle reflects the light vec3 R = reflect(-l,n); // Cosine of the angle between the Eye vector and the Reflect vector, // clamped to 0 // - Looking into the reflection -> 1 // - Looking elsewhere -> < 1 float cosAlpha = clamp( dot( E,R ), 0,1 ); color = // Ambient : simulates indirect lighting MaterialAmbientColor + // Diffuse : "color" of the object MaterialDiffuseColor * LightColor * LightPower * cosTheta / (distance*distance) + // Specular : reflective highlight, like a mirror MaterialSpecularColor * LightColor * LightPower * pow(cosAlpha,5) / (distance*distance); //color.xyz = E; //color.xyz = LightDirection_tangentspace; //color.xyz = EyeDirection_tangentspace; } I have replaced the original color value by EyeDirection_tangentspace vector and then I got other strange effect but I can not link the image (not eunogh reputation) Is it possible that with this shaders is something wrong, or maybe in other place in my code e.g with my matrices? SOLVED Solved... 3 days needed for changing one letter from this: glBindBuffer(GL_ARRAY_BUFFER, vbo); glVertexAttribPointer ( 4, // attribute 3, // size GL_FLOAT, // type GL_FALSE, // normalized? sizeof(VboVertex), // stride (void*)(12*sizeof(float)) // array buffer offset ); to this: glBindBuffer(GL_ARRAY_BUFFER, vbo); glVertexAttribPointer ( 4, // attribute 3, // size GL_FLOAT, // type GL_FALSE, // normalized? sizeof(VboVertex), // stride (void*)(11*sizeof(float)) // array buffer offset ); see difference? :)

    Read the article

  • How to utilize miniMax algorrithm in Checkers game

    - by engineer
    I am sorry...as there are too many articles about it.But I can't simple get this. I am confused in the implementation of AI. I have generated all possible moves of computer's type pieces. Now I can't decide the flow. Whether I need to start a loop for the possible moves of each piece and assign score to it.... or something else is to be done. Kindly tell me the proper flow/algorithm for this. Thanks

    Read the article

  • Platform game collisions with Block

    - by Sri Harsha Chilakapati
    I am trying to create a platform game and doing wrong collision detection with the blocks. Here's my code // Variables GTimer jump = new GTimer(1000); boolean onground = true; // The update method public void update(long elapsedTime){ MapView.follow(this); // Add the gravity if (!onground && !jump.active){ setVelocityY(4); } // Jumping if (isPressed(VK_SPACE) && onground){ jump.start(); setVelocityY(-4); onground = false; } if (jump.action(elapsedTime)){ // jump expired jump.stop(); } // Horizontal movement setVelocityX(0); if (isPressed(VK_LEFT)){ setVelocityX(-4); } if (isPressed(VK_RIGHT)){ setVelocityX(4); } } // The collision method public void collision(GObject other){ if (other instanceof Block){ // Determine the horizontal distance between centers float h_dist = Math.abs((other.getX() + other.getWidth()/2) - (getX() + getWidth()/2)); // Now the vertical distance float v_dist = Math.abs((other.getY() + other.getHeight()/2) - (getY() + getHeight()/2)); // If h_dist > v_dist horizontal collision else vertical collision if (h_dist > v_dist){ // Are we moving right? if (getX()<other.getX()){ setX(other.getX()-getWidth()); } // Are we moving left? else if (getX()>other.getX()){ setX(other.getX()+other.getWidth()); } } else { // Are we moving up? if (jump.active){ jump.stop(); } // We are moving down else { setY(other.getY()-getHeight()); setVelocityY(0); onground = true; } } } } The problem is that the object jumps well but does not fall when moved out of platform. Here's an image describing the problem. I know I'm not checking underneath the object but I don't know how. The map is a list of objects and should I have to iterate over all the objects??? Thanks

    Read the article

< Previous Page | 474 475 476 477 478 479 480 481 482 483 484 485  | Next Page >