Search Results

Search found 35343 results on 1414 pages for 'development tools'.

Page 657/1414 | < Previous Page | 653 654 655 656 657 658 659 660 661 662 663 664  | Next Page >

  • Do open world games need less backstory?

    - by Raceimaztion
    I've played a few open-world games and really enjoyed them, though the ones I've really enjoyed have generally received complaints about how little story there is to them. The Saboteur is one example of this. Fully open-world, good enough story (for me, anyway), engaging gameplay, and still has received complaints in reviews about not having enough story. Do open-world games actually need a full, all-encompassing story? Or can fun and engaging gameplay fill in the gap and let the designer get away with a slightly less complete story?

    Read the article

  • System Requirement Checking

    - by gl3829
    I am working on a game and want to strengthen its requirement checking to ensure that it can run successfully. Therefore, I am looking for information on what is useful to check before starting the game. As a simple example, Why check for a specific amount of memory? Should I as a game developer ensure a minimum amount of memory? I feel this information is usually skipped in many books and resources but is critical to be able to deliver a game that will run on many machines. I would appreciate if you answered with what you check in the system, why you check it and if you have a good resource about it, please include. Just to be a bit more specific, I'm developing in Windows.

    Read the article

  • does glBindAttribLocation silently ignore names not found in a shader?

    - by rwols
    Does glBindAttribLocation silently ignore names that are not found? For example, in a shader: // Some vertex shader in vec3 position; in vec3 normal; // ... And in some set up code: // While setting up shader GLuint program = glCreateProgram(); glBindAttribLocation(program, 0, "position"); glBindAttribLocation(program, 1, "normal"); glBindAttribLocation(program, 2, "color"); // What about this one? glLinkProgram(program);

    Read the article

  • (Android) How are OpenGL ES 1 framebuffers and textures sized?

    - by jens
    I am trying to draw to a texture using a framebuffer using OpenGL ES 1.1 on Android, Java. Afterwords I want to overlay this texture full-screen over my game. In theory, this works like a charm, but somehow the coordinates are off. For testing I drew something at (0,0) with width and height 200, and it partly is off-screen. This is how I create the framebuffer: fb = new int[1]; depthRb = new int[1]; renderTex = new int[1]; gl11ep.glGenFramebuffersOES(1, fb, 0); gl11ep.glGenRenderbuffersOES(1, depthRb, 0); // the depth buffer gl.glGenTextures(1, renderTex, 0);// generate texture gl.glBindTexture(GL10.GL_TEXTURE_2D, renderTex[0]); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_REPEAT); gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_REPEAT); texBuffer = ByteBuffer.allocateDirect(buf.length*4).order(ByteOrder.nativeOrder()).asIntBuffer(); gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_LUMINANCE, texW, texH, 0, GL10.GL_LUMINANCE, GL10.GL_UNSIGNED_BYTE, texBuffer); gl11ep.glBindRenderbufferOES(GL11ExtensionPack.GL_RENDERBUFFER_OES, depthRb[0]); gl11ep.glRenderbufferStorageOES(GL11ExtensionPack.GL_RENDERBUFFER_OES, GL11ExtensionPack.GL_DEPTH_COMPONENT16, texW, texH); Before I draw, I do this: gl11ep.glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, fb[0]); gl.glClearColor(0f, 0f, 0f, 0f); // specify texture as color attachment gl11ep.glFramebufferTexture2DOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_COLOR_ATTACHMENT0_OES, GL10.GL_TEXTURE_2D, renderTex[0], 0); // attach render buffer as depth buffer gl11ep.glFramebufferRenderbufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_DEPTH_ATTACHMENT_OES, GL11ExtensionPack.GL_RENDERBUFFER_OES, depthRb[0]); I set texW = 1024 and texH = 512. When rendering this texture fullscreen, with a lightmask (size 200x200) placed at (0, 0) and (texW/2, texH/2). You can see that it seems like the coordinate system doesnt start at (0,0) as that light overlaps the screen and the images are not drawn as squares (my lightcone-texture is a circle, not an ellipse). So, how is the coordinate system of this offscreen-drawn texture defined? Thanks

    Read the article

  • How many threads should an Android game use?

    - by kvance
    At minimum, an OpenGL Android game has a UI thread and a Renderer thread created by GLSurfaceView. Renderer.onDrawFrame() should be doing a minimum of work to get the higest FPS. The physics, AI, etc. don't need to run every frame, so we can put those in another thread. Now we have: Renderer thread - Update animations and draw polys Game thread - Logic & periodic physics, AI, etc. updates UI thread - Android UI interaction only Since you don't ever want to block the UI thread, I run one more thread for the game logic. Maybe that's not necessary though? Is there ever a reason to run game logic in the renderer thread?

    Read the article

  • background animation algorithm for single screen

    - by becool_max
    I’m writing simple strategy game (in xna), and would like to have an animated background. In my game all the actions happens inside one screen and thus standard parallax effect does not look appropriate. However, I found a video of a game with suitable background animation for my game http://www.youtube.com/watch?v=Vcxdbjulf90&feature=share&list=PLEEF9ABAB913946E6 (from 3 to 6s, while main character stays at the same place). What is the algorithm to do this stuff? It would be nice if someone can provide a reference for a similar example (language is not important).

    Read the article

  • How can I plot a radius of all reachable points with pathfinding for a Mob (XNA)?

    - by PugWrath
    I am designing a tactical turn based game. The maps are 2d, but do have varying level-layers and blocking objects/terrain. I'm looking for an algorithm for pathfinding which will allow me to show an opaque shape representing all of the possible max-distance pixels that a mob can move to, knowing the mob's max pixel distance. Any thoughts on this, or do I just need to write a good pathfinding algorithm and use it to find the cutoff points for any direction in which an obstacle exists?

    Read the article

  • Existent js libs for tileset / map loading and rendering?

    - by ylluminate
    I'm building an rts style overhead tileset game with JavaScript (particularly using Ember.js framework as a base). The map is so large that I'd very much like to be able to load and render the board and layered items in a Google Maps'esque. I'm curious as to whether there are existing libs that would be helpful and already well thought out in these regards vs trying to reinvent the wheel. Are there any such libraries or code examples that would be useful in this area of board / map management?

    Read the article

  • What's the difference between Pygame's Sound and Music classes?

    - by Southpaw Hare
    What are the key differences between the Sound and Music classes in Pygame? What are the limitations of each? In what situation would one use one or the other? Is there a benefit to using them in an unintuitive way such as using Sound objects to play music files or visa-versa? Are there specifically issues with channel limitations, and do one or both have the potential to be dropped from their channel unreliably? What are the risks of playing music as a Sound?

    Read the article

  • Change density of the body dynamically

    - by Siddharth
    In my game, I want to change density of my body object when it collide with other objects. I found something like following to change density but further I could not able to find any hint for this. So someone please help. Fixture fixture = goldenBoxArrayList.get(i) .getGoldenBoxBody() .getFixtureList().get(0); fixture.setDensity(0.5f); After setting fixture data I could not able to set it to the body.

    Read the article

  • How do I connect the seams between my terrain?

    - by gnomgrol
    I'm using c++ and D3D11 and I'm trying to create a (pretty) large terrain, lets say 4096x4096, maybe larger. I've got the basics of terrain creation and already split it up into chunks. But, when I'm rendering them (every chunk has its own vertex and index buffer, as well as its own heightmap), there are still little pieces missing between them. I read a lot about LOD(Level Of Detail) and GMM(Geometry Mipmap), but I can't really implement the theory I read. At the moment, it looks like this: I could really use some help, everything is welcome. If you have some good tutorials on any of this, please share them.

    Read the article

  • What are the challenges and benefits of writing games with a functional language?

    - by McMuttons
    While I know that functional languages aren't the most commonly used for game writing, there are a lot of benefits associate with them that seem like they would be interesting in any programming context. Especially the ease of parallelization I would think could be very useful as focus is moving toward more and more processors. Also, with F# as a new member of the .NET family, it can be used directly with XNA, for example, which lowers the threshold quite a bit, as opposed to going with LISP, Haskell, Erlang, etc. If anyone has experience writing games with functional code, what has turned out to be the positives and negatives? What was it suited for, what not? Edit: Finding it hard to decide that there's a single good answer for this, so it's probably better suited as a community wiki post.

    Read the article

  • Looking for feedback on design pattern for simple 2D environment

    - by Le Mot Juiced
    I'm working in iOS. I am trying to make a very simple 2D environment where there are some basic shapes you can drag around with your finger. These shapes should interact in various ways when dropped on each other, or when single-tapped versus double-tapped, etc. I don't know the name for the design pattern I'm thinking of. Basically, you have a bunch of arrays named after attributes, such as "double-tappable" or "draggable" or "stackable". You assign these attributes to the shapes by putting the shapes in the arrays. So, if there's a double-tap event, the code gets the location of it, then iterates through the "double-tappable" array to see if any of its members are in that location. And so on: every interactive event causes a scan through the appropriate array or arrays. It seems like that should work, but I'm wondering if there's a better pattern for the purpose.

    Read the article

  • Android Java: Way to effectively pause system time while debugging?

    - by TheMaster42
    In my project, I call nanoTime and use that to get a deltaTime which I pass to my entities and animations. However, while debugging (for example, stepping through my code), the system time on my phone is happily chugging along, so it's impossible to look at, say, two sequential frames of data in the debugger (since by the time I'm done looking at the first frame, the system time has continued to move ahead by seconds or even minutes). Is there a programming practice or method to pause the system clock (or a way for my code to intercept and fake my deltaTime) whenever I pause execution from the debugger? Additional Information: I'm using Eclipse Classic with the ADT plugin and a Samsung SII, coding in Java. My code invoking nanoTime: http://pastebin.com/0ZciyBtN I do all display via a Canvas object (2D sprites and animations).

    Read the article

  • Unity, Unrealistic Sphere On Inclined Plane

    - by user1086516
    So I am trying to model a ball rolling down an inclined surface in Unity based on what I am observing in real life but it is still quite off. In Unity it takes the ball about 3 seconds to travel from a place to another specified place where in real life it only takes 1 second. The ball isn't as fast to react to the incline as in real life (even though I have tried giving the ball and surface low or zero friction values) The ball does not accelerate as nearly as fast as it does in real life What do I do to give the ball more realistic behavior ? I have tried messing around with mass, physics materials, drag, and angular drag on the ball and surface but it doesn't seem to be helping.

    Read the article

  • Rendering projectiles with DirectX and C++

    - by Chris
    I'm working on a simple game that has the user control a space ship that shoots small circular projectiles. However, I'm not sure how to render these. Right now I know how to make a LPDIREC3DSURFACE for a sprite and render it onto a LPDIRECT3DDEVICE9, but that's only for a single sprite. I assume I don't need to constantly create new surfaces and devices. How should projectile generation/rendering be handled? Thanks in advance.

    Read the article

  • How do I convert screen coordinates to between -1 and 1?

    - by bbdude95
    I'm writing a function that allows me to click on my tiles. The origin for my tiles is the center, however, the mouse's origin is the top left. I need a way to transform my mouse coordinates into my tile coordinates. Here is what I already have (but is not working): void mouseClick(int button, int state, int x, int y) { x -= 400; y -= 300; float xx = x / 100; // This gets me close but the number is still high. float yy = y / 100; // It needs to be between -1 and 1 }

    Read the article

  • How do I find a unit vector of another in Java?

    - by Shijima
    I'm writing a Java formula based on this tutorial: 2-D elastic collisions without Trigonometry. I am in the section "Elastic Collisions in 2 Dimensions". Part of step 1 says: Next, find the unit vector of n, which we will call un. This is done by dividing by the magnitude of n. My below code represents the normal vector of 2 objects (I'm using a simple array to represent the normal vector). int[] normal = new int[2]; normal[0] = ball2.x - ball1.x; normal[1] = ball2.y - ball1.y; I am unsure what the tutorial means by dividing the magnitude of n to get the un. What is un? How can I calculate it with my Java array?

    Read the article

  • OpenGL : Keeping alpha in a render buffer

    - by Cyan
    In my current task, i need to render a texture into a render buffer, in order to work on it (apply special filters) there. The result is then considered a "new texture", which is later displayed. This works fine, except when the texture contains some transparent/semi-transparent parts. My current guess it that, within the render buffer, the texture is "merged" with a kind of "grey background". In this case, it obviously impacts the R,G,B color components of transparent pixels. I've yet to find a way around this. Even manually assigning alpha after the rendering process doesn't save the day for semi-transparent pixels, which RGB are "tainted" by the grey background.

    Read the article

  • How to pause and unpause the animation of a sprite?

    - by user1609578
    My game has a sprite representing a character. When the character picks up an item, the sprite should stop moving for a period of time. I use CCbezier to make the sprite move, like this: sprite->runaction(x) Now I want the sprite to stop its current action (moving) and later resume it. I can make the sprite stop by using: sprite->stopaction(x) but if I do that, I can't resume the movement. How can I do that?

    Read the article

  • Knockback enemy based off of direction sprite is facing

    - by pengume
    Hey Everyone, Today I am trying to make it so if I hit the enemy then the enemy well be knocked backwards in the direction the sprite is facing. I am rotating the sprite around 360 degrees using a joystick on the screen and wanted to know the best practice or ways to accomplish this. I have come up with a few ideas but none of them make use of the sprites angle he is facing just a check to see if I hit the bottom then move him upward and so forth. I am just stumped on how to apply the sprites angle to the enemies x and y coordinate and move him accordingly. Has anyone tried this and have suggestions or things to look for? Thanks in advance.

    Read the article

  • Really weird GL Behaviour, uniform not "hitting" proper mesh? LibGdx

    - by HaMMeReD
    Ok, I got some code, and you select blocks on a grid. The selection works. I can modify the blocks to be raised when selected and the correct one shows. I set a color which I use in the shader. However, I am trying to change the color before rendering the geometry, and the last rendered geometry (in the sequence) is rendered light. However, to debug logic I decided to move the block up and make it white, in which case one block moves up and another block becomes white. I checked all my logic and it knows the correct one is selected and it is showing in, in the correct place and rendering it correctly. When there is only 1 it works properly. Video Of the bug in action, note how the highlighted and elevated blocks are not the same block, however the code for color and My Renderer is here (For the items being drawn) public void render(Renderer renderer) { mGrid.render(renderer, mGameState); for (Entity e:mGameEntities) { UnitTypes ut = UnitTypes.valueOf((String)e.getObject(D.UNIT_TYPE.ordinal())); if (ut == UnitTypes.Soldier) { renderer.testShader.begin(); renderer.testShader.setUniformMatrix("u_mvpMatrix",mEntityMatrix); renderer.texture_soldier.bind(0); Vector2 pos = (Vector2) e.getObject(D.COORDS.ordinal()); mEntityMatrix.set(renderer.mCamera.combined); if (mSelectedEntities.contains(e)) { mEntityMatrix.translate(pos.x, 1f, pos.y); renderer.testShader.setUniformf("v_color", 0.5f,0.5f,0.5f,1f); } else { mEntityMatrix.translate(pos.x, 0f, pos.y); renderer.testShader.setUniformf("v_color", 1f,1f,1f,1f); } mEntityMatrix.scale(0.2f, 0.2f, 0.2f); renderer.model_soldier.render(renderer.testShader,GL20.GL_TRIANGLES); renderer.testShader.end(); } else if (ut == UnitTypes.Enemy_Infiltrator) { renderer.testShader.begin(); renderer.testShader.setUniformMatrix("u_mvpMatrix",mEntityMatrix); renderer.testShader.setUniformf("v_color", 1.0f,1,1,1.0f); renderer.texture_enemy_infiltrator.bind(0); Vector2 pos = (Vector2) e.getObject(D.COORDS.ordinal()); mEntityMatrix.set(renderer.mCamera.combined); mEntityMatrix.translate(pos.x, 0f, pos.y); mEntityMatrix.scale(0.2f, 0.2f, 0.2f); renderer.model_enemy_infiltrator.render(renderer.testShader,GL20.GL_TRIANGLES); renderer.testShader.end(); } } }

    Read the article

  • Modular spaceship control

    - by SSS
    I am developing a physics based game with spaceships. A spaceship is constructed from circles connected by joints. Some of the circles have engines attached. Engines can rotate around the center of circle and create thrust. I want to be able to move the ship in a direction or rotate around a point by setting the rotation and thrust for each of the ship's engines. How can I find the rotation and thrust needed for each engine to achieve this?

    Read the article

  • What are some of the more commonly used projectile rendering techniques?

    - by KlashnikovKid
    couldn't find a duplicate question (bit surprising to me) but anywho I'm starting to get near implementing the rendering of projectiles for my game. My question is what are some good techniques for efficiently rendering projectiles? I would like emphasis on techniques that leave room for the projectiles to be "rich" and dynamic (Cool to look at!) I'm also using DX11 for my rendering engine so bleeding edge techniques that can make use of that would be much appreciated too. Thanks!

    Read the article

  • How can a pygame image be colored?

    - by Juicy
    I'm writing a 2d particle system for a game in Pygame[1]. For the particles, I have an image surface loaded from a file -- basically a white primitive drawn over a transparent background. I'd like the particle engine to emit variously colored particles, but I'm not sure how to tell Pygame to color the surface. I've looked through what passes for documentation, but I'm having trouble finding anything. [1] Yeah, I don't really like Pygame, but my course insists I write this project in Python.

    Read the article

< Previous Page | 653 654 655 656 657 658 659 660 661 662 663 664  | Next Page >