Search Results

Search found 26043 results on 1042 pages for 'development trunk'.

Page 433/1042 | < Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >

  • Best way to render card images

    - by user1065145
    I have high-quality SVG card images, but they drastically lose their quality when I downsize them. I have tried two ways of rendering cards (using Inkscape and Imagemagics): 1) Render SVG to high-res PNG and resize it then; 2) Render SVG to image of proper size at once. Both approaches generate blurry card images, which looks even worse than old Windows cards. What are the best way to generate smaller card images from SVG sources and not to loose their quality a lot?

    Read the article

  • Render full-screen gradient or texture

    - by Filip Skakun
    What's the simplest way to fill the background of the screen with a gradient or a texture in Direct3D 10/11? I'm building a Windows 8 metro app in which the camera never moves and I render some content in D3D, but I need to fill the background with something else than a solid color. Do I need to figure out the size and position of a rectangle and position it in 3D space or can I have some simpler solution? I don't care about depth at all, I don't use any depth buffer since all my content is sorted back to front, so I could just start by drawing to the background.

    Read the article

  • Logging library for (c++) games

    - by Klaim
    I know a lot of logging libraries but didn't test a lot of them. (GoogleLog, Pantheios, the coming boost::log library...) In games, especially in remote multiplayer and multithreaded games, logging is vital to debugging, even if you remove all logs in the end. Let's say I'm making a PC game (not console) that needs logs (multiplayer and multithreaded and/or multiprocess) and I have good reasons for looking for a library for logging (like, I don't have time or I'm not confident in my ability to write one correctly for my case). Assuming that I need : performance ease of use (allow streaming or formating or something like that) reliable (don't leak or crash!) cross-platform (at least Windows, MacOSX, Linux/Ubuntu) Wich logging library would you recommand? Currently, I think that boost::log is the most flexible one (you can even log to remotely!), but have not good performance. Pantheios is often cited but I don't have comparison points on performance and usage. I've used my own lib for a long time but I know it don't manage multithreading so it's a big problem, even if it's fast enough. Google Log seems interesting, I just need to test it but if you already have compared those libs and more, your advice might be of good use. Games are often performance demanding while complex to debug so it would be good to know logging libraries that, in our specific case, have clear advantages.

    Read the article

  • Animation API vs frame animation

    - by Max
    I'm pretty far down the road in my game right now, closing in on the end. And I'm adding little tweaks here and there. I used custom frame animation of a single image with many versions of my sprite on it, and controlled which part of the image to show using rectangles. But I'm starting to think that maybe I should've used the Animation API that comes with android instead. Will this effect my performance in a negative way? Can I still use rectangles to draw my bitmap? Could I add effects from the Animation API to my current frame-controlled animation? like the fadeout-effect etc? this would mean I wont have to change my current code. I want some of my animations to fade out, and just noticed that using the Animation API makes things alot easier. But needless to say, I would prefer not having to change all my animation-code. I'm bad at explaining, so Ill show a bit of how I do my animation: private static final int BMP_ROWS = 1; //I use top-view so only need my sprite to have 1 direction private static final int BMP_COLUMNS = 3; public void update(GameControls controls) { if (sprite.isMoving) { currentFrame = ++currentFrame % BMP_COLUMNS; } else { this.setFrame(1); } } public void draw(Canvas canvas, int x, int y, float angle) { this.x=x; this.y=y; canvas.save(); canvas.rotate(angle , x + width / 2, y + height / 2); int srcX = currentFrame * width; int srcY = 0 * height; Rect src = new Rect(srcX, srcY, srcX + width, srcY + height); Rect dst = new Rect(x, y, x + width, y + height); canvas.drawBitmap(bitmap, src, dst, null); canvas.restore(); }

    Read the article

  • Pygame: Save a list of objects/classes/surfaces

    - by Sam Tubb
    I am working on a game, in which you can create mazes. You place blocks on a 16x16 grid, while choosing from a variety of block to make the level with. Whenever you create a block, it adds this class: class Block(object): def __init__(self,x,y,spr): self.x=x self.y=y self.sprite=spr self.rect=self.sprite.get_rect(x=self.x,y=self.y) to a list called instances. I tried shelving it to a .bin file, but it returns some error dealing with surfaces. How can I go about saving and loading levels? Any help is appreciated! :) Here is the whole code for reference: import pygame from pygame.locals import * #initstuff pygame.init() screen=pygame.display.set_mode((640,480)) pygame.display.set_caption('PiMaze') instances=[] #loadsprites menuspr=pygame.image.load('images/menu.png').convert() b1spr=pygame.image.load('images/b1.png').convert() b2spr=pygame.image.load('images/b2.png').convert() currentbspr=b1spr curspr=pygame.image.load('images/curs.png').convert() curspr.set_colorkey((0,255,0)) #menu menuspr.set_alpha(185) menurect=menuspr.get_rect(x=-260,y=4) class MenuItem(object): def __init__(self,pos,spr): self.x=pos[0] self.y=pos[1] self.sprite=spr self.pos=(self.x,self.y) self.rect=self.sprite.get_rect(x=self.x,y=self.y) class Block(object): def __init__(self,x,y,spr): self.x=x self.y=y self.sprite=spr self.rect=self.sprite.get_rect(x=self.x,y=self.y) while True: #menu items b1menu=b1spr.get_rect(x=menurect.left+32,y=48) b2menu=b2spr.get_rect(x=menurect.left+64,y=48) menuitems=[MenuItem(b1menu,b1spr),MenuItem(b2menu,b2spr)] screen.fill((20,30,85)) mse=pygame.mouse.get_pos() key=pygame.key.get_pressed() placepos=((mse[0]/16)*16,(mse[1]/16)*16) if key[K_q]: if mse[0]<260: if menurect.right<255: menurect.right+=1 else: if menurect.left>-260: menurect.left-=1 else: if menurect.left>-260: menurect.left-=1 for e in pygame.event.get(): if e.type==QUIT: exit() if menurect.right<100: if e.type==MOUSEBUTTONUP: if e.button==1: to_remove = [i for i in instances if i.rect.collidepoint(placepos)] for i in to_remove: instances.remove(i) if not to_remove: instances.append(Block(placepos[0],placepos[1],currentbspr)) for i in instances: screen.blit(i.sprite,i.rect) if not key[K_q]: screen.blit(curspr,placepos) screen.blit(menuspr,menurect) for item in menuitems: screen.blit(item.sprite,item.pos) if item.rect.collidepoint(mse): if pygame.mouse.get_pressed()==(1,0,0): currentbspr=item.sprite pygame.draw.rect(screen, ((255,0,0)), item, 1) pygame.display.flip()

    Read the article

  • Implementing top view physics using box2D

    - by humbleBee
    How can top view physics games be done in box2D? One idea I have is to set the linear velocity of an object manually or to alter the linear and angular damping as my object moves over different surfaces. For example if my object is over a wet surface it'll have less linear damping and if it is over rough surface it'll have more damping. And to see if my object has fallen over an edge I'll try to use an AABB and check if its still inside or manually see if object.x > boundary.x etc. Is there any better way?

    Read the article

  • What is UVIndex and how do I use it on OpenGL?

    - by Delta
    I am a noob in OpenGL ES 2.0 (for WebGL) and I'm trying to draw a simple model I've made with a 3D tool and exported to .fbx format. I've been able to draw some models that only have: A vertex buffer, a index buffer for the vertices, a normal buffer and a texture coordinate buffer, but this model now has a "UVIndex" and I'm not sure where am I supposed to put this UVIndex. My code looks like this: GL.bindBuffer(GL.ARRAY_BUFFER, this.Model.House.VertexBuffer); GL.vertexAttribPointer(this.Shader.TextureAndLighting.Attribute["vPosition"],3,GL.FLOAT, false, 0, 0); GL.bindBuffer(GL.ARRAY_BUFFER, this.Model.House.NormalBuffer); GL.vertexAttribPointer(this.Shader.TextureAndLighting.Attribute["vNormal"], 3, GL.FLOAT, false, 0, 0); GL.bindBuffer(GL.ARRAY_BUFFER, this.Model.House.TexCoordBuffer); GL.vertexAttribPointer(this.Shader.TextureAndLighting.Attribute["TexCoord"], 2, GL.FLOAT, false, 0, 0); GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, this.Model.House.IndexBuffer); GL.bindTexture(GL.TEXTURE_2D, this.Texture.HTex1); GL.activeTexture(GL.TEXTURE0); GL.drawElements(GL.TRIANGLES, this.Model.House.IndexBuffer.Length, GL.UNSIGNED_SHORT, 0); But my model renders totally incorrect and I think it has to do with the fact that I am ignoring this "UVIndex" in the .fbx file, since I've never drawn any model that uses this UVIndex I really have no clue on what to do with it. This is the json file containing the model's data: http://pastebin.com/raw.php?i=G294TVmz

    Read the article

  • Rendering oily/polluted water?

    - by Fraser
    Any shader wizards out there have an idea of how to achieve an oily/polluted water effect, similar to this: Ideally, the water would not be uniformly oily, but instead the oil could be generated from some source (such as a polluting drain from a chemical plant) and then diffuse throughout the water body. My thought for this part would be to keep an "oil map" as a 2D texture that determines the density of oil at each point on the water surface. It would diffuse and move naturally with the water vel;ocity at that point (I have a wave-particle simulation for dynamic waves, and am already doing something similar for foam on the water surface). However, I'm not sure how physically correct that would be, since oil might not move at the same velocity as the water. And I have no idea how to make all those trippy colors :-). Thoughts?

    Read the article

  • Workaround the flip queue (AKA pre-rendered frames) in OpenGL?

    - by user41500
    It appears that some drivers implement a "flip queue" such that, even with vsync enabled, the first few calls to swap buffers return immediately (queuing those frames for later use). It is only after this queue is filled that buffer swaps will block to synchronize with vblank. This behavior is detrimental to my application. It creates latency. Does anyone know of a way to disable it or a workaround for dealing with it? The OpenGL Wiki on Swap Interval suggests a call to glFinish after the swap but I've had no such luck with that trick.

    Read the article

  • LIBGDX "parsing error emitter" with 2 or more emitters [on hold]

    - by flow969
    I have a problem with the use of particle effect of LIBGDX with 2 or more emitters. After using ParticleEditor to create my .p file, I use it in my code BUT...when I use only 1 emitter it's fine but with more than 1, not fine ! :( Here is my error code in java console : Exception in thread "LWJGL Application" java.lang.RuntimeException: Error parsing emitter: - Delay - at com.badlogic.gdx.graphics.g2d.ParticleEmitter.load(ParticleEmitter.java:910) at com.badlogic.gdx.graphics.g2d.ParticleEmitter.<init>(ParticleEmitter.java:95) at com.badlogic.gdx.graphics.g2d.ParticleEffect.loadEmitters(ParticleEffect.java:154) at com.badlogic.gdx.graphics.g2d.ParticleEffect.load(ParticleEffect.java:138) at com.fasgame.fishtrip.android.screens.GameScreen.show(GameScreen.java:313) at com.badlogic.gdx.Game.setScreen(Game.java:61) at com.fasgame.fishtrip.android.screens.MainMenuScreen.render(MainMenuScreen.java:71) at com.badlogic.gdx.Game.render(Game.java:46) at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:206) at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:114) Caused by: java.lang.NumberFormatException: For input string: "- Count -" at sun.misc.FloatingDecimal.readJavaFormatString(Unknown Source) at sun.misc.FloatingDecimal.parseFloat(Unknown Source) at java.lang.Float.parseFloat(Unknown Source) at com.badlogic.gdx.graphics.g2d.ParticleEmitter.readFloat(ParticleEmitter.java:929) at com.badlogic.gdx.graphics.g2d.ParticleEmitter$RangedNumericValue.load(ParticleEmitter.java:1062) at com.badlogic.gdx.graphics.g2d.ParticleEmitter.load(ParticleEmitter.java:866) ... 9 more And here is my particle effect .p file : Blanc - Delay - active: false - Duration - lowMin: 3000.0 lowMax: 3000.0 - Count - min: 0 max: 200 - Emission - lowMin: 0.0 lowMax: 0.0 highMin: 250.0 highMax: 250.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Life - lowMin: 500.0 lowMax: 500.0 highMin: 500.0 highMax: 500.0 relative: false scalingCount: 3 scaling0: 1.0 scaling1: 0.47058824 scaling2: 0.0 timelineCount: 3 timeline0: 0.0 timeline1: 0.51369864 timeline2: 1.0 - Life Offset - active: false - X Offset - active: false - Y Offset - active: false - Spawn Shape - shape: point - Spawn Width - lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Spawn Height - lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Scale - lowMin: 0.0 lowMax: 0.0 highMin: 70.0 highMax: 70.0 relative: true scalingCount: 2 scaling0: 1.0 scaling1: 0.0 timelineCount: 2 timeline0: 0.0 timeline1: 1.0 - Velocity - active: true lowMin: 0.0 lowMax: 0.0 highMin: 30.0 highMax: 300.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Angle - active: true lowMin: 220.0 lowMax: 320.0 highMin: 220.0 highMax: 320.0 relative: false scalingCount: 2 scaling0: 0.0 scaling1: 0.98039216 timelineCount: 2 timeline0: 0.0 timeline1: 1.0 - Rotation - active: false - Wind - active: false - Gravity - active: true lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Tint - colorsCount: 3 colors0: 0.50980395 colors1: 0.7647059 colors2: 0.7921569 timelineCount: 1 timeline0: 0.0 - Transparency - lowMin: 0.0 lowMax: 0.0 highMin: 1.0 highMax: 1.0 relative: false scalingCount: 4 scaling0: 1.0 scaling1: 1.0 scaling2: 1.0 scaling3: 1.0 timelineCount: 4 timeline0: 0.0 timeline1: 0.36301368 timeline2: 0.6164383 timeline3: 1.0 - Options - attached: false continuous: true aligned: false additive: true behind: false premultipliedAlpha: false pre_particle.png Bleu - Delay - active: false - Duration - lowMin: 3000.0 lowMax: 3000.0 - Count - min: 0 max: 200 - Emission - lowMin: 0.0 lowMax: 0.0 highMin: 250.0 highMax: 250.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Life - lowMin: 500.0 lowMax: 500.0 highMin: 500.0 highMax: 500.0 relative: false scalingCount: 3 scaling0: 1.0 scaling1: 0.47058824 scaling2: 0.0 timelineCount: 3 timeline0: 0.0 timeline1: 0.51369864 timeline2: 1.0 - Life Offset - active: false - X Offset - active: false - Y Offset - active: false - Spawn Shape - shape: point - Spawn Width - lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Spawn Height - lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Scale - lowMin: 0.0 lowMax: 0.0 highMin: 70.0 highMax: 70.0 relative: true scalingCount: 2 scaling0: 1.0 scaling1: 0.0 timelineCount: 2 timeline0: 0.0 timeline1: 1.0 - Velocity - active: true lowMin: 0.0 lowMax: 0.0 highMin: 30.0 highMax: 300.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Angle - active: true lowMin: 220.0 lowMax: 320.0 highMin: 220.0 highMax: 320.0 relative: false scalingCount: 2 scaling0: 0.0 scaling1: 0.98039216 timelineCount: 2 timeline0: 0.0 timeline1: 1.0 - Rotation - active: false - Wind - active: false - Gravity - active: true lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Tint - colorsCount: 3 colors0: 0.0 colors1: 0.7254902 colors2: 0.7921569 timelineCount: 1 timeline0: 0.0 - Transparency - lowMin: 0.0 lowMax: 0.0 highMin: 1.0 highMax: 1.0 relative: false scalingCount: 6 scaling0: 0.0 scaling1: 1.0 scaling2: 1.0 scaling3: 1.0 scaling4: 1.0 scaling5: 0.0 timelineCount: 6 timeline0: 0.0 timeline1: 0.047945205 timeline2: 0.34246576 timeline3: 0.6712329 timeline4: 0.94520545 timeline5: 1.0 - Options - attached: false continuous: true aligned: false additive: true behind: false premultipliedAlpha: false pre_particle.png BleuFonce - Delay - active: false - Duration - lowMin: 3000.0 lowMax: 3000.0 - Count - min: 0 max: 200 - Emission - lowMin: 0.0 lowMax: 0.0 highMin: 250.0 highMax: 250.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Life - lowMin: 500.0 lowMax: 500.0 highMin: 500.0 highMax: 500.0 relative: false scalingCount: 3 scaling0: 1.0 scaling1: 0.47058824 scaling2: 0.0 timelineCount: 3 timeline0: 0.0 timeline1: 0.51369864 timeline2: 1.0 - Life Offset - active: false - X Offset - active: false - Y Offset - active: false - Spawn Shape - shape: point - Spawn Width - lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Spawn Height - lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Scale - lowMin: 0.0 lowMax: 0.0 highMin: 70.0 highMax: 70.0 relative: true scalingCount: 2 scaling0: 1.0 scaling1: 0.0 timelineCount: 2 timeline0: 0.0 timeline1: 1.0 - Velocity - active: true lowMin: 0.0 lowMax: 0.0 highMin: 30.0 highMax: 300.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Angle - active: true lowMin: 220.0 lowMax: 320.0 highMin: 220.0 highMax: 320.0 relative: false scalingCount: 2 scaling0: 0.0 scaling1: 0.98039216 timelineCount: 2 timeline0: 0.0 timeline1: 1.0 - Rotation - active: false - Wind - active: false - Gravity - active: true lowMin: 0.0 lowMax: 0.0 highMin: 0.0 highMax: 0.0 relative: false scalingCount: 1 scaling0: 1.0 timelineCount: 1 timeline0: 0.0 - Tint - colorsCount: 3 colors0: 0.0 colors1: 0.7294118 colors2: 1.0 timelineCount: 1 timeline0: 0.0 - Transparency - lowMin: 0.0 lowMax: 0.0 highMin: 1.0 highMax: 1.0 relative: false scalingCount: 4 scaling0: 1.0 scaling1: 0.0 scaling2: 0.0 scaling3: 1.0 timelineCount: 4 timeline0: 0.0 timeline1: 0.001 timeline2: 0.5753425 timeline3: 0.79452056 - Options - attached: false continuous: true aligned: false additive: true behind: false premultipliedAlpha: false pre_particle.png For the "- Image Path -" missing it's normal if I let them in it doesn't work even with only 1 emitter PS : I've already updated my lib to the last release

    Read the article

  • Blender: How to "meshify" an object I made from Bezier curves

    - by capcom
    I made a star shape using Bezier curves, and extruded it (see pic below): What I want to do is give it a rounder look - not just around the edges by using beveling. I want it to kind of look like this (well, that shape anyway): How would I go about doing this? Please keep in mind that I am extremely new to Blender. I thought that I could somehow turn this star into those default shapes that have tonnes of squares which I could pull out, and apply a mirror to it so that the same thing happens on both sides. I really don't know how to do it, and would appreciate your help.

    Read the article

  • Using MVC with a retained mode renderer

    - by David Gouveia
    I am using a retained mode renderer similar to the display lists in Flash. In other words, I have a scene graph data structure called the Stage to which I add the graphical primitives I would like to see rendered, such as images, animations, text. For simplicity I'll refer to them as Sprites. Now I'm implementing an architecture which is becoming very similar to MVC, but I feel that that instead of having to create View classes, that the sprites already behave pretty much like Views (except for not being explicitly connected to the Model). And since the Model is only changed through the Controller, I could simply update the view together with the Model in the controller, as in the example below: Example 1 class Controller { Model model; Sprite view; void TeleportTo(Vector2 position) { model.Position = view.Position = position; } } The alternative, I think, would be to create View classes that wrap the sprites, make the model observable, and make the view react to changes on the model. This seems like a lot of extra work and boilerplate code, and I'm not seeing the benefits if I'm just going to have one view per controller. Example 2 class Controller { Model model; View view; void TeleportTo(Vector2 position) { model.Position = position; } } class View { Model model; Sprite sprite; View() { model.PropertyChanged += UpdateView; } void UpdateView() { sprite.Position = model.Position; } } So, how is MVC or more specifically, the View, usually implemented when using a retained-mode renderer? And is there any reason why I shouldn't stick with example 1?

    Read the article

  • Map and fill texture using PBO (OpenGL 3.3)

    - by NtscCobalt
    I'm learning OpenGL 3.3 trying to do the following (as it is done in D3D)... Create Texture of Width, Height, Pixel Format Map texture memory Loop write pixels Unmap texture memory Set Texture Render Right now though it renders as if the entire texture is black. I can't find a reliable source for information on how to do this though. Almost every tutorial I've found just uses glTexSubImage2D and passes a pointer to memory. Here is basically what my code does... (In this case it is generating an 1-byte Alpha Only texture but it is rendering it as the red channel for debugging) GLuint pixelBufferID; glGenBuffers(1, &pixelBufferID); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, pixelBufferID); glBufferData(GL_PIXEL_UNPACK_BUFFER, 512 * 512 * 1, nullptr, GL_STREAM_DRAW); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0); GLuint textureID; glGenTextures(1, &textureID); glBindTexture(GL_TEXTURE_2D, textureID); glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, 512, 512, 0, GL_RED, GL_UNSIGNED_BYTE, nullptr); glBindTexture(GL_TEXTURE_2D, 0); glBindTexture(GL_TEXTURE_2D, textureID); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, pixelBufferID); void *Memory = glMapBuffer(GL_PIXEL_UNPACK_BUFFER, GL_WRITE_ONLY); // Memory copied here, I know this is valid because it is the same loop as in my working D3D version glUnmapBuffer(GL_PIXEL_UNPACK_BUFFER); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0); And then here is the render loop. // This chunk left in for completeness glUseProgram(glProgramId); glBindVertexArray(glVertexArrayId); glBindBuffer(GL_ARRAY_BUFFER, glVertexBufferId); glEnableVertexAttribArray(0); glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 20, 0); glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 20, 12); GLuint transformLocationID = glGetUniformLocation(3, 'transform'); glUniformMatrix4fv(transformLocationID , 1, true, somematrix) // Not sure if this is all I need to do glBindTexture(GL_TEXTURE_2D, pTex->glTextureId); GLuint textureLocationID = glGetUniformLocation(glProgramId, "texture"); glUniform1i(textureLocationID, 0); glDrawArrays(GL_TRIANGLES, Offset*3, Triangles*3); Vertex Shader #version 330 core in vec3 Position; in vec2 TexCoords; out vec2 TexOut; uniform mat4 transform; void main() { TexOut = TexCoords; gl_Position = vec4(Position, 1.0) * transform; } Pixel Shader #version 330 core uniform sampler2D texture; in vec2 TexCoords; out vec4 fragColor; void main() { // Output color fragColor.r = texture2D(texture, TexCoords).r; fragColor.g = 0.0f; fragColor.b = 0.0f; fragColor.a = 1.0; }

    Read the article

  • box2d resize bodies arround point

    - by philipp
    I have a compound object, consisting of a b2Body, vector-graphics and a list polygons which describe the b2body's shapes. This object has its own transformation matrix to centralize the storage of transformations. So far everything is working quiet fine, even scaling works, but not if i scale around a point. In the initialization phase of the object it is scaled around a point. This happens in this order: transform the main matrix transform the vector graphics and the polygons recreate the b2Body After this function ran, the shapes and all the graphics are exactly where they should be, BUT: after the first steps of the b2World the graphical stuff moves away from the body. When I ran the debugger I found out that the position of the body is 0/0 the red dot shows the center of scaling. the first image shows the basic setup and the second the final position of the graphics. This distance stays constant for the rest of the simulation. If I set the position via myBody.SetPosition( sx, sy ); the whole scenario just plays a bit more distant for the origin. Any Idea how to fix this? EDIT:: I came deeper down to the problem and it lies in the fact that i must not scale the transform matrix for the b2body shapes around the center, but set the b2body's position back to the point after scaling. But how can I calculate that point? EDIT 2 :: I came ever deeper down to it, even solved it, but this is a slow solution and i hope that there is somebody who understands what formula I need. assuming to have a set polygons relative to an origin as basis shapes for a b2body: scaling the whole object around a certain point is done in the following steps: i scale everything around the center except the polygons i create a clone of the polygons matrix i scale this clone around the point i calculate dx, dy as difference of clone.tx - original.tx and clone.ty - original.ty i scale the original polygon matrix NOT around the point i recreate the body i create the fixture i set the position of the body to dx and dy done! So what i an interested in is a formula for dx and dy without cloning matrices, scaling the clone around a point, getting dx and dy and finally scale the vertex matrix.

    Read the article

  • Implmenting RLE into a tilemap or how to create a large 3D array?

    - by Smallbro
    Currently I've been using a 3D array for my tiles in a 2D world but the 3D side comes in when moving down into caves and whatnot. Now this is not memory efficient and I switched over to a 2D array and can now have much larger maps. The only issue I'm having now is that it seems that my tiles cannot occupy the same space as a tile on the same z level. My current structure means that each block has its own z variable. This is what it used to look like: map.blockData[x][y][z] = new Block(); however now it works like this map.blockData[x][y] = new Block(z); I'm not sure why but if I decide to use the same space on say the floor below it wont allow me to. Does anyone have any ideas on how I can add a z-axis to my 2D array? I'm using java but I reckon the concept carries across different languages. Edit: As Will posted, RLE sounds like the best method for achieving a fast 3D array. However I'm struggling to understand how I would even start to implement it? Would I create a 4D array the 4th being something which controls how many to skip? Or would the x-axis simply change altogether and have large gaps in between - for example [5][y][z] would skip 5 tiles? Is there something really obvious here which I am missing? The number of z levels I'm trying to have is around 66, it would be preferably that I can have up to or more than 1000 in x and y.

    Read the article

  • How can I downsample a texture using FBOs?

    - by snape
    I am rendering a scene to FBO as my render target whose size is 8 times the size of the orignal screen in OpenGL. Now i wan to downsample the texture generated by FBO to the size of the screen so as to achieve spatial anti aliasing. How do i achieve the down sampling ? Please provide implementation details. Note : If there is a better way of doing anti aliasing in FBOs please mention that too. I am trying to remove the aliasing in the image attached below.

    Read the article

  • How do I draw a single Triangle with XNA and fill it with a Texture?

    - by Deukalion
    I'm trying to wrap my head around: http://msdn.microsoft.com/en-us/library/bb196409.aspx I'm trying to create a method in XNA that renders a single Triangle, then later make a method that takes a list of Triangles and renders them also. But it isn't working. I'm not understanding what all the things does and there's not enough information. My methods: // Triangle is a struct with A, B, C (didn't include) A, B, C = Vector3 public static void Render(GraphicsDevice device, List<Triangle> triangles, Texture2D texture) { foreach (Triangle triangle in triangles) { Render(device, triangle, texture); } } public static void Render(GraphicsDevice device, Triangle triangle, Texture2D texture) { BasicEffect _effect = new BasicEffect(device); _effect.Texture = texture; _effect.VertexColorEnabled = true; VertexPositionColor[] _vertices = new VertexPositionColor[3]; _vertices[0].Position = triangle.A; _vertices[1].Position = triangle.B; _vertices[2].Position = triangle.B; foreach (var pass in _effect.CurrentTechnique.Passes) { pass.Apply(); device.DrawUserIndexedPrimitives<VertexPositionColor> ( PrimitiveType.TriangleList, _vertices, 0, _vertices.Length, new int[] { 0, 1, 2 }, // example has something similiar, no idea what this is 0, 3 // 3 = gives me an error, 1 = works but no results ); } }

    Read the article

  • Not getting desired results with SSAO implementation

    - by user1294203
    After having implemented deferred rendering, I tried my luck with a SSAO implementation using this Tutorial. Unfortunately, I'm not getting anything that looks like SSAO, you can see my result below. You can see there is some weird pattern forming and there is no occlusion shading where there needs to be (i.e. in between the objects and on the ground). The shaders I implemented follow: #VS #version 330 core uniform mat4 invProjMatrix; layout(location = 0) in vec3 in_Position; layout(location = 2) in vec2 in_TexCoord; noperspective out vec2 pass_TexCoord; smooth out vec3 viewRay; void main(void){ pass_TexCoord = in_TexCoord; viewRay = (invProjMatrix * vec4(in_Position, 1.0)).xyz; gl_Position = vec4(in_Position, 1.0); } #FS #version 330 core uniform sampler2D DepthMap; uniform sampler2D NormalMap; uniform sampler2D noise; uniform vec2 projAB; uniform ivec3 noiseScale_kernelSize; uniform vec3 kernel[16]; uniform float RADIUS; uniform mat4 projectionMatrix; noperspective in vec2 pass_TexCoord; smooth in vec3 viewRay; layout(location = 0) out float out_AO; vec3 CalcPosition(void){ float depth = texture(DepthMap, pass_TexCoord).r; float linearDepth = projAB.y / (depth - projAB.x); vec3 ray = normalize(viewRay); ray = ray / ray.z; return linearDepth * ray; } mat3 CalcRMatrix(vec3 normal, vec2 texcoord){ ivec2 noiseScale = noiseScale_kernelSize.xy; vec3 rvec = texture(noise, texcoord * noiseScale).xyz; vec3 tangent = normalize(rvec - normal * dot(rvec, normal)); vec3 bitangent = cross(normal, tangent); return mat3(tangent, bitangent, normal); } void main(void){ vec2 TexCoord = pass_TexCoord; vec3 Position = CalcPosition(); vec3 Normal = normalize(texture(NormalMap, TexCoord).xyz); mat3 RotationMatrix = CalcRMatrix(Normal, TexCoord); int kernelSize = noiseScale_kernelSize.z; float occlusion = 0.0; for(int i = 0; i < kernelSize; i++){ // Get sample position vec3 sample = RotationMatrix * kernel[i]; sample = sample * RADIUS + Position; // Project and bias sample position to get its texture coordinates vec4 offset = projectionMatrix * vec4(sample, 1.0); offset.xy /= offset.w; offset.xy = offset.xy * 0.5 + 0.5; // Get sample depth float sample_depth = texture(DepthMap, offset.xy).r; float linearDepth = projAB.y / (sample_depth - projAB.x); if(abs(Position.z - linearDepth ) < RADIUS){ occlusion += (linearDepth <= sample.z) ? 1.0 : 0.0; } } out_AO = 1.0 - (occlusion / kernelSize); } I draw a full screen quad and pass Depth and Normal textures. Normals are in RGBA16F with the alpha channel reserved for the AO factor in the blur pass. I store depth in a non linear Depth buffer (32F) and recover the linear depth using: float linearDepth = projAB.y / (depth - projAB.x); where projAB.y is calculated as: and projAB.x as: These are derived from the glm::perspective(gluperspective) matrix. z_n and z_f are the near and far clip distance. As described in the link I posted on the top, the method creates samples in a hemisphere with higher distribution close to the center. It then uses random vectors from a texture to rotate the hemisphere randomly around the Z direction and finally orients it along the normal at the given pixel. Since the result is noisy, a blur pass follows the SSAO pass. Anyway, my position reconstruction doesn't seem to be wrong since I also tried doing the same but with the position passed from a texture instead of being reconstructed. I also tried playing with the Radius, noise texture size and number of samples and with different kinds of texture formats, with no luck. For some reason when changing the Radius, nothing changes. Does anyone have any suggestions? What could be going wrong?

    Read the article

  • Tips for communication between JS browser game and node.js server?

    - by Petteri Hietavirta
    I am tinkering around with some simple Canvas based cave flyer game and I would like to make it multiplayer eventually. The plan is to use Node.js on the server side. The data sent over would consists of position of each player, direction, velocity and such. The player movements are simple force physics, so I should be able to extrapolate movements before next update from server. Any tips or best practices on the communications side? I guess web sockets are the way to go. Should I send information in every pass of the game loop or with specified intervals? Also, I don't mind if it doesn't work with older browsers.

    Read the article

  • What's the difference between Canvas and WebGL?

    - by gadr90
    I'm thinking about using CAAT as a part of a HTML5 game engine. One of it's features is the ability to render to Canvas and WebGL without changing anything in the client code. That is a good thing, but I haven't found precisely: what are the differences between those two technologies? I would specially like to know the differences of Canvas and WebGL in the following regards: Framerate Desktop browser support Mobile browser support Futureproofability (TM)

    Read the article

  • How to monetize and/or protect framework rights?

    - by Arthur Wulf White
    I made a game engine/framerwork for ActionScript 3 that allows very efficient and flexible level design for Platformers, Tower Defense game, RPG's, RTS and racing games. The algorithms I used are new and are not available in any other level editor I've seen. What are the best ways to benefit myself and others with my new framework? It is written for ActionScript 3 so unless I translate it to C# I'm guessing it will be decompiled and used by others. I want to have some lisence, allowing me to share the framework and still benefit from it. Any advice would be appreciated. This issue has been on my mind a lot this year. I am hoping to find a solution that will bring me some relief.

    Read the article

  • What is a good way to measure game virality?

    - by Chris Garrett
    I have added some social features to an iPhone game (Lexitect if you're curious), such as email, Twitter, and Facebook integration for sharing high scores. Along with these features, I am measuring how many times users make it to each step. The goal of these features are to make the game more viral, and I am trying to get to a measure of game virality. I would think that a game virality metric would produce a number based on 1.0, where 1.0 = zero viral growth, and 1.01 would represent 1% viral growth over some unit of time. How is virality normally measured, and in what units? How is time capped on the metric? i.e. if I gave each player a year to determine how many recommendations they make, I wouldn't get any real numbers for a year from the time I start tracking it. Are there any standards for tracking virality in a meaningful way?

    Read the article

  • Sprites rendering blurry with velocity

    - by ashes999
    After adding velocity to my game, I feel like my textures are twitching. I thought it was just my eyes, until I finally captured it in a screenshot: The one on the left is what renders in my game; the one on the right is the original sprite, pasted over. (This is a screenshot from Photoshop, zoomed in 6x.) Notice the edges are aliasing -- it looks almost like sub-pixel rendering. In fact, if I had not forced my sprites (which have position and velocity as ints) to draw using integer values, I would swear that MonoGame is drawing with floating point values. But it isn't. What could be the cause of these things appearing blurry? It doesn't happen without velocity applied. To be precise, my SpriteComponent class has a Vector2 Position field. When I call Draw, I essentially use new Vector2((int)Math.Round(this.Position.X), (int)Math.Round(this.Position.Y)) for the position. I had a bug before where even stationary objects would jitter -- that was due to me using the straight Position vector and not rounding the values to ints. If I use Floor/Ceiling instead of round, the sprite sinks/hovers (one pixel difference either way) but still draws blurry.

    Read the article

  • Develop 3d game for iphone & android in single code

    - by lajpat
    I have to make a 3d game on the lines of this app 3D Chess I want to make this app for both android & iphone by writing a single code. Of course little native code will be required to be done. I want to ensure that entire logic & animation code is written only once. What software does one suggest to achieve this since I am not a tight schedule. I came across Corona but I am not sure if such game can be made using it. Others I found Unity & Shiva. I am not experience in 3D game so please if someone can help Thanks Lajpat

    Read the article

  • How does Starcraft 2 load it's metadata?

    - by chobok
    Lets say you are playing Starcraft 2 melee map. The game loads the map. Melee maps have the following dependencies: Liberty (Mod) Liberty Multi (Mod) I think the game engine will load the data from Liberty (Mod) first, then from Liberty Multi (Mod). For data that exists in both dependencies, the engine will use the one from Liberty Multi (Mod). Is this correct? Liberty Multi (Mod) is updated with each patch of Starcraft 2. Does the game engine load just the latest version of Liberty Multi (Mod)? or Does the game engine load all the versions and overwrite duplicate data with the latest version?

    Read the article

< Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >