Search Results

Search found 38203 results on 1529 pages for 'library development'.

Page 541/1529 | < Previous Page | 537 538 539 540 541 542 543 544 545 546 547 548  | Next Page >

  • How AlphaBlend Blendstate works in XNA when accumulighting light into a RenderTarget?

    - by cubrman
    I am using a Deferred Rendering engine from Catalin Zima's tutorial: His lighting shader returns the color of the light in the rgb channels and the specular component in the alpha channel. Here is how light gets accumulated: Game.GraphicsDevice.SetRenderTarget(LightRT); Game.GraphicsDevice.Clear(Color.Transparent); Game.GraphicsDevice.BlendState = BlendState.AlphaBlend; // Continuously draw 3d spheres with lighting pixel shader. ... Game.GraphicsDevice.BlendState = BlendState.Opaque; MSDN states that AlphaBlend field of the BlendState class uses the next formula for alphablending: (source × Blend.SourceAlpha) + (destination × Blend.InvSourceAlpha), where "source" is the color of the pixel returned by the shader and "destination" is the color of the pixel in the rendertarget. My question is why do my colors are accumulated correctly in the Light rendertarget even when the new pixels' alphas equal zero? As a quick sanity check I ran the following code in the light's pixel shader: float specularLight = 0; float4 light4 = attenuation * lightIntensity * float4(diffuseLight.rgb,specularLight); if (light4.a == 0) light4 = 0; return light4; This prevents lighting from getting accumulated and, subsequently, drawn on the screen. But when I do the following: float specularLight = 0; float4 light4 = attenuation * lightIntensity * float4(diffuseLight.rgb,specularLight); return light4; The light is accumulated and drawn exactly where it needs to be. What am I missing? According to the formula above: (source x 0) + (destination x 1) should equal destination, so the "LightRT" rendertarget must not change when I draw light spheres into it! It feels like the GPU is using the Additive blend instead: (source × Blend.One) + (destination × Blend.One)

    Read the article

  • Pylucene in Python 2.6 + MacOs Snow Leopard

    - by jbastos
    Greetings, I'm trying to install Pylucene on my 32-bit python running on Snow Leopard. I compiled JCC with success. But I get warnings while making pylucene: ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/__init__.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/__wrap01__.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/__wrap02__.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/__wrap03__.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/functions.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/JArray.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/JObject.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/lucene.o, file is not of required architecture ld: warning: in build/temp.macosx-10.6-i386-2.6/build/_lucene/types.o, file is not of required architecture ld: warning: in /Developer/SDKs/MacOSX10.4u.sdk/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/JCC-2.3-py2.6-macosx-10.3-fat.egg/libjcc.dylib, file is not of required architecture ld: warning: in /Developer/SDKs/MacOSX10.4u.sdk/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/JCC-2.3-py2.6-macosx-10.3-fat.egg/libjcc.dylib, file is not of required architecture build of complete Then I try to import lucene: MacBookPro:~/tmp/trunk python Python 2.6.3 (r263:75184, Oct 2 2009, 07:56:03) [GCC 4.0.1 (Apple Inc. build 5493)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import pylucene Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: No module named pylucene >>> import lucene Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/lucene-2.9.0-py2.6-macosx-10.6-i386.egg/lucene/__init__.py", line 7, in <module> import _lucene ImportError: dlopen(/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/lucene-2.9.0-py2.6-macosx-10.6-i386.egg/lucene/_lucene.so, 2): Symbol not found: __Z8getVMEnvP7_object Referenced from: /Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/lucene-2.9.0-py2.6-macosx-10.6-i386.egg/lucene/_lucene.so Expected in: flat namespace in /Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/lucene-2.9.0-py2.6-macosx-10.6-i386.egg/lucene/_lucene.so >>> Any hints?

    Read the article

  • Importing a windows project into android using cocos2d-x

    - by Ef Es
    What I am trying to do today is to import a full project to Android, but no tutorials are available for that that I have seen. My approach was to create a new android project, copy all the classes and resources in the folders and calling ./build_native.sh but I get an error because most of the files are not being included in the project. I tried opening the Android.mk and I can see why "LOCAL_SRC_FILES := AppDelegate.cpp \ HelloWorldScene.cpp" are the only files linked. Should I manually modify the make file or can it be automated by some way I don't know? Thank you. UPDATE: I manually added all files and headers to the make file and I get errors linking Box2D or cocosdenshion libraries.

    Read the article

  • Send less Server Data with "AFK"

    - by Oliver Schöning
    I am working on a 2D (Realtime) MultiPlayer Game. With Construct2 and a Socket.IO JavaScript Server. Right now the code does not include the Array for each Player. var io = require("socket.io").listen(80); var x = 10; io.sockets.on("connection", function (socket) { socket.on("message", function(data) { x = x+1; }); }); setInterval(function() { io.sockets.emit("message", 'Pos,' + x); },100); I noticed a very annoying problem with my server today. It sends my X Coordinates every 100 milliseconds. The Problem was, that when I went into another Browser Tab, the Browser stopped the Game from running. And when I went back, I think the Game had to run through all the packages. Because my Offline Debugging Button still worked immediately and the Online Button only responded after some seconds. So then I changed my Code so that it would only send out an update when it received a player Input: var io = require("socket.io").listen(80); var x = 10; io.sockets.on("connection", function (socket) { socket.on("message", function(data) { x = x+1; io.sockets.emit("message", 'Pos,' + x); }); }); And it Updated Immediately, even when I had been inactive on the Browser Tab for a long time. Confirming my suspicion that it had to get through all the data. Confirm Please! It would be insane to only send information on Client Input in a Real Time Game. But how would I write a AFK function? I would think it is easier to run a AFK Boolean Loop on the Server. Here is what I need help for: playerArray[Me] if ( "Not Given any Input for X amount of Seconds" ) { "Don't send Data" } else { "Send Data" }

    Read the article

  • Strange javascript error when using Kongregates API

    - by Phil
    In the hopes of finding a fellow unity3d developer also aiming for the Kongregate contest.. I've implemented the Kongregate API and can see that the game receives a call with my username and presents it ingame. I'm using Application.ExternalCall("kongregate.stats.submit",type,amount); where type is a string "Best Score" and amount is an int (1000 or something). This is the error I'm getting: You are trying to call recursively into the Flash Player which is not allowed. In most cases the JavaScript setTimeout function, can be used as a workaround. callASFunction:function(a,b){if(FABrid...tion, can be used as a workaround."); I'm wondering, has anyone else had this error or am I somehow doing something stupid? Thanks!

    Read the article

  • XNA calculate normals for linesegment

    - by Gerhman
    I am quite new to 3D graphical programming and thus far only understand that normal somehow define the direction in which a vertex faces and therefore the direction in which light is reflected. I have now idea how they are calculated though, only that they are defined by a Vector3. For a visualizer that I am creating I am importing a bunch of coordinate which represent layer upon layer of line segments. At the moment I am only using a vertex buffer and adding the start and end point of each line and then rendering a linelist. The thing is now that I need to calculate the normal for the vertices of these line segments so that I can get some realistic lighting. I have no idea how to calculate these normal but I know they all face sideways and not up or down. To calculate them all I have are the start and end positions of each line segment. The below image is a representation of what I think I need to do in the case of an example layer: The red arrows represent the normal that should be calculates, the blue text represent the coordinates of the vertices and the green numbers represent their indices. I would greatly appreciate it if someone could please explain to me how I should calculate these normal.

    Read the article

  • Joystick example problem for android 2D

    - by iQue
    I've searched all over the web for an answer to this, and there are simular topics but nothing works for me, and I have no Idea why. I just want to move my sprite using a joystick, since I'm useless at math when it comes to angles etc I used an example, Ill post the code here: public float initx = 50; //og 425; public float inity = 300; //og 267; public Point _touchingPoint = new Point(50, 300); //og(425, 267); public Point _pointerPosition = new Point(100, 170); private Boolean _dragging = false; private MotionEvent lastEvent; @Override public boolean onTouchEvent(MotionEvent event) { if (event == null && lastEvent == null) { return _dragging; } else if (event == null && lastEvent != null) { event = lastEvent; } else { lastEvent = event; } // drag drop if (event.getAction() == MotionEvent.ACTION_DOWN) { _dragging = true; } else if (event.getAction() == MotionEvent.ACTION_UP) { _dragging = false; } if (_dragging) { // get the pos _touchingPoint.x = (int) event.getX(); _touchingPoint.y = (int) event.getY(); // bound to a box if (_touchingPoint.x < 25) { _touchingPoint.x = 25; //og 400 } if (_touchingPoint.x > 75) { _touchingPoint.x = 75; //og 450 } if (_touchingPoint.y < 275) { _touchingPoint.y = 275; //og 240 } if (_touchingPoint.y > 325) { _touchingPoint.y = 325; //og 290 } // get the angle double angle = Math.atan2(_touchingPoint.y - inity, _touchingPoint.x - initx) / (Math.PI / 180); // Move the beetle in proportion to how far // the joystick is dragged from its center _pointerPosition.y += Math.sin(angle * (Math.PI / 180)) * (_touchingPoint.x / 70); _pointerPosition.x += Math.cos(angle * (Math.PI / 180)) * (_touchingPoint.x / 70); // stop the sprite from goin thru if (_pointerPosition.x + happy.getWidth() >= getWidth()) { _pointerPosition.x = getWidth() - happy.getWidth(); } if (_pointerPosition.x < 0) { _pointerPosition.x = 0; } if (_pointerPosition.y + happy.getHeight() >= getHeight()) { _pointerPosition.y = getHeight() - happy.getHeight(); } if (_pointerPosition.y < 0) { _pointerPosition.y = 0; } } public void render(Canvas canvas) { canvas.drawColor(Color.BLUE); canvas.drawBitmap(joystick.get_joystickBg(), initx-45, inity-45, null); canvas.drawBitmap(happy, _pointerPosition.x, _pointerPosition.y, null); canvas.drawBitmap(joystick.get_joystick(), _touchingPoint.x - 26, _touchingPoint.y - 26, null); } public void update() { this.onTouchEvent(null); } og= original position. as you can see Im trying to move the joystick, but when I do it stops working correctly, I mean it still works like a joystick but the sprite dosnt move accordingly, if I for example push the joystick down, the sprite moves up, and if I push it up it moves left. can anyone PLEASE help me, I've been stuck here for sooo long and its really frustrating.

    Read the article

  • I'm using OpenAL, trying to load a .ogg file and having .dll troubles

    - by Brendan Webster
    I'm using OpenAL for my game's music, and it loads .wav files by default, but to load in Ogg files I had to download and setup a few .dlls and lib files. I have fixed all errors with dlls except for this: I need vorbis.dll, and it says it's missing vorbis_window. I just can't find the dll anywhere online that includes the vorbis_window, anyone have suggestions on how I should fix this problem with my dll?

    Read the article

  • Tangent basis calculation problem

    - by Kirill Daybov
    I have the problem with seams with calculating a tangent basis in my application. I'm using a seems to be right algorithm, but it gives wrong result on the seams. What am I doing wrong? Is there a problem with an algorithm, or with the model? The designer says that our models with our normal maps are rendered correctly in Xoliul Shader Plugin in 3Ds Max, so there should be a way to calculate correct tangent basis programmatically. Here's an example of the problem I'm talking about. Steps, I've already taken: - Tried different algorithm (from Gamasutra, I can't post the link because I don't have enough reputation yet). I got wrong, much worse, results; - Tried to average basis vectors for vertexes are used in multiple faces; - Tried to average basis vectors for vertexes that have same world coordinates (this would be obviously wrong solution, but I've tried it anyway).

    Read the article

  • Distance between two 3D objects' faces

    - by Arthur Gibraltar
    I'm really newbie on programming and I'm making some tests. I couldn't find nowhere on Internet how could I calculate the distance between two 3D objects' faces. Is there anyway? Detailing, as an example, I have two 3D cubes. Each one has a vector3 position designating it's center on the 3D space and an orientation matrix. And each cube has a size (float width, float height and float length). I could get a simple distance between them by calling Vector3.Distance(), but it doesn't consider its sizes, just the position. Then the distance would be between its centers. Is there any way to calculate the distance between the faces? Thanks for any reply.

    Read the article

  • libgdx - removing the circle outline rendered on Box2d CircleShape

    - by Brett
    How can I remove the outline on the circleshape below.. CircleShape circle = new CircleShape(); circle.setRadius(1f); ... using ... batch.draw(textureRegion, position.x - 1, position.y - 1, 1f, 1f, 2, 2, 1, 1, angle); I use this to set the body for a Box2d collision but I get a silly circle shape around my texture in libGdx, i.e. my textured sprite (ball) has a circle over the top of it with a line running from center along the radius. Any ideas on how to remove the overlying circle lines?

    Read the article

  • Collisions between moving ball and polygons

    - by miguelSantirso
    I know this is a very typical problem and that there area a lot of similar questions, but I have been looking for a while and I have not found anything that fits what I want. I am developing a 2D game in which I need to perform collisions between a ball and simple polygons. The polygons are defined as an array of vertices. I have implemented the collisions with the bounding boxes of the polygons (that was easy) and I need to refine that collision in the cases where the ball collides with the bounding box. The ball can move quite fast and the polygons are not too big so I need to perform continuous collisions. I am looking for a method that allows me to detect if the ball collides with a polygon and, at the same time, calculate the new direction for the ball after bouncing in the polygon. (I am using XNA, in case that helps)

    Read the article

  • Cocos2d and Body with few collision shapes using chipmunk

    - by Eimantas
    I'm using Cocos2d (0.99.5) with chipmunk physics engine. Currently I'm trying to place a body into space which is combined from few circle shapes. Let's say I have a corresponding sprite image with displays atom (nucleus + 3 electrons around it. Something like this without orbit lines). In it's simplest form - only one circle shape at the center should be enough which would detect collisions from other objects with nucleus. Now I'd like to add other circle shapes for each electron. How can I do that? Now when I add those shapes to the body and add the body into chipmunk space - the shapes (together with the body/sprite) start flickering and spinning with no recognizable pattern (or reason for that matter).

    Read the article

  • Calculating the position of an object with regards to current position using OpenGL like matrices

    - by spartan2417
    i have a 1st person camera that collides with walls, i also have a small sphere in front of my camera denoted by the camera position plus the distance ahead. I cannot get the postion of the sphere but i have the position of my camera. e.g. i need to find the position of the point or at the very least find away of calculating the position using the camera positions. code: static Float P_z = 0; P_z = -15; PushMatrix(); LoadMatrix(&Inv); Material(SCEGU_AMBIENT, 0x00000066); TranslateXYZ(0,0,P_z); ScaleXYZ(0.1f,0.1f,0.1f); pointer.Render(); PopMatrix(); where Inv is the camera positions (Inv.w.x,Inv.w.z), pointer is the sphere.

    Read the article

  • Per-pixel displacement mapping GLSL

    - by Chris
    Im trying to implement a per-pixel displacement shader in GLSL. I read through several papers and "tutorials" I found and ended up with trying to implement the approach NVIDIA used in their Cascade Demo (http://www.slideshare.net/icastano/cascades-demo-secrets) starting at Slide 82. At the moment I am completly stuck with following problem: When I am far away the displacement seems to work. But as more I move closer to my surface, the texture gets bent in x-axis and somehow it looks like there is a little bent in general in one direction. EDIT: I added a video: click I added some screen to illustrate the problem: Well I tried lots of things already and I am starting to get a bit frustrated as my ideas run out. I added my full VS and FS code: VS: #version 400 layout(location = 0) in vec3 IN_VS_Position; layout(location = 1) in vec3 IN_VS_Normal; layout(location = 2) in vec2 IN_VS_Texcoord; layout(location = 3) in vec3 IN_VS_Tangent; layout(location = 4) in vec3 IN_VS_BiTangent; uniform vec3 uLightPos; uniform vec3 uCameraDirection; uniform mat4 uViewProjection; uniform mat4 uModel; uniform mat4 uView; uniform mat3 uNormalMatrix; out vec2 IN_FS_Texcoord; out vec3 IN_FS_CameraDir_Tangent; out vec3 IN_FS_LightDir_Tangent; void main( void ) { IN_FS_Texcoord = IN_VS_Texcoord; vec4 posObject = uModel * vec4(IN_VS_Position, 1.0); vec3 normalObject = (uModel * vec4(IN_VS_Normal, 0.0)).xyz; vec3 tangentObject = (uModel * vec4(IN_VS_Tangent, 0.0)).xyz; //vec3 binormalObject = (uModel * vec4(IN_VS_BiTangent, 0.0)).xyz; vec3 binormalObject = normalize(cross(tangentObject, normalObject)); // uCameraDirection is the camera position, just bad named vec3 fvViewDirection = normalize( uCameraDirection - posObject.xyz); vec3 fvLightDirection = normalize( uLightPos.xyz - posObject.xyz ); IN_FS_CameraDir_Tangent.x = dot( tangentObject, fvViewDirection ); IN_FS_CameraDir_Tangent.y = dot( binormalObject, fvViewDirection ); IN_FS_CameraDir_Tangent.z = dot( normalObject, fvViewDirection ); IN_FS_LightDir_Tangent.x = dot( tangentObject, fvLightDirection ); IN_FS_LightDir_Tangent.y = dot( binormalObject, fvLightDirection ); IN_FS_LightDir_Tangent.z = dot( normalObject, fvLightDirection ); gl_Position = (uViewProjection*uModel) * vec4(IN_VS_Position, 1.0); } The VS just builds the TBN matrix, from incoming normal, tangent and binormal in world space. Calculates the light and eye direction in worldspace. And finally transforms the light and eye direction into tangent space. FS: #version 400 // uniforms uniform Light { vec4 fvDiffuse; vec4 fvAmbient; vec4 fvSpecular; }; uniform Material { vec4 diffuse; vec4 ambient; vec4 specular; vec4 emissive; float fSpecularPower; float shininessStrength; }; uniform sampler2D colorSampler; uniform sampler2D normalMapSampler; uniform sampler2D heightMapSampler; in vec2 IN_FS_Texcoord; in vec3 IN_FS_CameraDir_Tangent; in vec3 IN_FS_LightDir_Tangent; out vec4 color; vec2 TraceRay(in float height, in vec2 coords, in vec3 dir, in float mipmap){ vec2 NewCoords = coords; vec2 dUV = - dir.xy * height * 0.08; float SearchHeight = 1.0; float prev_hits = 0.0; float hit_h = 0.0; for(int i=0;i<10;i++){ SearchHeight -= 0.1; NewCoords += dUV; float CurrentHeight = textureLod(heightMapSampler,NewCoords.xy, mipmap).r; float first_hit = clamp((CurrentHeight - SearchHeight - prev_hits) * 499999.0,0.0,1.0); hit_h += first_hit * SearchHeight; prev_hits += first_hit; } NewCoords = coords + dUV * (1.0-hit_h) * 10.0f - dUV; vec2 Temp = NewCoords; SearchHeight = hit_h+0.1; float Start = SearchHeight; dUV *= 0.2; prev_hits = 0.0; hit_h = 0.0; for(int i=0;i<5;i++){ SearchHeight -= 0.02; NewCoords += dUV; float CurrentHeight = textureLod(heightMapSampler,NewCoords.xy, mipmap).r; float first_hit = clamp((CurrentHeight - SearchHeight - prev_hits) * 499999.0,0.0,1.0); hit_h += first_hit * SearchHeight; prev_hits += first_hit; } NewCoords = Temp + dUV * (Start - hit_h) * 50.0f; return NewCoords; } void main( void ) { vec3 fvLightDirection = normalize( IN_FS_LightDir_Tangent ); vec3 fvViewDirection = normalize( IN_FS_CameraDir_Tangent ); float mipmap = 0; vec2 NewCoord = TraceRay(0.1,IN_FS_Texcoord,fvViewDirection,mipmap); //vec2 ddx = dFdx(NewCoord); //vec2 ddy = dFdy(NewCoord); vec3 BumpMapNormal = textureLod(normalMapSampler, NewCoord.xy, mipmap).xyz; BumpMapNormal = normalize(2.0 * BumpMapNormal - vec3(1.0, 1.0, 1.0)); vec3 fvNormal = BumpMapNormal; float fNDotL = dot( fvNormal, fvLightDirection ); vec3 fvReflection = normalize( ( ( 2.0 * fvNormal ) * fNDotL ) - fvLightDirection ); float fRDotV = max( 0.0, dot( fvReflection, fvViewDirection ) ); vec4 fvBaseColor = textureLod( colorSampler, NewCoord.xy,mipmap); vec4 fvTotalAmbient = fvAmbient * fvBaseColor; vec4 fvTotalDiffuse = fvDiffuse * fNDotL * fvBaseColor; vec4 fvTotalSpecular = fvSpecular * ( pow( fRDotV, fSpecularPower ) ); color = ( fvTotalAmbient + (fvTotalDiffuse + fvTotalSpecular) ); } The FS implements the displacement technique in TraceRay method, while always using mipmap level 0. Most of the code is from NVIDIA sample and another paper I found on the web, so I guess there cannot be much wrong in here. At the end it uses the modified UV coords for getting the displaced normal from the normal map and the color from the color map. I looking forward for some ideas. Thanks in advance! Edit: Here is the code loading the heightmap: glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, mWidth, mHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, mImageData); glGenerateMipmap(GL_TEXTURE_2D); //glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR_MIPMAP_LINEAR); //glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR); //glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); //glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); Maybe something wrong in here?

    Read the article

  • Rendering different materials in a voxel terrain

    - by MaelmDev
    Each voxel datapoint in my terrain model is made up of two properties: density and material type. Each is stored as an unsigned integer value (but the density is interpreted as a decimal value between 0 and 1). My current idea for rendering these different materials on the terrain mesh is to store eleven extra attributes in each vertex: six material values corresponding to the materials of the voxels that the vertices lie between, three decimal values that correspond to the interpolation each vertex has between each voxel, and two decimal values that are used to determine where the fragment lies on the triangle. The material and interpolation attributes are the exact same for each vertex in the triangle. The fragment shader samples each texture that corresponds to each material and then uses the aforementioned couple of decimal values to interpolate between these samples and obtain the final textured color of the fragment. It should work fine, but it seems like a big memory hog. I won't be able to reuse vertices in the mesh with indexing, and each vertex will have a lot of data associated with it. It also seems pretty slow. What are some ways to improve or replace this technique for drawing materials on a voxel terrain mesh?

    Read the article

  • Sound not playing on Windows XP - SoundEffect or Song: Monogame

    - by ashes999
    I'm trying to integrate sound into my Monogame game. I don't have the content pipeline hack -- just straight Monogame (Beta 3) at this point. (I tried adding the content pipeline, but ran into some issues.) I added a .wav file to my /Content directory, and I can create and instantiate both SoundEffect and Song classes. However, both show durations of 00:00:00 (on a ten-second long file), and neither plays. I can call LoadContent without any issue. But when I call Play, nothing plays. I've tried a couple of different sounds, and different formats (MP3 and WAV) to rule that out. Only WAV seems to even load without crashing out, but it doesn't play. There seems to be a GitHub issue that fixes this problem in 2.5.1. Downgrading to 2.5.1 doesn't fix this problem; it seems like it's fixed in 3.0 (_data is set in the SoundEffect instance). This issue only occurs on Windows XP. I tested it on a Windows 7 laptop, and the sound plays fine.

    Read the article

  • How to create Executable Jar

    - by Siddharth
    When I try to create a jar file i found the following error, so please someone help me to out of this. Exception in thread "LWJGL Application" com.badlogic.gdx.utils.GdxRuntimeException: Couldn't load file: img/black_ring.png at com.badlogic.gdx.graphics.Pixmap.(Pixmap.java:137) at com.badlogic.gdx.graphics.glutils.FileTextureData.prepare(FileTextureData.java:55) at com.badlogic.gdx.graphics.Texture.load(Texture.java:175) at com.badlogic.gdx.graphics.Texture.create(Texture.java:159) at com.badlogic.gdx.graphics.Texture.(Texture.java:133) at com.badlogic.gdx.graphics.Texture.(Texture.java:122) at com.badlogic.runningball.UserBall.(UserBall.java:19) at com.badlogic.runningball.GameScreen.(GameScreen.java:25) at com.badlogic.runningball.RunningBall.create(RunningBall.java:12) at com.badlogic.gdx.backends.lwjgl.LwjglApplication.mainLoop(LwjglApplication.java:126) at com.badlogic.gdx.backends.lwjgl.LwjglApplication$1.run(LwjglApplication.java:113) Caused by: com.badlogic.gdx.utils.GdxRuntimeException: File not found: img/black_ring.png (Internal) at com.badlogic.gdx.files.FileHandle.read(FileHandle.java:108) at com.badlogic.gdx.files.FileHandle.length(FileHandle.java:364) at com.badlogic.gdx.files.FileHandle.readBytes(FileHandle.java:156) at com.badlogic.gdx.graphics.Pixmap.(Pixmap.java:134) ... 10 more

    Read the article

  • When does depth testing happen?

    - by Utkarsh Sinha
    I'm working with 2D sprites - and I want to do 3D style depth testing with them. When writing a pixel shader for them, I get access to the semantic DEPTH0. Would writing to this value help? It seems it doesn't. Maybe it's done before the pixel shader step? Or is depth testing only done when drawing 3D things (I'm using SpriteBatch)? Any links/articles/topics to read/search for would be appreciated.

    Read the article

  • Where should i organize my matrices in a 3D Game engine?

    - by Need4Sleep
    I'm working with a group of people from around the world to create a game engine(and hopefully a game with it) within the next upcoming years. My first task was writing a camera class for the engine to use in order to add cameras to the scene, position and follow points in the scene. The problem i have is with using matrices for transformations in the class, should i keep matrices separate to each class? such as have the model matrix in the model class, camera matrix in the camera class, or have all matrices placed in one class/chuck? I could see pros and cons for each method, but i wanted to hear some input form a more professional standpoint.

    Read the article

  • What's the most efficient way to find barycentric coordinates?

    - by bobobobo
    In my profiler, finding barycentric coordinates is apparently somewhat of a bottleneck. I am looking to make it more efficient. It follows the method in shirley, where you compute the area of the triangles formed by embedding the point P inside the triangle. Code: Vector Triangle::getBarycentricCoordinatesAt( const Vector & P ) const { Vector bary ; // The area of a triangle is real areaABC = DOT( normal, CROSS( (b - a), (c - a) ) ) ; real areaPBC = DOT( normal, CROSS( (b - P), (c - P) ) ) ; real areaPCA = DOT( normal, CROSS( (c - P), (a - P) ) ) ; bary.x = areaPBC / areaABC ; // alpha bary.y = areaPCA / areaABC ; // beta bary.z = 1.0f - bary.x - bary.y ; // gamma return bary ; } This method works, but I'm looking for a more efficient one!

    Read the article

  • 2D Collision masks for handling slopes

    - by JiminyCricket
    I've been looking at the example at: http://create.msdn.com/en-US/education/catalog/tutorial/collision_2d_perpixel and am trying to figure out how to adjust the sprite once a collision has been detected. As David suggested at XNA 4.0 2D sidescroller variable terrain heightmap for walking/collision, I made a few sensor points (feet, sides, bottom center, etc.) and can easily detect when these points actually collide with non-transparent portions of a second texture (simple slope). I'm having trouble with the algorithm of how I would actually adjust the sprite position based on a collision. Say I detect a collision with the slope at the sprite's right foot. How can I scan the slope texture data to find the Y position to place the sprite's foot so it is no longer inside the slope? The way it is stored as a 1D array in the example is a bit confusing, should I try to store the data as a 2D array instead? For test purposes, I'm thinking of just using the slope texture alpha itself as a primitive and easy collision mask (no grass bits or anything besides a simple non-linear slope). Then, as in the example, I find the coordinates of any collisions between the slope texture and the sprite's sensors and mark these special sensor collisions as having occurred. Finally, in the case of moving up a slope, I would scan for the first transparent pixel above (in the texture's Ys at that X) the right foot collision point and set that as the new height of the sprite. I'm a little unclear also on when I should make these adjustments. Collisions are checked on every game.update() so would I quickly change the position of the sprite before the next update is called? I also noticed several people mention that it's best to separate collision checks horizontally and vertically, why is that exactly? Open to any suggestions if this is an inefficient or inaccurate way of handling this. I wish MSDN had provided an example of something like this, I didn't know it would be so much more complex than NES Mario style pure box platforming!

    Read the article

  • Terrain sqaure loading

    - by AndroidXTr3meN
    Games like Skyrim, Morrowind, and more are using quads or sqaure to divide the terrain if im correct. The player is always at #5 1 | 2 | 3 4 | 5 | 6 7 | 8 | 9 So whenever you cross the border you unload and load the new "areas" But if the user goes just over the edge and then the second after goes back previous area a lot of uneccessary loading and unloading is done. Is there a general approach to this becuase I dont think games like skyrim have this issue? Cheers!

    Read the article

  • Pixel alignment algorithm

    - by user42325
    I have a set of square blocks, I want to draw them in a window. I am sure the coordinates calculation is correct. But on the screen, some squares' edge overlap with other, some are not. I remember the problem is caused by accuracy of pixels. I remember there's a specific topic related to this kind of problem in 2D image rendering. But I don't remember what exactly it is, and how to solve it. Look at this screenshot. Each block should have a fixed width margin. But in the image, the vertical white line have different width.Though, the horizontal lines looks fine.

    Read the article

  • C# creating a simple snake game

    - by Guy David
    I was thinking about creating a snake game with C#, so I ran ideas in my head, and some problems came up. How can I track and output in the correct location the blocks that run after the snake's head? If the snake is built of five blocks, and the user starts going in a circle, how can I print the snake body in the right location? Also, how can I create an action that will run on the background, which will move the snake forward, no matter what the user does? What structure should my code have? (code design structure) This should be a console application, since it's the only framework I am familiar with. I am not looking for finished code, since I want to really understand how it should work.

    Read the article

< Previous Page | 537 538 539 540 541 542 543 544 545 546 547 548  | Next Page >