Search Results

Search found 31839 results on 1274 pages for 'plugin development'.

Page 509/1274 | < Previous Page | 505 506 507 508 509 510 511 512 513 514 515 516  | Next Page >

  • How do I generate terrain like that of Scorched Earth?

    - by alex
    Hi, I'm a web developer and I am keen to start writing my own games. For familiarity, I've chosen JavaScript and canvas element for now. I want to generate some terrain like that in Scorched Earth. My first attempt made me realise I couldn't just randomise the y value; there had to be some sanity in the peaks and troughs. I have Googled around a bit, but either I can't find something simple enough for me or I am using the wrong keywords. Can you please show me what sort of algorithm I would use to generate something in the example, keeping in mind that I am completely new to games programming (since making Breakout in 2003 with Visual Basic anyway)?

    Read the article

  • How to draw textures on a model

    - by marc wellman
    The following code is a complete XNA 3.1 program almost unaltered to that code skeleton Visual Studio is creating when creating a new project. The only things I have changed are imported a .x model to the content folder of the VS solution. (the model is a simple square with a texture spanning over it - made in Google Sketchup and exported with several .x exporters) in the Load() method I am loading the .x model into the game. The Draw() method uses a BasicEffect to render the model. Except these three things I haven't added any code. Why does the model does not show the texture ? What can I do to make the texture visible ? This is the texture file (a standard SketchUp texture from the palette): And this is what my program looks like - as you can see: No texture! Find below the complete source code of the program AND the complete .x file: namespace WindowsGame1 { /// <summary> /// This is the main type for your game /// </summary> public class Game1 : Microsoft.Xna.Framework.Game { GraphicsDeviceManager graphics; SpriteBatch spriteBatch; public Game1() { graphics = new GraphicsDeviceManager(this); Content.RootDirectory = "Content"; } /// <summary> /// Allows the game to perform any initialization it needs to before starting to run. /// This is where it can query for any required services and load any non-graphic /// related content. Calling base.Initialize will enumerate through any components /// and initialize them as well. /// </summary> protected override void Initialize() { // TODO: Add your initialization logic here base.Initialize(); } Model newModel; /// <summary> /// LoadContent will be called once per game and is the place to load /// all of your content. /// </summary> protected override void LoadContent() { // Create a new SpriteBatch, which can be used to draw textures. spriteBatch = new SpriteBatch(GraphicsDevice); // TODO: usse this.Content to load your game content here newModel = Content.Load<Model>(@"aau3d"); foreach (ModelMesh mesh in newModel.Meshes) { foreach (ModelMeshPart meshPart in mesh.MeshParts) { meshPart.Effect = new BasicEffect(this.GraphicsDevice, null); } } } /// <summary> /// UnloadContent will be called once per game and is the place to unload /// all content. /// </summary> protected override void UnloadContent() { // TODO: Unload any non ContentManager content here } /// <summary> /// Allows the game to run logic such as updating the world, /// checking for collisions, gathering input, and playing audio. /// </summary> /// <param name="gameTime">Provides a snapshot of timing values.</param> protected override void Update(GameTime gameTime) { // Allows the game to exit if (GamePad.GetState(PlayerIndex.One).Buttons.Back == ButtonState.Pressed) this.Exit(); // TODO: Add your update logic here base.Update(gameTime); } /// <summary> /// This is called when the game should draw itself. /// </summary> /// <param name="gameTime">Provides a snapshot of timing values.</param> protected override void Draw(GameTime gameTime) { if (newModel != null) { GraphicsDevice.Clear(Color.CornflowerBlue); Matrix[] transforms = new Matrix[newModel.Bones.Count]; newModel.CopyAbsoluteBoneTransformsTo(transforms); foreach (ModelMesh mesh in newModel.Meshes) { foreach (BasicEffect effect in mesh.Effects) { effect.EnableDefaultLighting(); effect.TextureEnabled = true; effect.World = transforms[mesh.ParentBone.Index] * Matrix.CreateRotationY(0) * Matrix.CreateTranslation(new Vector3(0, 0, 0)); effect.View = Matrix.CreateLookAt(new Vector3(200, 1000, 200), Vector3.Zero, Vector3.Up); effect.Projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.ToRadians(45.0f), 0.75f, 1.0f, 10000.0f); } mesh.Draw(); } } base.Draw(gameTime); } } } This is the model I am using (.x): xof 0303txt 0032 // SketchUp 6 -> DirectX (c)2008 edecadoudal, supports: faces, normals and textures Material Default_Material{ 1.0;1.0;1.0;1.0;; 3.2; 0.000000;0.000000;0.000000;; 0.000000;0.000000;0.000000;; } Material _Groundcover_RiverRock_4inch_{ 0.568627450980392;0.494117647058824;0.427450980392157;1.0;; 3.2; 0.000000;0.000000;0.000000;; 0.000000;0.000000;0.000000;; TextureFilename { "aau3d.xGroundcover_RiverRock_4inch.jpg"; } } Mesh mesh_0{ 4; -81.6535;0.0000;74.8031;, -0.0000;0.0000;0.0000;, -81.6535;0.0000;0.0000;, -0.0000;0.0000;74.8031;; 2; 3;0,1,2, 3;1,0,3;; MeshMaterialList { 2; 2; 1, 1; { Default_Material } { _Groundcover_RiverRock_4inch_ } } MeshTextureCoords { 4; -2.1168,-3.4022; 1.0000,-0.0000; 1.0000,-3.4022; -2.1168,-0.0000;; } MeshNormals { 4; 0.0000;1.0000;-0.0000; 0.0000;1.0000;-0.0000; 0.0000;1.0000;-0.0000; 0.0000;1.0000;-0.0000;; 2; 3;0,1,2; 3;1,0,3;; } }

    Read the article

  • Constructive criticsm on my linear sampling Gaussian blur

    - by Aequitas
    I've been attempting to implement a gaussian blur utilising linear sampling, I've come across a few articles presented on the web and a question posed here which dealt with the topic. I've now attempted to implement my own Gaussian function and pixel shader drawing reference from these articles. This is how I'm currently calculating my weights and offsets: int support = int(sigma * 3.0) weights.push_back(exp(-(0*0)/(2*sigma*sigma))/(sqrt(2*pi)*sigma)); total += weights.back(); offsets.push_back(0); for (int i = 1; i <= support; i++) { float w1 = exp(-(i*i)/(2*sigma*sigma))/(sqrt(2*pi)*sigma); float w2 = exp(-((i+1)*(i+1))/(2*sigma*sigma))/(sqrt(2*pi)*sigma); weights.push_back(w1 + w2); total += 2.0f * weights[i]; offsets.push_back(w1 / weights[i]); } for (int i = 0; i < support; i++) { weights[i] /= total; } Here is an example of my vertical pixel shader: vec3 acc = texture2D(tex_object, v_tex_coord.st).rgb*weights[0]; vec2 pixel_size = vec2(1.0 / tex_size.x, 1.0 / tex_size.y); for (int i = 1; i < NUM_SAMPLES; i++) { acc += texture2D(tex_object, (v_tex_coord.st+(vec2(0.0, offsets[i])*pixel_size))).rgb*weights[i]; acc += texture2D(tex_object, (v_tex_coord.st-(vec2(0.0, offsets[i])*pixel_size))).rgb*weights[i]; } gl_FragColor = vec4(acc, 1.0); Am I taking the correct route with this? Any criticism or potential tips to improving my method would be much appreciated.

    Read the article

  • Farseer: How can I crush Mario? [on hold]

    - by Homer_Simpson
    I want that Mario dies if the box is crushing him. My game has a similar level like this level from New Super Mario Bros. Wii: http://www.youtube.com/watch?v=tYyu6tFAa2M At the beginning of the level, you see some boxes falling to the ground. If a box crushes Mario, he dies. I want to do exactly the same in my game, but I don't know how to do that in Farseer. How can I do that in Farseer 3.3.1? Do you have any suggestions? I don't know how to do the collision detection. I use rectangles for the boxes and ground in Farseer. Mario is a polygon.

    Read the article

  • How does one escape the GPL?

    - by tehtros
    DISCLAIMER I don't pretend to know anything about licensing. In fact, everything I say below may be completely false! Backstory: Recently, I've been looking for a decent game engine, and I think I've found one that I really like, Cafu Engine. However, they have a dual licensing plan, where everything you make with the engine is forced under GPL, unless you pay for a commercial license. I'm not saying that it's a bad engine, they even say that they are very relaxed about the licensing fees. However, the fact that it even involves the GPL scares me. So my question is basicly, how does one escape the GPL. Here's an example: The id Tech engine, also known as the Quake engine, or the Doom engine, was the base for the popular Source engine. However, the id Tech engine has been released under the GPL, and the Source engine is proprietary. Did Valve get a different license? Or did they do something to escape the GPL? Is there a way to escape the GPL? Or, if you use GPL'd source code as a base for another project, are you forced to use the GPL, and make your source code available to the world. Could some random person take the id Tech engine, modify it past the point of recognition, then use it as a proprietary engine for commercial products? Or are they required to make it open source. One last thing, I generally have no problem what-so-ever with open source. However I feel that open source has it's place, but that is not in the bushiness world.

    Read the article

  • Multi Pass Blend

    - by Kirk Patrick
    I am seeking the simplest working example of a two pass HLSL pixel shader. It can do anything really, but the main idea is to perform "ping ponging" to take the output of the first pass and then send it for the second pass. In my example I want to draw to the R channel and then draw to the G channel and produce a simple Venn Diagram in the shader, but need to detect overlap. I can currently detect one or the other but not overlap. There are a red and green circle overlapping, and I want to put a dynamic texture map in the overlap region. I can currently put it in either or. Below is how it looks in the shader. -------------------------------- Texture2D shaderTexture; SamplerState SampleType; ////////////// // TYPEDEFS // ////////////// struct PixelInputType { float4 position : SV_POSITION; float2 tex0 : TEXCOORD0; float2 tex1 : TEXCOORD1; float4 color : COLOR; }; //////////////////////////////////////////////////////////////////////////////// // Pixel Shader //////////////////////////////////////////////////////////////////////////////// float4 main(PixelInputType input) : SV_TARGET { float4 textureColor0; float4 textureColor1; // Sample the pixel color from the texture using the sampler at this texture coordinate location. textureColor0 = shaderTexture.Sample(SampleType, input.tex0); textureColor1 = shaderTexture.Sample(SampleType, input.tex1); if (input.color[0]==1.0f && input.color[1]==1.0f) // Requires multi-pass textureColor0 = textureColor1; return textureColor0; } Here is the calling code (that needs to be modified) m_d3dContext->IASetVertexBuffers(0, 2, vbs, strides, offsets); m_d3dContext->IASetIndexBuffer(m_indexBuffer.Get(), DXGI_FORMAT_R32_UINT,0); m_d3dContext->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); m_d3dContext->IASetInputLayout(m_inputLayout.Get()); m_d3dContext->VSSetShader(m_vertexShader.Get(), nullptr, 0); m_d3dContext->VSSetConstantBuffers(0, 1, m_constantBuffer.GetAddressOf()); m_d3dContext->PSSetShader(m_pixelShader.Get(), nullptr, 0); m_d3dContext->PSSetShaderResources(0, 1, m_SRV.GetAddressOf()); m_d3dContext->PSSetSamplers(0, 1, m_QuadsTexSamplerState.GetAddressOf());

    Read the article

  • Long delays in Unity3D substance generation

    - by Josh Buhler
    Currently working on an iOS/Android project in Unity3d, and we're seeing some incredibly long times for generating substances between testing runs. We can run the game, but once we shut down the playback, Unity begins to re-import all off the substances built using Substance Designer. As we've got a lot of these in our game, it's starting to lead to 5 minute delays between testing runs just to test a small change. Any suggestions or parameters we should check that could possibly prevent Unity from needing to regenerate these substances every time? Shouldn't it be caching these things somewhere?

    Read the article

  • Game editor integration with the engine?

    - by Daniel
    What I am trying to figure out is what is the best way to integrate the editor(level, effects, model, etc...) in the most effective way? Now the first thing I thought would be to create the game engine(*) extremely modular. For example I took the example of game states. You could have multiple game states that all have their own update() and draw() methods among others. Each game state class would inherit from a base GameState class. This allows for a more modular approach and a useful one at that. Now would the most efficient approach be to implement the editor along with the modular engine, or create two different designs for both the game, and editor? I thought to take the game state example and extend it to window states, and well could be used for a lot more systems. Is there a better implementation of this design(game state) for use in other systems used in the engine? *: Now I know the term game engine is sorta irrelevant, and misused in many situations. What I am referring to as the "game engine" is the combination of the systems that the game must interact with for short. Also this is more of a theory / design question than an implementation. Even though both mix, i'd rather like to have a more general idea on how the editor is built in an efficient way and still using the same engine code as what the game uses. Thanks, Daniel P.S If you need more clarification or extra bits just leave a comment.

    Read the article

  • How to link subprograms to a main program's game loop?

    - by Jim
    I recently discovered Crobot which is (briefly) a game where each player codes a virtual robot in a pseudo-C language. Each robot is then put in an arena where it fights against other robots. A robots' source code has this shape : /* Beginning file robot.r */ main() { while (1) { /* Do whatever you want */ ... move(); ... fire(); } } /* End file robot.r */ You can see that : The code is totally independent from any library/include Some predefined functions are available (move, fire, etc…) The program has its own game loop, and consequently is not called every frame My question is roughly : how does it work ? It seems that each robot's code is compiled by the main program and then used in a way I cannot understand. I thought it could yields a thread for each robot, but I have not any proof of this and it seems a bit complicated to achieve it. Any idea how it could work, someone ?

    Read the article

  • Integration error in high velocity

    - by Elektito
    I've implemented a simple simulation of two planets (simple 2D disks really) in which the only force is gravity and there is also collision detection/response (collisions are completely elastic). I can launch one planet into orbit of the other just fine. The collision detection code though does not work so well. I noticed that when one planet hits the other in a free fall it speeds backward and goes much higher than its original position. Some poking around convinced me that the simplistic Euler integration is causing the error. Consider this case. One object has a mass of 1kg and the other has a mass equal to earth. Say the object is 10 meters above ground. Assume that our dt (delta t) is 1 second. The object goes to the height of 9 meters at the end of the first iteration, 7 at the end of the second, 4 at the end of the third and 0 at the end of the fourth iteration. At this points it hits the ground and bounces back with the speed of 10 meters per second. The problem is with dt=1, on the first iteration it bounces back to a height of 10. It takes several more steps to make the object change its course. So my question is, what integration method can I use which fixes this problem. Should I split dt to smaller pieces when velocity is high? Or should I use another method altogether? What method do you suggest? EDIT: You can see the source code here at github:https://github.com/elektito/diskworld/

    Read the article

  • Capitalizing on JavaScript's prototypal inheritance

    - by keithjgrant
    JavaScript has a class-free object system in which objects inherit properties directly from other objects. This is really powerful, but it is unfamiliar to classically trained programmers. If you attempt to apply classical design patterns directly to JavaScript, you will be frustrated. But if you learn to work with JavaScript's prototypal nature, your efforts will be rewarded. ... It is Lisp in C's clothing. -Douglas Crockford What does this mean for a game developer working with canvas and HTML5? I've been looking over this question on useful design patterns in gaming, but prototypal inheritance is very different than classical inheritance, and there are surely differences in the best way to apply some of these common patterns. For example, classical inheritance allows us to create a moveableEntity class, and extend that with any classes that move in our game world (player, monster, bullet, etc.). Sure, you can strongarm JavaScript to work that way, but in doing so, you are kind of fighting against its nature. Is there a better approach to this sort of problem when we have prototypal inheritance at our fingertips?

    Read the article

  • Simplest way to use Steam Leaderboards from C# [on hold]

    - by Miau
    We are about to integrate steamworks for leaderboards and achievements into our game. I see there are many open and closed source libraries that can be used to use SteamWorks from C#. Rolling our own wrapper can be done, but if the other libraries are reliable then it would be better to use and perhaps contribute back if we see any obvious gaps. Have you used any and if so what was your experience with the different libraries? Specifically for Leaderboards and achievements The ones I found are: SteamWorks.net Steam4Net Ludosity (can be used outside of Unity apparently)

    Read the article

  • Bitmap to Texture2D problem with colors

    - by xnaNewbie89
    I have a small problem with converting a bitmap to a Texture2D. The resulted image of the conversion has the red channel switched with the blue channel :/ I don't know why, because the pixel formats are the same. If someone can help me I will be very happy :) System.Drawing.Image image = System.Drawing.Bitmap.FromFile(ImageFileLoader.filename); System.Drawing.Bitmap bitmap = new System.Drawing.Bitmap(image); Texture2D mapTexture = new Texture2D(Screen.Game.GraphicsDevice, bitmap.Width, bitmap.Height,false,SurfaceFormat.Color); System.Drawing.Imaging.BitmapData data = bitmap.LockBits(new System.Drawing.Rectangle( 0, 0, bitmap.Width, bitmap.Height), System.Drawing.Imaging.ImageLockMode.ReadOnly,System.Drawing.Imaging.PixelFormat.Format32bppArgb); byte[] bytes = new byte[data.Height * data.Width*4]; System.Runtime.InteropServices.Marshal.Copy(data.Scan0, bytes, 0, bytes.Length); mapTexture.SetData<byte>(bytes, 0, data.Height * data.Width * 4); bitmap.UnlockBits(data); bitmap.Dispose(); image.Dispose();

    Read the article

  • How can I simulate objects floating on water without a physics engine?

    - by user1075940
    In my game the water movement is done in a shader using Gerstner equations. The water movement looks realistic enough for a school project but I encounter serious problem when I wanted to do sailing on waves (similar to this). I managed to do collision with land by calculating quad's vertices and normals beneath ship, however same method can not be applied to water because XZ are displaced and Y is calculated in a shader :( How to approach this problem ? Is it possible to retrieve transformed grid from shader? Unfortunately no external physics libraries can be used.

    Read the article

  • Rotation based on x coordinate and x velocity?

    - by Lewis
    -(void) accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration { float deceleration = 0.3f, sensitivity = 8.0f, maxVelocity = 150; // adjust velocity based on current accelerometer acceleration playerVelocity.x = playerVelocity.x * deceleration + acceleration.x * sensitivity; // we must limit the maximum velocity of the player sprite, in both directions (positive & negative values) playerVelocity.x = fmaxf(fminf(playerVelocity.x, maxVelocity), -maxVelocity); } Hi, I want to rotate my sprite based on the velocity and accelerometer input. My sprite can move along the X axis like so: <--------- sprite ----------- But it always faces forwards, if it is moving left I want it to point slightly to the left, the degree of how far it is pointing to be judged from the velocity. This should also work for the right. I tried using atan but as the y velocity and position is always the same the function returns 0, which doesn't rotate it at all. Any ideas? Regards, Lewis.

    Read the article

  • Why does the location of my vehicle spawner change when I open a matinee?

    - by Gareth Jones
    I'm doing work with InterActors and vehicle spawners in Unreal Tournament 3's editor, and have it set up like so: (The Walkway its on is the InterActor) However if I go in Kemsit and open the matinee that handles the InterActor, this happens: It does look to me like the editor is moving it out of the way so I can see the InterActor (which would be very clever) because only the image of the vehicle moves, not the gizmo, nor does the vehicle spawn in that location in game. Is this the case?

    Read the article

  • How do I apply skeletal animation from a .x (Direct X) file?

    - by Byte56
    Using the .x format to export a model from Blender, I can load a mesh, armature and animation. I have no problems generating the mesh and viewing models in game. Additionally, I have animations and the armature properly loaded into appropriate data structures. My problem is properly applying the animation to the models. I have the framework for applying the models and the code for selecting animations and stepping through frames. From what I understand, the AnimationKeys inside the AnimationSet supplies the transformations to transform the bind pose to the pose in the animated frame. As small example: Animation { {Armature_001_Bone} AnimationKey { 2; //Position 121; //number of frames 0;3; 0.000000, 0.000000, 0.000000;;, 1;3; 0.000000, 0.000000, 0.005524;;, 2;3; 0.000000, 0.000000, 0.022217;;, ... } AnimationKey { 0; //Quaternion Rotation 121; 0;4; -0.707107, 0.707107, 0.000000, 0.000000;;, 1;4; -0.697332, 0.697332, 0.015710, 0.015710;;, 2;4; -0.684805, 0.684805, 0.035442, 0.035442;;, ... } AnimationKey { 1; //Scale 121; 0;3; 1.000000, 1.000000, 1.000000;;, 1;3; 1.000000, 1.000000, 1.000000;;, 2;3; 1.000000, 1.000000, 1.000000;;, ... } } So, to apply frame 2, I would take the position, rotation and scale from frame 2, create a transformation matrix (call it Transform_A) from them and apply that matrix the vertices controlled by Armature_001_Bone at their weights. So I'd stuff TransformA into my shader and transform the vertex. Something like: vertexPos = vertexPos * bones[ int(bfs_BoneIndices.x) ] * bfs_BoneWeights.x; Where bfs_BoneIndices and bfs_BoneWeights are values specific to the current vertex. When loading in the mesh vertices, I transform them by the rootTransform and the meshTransform. This ensures they're oriented and scaled correctly for viewing the bind pose. The problem is when I create that transformation matrix (using the position, rotation and scale from the animation), it doesn't properly transform the vertex. There's likely more to it than just using the animation data. I also tried applying the bone transform hierarchies, still no dice. Basically I end up with some twisted models. It should also be noted that I'm working in openGL, so any matrix transposes that might need to be applied should be considered. What data do I need and how do I combine it for applying .x animations to models?

    Read the article

  • Resultant Vector Algorithm for 2D Collisions

    - by John
    I am making a Pong based game where a puck hits a paddle and bounces off. Both the puck and the paddles are Circles. I came up with an algorithm to calculate the resultant vector of the puck once it meets a paddle. The game seems to function correctly but I'm not entirely sure my algorithm is correct. Here are my variables for the algorithm: Given: velocity = the magnitude of the initial velocity of the puck before the collision x = the x coordinate of the puck y = the y coordinate of the puck moveX = the horizontal speed of the puck moveY = the vertical speed of the puck otherX = the x coordinate of the paddle otherY = the y coordinate of the paddle piece.horizontalMomentum = the horizontal speed of the paddle before it hits the puck piece.verticalMomentum = the vertical speed of the paddle before it hits the puck slope = the direction, in radians, of the puck's velocity distX = the horizontal distance between the center of the puck and the center of the paddle distY = the vertical distance between the center of the puck and the center of the paddle Algorithm solves for: impactAngle = the angle, in radians, of the angle of impact. newSpeedX = the speed of the resultant vector in the X direction newSpeedY = the speed of the resultant vector in the Y direction Here is the code for my algorithm: int otherX = piece.x; int otherY = piece.y; double velocity = Math.sqrt((moveX * moveX) + (moveY * moveY)); double slope = Math.atan(moveX / moveY); int distX = x - otherX; int distY = y - otherY; double impactAngle = Math.atan(distX / distY); double newAngle = impactAngle + slope; int newSpeedX = (int)(velocity * Math.sin(newAngle)) + piece.horizontalMomentum; int newSpeedY = (int)(velocity * Math.cos(newAngle)) + piece.verticalMomentum; for those who are not program savvy here is it simplified: velocity = v(moveX² + moveY²) slope = arctan(moveX / moveY) distX = x - otherX distY = y - otherY impactAngle = arctan(distX / distY) newAngle = impactAngle + slope newSpeedX = velocity * sin(newAngle) + piece.horizontalMomentum newSpeedY = velocity * cos(newAngle) + piece.verticalMomentum My Question: Is this algorithm correct? Is there an easier/simpler way to do what I'm trying to do?

    Read the article

  • Math major as a viable degree

    - by Zak O'Keefe
    While I realize there are many topics about CS vs software engineering vs game school programs, I haven't found anything relating to whether pure math degrees (with CS minor and electives) would also be a viable program. By this I mean: Would having a math major, CS minor put one at competitive disadvantage as compared to a pure CS program? This relates specifically to game engine programming, more on the graphics side. Background (for those who care): Currently a math major, CS minor at school and looking to land a career doing graphics engine programming. Admittedly, I love math and if at all possible would like to stay my current program as long as it doesn't put me at a competitive disadvantage trying to land a job post-graduation. That being said, I'm strong in the traditional C/C++ languages, strong concurrent programming skills, and currently produce self-made games for iOS. As an employer, how badly is the math major hurting me? Just want to get some advice from people already in the field!

    Read the article

  • Setting Anchor Point

    - by Siddharth
    I want to set anchor point for the sprite like cocos2d has done for their implementation. I do not found any thing like that in andengine so please provide guidance on that. I want to move the sprite on touch so I use following code but that does not work for me. super.setPosition(pX - this.getWidthScaled() / 2, pY - this.getHeightScaled() / 2); Because I touch on the corner of the image but automatically it comes at center of the image because of above code. I want to remain the touch at desire position and drag it. For me the anchor point became useful. But I don't found anything in andengine.

    Read the article

  • Android - Rendering HUD View to SurfaceView

    - by Jon
    I have developed a relatively simple game in android, to get my head around it all, and on the back of it developed a crude game engine (in the loosest sense!). I use a SurfaceView and canvas (no OpenGL) - I'll cross that bridge another time! I have implemented a game HUD, title screens etc. by overlaying standard Android view widgets over my SurfaceView. This all works reasonably well maintaining an acceptable frame-rate, but it is a simple game with not a lot happening on or off screen. What I am wondering now is whether one could (and whether one would get any advantage by) drawing all my views to the one SurfaceView, all controlled by the main game thread. At the moment I have handlers flinging messages around and runOnUiThreads here, there and everywhere. Quite cumbersome. Any thoughts on this would be much appreciated (before I perhaps waste time trying to do it!)

    Read the article

  • Timing Calculations for Opengl ES 2.0 draw calls

    - by Arun AC
    I am drawing a cube in OpenGL ES 2.0 in Linux. I am calculating the time taken for each frame using below function #define NANO 1000000000 #define NANO_TO_MICRO(x) ((x)/1000) uint64_t getTick() { struct timespec stCT; clock_gettime(CLOCK_MONOTONIC, &stCT); uint64_t iCurrTimeNano = (1000000000 * stCT.tv_sec + stCT.tv_nsec); // in Nano Secs uint64_t iCurrTimeMicro = NANO_TO_MICRO(iCurrTimeNano); // in Micro Secs return iCurrTimeMicro; } I am running my code for 100 frames with simple x-axis rotation. I am getting around 200 to 220 microsecs per frame. that means am i getting around (1/220microsec = 4545) FPS Is my GPU is that fast? I strongly doubt this result. what went wrong in the code? Regards, Arun AC

    Read the article

  • Why does calling CreateDXGIFactory prevent my program from exiting?

    - by smoth190
    I'm using CreateDXGIFactory to get the graphics adapters and display modes. When I call it, it works fine and I get all the data. However, when I exit my program, the main Win32 thread exits, but something stays open because it keeps debugging. Does CreateDXGIFactory create an extra thread and I'm not closing it? I don't understand. The only thing I would suspect is that in the documentation it says it doesn't work if it's called from DllMain. It is in a DLL, but it's not called from DllMain. And it doesn't fail, either. I'm using DirectX 11. Here is the function that initializes DirectX. I haven't gotten past retrieving the refresh rate because of this problem. I commented everything out to pinpoint the problem. bool CGraphicsManager::InitDirectX(HWND hWnd, int width, int height) { HRESULT result; IDXGIFactory* factory; IDXGIOutput* output; IDXGIAdapter* adapter; DXGI_MODE_DESC* displayModes; DXGI_ADAPTER_DESC adapterDesc; unsigned int modeCount = 0; unsigned int refreshNum = 0; unsigned int refreshDen = 0; //First, we need to get the monitors refresh rater result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory); //if(FAILED(result)) //{ //MemoryUtil::MessageBoxError(TEXT("InitDirectX"), 0, 0, TEXT("Failed to create DXGI factory\nError:\n%s"), DXGetErrorDescription(result)); //return false; //} /*//Create a graphics card adapter result = factory->EnumAdapters(0, &adapter); if(FAILED(result)) { MemoryUtil::MessageBoxError(TEXT("InitDirectX"), 0, 0, TEXT("Failed to get graphics adapters\nError:\n%s"), DXGetErrorDescription(result)); return false; } //Get the output result = adapter->EnumOutputs(0, &output); if(FAILED(result)) { MemoryUtil::MessageBoxError(TEXT("InitDirectX"), 0, 0, TEXT("Failed to get adapter output\nError:\n%s"), DXGetErrorDescription(result)); return false; } //Get the modes result = output->GetDisplayModeList(DXGI_FORMAT_R8G8B8A8_UNORM, DXGI_ENUM_MODES_INTERLACED, &modeCount, 0); if(FAILED(result)) { MemoryUtil::MessageBoxError(TEXT("InitDirectX"), 0, 0, TEXT("Failed to get mode count\nError:\n%s"), DXGetErrorDescription(result)); return false; } displayModes = new DXGI_MODE_DESC[modeCount]; result = output->GetDisplayModeList(DXGI_FORMAT_R8G8B8A8_UNORM, DXGI_ENUM_MODES_INTERLACED, &modeCount, displayModes); if(FAILED(result)) { MemoryUtil::MessageBoxError(TEXT("InitDirectX"), 0, 0, TEXT("Failed to get display modes\nError:\n%s"), DXGetErrorDescription(result)); return false; } //Now we need to find one for our screen size for(unsigned int i = 0; i < modeCount; i++) { if(displayModes[i].Width == (unsigned int)width) { if(displayModes[i].Height == (unsigned int)height) { refreshNum = displayModes[i].RefreshRate.Numerator; refreshDen = displayModes[i].RefreshRate.Denominator; break; } } } //Store the video card data result = adapter->GetDesc(&adapterDesc); if(FAILED(result)) { MemoryUtil::MessageBoxError(TEXT("InitDirectX"), 0, 0, TEXT("Failed to get adapter description\nError:\n%s"), DXGetErrorDescription(result)); return false; } m_videoCard = new CVideoCard(); MemoryUtil::CreateGameObject(m_videoCard); m_videoCard->VideoCardMemory = (unsigned int)(adapterDesc.DedicatedVideoMemory); wcstombs_s(0, m_videoCard->VideoCardDescription, 128, adapterDesc.Description, 128);*/ //ReleaseCOM(output); //ReleaseCOM(adapter); ReleaseCOM(factory); //DeletePointerArray(displayModes); return true; } Also, I don't know if this means anything, but this is some of the output log when the function is commented out: //... 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\msvcr100d.dll', Symbols loaded. 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\imm32.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\msctf.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\uxtheme.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Program Files (x86)\Common Files\microsoft shared\ink\tiptsf.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\ole32.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\oleaut32.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\clbcatq.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\oleacc.dll', Cannot find or open the PDB file The program '[6560] LostRock.exe: Native' has exited with code 0 (0x0). And when it isn't commented out... //... 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\cfgmgr32.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\devobj.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\wintrust.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\crypt32.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\msasn1.dll', Cannot find or open the PDB file 'LostRock.exe': Unloaded 'C:\Windows\SysWOW64\setupapi.dll' 'LostRock.exe': Unloaded 'C:\Windows\SysWOW64\devobj.dll' 'LostRock.exe': Unloaded 'C:\Windows\SysWOW64\cfgmgr32.dll' 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\clbcatq.dll', Cannot find or open the PDB file 'LostRock.exe': Loaded 'C:\Windows\SysWOW64\oleacc.dll', Cannot find or open the PDB file The thread 'Win32 Thread' (0xb94) has exited with code 0 (0x0). The program '[8096] LostRock.exe: Native' has exited with code 0 (0x0). //This is called when I click "Stop Debugging" P.S. I know it is CreateDXGIFactory because if I comment it out, the program exits correctly.

    Read the article

  • Efficient mapping layout in 2D side-scroller, and collisions between character and the world

    - by Jack
    I haven't touched Visual Studio for a couple months now, but I was playing a game from the '90s toady and had an epiphany: I was looking for something what i didn't need, and I wasn't using what I knew correctly. One of those realizations was collision, so let me tell you a bit about my project that I was working on. The project's graphics looks like Mario or Dangerous Dave, etc., you get the idea - old-school pixels. So anyway I remember trying to think of something else than AABB for character form, but I couldn't think of anything. Perhaps I could get a suggestion for this? Another thing is the world - I don't want it to be just linear world, I want mountains, etc.. My idea is to use triangles, and no idea yet what to do if I want just part of the cube, say 3/4 or 2/4 or whatever. Hard-coding such things seems inefficient. P.S. I am not looking at the precision level offered by Box2D. Actually I remember trying to implement it at first, but I failed as my understanding of C++ wasn't advanced enough, as it'll be mentioned below. P.P.S. I am programming in C++, and I haven't done it for a couple months now. I have no means of testing it either, as my PC is broken down, and this one can barely run games from late '90s, not to speak about a compiler or a program with inefficient resource management... I am also not an expert (obviously), I don't even know if I can consider myself an average programmer. In short, I am simply curious about my thoughts and my past experience when programming the game. I may come back to it when my PC is fixed, I'm already filling a note about these things.

    Read the article

  • how to create texture for modelmesh?

    - by Berend
    Is there a possibiltiy to create a texture from a meshpart in xna. By getting a flat version of the mesh. So I can create a texture for it and edit that texture(via rendertarget)? I need to get the texture(wich is not yet a texture) so I can put another texture on it. I can create a texture and put it on a certain mesh. But I just cant figure out how I can create a texture with the right size. I also already found out i can use text2dproj in hlsl. But when i do this i get a gray stripe in the look. Is there a better solution?

    Read the article

< Previous Page | 505 506 507 508 509 510 511 512 513 514 515 516  | Next Page >