Search Results

Search found 9286 results on 372 pages for 'physics engine'.

Page 2/372 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Game Physics: Implementing Normal Reaction from ground correctly

    - by viraj
    I am implementing a simple side scrolling platform game. I am using the following strategy while coding the physics: Gravity constantly acts on the character. When the character is touching the floor, a normal reaction is exerted by the floor. I face the following problem: If the character is initially at a height, he acquires velocity in the -Y direction. Thus, when he hits the floor, he falls through even though normal force is being exerted. I could fix this by setting the Y velocity to 0, and placing him above the floor if he has collided with it. But this often leads to the character getting stuck in the floor or bouncing around it. Is there a better approach ?

    Read the article

  • Continuous Physics Engine's Collision Detection Techniques

    - by Griffin
    I'm working on a purely continuous physics engine, and I need to choose algorithms for broad and narrow phase collision detection. "Purely continuous" means I never do intersection tests, but instead want to find ways to catch every collision before it happens, and put each into "planned collisions" stack that is ordered by TOI. Broad Phase The only continuous broad-phase method I can think of is encasing each body in a circle and testing if each circle will ever overlap another. This seems horribly inefficient however, and lacks any culling. I have no idea what continuous analogs might exist for today's discrete collision culling methods such as quad-trees either. How might I go about preventing inappropriate and pointless broad test's such as a discrete engine does? Narrow Phase I've managed to adapt the narrow SAT to a continuous check rather than discrete, but I'm sure there's other better algorithms out there in papers or sites you guys might have come across. What various fast or accurate algorithm's do you suggest I use and what are the advantages / disatvantages of each? Final Note: I say techniques and not algorithms because I have not yet decided on how I will store different polygons which might be concave, convex, round, or even have holes. I plan to make a decision on this based on what the algorithm requires (for instance if I choose an algorithm that breaks down a polygon into triangles or convex shapes I will simply store the polygon data in this form).

    Read the article

  • How to restrict paddle movement using Farseer Physics engine 3.2

    - by brainydexter
    I am new to using Farseer Physics Engine 3.2(FPE), so please bear with my questions. Also, since FPE 3.2 is based on Box2D, I have been reading Box2D manual and pieces of code scattered in samples to better understand terminology and usage. Pong is usually my testbed whenever I try to do something new. Here is one of the issue I am running into: How can I restrict paddles to move only along Y axis, because the ball comes in and knocks off the paddles and everything floats in space afterwards ? (Box = Rectangle and ball = circle) I know MKS is the unit system, but is there a recommendation for sizes/position to be used ? I know this is a very generic question, but it would be good to know a simple set of values that one could use for making a game as simple as pong. Between box2d and FPE, I have some doubts: what is the recommended way of making a body in FPE ? world.CreateBody() does not exist in FPE Box2d manual recommends never to "new" body(since Box2D uses Small Object allocators), so is there a recommended way in Farseer to create a body (apart from factories) ? In box2d, it is recommended to keep a track of the body object, since it is also the parent to fixture(s). Why is it that in most of the examples, the fixture object is tracked ? Is there a reason why body is not tracked ? Thanks

    Read the article

  • Physics not synchronizing correctly over the network when using Bullet

    - by Lucas
    I'm trying to implement a client/server physics system using Bullet however I'm having problems getting things to sync up. I've implemented a custom motion state which reads and write the transform from my game objects and it works locally but I've tried two different approaches for networked games: Dynamic objects on the client that are also on the server (eg not random debris and other unimportant stuff) are made kinematic. This works correctly but the objects don't move very smoothly Objects are dynamic on both but after each message from the server that the object has moved I set the linear and angular velocity to the values from the server and call btRigidBody::proceedToTransform with the transform on the server. I also call btCollisionObject::activate(true); to force the object to update. My intent with method 2 was to basically do method 1 but hijacking Bullet to do a poor-man's prediction instead of doing my own to smooth out method 1, but this doesn't seem to work (for reasons that are not 100% clear to me even stepping through Bullet) and the objects sometimes end up in different places. Am I heading in the right direction? Bullet seems to have it's own interpolation code built-in. Can that help me make method 1 work better? Or is my method 2 code not working because I am accidentally stomping that?

    Read the article

  • Importance of scripting engine at Cocos2d Game Engine

    - by Mahbubur R Aaman
    Each Game Engine is different and solves different problems in different ways, so the engine design does vary greatly from engine to engine (even though a lot of principles are shared from engine to engine). Cocos2D is a great product on it’s own, but it doesn’t expose engine functionality to a scripting Language like Lua, JavaScript etc. My Question: How much important to integrate a Scripting Engine at Cocos2d?

    Read the article

  • Moving a body in a specific direction using XNA with Farseer Physics

    - by Code Assasssin
    I have a custom polygon attached to a body, which looks like this: What I am trying to accomplish is getting the body to move according to wherever the tip of the body is. So far this is what I've tried: if (ks.IsKeyDown(Keys.Up)) { body.ApplyForce(new Vector2(0, -20),body.GetLocalPoint(new Vector2(0,0))); } if (ks.IsKeyDown(Keys.Left)) { body.ApplyTorque(-500); } if (ks.IsKeyDown(Keys.Right)) { body.ApplyTorque(500); } The body rotates fine - but when I try making the body accelerate according to the tip of the body - assuming I have specified the tip correctly(I am pretty sure I haven't), it just spins around, as if I have applied Torque to it. Can anyone point me in the right direction of how to fix this problem?

    Read the article

  • Circle physics and collision using vectors

    - by Joe Hearty
    This is a problem I've been having, When making a set number of filled circles at random locations on a JPanel and applying a gravity (a negative change in the y), each of the circles collide. I want them to have collision detection and push in the opposite direction using vectors but I don't know how to apply that to my scenario could someone help? public void drawballs(Graphics g){ g.setColor (Color.white); //displays circles for(int i = 0; i<xlocationofcircles.length-1; i++){ g.fillOval( (int) xlocationofcircles[i], (int) (ylocationofcircles[i]) ,16 ,16 ); ylocationofcircles[i]+=.2; //gravity if(ylocationofcircles[i] > 550) //stops gravity at bottom of screen ylocationofcircles[i]-=.2; //Check distance between circles(i think..) float distance =(xlocationofcircles[i+1]-xlocationofcircles[i]) + (ylocationofcircles[i+1]-xlocationofcircles[i]); if( Math.sqrt(distance) <16) ...

    Read the article

  • Vectors with Circles Physics -java

    - by Joe Hearty
    This is a problem I've been having, When making a set number of filled circles at random locations on a JPanel and applying a gravity (a negative change in the y), each of the circles collide. I want them to have collision detection and push in the opposite direction using vectors but i don't know how to apply that to my scenario could someone help? public void drawballs(Graphics g){ g.setColor (Color.white); //displays circles for(int i = 0; i<xlocationofcircles.length-1; i++){ g.fillOval( (int) xlocationofcircles[i], (int) (ylocationofcircles[i]) ,16 ,16 ); ylocationofcircles[i]+=.2; //gravity if(ylocationofcircles[i] > 550) //stops gravity at bottom of screen ylocationofcircles[i]-=.2; //Check distance between circles(i think..) float distance =(xlocationofcircles[i+1]-xlocationofcircles[i]) + (ylocationofcircles[i+1]-xlocationofcircles[i]) ; if( Math.sqrt(distance) <16)

    Read the article

  • Keep basic game physics separate from basic game object? [on hold]

    - by metamorphosis
    If anybody has dealt with a similar situation I'd be interested in your experience/wisdom, I'm developing a 2D game library in C++, I have game objects which have very basic physics, they also have movement classes attached to differing states, for example, a different movement type based on whether the character is jumping, on ice, whatever. In terms of storing velocity and acceleration impulses, are they best held by the object? Or by the associated movement class? The reason I ask is that I can see advantages to both approaches- if you store physics data in the movement class, you have to pass physics information between class instances when a state change occurs (ie. impulses, gravity etc) but the class has total control over whether those physics are updated or not. An obvious example of how this would be useful was if an object was affected by something which caused it to ignore gravity, or something like that. on the other hand if you store the physics data in the object class, it feels more logical, you don't have to go around passing physics impulses and gravity etc, however the control that the movement class has over the object's physics becomes more convoluted. Basically the difference is between: object->physics stacks (acceleration impulses etc) ->physics functions ->movement type <-movement type makes physics function calls through object and object->movement type->physics stacks ->physics functions ->object forwards external physics calls onto movement type ->object transfers physics stacks between movement types when state change occurs Are there best practices here?

    Read the article

  • Ogre 3d and bullet physics interaction

    - by Tim
    I have been playing around with Ogre3d and trying to integrate bullet physics. I have previously somewhat successfully got this functionality working with irrlicht and bullet and I am trying to base this on what I had done there, but modifying it to fit with Ogre. It is working but not correctly and I would like some help to understand what it is I am doing wrong. I have a state system and when I enter the "gamestate" I call some functions such as setting up a basic scene, creating the physics simulation. I am doing that as follows. void GameState::enter() { ... // Setup Physics btBroadphaseInterface *BroadPhase = new btAxisSweep3(btVector3(-1000,-1000,-1000), btVector3(1000,1000,1000)); btDefaultCollisionConfiguration *CollisionConfiguration = new btDefaultCollisionConfiguration(); btCollisionDispatcher *Dispatcher = new btCollisionDispatcher(CollisionConfiguration); btSequentialImpulseConstraintSolver *Solver = new btSequentialImpulseConstraintSolver(); World = new btDiscreteDynamicsWorld(Dispatcher, BroadPhase, Solver, CollisionConfiguration); ... createScene(); } In the createScene method I add a light and try to setup a "ground" plane to act as the ground for things to collide with.. as follows. I expect there is issues with this as I get objects colliding with the ground but half way through it and they glitch around like crazy on collision. void GameState::createScene() { m_pSceneMgr->createLight("Light")->setPosition(75,75,75); // Physics // As a test we want a floor plane for things to collide with Ogre::Entity *ent; Ogre::Plane p; p.normal = Ogre::Vector3(0,1,0); p.d = 0; Ogre::MeshManager::getSingleton().createPlane( "FloorPlane", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, p, 200000, 200000, 20, 20, true, 1, 9000,9000,Ogre::Vector3::UNIT_Z); ent = m_pSceneMgr->createEntity("floor", "FloorPlane"); ent->setMaterialName("Test/Floor"); Ogre::SceneNode *node = m_pSceneMgr->getRootSceneNode()->createChildSceneNode(); node->attachObject(ent); btTransform Transform; Transform.setIdentity(); Transform.setOrigin(btVector3(0,1,0)); // Give it to the motion state btDefaultMotionState *MotionState = new btDefaultMotionState(Transform); btCollisionShape *Shape = new btStaticPlaneShape(btVector3(0,1,0),0); // Add Mass btVector3 LocalInertia; Shape->calculateLocalInertia(0, LocalInertia); // CReate the rigid body object btRigidBody *RigidBody = new btRigidBody(0, MotionState, Shape, LocalInertia); // Store a pointer to the Ogre Node so we can update it later RigidBody->setUserPointer((void *) (node)); // Add it to the physics world World->addRigidBody(RigidBody); Objects.push_back(RigidBody); m_pNumEntities++; // End Physics } I then have a method to create a cube and give it rigid body physics properties. I know there will be errors here as I get the items colliding with the ground but not with each other properly. So I would appreciate some input on what I am doing wrong. void GameState::CreateBox(const btVector3 &TPosition, const btVector3 &TScale, btScalar TMass) { Ogre::Vector3 size = Ogre::Vector3::ZERO; Ogre::Vector3 pos = Ogre::Vector3::ZERO; Ogre::Vector3 scale = Ogre::Vector3::ZERO; pos.x = TPosition.getX(); pos.y = TPosition.getY(); pos.z = TPosition.getZ(); scale.x = TScale.getX(); scale.y = TScale.getY(); scale.z = TScale.getZ(); Ogre::Entity *entity = m_pSceneMgr->createEntity( "Box" + Ogre::StringConverter::toString(m_pNumEntities), "cube.mesh"); entity->setCastShadows(true); Ogre::AxisAlignedBox boundingB = entity->getBoundingBox(); size = boundingB.getSize(); //size /= 2.0f; // Only the half needed? //size *= 0.96f; // Bullet margin is a bit bigger so we need a smaller size entity->setMaterialName("Test/Cube"); Ogre::SceneNode *node = m_pSceneMgr->getRootSceneNode()->createChildSceneNode(); node->attachObject(entity); node->setPosition(pos); //node->scale(scale); // Physics btTransform Transform; Transform.setIdentity(); Transform.setOrigin(TPosition); // Give it to the motion state btDefaultMotionState *MotionState = new btDefaultMotionState(Transform); btVector3 HalfExtents(TScale.getX()*0.5f,TScale.getY()*0.5f,TScale.getZ()*0.5f); btCollisionShape *Shape = new btBoxShape(HalfExtents); // Add Mass btVector3 LocalInertia; Shape->calculateLocalInertia(TMass, LocalInertia); // CReate the rigid body object btRigidBody *RigidBody = new btRigidBody(TMass, MotionState, Shape, LocalInertia); // Store a pointer to the Ogre Node so we can update it later RigidBody->setUserPointer((void *) (node)); // Add it to the physics world World->addRigidBody(RigidBody); Objects.push_back(RigidBody); m_pNumEntities++; } Then in the GameState::update() method which which runs every frame to handle input and render etc I call an UpdatePhysics method to update the physics simulation. void GameState::UpdatePhysics(unsigned int TDeltaTime) { World->stepSimulation(TDeltaTime * 0.001f, 60); btRigidBody *TObject; for(std::vector<btRigidBody *>::iterator it = Objects.begin(); it != Objects.end(); ++it) { // Update renderer Ogre::SceneNode *node = static_cast<Ogre::SceneNode *>((*it)->getUserPointer()); TObject = *it; // Set position btVector3 Point = TObject->getCenterOfMassPosition(); node->setPosition(Ogre::Vector3((float)Point[0], (float)Point[1], (float)Point[2])); // set rotation btVector3 EulerRotation; QuaternionToEuler(TObject->getOrientation(), EulerRotation); node->setOrientation(1,(Ogre::Real)EulerRotation[0], (Ogre::Real)EulerRotation[1], (Ogre::Real)EulerRotation[2]); //node->rotate(Ogre::Vector3(EulerRotation[0], EulerRotation[1], EulerRotation[2])); } } void GameState::QuaternionToEuler(const btQuaternion &TQuat, btVector3 &TEuler) { btScalar W = TQuat.getW(); btScalar X = TQuat.getX(); btScalar Y = TQuat.getY(); btScalar Z = TQuat.getZ(); float WSquared = W * W; float XSquared = X * X; float YSquared = Y * Y; float ZSquared = Z * Z; TEuler.setX(atan2f(2.0f * (Y * Z + X * W), -XSquared - YSquared + ZSquared + WSquared)); TEuler.setY(asinf(-2.0f * (X * Z - Y * W))); TEuler.setZ(atan2f(2.0f * (X * Y + Z * W), XSquared - YSquared - ZSquared + WSquared)); TEuler *= RADTODEG; } I seem to have issues with the cubes not colliding with each other and colliding strangely with the ground. I have tried to capture the effect with the attached image. I would appreciate any help in understanding what I have done wrong. Thanks. EDIT : Solution The following code shows the changes I made to get accurate physics. void GameState::createScene() { m_pSceneMgr->createLight("Light")->setPosition(75,75,75); // Physics // As a test we want a floor plane for things to collide with Ogre::Entity *ent; Ogre::Plane p; p.normal = Ogre::Vector3(0,1,0); p.d = 0; Ogre::MeshManager::getSingleton().createPlane( "FloorPlane", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, p, 200000, 200000, 20, 20, true, 1, 9000,9000,Ogre::Vector3::UNIT_Z); ent = m_pSceneMgr->createEntity("floor", "FloorPlane"); ent->setMaterialName("Test/Floor"); Ogre::SceneNode *node = m_pSceneMgr->getRootSceneNode()->createChildSceneNode(); node->attachObject(ent); btTransform Transform; Transform.setIdentity(); // Fixed the transform vector here for y back to 0 to stop the objects sinking into the ground. Transform.setOrigin(btVector3(0,0,0)); // Give it to the motion state btDefaultMotionState *MotionState = new btDefaultMotionState(Transform); btCollisionShape *Shape = new btStaticPlaneShape(btVector3(0,1,0),0); // Add Mass btVector3 LocalInertia; Shape->calculateLocalInertia(0, LocalInertia); // CReate the rigid body object btRigidBody *RigidBody = new btRigidBody(0, MotionState, Shape, LocalInertia); // Store a pointer to the Ogre Node so we can update it later RigidBody->setUserPointer((void *) (node)); // Add it to the physics world World->addRigidBody(RigidBody); Objects.push_back(RigidBody); m_pNumEntities++; // End Physics } void GameState::CreateBox(const btVector3 &TPosition, const btVector3 &TScale, btScalar TMass) { Ogre::Vector3 size = Ogre::Vector3::ZERO; Ogre::Vector3 pos = Ogre::Vector3::ZERO; Ogre::Vector3 scale = Ogre::Vector3::ZERO; pos.x = TPosition.getX(); pos.y = TPosition.getY(); pos.z = TPosition.getZ(); scale.x = TScale.getX(); scale.y = TScale.getY(); scale.z = TScale.getZ(); Ogre::Entity *entity = m_pSceneMgr->createEntity( "Box" + Ogre::StringConverter::toString(m_pNumEntities), "cube.mesh"); entity->setCastShadows(true); Ogre::AxisAlignedBox boundingB = entity->getBoundingBox(); // The ogre bounding box is slightly bigger so I am reducing it for // use with the rigid body. size = boundingB.getSize()*0.95f; entity->setMaterialName("Test/Cube"); Ogre::SceneNode *node = m_pSceneMgr->getRootSceneNode()->createChildSceneNode(); node->attachObject(entity); node->setPosition(pos); node->showBoundingBox(true); //node->scale(scale); // Physics btTransform Transform; Transform.setIdentity(); Transform.setOrigin(TPosition); // Give it to the motion state btDefaultMotionState *MotionState = new btDefaultMotionState(Transform); // I got the size of the bounding box above but wasn't using it to set // the size for the rigid body. This now does. btVector3 HalfExtents(size.x*0.5f,size.y*0.5f,size.z*0.5f); btCollisionShape *Shape = new btBoxShape(HalfExtents); // Add Mass btVector3 LocalInertia; Shape->calculateLocalInertia(TMass, LocalInertia); // CReate the rigid body object btRigidBody *RigidBody = new btRigidBody(TMass, MotionState, Shape, LocalInertia); // Store a pointer to the Ogre Node so we can update it later RigidBody->setUserPointer((void *) (node)); // Add it to the physics world World->addRigidBody(RigidBody); Objects.push_back(RigidBody); m_pNumEntities++; } void GameState::UpdatePhysics(unsigned int TDeltaTime) { World->stepSimulation(TDeltaTime * 0.001f, 60); btRigidBody *TObject; for(std::vector<btRigidBody *>::iterator it = Objects.begin(); it != Objects.end(); ++it) { // Update renderer Ogre::SceneNode *node = static_cast<Ogre::SceneNode *>((*it)->getUserPointer()); TObject = *it; // Set position btVector3 Point = TObject->getCenterOfMassPosition(); node->setPosition(Ogre::Vector3((float)Point[0], (float)Point[1], (float)Point[2])); // Convert the bullet Quaternion to an Ogre quaternion btQuaternion btq = TObject->getOrientation(); Ogre::Quaternion quart = Ogre::Quaternion(btq.w(),btq.x(),btq.y(),btq.z()); // use the quaternion with setOrientation node->setOrientation(quart); } } The QuaternionToEuler function isn't needed so that was removed from code and header files. The objects now collide with the ground and each other appropriately.

    Read the article

  • Farseer Physics: Ways to create a Body?

    - by EdgarT
    I want to create something similar to this using farsser and Kinect: https://vimeo.com/33500649 This is my implementation until now: http://www.youtube.com/watch?v=GlIvJRhco4U I have the outline vertices and the triangulation of the user. And following the Texture to Polygonmsample i used this line to create the shape, where farseerObject is a list of vertices of the triangles: _compound = BodyFactory.CreateCompoundPolygon(World, farseerObject, 1f, BodyType.Dynamic); But I have to update the body each frame (like 30 fps) and this is very slow. I get just 2 or 3 fps. There's another (faster) way to create the Body from a list of triangles or the contour vertices?

    Read the article

  • Idea for a physics–computer science joint curriculum and textbook

    - by Ami
    (I apologize in advance if this question is off topic or too vague) I want to write (and have starting outlining) a physics textbook which assumes its reader is a competent computer programmer. Normal physics textbooks teach physical formulas and give problems that are solved with pen, paper and calculator. I want to provide a book that emphasizes computational physics, how computers can model physical systems and gives problems of the kind: write a program that can solve a set of physics problems based on user input. Third party open source libraries would be used to handle most of the computation and I want to use a high-level language like Java or C#. Besides the fact I'd enjoy working on this, I think a physics-computer science joint curriculum should be offered in schools and this is part of a large agenda to make this happen. I think physics students (like myself) should be learning how to use and leverage computers to solve abstract problems and sets of problems. I think programming languages should be thought of as a useful medium for engaging in many areas of inquiry. Is this an idea worth pursuing? Is the merger of these two subjects in the form of an undergraduate college curriculum feasible? Are there any specific tools I should be leveraging or pitfalls I should be aware of? Has anyone heard of college courses or otherwise that assume this methodology? Are there any books/textbooks out there like the one I'm describing (for physics or any other subject)?

    Read the article

  • How is Basic Physics applied in CS/SE?

    - by Wulf
    What basic physics principles do software engineers and/or computer scientists use to help solve specific or common problems? The first one that came to my head was creating a Physics engine for a game; physics is involved, as it requires knowledge of: Forces and Motion: Kinematics, Dynamics, Circular Motion However, I need another example, but haven't come across one that involves basic physics. Please consider the following basic physics (grade 12 level) concepts: Energy and Momentum: Work and Energy, Momentum and Collisions, Gravitational and Celestial Mechanics Electric, Gravitational & Magnetic Field: Electric Charges and Electric Field, Magnetic Fields and Electomagnetism The Wave Nature of Light: Waves and Light, Wave Effects of Light Matter-Energy Interface: Einstein’s Special Theory of Relativity, Waves, Photons and Matter, Radioactivity and Elementary Particles I will be happy with any response; Keywords for google, names of methods like raycasting, etc.

    Read the article

  • XNA C# Platformer - physics engine or tile based?

    - by Hugh
    I would like to get some opinions on whether i should develop my game using a physics engine (farseer physics seems to be the best option) or follow the traditional tile-based method. Quick background: - its a college project, my first game, but have 4 years academic programming experience - Just want a basic platformer with a few levels, nothing fancy - want a shooting mechanic, run and gun, just like contra or metal slug for example - possibly some simple puzzles I have made a basic prototype with farseer, the level is hardcoded with collisions and not really tiled, more like big full-screen sized tiles, with collision bodies drawn manually along the ground and walls etc. My main problem is i want a simple retro feel to the jumping and physics but because its a physics simulation engine its going to be realistic, whereas typical in air controllable physics for platformers arent realistic. I have to make a box with wheel body fixture under it to have this effect and its glitchy and doesnt feel right. I chose to use a physics engine because i tried the tile method initially and found it very hard to understand, the engine took care of alot things to save me time, mainly being able to do slopes easily was nice and the freedom to draw collision bounds wherever i liked, rather then restricted to a grid, which gave me more freedom for art design also. In conclusion i don't know which method to pick, i want to use a method which will be the most straight forward way to implement and wont give me a headache later on, preferably a method which has an abundance of tutorials and resources so i dont get "stuck" doing something which has been done a million times before! Let me know i haven't provided enough information for you to help me! Thanks in advance, Hugh.

    Read the article

  • Need Guidance Making HTML5 Canvas Game Engine

    - by Scriptonaut
    So I have some free time this winter break and want to build a simple 2d HTML5 canvas game engine. Mostly a physics engine that will dictate the way objects move and interact(collisions, etc). I made a basic game here: http://caidenhome.com/HTML%205/pong.html and would like to make more, and thought that this would be a good reason to make a simple framework for this stuff. Here are some questions: Does the scripting language have to be Javascript? What about Ruby? I will probably write it with jQuery because of the selecting powers, but I'm curious either way. Are there any great guides you guys know of? I want a fast guide that will help me bust out this engine sometime in the next 2 weeks, hopefully sooner. What are some good conventions I should be aware of? What's the best way to get sound? At the moment I'm using something like this: var audioElement = document.createElement('audio'); audioElement.setAttribute('src', 'paddle_col.wav'); audioElement.load(); I'm interested in making this engine lightweight and extremely efficient, I will do whatever it takes to get great speeds and processing power. I know this question is fairly vague, but I just need a push in the right direction. Thanks :)

    Read the article

  • Separating physics and game logic from UI code

    - by futlib
    I'm working on a simple block-based puzzle game. The game play consists pretty much of moving blocks around in the game area, so it's a trivial physics simulation. My implementation, however, is in my opinion far from ideal and I'm wondering if you can give me any pointers on how to do it better. I've split the code up into two areas: Game logic and UI, as I did with a lot of puzzle games: The game logic is responsible for the general rules of the game (e.g. the formal rule system in chess) The UI displays the game area and pieces (e.g. chess board and pieces) and is responsible for animations (e.g. animated movement of chess pieces) The game logic represents the game state as a logical grid, where each unit is one cell's width/height on the grid. So for a grid of width 6, you can move a block of width 2 four times until it collides with the boundary. The UI takes this grid, and draws it by converting logical sizes into pixel sizes (that is, multiplies it by a constant). However, since the game has hardly any game logic, my game logic layer [1] doesn't have much to do except collision detection. Here's how it works: Player starts to drag a piece UI asks game logic for the legal movement area of that piece and lets the player drag it within that area Player lets go of a piece UI snaps the piece to the grid (so that it is at a valid logical position) UI tells game logic the new logical position (via mutator methods, which I'd rather avoid) I'm not quite happy with that: I'm writing unit tests for my game logic layer, but not the UI, and it turned out all the tricky code is in the UI: Stopping the piece from colliding with others or the boundary and snapping it to the grid. I don't like the fact that the UI tells the game logic about the new state, I would rather have it call a movePieceLeft() method or something like that, as in my other games, but I didn't get far with that approach, because the game logic knows nothing about the dragging and snapping that's possible in the UI. I think the best thing to do would be to get rid of my game logic layer and implement a physics layer instead. I've got a few questions regarding that: Is such a physics layer common, or is it more typical to have the game logic layer do this? Would the snapping to grid and piece dragging code belong to the UI or the physics layer? Would such a physics layer typically work with pixel sizes or with some kind of logical unit, like my game logic layer? I've seen event-based collision detection in a game's code base once, that is, the player would just drag the piece, the UI would render that obediently and notify the physics system, and the physics system would call a onCollision() method on the piece once a collision is detected. What is more common? This approach or asking for the legal movement area first? [1] layer is probably not the right word for what I mean, but subsystem sounds overblown and class is misguiding, because each layer can consist of several classes.

    Read the article

  • What different ways are there to model restitution in a physics engine?

    - by Mikael Högström
    In my physics engine I give a body a value for restitution between 0 and 1. When two bodies collide there seems to be different views on how the restitution of the collision should be calculated. To me the most intuitive seems to be to take the average of the two but some seem to take only the largest one. Are there other ways to do it? Also, could the closing velocity or some other parameter come into effect?

    Read the article

  • How to implement physical effect, perspective effect on Android

    - by asedra_le
    I'm researching about 2D game for Android to implement an Android Game Project. My project looks nearly like PaperToss. Instance of throwing a page, my game will throw a coin. Suppose that I have a coin put in three-dimensional that have coordinates at A(x,y,z). I throw that point ahead, after 1/100 second, that coin move from A(x,y,z) to A'(x',y',z'). By this way, I have two problems need to solve. Determine the formulas can be used to compute the coordinates of the coin at time t. This problem is under-researching. I have no idea to solve this problem. Mapping three-dimensional points to a two-dimensional and use those new coordinates (a two-dimensional coordinates) to draw our coin on screen. I have found two solutions for this problem: Orthographic projection & Perspective projection However, my old friend said that OpenGL supports to solve problems like my problems. Any body have experiences about my problems? Help me please :) Thank for reading my question.

    Read the article

  • How can I stop my Jitter physics meshes being offset?

    - by ben1066
    I'm developing a C# game engine and have hit a snag trying to add physics. I'm using XNA for graphics and Jitter for physics. I am trying to split the XNA model into it's meshes, then create a ConvexHull for each mesh. I then attempt to combine those into a CompoundObject, this however isn't working and depending upon the model the meshes are offset by different amounts. This is the code I'm currently using and it gives me: Any ideas?

    Read the article

  • IndexOutOfRangeException on World.Step after enabling/disabling a Farseer physics body?

    - by WilHall
    Earlier, I posted a question asking how to swap fixtures on the fly in a 2D side-scroller using Farseer Physics Engine. The ultimate goal being that the player's physical body changes when the player is in different states (I.e. standing, walking, jumping, etc). After reading this answer, I changed my approach to the following: Create a physical body for each state when the player is loaded Save those bodies and their corresponding states in parallel lists Swap those physical bodies out when the player state changes (which causes an exception, see below) The following is my function to change states and swap physical bodies: new protected void SetState(object nState) { //If mBody == null, the player is being loaded for the first time if (mBody == null) { mBody = mBodies[mStates.IndexOf(nState)]; mBody.Enabled = true; } else { //Get the body for the given state Body nBody = mBodies[mStates.IndexOf(nState)]; //Enable the new body nBody.Enabled = true; //Disable the current body mBody.Enabled = false; //Copy the current body's attributes to the new one nBody.SetTransform(mBody.Position, mBody.Rotation); nBody.LinearVelocity = mBody.LinearVelocity; nBody.AngularVelocity = mBody.AngularVelocity; mBody = nBody; } base.SetState(nState); } Using the above method causes an IndexOutOfRangeException when calling World.Step: mWorld.Step(Math.Min((float)nGameTime.ElapsedGameTime.TotalSeconds, (1f / 30f))); I found that the problem is related to changing the .Enabled setting on a body. I tried the above function without setting .Enabled, and there was no error thrown. Turning on the debug views, I saw that the bodies were updating positions/rotations/etc properly when the state was changes, but since they were all enabled, they were just colliding wildly with each other. Does Enabling/Disabling a body remove it from the world's body list, which then causes the error because the list is shorter than expected? Update: For such a straightforward issue, I feel this question has not received enough attention. Has anyone else experienced this? Would anyone try a quick test case? I know this issue can be sidestepped - I.e. by not disabling a body during the simulation - but it seems strange that this issue would exist in the first place, especially when I see no mention of it in the documentation for farseer or box2d. I can't find any cases of the issue online where things are more or less kosher, like in my case. Any leads on this would be helpful.

    Read the article

  • Getting into game/game engine programming

    - by Darkslash
    So I am interested in learning game programming, but I really have an interest in the lower level engineering in games. I have openGL experience, and I am really interested in learning more about implementing AI, Physics, etc. I have a computer science degree, so I really like getting into technical stuff. Many times when I ask about this sort of thing, I get a lot of "Use an engine", "Use Unity3d", "Why waste your time writing code that already exists", etc etc. My idea was to use simpler libraries such as SFML or XNA so that I could learn how to implement the more complex systems. The thing is, although I do want to write games, I want to learn things that using something like Unity simply doesnt teach you. My goal is not to make a current generation quality 3D game to sell, I just want to make some cool smaller games and learn all I can about the programming side of game development. Is this something that people just do not do anymore? It seems like everywhere I turn people are using Unity or UDK or GameMaker. I fully understand why you would use a tool like these, but I cant see how they would suit my purposes. So where does someone like myself turn? Am I trying to learn something that people just do not bother doing anymore? Is the innovation in this area gone and just all about gameplay now? Im sorry if this question seems silly, but I am genuinely interested in knowing more about this and meeting more people who are interested in this sort of thing.

    Read the article

  • Game physics presentation by Richard Lord, some questions

    - by Steve
    I been implementing (in XNA) the examples in this physics presentation by Richard Lord where he discusses various integration techniques. Bearing in mind that I am a newcomer to game physics (and physics in general) I have some questions. 15 slides in he shows ActionScript code for a gravity example and an animation showing a bouncing ball. The ball bounces higher and higher until it is out of control. I implemented the same in C# XNA but my ball appeared to be bouncing at a constant height. The same applies to the next example where the ball bounces lower and lower. After some experimentation I found that if I switched to a fixed timestep and then on the first iteration of Update() I set the time variable to be equal to elapsed milliseconds (16.6667) I would see the same behaviour. Doing this essentially set the framerate, velocity and acceleration to zero for the first update and introduced errors(?) into the algorithm causing the ball's velocity to increase (or decrease) over time. I think! My question is, does this make the integration method used poor? Or is it demonstrating that it is poor when used with variable timestep because you can't pass in a valid value for the first lot of calculations? (because you cannot know the framerate in advance). I will continue my research into physics but can anyone suggest a good method to get my feet wet? I would like to experiment with variable timestep, acceleration that changes over time and probably friction. Would the Time Corrected Verlet be OK for this?

    Read the article

  • Physics-based dynamic audio generation in games

    - by alexc
    I wonder if it is possible to generate audio dynamically without any (!) audio assets, using pure mathematics/physics and some input values like material properties and spatial distribution of content in scene space. What I have in mind is something like a scene, with concrete floor, wooden table and glass on it. Now let's assume force pushes the glass towards the edge of table and then the glass falls onto the floor and shatters. The near-realistic glass destruction itself would be possible using voxels and good physics engine, but what about the sound the glass makes while shattering? I believe there is a way to generate that sound, because physics of sound is fairly known these days, but how computationaly costy that would be? Consumer hardware or supercomputers? Do any of you know some good resources/videos of such an experiment?

    Read the article

  • Physics from other games

    - by Carlosrdz1
    I'm making a platform engine with XNA Game Studio, and I've solved almost everything about colliding stuff. But now, I'm searching for good physics for the player, I'm trying to emulate characters from other games like Mario from Super Mario World, or MegaMan X... do you know a website or something, where the physics from that games are revealed? I remember seen a page with something like that. Or what's the process you think is the best to emulate physics from other games? Just trial and error? Thank you.

    Read the article

  • 2D Physics in a networked game (iOS)?

    - by Pedro
    I am researching the possibilities for a new iOS game. It's going to be a run-n-gun type platformer, and I'm looking into the possibility of co-op multiplayer. The game itself wouldn't be very physics intensive, there will most likely be 20-30 physics bodies at any given time. For the multiplayer, I think I would have one player "hosting" and up to 3 other connecting via the Internet. Here's my first question, are there any 2D physics engines that work over a network(preferably open source)? My second question, Does anyone have any thoughts on using a non-networked engine (like Box2D or Chipmunk) and adding the networking component? Since there would not be very much information sent, do you think it would cause a lot of lag?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >