Search Results

Search found 26043 results on 1042 pages for 'development trunk'.

Page 508/1042 | < Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >

  • How should I access frame buttons from a controller in an MVC approach?

    - by Loris
    I'm developing an italian card game using the mvc pattern. I have the class GameFrame that contains the view. The user's card are buttons (JButton objects). I have 3 controllers: GameController: to control the game in general. Contains the game loop. HumanPlayerController: to control the user input ComputerPlayerController: contains the AI of the computer PlayerController: is an interface with the makeTurn() method. It's implemented by HumanP.C. and ComputerP.C. HumanPlayerController implements ActionListener too. But what is the right way to access to the GameFrame buttons? I need it for understand which card was chosen. GameFrame and HumanPlayerController are in different packages. Should i make the JButtons public?

    Read the article

  • C# XNA Make rendered screen a texture2d

    - by redcodefinal
    I am working on a cool little city generator which makes cities in the isometric perspective. However, a problem arose where if the grid size was over a certain limit it would have awful lag. I found the main problem to be in the draw method. So I took the precautionary step of rendering only items that were onscreen. This fixed the lag but, not by much. The idea I have is to render the frame once and take a snapshot. Then, display that as a texture2d on screen. This way I don't have to render 1,000,000 objects every frame since they don't change anyways. TL;DR - I want to Take a snapshot of an already rendered frame Turn it into a Texture2D Render that to the screen instead of all the objects. Any help appreciated.

    Read the article

  • Units in tile world

    - by Vilzow
    I've started to make a 2D sidescroller, the camera and world rendering works as I expect, but now comes the physics part of world. What I need is that one tile in x direction (or y direction) should correspond to 1 meter. Since I have a variable time step (android mobile game), I can't figure it out, since the timing and velocity always will be dependent of the device. So, is there any good way to make one tile to correspond 1 meter? This would be good, otherwise the physics implementation would later be weird.

    Read the article

  • how to keep display tick rate steady when using continuous collision detection?

    - by nas Ns
    (I've just found about this forum). I hope it is ok to repost my question again here. I posted this question at stackoverflow, but it looks like I might get better help here. Here is the question: I've implemented basic particles motion simulation with continuous collision detection. But there is small issue in display. Assume simple case of circles moving inside square. All elastic collisions. no firction. All motion is constant speed. No forces are involved, no gravity. So when a particle is moving, it is always moving at constant speed (in between collisions) What I do now is this: Let the simulation time step be 1 second (for example). This is the time step simulation is advanced before displaying the new state (unless there is a collision sooner than this). At start of each time step, time for the next collision between any particles or a particle with a wall is determined. Call this the TOC time; let’s say TOC was .5 seconds in this case. Since TOC is smaller than the standard time step, then the system is moved by TOC and the new system is displayed so that the new display shows any collisions as just taking place (say 2 circles just touched each other’s, or a circle just touched a wall) Next, the collision(s) are resolved (i.e. speeds updated, changed directions etc..). A new step is started. The same thing happens. Now assume there is no collision detected within the next 1 second (those 2 circles above will not be in collision any more, even though they are still touching, due to their speeds showing they are moving apart now), Hence, simulation time is advanced now by the full one second, the standard time step, and particles are moved on the screen using 1 second simulation time and new display is shown. You see what has just happened: One frame ran for .5 seconds, but the next frame runs for 1 second, may be the 3rd frame is displayed after 2 seconds, may be the 4th frame is displayed after 2.8 seconds (because TOC was .8 seconds then) and so on. What happens is that the motion of a particle on the screen appears to speed up or slow down, even though it is moving at constant speed and was not even involved in a collision. i.e. Looking at one particle on its own, I see it suddenly speeding up or slowing down, becuase another particle had hit a wall. This is because the display tick is not uniform. i.e. the frame rate update is changing, giving the false illusion that a particle is moving at non-constant speed while in fact it is moving at constant speed. The motion on the screen is not smooth, since the screen is not updating at constant rate. I am not able to figure how to fix this. If I want to show 2 particles at the moment of the collision, I must draw the screen at different times. Drawing the screen always at the same tick interval, results in seeing 2 particles before the collision, and then after the collision, and not just when they colliding, which looked bad when I tried it. So, how do real games handle this issue? How to display things in order to show collisions when it happen, yet keep the display tick constant? These 2 requirements seem to contradict each other’s.

    Read the article

  • Avatar creation / dressing feature

    - by milesmeow
    What is the effort required to use a game engine such as Unreal or Unity, etc. and create an avatar customization features...complete with clothes. The user should be able to customize the body features and the clothes need to then fit onto the customized body. What is needed? Can you create one set of 3D models for clothes and somehow programatically have the clothes adapt to the body shape? I.e. The same shirt model will be able to fit on a skinny person vs. someone with a big beer belly. How difficult is this? What are the steps needed to implement this avatar creation/dressing feature. I'm basically talking about something like in Rockband 3.

    Read the article

  • Gosu ruby windows no allocator for Image [on hold]

    - by user2812818
    I am trying to run the Gosu tutorial on Windows XP for ruby 1.93 It quits with `new': allocator undefined for Gosu::Image (TypeError) when trying to initialize a new Image: require 'gosu' require 'rubygems' class GameWindow < Gosu::Window def initialize super(640, 480, false) self.caption = "Gosu Tutorial Game" @background_image = Gosu::Image.new(self, "/media/123.bmp", true) end end I made sure the image is there and is png/bmp. I know it is something simple, maybe to do with the DLL's required? just not sure what.... thanks sgv

    Read the article

  • problems texture mapping in modern OpenGL 3.3 using GLSL #version 150

    - by RubyKing
    Hi all I'm trying to do texture mapping using Modern OpenGL and GLSL 150. The problem is the texture shows but has this weird flicker I can show a video here http://www.youtube.com/watch?v=xbzw_LMxlHw and I have everything setup best I can have my texcords in my vertex array sent up to opengl I have my fragment color set to the texture values and texel values I have my vertex sending the textures cords to texture cordinates to be used in the fragment shader I have my ins and outs setup and I still don't know what I'm missing that could be causing that flicker. here is my code FRAGMENT SHADER #version 150 uniform sampler2D texture; in vec2 texture_coord; varying vec3 texture_coordinate; void main(void){ gl_FragColor = texture(texture, texture_coord); } VERTEX SHADER #version 150 in vec4 position; out vec2 texture_coordinate; out vec2 texture_coord; uniform vec3 translations; void main() { texture_coord = (texture_coordinate); gl_Position = vec4(position.xyz + translations.xyz, 1.0); } Last bit here is my vertex array with texture cordinates GLfloat vVerts[] = { 0.5f, 0.5f, 0.0f, 0.0f, 1.0f , 0.0f, 0.5f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f}; //tex x and y HERE IS THE ACTUAL FULL SOURCE CODE if you need to see all the code in its fullest glory here is a link to every file http://ideone.com/7kQN3 thank you for your help

    Read the article

  • How can I read a portion of one Minecraft world file and write it into another?

    - by RapierMother
    I'm looking to read block data from one Minecraft world and write the data into certain places in another. I have a Minecraft world, let's say "TemplateWorld", and a 2D list of Point objects. I'm developing an application that should use the x and y values of these Points as x and z reference coordinates from which to read constant-sized areas of blocks from the TemplateWorld. It should then write these blocks into another Minecraft world at constant y coordinates, with x & z coordinates determined based on each Point's index in the 2D list. The issue is that, while I've found a decent amount of information online regarding Minecraft world formats, I haven't found what I really need: more of a breakdown by hex address of where/what everything is. For example, I could have the TemplateWorld actually be a .schematic file rather than a world; I just need to be able to read the bytes of the file, know that the actual block data starts always at a certain address (or after a certain instance of FF, etc.), and how it's stored. Once I know that, it's easy as pie to just read the bytes and store them.

    Read the article

  • Mobile: Physics and movement actions

    - by meganegora
    I've been using spritekit for a while for a few small games. One thing I've noticed is that spritekit is the first game framework I've used that allows me to apply move actions to physics bodies. (without anything screwing up at least.) Are there any cross platform game frameworks I can use that allow move actions on physics bodies? Not impulses. I've used cocos2d in the past and when I tried ccmoveby on physics bodies the simulation would get totally confused. I rather not use cocos2d anyway. I'm asking because I want to make cross platform games and spritekit is iOS only.

    Read the article

  • Client side prediction/simulation Question

    - by Legendre
    I found a related question but it doesn't have what I needed. Client A sends input to move at T0. Server receives input at T1. All clients receive the change at T2. Question: With client-side prediction, client A would start moving at T0, client-side. All other clients receive the change at T2, so to them, client A only started moving at T2. If I understand correctly, client B will always see client A's past position and not his current position? How do I sync both client B and client A?

    Read the article

  • Which 3D file formats support multiple animations? [on hold]

    - by Justin
    I'm working on a 3D application that uses Assimp to import 3D models with animations. Personally, I use Blender to create the models and animations. I'm having trouble exporting multiple animations, however. For example, I'd like to have an idle animation, a walk animation, a run animation, etc. So far I've tried COLLADA and DirectX without much success. The COLLADA export will include the first animation, but not any of the others. The DirectX doesn't include any animation. Which 3D file formats support multiple animations? (Preferably one that Assimp can import. Also, the Assimp website says that it doesn't support .blend files with animation, otherwise I'd just do that.)

    Read the article

  • Using normals in DirectX 10

    - by Dave
    I've got a working OBJ loader that loads vertices, indices, texture coordinates, and normals. As of right now it doesn't process texture coordinates or normals but it stores them in arrays and creates a valid mesh with the vertices and indices. Now I am trying to figure out how can I make the shader use the correct normal in the array for the current vertex if I can't setnormals() to my mesh. If I were to just use an index in my array of normals corresponding to the index in the vertices, how would I retrieve the current index the shader is processing? BTW: I am trying to write a blinn-phong shader technique. Also when I create the input layout and I've added the semantic NORMAL to it, how would I list the multiple semantics in that single parameter? Would I just separate it with a space? PS: If you need to see any code, just let me know.

    Read the article

  • What's the best way to access Neo4j from Django?

    - by abdel
    it seems that i found something that let me confused; i've found two Neo4j to download to python, the first one is: http://pypi.python.org/pypi/neo4j-embedded and the second one is: https://svn.neo4j.org/components/neo4j.py/trunk/ what's the difference between the two? the first one seems to be big (size), so does this mean that if i use it i'll not need the neo4j community release (milestone)? when i've installed the first one, and tried to test a django example, it seems that the directory named "model" https://svn.neo4j.org/components/neo4j.py/trunk/src/main/python/neo4j/model/ is missing? so what's the difference, and who will be better to use with Django? and what about that one? http://pypi.python.org/pypi/neo4django/

    Read the article

  • Unity, changing gravity game & stopping character when he hit a wall.

    - by Sylario
    i am currently working on a 2d puzzle game in the unity engine. One of the aspect of this game is the possibility to rotate the level of 90°. It also rotate the gravity. The main character is not directly controlled by the player, but instead fall when the level is rotated. When the main character hit a wall, he should stop to move. If i do not stop it, it kind of blink ans shake against the wall. To stop it i detect collision and depending on the current rotation state, the player will stop at "vertical" or "horizontal" tags when a OnCollisonEnter occurs. I must do that because when the player fall on his relative ground, He must not stop like if he had touch a wall. My problem is the 'side' of platform, or the 'top' of wall, they use the same tag and thus do not give the correct tag to my character. I tried to put a very small invisible box on top/side of elements but the collision occurs nevertheless. It seems when the player falls and hit something he go through a bit before being replaced at correct position by unity. Is there a way : 1 ) to doesn't stop my character but to make it appear immobile on screen 2 ) To detect a "i cannot move anymore" collision other than by using collision?

    Read the article

  • Making a clone of a game legal?

    - by user782220
    My question is similar to a previous question. Consider the following clone of starcraft. Change the artwork, sound, music, change the names of units. However, leave the unit hitpoints unchanged, unit damage unchanged, unit movement speed unchanged, change ability names but not ability effects. Is that considered illegal? In other words is copying the unit hp, dmg, etc. considered illegal even if everything else is changed.

    Read the article

  • Why is C++ used for game engines? How about its future in game engines?

    - by kasperov
    C++, as I have seen, is being heavily used in 3d video game engines.... Is it because of the performance issues, legecy code or libraries such as DriverX? If performance, libraries and code infrastructure are the reasons, dosen't that make C++ indispensible, at least for game engines? (ie, we have no other option even in the very distant future). I asked this because, I have the right to know the upcomming future trends in game engines.

    Read the article

  • using lua in kobold2d to control parameters

    - by nycynik
    Is there a tutorial on using LUA in Kobold2d? I want to know if its possible to use it to control the game behavior (like max speed decrease of timer, and bonus points) by uploading a new script to the app. I found this link in the FAQ: http://www.kobold2d.com/pages/viewpage.action?pageId=917888 but it does not mention if I can replace the lua script from within the game, and reload it, is that possible? Should i just have a parameter file instead that i can download and replace?

    Read the article

  • non randomic enemy movement implementation

    - by user601836
    I would like to implement enemy movement on a X-Y grid. Would it be a good idea to have a predefined table with an initial X-Y position and a predefined "surveillance path"? Each enemy will follow its path until it detects a player, at this point it will start chasing it using a chasing algorithm. According to a friend of mine this implementation is good because the design of a good path will provide to the user a sort of reality sensation.

    Read the article

  • How access PhysicalMaterial from Actor Class?

    - by EmAdpres
    I use Projectile for my weapon system and UDKProjectile has two main function to handle Hit of projectiles(=bullet of my weapon): simulated function ProcessTouch(Actor Other, Vector HitLocation, Vector HitNormal) // For Actors simulated event HitWall(vector HitNormal, actor Wall, PrimitiveComponent WallComp) // Everything except Actors ( I guess) the first method, the function just give me the actor which I hit and my question is How I can get that actor's physical material by first parameter ( Other ), in order to make a proper react about it ( for example a proper Sound of collide ) ... A tricky (but hateful ) way which I knew works is, make a Trace from a little back of that actor to that actor, and use HitInfo parameter which include physical Material ! But there should be a more standard way !

    Read the article

  • 2D Topdown Shooter - Player Movement Relative to Mouse

    - by Jarmo
    I'm trying to make a topdown 2D space game for my school project. I'm almost done but I just want to add a few little things to make the game more fun to play. if (keystate.IsKeyDown(Keys.W)) { vPlayerPos += Vector2.Normalize(new Vector2(Mouse.GetState().X - vPlayerPos.X, Mouse.GetState().Y - vPlayerPos.Y)) * 3; rPlayer.X = (int)vPlayerPos.X; rPlayer.Y = (int)vPlayerPos.Y; } if (keystate.IsKeyDown(Keys.S)) { vPlayerPos += Vector2.Normalize(new Vector2(Mouse.GetState().X - vPlayerPos.X, Mouse.GetState().Y - vPlayerPos.Y)) * -3; rPlayer.X = (int)vPlayerPos.X; rPlayer.Y = (int)vPlayerPos.Y; } This is what i use to move towards and away from my mouse crossair. I tried to make a somewhat similar function to make it strafe with "A" and "D". But for some reason I just couldn't get it done. Any thoughts?

    Read the article

  • keyPressed is not working after adding ActionListener to JButton

    - by Yehonatan
    I have a serious problem while trying to build a menu for my game. I've added two JButton to a main JPanel and added an ActionListener for each of them. The main JPanel also contains the game JPanel which have the keyPressed method inside keyController. That's how it looks - Main -       JPanel -         JButton, JButton,         JPanel which contains the game and keyPressed function inside KeyController class which worked fine before I added the ActionListener for JButton. For some reason after I added an ActionListener for each of the button, the game JPanel is not getting any keyPreseed events nor KeyRealesed. Does anyone know the solution for my situation? Thank you very much! Main window - Scanner in = new Scanner(System.in); JFrame f = new JFrame("Square V.S Circles"); f.setUndecorated(true); f.setResizable(false); f.setDefaultCloseOperation(JFrame.DISPOSE_ON_CLOSE); f.add(new JPanelHandler()); f.pack(); f.setVisible(true); f.setLocationRelativeTo(null); JPanelHandler(main JPanel) - super.setFocusable(true); JButton mybutton = new JButton("Quit"); JButton sayhi = new JButton("Say hi"); sayhi.addActionListener(new ActionListener() { @Override public void actionPerformed(ActionEvent e) { System.out.println("Hi"); } }); mybutton.addActionListener(new ActionListener() { @Override public void actionPerformed(ActionEvent e) { System.exit(0); } }); add(mybutton); add(sayhi); add(new Board(2)); Board KeyController(The code inside is working so it's unnecessary to put it here) - private class KeyController extends KeyAdapter { public KeyController() { ..Code } @Override public void keyPressed(KeyEvent e) { ...Code } @Override public void keyReleased(KeyEvent e){ ...Code } }

    Read the article

  • No idea of how to simulate car crashes

    - by user2332868
    I have a simple car game with all the basic collision detection and movement in Unity with C#. But when I was building the "engine" for the game I didn't include a detail I later decided to include. I want the possibility so that cars can get damaged and so that the model can change. For example the car looked like it was new and after a crash it looks like a wreck. Please help me by pointing to some resources or by telling me approximate ways of implementing this new feature. Thanks!

    Read the article

  • Should developers make their games easier with new versions?

    - by Gil Kalai
    It seems that the game Angry Birds is becoming gradually easier with new versions. Maybe so people get the illusion of progress and satisfaction of breaking new records? I would like to know if gradual small modifications of games to enhance the sense of improvement and learning by users is known/common/standard practice in game developing. (I don't mean to say that there is anything wrong with such a practice.)

    Read the article

  • Entiity System with C++

    - by Dono
    I'm working on a game engine using the Entity System and I have some questions. How i see Entity System : Components : A class with attributs, set and get. Sprite Physicbody SpaceShip ... System : A class with a list of components. (Component logic) EntityManager Renderer Input Camera ... Entity : Just a empty class with a list of components. What i've done : Currently, i've got a program who allow me to do that : // Create a new entity/ Entity* entity = game.createEntity(); // Add some components. entity->addComponent( new TransformableComponent() ) ->setPosition( 15, 50 ) ->setRotation( 90 ) ->addComponent( new PhysicComponent() ) ->setMass( 70 ) ->addComponent( new SpriteComponent() ) ->setTexture( "name.png" ) ->addToSystem( new RendererSystem() ); My questions Did the system stock a list of components or a list of entities ? In the case where I stock a list of entities, I need to get the component of this entities on each frame, that's probably heavy isn't it ? Did the system stock a list of components or a list of entities ? In the case where I stock a list of entities, I need to get the component of this entities on each frame, that's probably heavy isn't it ?

    Read the article

  • Setting a leader from a sprite array

    - by Craig
    I'm looking to set a leader from an array of sprites, I keep on getting a NullReferenceException was unhandled error from within my main game class when calling the UpdateMouse Method. What have I dont wrong here? class MouseSprite { Random random = new Random(); private MouseSprite leader; public void UpdateBoundaryBox() { mouseBounds.X = (int)mousePosition.X - mouseTexture.Width / 2; mouseBounds.Y = (int)mousePosition.Y - mouseTexture.Height / 2; } public void UpdateMouse(Vector2 position, MouseSprite [] mice, int numberMice, int index) { Vector2 catPosition = position; int enemies = numberMice; this.alive = true; mice[random.Next(0, mice.Length)] = leader;

    Read the article

< Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >