Search Results

Search found 28031 results on 1122 pages for 'personal development'.

Page 486/1122 | < Previous Page | 482 483 484 485 486 487 488 489 490 491 492 493  | Next Page >

  • Issue with DFS imlemtation in objetive-c

    - by Hemant
    i am trying to to do something like this Below is my code: -(id) init{ if( (self=[super init]) ) { bubbles_Arr = [[NSMutableArray alloc] initWithCapacity: 9]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"1",@"1",@"1",@"1",@"1",nil] atIndex:0]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"3",@"3",@"5",@"5",@"1",nil] atIndex:1]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"5",@"3",@"5",@"3",@"1",nil] atIndex:2]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"5",@"3",@"5",@"3",@"1",nil] atIndex:3]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"1",@"1",@"1",@"1",@"1",nil] atIndex:4]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"5",@"5",@"3",@"5",@"1",nil] atIndex:5]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"5",@"5",@"5",@"5",@"5",nil] atIndex:6]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"5",@"5",@"5",@"5",@"5",nil] atIndex:7]; [bubbles_Arr insertObject:[NSMutableArray arrayWithObjects:@"5",@"5",@"5",@"5",@"5",nil] atIndex:8]; NOCOLOR = @"-1"; R = 9; C = 5; [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(testting) userInfo:Nil repeats:NO]; } return self; } -(void)testting{ // NSLog(@"dataArray---- %@",dataArray.description); int startR = 0; int startC = 0; int color = 1 ;// red // NSString *color = @"5"; //reset visited matrix to false. for(int i = 0; i < R; i++) for(int j = 0; j < C; j++) visited[i][j] = FALSE; //reset count count = 0; [self dfs:startR :startC :color :false]; NSLog(@"count--- %d",count); NSLog(@"test--- %@",bubbles_Arr); } -(void)dfs:(int)ro:(int)co:(int)colori:(BOOL)set{ for(int dr = -1; dr <= 1; dr++) for(int dc = -1; dc <= 1; dc++) if((dr == 0 ^ dc == 0) && [self ok:ro+dr :co+dc]) // 4 neighbors { int nr = ro+dr; int nc = co+dc; NSLog(@"-- %d ---- %d",[[[bubbles_Arr objectAtIndex:nr] objectAtIndex:nc] integerValue],colori); if ((([[[bubbles_Arr objectAtIndex:nr] objectAtIndex:nc] integerValue]==1 || [[[bubbles_Arr objectAtIndex:nr] objectAtIndex:nc] isEqualToString:@"1"]) && !visited[nr][nc])) { visited[nr][nc] = true; count++; [self dfs:nr :nc :colori :set]; if(count>2) { [[bubbles_Arr objectAtIndex:nr] replaceObjectAtIndex:nc withObject:NOCOLOR]; [bubbles[nc+1][nr+1] setTexture:[[CCTextureCache sharedTextureCache] addImage:@"gray_tiger.png"]]; } } } } -(BOOL)ok:(int)r:(int)c{ return r >= 0 && r < R && c >= 0 && c < C; } But it's only working for left to right,not working for right to left. And it is also skipping first object. Thanks in advance.

    Read the article

  • Sending state diffs (deltas) and unreliable connections

    - by spaceOwl
    We're building a realtime multiplayer game, in which each player is responsible for reporting its state on every iteration of the game loop. The state updates are broadcasted using unreliable UDP. To minimize state data sending, we've come up with a system that will send only deltas (whatever state data that was changed). This method however is flawed, since a lost packet will mean that other players will not receive the delta, making the game behave in an unexpected way. For example: Assume that state is comprised of: { positionX, positionY, health } Frame 1 - positionX changed --> send a packet with positionX only. Frame 2 - health changed // lost ! Frame 3 - positionY changed --> send a packet with positionY only. // Other players don't know about health change. How can one overcome this issue then? sending the entire data is not always feasible.

    Read the article

  • fragment shader directional light positioning with camera

    - by meWantToLearn
    Im trying to set up directional lighting in the fragment shader. So the direction of my light moves with the camera position. #version 150 core uniform sampler2D diffuseTex; uniform vec4 lightColour; uniform vec3 lightDirection; vec3 LNorm = normalize(lightDirection); vec3 normal = normalize(IN.normal); vec3 calColour = lightColour[i].rgb * intensity; gl_FragColor = vec4(diffuse.rbg * calColour, diffuse.a); It lights the entire scene.

    Read the article

  • Creating my own kill cam

    - by DalexL
    I plan on creating my own kill cam system for a sandbox tool set. After thinking about the mechanics of the kill cam itself, however, I'm quite lost. I'm trying to recreate the ones commonly seen in call of duty games that show, from the view of the killer, the actual killing scene. My Thoughts: -I can't just keep in memory when people kill others because I wouldn't know when to start the 'recording process'. There is on way for me to accurately determine when somebody is 'about' to kill someone. -My only real idea so far is to have a complete duplicate of everything loaded off to the side copying all the movement from the original world but with a 10 second delay. That way, all the kill cams would be 10 seconds long and the persons camera would just be moved to the second world of their killer. My Questions: Is there already an accepted way to do this? Does anybody have any good ideas for something like this? Thanks if you can!

    Read the article

  • Changing DisplayMode seems not to update Input&Graphic Dimension

    - by coding.mof
    I'm writing a small game using Slick and Nifty-GUI. At the program startup I set the DisplayMode using the following lines: AppGameContainer app = new ... app.setDisplayMode( 800, 600, false ); app.start(); I wrote a Nifty-ScreenController for my settings dialog in which the user can select the desired DisplayMode. When I try to set the new DisplayMode within this controller class the game window gets resized correctly but the Graphics and Input objects aren't updated accordingly. Therefore my rendering code just uses a part of the new window. I tried to set different DisplayModes in the main method to test if it's generally possible to invoke this method multiple times. It seems that changing the DisplayMode only works before I call app.start(). Furthermore I tried to update the Graphics & Input object manually but the init and setDimensions methods are package private. :( Does someone know what I'm doing wrong and how to change the DisplayMode correctly?

    Read the article

  • Any interesting thesis topic?

    - by revers
    Hi, I study Computer Science at Technical University of Lodz (in Poland) with Computer Game and Simulation Technology specialization. I'm going to defend BSc thesis next year and I was wondering what topic I could choose but nothing really interesting is coming to my mind. Maybe You could help me and suggest some subjects related to programming graphics, games or simulations? (or maybe something else that is interesting enough :) ). I would be very grateful for any suggestion!

    Read the article

  • What are the parameters of APEX destructible asset / actor, and what are the effect of them?

    - by Semih Kekül
    There are parameters of the NxDestructibleAsset such as: defaultBehaviorGroup.damageToRadius destructibleParameters.fractureImpulseScale p3BodyDescTemplate.density structureSettings.useStressSolver destructibleParameters.runtimeFracture.glass.firstSegmentSize, etc. However, i can not find any document explaining these parameters. Are there any documents/videos or codes (anything) which explains these parameters?

    Read the article

  • How do I manipulate the Url of my Silverlight testpage.aspx?

    - by Daniel
    I am making an XNA game using Silverlight over the web. My testpage.aspx is linked to from a previous page where the client selects certain elements. The testpage.aspx URL changes depending on what I have sent to it. Now in my mainpage.cs file I would like to call certain functions depending on what was passed, but I am unsure how to manipulate or even access the URL. Is there a specific class in the Silverlight library I can use? Thank you for your time.

    Read the article

  • What is the best type of c# timer to use with a Unity game that uses many timers simultaneously?

    - by Kyle Seidlitz
    I am developing a stand-alone 3d game in Unity that will have anywhere from 1 to 200 timers running simultaneously. There will be a GameObject containing 1 timer. For this game timer durations will range from 5 minutes to 4 days. There will not be any countdown displays or any UI for the timers. Each object is a prefab, with all the necessary materials included. An attached script will handle the timer and all the necessary code to change the materials and make any sound effects. Once the timer is expired, the user will then click on the object again, and the object will be destroyed, and the user's inventory will be adjusted. If the user wants to save or end the game before all the timers are done, the start value of the still running timers is to be saved to an XML file such that when the game is started again, any still running timers will be checked to see if they have expired, where the object's materials will be changed appropriately. I am still trying to figure out what type of timer to use, and see also if there are any suggestions for saving and calculating times over several days. What class(es) of timers should I use? Are there any special issues I should look out for in terms of performance?

    Read the article

  • Circle collision detection and Vector math: HELP?

    - by Griffin
    Hey so i'm currently going through the wildbunny blog to learn about collision detection, but i'm a bit confused on how the vectors he's talking about come into play QUOTED BLOG: p = ||A-B|| – (r1+r2) The two spheres are penetrating by distance p. We would also like the penetration vector so that we can correct the penetration once we discover it. This is the vector that moves both circles to the point where they just touch, correcting the penetration. Importantly it is not only just a vector that does this, it is the only vector which corrects the penetration by moving the minimum amount. This is important because we only want to correct the error, not introduce more by moving too much when we correct, or too little. N = (A-B) / ||A-B|| P = N*p Here we have calculated the normalised vector N between the two centres and the penetration vector P by multiplying our unit direction by the penetration distance. Ok so i understand that p is the distance each circle is penetrating each other, but i don't get what exactly N and P is. it seems to me N is just the coordinates of the 3rd point of the right trianlge formed by point A and B (A-B) then being divided by the hypotenuse of that triangle or distance between A and B (||A-B||) Whats the significance of this? Also, what is the penetration vector used for? It seems to me like a movement that one of the circles would perform to get un-penetrated.

    Read the article

  • Will making players pay a virtual currency before entering a match discourage them from playing?

    - by Bane
    I'm making a multiplayer match-making game, and by my current design, people will need to pay a small fee before joining a match. At the end of the match, the team that won will get the money. That will be a virtual currency, but still, will it discourage people to enter matches? I introduced it to make the matches matter more, because there's always a fear that you will loose your investments. I'm not talking about anything big here, but even a small amount might have a similar psychological effect as a bigger one.

    Read the article

  • Where can I find affordable legal advice for game software related inquiries?

    - by Steven Lu
    I am working on simulation middleware which is applicable for game engine implementations. What I would like to do is to make it freely available for use for all non-commercial purposes, while at the same time imposing some percentage of royalty on revenue (above a certain threshold) that is derived from my work. Something very similar to Epic's UDK licensing model. To facilitate the use of my software, I plan to offer binaries (static libs) for several platforms, as well as obfuscated source code which I will freely distribute, in addition to documentation of the API. I simply want to impose the restriction that if you try to make money from it, I get a cut eventually. I'm wondering if there are online forums and such where I am likely to find people who are willing to assist me in terms of learning what sort of things I have to do to get things down on the right kinds of documents. So far a site like this seems to be the most promising.

    Read the article

  • What is the purpose of bitdepth for the several components of the framebuffer in glfwWindowHint function of GLFW3?

    - by Rui d'Orey
    I would like to know what are the following "framebuffer related hints" of GLFW3 function glfwWindowHint : GLFW_RED_BITS GLFW_GREEN_BITS GLFW_BLUE_BITS GLFW_ALPHA_BITS GLFW_DEPTH_BITS GLFW_STENCIL_BITS What is the purpose of this? Usually their default values are enough? Where are those bits stored? In a buffer in the GPU? What do they affect? And by that I mean in what way Thank you in advance!

    Read the article

  • draw bullet at the end of the barrel

    - by Alberto
    excuse my awkwardness, i have this code: [syntax="java"] int x2 = (int) (canon.getSceneCenterCoordinates()[0] + LENGTH_SPRITE/2* Math.cos(canon.getRotation())); int y2 = (int) (canon.getSceneCenterCoordinates()[1] + LENGTH_SPRITE/2* Math.sin(canon.getRotation())); projectile = new Sprite( (float) x2, (float) y2, mProjectileTextureRegion,this.getVertexBufferObjectManager() ); mMainScene.attachChild(projectile); [/syntax] and the bullet are drawn around the cannon in circle.. but not from the end of cannon :( help!

    Read the article

  • Why do I get an exception when playing multiple sound instances?

    - by Boreal
    Right now, I'm adding a rudimentary sound engine to my game. So far, I am able to load in a WAV file and play it once, then free up the memory when I close the game. However, the game crashes with a nice ArgumentOutOfBoundsException when I try to play another sound instance. Specified argument was out of the range of valid values. Parameter name: readLength I'm following this tutorial pretty much exactly, but I still keep getting the aforementioned error. Here's my sound-related code. /// <summary> /// Manages all sound instances. /// </summary> public static class Audio { static XAudio2 device; static MasteringVoice master; static List<SoundInstance> instances; /// <summary> /// The XAudio2 device. /// </summary> internal static XAudio2 Device { get { return device; } } /// <summary> /// Initializes the audio device and master track. /// </summary> internal static void Initialize() { device = new XAudio2(); master = new MasteringVoice(device); instances = new List<SoundInstance>(); } /// <summary> /// Releases all XA2 resources. /// </summary> internal static void Shutdown() { foreach(SoundInstance i in instances) i.Dispose(); master.Dispose(); device.Dispose(); } /// <summary> /// Registers a sound instance with the system. /// </summary> /// <param name="instance">Sound instance</param> internal static void AddInstance(SoundInstance instance) { instances.Add(instance); } /// <summary> /// Disposes any sound instance that has stopped playing. /// </summary> internal static void Update() { List<SoundInstance> temp = new List<SoundInstance>(instances); foreach(SoundInstance i in temp) if(!i.Playing) { i.Dispose(); instances.Remove(i); } } } /// <summary> /// Loads sounds from various files. /// </summary> internal class SoundLoader { /// <summary> /// Loads a .wav sound file. /// </summary> /// <param name="format">The decoded format will be sent here</param> /// <param name="buffer">The data will be sent here</param> /// <param name="soundName">The path to the WAV file</param> internal static void LoadWAV(out WaveFormat format, out AudioBuffer buffer, string soundName) { WaveStream wave = new WaveStream(soundName); format = wave.Format; buffer = new AudioBuffer(); buffer.AudioData = wave; buffer.AudioBytes = (int)wave.Length; buffer.Flags = BufferFlags.EndOfStream; } } /// <summary> /// Manages the data for a single sound. /// </summary> public class Sound : IAsset { WaveFormat format; AudioBuffer buffer; /// <summary> /// Loads a sound from a file. /// </summary> /// <param name="soundName">The path to the sound file</param> /// <returns>Whether the sound loaded successfully</returns> public bool Load(string soundName) { if(soundName.EndsWith(".wav")) SoundLoader.LoadWAV(out format, out buffer, soundName); else return false; return true; } /// <summary> /// Plays the sound. /// </summary> public void Play() { Audio.AddInstance(new SoundInstance(format, buffer)); } /// <summary> /// Unloads the sound from memory. /// </summary> public void Unload() { buffer.Dispose(); } } /// <summary> /// Manages a single sound instance. /// </summary> public class SoundInstance { SourceVoice source; bool playing; /// <summary> /// Whether the sound is currently playing. /// </summary> public bool Playing { get { return playing; } } /// <summary> /// Starts a new instance of a sound. /// </summary> /// <param name="format">Format of the sound</param> /// <param name="buffer">Buffer holding sound data</param> internal SoundInstance(WaveFormat format, AudioBuffer buffer) { source = new SourceVoice(Audio.Device, format); source.BufferEnd += (s, e) => playing = false; source.Start(); source.SubmitSourceBuffer(buffer); // THIS IS WHERE THE EXCEPTION IS THROWN playing = true; } /// <summary> /// Releases memory used by the instance. /// </summary> internal void Dispose() { source.Dispose(); } } The exception occurs on line 156 when I am playing the sound: source.SubmitSourceBuffer(buffer);

    Read the article

  • How to make an Actor follow my finger

    - by user48352
    I'm back with another question that may be really simple. I've a texture drawn on my spritebatch and I'm making it move up or down (y-axis only) with Libgdx's Input Handler: touchDown and touchUp. @Override public boolean touchDown(int screenX, int screenY, int pointer, int button) { myWhale.touchDownY = screenY; myWhale.isTouched = true; return true; } @Override public boolean touchUp(int screenX, int screenY, int pointer, int button) { myWhale.isTouched = false; return false; } myWhale is an object from Whale Class where I move my texture position: public void update(float delta) { this.delta = delta; if(isTouched){ dragWhale(); } } public void dragWhale() { if(Gdx.input.getY(0) - touchDownY < 0){ if(Gdx.input.getY(0)<position.y+height/2){ position.y = position.y - velocidad*delta; } } else{ if(Gdx.input.getY(0)>position.y+height/2){ position.y = position.y + velocidad*delta; } } } So the object moves to the center of the position where the person is pressing his/her finger and most of the time it works fine but the object seems to take about half a second to move up or down and sometimes when I press my finger it wont move. Maybe there's another simplier way to do this. I'd highly appreciate if someone points me on the right direction.

    Read the article

  • openGL Camera setup for Zoom in/out centered at point under cursor

    - by user3228921
    I am trying to implement a zoom in/out navigation mode in a openGL 3dViewer. I was able to implement zoom functionality centered at screen center just by moving eye towards the center in perspective mode. Now i am trying to do the zoom centered at arbitrary position under the cursor. I am unable to figure out how should i move my camera forward and backward such that point under cursor remains at the same screen coordinates after zoom in/out. Any help would be appreciated. Below are the images which show the desired effect. Just to mention, I am working in a perspective mode with eye target and up vectors to control camera. Same effect i found in google sketchup and 'zoom to mouse position' setting in blender.

    Read the article

  • Resolving collisions between dynamic game objects

    - by TheBroodian
    I've been building a 2D platformer for some time now, I'm getting to the point where I am adding dynamic objects to the stage for testing. This has prompted me to consider how I would like my character and other objects to behave when they collide. A typical staple in many 2D platformer type games is that the player takes damage upon touching an enemy, and then essentially becomes able to pass through enemies during a period of invulnerability, and at the same time, enemies are able to pass through eachother freely. I personally don't want to take this approach, it feels strange to me that the player should receive arbitrary damage for harmless contact to an enemy, despite whether the enemy is attacking or not, and I would like my enemies' interactions between each other (and my player) to be a little more organic, so to speak. In my head I sort of have this idea where a game object (player, or non player) would be able to push other game objects around by manner of 'pushing' each other out of one anothers' bounding boxes if there is an intersection, and maybe correlate the repelling force to how much their bounding boxes are intersecting. The problem I'm experiencing is I have no idea what the math might look like for something like this? I'll show what work I've done so far, it sort of works, but it's jittery, and generally not quite what I would pass in a functional game: //Clears the anti-duplicate buffer collisionRecord.Clear(); //pick a thing foreach (GameObject entity in entities) { //pick another thing foreach (GameObject subject in entities) { //check to make sure both things aren't the same thing if (!ReferenceEquals(entity, subject)) { //check to see if thing2 is in semi-near proximity to thing1 if (entity.WideProximityArea.Intersects(subject.CollisionRectangle) || entity.WideProximityArea.Contains(subject.CollisionRectangle)) { //check to see if thing2 and thing1 are colliding. if (entity.CollisionRectangle.Intersects(subject.CollisionRectangle) || entity.CollisionRectangle.Contains(subject.CollisionRectangle) || subject.CollisionRectangle.Contains(entity.CollisionRectangle)) { //check if we've already resolved their collision or not. if (!collisionRecord.ContainsKey(entity.GetHashCode())) { //more duplicate resolution checking. if (!collisionRecord.ContainsKey(subject.GetHashCode())) { //if thing1 is traveling right... if (entity.Velocity.X > 0) { //if it isn't too far to the right... if (subject.CollisionRectangle.Contains(new Microsoft.Xna.Framework.Rectangle(entity.CollisionRectangle.Right, entity.CollisionRectangle.Y, 1, entity.CollisionRectangle.Height)) || subject.CollisionRectangle.Intersects(new Microsoft.Xna.Framework.Rectangle(entity.CollisionRectangle.Right, entity.CollisionRectangle.Y, 1, entity.CollisionRectangle.Height))) { //Find how deep thing1 is intersecting thing2's collision box; float offset = entity.CollisionRectangle.Right - subject.CollisionRectangle.Left; //Move both things in opposite directions half the length of the intersection, pushing thing1 to the left, and thing2 to the right. entity.Velocities.Add(new Vector2(-(((offset * 4) * (float)gameTime.ElapsedGameTime.TotalMilliseconds)), 0)); subject.Velocities.Add(new Vector2((((offset * 4) * (float)gameTime.ElapsedGameTime.TotalMilliseconds)), 0)); } } //if thing1 is traveling left... if (entity.Velocity.X < 0) { //if thing1 isn't too far left... if (entity.CollisionRectangle.Contains(new Microsoft.Xna.Framework.Rectangle(subject.CollisionRectangle.Right, subject.CollisionRectangle.Y, 1, subject.CollisionRectangle.Height)) || entity.CollisionRectangle.Intersects(new Microsoft.Xna.Framework.Rectangle(subject.CollisionRectangle.Right, subject.CollisionRectangle.Y, 1, subject.CollisionRectangle.Height))) { //Find how deep thing1 is intersecting thing2's collision box; float offset = subject.CollisionRectangle.Right - entity.CollisionRectangle.Left; //Move both things in opposite directions half the length of the intersection, pushing thing1 to the right, and thing2 to the left. entity.Velocities.Add(new Vector2((((offset * 4) * (float)gameTime.ElapsedGameTime.TotalMilliseconds)), 0)); subject.Velocities.Add(new Vector2(-(((offset * 4) * (float)gameTime.ElapsedGameTime.TotalMilliseconds)), 0)); } } //Make record that thing1 and thing2 have interacted and the collision has been solved, so that if thing2 is picked next in the foreach loop, it isn't checked against thing1 a second time before the next update. collisionRecord.Add(entity.GetHashCode(), subject.GetHashCode()); } } } } } } } } One of the biggest issues with my code aside from the jitteriness is that if one character were to land on top of another character, it very suddenly and abruptly resolves the collision, whereas I would like a more subtle and gradual resolution. Any thoughts or ideas are incredibly welcome and helpful.

    Read the article

  • Designing a "Grid" like object that contains game objects

    - by liortal
    I am working on a 2D game, where there's a game "board" on which other game objects are placed. This this is 2D, my starting point was to design a class that will internally use a 2d array for the actual stored game objects. This class could be simply accessed by 2 indices: (i, j) to get game objects on it. My problem is that i have no idea how to make the game "board" "propagate" its data onto its children. Design questions i ran into are: Should the children placed on the board have display properties such as size, screen position? Should the board itself dictate this information? How to update children in case the board changes some of its properties? (position, etc). Should the board be aware of the types of objects stored in it ? I have no idea how similar things such as WPF or other UI frameworks go about organizing a "container like" object that can arrange or apply certain UI properties to its children.

    Read the article

  • Intersection points of plane set forming convex hull

    - by Toji
    Mostly looking for a nudge in the right direction here. Given a set of planes (defined as a normal and distance from origin) that form a convex hull, I would like to find the intersection points that form the corners of that hull. More directly, I'm looking for a way to generate a point cloud appropriate to provide to Bullet. Bonus points if someone knows of a way I could give bullet the plane list directly, since I somewhat suspect that's what it's building on the backend anyway.

    Read the article

  • Projecting onto different size screens by cropping

    - by Jason
    Hi, I am building a phone application which will display a shape on screen. The shape should look the same on different screen sizes. I. Decided the best way to do this is to show more of the background on larger screen keeping the shapes proportion the same on all screens. My problem is I am not sure how to achieve this, I can query the screen size at runtime and calculate how different it is from the six is designed for but I am not sure what to do with this value. What kind of projection should I use for my orthographic matrix an hour will I display more on larger screens and not loose information on smaller screens? Thanks, Jason.

    Read the article

  • Behavior tree implementation details

    - by angryInsomniac
    I have been looking around for implementation details of behavior trees, the best descriptions I found were by Alex Champarand and some of Damian Isla's talk about AI in Halo 2 (the video of which is locked up in the GDC vault sadly). However, both descriptions fall short of helping one actually create a BT, one particular question has been bugging me for a while. When is the tree in a behavior tree evaluated? Furthermore: If the tree is in the middle of executing a sequence of actions (patrolling waypoints) and a higher priority impulse comes in (distraction sound) , how to switch to that side of the tree seamlessly without resorting to a state machine like system and if it is decided that the impulse was irrelevant (the distraction is too far away to affect this guard), how to go back to the last thing that the guard was doing ? I have quite a few questions like this and I don't wish to flood the board with separate queries so if you know of any resource where questions like these can be answered I would be very grateful.

    Read the article

  • How do I find which isometric tiles are inside the cameras current view?

    - by Steve
    I'm putting together an isometric engine and need to cull the tiles that aren't in the camera's current view. My tile coordinates go from left to right on the X and top to bottom on the Y with (0,0) being the top left corner. If I have access to say the top left, top right, bottom left and bottom right corner coordinates, is there a formula or something I could use to determine which tiles fall in range? This is a screenshot of the layout of the tiles for reference. If there isn't one, or there's a better way to determine which tiles are on screen and which to cull, I'm all ears and am grateful for any ideas. I've got a few other methods I may be able to try such as checking the position of the tile against a rectangle. I pretty much just need something quick. Thanks for giving this a read =)

    Read the article

  • Implementing invisible bones

    - by DeadMG
    I suddenly have the feeling that I have absolutely no idea how to implement invisible objects/bones. Right now, I use hardware instancing to store the world matrix of every bone in a vertex buffer, and then send them all to the pipeline. But when dealing with frustrum culling, or having them set to invisible by my simulation for other reasons, means that some of them will be randomly invisible. Does this mean I effectively need to re-fill the buffer from scratch every frame with only the visible unit's matrices? This seems to me like it would involve a lot of wasted bandwidth.

    Read the article

  • write to depth buffer while using multiple render targets

    - by DocSeuss
    Presently my engine is set up to use deferred shading. My pixel shader output struct is as follows: struct GBuffer { float4 Depth : DEPTH0; //depth render target float4 Normal : COLOR0; //normal render target float4 Diffuse : COLOR1; //diffuse render target float4 Specular : COLOR2; //specular render target }; This works fine for flat surfaces, but I'm trying to implement relief mapping which requires me to manually write to the depth buffer to get correct silhouettes. MSDN suggests doing what I'm already doing to output to my depth render target - however, this has no impact on z culling. I think it might be because XNA uses a different depth buffer for every RenderTarget2D. How can I address these depth buffers from the pixel shader?

    Read the article

< Previous Page | 482 483 484 485 486 487 488 489 490 491 492 493  | Next Page >