Search Results

Search found 5786 results on 232 pages for 'fast'.

Page 7/232 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • How to prevent overlapping of gunshot sounds when using fast-firing weapons

    - by G3tinmybelly
    So I am now trying to find sounds for my guns but when I grab a gun sound effect and play it in my game a lot of the sounds are either terrible sounding or have this horrible echoing effect because as a gun shoots sometimes the previous sound is playing still. public void shoot(float x, float y, float direction){ if(empty){ PlayHUD.message = "No more bullets!"; return; } if(reloading){ return; } if(System.currentTimeMillis() - lastShot < fireRate){ //AssetsLoader.lmgSound.stop(); return; } float dx = (float) (-13 * Math.cos(direction) + 75 * Math.sin(direction)); float dy = (float) (-14 * -Math.sin(direction) + 75 * Math.cos(direction)); float dx1 = (float) (-13 * Math.cos(direction) + 75 * Math.sin(direction)); float dy1 = (float) (-14 * -Math.sin(direction) + 75 * Math.cos(direction)); PlayState.effects.add(new MuzzleFlashEffect(x + dx1, y + dy1, (float) Math.toDegrees(-direction))); PlayState.projectiles.add(new Bullet(this, x + dx, y + dy, (float) (direction + (Math.toRadians(MathUtils.random(-accuracy, accuracy)))))); if(OptionState.soundOn){ AssetsLoader.lmgSound.play(OptionState.volume); } bulletsInClip--; lastShot = System.currentTimeMillis(); } Here is the code for where the sound plays. Every time this method is called the sound is called but it happens so often in this case that there is this terrible echoing. Any idea on how to fix this?

    Read the article

  • How to compile and build fast in Visual studio 2010

    - by anirudha
    Sometime Project have included many thing with a project.suppose a ASP.NET MVC project maybe included Test project for the project you run or have some more project who attach to the current project. it's take a long time while project is going to debug the reason for that is because project have many subproject or attached project then compilation of all maybe goes long. the solution is that build and debug current project instead of all. it's same time on compilation in Visual studio. for configure build only current project you need to configure it in Visual studio. click on the button and select Configuration manager choose the project who you currently worked and unchecked all other. After that Visual studio debugging goes faster.

    Read the article

  • A Fast and FUN Way to Google First Page Formula

    Internet traffic is a number game, more traffic more money and you can get traffic from Google for absolutely free. At zero cost to you but with some knowledgeable effort that will put you ahead of the competition. The actionable strategies discussed on this article will certainly but help you to rank first page on Google.

    Read the article

  • Fast, accurate 2d collision

    - by Neophyte
    I'm working on a 2d topdown shooter, and now need to go beyond my basic rectangle bounding box collision system. I have large levels with many different sprites, all of which are different shapes and sizes. The textures for the sprites are all square png files with transparent backgrounds, so I also need a way to only have a collision when the player walks into the coloured part of the texture, and not the transparent background. I plan to handle collision as follows: Check if any sprites are in range of the player Do a rect bounding box collision test Do an accurate collision (Where I need help) I don't mind advanced techniques, as I want to get this right with all my requirements in mind, but I'm not sure how to approach this. What techniques or even libraries to try. I know that I will probably need to create and store some kind of shape that accurately represents each sprite minus the transparent background. I've read that per pixel is slow, so given my large levels and number of objects I don't think that would be suitable. I've also looked at Box2d, but haven't been able to find much documentation, or any examples of how to get it up and running with SFML.

    Read the article

  • 6 Guidelines For You to Create a Fast Loading Website

    As you may have heard that Google has implemented a new criterion to determine your rankings on search results, the website speed. Increasing your website loading speed is not only important for getting a high search ranking and it will also affect your visitors' perception about your website.

    Read the article

  • How to Make My Website Come Up First in Google Fast

    Many people believe wrongly that it is very difficult to get a website to come up on the first page of Google. In fact, provided you follow a very simple step by step method it is really very easy even to get a brand new website up to the top of the results. Follow along and I will show you how to get your site to number one.

    Read the article

  • Firefox 4: fast, powerful, and empowering

    <b>beltzner:</b> "Today, I presented an early product plan for Firefox 4 to the Mozilla community (live, over the web!) to share our vision for the next version of Firefox, and what projects are underway to realize it. Then I invited everyone to get involved by joining our engineering or product development efforts."

    Read the article

  • The architecture and technologies to use for a secure, fast, reliable and easily scalable web application

    - by DSoul
    ^ For actual questions, skip to the lists down below I understand, that his is a vague topic, but please, before you turn the other way and disregard me, hear me out. I am currently doing research for a web application(I don't know if application is the correct word for it, but I will proceed w/ that for now), that one day might need to be everything mentioned in the title. I am bound by nothing. That means that every language, OS and framework is acceptable, but only if it proves it's usefulness. And if you are going to say, that scalability and speed depend on the code I write for this application, then I agree, but I am just trying to find something, that wouldn't stand in my way later on. I have done quite a bit reading on this subject, but I still don't have a clear picture, to what suits my needs, so I come to you, StackOverflow, to give me directions. I know you all must be wondering what I'm building, but I assure you, that it doesn't matter. I have heard of 12 factor app though, if you have any similar guidelines or what is, to suggest the please, go ahead. For the sake of keeping your answers as open as possible, I'm not gonna provide you my experience regarding anything written in this question. ^ Skippers, start here First off - the weights of the requirements are probably something like that (on a scale of 10): Security - 10 Speed - 5 Reliability (concurrency) - 7.5 Scalability - 10 Speed and concurrency are not a top priority, in the sense, that the program can be CPU intensive, and therefore slow, and only accept a not-that-high number of concurrent users, but both of these factors must be improvable by scaling the system Anyway, here are my questions: How many layers should the application have, so it would be future-proof and could best fulfill the aforementioned requirements? For now, what I have in mind is the most common version: Completely separated front end, that might be a web page or an MMI application or even both. Some middle-ware handling communication between the front and the back end. This is probably a server that communicates w/ the front end via HTTP. How the communication w/ the back end should be handled is probably dependent on the back end. The back end. Something that handles data through resources like DB and etc. and does various computations w/ the data. This, as the highest priority part of the software, must be easily spread to multiple computers later on and have no known security holes. I think ideally the middle-ware should send a request to a queue from where one of the back end processes takes this request, chops it up to smaller parts and buts these parts of the request back onto the same queue as the initial request, after what these parts will be then handled by other back end processes. Something *map-reduce*y, so to say. What frameworks, languages and etc. should these layers use? The technologies used here are not that important at this moment, you can ignore this part for now I've been pointed to node.js for this part. Do you guys know any better alternatives, or have any reasons why I should (not) use node.js for this particular job. I actually have no good idea, what to use for this job, there are too many options out there, so please direct me. This part (and the 2. one also, I think) depend a lot on the OS, so suggest any OSs alongside w/ the technologies/frameworks. Initially, all computers (or 1 for starters) hosting the back end are going to be virtual machines. Please do give suggestions to any part of the question, that you feel you have comprehensive knowledge and/or experience of. And also, point out if you feel that any part of the current set-up means an instant (or even distant) failure or if I missed a very important aspect to consider. I'm not looking for a definitive answer for how to achieve my goals, because there certainly isn't one, for I haven't provided you w/ all the required information. I'm just looking for recommendations and directions on what to look into. Also, bare in mind, that this isn't something that I have to get done quickly, to sell and let it be re-written by the new owner (which, I've been told for multiple times, is what I should aim for). I have all the time in the world and I really just want to learn doing something really high-end. Also, excuse me if my language isn't the best, I'm not a native. Anyway. Thanks in advance to anyone, who takes the time to help me out here. PS. When I do seem to come up w/ a good architecture/design for this project, I will certainly make it an open project and keep you guys up to date w/ it's development. As in what you could have told me earlier and etc. For obvious reasons the very same question got closed on SO, but could you guys still help me?.

    Read the article

  • Simple (and fast) dices physics

    - by Markus von Broady
    I'm programming a throw of 5 dices in Actionscript 3 + AwayPhysics (BulletPhysics port). I had a lot of fun tweaking frictions, masses etc. and in the end I found best results with more physics ticks per frame. Currently I use 10 ticks per frame (1/60 s) and it's OK, though I see a difference in plus for 20 ticks. Even though it's only 5 cubes (dices) in a box (or a floor with 3 walls really) I can't simulate 20 ticks in a frame and keep FPS at 60 on a medium-aged PC. That's why I decided to precompute frames for animation, finishing it in around 1700 ticks in 2 seconds. The flash player is freezed for these 2 seconds, and I'm afraid that this result will be more of a 5 seconds or even more, if I'll simulate multi-threading and compute frames in background of some other heavy processes and CPU drawing (dices is only a part of this game). Because I want both players to see dices roll in same way, I can't compute physics when having free resources, and build a buffer for at least one throw of each type (where type is number of dices thrown). I'm afraid players will see a "preparing dices........." message too often and for too long. I think the only solution to this problem is replacing PhysicsEngine with something simpler, or creating own physicsEngine. Do You have any formulas for cube-cube and cube-wall collision detection, and for calculating how their angular and linear velocities should change after a collision occurs?

    Read the article

  • fast 3d point -> cuboid volume intersection test

    - by user1130477
    Im trying to test whether a point lies within a 3d volume defined by 8 points. I know I can use the plane equation to check that the signed distance is always -1 for all 6 sides, but does anyone know of a faster way or could point me to some code? Thanks EDIT: I should add that ideally the test would produce 3 linear interpolation parameters which would lie in the range 0..1 to indicate that the point is within the volume for each axis (since I will have to calculate these later if the point is found to be in the volume)

    Read the article

  • SEO Techniques - How to Get Quality Backlinks Fast

    Backlinks are like votes for you in a page rank election, you are the candidate and the votes are your backlinks. The more votes you receive and the more important the people who cast those votes for you then the better popularity you have and that is how Google looks at it as well.

    Read the article

  • Fast, easy, and secure method to perform DB actions with GET

    - by rob - not a robber
    Hey All, Sort of a methods/best practices question here that I am sure has been addressed, yet I can't find a solution based on the vague search terms I enter. I know starting off the question with "Fast and easy" will probably draw out a few sighs, so my apologies. Here is the deal. I have a logged in area where an ADMIN can do a whole host of POST operations to input data relating to their profile. The way I have data structured is pretty distinct and well segmented in most tables as it relates to the ID of the admin. Now, I have a table where I dump one type of data into and differentiate this data by assigning the ADMIN's unique ID to each record. In other words, all ADMINs have this one type of data writing to this table. I just differentiate by the ADMIN ID with each record. I was planning on letting the ADMIN remove these records by clicking on a link with a query string - obviously using GET. Obviously, the query structure is in the link so any logged in admin could then exploit the URL and delete a competitor's records. Is the only way to safely do this through POST or should I pass through the session info that includes password and validate it against the ADMIN ID that is requesting the delete? This is obviously much more work for me. As they said in the auto repair biz I used to work in... there are 3 ways to do a job: Fast, Good, and Cheap. You can only have two at a time. Fast and cheap will not be good. Good and cheap will not have fast turnaround. Fast and good will NOT be cheap. haha I guess that applies here... can never have Fast, Easy and Secure all at once ;) Thanks in advance...

    Read the article

  • Fast Ethernet module for Cisco 2620

    - by Kenny Rasschaert
    I have a Cisco 2620 Router. It comes with one fast ethernet port built in (circled in red), and one old AUI ethernet module is installed (circled in blue). I figure I can put a transceiver on the AUI interface to get a second RJ45 connector. What I'd really like to have is a second fast ethernet connector. The ideal candidate to achieve this would be the NM-1FE-TX module. Cisco claims on their website that this module is not suitable for the Cisco 2620 and Cisco 2620XM. It says so in "Table 2 Physical Limitation of Serial Modules per Chassis". Indeed, this module was designed for the 3600 series of routers. I've seen claims on the internet, however, of people having this module on a 2620XM, and it being fully functional. This claim gains some credibility because of the fact that in Cisco's own Packet Tracer software, you can install this module on the 2620XM router. I'm looking for a definitive answer. Will this module work on a Cisco 2620? Is there perhaps another way to get a second fast ethernet port on this device?

    Read the article

  • Windows Explorer slow to open networked computer, fast to navigate once opened

    - by Scott Noyes
    I open Windows Explorer and enter an IP for a computer on my home network (\\192.168.1.101). It takes 30 seconds or more to present a list of the shared folders. It does not appear to be an initial handshaking/authentication thing; even if I allow the view to load and then immediately load the same again, it is always slow. Once they appear, navigating through folders and opening files is fast. Also, navigating directly to a folder (\\192.168.1.101\My Music) is fast, even if it's the first connection since a restart. Using \\computerName instead of the IP address gives exactly the same results. Pings return in 1ms. net view \\computerName (or \ipAddress) returns the list of shared folders fast. This makes me suspect an Explorer issue rather than a network issue. Suspecting that the remote computer was being automatically indexed or something, I went into Tools-Folder Options-View and unchecked "Automatically search for network folders and printers," but that made no difference. De-selecting the "Folders" icon near the address bar makes no difference. Adding the IP address and computer name to the hosts file makes no difference. Both computers involved are laptops running Windows XP. Both have WiFi and cable adapters. Mine is not connected via cable. The result is the same whether the target is plugged in to the cable or not (although the IP address changes - 192.168.1.101 over cable, 192.168.1.103 over WiFi.) We are using DHCP assigned by the router.

    Read the article

  • OS X Server - setting up a share for network home directories + fast user switching

    - by sohocoke
    I'm nearly finished setting up my Open Directory master to allow users to be managed centrally and logged in on any of the client machines at home. I found discussion suggesting that using an AFP share for the 'Users' automount would result in network users (cf. users defined locally, or Portable Home Directory users) are unable to use fast user switching, as the first user logging in would trigger the automount and mount the 'Users' share with his permissions, preventing further users from using the mount. I've also found some suggestions to configure autofs such that the 'Users' share is mounted prior to any user logging in, but not great amounts of details along these lines. I'd greatly appreciate some instructions to set up autofs - ideally, on the OpenDirectory server rather than each client's etc/fstab or equivalent location, so that it isn't required per every client machine - to get fast user switching working with network users.

    Read the article

  • APC serving old code intermittently running with Lighttpd and PHP Fast CGI

    - by APZ
    I recently started facing this problem that APC shows old code when we upload a html template file to fix/ change something on our websites. We run APC with Stat=0 and want to keep it that way because we seldomly make changes to templates. Every time we upload a template we make sure to flush APC cache and we execute this script(shown only some part of the script here) to clear the cache: apc_clear_cache(); apc_clear_cache('user'); apc_clear_cache('system'); apc_clear_cache(opcode); We use lightpd and PHP Fast CGI and fast cgi has "max-procs" = 2, "PHP_FCGI_CHILDREN" = "5", Even after flushing APC once upload is complete it serves the old template intermittently. Any help would be appreciated.

    Read the article

  • Archive software for big files and fast index

    - by AkiRoss
    I'm currently using tar for archiving some files. Problem is: archives are pretty big, contains many data and tar is very slow when listing and extracting. I often need to extract single files or folders from the archive, but I don't currently have an external index of files. So, is there an alternative for Linux, allowing me to build uncompressed archive files, preserving the file attributes AND having fast access list table? I'm talking about archives of 10 to 100 GB, and it's pretty impractical to wait several minutes to access a single file. Anyway, any trick to solve this problem is welcome (but single archives are non-optional, so no rsync or similar). Thanks in advance! EDIT: I'm not compressing archives, and using tar I think they are too slow. To be precise about "slow", I'd like that: listing archive content should take time linear in files count inside the archive, but with very little constant (e.g. if a list of all the files is included at the head of the archive, it could be very fast). extraction of a target file/directory should (filesystem premitting) take time linear with the target size (e.g. if I'm extracting a 2MB PDF file in a 40GB directory, I'd really like it to take less than few minutes... If not seconds). Of course, this is just my idea and not a requirement. I guess such performances could be achievable if the archive contained an index of all the files with respective offset and such index is well organized (e.g. tree structure).

    Read the article

  • memcache fast-cgi php apache 2.2 windows 7 creating problems

    - by Ahmad
    hi, i am trying to run memcache, fast-cgi with apache 2.2 + php on a windows 7 machine. if i dont use memcache everything works fine. the moment i disable extension=php_memcache.dll in php.ini everything returns to normal. once i start apache, the apache logs say: [Wed Jan 12 18:19:23 2011] [notice] Apache/2.2.17 (Win32) mod_fcgid/2.3.6 configured -- resuming normal operations [Wed Jan 12 18:19:23 2011] [notice] Server built: Oct 18 2010 01:58:12 [Wed Jan 12 18:19:23 2011] [notice] Parent: Created child process 412 [Wed Jan 12 18:19:23 2011] [notice] Child 412: Child process is running [Wed Jan 12 18:19:23 2011] [notice] Child 412: Acquired the start mutex. [Wed Jan 12 18:19:23 2011] [notice] Child 412: Starting 64 worker threads. [Wed Jan 12 18:19:23 2011] [notice] Child 412: Starting thread to listen on port 80. and after accessing the page [the page just has echo phpinfo()]. i get this error in the error.log [Wed Jan 12 18:20:54 2011] [warn] [client 127.0.0.1] (OS 109)The pipe has been ended. : mod_fcgid: get overlap result error [Wed Jan 12 18:20:54 2011] [error] [client 127.0.0.1] Premature end of script headers: index.php i have php_memcache.dll in my ext directory and httpd.conf is like this: LoadModule fcgid_module modules/mod_fcgid.so FcgidInitialEnv PHPRC "c:/php" FcgidInitialEnv PATH "c:/php;C:/WINDOWS/system32;C:/WINDOWS;C:/WINDOWS/System32/Wbem;" FcgidInitialEnv SystemRoot "C:/Windows" FcgidInitialEnv SystemDrive "C:" FcgidInitialEnv TEMP "C:/WINDOWS/Temp" FcgidInitialEnv TMP "C:/WINDOWS/Temp" FcgidInitialEnv windir "C:/WINDOWS" FcgidIOTimeout 64 FcgidConnectTimeout 32 FcgidMaxRequestsPerProcess 500 <Files ~ "\.php$>" AddHandler fcgid-script .php FcgidWrapper "c:/php/php-cgi.exe" .php </Files> so the problem has to be related to memcache coz if i disable it, fast-cgi seems to be working fine. any possible reasons for this?? the memcache service is running.. i can check it through control panel-services

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >