Search Results

Search found 35970 results on 1439 pages for 'javascript performance'.

Page 18/1439 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Separating Javascript and Html, when dynamically adding html via javascript

    - by optician
    I am currently building a very dynamic table for a list application, which will basically perform basic CRUD functions via AJAX. What I would like to do is separate the visual design and javascript to the point where I can change the design side without touching the JS side. This would only work where the design stays roughly the same(i would like to use it for rapid protyping) Here is an example. <table> <tr><td>record-123</td><td>I am line 123</td><td>delete row</td></tr> <tr><td>record-124</td><td>I am line 124</td><td>delete row</td></tr> <tr><td>record-125</td><td>I am line 125</td><td>delete row</td></tr> <tr><td>add new record</td></tr> </table> Now, when I add a new record, I would like to insert a new row of html, but I would rather not put this html into the javascript file. What I am considering is creating a row like this on the page, near the table. <tr style='visble:none;' id='template-row'><td>record-id</td><td>content-area</td><td>delete row</td></tr> And when I come to add the new row, I search the page for the tags with the id=template-row , and then grab it, do a string replace on it, and then put it in the right place in the page. As long as the design doesn't shift radically, and I keep the placeholder strings the same, it means designs can be quickly modified without touching the js. Can any give any advice on a methodology like this?

    Read the article

  • if else on javascript with the value of a select box (pure javascript)

    - by user983248
    I'm working on a select box that have images instead of text, (on the background with css). <script type="text/javascript"> function onChange(element) { element.style.backgroundColor = "Yellow"; element.className = "on_change"; } </script> <select onchange="onChange(this);"> <option value="1" style="background: url(/one.png) no-repeat scroll 0 0 transparent; width:32px; height:32px;"></option> <option value="2" style="background: url(/two.png) no-repeat scroll 0 0 transparent; width:32px; height:32px;"></option> <option value="3" style="background: url(/three.png) no-repeat scroll 0 0 transparent; width:32px; height:32px;"></option> </select> The problem is how do I get the value of the selected option and if is 1 set one image and if it is two set another image as the background using pure javascript (no jQuery)? I know that selectedIndex is the key to my problem, but I'm clueless of how to use it or how to use it on an if else statement. The script above is just one of my trials, I actually use the script above to perform the same task. <select onchange="this.style.backgroundColor=this.options[this.selectedIndex].style.backgroundColor; this.style.color=this.options[this.selectedIndex].style.color">

    Read the article

  • Replacing certain words with links to definitions using Javascript

    - by adharris
    I am trying to create a glossary system which will get a list of common words and their definitions via ajax, then replace any occurrence of that word in certain elements (those with the useGlossary class) with a link to the full definition and provide a short definition on mouse hover. The way I am doing it works, but for large pages it takes 30-40 seconds, during which the page hangs. I would like to either decrease the time it takes to do the replacement or make it so that the replacement is running in the background without hanging the page. I am using jquery for most of the javascript, and Qtip for the mouse hover. Here is my existing slow code: $(document).ready(function () { $.get("fetchGlossary.cfm", null, glossCallback, "json"); }); function glossCallback(data) { $(".useGlossary").each(function() { var $this = $(this); for (var i in data) { $this.html($this.html().replace(new RegExp("\\b" + data[i].term + "\\b", "gi"), function(m) {return makeLink(m, data[i].def);})); } $this.find("a.glossary").qtip({ style: { name: 'blue', tip: true } }) }); } function makeLink(m, def) { return "<a class='glossary glossary" + m.replace(/\s/gi, "").toUpperCase() + "' href='reference/glossary.cfm' title='" + def + "'>" + m + "</a>"; } Thanks for any feedback/suggestions!

    Read the article

  • Building an HTML5 App with ASP.NET

    - by Stephen Walther
    I’m teaching several JavaScript and ASP.NET workshops over the next couple of months (thanks everyone!) and I thought it would be useful for my students to have a really easy to use JavaScript reference. I wanted a simple interactive JavaScript reference and I could not find one so I decided to put together one of my own. I decided to use the latest features of JavaScript, HTML5 and jQuery such as local storage, offline manifests, and jQuery templates. What could be more appropriate than building a JavaScript Reference with JavaScript? You can try out the application by visiting: http://Superexpert.com/JavaScriptReference Because the app takes advantage of several advanced features of HTML5, it won’t work with Internet Explorer 6 (but really, you should stop using that browser). I have tested it with IE 8, Chrome 8, Firefox 3.6, and Safari 5. You can download the source for the JavaScript Reference application at the end of this article. Superexpert JavaScript Reference Let me provide you with a brief walkthrough of the app. When you first open the application, you see the following lookup screen: As you type the name of something from the JavaScript language, matching results are displayed: You can click the details link for any entry to view details for an entry in a modal dialog: Alternatively, you can click on any of the tabs -- Objects, Functions, Properties, Statements, Operators, Comments, or Directives -- to filter results by type of syntax. For example, you might want to see a list of all JavaScript built-in objects: You can login to the application to make modification to the application: After you login, you can add, update, or delete entries in the reference database: HTML5 Local Storage The application takes advantage of HTML5 local storage to store all of the reference entries on the local browser. IE 8, Chrome 8, Firefox 3.6, and Safari 5 all support local storage. When you open the application for the first time, all of the reference entries are transferred to the browser. The data is stored persistently. Even if you shutdown your computer and return to the application many days later, the data does not need to be transferred again. Whenever you open the application, the app checks with the server to see if any of the entries have been updated on the server. If there have been updates, then only the updates are transferred to the browser and the updates are merged with the existing entries in local storage. After the reference database has been transferred to your browser once, only changes are transferred in the future. You get two benefits from using local storage. First, the application loads very fast and works very fast after the data has been loaded once. The application does not query the server whenever you filter or view entries. All of the data is persisted in the browser. Second, you can browse the JavaScript reference even when you are not connected to the Internet (when you are on the proverbial airplane). The JavaScript Reference works as an offline application for browsers that support offline applications (unfortunately, not IE). When using Google Chrome, you can easily view the contents of local storage by selecting Tools, Developer Tools (CTRL-SHIFT I) and selecting Storage, Local Storage: The JavaScript Reference app stores two items in local storage: entriesLastUpdated and entries. HTML5 Offline App For browsers that support HTML5 offline applications – Chrome 8 and Firefox 3.6 but not Internet Explorer – you do not need to be connected to the Internet to use the JavaScript Reference. The JavaScript Reference can execute entirely on your machine just like any other desktop application. When you first open the application with Firefox, you are presented with the following warning: Notice the notification bar that asks whether you want to accept offline content. If you click the Allow button then all of the files (generated ASPX, images, CSS, JavaScript) needed for the JavaScript Reference will be stored on your local computer. Automatic Script Minification and Combination All of the custom JavaScript files are combined and minified automatically whenever the application is built with Visual Studio. All of the custom scripts are contained in a folder named App_Scripts: When you perform a build, the combine.js and combine.debug.js files are generated. The Combine.config file contains the list of files that should be combined (importantly, it specifies the order in which the files should be combined). Here’s the contents of the Combine.config file:   <?xml version="1.0"?> <combine> <scripts> <file path="compat.js" /> <file path="storage.js" /> <file path="serverData.js" /> <file path="entriesHelper.js" /> <file path="authentication.js" /> <file path="default.js" /> </scripts> </combine>   jQuery and jQuery UI The JavaScript Reference application takes heavy advantage of jQuery and jQuery UI. In particular, the application uses jQuery templates to format and display the reference entries. Each of the separate templates is stored in a separate ASP.NET user control in a folder named Templates: The contents of the user controls (and therefore the templates) are combined in the default.aspx page: <!-- Templates --> <user:EntryTemplate runat="server" /> <user:EntryDetailsTemplate runat="server" /> <user:BrowsersTemplate runat="server" /> <user:EditEntryTemplate runat="server" /> <user:EntryDetailsCloudTemplate runat="server" /> When the default.aspx page is requested, all of the templates are retrieved in a single page. WCF Data Services The JavaScript Reference application uses WCF Data Services to retrieve and modify database data. The application exposes a server-side WCF Data Service named EntryService.svc that supports querying, adding, updating, and deleting entries. jQuery Ajax calls are made against the WCF Data Service to perform the database operations from the browser. The OData protocol makes this easy. Authentication is handled on the server with a ChangeInterceptor. Only authenticated users are allowed to update the JavaScript Reference entry database. JavaScript Unit Tests In order to build the JavaScript Reference application, I depended on JavaScript unit tests. I needed the unit tests, in particular, to write the JavaScript merge functions which merge entry change sets from the server with existing entries in browser local storage. In order for unit tests to be useful, they need to run fast. I ran my unit tests after each build. For this reason, I did not want to run the unit tests within the context of a browser. Instead, I ran the unit tests using server-side JavaScript (the Microsoft Script Control). The source code that you can download at the end of this blog entry includes a project named JavaScriptReference.UnitTests that contains all of the JavaScripts unit tests. JavaScript Integration Tests Because not every feature of an application can be tested by unit tests, the JavaScript Reference application also includes integration tests. I wrote the integration tests using Selenium RC in combination with ASP.NET Unit Tests. The Selenium tests run against all of the target browsers for the JavaScript Reference application: IE 8, Chrome 8, Firefox 3.6, and Safari 5. For example, here is the Selenium test that checks whether authenticating with a valid user name and password correctly switches the application to Admin Mode: [TestMethod] [HostType("ASP.NET")] [UrlToTest("http://localhost:26303/JavaScriptReference")] [AspNetDevelopmentServerHost(@"C:\Users\Stephen\Documents\Repos\JavaScriptReference\JavaScriptReference\JavaScriptReference", "/JavaScriptReference")] public void TestValidLogin() { // Run test for each controller foreach (var controller in this.Controllers) { var selenium = controller.Value; var browserName = controller.Key; // Open reference page. selenium.Open("http://localhost:26303/JavaScriptReference/default.aspx"); // Click login button displays login form selenium.Click("btnLogin"); Assert.IsTrue(selenium.IsVisible("loginForm"), "Login form appears after clicking btnLogin"); // Enter user name and password selenium.Type("userName", "Admin"); selenium.Type("password", "secret"); selenium.Click("btnDoLogin"); // Should set adminMode == true selenium.WaitForCondition("selenium.browserbot.getCurrentWindow().adminMode==true", "30000"); } }   The results for running the Selenium tests appear in the Test Results window just like the unit tests: The Selenium tests take much longer to execute than the unit tests. However, they provide test coverage for actual browsers. Furthermore, if you are using Visual Studio ALM, you can run the tests automatically every night as part of your standard nightly build. You can view the Selenium tests by opening the JavaScriptReference.QATests project. Summary I plan to write more detailed blog entries about this application over the next week. I want to discuss each of the features – HTML5 local storage, HTML5 offline apps, jQuery templates, automatic script combining and minification, JavaScript unit tests, Selenium tests -- in more detail. You can download the source control for the JavaScript Reference Application by clicking the following link: Download You need Visual Studio 2010 and ASP.NET 4 to build the application. Before running the JavaScript unit tests, install the Microsoft Script Control. Before running the Selenium tests, start the Selenium server by running the StartSeleniumServer.bat file located in the JavaScriptReference.QATests project.

    Read the article

  • Replace Javascript click event with timed event?

    - by Rik
    Hi, I've found some javascript code that layers photos on top of each other when you click on them. Rather than having to click I'd like the function to automatically run every 5 seconds. How can I change this event to a timed one: $('a#nextImage, #image img').click(function(event){ Full code below. Thanks <script type="text/javascript"> $(document).ready(function(){ $('#description').css({'display':'block'}); $('#image img').hover(function() { $(this).addClass('hover'); }, function() { $(this).removeClass('hover'); }); $('a#nextImage, #image img').click(function(event){ event.preventDefault(); $('#description p:first-child').css({'visibility':'hidden'}); if($('#image img.current').next().length){ $('#image img.current').removeClass('current').next().fadeIn('normal').addClass('current').css({'position':'absolute'}); }else{ $('#image img').removeClass('current').css({'display':'none'}); $('#image img:first-child').fadeIn('normal').addClass('current').css({'position':'absolute'}); } if($('#image img.current').width()>=($('#page').width()-100)){ xPos=170; }else{ do{ xPos = 120 + (Math.floor(Math.random()*($('#page').width()-100))); }while(xPos+$('#image img.current').width()>$('#page').width()); } if($('#image img.current').height()>=300){ yPos=0; }else{ do{ yPos = Math.floor(Math.random()*300); }while(yPos+$('#image img.current').height()>300); } $('#image img.current').css({'left':xPos,'top':yPos}); }); });

    Read the article

  • ATI proprietary driver performance?

    - by Axel
    I'm about to (at least, want to..) buy a laptop with an ATI Radeon HD 4250, and I haven't a good opinion on ATI's drivers. How is the actual performance of the open/proprietary driver (currently I have nVidia, and I'm very satisfied)? The intended use for the laptop is: watching videos, programming in Java/PHP/maybe Qt... but, I like to know if Compiz runs well. Yes, I'm a hardcore (?) programmer that uses Compiz. :P Someone has this GPU? Experiences? Thoughts? Thanks! :D

    Read the article

  • Intel Xeon 5600 (Westmere-EP) and AMD Magny-Cours Performance Update

    - by jchang
    HP has just released TPC-C and TPC-E results for the ProLiant DL380G7 with 2 Xeon 5680 3.33GHz 6-core processor, allowing a direct comparison with their DL385G& with 2 Opteron 6176 2.3GHz 12-core processors. Last month I complained about the lack of performance results for the Intel Xeon 5600 6-core 32nm processor line for 2-way systems. This might have been deliberate to not complicate the message for the Xeon 7500 8-core 45nm (for 4-way+ systems) launch two weeks later. http://sqlblog.com/blogs/joe_chang/archive/2010/04/07/intel-xeon-5600-westmere-ep-and-7500-nehalem-ex.aspx...(read more)

    Read the article

  • How does ecryptfs impact harddisk performance?

    - by Freddi
    I have my home directy encrypted with ecryptfs. Does ecryptfs lead to fragmentation? I have the feeling that reading files, displaying folders and login became continuously slower and slower (although it was not noticeably slow at the beginning). The hard disk makes a lot of seek noise even if I open only a text file. In /home/.ecryptfs I see many big archives (that probably contain the encrypted files), so I'm wondering if Linux file system online defragmentation gains anything here. What options do I have to increase performance? Should I decide whether I maybe better do without encryption?

    Read the article

  • Quick ways to boost performance and scalability of ASP.NET, WCF and Desktop Clients

    - by oazabir
    There are some simple configuration changes that you can make on machine.config and IIS to give your web applications significant performance boost. These are simple harmless changes but makes a lot of difference in terms of scalability. By tweaking system.net changes, you can increase the number of parallel calls that can be made from the services hosted on your servers as well as on desktop computers and thus increase scalability. By changing WCF throttling config you can increase number of simultaneous calls WCF can accept and thus make most use of your hardware power. By changing ASP.NET process model, you can increase number of concurrent requests that can be served by your website. And finally by turning on IIS caching and dynamic compression, you can dramatically increase the page download speed on browsers and and overall responsiveness of your applications. Read the CodeProject article for more details. http://www.codeproject.com/KB/webservices/quickwins.aspx Please vote for me if you find the article useful.

    Read the article

  • Java performance of StringBuilder append chains

    - by ultimate_guy
    In Java, if I am building a significant number of strings, is there any difference in performance in the following two examples? StringBuilder sb = new StringBuilder(); for (int i = 0; i < largeNumber; i++) { sb.append(var[i]); sb.append('='); sb.append(value[i]); sb.append(','); } or StringBuilder sb = new StringBuilder(); for (int i = 0; i < largeNumber; i++) { sb.append(var[i]).append('=').append(value[i]).append(','); } Thanks!

    Read the article

  • Misunderstanding Scope in JavaScript?

    - by Jeff
    I've seen a few other developers talk about binding scope in JavaScript but it has always seemed to me like this is an inaccurate phrase. The Function.prototype.call and Function.prototype.apply don't pass scope around between two methods; they change the caller of the function - two very different things. For example: function outer() { var item = { foo: 'foo' }; var bar = 'bar'; inner.apply(item, null); } function inner() { console.log(this.foo); //foo console.log(bar); //ReferenceError: bar is not defined } If the scope of outer was really passed into inner, I would expect that inner would be able to access bar, but it can't. bar was in scope in outer and it is out of scope in inner. Hence, the scope wasn't passed. Even the Mozilla docs don't mention anything about passing scope: Calls a function with a given this value and arguments provided as an array. Am I misunderstanding scope or specifically scope as it applies to JavaScript? Or is it these other developers that are misunderstanding it?

    Read the article

  • SQL SERVER – Video – Performance Improvement in Columnstore Index

    - by pinaldave
    I earlier wrote an article about SQL SERVER – Fundamentals of Columnstore Index and it got very well accepted in community. However, one of the suggestion I keep on receiving for that article is that many of the reader wanted to see columnstore index in the action but they were not able to do that. Some of the readers did not install SQL Server 2012 or some did not have good machine to recreate the big table involved in the demo. For the same reason, I have created small video for that. I have written two more article on columstore index. Please read them as followup to the video: SQL SERVER – How to Ignore Columnstore Index Usage in Query SQL SERVER – Updating Data in A Columnstore Index Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Index, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video

    Read the article

  • How can I improve overall system performance?

    - by Decio Lira
    What are your tips for improving overall system performance on ubuntu? Inspired by this question I realized that some default settings may be rather conservative on Ubuntu and that it's possible to tweak it with little or no risk if you wish to make it faster. This is not meant to be application specific (e.g. make firefox load pages faster), but system wide. Preferably 1 tip per answer, with enough detail for people to implement it. A couple of mine would be: Install Preload (via Software Center or sudo apt-get install preload); Change Swappiness value - "which controls the degree to which the kernel prefers to swap when it tries to free memory"; What are yours? PS: Since this is not intended to have a unique answer but rather, several useful tips, I'm making this community wiki out-of-the-box.

    Read the article

  • Fixed JavaScript Warning - Pin to Top of Page Using CSS Position [migrated]

    - by nicorellius
    I am new to this site, but it seems like the right place to ask this question. I am working on a noscript chunk of code whereby I do some stuff that includes a <p> at the top of the page that alerts the users that he/she has JavaScript disabled. The end result should look like the Stack Exchange sites when JavaScript is disabled (here is a screenshot of mine - SE looks similar except it is at the very top of the page): I have it working OK, but I would love it if the red bar stayed fixed along the top, upon scrolling. I tried using the position: fixed; method, but it ends up moving the p element and I can't get it to look exactly the same as it does without the position: fixed; modification. I tried fiddling with CSS top and left and other positioning but it doesn't ever look like I want it to. Here is a CSS snippett: <noscript> <style type="text/css"> p. noscript_warning { position: fixed; } </noscript>

    Read the article

  • Web Application: Combining View Layer Between PHP and Javascript-AJAX

    - by wlz
    I'm developing web application using PHP with CodeIgniter MVC framework with a huge real time client-side functionality needs. This is my first time to build large scale of client-side app. So I combine the PHP with a large scale of Javascript modules in one project. As you already know, MVC framework seperate application modules into Model-View-Controller. My concern is about View layer. I could be display the data on the DOM by PHP built-in script tag by load some data on the Controller. Otherwise I could use AJAX to pulled the data -- treat the Controller like a service only -- and display the them by Javascript. Here is some visualization I could put the data directly from Controller: <label>Username</label> <input type="text" id="username" value="<?=$userData['username'];?>"><br /> <label>Date of birth</label> <input type="text" id="dob" value="<?=$userData['dob'];?>"><br /> <label>Address</label> <input type="text" id="address" value="<?=$userData['address'];?>"> Or pull them using AJAX: $.ajax({ type: "POST", url: config.indexURL + "user", dataType: "json", success: function(data) { $('#username').val(data.username); $('#dateOfBirth').val(data.dob); $('#address').val(data.address); } }); So, which approach is better regarding my application has a complex client-side functionality? In the other hand, PHP-CI has a default mechanism to put the data directly from Controller, so why using AJAX?

    Read the article

  • Coded ui to measure performance

    - by Mike Weber
    I have been tasked with using coded UI to measure performance on a proprietary windows desktop application. The need is to measure how long it takes for the next page/screen to display after a user clicks on a control. For example - a user enters their ID and PW and clicks sign-in. The need is to measure how long it takes for the next screen to display when the user clicks the sign-in button. I understand the need to define what indicates the screen is loaded and ready for use. One approach is to use control.WaitForControlReady and use BeginTimer/EndTimer. Is coded ui a dependable and accurate way of measuring time? Is WaitForControlReady the best method to determine when a control is ready for use?

    Read the article

  • Poor mobile performance when running from Eclipse

    - by Yajirobe_LOL
    So after weeks of thinking my rendering code was bad, I accidentally discovered the following: Running my game on a Nexus S From Eclipse (Debug as - Android application): 12fps From the device while still attached to USB (getting log info in Eclipse still): 24fps From the device while not attached via USB: 56fps I was wondering if anyone else has issues like this? I mean, the problem really isn't a problem since the final release build will likely have good performance, but for the time being I don't want to have to keep (un)plugging my device in and out when testing code all day long. Is there some remedy for this or does anyone have any input/advice? Thanks.

    Read the article

  • Will having many timers affect my game performance?

    - by iQue
    I'm making a game for android, and earlier today I was trying to add some cool stuff to my game. The problem is this thing needs like 5 timers. I build my timers like this: timer += deltaTime; if(timer >= 2.0f){ doStuff; timer -= 2.0f; } // this timers gets stuff done every 2 secs Will having to many timers like this, getting checked every frame, screw up my games performance? The effect I wanted to add was a crosshair every 2 sec, then remove it after 2 sec and do a timed animation. So an array of crosshairs dependent on a bunch of timers to be exact. This caused my game to shut down when used, so thats why Im wondering if using that many timers causes my game to flip out.

    Read the article

  • Setter Validation can affect performance?

    - by TiagoBrenck
    Whitin a scenario where you use an ORM to map your entities to the DB, and you have setter validations (nullable, date lower than today validation, etc) every time the ORM get a result, it will pass into the setter to instance the object. If I have a grid that usually returns 500 records, I assume that for each record it passes on all validations. If my entity has 5 setter validations, than I have passed in 2.500 validations. Does those 2.500 validations will affect the performance? If was 15.000 validation, it will be different? In my opinion, and according to this answer (http://stackoverflow.com/questions/4893558/calling-setters-from-a-constructor/4893604#4893604), setter validation is usefull than constructors validation. Is there a way to avoid unecessary validation, since I am safe that the values I send to DB when saving the entity wont change until I edit it on my system?

    Read the article

  • Performance tracking/monitoring in games

    - by vitaliy kotik
    Let's say I have an online game with a downloadable client / browser plugin. I want to track performance of my software and automatically send summary to the server. Let it be fps, latency, load time, physics step calc. time, whatever... I also want tools to perform data analysis: per session stats, per hardware stats, avgs, totals, diagrams, etc. So that I could see what are the real world hotspots / bottlenecks. Is there any common out-of-the-box / SaS solution?

    Read the article

  • Using a subset of GetHashCode() to increase AzureTable performance through partitioning

    - by makerofthings7
    Generally speaking, Azure Table IO performance improves as more partitions are used (with some tradeoffs in continuation tokens and batch updates I won't go into). Since the partition key is always a string I am considering using a "natural" load balancing technique based on a subset of the GetHashCode() of the partition key, and appending this subset to the partition key itself. This will allow all direct PK/RK queries to be computed with little overhead and with ease. Batch updates may just need an intermediate to group similar PKs together prior to submission. Question: Should I use GetHashCode() to compute the partition key? Is a better function available? If I use GetHashCode() does it matter which character I use for my PK? Is there an abstraction for Azure Table and Blob storage that does this for me already?

    Read the article

  • Improving grepping over a huge file performance

    - by rogerio_marcio
    I have FILE_A which has over 300K lines and FILE_B which has over 30M lines. I created a bash script that greps each line in FILE_A over in FILE_B and writes the result of the grep to a new file. This whole process is taking over 5+ hours. I'm looking for suggestions on whether you see any way of improving the performance of my script. I'm using grep -F -m 1 as the grep command. FILE_A looks like this: 123456789 123455321 and FILE_B is like this: 123456789,123456789,730025400149993, 123455321,123455321,730025400126097, So with bash I have a while loop that picks the next line in FILE_A and greps it over in FILE_B. When the pattern is found in FILE_B i write it to result.txt. while read -r line; do grep -F -m1 $line 30MFile done < 300KFile Thanks a lot in advance for your help.

    Read the article

  • game performance

    - by iQue
    I'm making a game for android, and earlier today I was trying to add some cool stuff to my game. The problem is this thing needs like 5 timers. I build my timers like this: timer += deltaTime; if(timer >= 2.0f){ doStuff; timer -= 2.0f; } // this timers gets stuff done every 2 secs Will having to many timers like this, getting checked every frame, screw up my games performance? The effect I wanted to add was a crosshair every 2 sec, then remove it after 2 sec and do a timed animation. So an array of crosshairs dependent on a bunch of timers to be exact. This caused my game to shut down when used, so thats why Im wondering if using that many timers causes my game to flip out.

    Read the article

  • Video capture Performance

    - by volting
    I have noticed high CPU utilization in a number of applications (except mplayer) which read from the embedded webcam on my laptop. Bizarrely CPU utilization varies proportionately to the level of illumination present. I know that that high CPU usage has nothing to do with rendering the video, as I have written a simple app using the OpenCV library to simply grab frames from the webcam, and cpu usage is still high. I think that mplayer might be using my GPU (and the other apps aren't), but since its not an issue with rendering, I dont think this explains anything. Cheese Low light --- ~12% CPU Bright Light ---- ~63% CPU Camorama Low light --- ~7% CPU Bright Light ---- ~30% CPU Opencv C++ library, (display in a single highgui window) Low light --- ~13% CPU Bright Light ---- ~40% CPU (same test on windows 7, 4-9%) Mplayer No problem, 1-2% regardless of light levels Note: If all I want't to do is capture a feed from my webcam I would use mplayer and forget about it, but I'm developing an application which uses the OpenCV to capture a video feed among other things, performance is important.

    Read the article

  • Find an element in a JavaScript array

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2014/08/22/find-an-element-in-a-javascript-array.aspxI needed a C# Dictionary like data structure in JavaScript and then a way to find that object by a key. I had forgotten how to do this, so did some searching and talked to a colleague and came up with this JsFiddle. See the code in my jsFiddle or below: var processingProgressTimeoutIds = []; var file = { name: 'test', timeId: 1 }; var file2 = { name: 'test2', timeId: 2 }; var file3 = { name: 'test3', timeId: 3 }; processingProgressTimeoutIds.push({ name: file.name, timerId: file.id }); processingProgressTimeoutIds.push({ name: file2.name, timerId: file2.id }); processingProgressTimeoutIds.push({ name: file3.name, timerId: file3.id }); console.log(JSON.stringify(processingProgressTimeoutIds)); var keyName = 'test'; var match = processingProgressTimeoutIds.filter(function (item) { return item.name === keyName; })[0]; console.log(JSON.stringify(match)); // optimization var match2 = processingProgressTimeoutIds.some(function (element, index, array) { return element.name === keyName; }); console.log(JSON.stringify(match2)); // if you have the full object var match3 = processingProgressTimeoutIds.indexOf(file); console.log(JSON.stringify(match3)); // http://jsperf.com/array-find-equal – from Dave // indexOf is faster, but I need to find it by the key, so I can’t use it here //ES6 will rock though, array comprehension! – also from Dave // var ys = [x of xs if x == 3]; // var y = ys[0]; Here’s a good blog post on Array comprehension.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >