Search Results

Search found 80052 results on 3203 pages for 'data load performance'.

Page 249/3203 | < Previous Page | 245 246 247 248 249 250 251 252 253 254 255 256  | Next Page >

  • jquery data selector

    - by Tauren
    I need to select elements based on values stored in an element's .data() object. At a minimum, I'd like to select top-level data properties using selectors, perhaps like this: $('a').data("category","music"); $('a:data(category=music)'); Or perhaps the selector would be in regular attribute selector format: $('a[category=music]'); Or in attribute format, but with a specifier to indicate it is in .data(): $('a[:category=music]'); I've found James Padolsey's implementation to look simple, yet good. The selector formats above mirror methods shown on that page. There is also this Sizzle patch. For some reason, I recall reading a while back that jQuery 1.4 would include support for selectors on values in the jquery .data() object. However, now that I'm looking for it, I can't find it. Maybe it was just a feature request that I saw. Is there support for this and I'm just not seeing it? Ideally, I'd like to support sub-properties in data() using dot notation. Like this: $('a').data("user",{name: {first:"Tom",last:"Smith"},username: "tomsmith"}); $('a[:user.name.first=Tom]'); I also would like to support multiple data selectors, where only elements with ALL specified data selectors are found. The regular jquery multiple selector does an OR operation. For instance, $('a.big, a.small') selects a tags with either class big or small). I'm looking for an AND, perhaps like this: $('a').data("artist",{id: 3281, name: "Madonna"}); $('a').data("category","music"); $('a[:category=music && :artist.name=Madonna]'); Lastly, it would be great if comparison operators and regex features were available on data selectors. So $(a[:artist.id>5000]) would be possible. I realize I could probably do much of this using filter(), but it would be nice to have a simple selector format. What solutions are available to do this? Is Jame's Padolsey's the best solution at this time? My concern is primarily in regards to performance, but also in the extra features like sub-property dot-notation and multiple data selectors. Are there other implementations that support these things or are better in some way?

    Read the article

  • Apache maximum request number 256?

    - by victor hugo
    I have a very good server running an Apache instance with mod_jk for proxying the request to an Application server. I'm doing a load test and although I'm sending over 600 requests, the status worker keep showing this: 256 requests currently being processed, 0 idle workers I'm using 'prefork MPM' <IfModule prefork.c> ServerLimit 2048 StartServers 5 MinSpareServers 5 MaxSpareServers 10 MaxClients 1000 MaxRequestsPerChild 0 </IfModule> Is there a compiled limit for Apache to handle just 256 request or what would I be missing?

    Read the article

  • Sync Vs. Async Sockets Performance in .NET

    - by Michael Covelli
    Everything that I read about sockets in .NET says that the asynchronous pattern gives better performance (especially with the new SocketAsyncEventArgs which saves on the allocation). I think this makes sense if we're talking about a server with many client connections where its not possible to allocate one thread per connection. Then I can see the advantage of using the ThreadPool threads and getting async callbacks on them. But in my app, I'm the client and I just need to listen to one server sending market tick data over one tcp connection. Right now, I create a single thread, set the priority to Highest, and call Socket.Receive() with it. My thread blocks on this call and wakes up once new data arrives. If I were to switch this to an async pattern so that I get a callback when there's new data, I see two issues The threadpool threads will have default priority so it seems they will be strictly worse than my own thread which has Highest priority. I'll still have to send everything through a single thread at some point. Say that I get N callbacks at almost the same time on N different threadpool threads notifying me that there's new data. The N byte arrays that they deliver can't be processed on the threadpool threads because there's no guarantee that they represent N unique market data messages because TCP is stream based. I'll have to lock and put the bytes into an array anyway and signal some other thread that can process what's in the array. So I'm not sure what having N threadpool threads is buying me. Am I thinking about this wrong? Is there a reason to use the Async patter in my specific case of one client connected to one server?

    Read the article

  • Whats the best way to setup an IIS7 Webfarm for an ASP.NET Application

    - by sontek
    We are looking to setup an IIS7 WebFarm... We have 2 IIS7/Windows Server 2008 boxes that will act as the load balanced webservers. How do you setup IIS/Windows Server 2008 to handle balancing the requests between the 2 servers? What is the best way to sync deployments so we only have to deploy to 1 place. Can we just have them sync their structures or do we have to use a NAS/Network Share?

    Read the article

  • Problems with jQuery load and getJSON only when using Chrome

    - by leftend
    I'm having an issue with two jQuery calls. The first is a "load" that retrieves HTML and displays it on the page (it does include some Javascript and CSS in the code that is returned). The second is a "getJSON" that returns JSON - the JSON returned is valid. Everything works fine in every other browser I've tried - except Chrome for either Windows or Mac. The page in question is here: http://urbanistguide.com/category/Contemporary.aspx When you click on a Restaurant name in IE/FF, you should see that item expand with more info - and a map displayed to the right. However, if you do this in Chrome all you get is the JSON data printed to the screen. The first problem spot is when the "load" function is called here: var fulllisting = top.find(".listingfull"); fulllisting.load(href2, function() { fulllisting.append("<div style=\"width:99%;margin-top:10px;text-align:right;\"><a href=\"#\" class=\"" + obj.attr("id") + "\">X</a>"); itemId = fulllisting.find("a.listinglink").attr("id"); ... In the above code, the callback function doesn't seem to get invoked. The second problem spot is when the "getJSON" function is called: $.getJSON(href, function(data) { if (data.error.length > 0) { //display error message } else { ... } In this case - it just seems to follow the link instead of performing the callback... and yes, I am doing a "return false;" at the end of all of this to prevent the link from executing. All of the rest of the code is inline on that page if you want to view the source code. Any ideas?? Thanks

    Read the article

  • Cost of exception handlers in Python

    - by Thilo
    In another question, the accepted answer suggested replacing a (very cheap) if statement in Python code with a try/except block to improve performance. Coding style issues aside, and assuming that the exception is never triggered, how much difference does it make (performance-wise) to have an exception handler, versus not having one, versus having a compare-to-zero if-statement?

    Read the article

  • How does glClear() improve performance?

    - by Jon-Eric
    Apple's Technical Q&A on addressing flickering (QA1650) includes the following paragraph. (Emphasis mine.) You must provide a color to every pixel on the screen. At the beginning of your drawing code, it is a good idea to use glClear() to initialize the color buffer. A full-screen clear of each of your color, depth, and stencil buffers (if you're using them) at the start of a frame can also generally improve your application's performance. On other platforms, I've always found it to be an optimization to not clear the color buffer if you're going to draw to every pixel. (Why waste time filling the color buffer if you're just going to overwrite that clear color?) How can a call to glClear() improve performance?

    Read the article

  • How to manage service failover?

    - by Jader Dias
    I am using Windows Network Load Balancing to keep my apps available even when one of the servers is down. But when all servers are up, but one instance of a service in one of them is down, I would like to not send requests to it, because those requests will be lost. Is there any solution that addresses this problem?

    Read the article

  • vsts load test datasource issues

    - by ashish.s
    Hello, I have a simple test using vsts load test that is using datasource. The connection string for the source is as follows <connectionStrings> <add name="MyExcelConn" connectionString="Driver={Microsoft Excel Driver (*.xls)};Dsn=Excel Files;dbq=loginusers.xls;defaultdir=.;driverid=790;maxbuffersize=4096;pagetimeout=20;ReadOnly=False" providerName="System.Data.Odbc" /> </connectionStrings> the datasource configuration is as follows and i am getting following error estError TestError 1,000 The unit test adapter failed to connect to the data source or to read the data. For more information on troubleshooting this error, see "Troubleshooting Data-Driven Unit Tests" (http://go.microsoft.com/fwlink/?LinkId=62412) in the MSDN Library. Error details: ERROR [42000] [Microsoft][ODBC Excel Driver] Cannot update. Database or object is read-only. ERROR [IM006] [Microsoft][ODBC Driver Manager] Driver's SQLSetConnectAttr failed ERROR [42000] [Microsoft][ODBC Excel Driver] Cannot update. Database or object is read-only. I wrote a test, just to check if i could create an odbc connection would work and that works the test is as follows [TestMethod] public void TestExcelFile() { string connString = ConfigurationManager.ConnectionStrings["MyExcelConn"].ConnectionString; using (OdbcConnection con = new OdbcConnection(connString)) { con.Open(); System.Data.Odbc.OdbcCommand objCmd = new OdbcCommand("SELECT * FROM [loginusers$]"); objCmd.Connection = con; OdbcDataAdapter adapter = new OdbcDataAdapter(objCmd); DataSet ds = new DataSet(); adapter.Fill(ds); Assert.IsTrue(ds.Tables[0].Rows.Count > 1); } } any ideas ?

    Read the article

  • onComplete for Ajax.Request sometimes does not hide the load animation in Prototype

    - by TenJack
    I have a Ajax.Request in which I use onLoading and onComplete to show and hide a load animation gif. The problem is that every 10 clicks or so, the load animation fails to hide and just stays there animating even though the ajax request has returned successfully. I have a number of div elements that each has its own respective load animation and an onclick with the Ajax.Request that looks like this: <div id="word_block_<%= word_obj.word %>" class="word_block" > <%= image_tag("ajax-loader_word_block.gif", :id => "load_animation_#{word_obj.word}", :style => 'display:none') %> <a href="#" onclick="new Ajax.Request('/test/ajax_load', {asynchronous:true, evalScripts:true, onLoading:function() { Element.show('load_animation_<%= word_obj.word %>')}, onComplete:function(){ Element.hide('load_animation_<%= word_obj.word %>')}}); return false;">Click Here</a> </div> Does it look like anything could be wrong with this? Maybe I should try removing the inline onclick and add an onclick programmatically with javascript? I really have no idea why this keeps happening. I am using the prototype library with ruby on rails.

    Read the article

  • Unable to load huge XML document (incorrectly suppose it's due to the XSLT processing)

    - by krisvandenbergh
    I'm trying to match certain elements using XSLT. My input document is very large and the source XML fails to load after processing the following code (consider especially the first line). <xsl:template match="XMI/XMI.content/Model_Management.Model/Foundation.Core.Namespace.ownedElement/Model_Management.Package/Foundation.Core.Namespace.ownedElement"> <rdf:RDF> <rdf:Description rdf:about=""> <xsl:for-each select="Foundation.Core.Class"> <xsl:for-each select="Foundation.Core.ModelElement.name"> <owl:Class rdf:ID="@Foundation.Core.ModelElement.name" /> </xsl:for-each> </xsl:for-each> </rdf:Description> </rdf:RDF> </xsl:template> Apparently the XSLT fails to load after "Model_Management.Model". The PHP code is as follows: if ($xml->loadXML($source_xml) == false) { die('Failed to load source XML: ' . $http_file); } It then fails to perform loadXML and immediately dies. I think there are two options now. 1) I should set a maximum executing time. Frankly, I don't know how that I do this for the built-in PHP 5 XSLT processor. 2) Think about another way to match. What would be the best way to deal with this? The input document can be found at http://krisvandenbergh.be/uml_pricing.xml Any help would be appreciated! Thanks.

    Read the article

  • Virtual Earth Shape Rendering Performance

    - by Mike
    I am overlaying a transparent image on my VEMap control by rendering it as a single VEShape. The shape changes sizes dynamically depeding on the zoom level of my map and can be as large as 4000*4000px. In older browsers such as IE6 and early versions of Firefox 2.x, map control performance degrades rapidly when my shape gets larger than 1500*1500px. The mouse pointer moves slowly and the map responds very slowly to events. I don't see this issue at all in newer browsers (IE7+). Are there any workarounds to boost performance of rendering a large shape for IE6 users?

    Read the article

  • Core Data Image Won't Load Into NSTableView Image Cell

    - by jcady
    In my code I am storing an image into my Core Data model (works fine). If I set up my view to have an NSImageView and bind its Data to Controller Key: selection and modelKeyPath: myImagePath, it works. It will show each image for the selected row. I then create a new column in my NSTableView and drag an image cell onto the column. However, I cannot get my Core Data bindings to have the image show up in the cell. I have tried binding both value and data, but no luck. Since I am sure the image is stored properly, what am I doing wrong in my binding to prevent the image from showing up in the table cell? Thanks so much. (My background: new Cocoa developer who has recently gone through the entire Hillegass book.)

    Read the article

  • TDD test data loading methods

    - by Dave Hanson
    I am a TDD newb and I would like to figure out how to test the following code. I am trying to write my tests first, but I am having trouble for creating a test that touches my DataAccessor. I can't figure out how to fake it. I've done the extend the shipment class and override the Load() method; to continue testing the object. I feel as though I end up unit testing my Mock objects/stubs and not my real objects. I thought in TDD the unit tests were supposed to hit ALL of the methods on the object; however I can never seem to test that Load() code only the overriden Mock Load My tests were write an object that contains a list of orders based off of shipment number. I have an object that loads itself from the database. public class Shipment { //member variables protected List<string> _listOfOrders = new List<string>(); protected string _id = "" //public properties public List<string> ListOrders { get{ return _listOfOrders; } } public Shipment(string id) { _id = id; Load(); } //PROBLEM METHOD // whenever I write code that needs this Shipment object, this method tries // to hit the DB and fubars my tests // the only way to get around is to have all my tests run on a fake Shipment object. protected void Load() { _listOfOrders = DataAccessor.GetOrders(_id); } } I create my fake shipment class to test the rest of the classes methods .I can't ever test the Real load method without having an actual DB connection public class FakeShipment : Shipment { protected new void Load() { _listOfOrders = new List<string>(); } } Any thoughts? Please advise. Dave

    Read the article

  • Entity Framework Performance Problem

    - by Steve Horn
    I'm hoping that someone can help me understand how to overcome a performance problem I'm running into with the latest version of the Entity Framework. In my test, I created my model from a database consisting of around 80 tables. The problem that I'm running into is that the cost of the very first query I run on a thread is very expensive. If I run without pre-compiling views the first query takes anywhere from 5800 to 6600 milliseconds. If I pre-compile the views (see this article) I can get the initial query cost down to about 2800 to 3200 milliseconds. 3 seconds for each request is still unacceptable for my needs. Subsequent queries are very fast. Can you please help me understand how to eliminate the poor performance of the initial query? I'm using the version of entity framework that ships with Visual Studio 2010 RC.

    Read the article

  • Switch from Microsofts STL to STLport

    - by Laserallan
    Hi! I'm using quite much STL in performance critical C++ code under windows. One possible "cheap" way to get some extra performance would be to change to a faster STL library. According to this post STLport is faster and uses less memory, however it's a few years old. Has anyone made this change recently and what were your results?

    Read the article

  • How to create Data Entry User Interfaces in asp.net?

    - by Wael Dalloul
    Suppose that you have a big Data Entry Web Application Like Microsoft CRM, what is the strategies and technologies that you follow to build a website like it? I don't want to use any Dynamic Web Page Generation software, because it have a lot of limitations.. Also I don't want to design every page and repeat everything what's the best approach? Any Ideas or Head lines on this issue? Thanks in Advance...

    Read the article

  • Import xml to database with high end performance and Audit log- A best Practice

    - by karthik
    Hi, I have to import big xml files to Ms SQL 2005 Database by using C# with high end Performance. Even if any record fails in middle, i have to take next record for process and failed record need to log for audit. I don't want to put insert query with in for loop. Could you please suggest a best way to do this. If I can use bulkcopy methods or Data Adapter update methods- Its very nice, But if any record fails, execution of that statement breaks and rolled back totally, right? Any alternatives and Best practices with example please..? Is Multi-threading works for me to improve performance..? Give me example please. Thanks Karthikeyan

    Read the article

  • iPhone plist data, large amounts of text and return key?

    - by user278342
    Basicly iv built my app using a plist. But with the data there is a few times when I need to press return and start a new line. The return key doesn't work in the plist. If I did it the older way it would be \n\n but that doesn't work either. Is there a obvious way I'm overlooking? Or will it be a case off just pressing the space bar allot? Thanks

    Read the article

  • IE8 (compatibility mode) won't load my Ajax content

    - by Jens Roland
    Hi all - I am working on a jQuery script on http://www.qxl.dk/ and I can't seem to get IE7 (or more accurately, IE8 in IE7 compatibility mode) to load my content. The sidebar box on the right named "QXL Aktuelt" loads its HTML content from an external file using Ajax load(), then triggers a custom jQuery event ("aktuelt_loaded") that starts a carousel script (like a scrolling newsticker). Several other content sections on the same page are loaded through Ajax and they work just fine, so I'm wondering what's going wrong. Everything works as expected in Firefox 3.6 and IE8, but not in IE8's compatibility mode. The script that loads the Ajax content is (inline on the page): <div id="qxlaktueltHolder"></div> <script type="text/javascript"> $("#qxlaktueltHolder").load("/contents/dk/modul/qxlaktuelt/qxlaktuelt.htm", function() { $("#qxlaktueltHolder").trigger("qxlaktuelt_loaded", []); }); </script> <script type='text/javascript' src='http://www.qxl.dk/contents/dk/js/jcarousellite_1.0.1.min.js'></script> <script type='text/javascript' src='http://www.qxl.dk/contents/dk/js/qxlaktuelt_liveload.js'></script> The external script that responds to the event is in the following file: http://www.qxl.dk/contents/dk/js/qxlaktuelt_liveload.js All ideas are very welcome.

    Read the article

  • Load a MySQL innodb database into memory

    - by jack
    I have a MySQL innodb database at 1.9GB, showed by following command. SELECT table_schema "Data Base Name", -> sum( data_length + index_length ) / 1024 / -> 1024 "Data Base Size in MB", -> sum( data_free )/ 1024 / 1024 "Free Space in MB" -> FROM information_schema.TABLES -> GROUP BY table_schema ; +--------------------+----------------------+------------------+ | Data Base Name | Data Base Size in MB | Free Space in MB | +--------------------+----------------------+------------------+ | database_name | 1959.73437500 | 31080.00000000 | My questions are: Does it mean if I set the innodb_buffer_pool_size to 2GB or larger, the whole database can be loaded into memory so much fewer read from disk requests are needed? What does the free space of 31GB mean? If the maximum RAM can be allocated to innodb_buffer_pool_size is 1GB, is it possible to specify which tables to loaded into memory while keep others always read from disk? Thanks in advance.

    Read the article

  • THE FASTEST Smarty Cache Handler

    - by rob.effect
    Does anyone know if there is an overview of the performance of different cache handlers for smarty? I compared smarty file cache with a memcache handler, but it seemed memcache has a negative impact on performance. I figured there would be a faster way to cache than through the filesystem... am I wrong?

    Read the article

< Previous Page | 245 246 247 248 249 250 251 252 253 254 255 256  | Next Page >