Search Results

Search found 6805 results on 273 pages for 'fast formula'.

Page 188/273 | < Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >

  • Oracle, slow performance when using sub select

    - by Wyass
    I have a view that is very slow if you fetch all rows. But if I select a subset (providing an ID in the where clause) the performance is very good. I cannot hardcode the ID so I create a sub select to get the ID from another table. The sub select only returns one ID. Now the performance is very slow and it seems like Oracle is evaluating the whole view before using the where clause. Can I somehow help Oracle so SQL 2 and 3 have the same performance? I’m using Oracle 10g 1 slow select * from ci.my_slow_view 2 fast select * from ci.my_slow_view where id = 1; 3 slow select * from ci.my_slow_view where id in (select id from active_ids)

    Read the article

  • Help with concept - filters and number of items

    - by dreamer
    Please check http://www.alibaba.com/catalogs/cid/702/Laptops.html they have nice filter here with number of items for each. Note one detail - they have locations here. Same thing on olx.com - location and number of items for each category. Now imagine I have tables: [products] (Id, Name, CategoryId, LocationId) [Categories] (Id,Name) [Location] (Id, Name) My question how can I do the same, cause count things even with caching looks expensive? And they give results pretty fast... Please advice with possible ways to do that in ASP.NET, C#, MVC, MS SQL, but avoice simple answers like "count and change" Thank you in advance.

    Read the article

  • Retrieving data from database. Retrieve only when needed or get everything?

    - by RHaguiuda
    I have a simple application to store Contacts. This application uses a simple relational database to store Contact information, like Name, Address and other data fields. While designing it, I question came to my mind: When designing programs that uses databases, should I retrieve all database records and store them in objects in my program, so I have a very fast performance or I should always gather data only when required? Of course, retrieving all data can only be done if it`s not too many, but do you use this approach when you make sure that the database will be small (< 300 records for example)? I have designed once a similar application that fetches data only when needed, but that was slow (using a Access database). Thanks for all help.

    Read the article

  • two android threads and not synchronized data

    - by Sponge
    i have a (perhaps stupid) question: im using 2 threads, one is writing floats and one is reading this floats permanently. my question is, what could happen worse when i dont synchronize them? it would be no problem if some of the values would not be correct because they switch just a little every write operation. im running the application this way at the moment and dont have any problems so i want to know what could happen worse? a read/write conflict would cause a number like 12345 which is written to 54321 and red at the same time appear for example as 54345 ? or could happen something worse? (i dont want to use synchronization to keep the code as fast as possible)

    Read the article

  • MPMoviePlayerController seems to make 2 calls for each movie

    - by user76328
    I seem to have an issue where an iphone app using the MPMoviePlayerController seems to make 2 calls to the server for each video it wants to play back. This occurs with iphone 3.x OS and libraries but not with iphone 2.x. I know that iphone does progressive download and will make multiple 206 requests, etc. but as far as our back end is concerned the player appears to make 2 separate sessions. This only appears to be an issue with iPhone native apps and not iphone videos played through safari. Additional info from apple: iPhone OS 3.0 added support for streaming audio and video over HTTP, and MPMoviePlayerController must validate the media before playback to determine if it is streaming content or progressively downloaded content. This is the delay you are experiencing. On a fast network, the delay should be minimized. Is this double check causing 2 sessions be created for each video request? Any one else seeing same issue? Is there a remedy?

    Read the article

  • Is Python good for highload web projects?

    - by Vitali Fokin
    Hello! I decidet to start my own web project. It should be highload project, and I can't decide which technologies should I use. I'm good in ASP.NET MVC, but I like languages like Python more than C#. I read a lot about Python and Django/Pylons/etc but I didn't find any good examples of highload projects on python. So, the question is: Is Python good for highload project? Is it enough fast? And if it is, are python frameworks like django/pylons/etc good for this? Or asp.net mvc will be better choice? PS, I'm not interesting in Java, Ruby and PHP :) So, I'm choosing only between Python + django/pylons/etc and asp.net mvc. Thanks in advance. Please, don't make holywars :)

    Read the article

  • How to handle large table in MySQL ?

    - by Frantz Miccoli
    I've a database used to store items and properties about these items. The number of properties is extensible, thus there is a join table to store each property associated to an item value. CREATE TABLE `item_property` ( `property_id` int(11) NOT NULL, `item_id` int(11) NOT NULL, `value` double NOT NULL, PRIMARY KEY (`property_id`,`item_id`), KEY `item_id` (`item_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; This database has two goals : storing (which has first priority and has to be very quick, I would like to perform many inserts (hundreds) in few seconds), retrieving data (selects using item_id and property_id) (this is a second priority, it can be slower but not too much because this would ruin my usage of the DB). Currently this table hosts 1.6 billions entries and a simple count can take up to 2 minutes... Inserting isn't fast enough to be usable. I'm using Zend_Db to access my data and would really be happy if you don't suggest me to develop any php side part. Thanks for your advices !

    Read the article

  • Best indexing strategy for several varchar columns in Postgres

    - by Corey
    I have a table with 10 columns that need to be searchable (the table itself has about 20 columns). So the user will enter query criteria for at least one of the columns but possibly all ten. All non-empty criteria is then put into an AND condition Suppose the user provided non-empty criteria for column1 and column4 and column8 the query would be: select * from the_table where column1 like '%column1_query%' and column4 like '%column4_query%' and column8 like '%column8_query%' So my question is: am I better off creating 1 index with 10 columns? 10 indexes with 1 column each? Or do I need to find out what sets of columns are queried together frequently and create indexes for them (an index on cols 1,4 and 8 in the case above). If my understanding is correct a single index of 10 columns would only work effectively if all 10 columns are in the condition. Open to any suggestions here, additionally the rowcount of the table is only expected to be around 20-30K rows but I want to make sure any and all searches on the table are fast. Thanks!

    Read the article

  • Treating a fat webservice in .net 3.5 c#

    - by Chris M
    I'm dealing with an obese 3rd party webservice that returns about 3mb of data for a simple search results, about 50% of the data in that response is junk. Would it make sense then to remap this data to my own result object and ditch the response so I'm storing 1-2 mb in memory for filtering and sorting rather than using the web-responses own object and using 2-4 or am I missing a point? So far I've been accessing the webservice from a separate project and using a new class to provide the interaction and to handle the persistence so my project looks like this |- Web (mvc2 proj) |- DAL (database/storage fluent-nhibernate) |- SVCGateway (interaction layer + webservice related models) |- Services -------------- |- Tests |- Specs I'm trying to make the application behave fast and I also need to store the result set temporarily in case a customer goes to view the product and wants to go back to the results. (Service returns only 500 of possible 14K results). So basically I'm looking for confirmation that I'm doing the right thing in pushing the results into my own objects or if I'm breaking some rule or even if there's a better way of handling it. Thanks

    Read the article

  • Anyone Know a Great Sparse One Dimensional Array Library in Python?

    - by TheJacobTaylor
    I am working on an algorithm in Python that uses arrays heavily. The arrays are typically sparse and are read from and written to constantly. I am currently using relatively large native arrays and the performance is good but the memory usage is high (as expected). I would like to be able to have the array implementation not waste space for values that are not used and allow an index offset other than zero. As an example, if my numbers start at 1,000,000 I would like to be able to index my array starting at 1,000,000 and not be required to waste memory with a million unused values. Array reads and writes needs to be fast. Expanding into new territory can be a small delay but reads and writes should be O(1) if possible. Does anybody know of a library that can do it? Thanks!

    Read the article

  • Annoying Bug in banner cycle

    - by Davemof
    I have created a rotating banner script for a new site I'm developing View banner here(it rotates every 10 seconds) Unfortunately to the transition seems to be a biut buggy, and the image will fade out, show the same image again and then fade in the new one. I think I have made a simple error somewhere, but can't figure out where it is. the code used to cycle the banners is: In document ready: if ($('.home').length > 0){ $('<img width="100%" />').attr('src', '/assets/img/backgrounds/home/hero'+homecount+'.jpg').load(function(){ $('.hero').append( $(this) ); $('.hero img').fadeIn('medium').delay(10000).fadeOut('slow', loopImages); setHeroHeight(); }); } Outside document ready: function loopImages(){ homecount = homecount+1; if (homecount > 5){ homecount = 1; } $('.hero img') .attr('src', '/assets/img/backgrounds/home/hero'+homecount+'.jpg') .load(function(){ $('.hero img').fadeIn('fast')}).delay(10000).fadeOut('slow', loopImages); } Any help would be greatly appreciated Thanks Dave

    Read the article

  • Dynamically generating high performance functions in clojure

    - by mikera
    I'm trying to use Clojure to dynamically generate functions that can be applied to large volumes of data - i.e. a requirement is that the functions be compiled to bytecode in order to execute fast, but their specification is not known until run time. e.g. suppose I specify functions with a simple DSL like: (def my-spec [:add [:multiply 2 :param0] 3]) I would like to create a function compile-spec such that: (compile-spec my-spec) Would return a compiled function of one parameter x that returns 2x+3. What is the best way to do this in Clojure?

    Read the article

  • Is memcached a dinosaur in comparison to Redis?

    - by Industrial
    Hi everyone, I have worked quite a bit with memcached the last weeks and just found out about Redis. When I read this part of their readme, I suddenly got a warm, cozy feeling in my stomach: Redis can be used as a memcached on steroids because is as fast as memcached but with a number of features more. Like memcached, Redis also supports setting timeouts to keys so that this key will be automatically removed when a given amount of time passes. This sounds amazing. I'd also found this page with benchmarks: http://www.ruturaj.net/redis-memcached-tokyo-tyrant-mysql-comparison So, honestly - Is memcache really that old dinousaur that is a bad choice from a performance perspective when compared to this newcomer called Redis? I haven't heard lot about Redis previously, thereby the approach for my question!

    Read the article

  • C# TCP First Message Delay

    - by ikurtz
    greetings, i am writng a socket program using sockets in c# (asynchronous). the issue is, when a client connects to the server it kinda happens quiet fast. then.. when the first message is sent there is a delay in responding. this only happens to the very first data being sent over the connection. and boh client and server suffers from this behaviour. what is this delay? is there a way to get rid of this? many thanks.

    Read the article

  • jquery: animating html5 video element?

    - by mathiregister
    is it possible to animate a html5 video element <div id="cont"> <video id="movie" width="320" height="240" preload controls> <source src="pr6.ogv" type='video/ogg; codecs="theora, vorbis"'> <source src="pr6.mp4" type='video/mp4; codecs="avc1.42E01E, mp4a.40.2"'> <source src="pr6.webm" type='video/webm; codecs="vp8, vorbis"'> </video> </div> <script type="text/javascript"> $(document).ready(function(){ $('#cont').animate({"left": "500px"}, "fast"); //$('#movie').css("left", "300px"); }); </script> this seems not to work! thank you for your help

    Read the article

  • What kind of good approaches use c++ programmers for storing error messages?

    - by Narek
    Say I have a huge code and have different kinds of error messages. For that I want to have a separate place where I store error codes and error messages. For example, for an error that occured because the program could not open a file I stroe: F001 "Can not open a file." "The same error message in another language" "The same error message in third language" What is the best way of storing different kind of error messages and codes in a file for c++ programmer in order to use that in a programme fast and easily? FYI I am working with Qt lib.

    Read the article

  • is it posible to upload directly to remote server using SFTP on ASP.net MVC

    - by DucDigital
    Hi! I am currently develope something using asp.net MVC, im still quite not experience with it so please help me out. I have a form for user to upload Video. The current ideal concept to upload to remote server is to Upload it to to the current server, then use FTP to push it to a remote server. For me, this is not quite fast since you have to upload to current server (Time x1) and then the current server push to new server (Time x2) so it's double the time. So my idea is to make user upload it to the current server, and WHILE user is uploading, the current server add the file to DB and also send the file to the remote server at the same time using SFTP... is it posible and are there any security hole in this concept? Thank you very much

    Read the article

  • Playing animations in sequence in objecive c

    - by Ohmnastrum
    I'm trying to play animations in sequence but i'm having issues playing them as a for loop iterates through the list of objects in an array. it will move through the array but it won't play each one it just plays the last... -(void) startGame { gramma.animationDuration = 0.5; // Repeat forever gramma.animationRepeatCount = 1; int r = arc4random() % 4; [colorChoices addObject:[NSNumber numberWithInt:r]]; int anInt = [[colorChoices objectAtIndex:0] integerValue]; NSLog(@"%d", anInt); for (int i = 0; i < colorChoices.count; i++) { [self StrikeFrog:[[colorChoices objectAtIndex:i] integerValue]]; //NSLog(@"%d", [[colorChoices objectAtIndex:i] integerValue]); sleep(1); } } it moves through the whole cycle really fast and sleep isn't doing anything to allow it to play each animation... any suggestions?

    Read the article

  • How to build up a lookup table in microcontroller?

    - by GiGi
    Hi, I am so confused about how to build up a 3-dimension lookup table. Is the template as follows: create a 3 dimensional array to store data. Then create linked list. Then create function 'insert' to put all the data into the array? As some book said, linked list should be static const, is it need to create another function to expand the list? Because the lookup table should be used in a microcontroller, it only needs to finish the operation of putting the data into the array and whenever want to find the data, it will be fast and easy to search. Could you help me with that and give me some suggestions? Thank you.

    Read the article

  • How do I use Declarations (type, inline, optimize) in Scheme?

    - by kunjaan
    How do I declare the types of the parameters in order to circumvent type checking? How do I optimize the speed to tell the compiler to run the function as fast as possible like (optimize speed (safety 0))? How do I make an inline function in Scheme? How do I use an unboxed representation of a data object? And finally are any of these important or necessary? Can I depend on my compiler to make these optimizations? thanks, kunjaan.

    Read the article

  • What is the technological basis for Azure DocumentDB?

    - by user1703840
    Microsoft announced the availability of Azure DocumentDB as follows... A fully-managed, highly-scalable, NoSQL document database service. - Rich query over a schema-free JSON data model - Transactional execution of JavaScript logic - Scalable storage and throughput - Tunable consistency - Rapid development with familiar technologies - Blazingly fast and write optimized database service I really like the "transactional execution of JavaScript logic". Sounds like an approach similar to that of PostgreSQL NoSQL. Anyone know what is the technological basis for the Azure DocumentDB service? SQL Server?

    Read the article

  • Where is the chink in Google Chrome's armor?

    - by kudlur
    While browsing with Chrome, I noticed that it responds extremely fast (in comparison with IE and Firefox on my laptop) in terms of rendering pages, including JavaScript heavy sites like gmail. This is what googlebook on Chrome has to say tabs are hosted in process rather than thread. compile javascript using V8 engine as opposed to interpreting. Introduce new virtual machine to support javascript heavy apps introduce "hidden class transitions" and apply dynamic optimization to speed up things. Replace inefficient "Conservative garbage colllection" scheme with more precise garbage collection scheme. Introduce their own task scheduler and memory manager to manage the browser environment. All this sounds so familiar, and Microsoft has been doing such things for long time.. Windows os, C++, C# etc compilers, CLR, and so on. So why isn't Microsoft or any other browser vendor taking Chrome's approach? Is there a flaw in Chrome's approach? If not, is the rest of browser vendor community caught unaware with Google's approach?

    Read the article

  • Twitter search c#

    - by Lily
    Hi, I have implemented a method which manually scrapes the Search Twitter page and gets the tweets on different pages. But since there is a fast refresh rate, the method triggers an exception. Therefore I have decided to use TweetSharp API instead var search = FluentTwitter.CreateRequest() .AuthenticateAs(TWITTER_USERNAME, TWITTER_PASSWORD) .Users().SearchFor("dumbledore"); var result = search.Request(); var users = result.AsUsers(); this code was on the site. Does anyone know how I can avoid giving my credentials and retrieve from all users and not just the ones I have as friends? Thanks!

    Read the article

  • I need an efficient protocol between webservices that are more or less supported by all major langua

    - by corgrath
    Hey all. I am looking for a fast and efficient protocol that can be used between different web services to send text-data (not binary data). Doesn't matter if the protocol is binary or text base. Some conditions: I has to be more "efficient" than normal XML which adds a lot of extra data and the tools to read/write is too heavy It has to be "supported" by most major languages, meaning it cannot only be available for one specific language. At the moment, both Java and PHP have to be able to talk to each other using this protocol. I have already looked at: XML - which I am currently using. Hessian 2 -which works perfectly in Java, but the PHP-support is out of date JSON -the different between JSON and XML is only minor Any suggestions are welcome!

    Read the article

  • More elegant way to parse inline variables in strings

    - by Tom
    Currently I have this: function parse_string($string, $variables){ extract($variables); return eval('return "'. addcslashes($string, '"') .'";'); } So I can input this string: 'Hi {$name}, my name is {$own_name}' Together with this array: array('name' => 'John', 'own_name' => 'Tom') And get this back: 'Hi John, my name is Tom'   I've never liked this eval() approach but it works and it's fast (faster than regex at least). Question: Is there a more elegant way to do this (faster than using regex) in PHP5?

    Read the article

< Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >