Search Results

Search found 11362 results on 455 pages for 'big o analysis'.

Page 32/455 | < Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >

  • PyQt4, QThread and opening big files without freezing the GUI

    - by jmrbcu
    Hi, I would like to ask how to read a big file from disk and maintain the PyQt4 UI responsive (not blocked). I had moved the load of the file to a QThread subclass but my GUI thread get freezed. Any suggestions? I think it must be something with the GIL but I don't know how to sort it? EDIT: I am using vtkGDCMImageReader from the GDCM project to read a multiframe DICOM image and display it with vtk and pyqt4. I do this load in a different thread (QThread) but my app freeze until the image is loaded. here is an example code: class ReadThread(QThread): def __init__(self, file_name): super(ReadThread, self).__init__(self) self.file_name = file_name self.reader.vtkgdcm.vtkGDCMImageReader() def run(self): self.reader.SetFileName(self.file_name) self.reader.Update() self.emit(QtCore.SIGNAL('image_loaded'), self.reader.GetOutput())

    Read the article

  • What is the big deal with SSL?

    - by xarzu
    What is the big deal with SSL? My interenet website hosting provider has sold me an SSL line. All I know is that for what I want to do with PayPal, I need to have a folder that is denoted beginning with https:// and this is what an SSL line is. But it seems that they have a hard time setting it up. I wonder if I can just go ahead and do it for myself. I mean, some third party has send me confirmation and even what seems to be some sort of long numeric certificate.

    Read the article

  • How to save big "database-like" class in python

    - by Rafal
    Hi there, I'm doing a project with reasonalby big DataBase. It's not a probper DB file, but a class with format as follows: DataBase.Nodes.Data=[[] for i in range(1,1000)] f.e. this DataBase is all together something like few thousands rows. Fisrt question - is the way I'm doing efficient, or is it better to use SQL, or any other "proper" DB, which I've never used actually. And the main question - I'd like to save my DataBase class with all record, and then re-open it with Python in another session. Is that possible, what tool should I use? cPickle - it seems to be only for strings, any other? In matlab there's very useful functionality named save workspace - it saves all Your variables to a file that You can open at another session - this would be vary useful in python!

    Read the article

  • Handling "Big" Integers in C#

    - by priyanka.sarkar
    How do I handle big integers in C#? I have a function that will give me the product of divisors: private static int GetDivisorProduct(int N, int product) { for (int i = 1; i < N; i++) { if (N % i == 0) { Console.WriteLine(i.ToString()); product *= i; } } return product; } The calling function is GetDivisorProduct(N, 1) If the result is bigger than 4 digits , I should obtain only the last 4 digits. ( E.g. If I give an input of 957, the output is 7493 after trimming out only the last four values. The actual result is 876467493.). Other sample inputs: If I give 10000, the output is 0. The BigInteger class has been removed from the C# library! How can I get the last four digits?

    Read the article

  • Handling big user IDs returned by FQL in PHP

    - by ggambett
    I'm using FQL to retrieve a list of users from Facebook. For consistency I get the result as JSON. This causes a problem - since the returned JSON encodes the user IDs as numbers, json_decode() converts these numbers to floating point values, because some are too big to fit in an int; of course, I need these IDs as strings. Since json_decode() does its own thing without accepting any behavior flags, I'm at a loss. Any suggestions on how to resolve this?

    Read the article

  • Immutability of big objects

    - by Malax
    Hi StackOverflow! I have some big (more than 3 fields) Objects which can and should be immutable. Every time I run into that case i tend to create constructor abominations with long parameter lists. It doesn't feel right, is hard to use and readability suffers. It is even worse if the fields are some sort of collection type like lists. A simple addSibling(S s) would ease the object creation so much but renders the object mutable. What do you guys use in such cases? I'm on Scala and Java, but i think the problem is language agnostic as long as the language is object oriented. Solutions I can think of: "Constructor abominations with long parameter lists" The Builder Pattern Thanks for your input!

    Read the article

  • Best approach to show big amount of "grid" data

    - by Jorge Ramírez
    Hello all. I am building an application for Android (1.5) that, after quering a webservice, shows to the user a big amount of data that should be displayed in a "grid" or "table" style. I must show a result of about 7 columns and 50 rows (for example a customer list with names, adresses, telephone number, sales amount last year and so). Obviously, the 7 columns will not fix in the screen and I would like the user would be able to scroll up/down and LEFT/RIGHT (important because of the number of columns) to explore the grid results. cell selection level is NOT necessary, as much I would need row selection level. What is the best approach to get this interface element? Listview / GridView / TableLayout? Thanks

    Read the article

  • asp.net C# uploading big file and processing it

    - by JewelThief
    I want to be able to upload file from my .aspx page to my web server so that it can be preocessed into a different format. e.g. user will upload a doc and in few seconds it would see a pdf version of the doc on the web page. I have web service available which can convert doc to pdf. now 1- how do i automate upload + conversion process. 2- how do i handle big files here. 3- how not to make user wait for all this thing to happen.

    Read the article

  • Silverlight Isolated Storage and loading big files

    - by Thomas Joulin
    In a Windows Phone 7 application, I would like to query a big XML file (list of cities) stored using Isolated Storage. If I do that this way, will the file be loaded to memory ( 5 mo) ? If so, what other solution do I have? Edit: More details. I want to use AutoCompleteBox (http://www.jeff.wilcox.name/2008/10/introducing-autocompletebox/), but instead of using a web service (this is fixed data, no need to be online), I want to query a file/database/isolated storage... I have a fixed list of cities. I said in the comments it's 40k, but it finally seems closer to 1k rows.

    Read the article

  • Programming language for fast calculations with big integers

    - by sub
    I'm doing Project Euler problems at the moment and I can solve most of them using my own programming language which uses direct C++ integers (so they are bound to 2^32 on my machine). However, at times there are problems which require me to work with very high numbers, I can't do that with native integers. So I implemented a BigInt library in my language which unfortunately gets extremely slow at times. Is there a programming language suitable for very efficient handling of big numbers? I mean that I want to do the things I could do in other programming languages with it (variables, loops, etc.), but in a faster way. If you have got tips for workarounds of the 2^32 limit in my language/C++/other languages, please tell me too!

    Read the article

  • A RAM error of big array

    - by flint
    I have a big file, more than 400M. In that file, there are 13496*13496 number, means 13496 rows and 13496 cols. I want to read them to a array. This is my code: _L1 = [[0 for col in range(13496)] for row in range(13496)] _L1file = open('distanceCMD.function.txt') while (i<13496): print "i="+str(i) _strlf = _L1file.readline() _strlf = _strlf.split('\t') _strlf = _strlf[:-1] _L1[i] = _strlf i += 1 _L1file.close() And this is my error massage: MemoryError: File "D:\research\space-function\ART3.py", line 30, in <module> _strlf = _strlf.split('\t')

    Read the article

  • In MYSQL is it better to have one big table or many smaller tables

    - by user307922
    Hi All, I am making a database of my client's customers to send email promotions to. The database will include all about 12 of my clients and each of them has an average of 2100 customers. I was wondering if it would be better to have a table in the db for each one of my clients that contains a list of their customers or if I should just make one big table... The customers will be queried daily. I know it is a broad question but any advice would be appreciated. Cheers, Chuck

    Read the article

  • How often do you implement the big three?

    - by Neil Butterworth
    I was just musing about the number of questions here that either are about the "big three" (copy constructor, assignment operator and destructor) or about problems caused by them not being implemented correctly, when it occurred to me that I could not remember the last time I had implemented them myself. A swift grep on my two most active projects indicate that I implement all three in only one class out of about 150. That's not to say I don't implement/declare one or more of them - obviously base classes need a virtual destructor, and a large number of my classes forbid copying using the private copy ctor & assignment op idiom. But fully implemented, there is this single lonely class, which does some reference counting. So I was wondering am I unusual in this? How often do you implement all three of these functions? Is there any pattern to the classes where you do implement them?

    Read the article

  • HTTP Download very Big File

    - by Luca
    I'm working at a web application in Python/Twisted. I want the user to be able to download a very big file ( 100 Mb). I don't want to load all the file in memory (of the server), of course. server side I have this idea: ... request.setHeader('Content-Type', 'text/plain') fp = open(fileName, 'rb') try: r = None while r != '': r = fp.read(1024) request.write(r) finally: fp.close() request.finish() I expected this to work, but I have problems: I'm testing with FF... It seems the browser make me wait until the file is completed downloaded, and then I have the open/save dialog box. I expected the dialog box immediately, and then the progress bar in action... Maybe I have to add something in the Http header... Something like the size of the file?

    Read the article

  • Development Applications on big devices vs mobile phones: Similarities/Disimilarities

    - by Richard77
    Hello, I saw a news document on applications running in mobile devices. And, I believe that might be interesting for people where I live (Internet is not developed - but the cellphone networks are much better). So here are my questions: Where can I find documentation for beginners on that matter (And most importantly) Am I gonna be able to take advantage of acquired knowledge in .NET framework (C#, MVC, JQuery, XHTML, ...)? Am I gonna need my laptop or a special device to develop applications? Am I gonna need Visual Studio? And so on... Bref, What are similarities/dissimilarities between developments for applications running in big machines and those running on mobile phones? Thanks for helping

    Read the article

  • how to convert big-endian numbers to native numbers delphi

    - by steve0
    hi all i want to know how to convert big endian numbers to native numbers in delphi i am porting some c++ code in that i came accross this part unsigned long blockLength = *blockLengthPtr++ << 24; blockLength |= *blockLengthPtr++ << 16; blockLength |= *blockLengthPtr++ << 8; blockLength |= *blockLengthPtr; unsigned long dataLength = *dataLengthPtr++ << 24; dataLength |= *dataLengthPtr++ << 16; dataLength |= *dataLengthPtr++ << 8; dataLength |= *dataLengthPtr; i am not familiar with c++ ,so i didnt understand what those operators doing can any one help ? regards

    Read the article

  • Analyze big human database

    - by Neir0
    Lets we have a big people database. Each human has a many parameters: age, weight, favorite music, favorite films, education etc. I want to know how one feature associate with other features. For example, if human has a good education what it means for musical preferences? Or how films preferences changes with age? I know about assotian rules algorithms like apriory but i donnt want just to found assotiation rules, i want to know how one specific feature affect to others. Which keywords i must to use for google?

    Read the article

  • Storing a big xml string in a xml document in java

    - by shyam R
    Hi All, I Have a Java object which I am converting into a xml file. I am getting a big xml file here after converting java object. I am capturing that converted xml file in a string like below String ouputXML = xmlfile; Noe If I pring the ouputXML on the console of IBM RSA in the proper format only it is printing but my requirement is I need to redirect outputXML in to an XML instead of printing in IBm RSA console. I am able to do this but the problem is if I open the craetd XML file the xml structure is not proper , it is showing so many special characters . Please help me !!

    Read the article

  • Bit/Byte adressing - Little/Big-endnian

    - by code8230
    Consider the 16-Bit data packet below, which is sent through the network in network byte order ie Big Endian: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 (Byte num) 34 67 89 45 90 AB FF 23 65 37 56 C6 56 B7 00 00 (Value) Lets say 8945 is a 16 bit value. All others are 8 bit data bytes. On my system, which is little endian, how would the data be received and stored? Lets say, we are configured to receive 8 bytes at a time. RxBuff is the Rx buffer where data will be received. Buff is the storage buffer where data would be stored. Please point out which case is correct for data storage after reading 8 bytes at a time: 1) Buff[] = {0x34, 0x67, 0x45, 0x89, 0x90, 0xAB....... 0x00}; 2) Buff[] = {0x00, 0x00, .......0x67, 0x89, 0x45, 0x34}; Would the whole 16 bytes data be reversed or only the 2 bytes value contained in this packet?

    Read the article

  • merge three file into one big file

    - by davit-datuashvili
    suppose that we have three array int a[]=new int[]{4,6,8,9,11,12}; int b[]=new int[]{3,5,7,13,14}; int c[]=new int[]{1,2,15,16,17}; and we want to merge it into one big d array where d.length=a.length+b.length+c.length but we have memory problem it means that we must need use only this d array where we should merge these these three array of course we can use merge sort but can we use merge algorithm without sorting method? like two sorted array we can merge in one sorted array what about three or more array?

    Read the article

  • Magento: server requirements for a quite big shop to run smoothly

    - by david parloir
    Hi, I'm working on a quite big magento: it will have 50 different shops (1 magento install, 1 admin to rule them all) for start, this number is expected to raise in the future, and a catalog of more than 1k products. This catalog will be shared by all shops. I'm concerned about the server requirements I need for this to run smoothly. So far this is what I've found to get the most of it: Caching: using magento's cache with APC, MySQL's querys Images sprite in the theme use FastCGI instead of mod_php database clustering: I don't think it will be necesary for 1k products, what do you think? using Zend Server Are there other thing I can do in order to improve magento's performance? I'd like to know all I need from the beginning so I can find the right server. thanks in advance.

    Read the article

  • Load big XML files to mySQL database (PHP)

    - by Kees
    Hello There, For a new project I need to load big XML files (200MB+) to a mySQL database. There are +- 20 feeds i need to match with that (not all fields are the same). Now when i want to catch the XML I get this error: Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 171296569 bytes) in E:\UsbWebserver\Root\****\application\libraries\MY_xml.php on line 21 Is there an easy solution for this? It's not possible to get te feed in parts of a few MB's each. Thank you very much! P.s. has somebody an idea to match xml-feeds easy?

    Read the article

  • SSAS – Synchronisation performance

    - by ACALVETT
    I’ve always thought of SSAS synchronisation as a clever file mirroring utility built into SSAS and i have never considered the technology as bringing any performance gains to the table. So, its a good job I like to revisit areas…. :) I decided to compare the performance of robocopy and SSAS Synchronisation between 2 Windows 2003 servers running SSAS 2008 SP1 CU7 with 1gb network links. For the robocopy of the data directory i used the SQLCat Robocopy Script . The results are shown below. SSAS Sync...(read more)

    Read the article

  • Sublinear Extra Space MergeSort

    - by hulkmeister
    I am reviewing basic algorithms from a book called Algorithms by Robert Sedgewick, and I came across a problem in MergeSort that I am, sad to say, having difficulty solving. The problem is below: Sublinear Extra Space. Develop a merge implementation that reduces that extra space requirement to max(M, N/M), based on the following idea: Divide the array into N/M blocks of size M (for simplicity in this description, assume that N is a multiple of M). Then, (i) considering the blocks as items with their first key as the sort key, sort them using selection sort; and (ii) run through the array merging the first block with the second, then the second block with the third, and so forth. The problem I have with the problem is that based on the idea Sedgewick recommends, the following set of arrays will not be sorted: {0, 10, 12}, {3, 9, 11}, {5, 8, 13}. The algorithm I use is the following: Divide the full array into subarrays of size M. Run Selection Sort on each of the subarrays. Merge each of the subarrays using the method Sedgwick recommends in (ii). (This is where I encounter the problem of where to store the results after the merge.) This leads to wanting to increase the size of the auxiliary space needed to handle at least two subarrays at a time (for merging), but based on the specifications of the problem, that is not allowed. I have also considered using the original array as space for one subarray and using the auxiliary space for the second subarray. However, I can't envision a solution that does not end up overwriting the entries of the first subarray. Any ideas on other ways this can be done? NOTE: If this is suppose to be on StackOverflow.com, please let me know how I can move it. I posted here because the question was academic.

    Read the article

  • SSAS 2008 R2– Little Gems

    - by ACALVETT
    I have spent the last few days working with SSAS 2008 R2 and noticed a few small enhancements which many people probably won’t notice but i will list them here and why they are important to me. New profiler events Commit: This is a new sub class event for “progress report end”. This represents the elapsed time taken for the server to commit your data. It is important because for the duration of this event a server level lock will be in place blocking all incoming connections and causing time out...(read more)

    Read the article

< Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >