Search Results

Search found 9124 results on 365 pages for 'big sal'.

Page 244/365 | < Previous Page | 240 241 242 243 244 245 246 247 248 249 250 251  | Next Page >

  • Can't subtract in a for loop in C/Objective-C

    - by user1612935
    I'm going through the Big Nerd Ranch book on Objective-C, which takes you through some early C stuff. I've played with C before, and am pretty experienced in PHP. Anyhow, I'm doing the challenges and this one is not working the way I think it should. It's pretty simple - start at 99, loop through and subtract three until you get to zero, and every time you get a number that is divisible by 5 print "Found one." Pretty straightforward. However, subtracting by three in the for loop is not working #include <stdio.h> int main (int argc, const char * argv[]) { int i; for(i = 99; i > 0; i-3){ printf("%d\n", i); if(i % 5 == 0) { printf("Found one!\n"); } } return 0; } It creates and endless loop at 99, and I'm not sure why.

    Read the article

  • Select rows where column LIKE dictionary word

    - by Gerve
    I have 2 tables: Dictionary - Contains roughly 36,000 words CREATE TABLE IF NOT EXISTS `dictionary` ( `word` varchar(255) NOT NULL, PRIMARY KEY (`word`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1; Datas - Contains roughly 100,000 rows CREATE TABLE IF NOT EXISTS `datas` ( `ID` int(11) NOT NULL AUTO_INCREMENT, `hash` varchar(32) NOT NULL, `data` varchar(255) NOT NULL, `length` int(11) NOT NULL, `time` int(11) NOT NULL, PRIMARY KEY (`ID`), UNIQUE KEY `hash` (`hash`), KEY `data` (`data`), KEY `length` (`length`), KEY `time` (`time`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=105316 ; I would like to somehow select all the rows from datas where the column data contains 1 or more words. I understand this is a big ask, it would need to match all of these rows together in every combination possible, so it needs the best optimization. I have tried the below query, but it just hangs for ages: SELECT `datas`.*, `dictionary`.`word` FROM `datas`, `dictionary` WHERE `datas`.`data` LIKE CONCAT('%', `dictionary`.`word`, '%') AND LENGTH(`dictionary`.`word`) > 3 ORDER BY `length` ASC LIMIT 15 I have also tried something similar to the above with a left join, and on clause that specified the like statement.

    Read the article

  • How can I make a view get bigger again, as soon as the status bar goes back to normal height after a

    - by Thanks
    In the simulator I went to Hardware menu and activated the simulation of bigger status bar during phone call. Now, I tried to make a view in my nib that takes up the whole screen. As soon as the status bar gets smaller, I want my view to get bigger, so it uses that space up there. But regardless of any autoresizing settings, my view will keep pressed down after that status bar gets smaller. There is a empty slot left where the status bar was after hanging up the call. What's that actually supposed to be? Is my app recognizing the status bar as a view, or is the status bar indeed making my screen smaller? I mean...does it mess around with my views as if it was a view itself, or do my views not know about a status bar, but about a smaller screen size when the status bar gets bigger? How do you get your views big again when the status bar returns to normal height?

    Read the article

  • is there a good reason to fear closed-source code *inside* of open-source libraries?

    - by jcollum
    Here's the situation. At work here, I hear there is resistance to using open source code (Nant in particular) because there might be copyrighted code in there. Meaning somewhere in that open source tool or library there might be a chunk of code that was directly lifted from copyrighted code. In theory, this means our company (which is quite large) get sued for big money because they used an open source library. We don't ship any software, so how this theoretical plaintiff would find this out is a mystery. I have also heard that some group of people came through a year or two ago and actually found instances of this in our codebase. That's hearsay of course, so who knows. Is this simple paranoia? Didn't something similar to this happen with Linux a while ago? Wouldn't the burden of checking for copyrighted code lie with the people who made the code, not the people who use it?

    Read the article

  • Concatinate integer arrays iteratively

    - by Ojtwist
    I have a methode in2.getImagesOneDim() which gives me an array of integers, to be more precise the pixel values of an image. Now i want to create one big array with all the pixel values of all the images. Therefore I have to call this method several times. Now I would like to concatenate the previous output to the current output until all images are read. In some kind of pseudo code, where the + is a concatination ... : for (int i = 1; i < 25; i++) { ConArray = ConArray + in2.getImagesOneDim("../images/"+i); } How would I do this in java ?

    Read the article

  • Best practice for avoiding locks on a heavily read table?

    - by Luiggi
    Hi, I have a big database (~4GB), with 2 large tables (~3M records) having ~180K SELECTs/hour, ~2k UPDATEs/hour and ~1k INSERTs+DELETEs/hour. What would be the best practice to guarantee no locks for the reading tasks while inserting/updating/deleting? I was thinking about using a NOLOCK hint, but there is so much discussed about this (is good, is bad, it depends) that I'm a bit lost. I must say I've tried this in a dev environment and I didn't find any problems, but I don't want to put it on production until I get some feedback... Thank you! Luiggi

    Read the article

  • minifying patched javascript files

    - by Stacia
    I'm writing a Rails app and I've partially integrated in this nice little patch to the in line ajax editor: http://inplacericheditor.box.re/ The problem is, on that page I have tinymce, prototype and scriptaculous included. In Firefox at least there's a big lag when all this stuff is loading. I was hoping to fix it by compressing the files so I checked out a plugin for rails called Smurf. It seemed to do what it was supposed to do nicely, but it choked on the little patch files that are included with the Ajax editor thing. THe patch files look like this: Object.extend(Ajax.InPlaceEditor.prototype, { handleAJAXFailure: function(transport) Alternatively, should I just be catching them instead of worrying about minfying them? I know I'm running on development and that Apache would maybe be handling serving the js files differently..It just seems like a lot of things to serve on one page.

    Read the article

  • Create a picture with GD containing other images

    - by Jensen
    Hi, I would like to create a picture in PHP with GD composed by different other pictures. For example I have 6 pictures (or more) and I would like to create ONE picture who contain these different pictures. The Difficulty is that my final picture must have a fixed width and height (304x179), so if the different pictures are too big they must be cut. This is an example from IconFinder : This picture is composed by 6 images, but the 3rd bird (green) is cutted, and the 4, 5 and 6 are cutted in the bottom. This is what I want, can you give me some help to write this code in PHP ? Thanks

    Read the article

  • What's the recommended way to create an HTML elemnt and bind a listener to it using jQuery?

    - by Bytecode Ninja
    At the moment I achieve this using something like this: var myElem = "<tr id='tr-1'><td>content</td></tr>"; $("#myTable").append(myElem); $("#tr-1").click(function() { // blah blah }); Traditionally, when I wasn't using jQuery, I used to do something like this: var myElem = document.createElement(...); var myTable = document.getElementById("myTable"); myTable.appendChild(myElem); myElem.onclick = function() { // blah blah } The thing is, in the second approach I already have a reference to myElem and I don't have to scan the DOM ($("#tr-1")) to find it, like the jQuery approach, and hence it should be much faster especially in big pages. Isn't there a better jQuery-ish way to accomplish this task?

    Read the article

  • HTML/CSS - Image inside a li element from the navigation bar

    - by musicvicious
    I have a navigation bar, and underneath a black div on which the drop-down elements from the navigation bar drops. This is not the main function of the black div. It is just for design, but it works really well. You can see here what i am talking about: http://www.ecoloc.ro/interior/test/regeneration . Now, what i want to do is that every time a main element from the navigation bar is hovered, an image big enough to cover the main element and a part of that black div beneath it will appear. You can see in the link that i posted, on that black gap i want the image. Can this be done? Thank you!

    Read the article

  • how to construct a long string

    - by david
    I need to construct a long string with javascript. Thats how i tried to do it: var html = '<div style="balbalblaba">&nbsp;</div>'; for(i = 1; i <= 400; i++){ html=+html; }; When i execute that in firefox its taking ages or makes it crash. what is the best way to do that? What is generally the best way to construct big strings in JS. can someone help me?

    Read the article

  • Coalesce and Case-When with To_Date not working as expected (Postgres bug?)

    - by ADTC
    I'm using Postgres 9.1. The following query does not work as expected. Coalesce should return the first non-null value. However, this query returns null (1?) instead of the date (2). select COALESCE( TO_DATE('','yyyymmdd'), --(1) TO_DATE('20130201','yyyymmdd') --(2) ); --(1) this evaluates independently to null --(2) this evaluates independently to the date, and therefore is the first non-null value What am I doing wrong? Any workaround? Edit: This may have nothing to do with Coalesce at all. I tried some experiments with Case When constructs; it turns out, Postgres has this big ugly bug where it treats TO_DATE('','yyyymmdd') as not null, even though selecting it returns null.

    Read the article

  • Question about inserting/updating rows with SQL Server (ASP.NET MVC)

    - by Alex
    I have a very big table with a lot of rows, every row has stats for every user for certain days. And obviously I don't have any stats for future. So to update the stats I use UPDATE Stats SET Visits=@val WHERE ... a lot of conditions ... AND Date=@Today But what if the row doesn't exist? I'd have to use INSERT INTO Stats (...) VALUES (Visits=@val, ..., Date=@Today) How can I check if the row exists or not? Is there any way different from doing the COUNT(*)? If I fill the table with empty cells, it'd take hundreds of thousands of rows taking megabytes and storing no data.

    Read the article

  • question on find value in array

    - by davit-datuashvili
    i have seen a few days ago such problem there is given two array find elements which are common of these array one of the solution was sort big array and then use binary search algorithm and also there is another algorithm- brute-force algorithm for (int i=0;i<array1.length;i++){ for (int j=0;j<array2.length;j++){ if (array1[i]==array2[j]){ //code here } } it's complexity is O(array1.length*array2.length); and i am interested the first method's complexity is also same yes? because we should sort array first and then use search method binary search algorithm's complexity is log_2(n) so it means that total time will be array.length*log_2(n) and about sort? please explain me which is better

    Read the article

  • is there a limit of merge tables with Mysql ?

    - by sysko
    I'm working on a database with mysql 5.0 for an open source project it's used to stored sentences in specific languages and their translations in other languages I used to have a big table "sentences" and "sentences_translations" (use to join sentences to sentences) table but has we have now near one million entries, this begin to be a bit slow, moreover, most of request are made using a "where lang =" so I've decided to create a table by language sentences_LANGUAGECODE and sentences_translation_LANGSOURCE_LANGTARGET and to create merge table like this sentences_ENG_OTHERS which merge sentences_ENG_ARA sentences_ENG_DEU etc... when we want to have the translations in all languages of an english sentence sentences_OTHERS_ENG when we want to have only the english translations of some sentences I've created a script to create all these tables (they're around 31 languages so more than 60 merge table), I've tested, that works really great a request which use to take 160ms now take only 30 :) but I discover that all my merge table after the 15th use to have "NULL" as type of storage engine instead of MRG_MYISAM, and if delete one, then I can create an others, using FLUSH table between each creation also allow me to create more merge tables so is this a limitation from mysql ? can we override it ? thanks for your answers

    Read the article

  • Object for storing strings in Python

    - by evg
    class MyWriter: def __init__(self, stdout): self.stdout = stdout self.dumps = [] def write(self, text): self.stdout.write(smart_unicode(text).encode('cp1251')) self.dumps.append(text) def close(self): self.stdout.close() writer = MyWriter(sys.stdout) save = sys.stdout sys.stdout = writer I use self.dumps list to store data obtained from prints. Is there a more convenient object for storing string lines in memory? Ideally I want dump it to one big string. I can get it like this "\n".join(self.dumps) from code above. May be it's better to just concatenate strings - self.dumps += text?

    Read the article

  • Can a programming language without arrays be turing-complete?

    - by Ring
    My question is simple: There are no arrays possible. That means you can address variables only "statically" by directly using their unique name. (This already throws out the default array syntax variable[ index ] and variable variables) "Emulated arrays" are counted as arrays and excluded too. Examples: You could basically simulate arrays using strings (quite easily actually) or use variable variables as in PHP. Can such a language be turing-complete? Brainf*ck for example has arrays, in fact it is one big array, isn't it?

    Read the article

  • Sharing base object with inheritance

    - by max
    I have class Base. I'd like to extend its functionality in a class Derived. I was planning to write: class Derived(Base): def __init__(self, base_arg1, base_arg2, derived_arg1, derived_arg2): super().__init__(base_arg1, base_arg2) # ... def derived_method1(self): # ... Sometimes I already have a Base instance, and I want to create a Derived instance based on it, i.e., a Derived instance that shares the Base object (doesn't re-create it from scratch). I thought I could write a static method to do that: b = Base(arg1, arg2) # very large object, expensive to create or copy d = Derived.from_base(b, derived_arg1, derived_arg2) # reuses existing b object but it seems impossible. Either I'm missing a way to make this work, or (more likely) I'm missing a very big reason why it can't be allowed to work. Can someone explain which one it is? [Of course, if I used composition rather than inheritance, this would all be easy to do. But I was hoping to avoid the delegation of all the Base methods to Derived through __getattr__.]

    Read the article

  • Sending large XML from Silverlight to SVC (WCF)

    - by alexbf
    Hi! I want to send a big XML string to a WCF SVC service from Silverlight. It looks like anything under about 50k is sent correctly but if I try to send something over that limit, my request reaches the server (BeginRequest is called) but never reaches my SVC. I get the classic "NotFound" exception. Any idea on how to raise that limit? If I can't raise it? What are my other options? Thanks, Alex

    Read the article

  • Read a file to multiple array byte[]

    - by hankol
    I have an encryption algorithm (AES) that accepts file converted to array byte and encrypt it. Since I am going to process a very big size files, the JVM may go out of memory. I am planing to read the files in multiple array byte. each containing some part of the file. Then I teratively feed the algorithm. Finally merge them to produce encrypted file. So my question is: there any way to read a file part by part to multiple array byte? I thought I can use the following to read the file to array byte: IOUtils.toByteArray(InputStream input). And then split the array into multiple bytes using: Arrays.copyOfRange(). But I am afraid that the first code that reads file to byte will make the JVM to go out of memory. any suggestion please ? thanks

    Read the article

  • Doesn't Matlab optimize the following?

    - by kloop
    I have a very long vector 1xr v, and a very long vector w 1xs, and a matrix A rxs, which is sparse (but very big in dimensions). I was expecting the following to be optimized by Matlab so I won't run into trouble with memory: A./(v'*w) but it seems like Matlab is actually trying to generate the full v'*w matrix, because I am running into Out of memory issue. Is there a way to overcome this? Note that there is no need to calculate all v'*w because many values of A are 0. EDIT: If that were possible, one way to do it would be to do A(find(A)) ./ (v'*w)(find(A)); but you can't select a subset of a matrix (v'*w in this case) without first calculating it and putting it in a variable.

    Read the article

  • freeing malloc and checkin it is empty or not

    - by gcc
    char *p; p="kjkjk"; . .//there are codes which are checking another command . if(.....)//i used pointer p in only that area free(p); . . //there are codes which are checking another command . if(p==NULL) //i check whether is empty .... if(p==-1) //can we use "EOF==p " in if statement ... //are there any usage like that EOF==p else .... I think there is big error , but where?

    Read the article

  • The largest prime factor with php

    - by Tom
    So, I wrote php program to find the largest prime factor with php and I think it is quite optimal, because it loads quite fast. But there is a problem, it doesn't count very big numbers's prime factors. Here is a program: function is_even($s) { $sk_sum = 0; for($i = 1; $i <= $s; $i++) { if($s % $i == 0) { $sk_sum++; } } if($sk_sum == 2) { return true; } } $x = 600851475143; $i = 2; //x is number while($i <= $x) { if($x % $i == 0) { if(is_even($i)) { $sk = $i; $x = $x / $i; } } $i++; } echo $sk;

    Read the article

  • Converting byte array to image in xcode?

    - by arizah
    I am getting a byte array(I reckon) in NSMutableArray's object.The NSMutableArray i get is from Xml parsing from a service URL..Now I need to convert that byte array to UIImage ..Gone through google and found to convert to NSData and then UIImage but couldn't understand how..? How can I do it ? My array looks like this : ( { Name = "John"; Image = "/9j/4AAQSkZJRgABAQAAAQABAAD//gA7Q1JFQVRPUjogZ2QtanBlZyB2MS4wICh1c2luZyBJS kcgSlBFRyB2NjIpLCBxdWFsaXR5ID0gODAK/9sAQwAGBAUGBQQGBgUGBwcGCAoQCgoJCQoUDg8MEBcUGBgXF BYWGh0lHxobIxwWFiAsICMmJykqKRkfLTAtKDAlKCko/9sAQwEHBwcKCAoTCgoTKBoWGigoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgo/8AAEQgEKwZAAwEiAAIRAQMRAf/........ //big byte array } ) First of all I need to clear whether it is a byte array ?How can I extract image from this ?

    Read the article

  • How to match data between columns to do the comparasion

    - by NCC
    I do not really know how to explain this in a clear manner. Please see attached image I have a table with 4 different columns, 2 are identical to each other (NAME and QTY). The goal is to compare the differences between the QTY, however, in order to do it. I must: 1. sort the data 2. match the data item by item This is not a big deal with small table but with 10 thousand rows, it takes me a few days to do it. Pleas help me, I appreciate. My logic is: 1. Sorted the first two columns (NAME and QTY) 2. For each value of second two columns (NAME and QTY), check if it match with first two column. If true, the insert the value. 3. For values are not matched, insert to new rows with offset from the rows that are in first two columns but not in second two columns

    Read the article

< Previous Page | 240 241 242 243 244 245 246 247 248 249 250 251  | Next Page >