Search Results

Search found 24201 results on 969 pages for 'andrew case'.

Page 359/969 | < Previous Page | 355 356 357 358 359 360 361 362 363 364 365 366  | Next Page >

  • Theory: Can JIT Compiler be used to parse the whole program first, then execute later?

    - by unknownthreat
    Normally, JIT Compiler works by reads the byte code, translate it into machine code, and execute it. This is what I understand, but in theory, is it possible to make the JIT Compiler parses the whole program first, then execute the program later as machine code? I do not know how JIT Compiler works technically and exactly, so I don't know any feasibility in this case. But theoretically, is it possible? Or am I doing it wrong?

    Read the article

  • R: getting "inside" environments

    - by Mike Dewar
    Given an environment object e: > e <environment: 0x10f0a6e98> > class(e) [1] "environment" How do you access the variables inside the environment? Just in case you're curious, I have found myself with this environment object. I didn't make it, a package in Bioconductor made it. You can make it, too, using these commands: library('GEOquery') eset <- getGEO("GSE4142")[[1]] e <- assayData(eset)

    Read the article

  • Drupal: assign block to a specific content type

    - by bert
    I made a customized template called node-mynode.tpl.php Whenever a node of type mynode is requested, then node-mynode.tpl.php is automatically used. However, now user wants to see a specific menu block in this case. Question: How can I assign a block to a specific content type? Hint: I have started to look at URL aliases with Pathauto. I suspect one solution may lie in this direction.

    Read the article

  • Optimising speeds in HDF5 using Pytables

    - by Sree Aurovindh
    The problem is with respect to the writing speed of the computer (10 * 32 bit machine) and the postgresql query performance.I will explain the scenario in detail. I have data about 80 Gb (along with approprite database indexes in place). I am trying to read it from Postgresql database and writing it into HDF5 using Pytables.I have 1 table and 5 variable arrays in one hdf5 file.The implementation of Hdf5 is not multithreaded or enabled for symmetric multi processing.I have rented about 10 computers for a day and trying to write them inorder to speed up my data handling. As for as the postgresql table is concerned the overall record size is 140 million and I have 5 primary- foreign key referring tables.I am not using joins as it is not scalable So for a single lookup i do 6 lookup without joins and write them into hdf5 format. For each lookup i do 6 inserts into each of the table and its corresponding arrays. The queries are really simple select * from x.train where tr_id=1 (primary key & indexed) select q_t from x.qt where q_id=2 (non-primary key but indexed) (similarly five queries) Each computer writes two hdf5 files and hence the total count comes around 20 files. Some Calculations and statistics: Total number of records : 14,37,00,000 Total number of records per file : 143700000/20 =71,85,000 The total number of records in each file : 71,85,000 * 5 = 3,59,25,000 Current Postgresql database config : My current Machine : 8GB RAM with i7 2nd generation Processor. I made changes to the following to postgresql configuration file : shared_buffers : 2 GB effective_cache_size : 4 GB Note on current performance: I have run it for about ten hours and the performance is as follows: The total number of records written for each file is about 6,21,000 * 5 = 31,05,000 The bottle neck is that i can only rent it for 10 hours per day (overnight) and if it processes in this speed it will take about 11 days which is too high for my experiments. Please suggest me on how to improve. Questions: 1. Should i use Symmetric multi processing on those desktops(it has 2 cores with about 2 GB of RAM).In that case what is suggested or prefereable? 2. If i change my postgresql configuration file and increase the RAM will it enhance my process. 3. Should i use multi threading.. In that case any links or pointers would be of great help Thanks Sree aurovindh V

    Read the article

  • Symfony : ajax call cause server to queue next queries

    - by Remiz
    Hello, I've a problem with my application when an ajax call on the server takes too much time : it queue all the others queries from the user until it's done server side (I realized that canceling the call client side has no effect and the user still have to wait). Here is my test case : <script type="text/javascript" src="jquery-1.4.1.min.js"></script> <a href="another-page.php">Go to another page on the same server</a> <script type="text/javascript"> url = 'http://localserver/some-very-long-complex-query'; $.get(url); </script> So when the get is fired and then after I click on the link, the server finish serving the first call before bringing me to the other page. My problem is that I want to avoid this behavior. I'm on a LAMP server and I'm looking in a way to inform the server that the user aborted the query with function like connection_aborted(), do you think that's the way to go ? Also, I know that the longest part of this PHP script is a MySQL query, so even if I know that connection_aborted() can detect that the user cancel the call, I still need to check this during the MySQL query... I'm not really sure that PHP can handle this kind of "event". So if you have any better idea, I can't wait to hear it. Thank you. Update : After further investigation, I found that the problem happen only with the Symfony framework (that I omitted to precise, my bad). It seems that an Ajax call lock any other future call. It maybe related to the controller or the routing system, I'm looking into it. Also for those interested by the problem here is my new test case : -new project with Symfony 1.4.3, default configuration, I just created an app and a default module. -jquery 1.4 for the ajax query. Here is my actions.class.php (in my unique module) : class defaultActions extends sfActions { public function executeIndex(sfWebRequest $request) { //Do nothing } public function executeNewpage() { //Do also nothing } public function executeWaitingaction(){ // Wait sleep(30); return false; } } Here is my indexSuccess.php template file : <script type="text/javascript" src="jquery-1.4.1.min.js"></script> <a href="<?php echo url_for('default/newpage');?>">Go to another symfony action</a> <script type="text/javascript"> url = '<?php echo url_for('default/waitingaction');?>'; $.get(url); </script> For the new page template, it's not very relevant... But with this, I'm able to reproduce the lock problem I've on my real application. Is somebody else having the same issue ? Thanks.

    Read the article

  • ASP.NET mvc 2 Validation always shows errors on initial page load.

    - by gt124
    I've searched around a lot, and honed this problem down to this case: I'm using the PRG pattern, pragmatically I'm using the same DTO for my post/get actions. It looks like when I have the dto with the data annotation attributes in the get action parameter list, the validation is always displaying errors, every time on initial page load. In some cases this could be desired behavior if you put asterisks in the error message, but how do I get rid of it? Thanks in advance.

    Read the article

  • Spotting similarities and patterns within a string - Python

    - by RadiantHex
    Hi folks, this is the use case I'm trying to figure this out for. I have a list of spam subscriptions to a service and they are killing conversion rate and other usability studies. The emails inserted look like the following: [email protected] [email protected] [email protected] roger[...]_surname[...]@hotmail.com What would be your suggestions on spotting these entries by using an automated script? It feels a little more complicated than it actually looks. Help would be very much appreciated!

    Read the article

  • Can I pull the next element from within a Perl foreach loop?

    - by Thilo
    Can I do something like the following in Perl? foreach (@tokens) { if (/foo/){ # simple case, I can act on the current token alone # do something next; } if (/bar/) { # now I need the next token, too # I want to read/consume it, advancing the iterator, so that # the next loop iteration will not also see it my $nextToken = ..... # do something next; } }

    Read the article

  • Split comma separated string to count duplicates

    - by josepv
    I have the following data in my database (comma separated strings): "word, test, hello" "test, lorem, word" "test" ... etc How can I transform this data into a Dictionary whereby each string is separated into each distinct word together with the number of times that it occurs, i.e. {"test", 3}, {"word", 2}, {"hello", 1}, {"lorem", 1} I will have approximately 3000 rows of data in case this makes a difference to any solution offered. Also I am using .NET 3.5 (and would be interested to see any solution using linq)

    Read the article

  • Weighted Average and Ratings

    - by Danten
    Maths isn't my strong point and I'm at a loss here. Basically, all I need is a simple formula that will give a weighted rating on a scale of 1 to 5. If there are very few votes, they carry less influence and the rating pressess more towards the average (in this case I want it to be 3, not the average of all other ratings). I've tried a few different bayesian implementations but these haven't worked out. I believe the graphical representation I am looking for could be shown as: ___ / ___/ Cheers

    Read the article

  • Why do condition variables sometimes erroneously wake up?

    - by aspo
    I've known for eons that the way you use a condition variable is lock while not task_done wait on condition variable unlock Because sometimes condition variables will spontaneously wake. But I've never understood why that's the case. In the past I've read it's expensive to make a condition variable that doesn't have that behavior, but nothing more than that. So... why do you need to worry about falsely being woken up when waiting on a condition variable?

    Read the article

  • How can I pre-compress files with mod_deflate in Apache 2.x?

    - by Otto
    I am serving all content through apache with Content-Encoding: zip but that compresses on the fly. A good amount of my content is static files on the disk. I want to gzip the files beforehand rather than compressing them every time they are requested. This is something that, I believe, mod_gzip did in Apache 1.x automatically, but just having the file with .gz next to it. That's no longer the case with mod_deflate.

    Read the article

  • Azure table storage: maximum variable size ?

    - by Egon
    I will be using the table storage to store a lot of blob names, in a single string, appended to each other using some special character. This string will sky rockets pretty soon. But is there a maximum size to the length of a property for a particular entity ? in my case the string ?

    Read the article

  • Does Hibernate always need a setter when there is a getter?

    - by Marcus
    We have some Hibernate getter methods annotated with both @Column and @Basic. We get an exception if we don't have the corresponding setter. Why is this? In our case we are deriving the value returned from the getter (to get stored in the DB) and the setter has no functional purpose. So we just have an empty method to get around the error condition..

    Read the article

  • Simple 2 column CSS layout with nested divs

    - by Dan
    Hello, I have good familiarity with CSS but for some reason I am unable to achieve the result I want in this case. Here is the link to my test site http://danberinger.com/test.html Keep in mind that I want a 2 column layout and that the background wrapper color (gray) is not showing. Instead the body background color image is being put in place of where the wrapper background should be. Any suggestions would be greatly appreciated.

    Read the article

  • sql to compare two strings in MS access

    - by tksy
    I am trying to compare a series of strings like the following rodeo rodas carrot crate GLX GLX 1.1 GLX glxs the comparision need not be case sensitive i am trying to write a sql where i am updating the first string with the second string if they match approximately. Here except the second string all the other examples match. I would like to write a query which updates the strings except the second one. is this possible directly in a query in ACCESS thanks

    Read the article

  • regex to match letters, numbers, certain symbols

    - by Hintswen
    I need to validate a username in php, it can be: Letters (upper and lower case) Numbers Any of these symbols :.,?!@ up to 15 characters OR 16 if the last character is one of the following #$^ (it can also be 15 or less with one of these 3 characters at the end only) How do I do this?

    Read the article

  • Has the role of the Business Analyst become redundant on true Agile projects?

    - by Joanne
    On a truely agile project where the business is performing the role of the product owner, is there still a role for the Business Analyst? The product owner would do the functional testing as soon as the user story is developed and document and prioritise the user stories. In this case which I must add I haven't experienced yet and with high performing, self motivated developers I am struggling to see the role of the traditional business analyst?

    Read the article

  • Can't find mistake -- 'segmentation fault' - in C

    - by Mosh
    Hello all! I wrote this function but can't get on the problem that gives me 'segmentation fault' msg. Thank you for any help guys !! /*This function extract all header files in a *.c1 file*/ void includes_extractor(FILE *c1_fp, char *c1_file_name ,int c1_file_str_len ) { int i=0; FILE *c2_fp , *header_fp; char ch, *c2_file_name,header_name[80]; /* we can assume line length 80 chars MAX*/ char inc_name[]="include"; char inc_chk[INCLUDE_LEN+1]; /*INCLUDE_LEN is defined | +1 for null*/ /* making the c2 file name */ c2_file_name=(char *) malloc ((c1_file_str_len)*sizeof(char)); if (c2_file_name == NULL) { printf("Out of memory !\n"); exit(0); } strcpy(c2_file_name , c1_file_name); c2_file_name[c1_file_str_len-1] = '\0'; c2_file_name[c1_file_str_len-2] = '2'; /*Open source & destination files + ERR check */ if( !(c1_fp = fopen (c1_file_name,"r") ) ) { fprintf(stderr,"\ncannot open *.c1 file !\n"); exit(0); } if( !(c2_fp = fopen (c2_file_name,"w+") ) ) { fprintf(stderr,"\ncannot open *.c2 file !\n"); exit(0); } /*next code lines are copy char by char from c1 to c2, but if meet header file, copy its content */ ch=fgetc(c1_fp); while (!feof(c1_fp)) { i=0; /*zero i */ if (ch == '#') /*potential #include case*/ { fgets(inc_chk, INCLUDE_LEN+1, c1_fp); /*8 places for "include" + null*/ if(strcmp(inc_chk,inc_name)==0) /*case #include*/ { ch=fgetc(c1_fp); while(ch==' ') /* stop when head with a '<' or '"' */ { ch=fgetc(c1_fp); } /*while(2)*/ ch=fgetc(c1_fp); /*start read header file name*/ while((ch!='"') || (ch!='>')) /*until we get the end of header name*/ { header_name[i] = ch; i++; ch=fgetc(c1_fp); }/*while(3)*/ header_name[i]='\0'; /*close the header_name array*/ if( !(header_fp = fopen (header_name,"r") ) ) /*open *.h for read + ERR chk*/ { fprintf(stderr,"cannot open header file !\n"); exit(0); } while (!feof(header_fp)) /*copy header file content to *.c2 file*/ { ch=fgetc(header_fp); fputc(ch,c2_fp); }/*while(4)*/ fclose(header_fp); } }/*frst if*/ else { fputc(ch,c2_fp); } ch=fgetc(c1_fp); }/*while(1)*/ fclose(c1_fp); fclose(c2_fp); free (c2_file_name); }

    Read the article

  • MongoDB ruby dates

    - by MB
    I have a collection with an index on :created_at (which in this particular case should be a date) From rails what is the proper way to save an entry and then retrieve it by the date? I'm trying something like: Model: field :created_at, :type = Time script: Col.create(:created_at = Time.parse(another_model.created_at).to_s and Col.find(:all, :conditions = { :created_at = Time.parse(same thing) }) and it's not returning anything

    Read the article

< Previous Page | 355 356 357 358 359 360 361 362 363 364 365 366  | Next Page >