Search Results

Search found 11100 results on 444 pages for 'xt 20'.

Page 281/444 | < Previous Page | 277 278 279 280 281 282 283 284 285 286 287 288  | Next Page >

  • MySQL Query to receive random combinations from two tables.

    - by Michael
    Alright, here is my issue, I have two tables, one named firstnames and the other named lastnames. What I am trying to do here is to find 100 of the possible combinations from these names for test data. The firstnames table has 5494 entries in a single column, and the lastnames table has 88799 entries in a single column. The only query that I have been able to come up with that has some results is: select * from (select * from firstnames order by rand()) f LEFT JOIN (select * from lastnames order by rand()) l on 1=1 limit 10; The problem with this code is that it selects 1 firstname and gives every lastname that could go with it. While this is plausible, I will have to set the limit to 500000000 in order to get all the combinations possible without having only 20 first names(and I'd rather not kill my server). However, I only need 100 random generations of entries for test data, and I will not be able to get that with this code. Can anyone please give me any advice?

    Read the article

  • Problem with onsubmit ID

    - by Liso22
    I need an onsubmit to return false if what's written on the first field is a certain PHP value however the field has a strange id format and I'm not quite sure how to add it in the onsubmit. This is the form, I didn't add the element ID to it: <form class="questionform" name="questionform-0" id="questionform-0" onsubmit="if (document.getElementById('').value == '<?php echo $casi; ?>') return false;" > <textarea class="question-box" style="width:97%;" cols="20" rows="4" id="question-box-' . $questionformid . '" name="title" type="text" maxlength="80" size="28" value=""></textarea> I tried many times but couldn't add it. How should I do it? Thanks

    Read the article

  • php array count

    - by user295189
    I have a var dump of my sql query which return the following I wanna to count in the array below that how many rows of myID = 5 are there. How would I do that. I am using php. Thanks in advance array 0 = object(stdClass)[17] public 'myID' => string '5' (length=1) public 'data' => string '123' (length=3) 1 = object(stdClass)[18] public 'myID' => string '5' (length=1) public 'data' => string '123' (length=3) 2 = object(stdClass)[19] public 'relativeTypeID' => string '2' (length=1) public 'data' => string '256' (length=3) 3 = object(stdClass)[20] public 'myID' => string '4' (length=1) public 'data' => string '786' (length=3) object(stdClass)[21] public 'myID' => string '4' (length=1) public 'data' => string '786' (length=3)

    Read the article

  • Mechanize Submit Form Error: Insufficient items with name '10427'

    - by maneh
    I'm trying to submit a form with Mechanize, I have tried different ways, but the problem persists. Can anyone help me on this. Thank you in advance! This is the form I want to submit: http://www.stpairways.st/ This is the code that I'm using: def stp_airways(url): import re import mechanize br = mechanize.Browser() br.open(url) print br.title() br.select_form(name = "frmbook") br.form['TypeTrajet'] = ["1"] br.form['id_depart'] = ["11967"] br.form['id_arrivee'] = ["10427"] br.form['txtDateAller'] = "5/7/2014" br.form['txtDateRetour'] = "12/7/2014" br.form['TypePassager1u1000r0b1'] = ["1"] br.form['TypePassager2u1000r0b1'] = ["0"] br.form['TypePassager3u1000r0b1'] = ["0"] br.form['CodeIsoDeviseClient'] = ["17,20,23,24,25,26,27,28,29,30,31,33,34,36,37,64,65,67,68,70,73,80,81,95,96,103,147,151,152,159,160,162,169,170TP1TPF"] br.form['CodeIsoDeviseClient'] = ["EUR"] # submit response1 = br.submit() print response1.read()

    Read the article

  • Create a javascript chome extention that does not execute in 6 months

    - by user1907657
    I have just started learning programming and I would like to make a script into a chrome extension. Its a basic script and I hope to practice more and more and develop bigger projects and set myself bigger tasks This script has to do the following : reload a page every 20 seconds (say google.com) after 6 months the script must not run (maybe prompt a window saying "its over 6 months") The code should be able to go into a small chrome extension and also the 6 month time period should be absolute not relative to the time the script was started; for example should the browser crash and i have to turn on the extension again it should not restart the 6 month counter. Also if anyone could recommend any good sources for JavaScript to learn (preferably books; nothingIi read online ever seems to stick)

    Read the article

  • render json data returned from mvc controller

    - by user1765862
    I'm having js function which calls mvc controller action method which return list of data as json. function FillCountryCities(countryId) { $.ajax({ type: 'GET', url: '/User/FillCityCombo', data: { countryId: countryId }, contentType: 'application/json', success: function (data) { alert(data[0].Name); } error: function () { alert('something bad happened'); } .... format of data which sent back from controller is Name (string) and Id (Guid) Now I just want to alert Name on success first item from collection. Double checked controller sends 20 records, so it should alert first from collection but I'm getting error something bad happened update: public JsonResult FillCityCombo(Guid countryId) { var cities = repository.GetData() .Where(x = x.Country.Id == countryId).ToList(); if (Request.IsAjaxRequest()) { return new JsonResult { Data = cities, JsonRequestBehavior = JsonRequestBehavior.AllowGet }; } else { return new JsonResult { Data = "Not Valid Request", JsonRequestBehavior = JsonRequestBehavior.AllowGet }; } }

    Read the article

  • Seeking assistance with Escaping Data for MySQL queries

    - by JM4
    Please don't send me a link to php.net referencing mysql_real_escape_string as the only response. I have read through the page and while I understand the general concepts, I am having some trouble based on how my INSERT statement is currently built. Today, I am using the following: $sql = "INSERT INTO tablename VALUES ('', '$_SESSION['Member1FirstName'], '$_SESSION['Member1LastName'], '$_SESSION['Member1ID'], '$_SESSION['Member2FirstName'], '$_SESSION['Member2LastName'], '$_SESSION['Member2ID'] .... and the list goes on for 20+ members with some other values entered. It seems most people in the examples already have all their data stored in an array. On my site, I accept form inputs, action="" is set to self, php validation takes place and if validation passes, data is stored into SESSION variables on page 2 then redirected to the next page in the process (page 3) (approximately 8-10 pages in the whole process). thanks!

    Read the article

  • Take advantage of multiple cores executing SQL statements

    - by willvv
    I have a small application that reads XML files and inserts the information on a SQL DB. There are ~ 300 000 files to import, each one with ~ 1000 records. I started the application on 20% of the files and it has been running for 18 hours now, I hope I can improve this time for the rest of the files. I'm not using a multi-thread approach, but since the computer I'm running the process on has 4 cores I was thinking on doing it to get some improvement on the performance (although I guess the main problem is the I/O and not only the processing). I was thinking on using the BeginExecutingNonQuery() method on the SqlCommand object I create for each insertion, but I don't know if I should limit the max amount of simultaneous threads (nor I know how to do it). What's your advice to get the best CPU utilization? Thanks

    Read the article

  • Is it the filename or the whole URL used as a key in browser caches?

    - by Richard Turner
    It's common to want browsers to cache resources - JavaScript, CSS, images, etc. until there is a new version available, and then ensure that the browser fetches and caches the new version instead. One solution is to embed a version number in the resource's filename, but will placing the resources to be managed in this way in a directory with a revision number in it do the same thing? Is the whole URL to the file used as a key in the browser's cache, or is it just the filename itself and some meta-data? If my code changes from fetching '/r20/example.js' to '/r21/example.js', can I be sure that revision 20 of example.js was cached, but now revision 21 has been fetched instead and it is now cached?

    Read the article

  • Permissions for Large Variables to Be Sent Via Stored Procedures (SQL Server)

    - by Joe Majewski
    I can't figure out a way to allow more than 4000 bytes to be received at once via a call to a stored procedure. I am storing images in the table that are around 15 - 20 kilobytes each, but upon getting them and displaying them to the page, they are always exactly 3.91 KB in size (or 4000 bytes). Do stored procedures have a limit on how much data can be sent at once? I double-checked my data, and I am indeed only receiving the first 4000 characters from the varbinary(MAX) field. Is there a permission setting to allow more than 4k bytes at once?

    Read the article

  • UItabbar settitlecolor,iPhone

    - by FirstTimer
    I am trying to change the color of the text of tabbar item,programmatically. I am using [[UITabBar appearance] setTitleTextAttributes:[NSDictionary dictionaryWithObjectsAndKeys: [UIFont fontWithName:@"AmericanTypewriter" size:20.0f], UITextAttributeFont, [UIColor blackColor], UITextAttributeTextColor, [UIColor grayColor], UITextAttributeTextShadowColor, [NSValue valueWithUIOffset:UIOffsetMake(0.0f, 1.0f)], UITextAttributeTextShadowOffset, nil]]; Which should works on iOS5 and above. But my apps gets crashed with error at console : *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[_UIAppearance setTitleTextAttributes:]: unrecognized selector sent to instance 0x79f5790' *** First throw call stack: Not sure, why i am getting a crash. Also please suggest, if there is any other way to change the font color of the tabbar items. Thanks

    Read the article

  • what is the difference b/w these codes javascript ?

    - by dhaliwaljee
    < input type='text' id='txt' name='txtName' size='20' value='testing'/> <script type="text/javascript" language='javascript'> var val = document.getElementsByName('txtName'); alert(val[0].value); alert(window.txtName.value); </script> In above code we are using alert(val[0].value); alert(window.txtName.value); these two ways for getting value from object. What is the difference b/w both ways and which way is best.

    Read the article

  • Choose between multiple options with defined probability

    - by Sijin
    I have a scenario where I need to show a different page to a user for the same url based on a probability distribution, so for e.g. for 3 pages the distribution might be page 1 - 30% of all users page 2 - 50% of all users page 3 - 20% of all users When deciding what page to load for a given user, what technique can I use to ensure that the overall distribution matches the above? I am thinking I need a way to choose an object at "random" from a set X { x1, x2....xn } except that instead of all objects being equally likely the probability of an object being selected is defined beforehand.

    Read the article

  • iphone - Images (slide show) and audio snychronization

    - by Qaiser
    I have 20 images and some audio. I would like to show a single image at a time and change the images at (unequal) intervals. For example, I want to show image 1 for 1.44 seconds and image 2 for 1.67 seconds and so on. Can someone suggest how to go about doing this please? What I have seen are examples that show how to setup an array of images with one field that denotes total time. This causes the images to show for an equal amount of time (each). ... and that not what I am looking for ...

    Read the article

  • MVC more specified models should be populated by more precise query too?

    - by KevinUK
    If you have a Car model with 20 or so properties (and several table joins) for a carDetail page then your LINQ to SQL query will be quite large. If you have a carListing page which uses under 5 properties (all from 1 table) then you use a CarSummary model. Should the CarSummary model be populated using the same query as the Car model? Or should you use a separate LINQ to SQL query which would be more precise? I am just thinking of performance but LINQ uses lazy loading anyway so I am wondering if this is an issue or not.

    Read the article

  • vb.net checkboxes. Need to populate from database and also help in designing

    - by redr
    i have this requirement and since im new to vb.net dont really have much of idea how to do this. I have 20 checkboxes with dropdowns and textbox with it. the example is - table tr td checkbox -- textbox -- dropdownlist /td /tr tr td chk1 txtbox1 ddl1 /td /tr tr td chk2 txtbox2 ddl2 /td /tr and so on. the above structure shall be in one row of a table. does anyone know how to design this in code recursive and also how to take the checkbox data from here and send it to db table for records insert, update and select. thanks

    Read the article

  • app engine's back referencing is too slow. How can I make it faster?

    - by Ray Yun
    Google app engine has smart feature named back references and I usually iterate them where the traditional SQL's computed column need to be used. Just imagine that need to accumulate specific force's total hp. class Force(db.Model): hp = db.IntegerProperty() class UnitGroup(db.Model): force = db.ReferenceProperty(reference_class=Force,collection_name="groups") hp = db.IntegerProperty() class Unit(db.Model): group = db.ReferenceProperty(reference_class=UnitGroup,collection_name="units") hp = db.IntegerProperty() When I code like following, it was horribly slow (almost 3s) with 20 forces with single group - single unit. (I guess back-referencing force reload sub entities. Am I right?) def get_hp(self): hp = 0 for group in self.groups: group_hp = 0 for unit in group.units: group_hp += unit.hp hp += group_hp return hp How can I optimize this code? Please consider that there are more properties should be computed for each force/unit-groups and I don't want to save these collective properties to each entities. :)

    Read the article

  • Good assembly tutorial for windows with eiter fasm or nasm

    - by The new guy
    I have spent the last 3 hours google and bing for a good asm tutorial(x86 variety). I have been on this site for many of these results. Non have really been sufficient. Mainly because while i like learning theory, i learn it best via practice, while for example pc-asm has about 20-30 pages of theroy before anything practical. I was wondering if there was a tutorial or online pdf (or cheap book(<£20)) that i could use that suits my style of learning. Please state if this is not possibke Thank you for your time

    Read the article

  • PHP Mail Function

    - by vgathan
    Is it possible to send a mail in PHP without any external packages or tools? If so, is there any requirement to configure the php.ini file? Its as follows: $to = $row-EMail_ID; $subject = "Reset your password"; $body = "Hi ".$row->Username.", \n\t\t\tA request to reset your password was received from you. \n\n\n"; $headers = "From: [email protected]\r\n"."X-Mailer: php/"; mail($to, $subject, $body, $headers); The error i get: Warning: mail() [function.mail]: Failed to connect to mailserver at "localhost" port 25, verify your "SMTP" and "smtp_port" setting in php.ini or use ini_set() in F:\wamp\www\pwd.php on line 20

    Read the article

  • Apriori Algorithm- what to do with small min.support?

    - by user3707650
    I have a question about the table beneath my question: If i was told that the given min.support=10%, how can i know what is the support count, by which i will use during the exercise? What i know is: that you take the number of transactions (8) and multiple it by the min.support: 8*(10/100)=0.8 the problem is that i get this number: 0.8, how can i use this support count during this example?? 0.8 is a number that will make me prune all combination set that i will build... please help me!!! TID A B C D E F G 10 1 0 1 0 0 0 1 20 1 1 1 1 0 1 1 30 0 0 0 0 0 0 1 40 0 0 1 0 0 1 1 50 0 0 0 1 1 0 0 60 0 1 1 0 1 1 0 70 0 0 0 0 1 1 0 80 0 0 1 0 1 1 1

    Read the article

  • How do I split the output from mysqldump into smaller files?

    - by lindelof
    I need to move entire tables from one MySQL database to another. I don't have full access to the second one, only phpMyAdmin access. I can only upload (compressed) sql files smaller than 2MB. But the compressed output from a mysqldump of the first database's tables is larger than 10MB. Is there a way to split the output from mysqldump into smaller files? I cannot use split(1) since I cannot cat(1) the files back on the remote server. Or is there another solution I have missed? Edit The --extended-insert=FALSE option to mysqldump suggested by the first poster yields a .sql file that can then be split into importable files, provided that split(1) is called with a suitable --lines option. By trial and error I found that bzip2 compresses the .sql files by a factor of 20, so I needed to figure out how many lines of sql code correspond roughly to 40MB.

    Read the article

  • [CakePHP] Pagination after inserting or updateing record

    - by user198003
    one more question related with cakephp... let's say that i have 20+ records in my table. they are sorted by some criteria, ie. by title. and on a list view, i have a list of 10 records, with available pagination. how can i achieve that when i insert new record, to be redirected to proper page, where i can see record that is just was insterted? how can i get information on which page i have to be redirected? hope my question is enough clear for understanding... tnx in adv!

    Read the article

  • Xpath help function

    - by NA
    Hi i have a document from which i am trying to extract a date. But the problem is within the node along with the date their is some text too. Something like <div class="postHeader"> Posted on July 20, 2009 9:22 PM PDT </div> From this tag i just want the date item not the Posted on text. something like ./xhtml:div[@class = 'postHeader'] is getting everything. and to be precise, the document i have is basically a nodelist of this elements for eg i will get 10 nodes of these elements with different date values but to be worse the problem is sometime inside these tags some random other tags also pops us like anchors etc. Can i write a universal expath which will just get the date out of the div tag?

    Read the article

  • What's the fastest way to get directory and subdirs size on unix using Perl?

    - by ivicas
    I am using Perl stat() function to get the size of directory and its subdirectories. I have a list of about 20 parent directories which have few thousand recursive subdirs and every subdir has few hundred records. Main computing part of script looks like this: sub getDirSize { my $dirSize = 0; my @dirContent = <*>; my $sizeOfFilesInDir = 0; foreach my $dirContent (@dirContent) { if (-f $dirContent) { my $size = (stat($dirContent))[7]; $dirSize += $size; } elsif (-d $dirContent) { $dirSize += getDirSize($dirContent); } } return $dirSize; } The script is executing for more than one hour and I want to make it faster. I was trying with the shell du command, but the output of du (transfered to bytes) is not accurate. And it is also quite time consuming. I am working on HP-UNIX 11i v1.

    Read the article

  • scikit learn extratreeclassifier hanging

    - by denson
    I'm running the scikit learn on some rather large training datasets ~1,600,000,000 rows with ~500 features. The platform is Ubuntu server 14.04, the hardware has 100gb of ram and 20 CPU cores. The test datasets are about half as many rows. I set n_jobs = 10, and am forest_size = 3*number_of_features so about 1700 trees. If I reduce the number of features to about 350 it works fine but never completes the training phase with the full feature set of 500+. The process is still executing and using up about 20gb of ram but is using 0% of CPU. I have also successfully completed on datasets with ~400,000 rows but twice as many features which completes after only about 1 hour. I am being careful to delete any arrays/objects that are not in use. Does anyone have any ideas I might try?

    Read the article

< Previous Page | 277 278 279 280 281 282 283 284 285 286 287 288  | Next Page >