Search Results

Search found 1968 results on 79 pages for 'pickle dump'.

Page 51/79 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • SQL LEFT JOIN help

    - by Stolz
    My scenario: There are 3 tables for storing tv show information; season, episode and episode_translation. My data: There are 3 seasons, with 3 episodes each one, but there is only translation for one episode. My objetive: I want to get a list of all the seasons and episodes for a show. If there is a translation available in a specified language, show it, otherwise show null. My attempt to get serie 1 information in language 1: SELECT season_number AS season,number AS episode,name FROM season NATURAL JOIN episode NATURAL LEFT JOIN episode_trans WHERE id_serie=1 AND id_lang=1 ORDER BY season_number,number result: +--------+---------+--------------------------------+ | season | episode | name | +--------+---------+--------------------------------+ | 3 | 3 | Episode translated into lang 1 | +--------+---------+--------------------------------+ expected result +-----------------+--------------------------------+ | season | episode| name | +-----------------+--------------------------------+ | 1 | 1 | NULL | | 1 | 2 | NULL | | 1 | 3 | NULL | | 2 | 1 | NULL | | 2 | 2 | NULL | | 2 | 3 | NULL | | 3 | 1 | NULL | | 3 | 2 | NULL | | 3 | 3 | Episode translated into lang 1 | +--------+--------+--------------------------------+ Full DB dump http://pastebin.com/Y8yXNHrH

    Read the article

  • Best practice to send secure information over e-mail?

    - by Zolomon
    I have to send sensitive information (name, address, social security number etc.) collected from a website, that has been entered by a user, to an e-mail address. What is the best course of action to make the information secure and easy to extract on the receiver side? Edit: I will be using ASP.NET for the website, not sure what it has for capabilities on this matter. Edit: If I decide to store the information in a database and just send a mail when a new entry has been made, would this be better? And create some secure way to dump the information instead.

    Read the article

  • `php -v` segmentation fault

    - by John
    I'm getting an odd segmentation fault in PHP. Every few times, when I run: php -v I see: PHP 5.2.6 (cli) (built: Aug 19 2009 16:59:56) Copyright (c) 1997-2008 The PHP Group Zend Engine v2.2.0, Copyright (c) 1998-2008 Zend Technologies Segmentation fault (core dumped) Analyzing the core dump (backtrace with gdb): #0 0x00002ba6412f6c6c in ?? () #1 0x0000003f90c06367 in start_thread () from /lib64/libpthread.so.0 #2 0x0000003f904d2f7d in clone () from /lib64/libc.so.6 #3 0x0000000000000000 in ?? () Any ideas? OS: Linux version 2.6.18-92.el5 ([email protected]) (gcc version 4.1.2 20071124 (Red Hat 4.1.2-42)) #1 SMP Tue Jun 10 18:51:06 EDT 2008

    Read the article

  • How to combine "|" character in run () command in powerbuilder in order to read an txt file as metad

    - by sgian76
    Could you please tell me how to use "pdftk mypdf.pdf dump data | findstr NumberOfPages in powerbuilder run command and save this metadata in a file by using the following code like this: string ls_runinput, ls_outputfile ls_outputfile = "c:\test.exe" ls_runinput = "c:\pdftk\pdftk.exe mypdf.pdf dump_data | findstr NumberOfPages >"+ls_outputfile Run(ls_runinput,Minimized!) li_fileopen = FileOpen(ls_outputfile ,TextMode!, Read!, Shared!) The problem is that Run command is executed, the file is created, but fileopen return -1 ? Is it maybe that run cannot recognize the "|" character? What should you propose me to write the right code? Iam using powerbuilder 10.5.2 , Thanks very much in advance

    Read the article

  • I need to debug my BrowserHelperObject (BHO) (in C++ with Visual Studio 2008) after a internet explo

    - by BHOdevelopper
    Hi, here is the situation, i'm developping a Browser Helper Object (BHO) in C++ with Visual Studio 2008, and i learned that the memory wasn't managed the same way in Debug mode than in Release mode. So when i run my BHO in debug mode, internet explorer 8 works just fine and i got no erros at all, the browser stays alive forever, but as soon as i compile it in release mode, i got no errors, no message, nothing, but after 5 minutes i can see through the task manager that internet explorer instances are just eating memory and then the browser just stop responding every time. Please, I really need some hint on how to get a feedback on what could be the error. I heard that, often it was happening because of memory mismanagement. I need a software that just grab a memory dump or something when iexplorer crashes to help me find the problem. Any help is appreciated, I'll be looking for responses every single days, thank you.

    Read the article

  • How does Mach-O loader loads different NSString objects?

    - by overboming
    I have known that If you define a bunch of @"" NSString objects in the source code in Mac OS. These NSStrings will be stored in a segment in the Mach-O library. Section sectname __ustring segname __TEXT addr 0x000b3b54 size 0x000001b7 offset 731988 align 2^1 (2) reloff 0 nreloc 0 flags 0x00000000 reserved1 0 reserved2 0 If I hex dump the binary, they are aligned closely one by one with a 0x0000as separator. What I want to know is how does the loader in Mac OS X load these NSStrings when the program runs? Are they loaded simpily by recognize the 0x0000 separator or these is a string offset table elsewhere in the binary pointing to separate NSString objects? Thanks. (What I really want to do is the increase the length of one of the NSString, so I have to know how the loader recognize these separate objects)

    Read the article

  • How to profile object creation in Java?

    - by gooli
    The system I work with is creating a whole lot of objects and garbage collecting them all the time which results in a very steeply jagged graph of heap consumption. I would like to know which objects are being generated to tune the code, but I can't figure out a way to dump the heap at the moment the garbage collection starts. When I tried to initiate dumpHeap via JConsole manually at random times, I always got results after GC finished its run, and didn't get any useful data. Any notes on how to track down excessive temporary object creation are welcome.

    Read the article

  • Why is Scaffolding Not Working in Ruby on Rails?

    - by Timmy
    I created a controller and a model. The controller is called "Admin" and the model is called "Album". I edited database.yml with proper info and did the rake db:migrate command which didn't return any errors and did migrate the db inside schema.rb. Inside the controller I wrote: class AdminController < ApplicationController scaffold :album end Next I started my server and went to http://localhost:3000/admin but instead of seeing the typical CRUD page I get the following error: app/controllers/admin_controller.rb:3 Request Parameters: None Show session dump --- flash: !map:ActionController::Flash::FlashHash {} Response Headers: {"cookie"=>[], "Cache-Control"=>"no-cache"} Any idea why?

    Read the article

  • Copying a mysql database from localhost to remote server using mysqldump.exe

    - by Ankur
    I want to copy a mysql database from my local computer to a remote server. I am trying to use the mysql dump command. All the examples on the internet suggest doing something like The initial mysql> is just the prompt I get after logging in. mysql> mysqldump -u user -p pass myDBName | NewDBName.out; But when I do this I get You have an error in your SQL syntax; check the manual that corresponds ... to use near 'mysqldump -u user -p pass myDBName | NewDBName.out' Since I have already logged in do I need to use -u and -p? Not doing so gives me the same error. Can you see what is wrong?

    Read the article

  • Do I need to installl Glassfish?

    - by Ayusman
    Hi, I am new to glassfish server. i have a question on glassfish usage: can I just use glassfish like a tomcat server without needing an installation? where in, I just take a folder containing glassfish folders, jars etc... dump it in a folder location setup a few environment variables and it runs.. just like tomcat? is it possible with glassfish? also does glassfish installation does any other background things like creating registry entries etc other than creating the glassfish folder structure? TIA Ayusman

    Read the article

  • Is there a way to send tracking info to Google Analytics from PHP ?

    - by seatoskyhk
    I have a PHP code that will return a image. the link is given to 3rd party. so, i need to keep track where the php request coming from. Because the PHP only return the image, I cannot use the Javascript code for Google analytics. I know that I can get the information from the access.log, but i think I can't dump the access.log to GA for analyzing, right? so, is there a way that I can do in PHP (e.g. sending a CURL ), send somethig to Google Analytics for tracking?

    Read the article

  • How few a files does it take to load a program on Linux?

    - by BCS
    The (hypothetical for now) situation is the user of my system is going to be given a chunk of C code and needs my system to compile and run it in a chroot sandbox that is generated on the fly and I want to require the fewest files in the box as possible. I'm only willing to play with compiler and linker settings (e.g. static link everything I can expect to be able to find) and make some moderate restriction on what the code can expect use (e.g. they can't use arbitrary libs). The question is how simple can I get the sandbox. Clearly I need the executable, but what about an ELF loader and a .so for the system calls? Can I dump either of them and is there something else I'll need?

    Read the article

  • Drupal 6 hook_form_FORM_ID_alter adding upload file field

    - by kristian nissen
    I'm trying to extend a form and add an upload file field from within a module, I can see the file field just fine, but it's empty when I submit the form, the enctype is set. $form['#attributes'] = array( 'enctype' => "multipart/form-data" ); $form['file_upload'] = array( '#type' => 'file', '#title' => 'Attach Image' ); custom form submit hook: $form['#submit'][] = 'user_images_handler'; is being called, but when I dump the form, the file field is empty, and when I try to access it, it's empty as well.

    Read the article

  • how to get get-item cmdlet's output to variable as string

    - by aeon
    i mean when i call get-item with directory it dump to console like this ---- ------------- ------ ---- d---- 2/16/2011 8:27 PM 2011-2-16 -a--- 2/13/2011 8:24 PM 3906877184 SWP-Full Database Backup_2011-02-13 0 -a--- 2/16/2011 8:23 PM 3919766476 SWP-Full Database Backup_2011-02-16.bak 8 -a--- 2/12/2011 8:18 PM 3906877747 SWP-Full Database Backup_2011-02-12 2 -a--- 2/14/2011 8:21 PM 3875484467 SWP-Full Database Backup_2011-02-14 2 but when i convert to string it changes as \\192.168.2.89\BwLive\2011-2-16 \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-13 \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-16.bak \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-12 \\192.168.2.89\BwLive\SWP-Full Database Backup_2011-02-14 i mean length,size,time attributes is omitted how can i keep these attributes while converting to string? thanks.

    Read the article

  • How to make django test framework read from live database?

    - by lfborjas
    I realize there's a similar question here, but this one has a different approach: I have a django app that does queries over data indexed with djapian ; I'd like to write unit tests for this app's search component, and, obviously, I'd need the django settings module and all connections with the database active, so the test runner that django provides seems ideal. however, the django testing framework creates a dummy database and I'd hate to dump all my data to a fixture and then index it (the tests would take forever!); My data isn't at risk because the tests would only read from the database, so, how could this be achieved? -I'm new at this whole unit testing thing, so the solution of writing a new test runner I read in that similar question doesn't enlighten me a bit, at least not without some details

    Read the article

  • How do you save a Neural Network to file using Ruby's ai4r gem?

    - by Jaime Bellmyer
    I'm using ruby's ai4r gem, building a neural network. Version 1.1 of the gem allowed me to simply do a Marshal.dump(network) to a file, and I could load the network back up whenever I wanted. With version 1.9 a couple years later, I'm no longer able to do this. It generates this error when I try: no marshal_dump is defined for class Proc I know the reason for the error - Marshal can't handle procs in an object. Fair enough. So is there something built in to ai4r? I've been searching with no luck. I can't imagine any practical use for a neural network you have to rebuild from scratch every time you want to use it.

    Read the article

  • I need to debug my BrowserHelperObject (BHO) (in C++) after a internet explorer 8 crash in Release m

    - by BHOdevelopper
    Hi, here is the situation, i'm developping a Browser Helper Object (BHO) in C++ with Visual Studio 2008, and i learned that the memory wasn't managed the same way in Debug mode than in Release mode. So when i run my BHO in debug mode, internet explorer 8 works just fine and i got no erros at all, the browser stays alive forever, but as soon as i compile it in release mode, i got no errors, no message, nothing, but after 5 minutes i can see through the task manager that internet explorer instances are just eating memory and then the browser just stop responding every time. Please, I really need some hint on how to get a feedback on what could be the error. I heard that, often it was happening because of memory mismanagement. I need a software that just grab a memory dump or something when iexplorer crashes to help me find the problem. Any help is appreciated, I'll be looking for responses every single days, thank you.

    Read the article

  • alter mysqldump file before import

    - by julio
    Hi-- I have a mysqldump file created from an earlier version of a product that can't be imported into a new version of the product, since the db structure has changed slightly (mainly altering a column that was NOT NULL DEFAULT 0 to UNIQUE KEY DEFAULT NULL). If I just import the old dump file, it will error out since the column that has default values of 0 now breaks the UNIQUE constraint. It would be easy enough to either manually alter the mysqldump file, or import into a temp table and change it, then copy to the new table. However, is there a way to do this programatically, so it will be repeatable and not manual? (this will need to happen for many instances of this product). I'm thinking something like disabling key constraints for the import, then setting all values that = 0 to NULL, then re-enabling the key constraints? Is this possible? Any help appreciated.

    Read the article

  • identify documents from results of mahout clustering

    - by Tejas
    I am using mahout to cluster text documents indexed using solr. I have used the "text" field in the document to form vectors. Then I used the k-means driver in mahout for clustering and then the clusterdumper utility to dump the results. I am having difficulty in understanding the output results from the dumper. I could see the clusters formed with term vectors in those clusters. But how do I extract the documents from these clusters. I want the result to be the input documents appearing in different clusters.

    Read the article

  • Is a program compiled with -g gcc flag slower than the same program compiled without -g?

    - by e271p314
    I'm compiling a program with -O3 for performance and -g for debug symbols (in case of crash I can use the core dump). One thing bothers me a lot, does the -g option results in a performance penalty? When I look on the output of the compilation with and without -g, I see that the output without -g is 80% smaller than the output of the compilation with -g. If the extra space goes for the debug symbols, I don't care about it (I guess) since this part is not used during runtime. But if for each instruction in the compilation output without -g I need to do 4 more instructions in the compilation output with -g than I certainly prefer to stop using -g option even at the cost of not being able to process core dumps. How to know the size of the debug symbols section inside the program and in general does compilation with -g creates a program which runs slower than the same code compiled without -g?

    Read the article

  • Why is rails setting ":null => false" on all my columns in schema.rb?

    - by ryeguy
    Even if I never specify :null => false in my migrations that initially add columns to tables, rails still generates code in schema.rb that specifies the columns as having :null => false. Why is this? If I develop on my box, and then use rake db:schema:load on my production box, I'm going to get very different behavior! Edit: Even if I delete schema.rb and run rake db:schema:dump, it still puts :null => false on the new schema even if it isn't defined like that in the actual database. It seems it can't tell whether or not a column is marked as allowing nulls. I'm using SQLite if that helps.

    Read the article

  • Do I need to install Glassfish?

    - by Ayusman
    Hi, I am new to glassfish server. i have a question on glassfish usage: can I just use glassfish like a tomcat server without needing an installation? where in, I just take a folder containing glassfish folders, jars etc... dump it in a folder location setup a few environment variables and it runs.. just like tomcat? is it possible with glassfish? also does glassfish installation does any other background things like creating registry entries etc other than creating the glassfish folder structure? TIA Ayusman

    Read the article

  • Cache an FTP connection via session variables for use via AJAX?

    - by Chad Johnson
    I'm working on a Ruby web Application that uses the Net::FTP library. One part of it allows users to interact with an FTP site via AJAX. When the user does something, and AJAX call is made, and then Ruby reconnects to the FTP server, performs an action, and outputs information. Every time the AJAX call is made, Ruby has to reconnect to the FTP server, and that's slow. Is there a way I could cache this FTP connection? I've tried caching in the session hash, but "We're sorry, but something went wrong" is displayed, and a TCP dump is outputted in my logs whenever I attempt to store it in the session hash. I haven't tried memcache yet. Any suggestions?

    Read the article

  • Programatticaly grabing text from a web page that is dynamically generated.

    - by bstullkid
    There is a website I am trying to pull information from in perl, however the section of the page I need is being generated using javascript so all you see in the source is <div id="results"></div> I need to somehow pull out the contents of that div and save it to a file using perl/proxies/whatever. e.g. the information I want to save would be document.getElementById('results').innerHTML; I am not sure if this is possible or if anyone had any ideas or a way to do this. I was using a lynx source dump for other pages but since I cant straight forward screen scrape this page I came here to ask about it!

    Read the article

  • How to replicate this screenshot in WebForms?

    - by AngryHacker
    I need to replicate the following in ASP.NET WebForms using a GridView. But I am not sure where to start. Basically I need 3 columns. The checkbox (which sometimes needs to be disabled), and 2 standard text columns. I've gone through the tutorial and I can see how to basically dump text data into a GridView, but not clear on how to implement checkboxes, particularly ones that needs to be disabled once in a while. And I have to replicate the style of the screenshot (e.g. border on the bottom). Having trouble with that as well. How do I swing something like that?

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >