Search Results

Search found 4705 results on 189 pages for 'export to csv'.

Page 26/189 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Is it good practice to export variables in Perl?

    - by gvkv
    I'm finding it very convenient to pass configuration and other data that is read or calculated once but then used many times throughout a program by using Perl's use mechanism. I'm doing this by exporting a hash into the caller's namespace. For example: package Myconfiguration; my %config; sub import { my $callpkg = caller(0); my $expsym = $_[1]; configure() unless %config; *{"$callpkg\::$expsym"} = \%config; } and then in other modules: use MyConfiguration (loc_config_sym); if ( $loc_config_sym{paramater} ) { # ... do stuff ... } However, I'm not sure about this as a best practice. Is it better to add a method that returns a hash ref with the data? Something else?

    Read the article

  • mysqldump --where with = operator doesn't get all rows = - Help!

    - by JonathanLIVE
    I have a situation with a particular table that now thinks it contains 4 Petabytes of data. I know that sounds cool, but I assure you, it is only on a 60GB partition. This table has 9 fields in it. One of them is a domain_id field. It is the best field to identify the rows by, as there are only approximately 6300 of them. The only other field option to match has over 2million records, and thats just more difficult. I cannot do a straight mysqldump because it will attempt to output all 4PB of data and fill the drive long before it gets close to that, so I need to surgically remove the good stuff, destroy the db, and recreate it. I believe if I can do a dump for each domain_id record, then I will get most of the usable data out of it. This is what I am trying to use: mysqldump -u root --skip-opt -q --no-create-info --skip-add-drop-table --max_allowed_packet=1000000000 database table --where="domain_id=10" domains10.sql Using this I expect every row with the domain_id 10 to be exported. However, when I check the export, I am only getting 1 row, when however I look at the db, there are many many rows. It is as though the operator just finds one, then gives up. I have tried various operators. Using the < or I am able to get more of the data, but the export stops short at certain rows where the data has been compromised. With over 6000 to go through, I can't narrow down which rows are being affected in the export easily enough. So, what I need is an operator that will basically do what I thought = would do, simply give me an export of all records that match the specific field. Also note, the only way I got this DB even accessible is through an innodb force recovery 3. So I need to get this right, because after this is done, I have to drop the db in order to make mysql functional again. Looking forward to any helpful answers.

    Read the article

  • Rake don't know how to build task?

    - by Schroedinger
    Using a rake task to import data into a database: file as follows namespace :db do desc "load imported data from csv" task :load_csv_data => :environment do require 'fastercsv' require 'chronic' FasterCSV.foreach("input.csv", :headers => true) do |row| Trade.create( :name => row[0], :type => row[4], :price => row[6].to_f, :volume => row[7].to_i, :bidprice => row[10].to_f, :bidsize => row[11].to_i, :askprice => row[14].to_f, :asksize => row[15].to_i ) end end end When attempting to use this, with the right CSV files and the other elements in place, it will say Don't know how to build task 'db:import_csv_data' I know this structure works because I've tested it, I'm just trying to get it to convert to the new values on the fly. Suggestions?

    Read the article

  • Dynamically generating a file with javascript?

    - by gct
    I'm working on a web app for my company that will let us do some image tagging stuff, and I'd like to be able to generates results in the form of a CSV file. I can do this easily enough by dumping the CSV data to a div or something on the page and having the user copy it out. I'd rather have them hit the a generate button and have a CSV file downloaded as though they clicked on a link to the result, so they can more easily just save the file somewhere convenient. Is it possible to simulate this kind of thing with javascript? I basically want to dynamically generate the file and then let them download it, client side.

    Read the article

  • How to view a DataTable while debuging

    - by Eric
    I'm just getting started using ADO.NET and DataSets and DataTables. One problem I'm having is it seems pretty hard to tell what values are in the data table when trying to debug. What are some of the easiest ways of quickly seeing what values have been saved in a DataTable? Is there someway to see the contents in Visual Studio while debugging or is the only option to write the data out to a file? I've created a little utility function that will write a DataTable out to a CSV file. Yet the the resulting CSV file created was cut off. About 3 lines from what should have been the last line in the middle of writing out a System.Guid the file just stops. I can't tell if this is an issue with my CSV conversion method, or the original population of the DataTable. Update Forget the last part I just forgot to flush my stream writer.

    Read the article

  • Efficient way to access a mapping of identifiers in Python

    - by sixbelo
    I am writing an app to do a file conversion and part of that is replacing old account numbers with a new account numbers. Right now I have a CSV file mapping the old and new account numbers with around 30K records. I read this in and store it as dict and when writing the new file grab the new account from the dict by key. My question is what is the best way to do this if the CSV file increases to 100K+ records? Would it be more efficient to convert the account mappings from a CSV to a sqlite database rather than storing them as a dict in memory?

    Read the article

  • Creating interruptible process in python

    - by Glycerine
    I'm creating a python script of which parses a large (but simple) CSV. It'll take some time to process. I would like the ability to interrupt the parsing of the CSV so I can continue at a later stage. Currently I have this - of which lives in a larger class: (unfinished) Edit: I have some changed code. But the system will parse over 3 million rows. def parseData(self) reader = csv.reader(open(self.file)) for id, title, disc in reader: print "%-5s %-50s %s" % (id, title, disc) l = LegacyData() l.old_id = int(id) l.name = title l.disc_number = disc l.parsed = False l.save() This is the old code. def parseData(self): #first line start fields = self.data.next() for row in self.data: items = zip(fields, row) item = {} for (name, value) in items: item[name] = value.strip() self.save(item) Thanks guys.

    Read the article

  • When I use SharePoint's export to spreadsheet (Excel), not all the columns appear

    - by MichaelKay
    We have several SharePoint (MOSS) lists with 100's of items so we use 'export to spreadsheet' to do the heavy editing. But, in the spreadsheet not all of the list columns appear. One example is all columns of the 'publishing HTML' type cannot be edited (or even seen) in either Excel 2003 or the web datasheet view. But, an SSIS can export/import these columns without issue. Is there a way to use Excel 2003/2007 or Access 03/07 to edit these columns. Is there another way to connect to these columns?

    Read the article

  • How to export all wordpress.com post to windows live writer

    - by Ieyasu Sawada
    Is is possible to export existing wordpress post to windows live writer? I have to edit some post and I need to make use of the code snippet plugin that is only available on live writer. There is actually a feature which allows me to do that. But it only allows 1 post at a time. And every time I go to this screen, it always fetches the blog post from wordpress again. Which makes it very slow. What I need is something that will allow me to cache the posts retrieved to make it faster. Or something that will allow me to export wordpress post into live writer documents

    Read the article

  • Microsoft Entourage/Exchange Server problem: all objects disappeared from server - still in some form on the client

    - by splattne
    One of our employees works with Entourage on his MacBook Pro (OSX 10.6) accessing Exchange Server 2007. Last Friday morning, I think while working over a VPN, Entourage (I think it was Entourage) deleted all his objects (mail, calendar, contacts) on the server and while creating a lot of strange folders (starting with underscores) on the client. The local data seems to be there, but not in a consistent form. Since the user's mailbox is rather big, I suspect, that there was some kind of "move" operation which did not complete. I tried to export the data, but the export stops because of a corrupted object. Is there a tool or another way to export or retrieve the local data? Edit - FYI: we solved the problem getting his data from the previous night's backup.

    Read the article

  • Microsoft Entourage/Exchange Server problem: all objects disappeared from server - still in some for

    - by splattne
    One of our employees works with Entourage on his MacBook Pro (OSX 10.6) accessing Exchange Server 2007. Last Friday morning, I think while working over a VPN, Entourage (I think it was Entourage) deleted all his objects (mail, calendar, contacts) on the server and while creating a lot of strange folders (starting with underscores) on the client. The local data seems to be there, but not in a consistent form. Since the user's mailbox is rather big, I suspect, that there was some kind of "move" operation which did not complete. I tried to export the data, but the export stops because of a corrupted object. Is there a tool or another way to export or retrieve the local data? Edit - FYI: we solved the problem getting his data from the previous night's backup.

    Read the article

  • Export a single layer as an image in Photoshop

    - by wrburgess
    I have a lot of designers send me layered PSDs of their designs and I need to break out the pieces of the designs to place on web pages. I can do a decent number of things in Photoshop, but I'm hardly efficient with it. My old way of just copying the image that's in a layer and pasting into a new image seems to take forever as I screw around with cropping and such. I've got Photoshop CS5, so I don't need external software to do anything, but I just need to figure out how to take a single layer, that may hold something small like an icon, and export it as a PNG or JPG. I am aware of the script called "Export Layers to Files" but it took about an hour and exported ALL of my layers to a huge number of files. I wasn't looking for a solution that broad. Is there an easy way to do this?

    Read the article

  • Google Docs not importing CSVs consistently

    - by nick
    Hey everyone, I'm trying to import some csv data into google docs spreadsheet. The data I am entering is all made up of 16 digit integers. About 90% of them are imported perfectly but 10% are rewritten automatically into scientific notation. How do I turn this feature of. I just want all the numbers kept in their standard form. Kind Regards Nick

    Read the article

  • Batch copy multiple folders and their subfolders to another folder

    - by DjLenny
    I have a folder X:\Export that has several folders X:\Export\Export1 X:\Export\Export2 X:\Export\Export3 etc. (names vary by a large factor) each Export folder has the same subdirectory structure but have different files. I would like to copy all the subfolders and the files of X:\Export\Export1 X:\Export\Export2 X:\Export\Export3 to a folder X:\Export\mergedExports keeping the subdirectory structure pseudocode of what I would like to do but cannot get working properly create new folder "merged" for (every folder X in a given directory Y) copy every file in X keeping directory structure to "merged" If conflict then overwrite

    Read the article

  • Convert Public Folders to a PST

    - by TrueDuality
    Alrighty so I've got a tricky one. I currently have a public folder database (edb & stm) residing on an Exchange 2003 folder. I need to export them into a pst file or otherwise make it so that I can manually get the data in it to end-users. I can not use the export feature built into Outlook as some of the folder refer to another server which doen't have the data. Trying only results in the Outlook Client hanging for close to an hour before giving an error about not finding the data. So this will need to be a server side export. There are a few tools out there that seem to be available for converting edb & stm files to psts but they are quite expensive. Does anybody have any ideas?

    Read the article

  • Excel 2007: Exporting more than 100 columns to a .prn file but data is concatenated

    - by Don1
    I want to export an Excel worksheet to a space delimited (.prn) file. The worksheet is pretty big (187 columns) and when I set the column widths and try to export the worksheet to a .prn file, the data gets cut at the 98th column (i.e. about 200 characters wide for my data) and the rest is placed directly underneath. It's like I ripped a page in half from top to bottom and placed the right-hand side directly under the left-hand side. How would I get it to export everything without getting concatenated?

    Read the article

  • export block device over network without root

    - by dschatz
    I'm trying to export a file as a block device over the network. I do not have root access on the machine where the file exists. I do have root access on the machine(s) where I will mount the block device. I've seen ATA-Over-Ethernet and ISCSI but there don't seem to be any implementations which allow me to export the block without root at least (some even require kernel modules). Is there an implementation of either of these or some other protocol that doesn't require root? Perhaps I can tunnel ethernet over IP to do this?

    Read the article

  • sql server: losing identity column on export/import

    - by Y.G.J
    Recently I started dealing with SQL Server, my previous experience was in MS-Access. When I'm doing an import/export of a db, from the server to my computer or even in the server, all column with primary key loose the key. Identity is set to false and even bit is not set to the default. How can I can I use an import/export job to make an exact copy of the db and its data? I don't want to have to perform a backup and restore every time I want the same db somewhere else, for another project, etc. I have read about "edit mapping" and the checkbox but that did not helped with the identity specification... and what about the primary key of the tables and the rest of the things?

    Read the article

  • mysql to excel exporting data using php getting html code also

    - by pmms
    hi all following is code for getting xlsheet from mysql using php ` if( ($_POST['Submit']=='generateexcel')) { $tblname=$_GET['generateexcel']; global $obj_mysql; $table="tbl_js_login"; // this is the tablename that you want to export to csv from mysql. function exportMysqlToCsv($table,$filename = 'export.csv') { $csv_terminated = "\n"; $csv_separator = ","; $csv_enclosed = '"'; $csv_escaped = "\"; $sql_query = "select fld_id, fld_fname,fld_lname from $table"; // Gets the data from the database $result = mysql_query($sql_query); $fields_cnt = mysql_num_fields($result); $schema_insert = ''; for ($i = 0; $i < $fields_cnt; $i++) { $l = $csv_enclosed . str_replace($csv_enclosed, $csv_escaped . $csv_enclosed, stripslashes(mysql_field_name($result, $i))) . $csv_enclosed; $schema_insert .= $l; $schema_insert .= $csv_separator; } // end for $out = trim(substr($schema_insert, 0, -1)); $out .= $csv_terminated; // Format the data while ($row = mysql_fetch_array($result)) { $schema_insert = ''; for ($j = 0; $j < $fields_cnt; $j++) { if ($row[$j] == '0' || $row[$j] != '') { if ($csv_enclosed == '') { $schema_insert .= $row[$j]; } else { $schema_insert .= $csv_enclosed . str_replace($csv_enclosed, $csv_escaped . $csv_enclosed, $row[$j]) . $csv_enclosed; } } else { $schema_insert .= ''; } if ($j < $fields_cnt - 1) { $schema_insert .= $csv_separator; } } // end for $out .= $schema_insert; $out .= $csv_terminated; $out1 = strip_tags($out); } // end while header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Content-Length: " . strlen($out1)); // Output to browser with appropriate mime type, you choose ;) header("Content-type: text/x-csv"); //header("Content-type: text/csv"); //header("Content-type: application/csv"); header("Content-Disposition: attachment; filename=$filename"); echo $out1; exit; } exportMysqlToCsv($table); } include_once $path."includes/jobseeker_form.php"; /* function is_duplicate($login_name) { global $obj_mysql; $sql="SELECT * FROM tbl_admin_details WHERE fld_login ='$login_name'"; $num=$obj_mysql-get_num_rows($sql); if($num==0) return false; else return true; }*/ ?` the above code we are using for genrating the xlsheet along with xlsheet we are getting html at th top . following is the screen shot of xlsheet please provide some help how to remove the html code from xlsheet

    Read the article

  • Combiner le chargement d'une base de données et d'un fichier CSV sur Java Swing en 5 minutes, par Thierry Leriche-Dessirier

    Bonjour à tous, Je vous propose un petit article, intitulé "Charger et afficher des données de la base et d'un fichier CSV simple en 5 minutes" et disponible à l'adresse http://thierry-leriche-dessirier.dev...-db-csv-5-min/ Synopsis : Ce petit article montre (par l'exemple) comment charger des données depuis un fichier CSV simple (avec Open-CSV) et depuis la base MySql (avec JDBC), en fusionnant les valeurs pour les afficher dans une Interface (Swing) sous forme de tableau (JTable et Table model) et sous forme de graphes, le tout en quelques minutes seulement. Retrouvez aussi les autres articles de la séri...

    Read the article

  • JMeter: how to asign a single distinct value from CSV Data Set Config to each thread in thread group?

    - by JohnnyM
    I have to make a load test for a relatively large number of users so I cant realy use User Parameters pre-processor to parametrize each thread with custom user data. I've read that I should use CSV Data Set Config instead. However I run into a problem with how JMeter interprets the input of this Config. Example: I have a thread group of 3 threads and Loop Count:10 with one HTTP request sampler with server www.example.com and path: \${user}. The csv file (bullet is a single line in file) for CSV Data Set Config to extract the user parameter: 1 2 3 4 5 Expected output is that for thread 1-x the path of the request should be: \x. So the output file should consist of 10 samples per thread namely: for thread 1-1 : 10 requests to www.example.com\1 for thread 1-2 : 10 requests to www.example.com\2 for thread 1-3 : 10 requests to www.example.com\3 but instead i get requests to each \1 - \5 and then to EOF. Does anyone know how to achieve the expected effect with CSV Data Set Config in jmeter 2.9?

    Read the article

  • Does Chrome always adds an XLS extension on a vnd.ms-excel mime type?

    - by Claudio
    It seems that a simple download of a PHP generated CSV file (with a vnd.ms-excel mime type for the sake of opening it with (Open)Office readly upon the "Save as..." dialog, when present) always gets the unwanted .xls extension when the UA is Google Chrome. The file would then be named as myfile.csv.xls. Firefox behaves correctly. I wonder if it is a bug, a feature or a misunderstanding of some references. Thank you.

    Read the article

  • BULK INSERT problem in mysql

    - by kartiku
    Hi, I get an error with the following sql command for bulk insert....any help would be appreciated. BULK INSERT libra.faculty FROM 'd\:faculty.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ); Here's the error message ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'BULK INSERT libra.faculty FROM 'd:\faculty.csv' WITH ( FIELDTERMINATOR = ',', RO' at line 1

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >