Search Results

Search found 1639 results on 66 pages for 'csv'.

Page 8/66 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • dump csv from sqlalchemy

    - by afilatun
    For some reason, I want to dump a table from a database (sqlite3) in the form of a csv file. I'm using a python script with elixir (based on sqlalchemy) to modify the database. I was wondering if there is any way to dump the table I use to csv. I've seen sqlalchemy serializer but it doesn't seem to be what I want. Am I doing it wrong? Should I call the sqlite3 python module after closing my sqlalchemy session to dump to a file instead? Or should I use something homemade?

    Read the article

  • Regular expression for parsing CSV in PHP

    - by Discodancer
    I already managed to split the CSV file using this regex: "/,(?=(?:[^\"]\"[^\"]\")(?![^\"]\"))/" But I ended up with an array of strings that contain the opening and ending double quotes. Now I need a regex that would strip those strings of the delimiter double quotes. As far as I know the CSV format can encapsulate strings in double quotes, and all the double quotes that are already a part of the string are doubled. For example: My "other" cat becomes "My ""other"" cat" What I basically need is a regex that will replace all sequences of N doublequotes with a sequence of (N/2 - rounded down) double quotes. Or is there a better way ? Thanks in advance.

    Read the article

  • Spooling data to CSV truncates

    - by Steve
    Hi, I am using the below script to output data to a csv file: set heading off set linesize 10000 set pagesize 0 set echo off set verify off spool D:\OVERNIGHT\TEMP_FILES\PFRA_DETAIL_VIXEN_OUTPUT.txt SELECT TRIM(T4.S_ORG_ID)||','|| TRIM(T4.NAME)||','|| TRIM(T3.CREATION_TIME)||','|| TRIM(T5.X_HOUSE_NUMBER)||','|| TRIM(T5.X_FLAT_NUMBER)||','|| TRIM(T5.ADDRESS)||','|| TRIM(T5.CITY)||','|| TRIM(T5.ZIPCODE)||','|| TRIM(T3.NOTES) FROM TABLE_CASE T1 INNER JOIN TABLE_QUEUE T2 ON T1.CASE_CURRQ2QUEUE = T2.OBJID INNER JOIN TABLE_PHONE_LOG T3 ON T1.OBJID = T3.CASE_PHONE2CASE INNER JOIN TABLE_BUS_ORG T4 ON T1.X_CASE2X_BUS_ORG = T4.OBJID INNER JOIN TABLE_ADDRESS T5 ON T1.CASE2ADDRESS = T5.OBJID WHERE case_currq2queue IN(422); / spool off; exit; However the data is being truncated to 80 characters. The t3.notes field is in CLOB format. Does anyone know how I can spool this out to csv? I only have access to SQL*Plus. Thanks in advance, Steve

    Read the article

  • iPhone Accelerometer > csv > email

    - by Bradley Powers
    Hi all, I'm trying to collect data for a machine learning project I'm working on. What I'd like to do is collect accelerometer data from an iPhone, save it to a csv and email it to myself. My app currently is able to acquire data from the accelerometer, but I'm at a bit of a loss as to how to proceed. First of all, I'd like to acquire data for a preset amount of time (after playing a sound to the user) which I don't really know how to do, and I can't find good documentation for. Also, I'd like to save that to a csv, which there is some documentation on (specifically using the NSString writeToFile method). Any recommendations/ ideas? Thanks!

    Read the article

  • How to serve a View as CSV in ASP.NET Web Forms

    - by ChessWhiz
    Hi, I have a MS SQL view that I want to make available as a CSV download in my ASPNET Web Forms app. I am using Entity Framework for other views and tables in the project. What's the best way to enable this download? I could add a HyperLink whose click handler iterates over the view, writes its CSV form to the disk, and then serves that file. However, I'd prefer not to write to the disk if it can be avoided, and that involves iteration code that may be avoided with some other solution. Any ideas?

    Read the article

  • Parsing a CSV File to a Rails Database

    - by Schroedinger
    G'day guys, I'm using fasterCSV and a rake script to parse a csv with about 30 columns into my rails db for a 'Trade' item. The script works fine when all of the values are set to strings, but when I change it to a decimal, int or other value, everything goes to hell. Wondering if fasterCSV has built in int etc parsing or whether I'll have to manage these within my model. Basically, I'm given a giant amount of trades data, need to import it, and then need to provide feedback with say the average trade volume, the times, etc. I understand I can do that all with the wonderful records provided to me by activeRecord but wondered if there was an easier way to populate a rather large Database with a given CSV? Several of the fields don't have values for certain rows, fasterCSV seems to work perfectly when they're all strings, but not when I try to get decimal or other.

    Read the article

  • Using Regex to remove Carriage Returns in a CSV file in Notepad++

    - by Barry
    I have a CSV file I need to clean up. This is a one-time thing so I'd like to do it in Notepad++ if possible. The CSV file has two fields, one of which is wrapped in quotes. I'd like to remove any Carriage Returns from within the quoted field. I was trying to use this pattern but can't get it quite right... (.*)\"(.*)\n(.*)\"(.*) Also correct me if I am wrong, but I presume the "replace with" value would be something along the lines of: \1\2\3\4 Thanks in advance. I'm also open to alternate solutions such as a quick and dirty PERL script.

    Read the article

  • Dealing with a badly formatted CSV file

    - by Josh K
    I have an exceptionally bad CSV file. Although I "solved" the problem in the end by manually writing scripts to process and reprocess this specific file I wanted to know if there were any other solutions out there. You have a CSV file that has all the fields terminated by | (pipe) characters. Running a quick check shows you that there are 53 fields in the file. The person who gave you the file claims there there are only 28 fields. Not all of the fields have information in them. For example there are five custom_field_{num} fields which may or may not have data. How would you get this into a database nicely? The ideal solution (and one I searched high and low for) would be to just throw it all into a table with no column names or specifications. Then remove any columns that were completely blank and then give them titles and specifications.

    Read the article

  • CSV File Content Display Issue

    - by Pankaj Khurana
    Hi, I want to retrieve contents from a csv file for that i am using following code: <?php $fo = fopen("record.csv", "rb+"); while(!feof($fo)) { $contents[] = fgetcsv($fo,0,';'); } print_r($contents); fclose($fo); ?> But my records are displayed in the following format: ????††???????†††??†††††????"Search Transactions Results" ††††??†???????††††?††††††??? ???????????? ?????????????? ?????????? My csv file format: "Search Transactions Results" "Transaction ID","Reference Transaction ID","Date","Type","Subject","Item Number","Item Name","Invoice ID","Name","Email","Shipping Name","Shipping Address Line 1","Shipping Address Line 2","Shipping Address City","Shipping State/Province","Shipping Zip/Postal Code","Shipping Address Country","Shipping Method","Address Status","Contact Phone Number","Gross Amount","Receipt ID","Custom Field","Option 1 Name","Option 1 Value","Option 2 Name","Option 2 Value","Note","Auction Site","Auction User ID","Item URL","Auction Closing Date","Insurance Amount","Currency","Fees","Net Amount","Shipping & Handling Amount","Sales Tax Amount","To Email","Time","Time Zone" "1T","",5/5/2010 2:10:44 PM,"Payment Processed","CFP Self Study Kit","1","CFP Self Study Kit","","User1","[email protected]","","","","","","","","","N","","68.18","R1","","","","","","","","","",,"","USD","-2.62","65.56","0","0","[email protected]","01:40","Asia/Calcutta" "2T","",5/19/2010 4:04:08 PM,"Payment Processed","CFP Self Study Kit","1","CFP Self Study Kit","","User2","[email protected]","","","","","","","","","N","","68.18","R2","","","","","","","","","",,"","USD","-2.62","65.56","0","0","[email protected]","03:34","Asia/Calcutta" "3T","1RT",5/19/2010 5:28:45 PM,"Currency Conversion Completed","","","",""," ","","","","","","","","","","N","","17492.6","","","","","","","","","","",,"","INR","0","17492.6","0","0","","04:58","Asia/Calcutta" "4T","2RT",5/19/2010 5:28:45 PM,"Currency Conversion Completed","","","",""," ","","","","","","","","","","N","","-393.36","","","","","","","","","","",,"","USD","0","-393.36","0","0","","04:58","Asia/Calcutta" "5T","",5/19/2010 5:28:45 PM,"Transfer to Bank Initiated","P1006","","P1006",""," ","","","","","","","","","","N","","-17492.6","","","","","","","","","","",,"","INR","0","-17492.6","0","0","","04:58","Asia/Calcutta" "6T","",5/20/2010 5:38:02 PM,"Transfer to Bank Completed","P1006","","P1006",""," ","","","","","","","","","","N","","-17492.6","","","","","","","","","","",,"","INR","0","-17492.6","0","0","","05:08","Asia/Calcutta" "7T","",5/21/2010 12:32:37 PM,"Payment Processed","FP - LVC Plus","","FP - LVC Plus","","User3","[email protected]","User3","NEW DELHI","BEHIND KARNATAKA BANK LD","SOUTH","NEW DELHI","110023","IN","","N","","283.96","","","","","","","","","","",,"","USD","-9.95","274.01","0","0","[email protected]","00:02","Asia/Calcutta" "8T","",5/25/2010 4:40:48 PM,"Transfer to Bank Initiated","P1006","","P1006",""," ","","","","","","","","","","N","","-12569.85","","","","","","","","","","",,"","INR","0","-12569.85","0","0","","04:10","Asia/Calcutta" "9T","3RT",5/25/2010 4:40:48 PM,"Currency Conversion Completed","","","",""," ","","","","","","","","","","N","","-274.01","","","","","","","","","","",,"","USD","0","-274.01","0","0","","04:10","Asia/Calcutta" "10T","4RT",5/25/2010 4:40:48 PM,"Currency Conversion Completed","","","",""," ","","","","","","","","","","N","","12569.85","","","","","","","","","","",,"","INR","0","12569.85","0","0","","04:10","Asia/Calcutta" "11T","",5/26/2010 4:57:39 PM,"Transfer to Bank Completed","P1006","","P1006",""," ","","","","","","","","","","N","","-12569.85","","","","","","","","","","",,"","INR","0","-12569.85","0","0","","04:27","Asia/Calcutta" "Total","-247.05 USD","-15.19","-262.24" "Total","0.00 INR","0.00","0.00" I want to retrieve the records where "Type"="Payment Processed". I want to retrieve content in a key value format that is for e.g. Transaction ID-1T as i have to store this values in a database but display is not proper. I am unable to find out the reason for the same please help me on this. Thanks

    Read the article

  • How to Practically Split Values from CSV File into MySQL Database

    - by Ryan
    Let's suppose I have the following line in a CSV file (I removed the header row in this example): "500,000",2,50,2,90000 I have a PHP script read the CSV file, break the file into individual lines, and store each line in an array called $linearray. Then, I use a foreach loop to look at each line individually. Within the foreach loop, I break the line into separate variables using the following function: $line = str_replace("'","\'",$line); From here, I insert the values into separate columns within a MySQL database. The script works. The values are inserted into a database, but I run into a problem. I want: "500,000" | 2 | 50 | 2 | 90000 But I get this: "500 | 000" | 2 | 50 | 2 | 90000 The script isn't smart enough to understand it should skip commas within quotation marks. Do you know how I can alter my script to make sure I get the output I'm looking for? Thanks.

    Read the article

  • Regex to match CSV file nested quotes

    - by user361970
    Hi, I know this has been discussed a million times. I tried searching through the forums and have seen some close regex expressions and tried to modify them but to no avail. Say there is a line in a csv file like this: "123", 456, "701 "B" Street", 910 Is there an easy regex to detect "B" (since its a non-escaped set of quotes within the normal CSV quotes) and replace it with something like \"B\" ? The final string would end up looking like this: "123", 456, "701 \"B\" Street", 910 Help would be greatly appreciated!

    Read the article

  • Python Parse CSV Correctly

    - by cornerstone
    I am very new to Python. I want to parse a csv file such that it will recognize quoted values - For example 1997,Ford,E350,"Super, luxurious truck" should be split as ('1997', 'Ford', 'E350', 'Super, luxurious truck') and NOT ('1997', 'Ford', 'E350', '"Super', ' luxurious truck"') the above is what I get if I use something like str.split(). How do I do this? Also would it be best to store these values in an array or some other data structure? because after I get these values from the csv I want to be able to easily choose, lets say any two of the columns and store it as another array or some other data structure. Thanks in advance.

    Read the article

  • Java BufferedReader behavior in CSV vs TXT file

    - by Gabriel
    If i try to read a CSV file called csv_file.csv. The problem is that when i read lines with BufferedReader.readLine() it skips the first line with months. But when i rename the file to csv_file.txt it reads it allright and it's not skipping the first line. Is there an undocumented "feature" of BufferedReader that i'm not aware? Example of file: Months, SEP2010, OCT2010, NOV2010 col1, col2, col3, col4, col5 aaa,,sdf,"12,456",bla bla bla, xsaffadfafda and so on, and so on, "10,00", xxx, xxx The code: FileInputStream stream = new FileInputStream(UploadSupport.TEMPORARY_FILES_PATH+fileName); BufferedReader br = new BufferedReader(new InputStreamReader(stream, "UTF-8")); String line = br.readLine(); String months[] = line.split(","); while ((line=br.readLine())!=null) { /*parse other lines*/ }

    Read the article

  • excluding a column in csv file with regex

    - by JPro
    Is there any way to exclude/delete/replace one field from a csv file with some regexp in notepad++? I have a csv file with some data like this: '1','data1','data2','data3','data4','data5','data6','data7','data8','data9', 'data10','data11','data12','data13','data14','data15','data16','data17','data18', 'data19','data20','data21','data22','data23','\'data24 with some commas, here and there and some "double quotes", and fullstops.','data25','data26' The only problem I am facing is with data24 WHERE I encounter \' and then "" and some wild characters like , and .. This is particularly fixed at 24 field. For the purpose of clarity, I have entered a newline here. But the entire text above is in juts one line. Any ideas on how to solve? Thanks.

    Read the article

  • Excel isn't reading sql exported csv properly

    - by mhopkins321
    I have a batch file that calls sqlcmd to run a command and then export the data as a csv. When viewed in a cell the trasancted date for example shows 35:30.0 but if you click on it the formula bar shows 1/1/1900 2:45:00 PM. I need the full timestamp to show in the cell. Any ideas? The batch file is the following sqlcmd -S server -U username -P password -d database -i "D:\path\sqlScript.sql" -s "," > D:\path\report.csv -I -W -k 1 The script is the following. Now I currently have them cast as varchars, but that's simply because i've tried to change it a bit. Varchar doesn't work either. SET NOCOUNT ON; select top(10)BO.Status, cast(tradeDate AS varchar) AS Trade_Date, CAST(closingTime AS varchar) AS Closing_Time, CAST(openingTime AS varchar) AS openingTime FROM GIANT COMPLICATED JOINS OF ALL SORTS OF TABLES

    Read the article

  • Wierd characters in exported csv files when converting

    - by Ahue
    Hey guys, I came across a problem I cannot solve on my own concerning the downloadable csv formatted trends data files from Google Insights for Search. I'm to lazy to reformat the files I4S gives me manually what means: Extracting the section with the actual trends data and reformatting the columns so that I can use it with a modelling program I do for school. So I wrote a tiny script the should do the work for me: Taking a file, do some magic and give me a new file in proper format. What it's supposed to do is reading the file contents, extracting the trends section, splitting it by newlines, splitting each line and then reorder the columns and maybe reformat them. When looking at a untouched I4S csv file it looks normal containing CR LF caracters at line breaks (maybe thats only because I'm using Windows). When just reading the contents and then writing them to a new file using the script wierd asian characters appear between CR and LF. I tried the script with a manually written similar looking file and even tried a csv file from Google Trends and it works fine. I use Python and the script (snippet) I used for the following example looks like this: # Read from an input file file = open(file,"r") contents = file.read() file.close() cfile = open("m.log","w+") cfile.write(contents) cfile.close() Has anybody an idea why those characters appear??? Thank you for you help! I'll give you and example: First few lines of I4S csv file: Web Search Interest: foobar Worldwide; 2004 - present Interest over time Week foobar 2004-01-04 - 2004-01-10 44 2004-01-11 - 2004-01-17 44 2004-01-18 - 2004-01-24 37 2004-01-25 - 2004-01-31 40 2004-02-01 - 2004-02-07 49 2004-02-08 - 2004-02-14 51 2004-02-15 - 2004-02-21 45 2004-02-22 - 2004-02-28 61 2004-02-29 - 2004-03-06 51 2004-03-07 - 2004-03-13 48 2004-03-14 - 2004-03-20 50 2004-03-21 - 2004-03-27 56 2004-03-28 - 2004-04-03 59 Output file when reading and writing contents: Web Search Interest: foobar ??????????? ? ? ? ????????? ????????? ???? ?????? Week foobar ?? ?? ?? ? ? ? ?? ??? ????? 2004-01-11 - 2004-01-17 44 ?? ?? ???? ? ? ?? ????????? 2004-01-25 - 2004-01-31 40 ?? ?? ?? ? ? ? ?? ?? ?????? 2004-02-08 - 2004-02-14 51 ?? ?? ???? ? ? ?? ????????? 2004-02-22 - 2004-02-28 61 ?? ?? ???? ? ? ?? ?? ?????? 2004-03-07 - 2004-03-13 48 ?? ?? ???? ? ? ?? ??? ?? ?? 2004-03-21 - 2004-03-27 56 ?? ?? ???? ? ? ?? ?? ?????? 2004-04-04 - 2004-04-10 69 ?? ?? ???? ? ? ?? ????????? 2004-04-18 - 2004-04-24 51 ?? ?? ???? ? ? ?? ?? ?????? 2004-05-02 - 2004-05-08 56 ?? ?? ?? ? ? ? ?? ????????? 2004-05-16 - 2004-05-22 54 ?? ?? ???? ? ? ?? ????????? 2004-05-30 - 2004-06-05 74 ?? ?? ?? ? ? ? ?? ????????? 2004-06-13 - 2004-06-19 50 ?? ?? ??? ? ? ?? ????????? 2004-06-27 - 2004-07-03 58 ?? ?? ?? ? ? ? ?? ??? ????? 2004-07-11 - 2004-07-17 59 ?? ?? ???? ? ? ?? ?????????

    Read the article

  • jqGrid Export to CSV Missing Column Names

    - by user561557
    I have a jqGrid that works perfectly. It contains a pager button to export the grid to a csv file which works and exports the data. However, I also need to have the column names exported with the data and I can't seem to get that to work. My working code follows. jQuery("#detail").jqGrid('navGrid','#pager2', {height:520,width:500,savekey:[true,13],navkeys:[true,38,40],reloadAfterSubmit:false, jqModal:false, closeOnEscape:true, bottominfo:"Fields marked with () are required"}, // edit options {height:520, width:500,savekey:[true,13],reloadAfterSubmit:false,jqModal:false, closeOnEscape:true,bottominfo:"Fields marked with () are required", closeAfterAdd: true}, // add options {reloadAfterSubmit:false,jqModal:false, closeOnEscape:true}, // del options {closeOnEscape:true}, // search options {height:250,width:500,jqModal:false,closeOnEscape:true}, {view:true} // view options ); // add custom button to export the data to excel jQuery("#detail").jqGrid('navButtonAdd','#pager2',{ caption:"", title:"Export to CSV", onClickButton : function () { exportExcel(); }, position:"last" }); // add custom button to print grid jQuery("#detail").jqGrid('navButtonAdd','#pager2',{ caption:"", title:"Print", buttonicon:"ui-icon-print", onClickButton : function () { jQuery('#detail_table').jqprint({ operaSupport: true }); return false; } }); function exportExcel() { var mya=new Array(); mya=jQuery("#detail").getDataIDs(); // Get All IDs var data=jQuery("#detail").getRowData(mya[0]); // Get First row to get the labels var colNames=new Array(); var ii=0; for (var i in data){colNames[ii++]=i;} // capture col names var html=""; for(i=0;i } html=html+"\\n"; // end of line at the end document.forms[0].method='POST'; document.forms[0].action='ajax/csvExport.php'; // send it to server which will open this contents in excel file document.forms[0].target='_blank'; document.forms[0].csvBuffer.value=html; document.forms[0].submit(); }

    Read the article

  • how to import csv data into django models

    - by little_fish
    i have some csv data and i want to export into django models the example of csv data 1;"02-01-101101";"Worm Gear HRF 50";"Ratio 1 : 10";"input shaft, output shaft, direction A, color dark green"; 2;"02-01-101102";"Worm Gear HRF 50";"Ratio 1 : 20";"input shaft, output shaft, direction A, color dark green"; 3;"02-01-101103";"Worm Gear HRF 50";"Ratio 1 : 30";"input shaft, output shaft, direction A, color dark green"; 4;"02-01-101104";"Worm Gear HRF 50";"Ratio 1 : 40";"input shaft, output shaft, direction A, color dark green"; 5;"02-01-101105";"Worm Gear HRF 50";"Ratio 1 : 50";"input shaft, output shaft, direction A, color dark green"; and i have some django models name Product in Product there is some fields like name, description and price and i want to something like this product=Product() product.name = "Worm Gear HRF 70(02-01-101116)" product.description = "input shaft, output shaft, direction A, color dark green" product.price = 100

    Read the article

  • Reading CSV files in numpy where delimiter is ","

    - by monch1962
    Hello all, I've got a CSV file with a format that looks like this: "FieldName1", "FieldName2", "FieldName3", "FieldName4" "04/13/2010 14:45:07.008", "7.59484916392", "10", "6.552373" "04/13/2010 14:45:22.010", "6.55478493312", "9", "3.5378543" ... Note that there are double quote characters at the start and end of each line in the CSV file, and the "," string is used to delimit fields within each line. When I try to read this into numpy via: import numpy as np data = np.genfromtxt(csvfile, dtype=None, delimiter=',', names=True) all the data gets read in as string values, surrounded by double-quote characters. Not unreasonable, but not much use to me as I then have to go back and convert every column to its correct type When I use delimiter='","' instead, everything works as I'd like, except for the 1st and last fields. As the start of line and end of line characters are a single double-quote character, this isn't seen as a valid delimiter for the 1st and last fields, so they get read in as e.g. "04/13/2010 14:45:07.008 and 6.552373" - note the leading and trailing double-quote characters respectively. Because of these redundant characters, numpy assumes the 1st and last fields are both String types; I don't want that to be the case Is there a way of instructing numpy to read in files formatted in this fashion as I'd like, without having to go back and "fix" the structure of the numpy array after the initial read?

    Read the article

  • Preserving hierarchy when converting .csv file to xml or json

    - by Simon Levinson
    Hello I have a question concerning translating data from a CSV into XML or JSON where it is essential to preserve the heirarchy of the data. For example, if I have CSV data like this: type,brand,country,quantity apple,golden_delicious,english,1 apple,golden_delicious,french,2 apple,cox,,4 apple,braeburn,,1 banana,,carribean,6 banana,,central_america,7 clememtine,,,3 What I want is to preserve hierarchy in the XML so that I get something like: <fruit> <type = "apple"> <brand = "golden_delicious"> <country = "english" quantity = "1"> <country = "french" quantity = "2"> </brand> <brand = "cox"> <quantity = "4"> </brand> <brand = "braeburn"> <quantity = "1"> </brand> </type> <type = "banana"> <country = "carribean" quantity = "6"> <country = "central_america" quantity = "7"> </type> <type = "clementine"> <quantity = "3"> </type> <fruit /> Is it best to try to use JAXP or to convert the above into a table simply of parent, child and then writing the data to an array of strings for processing,? Like this: parent,child fruit,apple apple,golden_delicious golden_delicious,english golden_delicious,french english,1 french,2 apple,cox cox,4 apple,braeburn braeburn,1 And so on. Or is there a better way? Thanks Simon Levinson

    Read the article

  • Read alphanumeric characters from csv file in C#

    - by Prasad
    I am using the following code to read my csv file: public DataTable ParseCSV(string path) { if (!File.Exists(path)) return null; string full = Path.GetFullPath(path); string file = Path.GetFileName(full); string dir = Path.GetDirectoryName(full); //create the "database" connection string string connString = "Provider=Microsoft.ACE.OLEDB.12.0;" + "Data Source=\"" + dir + "\\\";" + "Extended Properties=\"text;HDR=Yes;FMT=Delimited;IMEX=1\""; //create the database query string query = "SELECT * FROM " + file; //create a DataTable to hold the query results DataTable dTable = new DataTable(); //create an OleDbDataAdapter to execute the query OleDbDataAdapter dAdapter = new OleDbDataAdapter(query, connString); //fill the DataTable dAdapter.Fill(dTable); dAdapter.Dispose(); return dTable; } But the above doesn't reads the alphanumeric value from the csv file. it reads only i either numeric or alpha. Whats the fix i need to make to read the alphanumeric values? Please suggest.

    Read the article

  • Exporting to CSV from MySQL via PHP in FireFox

    - by typoknig
    Hi all, I am pulling some info from a database with the following code: <input type="button" value="Export to Excel" onClick="window.navigate('breakfast_service.php?action=export')"> Here is the code for that action. <?php if ($_GET['action'] == 'export') { // Get the registration data $user = 'root'; $pass = 'billiards'; $server = 'localhost'; $link = mysql_connect($server, $user, $pass); if (!$link) { die('Could not connect to database!' . mysql_error()); } mysql_select_db('breakfast', $link); $query = "SELECT * FROM registration"; $result = mysql_query($query); mysql_close($link); // format into CSV $contents = "id, school_id, first_name, last_name, email, attending, created_on\n"; $num = mysql_num_rows($result); for ($i = 0; $i < $num; $i++) { $row = mysql_fetch_array($result); $id = $row['id']; $school_id = $row['school_id']; $fname = $row['first_name']; $lname = $row['last_name']; $email = $row['email']; $attending = ($row['attending'] == 0) ? 'No' : 'Yes'; $date = $row['created_on']; $contents = $contents . "$id, $school_id, $fname, $lname, $email, $attending, $date\n"; } // return as excel file $filename = "export.csv"; header('Content-type: application/ms-excel'); header('Content-Disposition: attachment; filename='.$filename); echo $contents; } ?> This combination of code works excellent in IE, but fails to do create/download a file in Firefox or Chrome. Why?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >