Search Results

Search found 1639 results on 66 pages for 'csv'.

Page 7/66 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • How to change particular column entries in a mysql table when uploading data from csv file?

    - by understack
    I upload data into a mysql table from csv file in a standard way like this: TRUNCATE TABLE table_name; load data local infile '/path/to/file/file_name.csv' into table table_name fields terminated by ',' enclosed by '"' lines terminated by '\r\n' (id, name, type, deleted); All 'deleted' column entries in csv file has either 'current' or 'deleted' value. Question: When csv data is being loaded into table, I want to put current date in table for all those corresponding 'deleted' entries in csv file. And null for 'current' entries. How can I do this? Example: csv file: id_1, name_1, type_1, current id_2, name_1, type_2, deleted id_3, name_3, type_3, current Table after loading this data should look like this: id_1, name_1, type_1, null id_2, name_1, type_2, 2010-05-10 id_3, name_3, type_3, null Edit Probably, I could run another separate query after loading csv file. Wondering if it could be done in same query?

    Read the article

  • Specifying formatting for csv.writer in Python

    - by user248237
    I am using csv.DictWriter to output csv files from a set of dictionaries. I use the following function: def dictlist2file(dictrows, filename, fieldnames, delimiter='\t', lineterminator='\n'): out_f = open(filename, 'w') # Write out header header = delimiter.join(fieldnames) + lineterminator out_f.write(header) # Write out dictionary data = csv.DictWriter(out_f, fieldnames, delimiter=delimiter, lineterminator=lineterminator) data.writerows(dictrows) out_f.close() where dictrows is a list of dictionaries, and fieldnames provides the headers that should be serialized to file. Some of the values in my dictionary list (dictrows) are numeric -- e.g. floats, and I'd like to specify the formatting of these. For example, I might want floats to be serialized with "%.2f" rather than full precision. Ideally, I'd like to specify some kind of mapping that says how to format each type, e.g. {float: "%.2f"} that says that if you see a float, format it with %.2f. Is there an easy way to do this? I don't want to subclass DictWriter or anything complicated like that -- this seems like very generic functionality. How can this be done? The only other solution I can think of is: instead of messing with the formatting of DictWriter, just use the decimal package to specify the decimal precision of floats to be %.2 which will cause to be serialized as such. Don't know if this is a better solution? thanks very much for your help.

    Read the article

  • SQL Server Bulk insert of CSV file with inconsistent quotes

    - by mattstuehler
    Is it possible to BULK INSERT (SQL Server) a CSV file in which the fields are only OCCASSIONALLY surrounded by quotes? Specifically, quotes only surround those fields that contain a ",". In other words, I have data that looks like this (the first row contain headers): id, company, rep, employees 729216,INGRAM MICRO INC.,"Stuart, Becky",523 729235,"GREAT PLAINS ENERGY, INC.","Nelson, Beena",114 721177,GEORGE WESTON BAKERIES INC,"Hogan, Meg",253 Because the quotes aren't consistent, I can't use '","' as a delimiter, and I don't know how to create a format file that accounts for this. I tried using ',' as a delimter and loading it into a temporary table where every column is a varchar, then using some kludgy processing to strip out the quotes, but that doesn't work either, because the fields that contain ',' are split into multiple columns. Unfortunately, I don't have the ability to manipulate the CSV file beforehand. Is this hopeless? Many thanks in advance for any advice. By the way, i saw this post SQL bulk import from csv, but in that case, EVERY field was consistently wrapped in quotes. So, in that case, he could use ',' as a delimiter, then strip out the quotes afterwards.

    Read the article

  • PHP Streaming CSV always adds UTF-8 BOM

    - by Mustafa Ashurex
    The following code gets a 'report line' as an array and uses fputcsv to tranform it into CSV. Everything is working great except for the fact that regardless of the charset I use, it is putting a UTF-8 bom at the beginning of the file. This is exceptionally annoying because A) I am specifying iso and B) We have lots of users using tools that show the UTF-8 bom as characters of garbage. I have even tried writing the results to a string, stripping the UTF-8 BOM and then echo'ing it out and still get it. Is it possible that the issue resides with Apache? If I change the fopen to a local file it writes it just fine without the UTF-8 BOM. header("Content-type: text/csv; charset=iso-8859-1"); header("Cache-Control: no-store, no-cache"); header("Content-Disposition: attachment; filename=\"report.csv\""); $outstream = fopen("php://output",'w'); for($i = 0; $i < $report-rowCount; $i++) { fputcsv($outstream, $report-getTaxMatrixLineValues($i), ',', '"'); } fclose($outstream); exit;

    Read the article

  • Execute SQL on CSV files via JDBC

    - by Markos Fragkakis
    Dear all, I need to apply an SQL query to CSV files (comma-separated text files). My SQL is predefined from another tool, and is not eligible to change. It may contain embedded selects and table aliases in the FROM part. For my task I have found two open-source (this is a project requirement) libraries that provide JDBC drivers: CsvJdbc XlSQL JBoss Teiid Create an Apache Derby DB, load all CSVs as tables and execute the query. These are the problems I encountered: it does not accept the syntax of the SQL (it uses internal selects and table aliases). Furthermore, it has not been maintained since 2004. I could not get it to work, as it has as dependency a SAX Parser that causes exception when parsing other documents. Similarly, no change since 2004. Have not checked if it supports the syntax, but seems like an overhead. It needs several entities defines (Virtual Databases, Bindings). From the mailing list they told me that last release supports runtime creation of required objects. Has anyone used it for such simple task (normally it can connect to several types of data, like CSV, XML or other DBS and create a virtual, unified one)? Can this even be done easily? From the 4 things I considered/tried, only 3 and 4 seem to me viable. Any advice on these, or any other way in which I can query my CSV files? Cheers

    Read the article

  • Working with a CSV file with odd encapsulation // PHP

    - by Patrick
    I have a CSV file that I'm working with, and all the fields are comma separated. But some of the fields themselves, contain commas. In the raw CSV file, the fields that contain commas, are encapsulated with quotes, as seen here; "Doctor Such and Such, Medical Center","555 Scruff McGruff, Suite 103, Chicago IL 60652",(555) 555-5555,,,,something else the code I'm using is below <?PHP $file_handle = fopen("file.csv", "r"); $i=0; while (!feof($file_handle) ) { $line = fgetcsv($file_handle, 1024); $c=0; foreach($line AS $key=>$value){ if($i != 0){ if($c == 0){ echo "[ROW $i][COL $c] - $value"; //First field in row, show row # }else{ echo "[COL $c] - $value"; // Remaining fields in row } } $c++; } echo "<br>"; // Line Break to next line $i++; } fclose($file_handle); ?> The problem is I'm getting the fields with the commas split into two fields, which messes up the number of columns I'm supposed to have. Is there any way I could search for commas within quotes and convert them, or another way to deal with this?

    Read the article

  • Mailing Address is forcing 2nd Record in CSV (DevExpress Controls)

    - by Gerhard Weiss
    My DevExpress CSV export has two lines per record. Rank Score,Prog.,Full Address 63.30 ,JIW ,1234 Whispering Pines Dr , ,Sometown MI 48316 62.80 ,JIW ,9876 Beagle Dr , ,Sometown Twp MI 48382 I would like to change it to one line because I want to do a Word Merge. (Unless Word can merge these two lines back together, which I do not thing it can do) Does the DevExpress XtraGrid ExportToText allow for this? A MemoEdit is being used by the XtraGrid and is initialized using the following code (Noticed the ControlChars.NewLine): Public ReadOnly Property FullAddress() As String Get Dim strAddress As System.Text.StringBuilder = New StringBuilder() If Not IsNullEntity Then strAddress.Append(Line1 + ControlChars.NewLine) If Not Line2 Is Nothing AndAlso Me.Line2.Length > 0 Then strAddress.Append(Line2 + ControlChars.NewLine) End If If Not Line3 Is Nothing AndAlso Me.Line3.Length > 0 Then strAddress.Append(Line3 + ControlChars.NewLine) End If strAddress.Append(String.Format("{0} {1} {2}", City, RegionCode, PostalCode)) If Me.Country <> "UNITED STATES" Then strAddress.Append(ControlChars.NewLine + Me.Country) End If End If Return strAddress.ToString End Get End Property Here is the export to CSV code: Dim saveFile As New SaveFileDialog With saveFile '.InitialDirectory = ClientManager.CurrentClient.Entity.StudentPictureDirectory .FileName = "ApplicantList.csv" .CheckPathExists = True .CheckFileExists = False .Filter = "All Files (*.*)|*.*" End With If saveFile.ShowDialog() = Windows.Forms.DialogResult.OK Then Dim PrintingTextExportOptions As New DevExpress.XtraPrinting.TextExportOptions(",", Encoding.ASCII) PrintingTextExportOptions.QuoteStringsWithSeparators = True ApplicantRankListViewsGridControl.ExportToText(saveFile.FileName, PrintingTextExportOptions) End If

    Read the article

  • Python and csv help

    - by user353064
    I'm trying to create this script that will check the computer host name then search a master list for the value to return a corresponding value in the csv file. Then open another file and do a find an replace. I know this should be easy but haven't done so much in python before. Here is what I have so far... masterlist.txt (tab delimited) Name UID Bob-Smith.local bobs Carmen-Jackson.local carmenj David-Kathman.local davidk Jenn-Roberts.local jennr Here is the script that I have created thus far #GET CLIENT HOST NAME import socket host = socket.gethostname() print host #IMPORT MASTER DATA import csv, sys filename = "masterlist.txt" reader = csv.reader(open(filename, "rU")) #PRINT MASTER DATA for row in reader: print row #SEARCH ON HOSTNAME AND RETURN UID #REPLACE VALUE IN FILE WITH UID #import fileinput #for line in fileinput.FileInput("filetoreplace",inplace=1): # line = line.replace("replacethistext","UID") # print line Right now, it's just set to print the master list. I'm not sure if the list needs to be parsed and placed into a dictionary or what. I really need to figure out how to search the first field for the hostname and then return the field in the second column. Thanks in advance for your help, Aaron

    Read the article

  • define keys in multidimensional array from csv

    - by mourique
    I want to compare two arrays, one coming from a shoppingcart and the other one parsed from a csv-file. The array from the shopping cart looks like this: Array ( [0] => Array ( [id] => 7 [qty] => 1 [price] => 07.39 [name] => walkthebridge [subtotal] => 7.39 ) [1] => Array ( [id] => 2 [qty] => 1 [price] => 07.39 [name] => milkyway [subtotal] => 7.39 ) ) The array from my csv-file however looks like this Array ( [0] => Array ( [0] => 1 [1] => walkthebridge [2] => 07.39 ) [1] => Array ( [0] => 2 [1] => milkyway [2] => 07.39 ) ) and is build using this code $checkitems = array(); $file = fopen('checkitems.csv', 'r'); while (($result = fgetcsv($file)) !== false) { $checkitems[] = $result; } fclose($file); how can i get the keys in the second array to match those in the first one? ( So that 0 would be id, and 1 would be name and so on) thanks in advance

    Read the article

  • Showing errors in updatepanel after processing a CSV file

    - by Younes
    I have a csv importroutine which imports my CSV values into Sitecore. After this proces is done i want to show the errors in an asp:literal. This is not working, and I think this is because i need an updatepanel for this in order to be able to update text after the first postback (the csv upload / import). I made this: <asp:ScriptManager ID="ScriptManager1" runat="server"> </asp:ScriptManager> <asp:UpdatePanel ID="UpdatePanel1" runat="server"> <ContentTemplate> <asp:PlaceHolder ID="PlaceHolder1" runat="server"></asp:PlaceHolder> <asp:Button ID="Button1" runat="server" Text="Button" OnClick="Button1_Click" /> </ContentTemplate> </asp:UpdatePanel> and coded this: string melding = string.Format("Er zijn {0} objecten geïmporteerd.{1}", nrOfItemsImported, errors); ViewState["Melding"] = melding; And i have a button. On the onclick of this button I have: Literal literal = new Literal(); literal.Text = (string)ViewState["Melding"]; literal.ID = DateTime.Now.Ticks.ToString(); UpdatePanel1.ContentTemplateContainer.Controls.Add(literal); PlaceHolder1.Controls.Add(literal); When i now press the button i want to update the panel so that it will show my Literal with the errormsg on it. This however isn't happening. How can this be? I'm guessing it has something to do with my viewstate, i don't see keys on the viewstate after I press the button...

    Read the article

  • csv to hash data structure conversion using perl

    - by Kavya S
    1. Convert a .csv file to perlhash data structure Format of a .csv file: sw,s1,s2,s3,s4 ver,v1,v2,v3,v4 msword,v2,v3,v1,v1 paint,v4,v2,v3,v3 outlook,v1,v1,v3,v2 my perl script: #!/usr/local/bin/perl use strict; use warnings; use Data::Dumper; my %hash; open my $fh, '<', 'some_file.csv' or die "Cannot open: $!"; while (my $line = <$fh>) { $line =~ s/,,/-/; chomp ($line); my @array = split /,/, $line; my $key = shift @array; $hash{$key} = $line; $hash{$key} = \@array; } print Dumper(\%hash); close $fh; perl hash i.e output should look like: $sw_ver_db = { s1 => { msword => {ver => v2}, paint => {ver => v4}, outlook => {ver => v1}, }, s2 => { msword => {ver => v3}, paint => {ver => v2}, outlook => {ver => v1}, }, s3 => { msword => {ver =>v1}, paint => {ver =>v3}, outlook => {ver =>v3}, }, s4 => { msword => {ver =>v1}, paint => {ver =>v3}, outlook => {ver =>v2}, }, };

    Read the article

  • Ruby CSV in Array

    - by mattrock
    I read a CSV file. This file looks like this 1.00 cm; 2.00cm ; 3.00 cm; ... ; 100 cm 2.00 cm; 4.00 cm; 6.00 cm; ... ; 100 cm 4.00 cm; 8.00 cm; 12.00 cm; ... ; 100cm 8.00 cm; 16.00 cm; 24.00 cm; ... ; 100cm I have already written the following code CSV.foreach("/Users/testUser/Entwicklung/coverrechner/CoverPaperDE.csv", col_sep: ';') do |row| puts row[0] end This produces the following output: 1.00 cm 2.00 cm 4.00 cm 8.00 cm Example: My matrix is constructed 1.1 1.2 1.3 1.4 2.1 2.2 2.3 2.4 3.1 3.2 3.3 3.4 4.1 4.2 4.3 4.4 I want the following output 1.1 2.1 3.1 4.1 1.2 2.2 3.2 4.2 ... 4.4 How does it work?

    Read the article

  • working with a csv with odd encapsulation // php

    - by Patrick
    I have a CSV file that im working with, and all the fields are comma separated. But some of the fields themselves, contain commas. In the raw csv file, the fields that contain commas, are encapsulated with quotes, as seen here; "Doctor Such and Such, Medical Center","555 Scruff McGruff, Suite 103, Chicago IL 60652",(555) 555-5555,,,,something else the code im using is below <?PHP $file_handle = fopen("file.csv", "r"); $i=0; while (!feof($file_handle) ) { $line = fgetcsv($file_handle, 1024); $c=0; foreach($line AS $key=>$value){ if($i != 0){ if($c == 0){ echo "[ROW $i][COL $c] - $value"; //First field in row, show row # }else{ echo "[COL $c] - $value"; // Remaining fields in row } } $c++; } echo "<br>"; // Line Break to next line $i++; } fclose($file_handle); ?> The problem is im getting the fields with the comma's split into two fields, which messes up the number of columns im supposed to have. Is there any way i could search for comma's within quotes and convert them, or another way to deal with this?

    Read the article

  • [MySQL] Load data from .csv applying regex before insert into table

    - by Gabriel L. Oliveira
    I know that there is a code to import .csv data into a mysql table, and I'm using this one: LOAD DATA INFILE "file.csv" INTO TABLE foo FIELDS TERMINATED BY "," LINES TERMINATED BY "\\r\\n"; The data inside this .csv are lines like this example: 08/e0/Breast_Cancer_Res_2001_Nov_2_3(1)_55-60.tar.gz Breast Cancer Res. 2001 Nov 2; 3(1):55-60 PMC13900 b0/ac/Breast_Cancer_Res_2001_Nov_9_3(1)_61-65.tar.gz Breast Cancer Res. 2001 Nov 9; 3(1):61-65 PMC13901 I just want the first part (the .tar.gz path), always on the pattern (letter or number)(letter or number) / (letter or number)(letter or number)/... and the part starting by 'PMC', always on the pattern PMC(number...) where 'number' means a number between 0 to 9 and a letter means a letter between a to z (both upper and lower case) So, applying the LOAD DATA, and the regex, and inserting the result entries on my sql table, the result table should be: 1 08/e0/Breast_Cancer_Res_2001_Nov_2_3(1)_55-60.tar.gz PMC13900 2 b0/ac/Breast_Cancer_Res_2001_Nov_9_3(1)_61-65.tar.gz PMC13901 What should be the SQL command to do all this?

    Read the article

  • How to store sorted records in csv file ?

    - by Harikrishna
    I sort the records of the datatable datewise with the column TradingDate which is type of datetime. TableWithOnlyFixedColumns.DefaultView.Sort = "TradingDate asc"; Now I want to display these sorted records into csv file but it does not display records sorted by date. TableWithOnlyFixedColumns.DefaultView.Sort = "TradingDate asc";TableWithOnlyFixedColumns.Columns["TradingDate"].ColumnName + "] asc"; DataTable newTable = TableWithOnlyFixedColumns.Clone(); newTable.DefaultView.Sort = TableWithOnlyFixedColumns.DefaultView.Sort; foreach (DataRow oldRow in TableWithOnlyFixedColumns.Rows) { newTable.ImportRow(oldRow); } // we'll use these to check for rows with nulls var columns = newTable.DefaultView.Table.Columns.Cast<DataColumn>(); using (var writer = new StreamWriter(@"C:\Documents and Settings\Administrator\Desktop\New Text Document (3).csv")) { for (int i = 0; i < newTable.DefaultView.Table.Rows.Count; i++) { DataRow row = newTable.DefaultView.Table.Rows[i]; // check for any null cells if (columns.Any(column => row.IsNull(column))) continue; string[] textCells = row.ItemArray .Select(cell => cell.ToString()) // may need to pick a text qualifier here .ToArray(); // check for non-null but EMPTY cells if (textCells.Any(text => string.IsNullOrEmpty(text))) continue; writer.WriteLine(string.Join(",", textCells)); } } So how to store sorted records in csv file ?

    Read the article

  • PHP Array to CSV

    - by JohnnyFaldo
    I'm trying to convert an array of products into a CSV file, but it doesn't seem to be going to plan. The CSV file is one long line, here is my code: for($i=0;$i<count($prods);$i++) { $sql = "SELECT * FROM products WHERE id = '".$prods[$i]."'"; $result = $mysqli->query($sql); $info = $result->fetch_array(); } $header = ''; for($i=0;$i<count($info);$i++) { $row = $info[$i]; $line = ''; for($b=0;$b<count($row);$b++) { $value = $row[$b]; if ( ( !isset( $value ) ) || ( $value == "" ) ) { $value = "\t"; } else { $value = str_replace( '"' , '""' , $value ); $value = '"' . $value . '"' . "\t"; } $line .= $value; } $data .= trim( $line ) . "\n"; } $data = str_replace( "\r" , "" , $data ); if ( $data == "" ) { $data = "\n(0) Records Found!\n"; } header("Content-type: application/octet-stream"); header("Content-Disposition: attachment; filename=your_desired_name.xls"); header("Pragma: no-cache"); header("Expires: 0"); print "$data"; Also, the header doesn't force a download. I've been copy and pasting the output and saving as .csv

    Read the article

  • what is the proper way to do logging in csv file?

    - by user2003548
    i want to log some information of every single request send to a busy http server in a formatted form,use log module would create some thing i don't want to: [I 131104 15:31:29 Sys:34] i think of csv format but i don't know how to customize it,and python got csv module,but read the manual import csv with open('some.csv', 'w', newline='') as f: writer = csv.writer(f) writer.writerows(someiterable) since it would open and close a file each time, i am afraid in this way would slow down the whole server performance, what could i do?

    Read the article

  • Handling extra newlines in csv files parsed with Python?

    - by rmihalyi
    I have a CSV file that contains extra newlines in some fields, e.g.: A, B, C, D, E, F 123, 456, tree , very, bla, indigo I tried the following: import csv catalog = csv.reader(open('test.csv', 'rU'), delimiter=",", dialect=csv.excel_tab) for row in catalog: print "Length: ", len(row), row and the result I got was this: Length: 6 ['A', ' B', ' C', ' D', ' E', ' F'] Length: 3 ['123', ' 456', ' tree'] Length: 4 [' ', ' very', ' bla', ' indigo'] Does anyone have any idea how I can quickly remove extraneous newlines? Thanks!

    Read the article

  • How to transform a csv to combine matching rows?

    - by Christian Wolf
    I have a CSV file with some transaction data. Let's say date, volume, price and direction (sell/buy). Additionally there is a ID for each transaction and on each closing transaction (the newer one) there is a reference to the corresponding transaction. Classical database referencing. Now I want to do some statistics and draw some plots. This could be done via Octave, LaTeX/TikZ, Gnuplot or whatever. To do this I need both buy and sell price in one row. My thought was to preprocess the CSV to get another CSV containing the needed information and then to do the statistics. In the end I'd like to have a solution based on scripts and not on a spreadsheet as data might change often (exported from online DB). My actual solution (see http://paste.ubuntu.com/6262822/ ) is a bash script that parses the CSV line by line and checks if there exists a corresponding transaction. If found, a new row is written to the destination CSV. If not a warning is printed. The bad news: For each row in the source file I have to read the whole file a few times. This causes long running times of 10sec for 300 lines. As the line number might rise soon (10k lines), this is not perfect. I am aware, that there are many shells to be opened in the script which might cause the performance problems. Now my questions: Is bash/awk/sed/.... a good way to do things? Should I first import all data into a "real" local database to use SQL? Is there an easy way to achieve the desired results?

    Read the article

  • Convert a CSV file to a XLS file on the linux command line?

    - by Rory
    I'm using Debian Linux and I want to convert a CSV file to an Excel XLS Spreadsheet file. The catdoc package includes the xls2csv command that converts from XLS to CSV. However it doesn't do the reverse. Since I just have a CSV file, I don't care about formatting or anything like that. I'm not worried if it only generates a very simple XLS file, and doesn't support the fancy new versions. Just so long as it's an XLS spreadsheet.

    Read the article

  • How to insert a list of data files(described in CSV file) from client location into database using PHP programming?

    - by Golam Mustafa
    We have some DVD. Each of them contain ---A CSV file containing some information about the documents. ---The list of pdf file(Scanned document). Example of CSV file Title,Author,FileName Design Document 0455, Eric Clipton,ds0455.pdf Tesign Document 0511,Johanson E,td0511.pdf I want to write PHP code that would read the CSV file , insert each information to database table as record. Can anybody help me to provide any idea about ---- How to select individual file from the client location on the basis of file name in the CSV file using PHP script. Thanks Golam

    Read the article

  • CSV string handling

    - by Christian Hagelid
    Typical way of creating a CSV string (pseudocode): create a CSV container object (like a StringBuilder in C#) Loop through the strings you want to add appending a comma after each one After the loop, remove that last superfluous comma. Code sample: public string ReturnAsCSV(ContactList contactList) { StringBuilder sb = new StringBuilder(); foreach (Contact c in contactList) { sb.Append(c.Name + ","); } sb.Remove(sb.Length - 1, 1); //sb.Replace(",", "", sb.Length - 1, 1) return sb.ToString(); } I feel that there should be an easier / cleaner / more efficient way of removing that last comma. Any ideas? Update I like the idea of adding the comma by checking if the container is empty, but doesn't that mean more processing as it needs to check the length of the string on each occurrence?

    Read the article

  • Exporting info from PS script to csv

    - by George
    Hi, This is a powershell/AD/Exchange question.... I'm running a script against a number of Users to check some of their attributes however I'm having trouble getting this to output to CSV. The script runs well and does exactly what I need it to do, the output on screen is fine, I'm just having trouble getting it to directly export to csv. The input is a comma seperated txt file of usernames (eg "username1,username2,username3") I've experimented with creating custom ps objects, adding to them and then exporting those but its not working.... Any suggestions gratefully received.. Thanks George $array = Get-Content $InputPath #split the comma delimited string into an array $arrayb = $array.Split(","); foreach ($User in $arrayb) { #find group memebership Write-Host "AD group membership for $User" Get-QADMemberOf $User #Get Mailbox Info Write-Host "Mailbox info for $User" Get-Mailbox $User | select ServerName, Database, EmailAddresses, PrimarySmtpAddress, WindowsEmailAddress #get profile details Write-Host "Home drive info for $User" Get-QADUser $User| select HomeDirectory,HomeDrive #add space between users Write-Host "" Write-Host "******************************************************" } Write-Host "End Script"

    Read the article

  • Mathematica - Import CSV and process columns?

    - by Casey
    I have a CSV file that is formatted like: 0.0023709,8.5752e-007,4.847e-008 and I would like to import it into Mathematica and then have each column separated into a list so I can do some math on the selected column. I know I can import the data with: Import["data.csv"] then I can separate the columns with this: StringSplit[data[[1, 1]], ","] which gives: {"0.0023709", "8.5752e-007", "4.847e-008"} The problem now is that I don't know how to get the data into individual lists and also Mathematica does not accept scientific notation in the form 8.5e-007. Any help in how to break the data into columns and format the scientific notation would be great. Thanks in advance.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >