Search Results

Search found 1639 results on 66 pages for 'csv'.

Page 21/66 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Check the encoding of text in SQlite

    - by JJG
    I'm having a nightmare dealing with non Eurpean texts in SQlite. I think the problem is that SQlite isn't encoding the text in UTF8. So I want to check what the encoding is, and hopefully change it to utf8. I encoded a CSV in UTF8 and simply imported it to SQlite but the non-roman text is garbled. I would like to know: 1)how to check the encoding. 2)How to change the encoding if it is not utf8. I've been reading about Pragma encoding, but I'm not sure how to use this.

    Read the article

  • AppEngine GeoPt Data Upload

    - by Eric Landry
    I'm writing a GAE app in Java and only using Python for the data upload. I'm trying to import a CSV file that looks like this: POSTAL_CODE_ID,PostalCode,City,Province,ProvinceCode,CityType,Latitude,Longitude 1,A0E2Z0,Monkstown,Newfoundland,NL,D,47.150300000000001,-55.299500000000002 I was able to import this file in my datastore if I import Latitude and Longitude as floats, but I'm having trouble figuring out how to import lat and lng as a GeoPt. Here is my loader.py file: import datetime from google.appengine.ext import db from google.appengine.tools import bulkloader class PostalCode(db.Model): id = db.IntegerProperty() postal_code = db.PostalAddressProperty() city = db.StringProperty() province = db.StringProperty() province_code = db.StringProperty() city_type = db.StringProperty() lat = db.FloatProperty() lng = db.FloatProperty() class PostalCodeLoader(bulkloader.Loader): def __init__(self): bulkloader.Loader.__init__(self, 'PostalCode', [('id', int), ('postal_code', str), ('city', str), ('province', str), ('province_code', str), ('city_type', str), ('lat', float), ('lng', float) ]) loaders = [PostalCodeLoader] I think that the two db.FloatProperty() lines should be replaced with a db.GeoPtProperty(), but that's where my trail ends. I'm very new to Python so any help would be greatly appreciated.

    Read the article

  • Copy one column over another in a delimited file

    - by user275455
    For instance, I needed to remove column 25 and replace it with a copy of column 22 in a simple csv file with no embedded delimiters. The best I could come up with was the awkward looking: awk -F, '{ for(x=1;x<25;x++){printf("%s,", $x)};printf("%s,",$22);for(x=26;x<59;x++){printf ("%s,", $x)};print $59}' I would expect something like cut -d, -f1-24,23,26-59 to work but cut doesn't seem to want to print the same column two times... Is there a more elegant way to do it using anything typicaly available in a linux shell environment?

    Read the article

  • speeding up parsing of files

    - by user248237
    the following function parses a CSV file into a list of dictionaries, where each element in the list is a dictionary where the values are indexed by the header of the file (assumed to be the first line.) this function is very very slow, taking ~6 seconds for a file that's relatively small (less than 30,000 lines.) how can I speed it up? def csv2dictlist_raw(filename, delimiter='\t'): f = open(filename) header_line = f.readline().strip() header_fields = header_line.split(delimiter) dictlist = [] # convert data to list of dictionaries for line in f: values = map(tryEval, line.strip().split(delimiter)) dictline = dict(zip(header_fields, values)) dictlist.append(dictline) return (dictlist, header_fields) thanks.

    Read the article

  • Indexing CSV file contents in Python

    - by Hossein
    Hi, I have a very large CSV file contaning only two fields (id,url). I want to do some indexing on the url field with python, I know that there are some tools like Whoosh or Pylucene. but I can't get the examples to work. can someone help me with this?

    Read the article

  • need a regex to parse a csv file with double quotes in php

    - by Brandon G
    Trying to parse a csv file that has all the data wrapped in double quotes, because there may be commas in the double quotes. Looks like this: $songs = '"1, 2, 3, 4 (I Love You)","Plain White T's","CBE10-22",15,"CBE10-22","","","CB",984,"","10/05/10"'; $regResult = preg_match( "", $songs, $matches ); I can't figure out a regex that will return the data between the quotes as the matches. I'm sure there is some regex master that can help me with this.

    Read the article

  • Upload Excel or CSV file to MySQL with PHP

    - by Tony
    I'm looking to allow users to upload an Excel or CSV file to MySQL for a contact management system. Need to be able to allow users to map their columns so that they are imported into the correct column in the table. Anyone know of a good site or tutorial on this?

    Read the article

  • Send comma delimited CSV through SFTP?

    - by JM4
    I am collecting registration information on my site and need to figure out how to pass all data stored in the MySQL DB (or just portions of it) as a comma delimited CSV file through an SFTP so our partners can access the information. The pages are built using PHP. I literally have no idea how to do this and am hoping somebody has experience doing so. Thanks ahead of time!

    Read the article

  • Report generation in PHP (formats required pdf,xls,doc,csv)

    - by Ish Kumar
    I need to generate reports in my PHP website (in zend framework) Formats required: PDF (with tables & images) // presently using Zend_Pdf XLS (with tables & images) DOC (with tables & images) CSV (only tables) Please recommend robust and fast solution for generating reports in PHP. Platform: Zend Framework on LAMP I know there are some tricky solutions for creating such reports, i wonder is there any open source report generation utility that can be used with LAMP environment

    Read the article

  • Retrieving & Displaying data from csv files using AJAX

    - by JJ
    I need to provide a feature such that the user is able to upload a csv file.Once the uploading is done I need to retrieve each value and show it on a grid which is implemented using far point(http://www.fpoint.com/products/spread/spread.aspx).But all this has to be done without the page being refreshed.I use asp.net 2.0 & Ajax Pro.Remember I cannot use the inbuilt AJAX feature provided by microsoft .To be precise I need something similar to the lines of attaching a file using gmail. Thanks & Regards Bikram

    Read the article

  • Using a CSV/Text-File as a RecordSource for a Report

    - by Falcon
    I need an Access Report to use a CSV-File as a RecordSource. I have searched and tried many things, yet, I've found no way to achieve this. A temporary table in some other database is not an option. I've been trying to use a DAO RecordSet, but while I can read the RecordSet just fine I cannot set it as the report's RecordSet and I cannot use its Name as the RecordSource Property as both approaches lead to an error. Please help me find a way!

    Read the article

  • pandas read rotated csv files

    - by EricCoding
    Is there any function in pandas that can directly read a rotated csv file? To be specific, the header information in the first col instead of the first row. For example: A 1 2 B 3 5 C 6 7 and I would like the final DataFrame this way A B C 1 3 5 2 5 7 Of corse we can get around this problem using some data wangling techniques like transpose and slicing. I am wondering there should be a quick way in API but I could not find it.

    Read the article

  • Shortening large CSV on debian

    - by Unkwntech
    I have a very large CSV file and I need to write an app that will parse it but using the 6GB file to test against is painful, is there a simple way to extract the first hundred or two lines without having to load the entire file into memory? The file resides on a Debian server.

    Read the article

  • Dump to CSV/Postgres memory

    - by alex
    I have a large table (300 million lines) that I would like to dump to a csv - I need to do some processing that cannot be done with SQL. Right now I am using Squirrel as a client, and it does not apparently deal very well with large datasets - at least as far as I can tell from my own (limited) experience. If I run the query on the actual host, will it use less memory? Thanks for any help.

    Read the article

  • MySQL upload CSV file as new table

    - by Brian
    I frequently upload CSV files to a MySQL db. It is very convenient to use LOAD DATA LOCAL INFILE to upload the data, but I can't use this to create the table itself. As of now, the best method I have is to use PHP to get the field titles from the first row of the file and then put together a CREATE table query. Is there a more convenient way to do this?

    Read the article

  • converting a csv into text

    - by user349418
    I have a csv (large) file of ip addresses, and wish to covert into single line ip address in bash. aa.bb.cc.dd,aa.bb.cc.dd,aa.bb.cc.dd,.. into aa.bb.cc.dd aa.bb.cc.dd aa.bb.cc.dd [..] The list of ips in question, http://www.stopforumspam.com/downloads/bannedips.zip

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >