Search Results

Search found 1639 results on 66 pages for 'csv'.

Page 54/66 | < Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >

  • Remove objects from different environments

    - by Fred
    I have an R script file that executes a second R script via: source("../scripts/second_file.R") That second file has the following lines: myfiles <- list.files(".",pattern = "*.csv") ... rm(myfiles) When I run the master R file I get: > source("../scripts/second_file.R") Error in file.remove(myfiles) : object 'myfiles' not found and the program aborts. I think this has something to do with the environment. I looked at ?rm() pages but less than illuminating. I figure I have to give it a position argument, but not sure which.

    Read the article

  • How to insert null value for numeric field of a datatable in c#?

    - by Pandiya Chendur
    Consider My dynamically generated datatable contains the following fileds Id,Name,Mob1,Mob2 If my datatable has this it get inserted successfully, Id Name Mob1 Mob2 1 acp 9994564564 9568848526 But when it is like this it gets failed saying, Id Name Mob1 Mob2 1 acp 9994564564 The given value of type String from the data source cannot be converted to type decimal of the specified target column. I generating my datatable by readingt a csv file, CSVReader reader = new CSVReader(CSVFile.PostedFile.InputStream); string[] headers = reader.GetCSVLine(); DataTable dt = new DataTable(); foreach (string strHeader in headers) { dt.Columns.Add(strHeader); } string[] data; while ((data = reader.GetCSVLine()) != null) { dt.Rows.Add(data); } Any suggestion how to insert null value for numeric field during BulkCopy in c#... EDIT: I tried this dt.Columns["Mob2"].AllowDBNull = true; but it doesn't seem to work...

    Read the article

  • Rails fixtures seem to be adding extra unexpected data

    - by Mason Jones
    Hello, all. I've got a dynamic fixture CSV file that's generating predictable data for a table in order for my unit tests to do their thing. It's working as expected and filling the table with the data, but when I check the table after the tests run, I'm seeing a number of additional rows of "blank" data (all zeros, etc). Those aren't being created by the fixture, and the unit tests are read-only, just doing selects, so I can't blame the code. There doesn't seem to be any logging done during the fixtures setup, so I can't see when the "blank" data is being inserted. Anyone ever run across this before, or have any ideas of how to log or otherwise see what the fixture setup is doing in order to trace down the source of the blank data?

    Read the article

  • List of items with same values

    - by user559780
    I'm creating a list of items from a file BufferedReader reader = new BufferedReader(new InputStreamReader(new FileInputStream("H:/temp/data.csv"))); try { List<Item> items = new ArrayList<Item>(); Item item = new Item(); String line = null; while ((line = reader.readLine()) != null) { String[] split = line.split(","); item.name = split[0]; item.quantity = Integer.valueOf(split[1]); item.price = Double.valueOf(split[2]); item.total = item.quantity * item.price; items.add(item); } for (Item item2 : items) { System.out.println("Item: " + item2.name); } } catch (IOException e) { reader.close(); e.printStackTrace(); } Problem is the list is displaying the last line in the file as the value for all items.

    Read the article

  • In C, is it possible do free only an array first or last position?

    - by user354959
    Hi there! I've an array, but I don't need its first (or last) position. So I point a new variable to the rest of the array, but I should free the array first/last position. For instance: p = read_csv_file(); q = p + 1; // I don't need the first CSV file field // Here I'd like to free only the first position of p return q; Otherwise I've to memcpy the array to other variable, excluding the first position, and then free the original array. Like this: p = read_csv_file(); q = (int*) malloc(sizeof(int) * (SOME_SIZE - 1)); memcpy(q, p+1, sizeof(int) * (SOME_SIZE - 1)); free(p); return q; But then I'll have the overhead of copying all the array. Is this possible to only free a single position of an array?

    Read the article

  • How do C or .NET programmers store and load strings in their programs?

    - by Ivan Ivkovic
    I've been doing PHP and stuff for the last year; I just got into a bit of C and C++. In the book I'm just reading, all the strings are actually in the code (I realize this is just for example, but just curious). My interest is — is there a common way for programmers to store strings and display them? Does .NET have some predefined way of doing this — like Android does in strings file? (In PHP, I keep them in all CSV files completely separate from code.)

    Read the article

  • C#: split a string into runs of characters, numbers and delimited strings and process it

    - by nrkn
    OK my regex is a bit rusty and I've been struggling with this particular problem... I need to split and process a string containing any number of the following, in any order: Chars (lowercase letters only) Quote delimited strings Ints The strings are pretty weird (I don't have control over them). When there's more than one number in a row in the string they're seperated by a comma. They need to be processed in the same order that they appeared in the original string. For example, a string might look like: abc20a"Hi""OK"100,20b With this particular string the resulting call stack would look a bit like: ProcessLetters( new[] { 'a', 'b', 'c' } ); ProcessInts( 20 ); ProcessLetters( 'a' ); ProcessStrings( new[] { "Hi", "OK" } ); ProcessInts( new[] { 100, 20 } ); ProcessLetters( 'b' ); What I could do is treat it a bit like CSV, where you build tokens by processing the characters one at a time, but I think it could be more easily done with a regex?

    Read the article

  • Perl: print back to beginning of line

    - by Pmarcoen
    Okay, so what I'm trying to do is print out a percentage complete to my command line, now, I would like this to simply 'update' the number shown on the screen. So somehow go back to the beginning of the line and change it. For example the windows relog.exe command-line utility (which can convert a .blg file to a .csv file) does this. If you run it, it will display a percentage complete. Now this is probably written in C++. I don't know if this is possible in perl as well ?

    Read the article

  • Best tools to create valid XML files from an Excel file

    - by systempuntoout
    I need to create a script that extracts some data from a complex Excel 2003 file (with multiple Sheets and different tables inside a single sheet) and produces different XML files that need to be validated against a given XSD file. My preferred language is Python; to create and validate XML files i would go with lxml. What do you suggest for parsing XSL files? Is xlrd the right tool to use for complex Excel files? Or do i need to convert all the sheets in CSV manually, and read files line by line, splitting and getting data? I accept C#, VB6 suggestions too.

    Read the article

  • WatiN downloading a file from webpage

    - by user541597
    I have the following code which does a couple of webpage redirections and then clicks an tag which has a javascript fuction as the href. When it is called a file is download. My problem is I want to be able to download the file without being prompted to cancel, save or open. Please help I am using IE9 using (var browser = new IE("http:url.aspx")) { browser.TextField(Find.ByName("ctl00$ContentPlaceHolder1$Login1$UserName")).TypeText("cpereyra"); browser.TextField(Find.ByName("ctl00$ContentPlaceHolder1$Login1$Password")).TypeText("Maxipereyra15"); browser.Button(Find.ByName("ctl00$ContentPlaceHolder1$Login1$LoginButton")).Click(); browser.GoTo("http://it-motivity-cmc/Movation/MyPage/MyDashboard.aspx?dynamicdashboardid=ab000000-7dea-11c9-b596-d01e04bebb94"); while (browser.Eval("document.readyState") != "complete") { Thread.Sleep(1000); } Div div = browser.Div("ctl00_ContentPlaceHolder1_wrapper_vis_zone1_1"); div.Link(link => link.Text == "Export to CSV").Click(); }

    Read the article

  • What are the best practices for importing large datasets into MongoDB?

    - by snl
    We are just giving MongoDB a test run and have set up a Rails 3 app with Mongoid. What are the best practices for inserting large datasets into MongoDB? To flesh out a scenario: Say, I have a book model and want to import several million records from a CSV file. I suppose this needs to be done in the console, so this may possibly not be a Ruby-specific question. Edited to add: I assume it makes a huge difference whether the imported data includes associations or is supposed to go into one model only. Any comments on either scenario welcome.

    Read the article

  • Parse file to hash in Ruby

    - by Taschetto
    I'm a ruby newcomer who's trying to read a text file (a Valgrind simulation output) like this: -------------------------------------------------------------------------------- Profile data file 'temp/gt_1024_2_16.out' -------------------------------------------------------------------------------- I1 cache: 1024 B, 16 B, 2-way associative D1 cache: 32768 B, 64 B, 8-way associative LL cache: 3145728 B, 64 B, 12-way associative Profiled target: bash run.sh Events recorded: Ir I1mr ILmr Dr D1mr DLmr Dw D1mw DLmw Events shown: Ir I1mr ILmr Dr D1mr DLmr Dw D1mw DLmw Event sort order: Ir I1mr ILmr Dr D1mr DLmr Dw D1mw DLmw Thresholds: 99 0 0 0 0 0 0 0 0 Include dirs: User annotated: Auto-annotation: off -------------------------------------------------------------------------------- Ir I1mr ILmr Dr D1mr DLmr Dw D1mw DLmw -------------------------------------------------------------------------------- 1,894,017 246,981 2,448 519,124 4,691 2,792 337,817 1,846 1,672 PROGRAM TOTALS // other data I want to extract the PROGRAM TOTALS table and put it into a hash. Something like... myHash = { :Ir => 1894017, :I1mr => 246981, ILmr => 2448, ..., DLmw => 1672 } What are the best options for doing this? Could the CSV classes help me out? Thanks a bunch.

    Read the article

  • Extracting Data Daily from MySQL to a Local MySQL DB

    - by Sunny Juneja
    I'm doing some experiments locally that require some data from a production MySQL DB that I only have read access to. The schemas are nearly identical with the exception of the omission of one column. My goal is to write a script that I can run everyday that extracts the previous day's data and imports it into my local table. The part that I'm most confused about is how to download the data. I've seen names like mysqldump be tossed around but that seems a way to replicate the entire database. I would love to avoid using php seeing as I have no experience with it. I've been creating CSVs but I'm worried about having the data integrity (what if there is a comma in a field or a \n) as well as the size of the CSV (there are several hundred thousand rows per day).

    Read the article

  • Storing an arbitrary R object onto HDD?

    - by Harokitty
    I understand that we can export data matrices to csv or xlsx files. What about complex objects like lm? For example, in my work I might have a list of length 1000, each with a single lm() object. Each time I load R I have to wait a long time to populate the 1000 length list with these lm objects with a for loop or a lapply. I would rather just save the list somewhere on my HDD at the end of a session and open it at the start of the next session.

    Read the article

  • How to to dump JS array... (boommarklet?)

    - by Soulhuntre
    A page on a site I use is holding some of my data hostage. Once I have logged into the site and navigated to the right page, the data I need is in the array eeData[] - it is 720 elements long (once every 2 minutes of a given day). Rather than simulate the requests to the underlying stuff json supplier and since its only once a day, I am happy to simply develop a bookmarklet to grab the data - preferably as a XML or CSV file. Any pointers to sample code or hints would help. I found a bookmarklet here that is based on this script that does part of this - but I am not up to speed on any potential JS file IO to see if it is possible to induce a file "download" of the data, or pop it opn in a new window I can copy / paste.

    Read the article

  • Exporting ActiveRecord objects into POROs

    - by Lucas d. Prim
    Hello, I'm developing a "script generator" to automatize some processes at work. It has a Rails application running on a server that stores all data needed to make the script and generates the script itself at the end of the process. The problem I am having is how to export the data from the ActiveRecord format to Plain Old Ruby Objects (POROs) so I can deal with them in my script with no database support and a pure-ruby implementation. I tought about YAML, CSV or something like this to export the data but it would be a painful process to update these structures if the process changes. Is there a simpler way? Ty!

    Read the article

  • Transform LINQ Dataset into a Matrix for export

    - by Mad Halfling
    Hi folks, I've got a data table with columns in which include Item, Category and Value (and others, but those are the only relevant ones for this problem) that I access via LINQ in a C# ASP.Net MVC app. I want to transform these into a matrix and output that as a CSV file to pull into Excel as matrix with the items down the side, the categories across the top and the values in the row cells. However, I don't know how many, or what, categories there will be in this table, nor will there always be a record for each item/category combination. I've written this by looping round, getting my "master category" list, then looking again for each item, filling in either blank or Value, depending on whether the item/category record exists, but as there are currently 27000 records in the table, this isn't as fast as I'd like. Is there a slicker and faster way I can do this, maybe via LINQ (firing into a quicker SQL statement so the DB server can do the leg-work), or will any method essentially come back to what I am doing? Thx MH

    Read the article

  • In which document do file specifications belong?

    - by Andrew
    In which document would a file specification belong? Perhaps this file is used as an input to a third-party system. Would it belong in its own document? Or would it be better to put it in the functional or design spec? Or somewhere else? When I say file specification, I mean a description of what format the file is (CSV, fixed width, etc), columns, data types, etc. Also, where should you document how the file is generated? i.e. business rules/algorithms which are used to generate the file.

    Read the article

  • What is a good way of coding a file processing program, which accepts multisource data in Java

    - by jjepsuomi
    I'm making a data prosessing system, which currently is using csv-data as input and output form. In the future I might want to add support for example database-, xml-, etc. typed input and output forms. How should I desing my program so that it would be easy to add support for new type of data sources? Should simply make for example an abstract data class (which would contain the basic file prosessing methods) and then inherit this class for database, xml, etc. cases? Hope my question is clear =) In other words my question is: "How to desing a file prosessing system, which can be easily updated to accept input data from different sources (database, XML, Excel, etc.)".

    Read the article

  • Which is the best and appropriate way to write the code in Winforms ?

    - by Harikrishna
    What is the best way to write the code ? (1) Like directly writing the code in the button_click() event. or (2) Make the function of that code which I write in button_click() event and write this function in one class and then that function I should call in the button_Click() event.Like is it called three-tired approach to write the code ? Like in button_Click() event I write the code to save the records in csv file from datatable.So I should write that code in button_Click() event or I should make one new function and one new class and write that code in that function which is the new class and calling that function in button_Click() event. This is only one example but I am talking about all the code written in my application that which is the appropriate and best way to write the code and what are the benefits ? Note that I write the code in Winforms with c#.

    Read the article

  • preg_match_all confusing failing

    - by James
    $string = (string) file_get_contents($_FILES['file']['tmp_name']); echo $string; preg_match_all("/[\._a-zA-Z0-9-]+@[\._a-zA-Z0-9-]+/i", $string, $matches); print_r($matches); I am parsing text/csv files and grabbing email addresses from uploaded files. When parsing a Google Contact file I exported it weirdly fails. But when I simply copy the string that is echo'd and paste that instead of the file_get_contents result, it parses and works. Any idea why it is refusing to take the file_get_contents string, but if I paste in the raw data myself, it works?

    Read the article

  • How can I do a 'where' clause in Linux shell?

    - by Hoa
    I have a CSV file and I would like to filter all the lines where the 19th column has two or more characters. I know the individual pieces but can't figure out how to glue them together. First I have to cat the file. The following prints the 19th column awk -F "," '{print $19}' file.txt awk also has length and ifs And I know it all has to be glued together using pipes. I'm just getting stuck at the exact syntax since I have not done much bash programming before.

    Read the article

  • How to look for different types of files in a directory?

    - by herrow
    public List<string> MapMyFiles() { List<FileInfo> batchaddresses = new List<FileInfo>(); foreach (object o in lstViewAddresses.Items) { try { string[] files = Directory.GetFiles(o.ToString(), "*-E.esy"); files.ToList().ForEach(f => batchaddresses.Add(new FileInfo(f))); } catch { if(MessageBox.Show(o.ToString() + " does not exist. Process anyway?", "Continue?", MessageBoxButtons.YesNo) == DialogResult.Yes) { } else { Application.Exit(); } } } return batchaddresses.OrderBy(f => f.CreationTime) .Select(f => f.FullName).ToList(); } i would like to add to the array not only .ESY but also "p-.csv" how do i do this?

    Read the article

  • Python sqlite3 and concurrency

    - by RexE
    I have a Python program that uses the "threading" module. Once every second, my program starts a new thread that fetches some data from the web, and stores this data to my hard drive. I would like to use sqlite3 to store these results, but I can't get it to work. The issue seems to be about the following line: conn = sqlite3.connect("mydatabase.db") If I put this line of code inside each thread, I get an OperationalError telling me that the database file is locked. I guess this means that another thread has mydatabase.db open through a sqlite3 connection and has locked it. If I put this line of code in the main program and pass the connection object (conn) to each thread, I get a ProgrammingError, saying that SQLite objects created in a thread can only be used in that same thread. Previously I was storing all my results in CSV files, and did not have any of these file-locking issues. Hopefully this will be possible with sqlite. Any ideas?

    Read the article

  • Sharepoint 2010 - Managed KeyWords

    - by Audioillity
    Hi, Is it possible to import managed keywords into SharePoint 2010? Where are the keywords stored within which database? Background - I'm currently working on a migration from a legacy system into SharePoint 2010. So far everything is going well, and I can even bring across the managed meta data across along with most other data. The process I use was built for SharePoint 2007 to update Lists over SOAP. With a few manual tweaks I've managed to get the metadata to come across. To bring across either managed metadata or managed keywords I need to know the ID for the existing label/keyword. I have this for the Managed Metadata however not for the Managed Keyword. Currently I create a CSV file to be imported for managed metadata before working out the reverent GUID for the source label. Many Thanks Luke

    Read the article

< Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >