Search Results

Search found 1639 results on 66 pages for 'csv'.

Page 36/66 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • write.table in R screws up header when has rownames

    - by Yannick Wurm
    Hello, check this example: > a = matrix(1:9, nrow = 3, ncol = 3, dimnames = list(LETTERS[1:3], LETTERS[1:3])) > a A B C A 1 4 7 B 2 5 8 C 3 6 9 the table displays correctly. There are two different ways of writing it to file... write.csv(a, 'a.csv') which gives as expected: "","A","B","C" "A",1,4,7 "B",2,5,8 "C",3,6,9 and write.table(a, 'a.txt') which screws up "A" "B" "C" "A" 1 4 7 "B" 2 5 8 "C" 3 6 9 indeed, an empty tab is missing.... which is a pain in the butt for downstream things. Is this a bug or a feature? Is there a workaround? (other than write.table(cbind(rownames(a), a), 'a.txt', row.names=FALSE) Cheers, yannick

    Read the article

  • Is there a work-around that allows missing data to equal NULL for LOAD DATA INFILE in MySQL?

    - by richardh
    I have a lot of large csv files with NULL values stored as ,, (i.e., no entry). After a lot of searching I found that this is a known "bug", although it may be a feature for some users. Is there a way that I can fix this on the fly without pre-processing? These data are all numeric, so a zero value is very different from NULL. Or if I have to do pre-processing, is there one that is most promising for dealing with tens of csv files of 100mb to 1gb? Thanks!

    Read the article

  • I need to rename a file in VBA, but getting "File Not Found" when it's clearly there!

    - by Karl
    Help! I'm getting a File Not Found error when trying to rename a file w/a variable. The variable is string. I can look at the variable and it's the exact filename that is there, but when I run the code, it says not found! Dim filePath, fileName, absPath, newPath As String filePath = "P:\Automated\" fileName = MySite.GetResult absPath = filePath & fileName newPath = "P:\Automated\NEW.csv" 'The following is a rename from CuteFTP Pro COM Object: '(Getting the same result from this and the below "Name". 'MySite.LocalRename "P:\Automated\" & fileName, "P:\Automated\NEW.csv" Name absPath As newPath

    Read the article

  • Streaming data to the browser as a file of unknown size

    - by Sir Psycho
    I have some data which is queried from the database and I'd like to send it to the client as a csv file. The file size varies each time due to the fact that the DB data returned can be of any size. Instead of saving this file to the hard disk, I'd like to send it to the browser at the same time it's being processed into a CSV by my algorithm. Response.Write seems useless. For some reason, the file download dialog is only displayed once my processing is finished. This seems odd as I'm writting all my output to the Response.Output stream. I have downloaded files on the web before where the filesize is not known and the browser just keeps on downloading. Is there any way to achieve this? The following stackoverflow thread did not offer any good advise. http://stackoverflow.com/questions/873995/asp-net-downloading-large-files-of-unknown-size Thanks

    Read the article

  • Can I use Linq to project a new typed datarow?

    - by itchi
    I currently have a csv file that I'm parsing with an example from here: http://alexreg.wordpress.com/2009/05/03/strongly-typed-csv-reader-in-c/ I then want to loop the records and insert them using a typed dataset xsd to an Oracle database. It's not that difficult, something like: foreach (var csvItem in csvfile) { DataSet.MYTABLEDataTable DT = new DataSet.MYTABLEDataTable(); DataSet.MYTABLERow row = DT.NewMYTABLERow(); row.FIELD1 = csvItem.FIELD1; row.FIELD2 = csvItem.FIELD2; } I was wondering how I would do something with LINQ projection: var test = from csvItem in csvfile select new MYTABLERow { FIELD1 = csvItem.FIELD1, FIELD2 = csvItem.FIELD2 } But I don't think I can create datarows like this without the use of a rowbuilder, or maybe a better constructor for the datarow?

    Read the article

  • Extract dates from filename

    - by Newbie
    I have a situation where I need to extract dates from the file names whose general pattern is [filename_]YYYYMMDD[.fileExtension] e.g. "xxx_20100326.xls" or x2v_20100326.csv The below program does the work //Number of charecter in the substring is set to 8 //since the length of YYYYMMDD is 8 public static string ExtractDatesFromFileNames(string fileName) { return fileName.Substring(fileName.IndexOf("_") + 1, 8); } Is there any better option of achieving the same? I am basically looking for standard practice. I am using C#3.0 and dotnet framework 3.5 Edit: I have like the solution and the way of answerig of LC. I have used his program like string regExPattern = "^(?:.*_)?([0-9]{4})([0-9]{2})([0-9]{2})(?:\\..*)?$"; string result = Regex.Match(fileName, @regExPattern).Groups[1].Value; The input to the function is : "x2v_20100326.csv" But the output is: 2010 instead of 20100326(which is the expected one). Can anyone please help.

    Read the article

  • R problems using rpart with 4000 records and 13 attributes

    - by josh
    I have attempted to email the author of this package without success, just wondering if anybody else has experienced this. I am having an using rpart on 4000 rows of data with 13 attributes. I can run the same test on 300 rows of the same data with no issue. When I run on 4000 rows, Rgui.exe runs consistently at 50% cpu and the UI hangs.... it will stay like this for at least 4-5hours if I let it run, and never exit or become responsive. here is the code I am using both on the 300 and 4000 size subset : train<-read.csv("input.csv",header=T) y<-train[,18] x<-train[,3:17] library(rpart) fit<-rpart(y~.,x) Is this a known limitation of rpart, am I doing something wrong? potential workarounds? any assistance appreciated

    Read the article

  • Anyone using a web service as a data source in Excel 2007?

    - by Scott
    Can I use a web service as a data soruce for Excel pivot tables? Currently, the soure data for the pivot table is being exported from the DB to a CSV file. Then the CSV file is loaded into a worksheet in the workbook. From there, a pivot table is created in the same workbook. We are looking to streamline this process. The SQL db and pivot tables are the constants. The pivot tables are generated dynamically from a public-facing website. This is not an internal app so the preference is to not connect directly to the DB.

    Read the article

  • ob_start() -> ob_flush() doesn't work

    - by MB34
    I am using ob_start()/ob_flush() to, hopefully, give me some progress during a long import operation. Here is a simple outline of what I'm doing: <?php ob_start (); echo "Connecting to download Inventory file.<br>"; $conn = ftp_connect($ftp_site) or die("Could not connect"); echo "Logging into site download Inventory file.<br>"; ftp_login($conn,$ftp_username,$ftp_password) or die("Bad login credentials for ". $ftp_site); echo "Changing directory on download Inventory file.<br>"; ftp_chdir($conn,"INV") or die("could not change directory to INV"); // connection, local, remote, type, resume $localname = "INV"."_".date("m")."_".date('d').".csv"; echo "Downloading Inventory file to:".$localname."<br>"; ob_flush(); flush(); sleep(5); if (ftp_get($conn,$localname,"INV.csv",FTP_ASCII)) { echo "New Inventory File Downloaded<br>"; $datapath = $localname; ftp_close($conn); } else { ftp_close($conn); die("There was a problem downloading the Inventory file."); } ob_flush(); flush(); sleep(5); $csvfile = fopen($datapath, "r"); // open csv file $x = 1; // skip the header line $line = fgetcsv($csvfile); $y = (feof($csvfile) ? 2 : 5); while ((!$debug) ? (!feof($csvfile)) : $x <= $y) { $x++; $line = fgetcsv($csvfile); // do a lot of import stuff here with $line ob_flush(); flush(); sleep(1); } fclose($csvfile); // important: close the file ob_end_clean(); However, nothing is being output to the screen at all. I know the data file is getting downloaded because I watch the directory where it is being placed. I also know that the import is happening, meaning that it is in the while loop, because I can monitor the DB and records are being inserted. Any ideas as to why I am not getting output to the screen?

    Read the article

  • design function: underlying structure to store list of results for print to file

    - by forest.peterson
    is this a good approach to print a list of items to csv file with a sublist attached to each item. The gist of the function is when an item is found that does not exactly macth then a list of close matches is generated - this works now writing out one list at a time to a command window. For export to a csv file I think all the lists must be generated, stored and then written at once. Right now I use a struct to store the attributes printed, each struct is an item on the list - these structs are then added to a sorted stack and when printed they pop off into write out. Is a stack of stacks of structs a good design?

    Read the article

  • How to compare DateTime Objects while looping through a list?

    - by Taniq
    I'm trying to loop through a list (csv) containing two fields; a name and a date. There are various duplicated names and various dates in the list. I'm trying to deduce for each name in the list, where there are multiple instances of the same name, which corresponding date is the latest. I realise, from looking at another answer, that I need to use the DateTime.Compare method which is fine, but my problem is working out which date is later. Once I know this I need to produce a file with unique names and the latest date relating to it. This is my first question which makes me a newbie. EDIT: Initially I thought it would be 'ok' to set the LatestDate object to a date that wouldn't show up in my file, therefore making any later dates in the file the LatestDate. Here's my coding so far: using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; namespace flybe_overwriter { class Program { static DateTime currentDate; static DateTime latestDate = new DateTime(1000,1,1); static HashSet<string> uniqueNames = new HashSet<string>(); static string indexpath = @"e:\flybe test\indexing.csv"; static string[] indexlist = File.ReadAllLines(indexpath); static StreamWriter outputfile = new StreamWriter(@"e:\flybe test\match.csv"); static void Main(string[] args) { foreach (string entry in indexlist) { uniqueNames.Add(entry.Split(',')[0]); } HashSet<string>.Enumerator fenum = new HashSet<string>.Enumerator(); fenum = uniqueNames.GetEnumerator(); while (fenum.MoveNext()) { string currentName = fenum.Current; foreach (string line in indexlist) { currentDate = new DateTime(Convert.ToInt32(line.Split(',')[1].Substring(4, 4)), Convert.ToInt32(line.Split(',')[1].Substring(2, 2)), Convert.ToInt32(line.Split(',')[1].Substring(0, 2))); if (currentName == line.Split(',')[0]) { if(DateTime.Compare(latestDate.Date, currentDate.Date) < 1) { // Console.WriteLine(currentName + " " + latestDate.ToShortDateString() + " is earlier than " + currentDate.ToShortDateString()); } else if (DateTime.Compare(latestDate.Date, currentDate.Date) > 1) { // Console.WriteLine(currentName + " " + latestDate.ToShortDateString() + " is later than " + currentDate.ToShortDateString()); } else if (DateTime.Compare(latestDate.Date, currentDate.Date) == 0) { // Console.WriteLine(currentName + " " + latestDate.ToShortDateString() + " is the same as " + currentDate.ToShortDateString()); } } } } } } } Any help appreciated. Thanks.

    Read the article

  • mysql to xls sheet genration problemI(getting html code along with records ,unable get column names)

    - by pmms
    <?php if($_POST['Submit']=='Generatexml') { $tblname=$_GET['genratexml']; //mysql_connect("localhost","root",""); //mysql_select_db("hitnrunf_db"); global $obj_mysql; $result = mysql_query("SELECT * FROM tbl_js_login"); while($row = mysql_fetch_array($result)) { $csv_output .= "$row[fld_id],$row[fld_fname],$row[fld_lname]"; $csv_output .="\015\012"; } header("Content-type: application/vnd.ms-excel"); header("Content-disposition: csv; filename= Student_Data_". date("Y-m-d") . ".csv"); print $csv_output; exit; } include_once $path."includes/jobseeker_form.php"; ?> In the above we are getting html code along wtih id, firstname, lastname columns. we are unable to get the heading of the columns also How to remove Html code from xls file also need to get headers

    Read the article

  • list(of byte) to Picturebox

    - by michael
    I have a jpeg file that is being held as a list(of Byte) Currently I have code that I can use to load and save the jpeg file as either a binary (.jpeg) or a csv of bytes (asadsda.csv). I would like to be able to take the list(of Byte) and convert it directly to a Picturebox without saving it to disk and then loading it to the picturebox. If you are curious, the reason I get the picture file as a list of bytes is because it gets transfered over serial via an industrial byte oriented protocol as just a bunch of bytes. I am using VB.net, but C# example is fine too.

    Read the article

  • Setting up scripts in Amazon EC2 Cloud

    - by racket99
    Hello, I am currently running a few perl and python scripts on a windows pc and would like to port over to the Amazon EC2 servers running 64-bit LINUX. The scripts are basic web scrapers that go to a variety of websites, get data and then save daily as csv files. I would like to install these in the cloud and get them running in an automated way so that they will run without my intervention. Also given that I don't want to lose all the data if the instance crashes, I should also upload the csv files to Amazon S3. Any idea how I can do this? I am not terribly versed in LINUX nor do I know Perl/Python well. What is the best way for me to tackle thi

    Read the article

  • working with files on "start without debugging"

    - by user1472066
    I'm programming in C, and I have the following problem: I use fopen and try to read from a csv file, that is currently storred in the folder of the exe file of the program. the program works fine in debug mode and release mode, but when I try to run the program in "start without debugging" on visual studio 2008 express edition, the program stops working and windows is showing a message: "*.exe has stopped working. a program caused the program to stop working correctly. windows will close the program and notify you if a solution is available". I've tried running the program on several computers, and it's the same. another information I can give you is that if I enter the full path of the file (C:....file.csv) - then is works just fine, without any problem. I know I didn't write any code, but I hope someone will have an idea why this can happend. thanks is advance.

    Read the article

  • Mvc4 webapi file download with jquery

    - by Ray
    I want to download file on client side from api apicontroller: public HttpResponseMessage PostOfficeSupplies() { string csv = string.Format ("D:\\Others\\Images/file.png"); HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK); result.Content = new StringContent(csv); result.Content.Headers.ContentType = new MediaTypeHeaderValue ("application/octet-stream"); result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment"); result.Content.Headers.ContentDisposition.FileName = "file.png"; return result; } 1.How can I popup a download with jquery(octet-stream) from api controller? my client side code: $(document).ready(function () { $.ajax( { url: 'api/MyAPI' , type: "post" , contentType: "application/octet-stream" , data: '' , success: function (retData) { $("body").append("<iframe src='" + retData + "' style='display: none;' ></iframe>"); } }); }); but it was not work!!!Thanks!!

    Read the article

  • Analyze your IIS Log Files - Favorite Log Parser Queries

    - by The Official Microsoft IIS Site
    The other day I was asked if I knew about a tool that would allow users to easily analyze the IIS Log Files, to process and look for specific data that could easily be automated. My recommendation was that if they were comfortable with using a SQL-like language that they should use Log Parser . Log Parser is a very powerful tool that provides a generic SQL-like language on top of many types of data like IIS Logs, Event Viewer entries, XML files, CSV files, File System and others; and it allows you...(read more)

    Read the article

  • SSIS Design Pattern: Loading Variable-Length Rows

    - by andyleonard
    Introduction I encounter flat file sources with variable-length rows on occassion. Here, I supply one SSIS Design Pattern for loading them. What's a Variable-Length Row Flat File? Great question - let's start with a definition. A variable-length row flat file is a text source of some flavor - comma-separated values (CSV), tab-delimited file (TDF), or even fixed-length, positional-, or ordinal-based (where the location of the data on the row defines its field). The major difference between a "normal"...(read more)

    Read the article

  • Automate Log Parser to Find Your Data Faster

    - by The Official Microsoft IIS Site
    Microsoft’s Log Parser is a really powerful tool for searching log files. With just a few simple SQL commands you can pull more data than you ever imagined out of your logs. It can be used to read web site log files, csv files, and even Windows event logs. Log Parser isn’t intended to compete with stats products such as Webtrends Log Analyzer or SmarterStats by Smartertools.com which I feel is the best on the market. I primarily use Log Parser for troubleshooting problems with a site...(read more)

    Read the article

  • Removing hard-coded values and defensive design vs YAGNI

    - by Ben Scott
    First a bit of background. I'm coding a lookup from Age - Rate. There are 7 age brackets so the lookup table is 3 columns (From|To|Rate) with 7 rows. The values rarely change - they are legislated rates (first and third columns) that have stayed the same for 3 years. I figured that the easiest way to store this table without hard-coding it is in the database in a global configuration table, as a single text value containing a CSV (so "65,69,0.05,70,74,0.06" is how the 65-69 and 70-74 tiers would be stored). Relatively easy to parse then use. Then I realised that to implement this I would have to create a new table, a repository to wrap around it, data layer tests for the repo, unit tests around the code that unflattens the CSV into the table, and tests around the lookup itself. The only benefit of all this work is avoiding hard-coding the lookup table. When talking to the users (who currently use the lookup table directly - by looking at a hard copy) the opinion is pretty much that "the rates never change." Obviously that isn't actually correct - the rates were only created three years ago and in the past things that "never change" have had a habit of changing - so for me to defensively program this I definitely shouldn't store the lookup table in the application. Except when I think YAGNI. The feature I am implementing doesn't specify that the rates will change. If the rates do change, they will still change so rarely that maintenance isn't even a consideration, and the feature isn't actually critical enough that anything would be affected if there was a delay between the rate change and the updated application. I've pretty much decided that nothing of value will be lost if I hard-code the lookup, and I'm not too concerned about my approach to this particular feature. My question is, as a professional have I properly justified that decision? Hard-coding values is bad design, but going to the trouble of removing the values from the application seems to violate the YAGNI principle. EDIT To clarify the question, I'm not concerned about the actual implementation. I'm concerned that I can either do a quick, bad thing, and justify it by saying YAGNI, or I can take a more defensive, high-effort approach, that even in the best case ultimately has low benefits. As a professional programmer does my decision to implement a design that I know is flawed simply come down to a cost/benefit analysis?

    Read the article

  • Two BULK INSERT issues I worked around recently

    - by AaronBertrand
    Since I am still afraid of SSIS, and because I am dealing mostly with CSV files and table structures that are relatively simple and require only one of the three letters in the acronym "ETL," I find myself using BULK INSERT a lot. I have been meaning to switch to using CLR, since I am doing a lot of file system querying using xp_cmdshell, but I haven't had the chance to really explore it yet. I know, a lot of you are probably thinking, wow, look at all those bad habits. But for every person thinking...(read more)

    Read the article

  • Two BULK INSERT issues I worked around recently

    - by AaronBertrand
    Since I am still afraid of SSIS, and because I am dealing mostly with CSV files and table structures that are relatively simple and require only one of the three letters in the acronym "ETL," I find myself using BULK INSERT a lot. I have been meaning to switch to using CLR, since I am doing a lot of file system querying using xp_cmdshell, but I haven't had the chance to really explore it yet. I know, a lot of you are probably thinking, wow, look at all those bad habits. But for every person thinking...(read more)

    Read the article

  • CodePlex Daily Summary for Thursday, September 06, 2012

    CodePlex Daily Summary for Thursday, September 06, 2012Popular Releasesmenu4web: menu4web 0.4.1 - javascript menu for web sites: This release is for those who believe that global variables are evil. menu4web has been wrapped into m4w singleton object. Added "Vertical Tabs" example which illustrates object notation.WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.1: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...iPDC - Free Phasor Data Concentrator: iPDC-v1.3.1: iPDC suite version-1.3.1, Modifications and Bug Fixed (from v 1.3.0) New User Manual for iPDC-v1.3.1 available on websites. Bug resolved : PMU Simulator TCP connection error and hang connection for client (PDC). Now PMU Simulator (server) can communicate more than one PDCs (clients) over TCP and UDP parallely. PMU Simulator is now sending the exact data frames as mentioned in data rate by user. PMU Simulator data rate has been verified by iPDC database entries and PMU Connection Tes...Microsoft SQL Server Product Samples: Database: AdventureWorks OData Feed: The AdventureWorks OData service exposes resources based on specific SQL views. The SQL views are a limited subset of the AdventureWorks database that results in several consuming scenarios: CompanySales Documents ManufacturingInstructions ProductCatalog TerritorySalesDrilldown WorkOrderRouting How to install the sample You can consume the AdventureWorks OData feed from http://services.odata.org/AdventureWorksV3/AdventureWorks.svc. You can also consume the AdventureWorks OData fe...Desktop Google Reader: 1.4.6: Sorting feeds alphabetical is now optional (see preferences window)DotNetNuke® Community Edition CMS: 06.02.03: Major Highlights Fixed issue where mailto: links were not working when sending bulk email Fixed issue where uses did not see friendship relationships Problem is in 6.2, which does not show in the Versions Affected list above. Fixed the issue with cascade deletes in comments in CoreMessaging_Notification Fixed UI issue when using a date fields as a required profile property during user registration Fixed error when running the product in debug mode Fixed visibility issue when...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.65: Fixed null-reference error in the build task constructor.Active Forums for DotNetNuke CMS: Active Forums 5.0.0 RC: RC release of Active Forums 5.0.Droid Explorer: Droid Explorer 0.8.8.7 Beta: Bug in the display icon for apk's, will fix with next release Added fallback icon if unable to get the image/icon from the Cloud Service Removed some stale plugins that were either out dated or incomplete. Added handler for *.ab files for restoring backups Added plugin to create device backups Backups stored in %USERPROFILE%\Android Backups\%DEVICE_ID%\ Added custom folder icon for the android backups directory better error handling for installing an apk bug fixes for the Runn...BI System Monitor: v2.1: Data Audits report and supporting SQL, and SSIS package Environment Overview report enhancements, improving the appearance, addition of data audit finding indicators Note: SQL 2012 version coming soon.Hidden Capture (HC): Hidden Capture 1.1: Hidden Capture 1.1 by Mohsen E.Dawatgar http://Hidden-Capture.blogfa.comExt Spec: Ext Spec 0.2.1: Refined examples and improved distribution options.The Visual Guide for Building Team Foundation Server 2012 Environments: Version 1: --Nearforums - ASP.NET MVC forum engine: Nearforums v8.5: Version 8.5 of Nearforums, the ASP.NET MVC Forum Engine. New features include: Built-in search engine using Lucene.NET Flood control improvements Notifications improvements: sync option and mail body View Roadmap for more details webdeploy package sha1 checksum: 961aff884a9187b6e8a86d68913cdd31f8deaf83WiX Toolset: WiX Toolset v3.6: WiX Toolset v3.6 introduces the Burn bootstrapper/chaining engine and support for Visual Studio 2012 and .NET Framework 4.5. Other minor functionality includes: WixDependencyExtension supports dependency checking among MSI packages. WixFirewallExtension supports more features of Windows Firewall. WixTagExtension supports Software Id Tagging. WixUtilExtension now supports recursive directory deletion. Melt simplifies pure-WiX patching by extracting .msi package content and updating .w...Iveely Search Engine: Iveely Search Engine (0.2.0): ????ISE?0.1.0??,?????,ISE?0.2.0?????????,???????,????????20???follow?ISE,????,??ISE??????????,??????????,?????????,?????????0.2.0??????,??????????。 Iveely Search Engine ?0.2.0?????????“??????????”,??????,?????????,???????,???????????????????,????、????????????。???0.1.0????????????: 1. ??“????” ??。??????????,?????????,???????????????????。??:????????,????????????,??????????????????。??????。 2. ??“????”??。?0.1.0??????,???????,???????????????,?????????????,????????,?0.2.0?,???????...GmailDefaultMaker: GmailDefaultMaker 3.0.0.2: Add QQ Mail BugfixSmart Data Access layer: Smart Data access Layer Ver 3: In this version support executing inline query is added. Check Documentation section for detail.DotNetNuke® Form and List: 06.00.04: DotNetNuke Form and List 06.00.04 Don't forget to backup your installation before upgrade. Changes in 06.00.04 Fix: Sql Scripts for 6.003 missed object qualifiers within stored procedures Fix: added missing resource "cmdCancel.Text" in form.ascx.resx Changes in 06.00.03 Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 Change: Data is now stored in nvarchar(max) instead of ntext Changes in 06.00.02 The scripts are now compatible with SQL Azure, tested in a ne...Coevery - Free CRM: Coevery 1.0.0.24: Add a sample database, and installation instructions.New ProjectsAny-Service: AnyService is a .net 4.0 Windows service shell. It hosts any windows application in non-gui mode to run as a service.BabyCloudDrives - the multi cloud drive desktop's application: wpf ????BLACK ORANGE: Download The HPAD TEXT EDITOR and use it Wisely.. CodePlex New Release Checker: CodePlex New Release Checker is a small library that makes it easy to add, "New Version Available!" functionality to your CodePlex project.Collect: ????????!CSVManager: CSV??CSV?????,????CSV??,??????Exam Project: My Exam Project. Computer Vision, C and OpenCV-FTP: Hey guys thanks for checking out my ftp!Haushaltsbuch: 1ModMaker.Lua: ModMaker.Lua is an open source .NET library that parses and executes Lua code.MyJabbr: MyJabbr netduinoscope: Design shield and software to use netduino as oscilloscopeNetSurveillance Web Application: Net Surveillance Web ApplicationNiconicoApiHelper: ????API?????????OStega: A simple library for encrypt text into an bmp or png image.OURORM: ormTFS Cloud Deployment Toolkit: The TFS Cloud Deployment Toolkit is a set of tools that integrate with TFS 2010 to help manage configuration and deployment to various remote environments.The Visual Guide for Building Team Foundation Server 2012 Environments: A step-by-step guide for building Team Foundation Server 2012 environments that include SharePoint Server 2010, SQL Server 2012, Windows Server 2012 and more!WinRT LineChart: An attempt at creating an usable LineChart for everyone to use in his/her own Windows 8 Apps

    Read the article

  • How to process large files in NetLogo? [closed]

    - by user65597
    I am running into problems in NetLogo with large *.csv / *.txt files. The documents can consist of about 1 million data sets and I need to read them (to eventually create a diagram based on the data). With the most straightforward source code, my program needs about 2 minutes to process these files. How should I approach reading such large data files faster in NetLogo? Is NetLogo even suitable for such tasks (as it seems to be designed more for teaching and learning)?

    Read the article

  • Looking for free, specific Ip2Location Database

    - by Andresch Serj
    I am searching for a free db (like an updated XML or CSV file) that relates IP addresses to specific locations. I want more information than just the country. I want some sort of region or city reference, even if that ends up to be a number that makes no sense to me. Doesn't have to be super correct or always up to date either. It is just to distinguish between user groups and not to monitor or spy on them.

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >