Search Results

Search found 69971 results on 2799 pages for 'file hosts'.

Page 111/2799 | < Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >

  • unzip file on small drive

    - by David Oneill
    I have a zip file that contains many files. Each file in the zip file is about 100MB, and is compressed about 10% (IE it is about 90MB in the zip file). The whole zip file is 20GB, and I'm trying to unzip it onto a drive that has only 30GB free. On Windows 7, how can I unzip a zip file on my laptop's harddrive that doesn't have enough space for both the zipped and unzipped copies of the file? IE can I tell windows to remove stuff from the zip file as it is uncompressed?

    Read the article

  • Cannot open Pivot Table source file

    - by Ken
    Excel Pivot table error is: Cannot open Pivot Table source file C:\Users\UserName\AppData\Roaming\Microsoft\Excel\DatabaseName (version1).TableName I’ve seen other questions and answers with the same topic, but I think this is different. I believe I know why the error is occurring: Excel closed unexpectantly and did autosave with (version1) attached to the original file name and saved it in the C:\User etc. above , which is the default recovery location. I opened the recovered file in Excel, saved it as version1 on the server where the original file was located, deleted the original file, and renamed the version1 to the original name. When I go to PivotTable Tools? Options? Change Data Source, it shows only the Table and Range, which are correct, but it does not show the file name or path. The version1 and the renamed file both had the same structure, so the same source table was in both, by they were different files. How do I change the source file from what it is looking for to my renamed file? PS- The (version1) file that it says it is looking for is not in the autosave location, i.e. it is not at the path where it says it is looking in. Thank you for any help Ken

    Read the article

  • Looping through a batch file only if the response is "everything is okay"

    - by PeanutsMonkey
    I have a batch file that loops through the contents of a directory and compresses the files in the directory as follows; for %%a in (c:\data\*.*) do if "%%~xa" == "" "C:\Program Files\7-Zip\7za.exe" a -tzip -mx9 "%%a.zip" "%%a" Seeing that I am using 7zip to compress the file, it returns the message "everything is okay" if it has successfully compressed the file and it then moves onto the next file in any. What I would like to do is the following; Only move to the next file if the response is "everything is okay" If the response is anything but "everything is okay", the error is logged Since an error has occurred, it attempts to compress the file again Once when it has succeeded i.e. "everything is okay" it goes to the next file Steps 3 & 4 only occur a maximum of 3 times before it gives up and moves onto the next file. How can I achieve this?

    Read the article

  • Config file (App.config) does not update on new installation

    - by Muhammad Kashif Nadeem
    I am creating setup of my project using Visula Studio 2008. I am facing problem in setup installation. If I uninstall old setup (application) and install the new one then config file (App.config) updates the attributes (surely it is new file) of config file but if I install new setup without uninstalling the old one then config file does not update. from config file I mean MyProject.exe.config Why is this behavior of config file. Should it not be updated on installation of the new setup Is this possible to delete and copy the config file of new setup? Is there a way to update only config file forcefully during installation. Thanks for your help!

    Read the article

  • What could cause the file command in Linux to report a text file as data?

    - by Jonah Bishop
    I have a couple of C++ source files (one .cpp and one .h) that are being reported as type data by the file command in Linux. When I run the file -bi command against these files, I'm given this output (same output for each file): application/octet-stream; charset=binary Each file is clearly plain-text (I can view them in vi). What's causing file to misreport the type of these files? Could it be some sort of Unicode thing? Both of these files were created in Windows-land (using Visual Studio 2005), but they're being compiled in Linux (it's a cross-platform application). Any ideas would be appreciated. Update: I don't see any null characters in either file. I found some extended characters in the .cpp file (in a comment block), removed them, but file still reports the same encoding. I've tried forcing the encoding in SlickEdit, but that didn't seem to have an effect. When I open the file in vim, I see a [converted] line as soon as I open the file. Perhaps I can get vim to force the encoding?

    Read the article

  • rename/delete a folder from multipart rar file

    - by kikio
    Hello. I've a question: (I sent it in past) I have multipart rar file. Their contents are: file.part01.rar: myfolder (is a folder) data.cab -- file.part02.rar: myfolder (is a folder) data.cab <- file.part03.rar: myfolder (is a folder) data.cab <- file.part04.rar: difffolder (is a folder) anfolder (is a folder) data.cab <- file.part05.rar: myfolder (is a folder) data.cab <-- I want to extract it, so I right-click on "file.part01.rar" and select "Extract to ...". It extract 3 files, but in part 4, WinRAR said: "CRC. This file is currput." I think it problem is in the folders name in part04.rar. Is there anyway to rename folders in part04.rar? and cut "data.cab" from "afolder" to "difffolder". I really need it!! it is very emergency!!!!!!!! Thank you .....

    Read the article

  • How to restore from file using Symantec NetBackup 7.5

    - by Tony
    I have an install of Symantec NetBackup 7.5 and I want to restore the server from a NetBackup image file. The file was created using NetBackup before I arrived. We had a hardware failure that corrupted this server and it needed to be rebuilt, now we want to restore from this image file. I can't for the life of me figure out how to restore from that file. I've installed the NetBackup application but it can't find the file when using the restore command within the application. If I double-click the file it opens the application then gives me the same "can't find any NetBackup files" error. I also can't simply drag the file into the NetBackup window. Any advice on how I restore from this file would be appreciated, thank you.

    Read the article

  • What is the difference between the BIN file generated by ImgBurn and UltraISO

    - by user275517
    I have a CD that I would like to generate a BIN file from (with a CUE file to accompany it). I used ImgBurn and UltraISO to to generate two BIN files. However, I have found out that BIN files generated by these programs are not identical (different file size). So, what is the difference between the BIN file formats and which one should I use to backup CD? The same applies to ISO file generation by these two programs - file size does not match.

    Read the article

  • SQL Error (1064) when importing data from SQL file

    - by mejpark
    I have a MySQL database, which was originally set up with the default latin1 character set and latin1_swedish_ci collation. I was using the database like this for sometime, until I noticed strange characters on my production web site, which is powered by a database exported from my development machine. At this point, I changed the default character set of the database and tables to utf8 and the collation to utf8_unicode_ci, converted the latin1 data inside each table to utf8 (using the 'convert data' option) and exported the database as a single SQL file using HeidiSQL. When the resulting SQL file is opened in Notepad++, several characters are rendered incorrectly. For example, en dashes (-) are displayed as – and e with accent (é) are displayed as é. I changed the encoding of the file from ANSI to UTF-8 (using the encoding menu option in Notepad++) and the offending characters are rendered correctly. I saved the new utf8-encoded SQL file and attempted to import the contents into the MySQL database on my production server. The import process fails with following error: /* SQL Error (1064): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '?# -------------------------------------------------------- # Host: ' at line 1 */ /* Error with snippets directory: The specified path was not found */ The head of the SQL file: # -------------------------------------------------------- # Host: 127.0.0.1 # Server version: 5.1.33-community # Server OS: Win32 # HeidiSQL version: 6.0.0.3773 # Date/time: 2011-04-20 09:48:36 # -------------------------------------------------------- It chokes on the first line of the file, which is commented out. Why is this happening? I didn't have a problem loading data from SQL files until I changed the character set and collation of the database. I came up with an ugly workaround to this problem by performing following steps: Export database as single SQL file using HeidiSQL Open resulting file in Notepad++ and convert from ANSI to UTF-8 encoding Create new empty file in Notepad++, paste in UTF-8 and save file normally What am I missing here?

    Read the article

  • rkhunter warns of inode change by no file modification date changes

    - by Nicholas Tolley Cottrell
    I have several systems running Centos 6 with rkhunter installed. I have a daily cron running rkhunter and reporting back via email. I very often get reports like: ---------------------- Start Rootkit Hunter Scan ---------------------- Warning: The file properties have changed: File: /sbin/fsck Current inode: 6029384 Stored inode: 6029326 Warning: The file properties have changed: File: /sbin/ip Current inode: 6029506 Stored inode: 6029343 Warning: The file properties have changed: File: /sbin/nologin Current inode: 6029443 Stored inode: 6029531 Warning: The file properties have changed: File: /bin/dmesg Current inode: 13369362 Stored inode: 13369366 From what I understand, rkhunter will usually report a changed hash and/or modification date on the scanned files to, so this leads me to think that there is no real change. My question: is there some other activity on the machine that could make the inode change (running ext4) or is this really yum making regular (~ once a week) changes to these files as part of normal security updates?

    Read the article

  • Gifsicle: How to set it to not overwrite the original GIF file if the resulting modified GIF file is larger than the original?

    - by galacticninja
    About Gifsicle: Gifsicle is a command-line tool for creating, editing, and getting information about GIF images and animations. One of its features is (from its website): Optimize your animations! This stores only the changed portion of each frame, and can radically shrink your GIFs. You can also use transparency to make them even smaller. Gifsicle’s optimizer is pretty powerful, and usually reduces animations to within a couple bytes of the best commercial optimizers. I call Gifsicle through this .BAT file in the Right Click - 'Send to' Menu: @echo off :compressFile "C:\Programs\Compression Scripts\gifsicle\bin\gifsicle.exe" --batch -V -O3 %1% echo. echo. SHIFT if exist %1% goto compressFile PAUSE This animated GIF file, however: http://i.minus.com/i7WdodY5Zwot3.gif, when its compression is optimized with Gifsicle with the above commands, results in a larger-filesized GIF file. Gifsicle overwrites the original GIF file with the resulting larger-filesized GIF file. Initial filesize: 7.57 MiB (7,942,886 bytes). After running through the above commands with Gifsicle: 7.64 MiB (8,017,622 bytes). Is there a way to prevent Gifsicle from overwriting the original file if its output file is larger than the original file, while still overwriting the original file if the output file is smaller? Details: OS: Windows 7 Gifsicle version: 1.63, from the binary provided here: http://www.lcdf.org/gifsicle/ Gifsicle manual

    Read the article

  • Browser not parsing PAC file properly?

    - by mfinni
    I have a long PAC file. The browser(s) (IE and Chrome) are configured to use it and it generally does what it says on the tin. I have a domain that continues to go through the proxy although it should be going direct. // Match specific hosts and IPs entered as hosts if (buncha stuff || shExpMatch(host,"(*.newmarketinc.com)") || shExpMatch(host,"(newmarketinc.com)") || buncha stuff ) return "DIRECT"; Pactester shows that anything in the domain should be direct. h:\pacparser\pactester.exe -p h:\pacfile -u http://daas.newmarketinc.com DIRECT But we continue to pass traffic to hosts in this domain via the proxy. Wireshark and Fiddler both show this. How do i figure out how my browser has gotten brain-damage? Traffic to other sites in this stanza does properly go direct, as confirmed by Fiddler and Wireshark.

    Read the article

  • Alternative Windows Offline Files + Windows Backup + Previous Version Setup

    - by Herson
    Currently our documents are all hosted in a Windows 7 box. Users can access the files using Windows share and the documents are available offline (windows 7 feature). The documents are being backed up daily by Windows 7 backup and restore utility. Users can access previous versions of the file (from the backups) using Windows Explorer "previous versions" feature. This setup is currently working well, except for the following: We would prefer to have access to hourly versions of the file, not daily. The previous version mechanism is tied up to the backup mechanism. Windows 7 performs a full backup every week and incremental backup everyday. The previous versions of a file is actually what are the available in the backups. If you 20GB documents and want to maintain at least three(3) year history, you will use at minimum 3 years * 52 weeks * 20GB or about 3TB even if there are few changes in the documents. Its pretty inefficient use of space. Looking up previous versions of a file is very slow (tens of minutes). This is probably related to the previous issue - Windows has to traverse its all of its backups. I am considering using SVN + autocommit/autoupdate tortoisesvn. It will have the following advantages: Backups are easy and will also backup the whole history of each documents. (Just backup the repository). Creating previous versions can be frequent. I think svn commit / update can be done every two minutes or so. Users can sync over the net. However, I can see the following issues: More conflicts than the original setup because both multiple users can now edit the same file even both are online, i.e. can connect to the SVN repo. The users can off course lock the file first before editing, but that would mean they have to adjust. Delay on propagation of file changes. On windows 7 file sharing, changes made by one online user will be instantaneously available to other online users. With the SVN setup, changes will only be propagated when the users execute the svn add/commit/update sequence. Delay will be probably a few minutes. This workflow will no longer work: "Hi, I just edited document X, can you have a quick look?" I would like to ask the opinion of the community for alternative setups, or improvements on the above setups to work out the kinks.

    Read the article

  • Sync Local ICS File with Android via Exchange/Outlook

    - by sinDizzy
    At my company we have a 3rd party app which tracks off-hours duty for all of our engineers. The app is not web-enabled and we cannot make any changes to it. It does write a simple text file and I have created an app that translates that to an ICS file. My goal is to have that appear on my calendar on my Android phone. Here is the path I am working on: DutyApp -- TextFile -- ICSFile -- Outlook(exchange) -- Android (via exchange sync) My problems: If I place the ICS file on our FILE server and then in Outlook if I go to the option CalendarOpen CalendarFrom Internet it shows up in Outlook and looks pretty good. After a couple minutes it shows up on my Android phone as well. If I change the original ICS file those changes never display in Outlook and never sync to my Android phone. This seems to be a one shot deal almost like an import. Now if I place the ICS file on our WEB server and then in Outlook if I go to the option CalendarOpen CalendarFrom Internet and use webcal:\ as the address, it shows up in Outlook and also looks pretty good. Any changes I make to the original ICS file display in Outlook. However the entire calendar never shows up in Android. This calendar is a subscription and it seems, although am not sure, that Android doesn't display Exchange subscription calendars. Yes I know it works with Gmail subscription calendars but this is Exchange. So my question is what other options are there? We are behind a firewall so cant link the ICS file to a Gmail account. I can't put the ICS file anywhere else other than our file or web server.

    Read the article

  • Over writing output to a text file

    - by Naveen Gamage
    I'm trying to write wget command's output to a text file, but it always appends to the text file. #!/bin/sh download() { local url=$1 echo -n " " wget --progress=dot $url 2>&1 | grep --line-buffered "%" | \ sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}' echo " DONE" } file="$1" echo -n "Downloading $file:" download "$file" > file.log I tried using using > won't work, where am I doing wrong?

    Read the article

  • Full File Path in Reference to Another Workbook in Excel

    - by SHARIQ MUSANI
    I have two Excel files, one on D:\ and the other one on E:\. I reference one from the other, for example, using vlookup in the E: file to search from the D: file, like that: VLOOKUP(A1,'D:\SHARIQ\[FILE NAME.XLS]SHEETNAME'!A1:10,3,FALSE) As long as D:\SHARIQ\FILE NAME.XLS is open, I get the formula in the E: file displayed like this: VLOOKUP(A1,'[FILE NAME.XLS]SHEETNAME'!A1:10,3,FALSE) Why does it remove the whole path?

    Read the article

  • "pdf_open: Not a PDF 1.[1-5] file." when typesetting TeX file in TextMate

    - by Manti
    I have a TeX file, which is typeset by XeLaTeX. The file has \includegraphics{coverimage.eps} on the first page, and coverimage.eps is located in the same directory as the TeX file. OS: MacOS 10.6.5, TextMate 1.5.10, xelatex 3.1415926-2.2-0.9997.4 (TeX Live 2010) If I run xelatex doc.tex in console, file compiles without errors, and eps file is included. If I typeset it in TextMate (xelatex engine is selected in preferences), I get the following error: ** WARNING ** pdf_open: Not a PDF 1.[1-5] file. ** WARNING ** Failed to include image file "./coverimage.eps" ** WARNING ** >> Please check if ** WARNING ** >> rungs -q -dNOPAUSE -dBATCH -sPAPERSIZE=a0 -sDEVICE=pdfwrite -dCompatibilityLevel=%v -dAutoFilterGrayImages=false -dGrayImageFilter=/FlateEncode -dAutoFilterColorImages=false -dColorImageFilter=/FlateEncode -sOutputFile=%o %i -c quit ** WARNING ** >> %o = output filename, %i = input filename, %b = input filename without suffix ** WARNING ** >> can really convert "./coverimage.eps" to PDF format image. ** WARNING ** pdf: image inclusion failed for "coverimage.eps". ** WARNING ** Failed to read image file: coverimage.eps ** WARNING ** Interpreting special command PSfile (ps:) failed. ** WARNING ** >> at page="1" position="(107.149, 124.566)" (in PDF) ** WARNING ** >> xxx "PSfile="coverimage.eps" llx=0 lly=0 urx=408 ury=526 rwi=3809 and there is no image included in the resulting file (but text looks ok). I am not sure whether it is the error of xelatex or TextMate LaTeX bundle. What I tried to do: As I said, xelatex doc.tex from console works. The exact command line that TextMate uses is: xelatex -interaction=nonstopmode -file-line-error-style -synctex=1 But it works from console too. If I convert the image to pdf (and fix the .tex file accordingly), typesetting from TextMate works too. I tried running rungs with parameters specified in the error message, and got valid .pdf file with image as a result. I compared .log files from typesetting in TextMate and in console, they are absolutely identical except for this error message (in particular, version of xelatex is the same). Does anyone know what can cause this? Please tell me if you need any additional information. Thank you in advance.

    Read the article

  • Any tool available to make renaming in Windows XP more like Mac OS X?

    - by alex
    I've noticed 1 cool thing and OS X. When you rename a file, it automatically selects everything up until the extension. For example, attempting to rename this.is.a.file.png would preselect this.is.a.file allowing you to quickly rename it whilst preserving the extension. I know I could turn off 'show extensions for known file types', but I like to have them visible. Is there any software that can do this? Thanks

    Read the article

  • Any cloud storage service that lets us to authenticate the file when we serve the file to our visito

    - by TORr0t
    Lets say, i want to restrict a file to my visitors. I mean , i have an xx.avi file to be streamed/downloaded, and the visitor paid me for the bandwidth and the size of the file. In amazon s3, i cant control the file at all .(there is a very basic control thing which is not ok for me) Only way is my server can proxy the file, like it fetches the file from amazon s3 storagenode and send it to the owner with authentication approval by a php script. But this way i would double up the bandwidth usage and again there would be latency problem since my server needs to get the file from amazon s3. So i was wondering if there is a better solution or any cloud storage service that lets us to control the file restriction to my visitors. Thanks

    Read the article

  • How to know if a file has 'access' monitor in linux

    - by J L
    I'm a noob and have some questions about viewing who accessed a file. I found there are ways to see if a file was accessed (not modified/changed) through audit subsystem and inotify. However, from what I have read online, according to here: http://www.cyberciti.biz/tips/linux-audit-files-to-see-who-made-changes-to-a-file.html it says to 'watch/monitor' file, I have to set a watch by using command like: # auditctl -w /etc/passwd -p war -k password-file So if I create a new file or directory, do I have to use audit/inotify command to 'set' watch first to 'watch' who accessed the new file? Also is there a way to know if a directory is being 'watched' through audit subsystem or inotify? How/where can I check the log of a file?

    Read the article

  • windows symbolic file links arn't followed through network access

    - by fpdragon
    I have a server with win7 and a big share on C:\share In this share there are many symbolic links to files on other local disks like C:\share\file.txt(symlink) <- D:\file.txt I can access the file over: C:\share\file.txt \server\share\file.txt but if I try to access from an other pc I can't open the file. I'm able to delete, rename ... the link but it seems that the symbolic link isn't processed by the server. Can I change something with cifs to make this work? I already checked the acls of the link and the file and set them to allow everybody everything. I also can access the file with d$. Hope there is a solution...

    Read the article

  • Move hibernate file to a different drive

    - by Kamarey
    Is it possible to move a Windows hibernate file to a different drive? E.g. if I have Windows installed on C, I want it's hibernate file be on D. Eee... just thought that I wanted to ask this question about hibernate file, not the page file. Don't know where the page file came from. Sorry:) So the question is about hibernate file. But no problems with all answers about page file. (Edit: The original question title was "Move page file to a different drive")

    Read the article

  • creating objects from trivial graph format text file. java. dijkstra algorithm.

    - by user560084
    i want to create objects, vertex and edge, from trivial graph format txt file. one of programmers here suggested that i use trivial graph format to store data for dijkstra algorithm. the problem is that at the moment all the information, e.g., weight, links, is in the sourcecode. i want to have a separate text file for that and read it into the program. i thought about using a code for scanning through the text file by using scanner. but i am not quite sure how to create different objects from the same file. could i have some help please? the file is v0 Harrisburg v1 Baltimore v2 Washington v3 Philadelphia v4 Binghamton v5 Allentown v6 New York # v0 v1 79.83 v0 v5 81.15 v1 v0 79.75 v1 v2 39.42 v1 v3 103.00 v2 v1 38.65 v3 v1 102.53 v3 v5 61.44 v3 v6 96.79 v4 v5 133.04 v5 v0 81.77 v5 v3 62.05 v5 v4 134.47 v5 v6 91.63 v6 v3 97.24 v6 v5 87.94 and the dijkstra algorithm code is Downloaded from: http://en.literateprograms.org/Special:Downloadcode/Dijkstra%27s_algorithm_%28Java%29 */ import java.util.PriorityQueue; import java.util.List; import java.util.ArrayList; import java.util.Collections; class Vertex implements Comparable<Vertex> { public final String name; public Edge[] adjacencies; public double minDistance = Double.POSITIVE_INFINITY; public Vertex previous; public Vertex(String argName) { name = argName; } public String toString() { return name; } public int compareTo(Vertex other) { return Double.compare(minDistance, other.minDistance); } } class Edge { public final Vertex target; public final double weight; public Edge(Vertex argTarget, double argWeight) { target = argTarget; weight = argWeight; } } public class Dijkstra { public static void computePaths(Vertex source) { source.minDistance = 0.; PriorityQueue<Vertex> vertexQueue = new PriorityQueue<Vertex>(); vertexQueue.add(source); while (!vertexQueue.isEmpty()) { Vertex u = vertexQueue.poll(); // Visit each edge exiting u for (Edge e : u.adjacencies) { Vertex v = e.target; double weight = e.weight; double distanceThroughU = u.minDistance + weight; if (distanceThroughU < v.minDistance) { vertexQueue.remove(v); v.minDistance = distanceThroughU ; v.previous = u; vertexQueue.add(v); } } } } public static List<Vertex> getShortestPathTo(Vertex target) { List<Vertex> path = new ArrayList<Vertex>(); for (Vertex vertex = target; vertex != null; vertex = vertex.previous) path.add(vertex); Collections.reverse(path); return path; } public static void main(String[] args) { Vertex v0 = new Vertex("Nottinghill_Gate"); Vertex v1 = new Vertex("High_Street_kensignton"); Vertex v2 = new Vertex("Glouchester_Road"); Vertex v3 = new Vertex("South_Kensignton"); Vertex v4 = new Vertex("Sloane_Square"); Vertex v5 = new Vertex("Victoria"); Vertex v6 = new Vertex("Westminster"); v0.adjacencies = new Edge[]{new Edge(v1, 79.83), new Edge(v6, 97.24)}; v1.adjacencies = new Edge[]{new Edge(v2, 39.42), new Edge(v0, 79.83)}; v2.adjacencies = new Edge[]{new Edge(v3, 38.65), new Edge(v1, 39.42)}; v3.adjacencies = new Edge[]{new Edge(v4, 102.53), new Edge(v2, 38.65)}; v4.adjacencies = new Edge[]{new Edge(v5, 133.04), new Edge(v3, 102.53)}; v5.adjacencies = new Edge[]{new Edge(v6, 81.77), new Edge(v4, 133.04)}; v6.adjacencies = new Edge[]{new Edge(v0, 97.24), new Edge(v5, 81.77)}; Vertex[] vertices = { v0, v1, v2, v3, v4, v5, v6 }; computePaths(v0); for (Vertex v : vertices) { System.out.println("Distance to " + v + ": " + v.minDistance); List<Vertex> path = getShortestPathTo(v); System.out.println("Path: " + path); } } } and the code for scanning file is import java.util.Scanner; import java.io.File; import java.io.FileNotFoundException; public class DataScanner1 { //private int total = 0; //private int distance = 0; private String vector; private String stations; private double [] Edge = new double []; /*public int getTotal(){ return total; } */ /* public void getMenuInput(){ KeyboardInput in = new KeyboardInput; System.out.println("Enter the destination? "); String val = in.readString(); return val; } */ public void readFile(String fileName) { try { Scanner scanner = new Scanner(new File(fileName)); scanner.useDelimiter (System.getProperty("line.separator")); while (scanner.hasNext()) { parseLine(scanner.next()); } scanner.close(); } catch (FileNotFoundException e) { e.printStackTrace(); } } public void parseLine(String line) { Scanner lineScanner = new Scanner(line); lineScanner.useDelimiter("\\s*,\\s*"); vector = lineScanner.next(); stations = lineScanner.next(); System.out.println("The current station is " + vector + " and the destination to the next station is " + stations + "."); //total += distance; //System.out.println("The total distance is " + total); } public static void main(String[] args) { /* if (args.length != 1) { System.err.println("usage: java TextScanner2" + "file location"); System.exit(0); } */ DataScanner1 scanner = new DataScanner1(); scanner.readFile(args[0]); //int total =+ distance; //System.out.println(""); //System.out.println("The total distance is " + scanner.getTotal()); } }

    Read the article

  • File Encryption Operation

    - by kiruthika
    Hi All, I have doubt in gpg command operation . Actually we are using gpg command for encrypting the file . File.txt has following things. Testing hello world My security things. Now I am doing the file encryption for File.txt gpg --symmetric File.txt Now I got File.txt.gpg file , which is encrypted. My doubt, if some open that file and did someone changes in that I am not able get file content . It says my following things. $ gpg --decrypt File.txt.gpg gpg: no valid OpenPGP data found. gpg: decrypt_message failed: eof I want my file content , even though some body has done changes in that . what should I do for this problem....?

    Read the article

  • Compile a binary file for linking OSX

    - by Satpal
    I'm trying to compile a binary file into a MACH_O object file so that it can be linked it into a dylib. The dylib is written in c/c++. On linux the following command is used: ld -r -b binary -o foo.o foo.bin I have tried various option on OSX but to no avail: ld -r foo.bin -o foo.o gives: ld: warning: -arch not specified ld: warning: ignoring file foo.bin, file was built for unsupported file format which is not the architecture being linked (x86_64) An empty .o file is created ld -arch x86_64 -r foo.bin -o foo.o ld: warning: ignoring file foo.bin, file was built for unsupported file format which is not the architecture being linked (x86_64) Again and empty .o file is created. Checking the files with nm gives: nm foo.o nm: no name list The binary file is actually, firmware that will be downloaded to an external device. Thanks for looking

    Read the article

< Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >