Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 369/1620 | < Previous Page | 365 366 367 368 369 370 371 372 373 374 375 376  | Next Page >

  • What's the fastest and automatic way to transfer 2GB of data between 2 PCs every night?

    - by phan
    While it's fast (less than 2 minutes) I hate having to copy files from PC #1 onto a USB stick, and then manually popping it in PC #2 to copy the files to PC #2. Dropbox is too slow in uploading and then downloading 2GBs (synching), it could take hours. Copying 2GBs over the network is also slow because we're dealing with 10,000 little files that totals 2GBs, and not just one, giant 2gb file. Not sure why, but dealing with 10,000 little files makes the copy process much longer. Is there any other method that I'm missing? Any ideas? I'm using Win7 on both PCs. Edit: These files change every single night.

    Read the article

  • How can Windows defragmentation tools cause internal fragmentation in SQL Server?

    - by Martin
    I was just reading this article where the author talks about the file system fragmentation that can be caused by growing database files. There was one bit that I didn't quite follow. What about Windows defragmentation tools? Although you can use a Windows defragmentation tool to defragment your database files, these tools simply move chunks of files around to get them contiguous. This moving of chunks of files can cause internal fragmentation that you might not be able to resolve easily. Is the author saying here that the disc defragmenter makes no attempt to put the chunks of files in the correct sequence or have I misunderstood? If he is saying that then is this a limitation of all disc defragmenter utilities - even commercial ones?

    Read the article

  • Which version control system to use?

    - by deshmukh
    I am looking at using a version control system to ensure that I can go back in time to a particular version for all documents in ~, if I have to. What is the best suited tool for this given that: I am without much experience in version control systems Several files will be plain text files but there will be some Libre-Office files also The tool should: Be easy to set-up, run and maintain Have easy to understand configuration options (what directories to track, for how long, how frequently the changes should be captured, etc.) Ideally have a GUI front also Be able to recover deleted files What is the best/ most widely used tool that will be suitable for me?

    Read the article

  • How can I autoclean my gnome main menu?

    - by Bruce Connor
    I like to experiment with lots of different software in my Ubuntu install. Then, every time Ubuntu reaches a new release cycle, I simply do a clean install (instead of upgrading) to get rid of all the extra software (and their respective config files/folders). The only thing I always backup and carry to the next install (besides personal files) are the config files for gnome, so my desktop is always the way I like it. =) The problem with that, is that the different packages I test out never get properly uninstalled, so my gnome main menu is full of broken links referring to software I had in previous installations (which got carried over because I kept the gnome config files). Is there any automated way to go through my gnome main menu and remove any broken links? I know how to manually edit the menu, and I could go through it myself, but I'm looking for some script or package that will clean for me so I wouldn't have to do it manually every release cycle.

    Read the article

  • Avoiding connection timeouts on first connection to LocalDB edition of SQL Server Express

    - by Greg Low
    When you first make a connection to the new LocalDB edition of SQL Server Express, the system files, etc. that are required for a new version are spun up. (The system files such as the master database files, etc. end up in C:\Users\<username>\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\LocalDBApp1) That can take a while on a slower machine, so this means that the default connection timeout of 30 seconds (in most client libraries) could be exceeded. To avoid this hit on the...(read more)

    Read the article

  • Using Computer Management (MMC) with the Solaris CIFS Service (August 25, 2009)

    - by user12612012
    One of our goals for the Solaris CIFS Service is to provide seamless Windows interoperability: not just to deliver ubiquitous, multi-protocol file sharing, which is obviously a major part of this project, but to support Windows services at a fundamental level.  It's an ongoing mission and our latest update includes support for Windows remote management. Remote management is extremely important to Windows administrators and one of the mainstay tools is Computer Management. Computer Management is a Windows administration application, actually a collection of Microsoft Management Console (MMC) tools, that can be used to configure, monitor and manage local and remote services and resources.  The MMC is an extensible framework of registered components, known as snap-ins, which allows Computer Management to provide comprehensive management features for both the local system and remote systems on the network. Supported Computer Management features include: Share ManagementSupport for share management is relatively complete.  You can create, delete, list and configure shares.  It's not yet possible to change the maximum allowed or number of users properties but other properties, including the Share Permissions, can be managed via the MMC. Users, Groups and ConnectionsYou can view local SMB users and groups, monitor user connections and see the list of open files. If necessary, you can also disconnect users and/or close files. ServicesYou can view the SMF services running on an OpenSolaris system.  This is a read-only view - we don't support service management (the ability to start or stop) SMF services from Computer Management (yet). To ensure that only the appropriate users have access to administrative operations there are some access restrictions on these remote management features. Regular users can: List shares Only members of the Administrators or Power Users groups can: Manage shares List connections Only members of the Administrators group can: List open files and close files Disconnect users View SMF services View the EventLog Here's a screenshot when I was using Computer Management and Server Manager (another Windows remote management application) on Windows XP to view some open files on an OpenSolaris system to prepare a slide presentation on MMC support.

    Read the article

  • Google I/O 2012 - Writing Efficient Drive Apps for Android

    Google I/O 2012 - Writing Efficient Drive Apps for Android Alain Vongsouvanh, Claudio Cherubino This session goes through how to write Drive apps that synchronize files with Android devices. We'll also go into how to open files on Android devices, or create new files from this environment. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 234 5 ratings Time: 52:45 More in Science & Technology

    Read the article

  • SCP command Clarification

    - by david.colais
    I'm using the scp commands to pull some files from the remote server and one variation of the command is not working. I have 2 files names one.xml and two.xml in a remote server and I'm pulling these two files into the current dir using the following command: scp [email protected]:/student/class/Intermediate/one.xml . scp [email protected]:/student/class/Intermediate/two.xml . The above command works fine but if I use wildcards to pull all the xml files in a single shot as shown below it returns scp: No match. scp [email protected]:/student/class/Intermediate/*.xml . Why is it working if I pull the files individually and not working if I try to pull using wildcards.

    Read the article

  • Simple Merging Of PDF Documents with iTextSharp 5.4.5.0

    - by Mladen Prajdic
    As we were working on our first SQL Saturday in Slovenia, we came to a point when we had to print out the so-called SpeedPASS's for attendees. This SpeedPASS file is a PDF and contains thier raffle, lunch and admission tickets. The problem is we have to download one PDF per attendee and print that out. And printing more than 10 docs at once is a pain. So I decided to make a little console app that would merge multiple PDF files into a single file that would be much easier to print. I used an open source PDF manipulation library called iTextSharp version 5.4.5.0 This is a console program I used. It’s brilliantly named MergeSpeedPASS. It only has two methods and is really short. Don't let the name fool you It can be used to merge any PDF files. The first parameter is the name of the target PDF file that will be created. The second parameter is the directory containing PDF files to be merged into a single file. using iTextSharp.text; using iTextSharp.text.pdf; using System; using System.IO; namespace MergeSpeedPASS { class Program { static void Main(string[] args) { if (args.Length == 0 || args[0] == "-h" || args[0] == "/h") { Console.WriteLine("Welcome to MergeSpeedPASS. Created by Mladen Prajdic. Uses iTextSharp 5.4.5.0."); Console.WriteLine("Tool to create a single SpeedPASS PDF from all downloaded generated PDFs."); Console.WriteLine(""); Console.WriteLine("Example: MergeSpeedPASS.exe targetFileName sourceDir"); Console.WriteLine(" targetFileName = name of the new merged PDF file. Must include .pdf extension."); Console.WriteLine(" sourceDir = path to the dir containing downloaded attendee SpeedPASS PDFs"); Console.WriteLine(""); Console.WriteLine(@"Example: MergeSpeedPASS.exe MergedSpeedPASS.pdf d:\Downloads\SQLSaturdaySpeedPASSFiles"); } else if (args.Length == 2) CreateMergedPDF(args[0], args[1]); Console.WriteLine(""); Console.WriteLine("Press any key to exit..."); Console.Read(); } static void CreateMergedPDF(string targetPDF, string sourceDir) { using (FileStream stream = new FileStream(targetPDF, FileMode.Create)) { Document pdfDoc = new Document(PageSize.A4); PdfCopy pdf = new PdfCopy(pdfDoc, stream); pdfDoc.Open(); var files = Directory.GetFiles(sourceDir); Console.WriteLine("Merging files count: " + files.Length); int i = 1; foreach (string file in files) { Console.WriteLine(i + ". Adding: " + file); pdf.AddDocument(new PdfReader(file)); i++; } if (pdfDoc != null) pdfDoc.Close(); Console.WriteLine("SpeedPASS PDF merge complete."); } } } } Hope it helps you and have fun.

    Read the article

  • What is a good Amazon S3 client?

    - by Eyal
    I've been using the Amazon S3 Management console to browse my S3 files. Unfortunately, it doesn't seem to be able to sort files (in a given bucket) by anything other than whatever its default is (which seems to be by name). I'd like a nice GUI client for seeing these files which will let me sort them by date, so the newest will appear on top. I did find a Firefox plug-in - S3Fox - but it doesn't work for the current version of Firefox.

    Read the article

  • Extract large zip file (50G) on Mac OS X

    - by chingjun
    I was trying to move the files to another hard drive. So I archived all my photos in one large zip file using the Mac OS X built-in compress function. But the file failed to extract. I've tried many programs, but none of the programs I tried were able to extract the file. I've tried Mac OS X's extract utility, Stuffit Expander, 7zip (command line), all failed. Mac's archive utility and Stuffit don't seem to support large files, and 7zip's command line version gave an error stating unsupported archive. I have no luck in Windows too as many of my files have Chinese filenames, and couldn't extract to the correct name under Windows. Could anyone please suggest some programs that can support large files, can handle files compressed using Mac OS X's compress function, and can support UTF-8 filename? With or without GUI is fine. Thank you in advance.

    Read the article

  • Changing default application

    - by ragebunny
    Hey i have recently installed Sublime text 2 and i must say it's one of the best text editors i've ever used or even seen in use. So i managed to install everything and added to the unity menu and also change all the entries of gedit.desktop to sublime.desktop and it works just fine for most files but i realised that some files still open in gedit, for example php files, i checked the default list for php but there isn't anything in there. So how would i set the default for opening php files? Update:Also Sublime doesn't show up in them properties menu where i can usually select the default program. Thanks for the help. I'm using 12.10.

    Read the article

  • How to determine if a file has been backed up?

    - by Console
    I try to consolidate old drives to new ones of larger capacity. Sometimes files have been renamed, but are otherwise identical. Sometimes an old directory has just a few more files in it than a newer directory with the same name. Sometimes a file has the same name but the size differs. So I often find myself asking the question: Are there any files on this old drive or directory that I haven't already copied to the new drive? I just want to know that I have the files, I don't want to try and sync stuff automatically (Syncing tools tend to just sync, creating duplicate folder structures and other problems, so I prefer to do it by hand). Basically, if an old drive has a file called "foo.bar" ten directories deep, and my new big drive has an identical file called "oldstuff.zip" in the root, I just want a "yes you have it" or "no, unique files exist". Is there a free tool, a script or a quick and easy method (Mac/Unix or Windows) to get the answer?

    Read the article

  • Git for Application Settings

    - by devians
    I use a lot of tools at work and at home, and im constantly tweaking them in one location or the other. It's somewhat common practice for people to use Git to version their .vim, .vimrc, and other . files, since you can host your config files on github and have the share-ability and all the other advantages that implies. Being able to version and branch my configs sounds like a grand idea, since I'm always messing about with them. I'd like to discuss the best practice for doing this on a slightly wider scope. How would you implement it? Have your configfiles repo in ~/Library/Configs or similar, and symlink the appropriate files? How to handle preference files for Applications, ie iTerm2. These files are recreated every time, so you'd have to symlink 'backwards' and put a link in the repo? rather than symlinking to the repo, since it would just delete the symlink.

    Read the article

  • how to create java zip archives with a max file size limit [closed]

    - by Marci Casvan
    I need to write an algorithm in java (for an android app) to read a folder containing more folders and each of those containing images and audio files so the structure is this: mainDir/subfolders/myFile1.jpg It must be in java, something like perl script is not an option. It would preferably be for the compressed archive in order to squeeze as many files as possible before mailing the zip. Just a normal zip (no jar). My problem is that I need to limit the size of the archive to 16mb and at runtime, create as many archives as needed to contain all my files from my main mainDir folder. I tried several examples from the net, I read the java documentation, but I can't manage to understand and put it all together the way I need it. Has someone done this before or has a link or an example for me? I resolved the reading of the files with a recursive method but I can't write the logic for the zip creation I'm open for suggestions or better, a working example. EDIT: FileNotFoundException (no such file or directory) this was my initial post at Stack Overflow. I've got an answer to it, but I can't set the size of the ZipEntry and the logic doesn't work and also when extracting the my files from the zip I get the compression method not supported error.

    Read the article

  • How can I delete a specific file from a set of results using the find command in Linux?

    - by PeanutsMonkey
    I have the following command that lists all files with the extension doc, docx, etc. find . -maxdepth 1 -iname \*.doc\* The command returns numerous files some of which I would like to delete. So for example the results returned are Example.docx Dummydata.doc Sample.doc I would like to delete Sample.doc and Dummydata.docx. How do I delete the files using the -exec option. Am I able to pass in the names of the files e.g. rm Dummydata.docx Sample.doc hence the command would look as follows find . -maxdepth 1 -iname \*.doc\* -exec rm Dummydata.docx Sample.doc Can I pass the names of the files within {} afterrm`? e.g. find . -maxdepth 1 -iname \*.doc\* -exec rm {Dummydata.docx} Sample.doc Is there a better way of doing it?

    Read the article

  • On Windows Server 2003, permissions are not propagating to a group that is a member of a group

    - by Joshua K
    Windows Server 2003 on i386. FTP server is running as the SYSTEM user/group. Some files we want served (read and write) are owned by the group 'ftp.' ftp has full read/write/whatever permissions on those files and directories. SYSTEM can't read/write those directories. So, I added SYSTEM to the 'ftp' group. Windows happily complied, but even after restarting Filezilla, it still could not read/write those files. Is there any way to do what we want without "re-permissioning" all those files? Running the ftp server as 'ftp' isn't really an option because it also serves files that are owned by SYSTEM (And not ftp). Sigh... :) Any insights?

    Read the article

  • Is there a RAR extractor (for multiple rar files like .r00 etc.) that will use all of my quad cores?

    - by Christopher Done
    I've got a quad core Intel processor. I've got a big file split into little ones as RAR files, foo.r00, foo.r01, etc. which the RAR program extracts into one file/directory. Is there a RAR program that I can specify like "use four cores" in the extract process? At the moment it sits there using 100% of one core. I recognise the bottleneck might be my hard drive anyway, but I don't see a lot of HD usage and suspect the decompression process is more intensive than waiting on I/O. For example, GNU Make accepts a (-j, I think) argument to tell it how many cores to use, which I used to compile PHP 6 really quickly.

    Read the article

  • File store: CouchDB vs SQL Server + file system

    - by Andrey
    I'm exploring different ways of storing user-uploaded files (all are MS Office documents or alikes) on our high load web site. It's currently designed to store documents as files and have a SQL database store all metadata for those files. I'm concerned about growing out of the storage server and SQL server performance when number of documents reaches hundreds of millions. I was reading a lot of good information about CouchDB including its built-in scalability and performance, but I'm not sure how storing files as attachments in CouchDB would compare to storing files on a file system in terms of performance. Anybody used CouchDB clusters for storing LARGE amounts of documents and in high load environment?

    Read the article

  • Gzip compress offline?

    - by shoosh
    I've configured my site to serve compressed content by putting this line in .htaccess AddOutputFilterByType DEFLATE text/html text/plain text/xml text/javascript text/css application/javascript application/json This works perfectly for almost all files except a few large JSON files that are above 200Kb. For some reason they are not being compressed. I see that they don't using the net tab in firebug and the Network section in chrome. So as a workaround I thought I could compress these files offline and have Apache read them compressed. What tool should I use to compress them? is the linux gzip the one? any special flags or something I should use? What should I put in .htaccess so that the server would know to serve these files with content-encoding gzip ?

    Read the article

  • Ububtu 13.04 Rename Computer

    - by Sourabh
    How can I rename my computer? Renaming it in /etc/hosts and /etc/hostname does something weird. Before renaming it, I am able to open these files via sublime using sudo subl /etc/hosts but when I rename my Computer (using nano) and open any of these files using subl, I get this message: No protocol specified (sublime_text:20071): Gtk-WARNING **: cannot open display: :0.0 So I guess renaming in the above files is not the only thing I have to do. PS: If I rename using sublime, after renaming one of the files, I get same message when I try to open other file

    Read the article

  • Optimal Compression for Speech

    - by ashes999
    I'm designing a game that depends heavily on audio; I will have some 300+ speech files (most of them just a word or two long). This can very quickly escalate the size of my final game. What's the optimal way to encode/compress speech files to keep the size minimal without getting audio artifacts? Please address both per-file compression/encoding, and also zipping/compressing the set of all speech files together in your answer. Because I'm not sure which (or combination of both) factors will give me the best results. Edit: I need this to run in Silverlight and Android, so I'm presumably stuck with only MP3 as my option (other than uncompressed wave files).

    Read the article

< Previous Page | 365 366 367 368 369 370 371 372 373 374 375 376  | Next Page >