Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 236/1516 | < Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >

  • U1 music shows unknown artist, how can I make it recognize m4a files?

    - by bisi
    Hello all, I am in the process of uploading my music library as a backup to U1, but I figured, why not also enjoy Ubuntu 1 music on my iPhone? With a few difficulties to start, the upload is now in progress, but I've noticed that there is a huge percentage of files in the unknown artist folder, and I believe it is all of my m4a files. They play fine, but without any information. Coming from an iTunes background, and having bought the majority of my music on the iTunes store, I wonder how I could make this work, easily? I am on Maverick (afaik), but About Ubuntu shows 11.04. I use Banshee as my music manager, and I monitor my sync using Ubuntu one preferences, ubuntuone-indicator and magicicada. The total file size of my music folder is 38.9GB. Thank you for your help!! And apologies if I couldn't find a thread where this was already covered...

    Read the article

  • Rar.exe CLI - add files from directory without directory structure?

    - by Vercas
    I need to grab every file in a folder and shove it into a RAR archive. This is my current method: "C:\Program Files\WinRAR\Rar.exe" a -r -md2m -s -m5 -ma4 -t ..\Releases\vCommands.rar bin\ ... Where bin is my folder. I tried this too, even though it is for another program, and the results are the same. To be clear, here's a picture: To the top-left, in the .rar file, there is the bin directory which contains all the files. To the bottom-right, in the .7z file, all those files are in the archive root. What I need is shoving all those files in the .rar archive root, instead of a folder, without having to execute my batch file inside that bin folder.

    Read the article

  • Why is 'libgnomevfs' files under /usr/include/gnome-vfs-2.0?

    - by George Edison
    Most applications, including the gnomevfs headers themselves, expect the files to be under /usr/include/libgnomevfs, but Ubuntu has them under /usr/include/gnome-vfs-2.0/libgnomevfs. Why? The package I'm referring to is called libgnomevfs2. Inside /usr/include/gnome-vfs-2.0/libgnomevfs/gnome-vfs.h` we find: #include <libgnomevfs/gnome-vfs-acl.h> #include <libgnomevfs/gnome-vfs-address.h> #include <libgnomevfs/gnome-vfs-async-ops.h> #include <libgnomevfs/gnome-vfs-cancellation.h> ... Meaning that even the headers themselves expect the files to be in that location - and nothing that includes this file will work. Am I missing something, or is this a glitch?

    Read the article

  • How can I protect files on my NGiNX server?

    - by Jean-Nicolas Boulay Desjardins
    I am trying to protect files on my server (multiple types), with NGiNX and PHP. Basically I want people to have to sign in to the website if they want to access those static files like images. DropBox does it very well. Where by they force you to sign in to access any static files you put on there server. I though about using NGiNX Perl Module. And I would write a perl script that would check the session to see if the user was sign in to give them access to a static file. I would prefer using PHP because all my code is running under PHP and I am not sure how to check a session created by PHP with PERL. So basically my question is: How can I protect static files of any types that would need the user to have sign in and have a valid session created with a PHP script?

    Read the article

  • Why some recovery tools are still able to find deleted files after I purge Recycle Bin, defrag the disk and zero-fill free space?

    - by Ivan
    As far as I understand, when I delete (without using Recycle Bin) a file, its record is removed from the file system table of contents (FAT/MFT/etc...) but the values of the disk sectors which were occupied by the file remain intact until these sectors are reused to write something else. When I use some sort of erased files recovery tool, it reads those sectors directly and tries to build up the original file. In this case, what I can't understand is why recovery tools are still able to find deleted files (with reduced chance of rebuilding them though) after I defragment the drive and overwrite all the free space with zeros. Can you explain this? I thought zero-overwritten deleted files can be only found by means of some special forensic lab magnetic scan hardware and those complex wiping algorithms (overwriting free space multiple times with random and non-random patterns) only make sense to prevent such a physical scan to succeed, but practically it seems that plain zero-fill is not enough to wipe all the tracks of deleted files. How can this be?

    Read the article

  • How can I send super large files directly to another computer in the Internet for free?

    - by Cruise
    I regulary need to transfer very large files (30 GB) to my friend - financial statistics. I don't have any problem with bandwidth: it is very broad here. I did some research in the area, so: 1. I would not use FTP, as it is very tricky to get it working behind a NAT. 2. I would not use Skype/MSN/ICQ, as it is not designed for file transfer and it underperforms on the huge files. 3. I would not use file-sharing services, as I need to pay for big files (30 GB is a problem here) and I don't like holding any piece of my data on the third-party server. So, I need some smart tool that will do what I need: sending files directly browser-to-browser and not browser-server-browser. Is it so complex? Is there some web application in the Internet that can do this?

    Read the article

  • On linux how can make a list of files that are owned by a particular owner and then fix the group and owner?

    - by Stuart Woodward
    I have a deep and complex file system where some files have been accidently written by root. I want to change the ownership of those files back to the original owner in one go. I am playing with commands like: find /folder -type f | xargs ls -l | grep "root root" but there is a lot of garbage coming out too. I want to make a list first and then change only the files in that list after confirmation.

    Read the article

  • What Scripting Program would you choose to recover deleted and missing files?

    - by Steven Graf
    For a private project I'm looking for a command line tool to scan and recover files. I'm working on Gnome 3 (but I could also change my OS if it helps reaching my goal) and must be able to find and recover files on attached devices with formats such as NTFS, Fat32, MAC OS Extended and ext3. Is there a command line script to cover all of them or do I need to use different programs to reach my goal? can you recommend command line tools for these kind of tasks? is one of you willing and able to show me some examples and teach me further?

    Read the article

  • How to read data from a large number of files in a folder? [closed]

    - by Gary Dhillon
    I seem to be having some trouble figuring out a solution for a problem. See the thing is, my code is supposed to read a lot of data from a bunch of files. I've been thinking of two different approaches: 1) the first one seems simpler, I ask the user if they would like to examine the next file or just quit out of the program.( I believe this is simpler and would take less time to run through.) 2)It reads through all the files and outputs the results for each of them, and then a shared result for all of them.( I think this would be better for what I've been asked to do and it saves the user some hassle.) If anyone can tell me how to code either of these in C++, I would be very grateful. Here is a sample of the file: 0 -- 19 weight 0 -- 20 weight I use this to determine density and possibly ignore the weights which is a number.

    Read the article

  • Can I make Windows to open Excel XML files with Excel without opening Explorer?

    - by Sorin Sbarnea
    I want to be able to open Excel XML files in Excel but without assigning XML directly to Excel. There are lots of XML files that are not Excel files and I don't want to open all of them in Excel. The file has proper header for opening in Excel but currently it does open Internet Explorer that asks me if I want to open the file with Excel, save or cancel. I just want to open it without two another annoying windows.

    Read the article

  • Is there any method of backing up Google Drive files in some sort of versioning system?

    - by VictorKilo
    Backstory My company is utilizing Google Drive for our shared files. Each user has their own Drive account. In addition, we have a corporate Drive account which holds documents which are shared to each user. Each folder is shared to different users depending on their permissions and positions in the company. Many users are able to add files, and updated folders within this shared Drive account. This is fine. What is not fine, is when someone deletes something that they shouldn't. I have little to no way of knowing when I file is deleted wrongfully. Furthermore, anything that gets deleted goes into the trash bin of the file's creator, so I can't just restore it from the trash. Question Is there any method of backing up Google Drive files in some sort of versioning system that would allow me to revert files back to defined points in time? What i have Tried I currently have this corporate drive account synced up to my personal computer through the Google Drive application. Each night, I run a backup on the file using Windows "Backup and Restore." This allows me to at least get back files that are lost, but I a cleaner method than this. It's very possible that I may not have the very latest version of a document on my computer when the utility runs.

    Read the article

  • Find which files an apache process is writing to?

    - by Haluk
    We have this apache process which becomes io-bound time to time. Using atop, we can see it is a write operation. Using lsof -p <PID> we can see a list of files open by the httpd process. First we thought "log" files must be the problem. So we turned them off just to test. However write operations still continues. We will continue testing a few other things. For instance we use php session variables a lot. Maybe php session files are getting all the writing. But is there a way to quickly identify files which get written to by the httpd process? This way we can focus our efforts on those files. UPDATE: We used the strace command as suggested. Here are two lines from the output. write(23, "\27\0\0\0\3SET CHARACTER SET utf8", 27) = 27 write(23, "\17\0\0\0\3SET NAMES utf8", 19) = 19 We do not have a mysql process on this server. So is strace also showing what is being written to an ethernet port? UPDATE2: During high io load, the process which consumes most of the write resources gives the following output to strace -e trace=write -p <PID>: --- SIGCHLD (Child exited) @ 0 (0) --- write(9, "!", 1) = 1 write(19, "OPTIONS * HTTP/1.0\r\nUser-Agent: Apache (internal dummy connection)\r\n\r\n", 70) = 70 However I cannot figure out where these are being written to.

    Read the article

  • Possible to recover older, previously deleted files with R-studio?

    - by SteveO
    The files and directories on one of my ntfs partition were wiped out last time. I used R-studio to scan the partition, and it did find many files, actually more than the capacity of the partition. This is because R-studio found files that were deleted even earlier. So I wonder if it is possible to specify those files and directories deleted last time instead of those deleted earlier for recovery? R-studio has a free demo version, for which scanning is free,but recovery isn't. It is downloadable from http://www.data-recovery-software.net/Data_Recovery_Download.shtml Its manual is here http://www.r-tt.com/downloads/Recovery_Manual.pdf. I have tried my best to search for answers in the manual, but failed to find one. Their technical support is not as good as their software, and helpless usually in my opinion. Thanks!

    Read the article

  • How to move and delete all files and subdirectories with command line in windows7?

    - by user1285419
    I am looking for a way to move all files and subfolders within a given directory to somewhere else and after the movement delete the original folder. For example, suppose in current path, there is a folder called FOLDERA, I am trying to move all files and subfolders from FOLDERA to the current path and then remove FOLDERA, but I need to do this with a command line. I try MOVE command but I find that it can only move the files. Anyway to do that? Thanks.

    Read the article

  • How to give Apache access to files in my home directory?

    - by Mark Smith
    I'm a Ubuntu Linux user (Lucid Lynx) who is running Apache. I have a collection of zip files in a folder in my home directory (~/zip_files) which I would like to be able to link to through apache, such that when somebody who visits my website which I'm using Apache to host clicks a link to one of the zip files, he can download it through the web. How can I provide Apache with access to the files and set the permissions? Thanks, I'm new to linux!

    Read the article

  • Team Foundation Server– Debug symbols(pdb files) generated in Release build? Fix it.

    - by Gopinath
    Yesterday I setup TFS for my .NET playground website to implement continuous integration and deployments. After a successful build I noticed that debug symbols(pdb files) were generated even though TFS is configured to build in Release mode.  After a bit of analysis its turned out to be the behavior of TFS to generate debug symbols (pdb files) until we pass the attribute DebugType = None. Here are the steps to pass DebugType parameter to MSBuild of TFS Go to Team Explorer Select Build Defintion >> Edit Build Definition Switch to Process tab Navigate to Advanced Section and locate MSBuild Arguments Add the following: /p:Configuration=Release /p:DebugType=none

    Read the article

  • How to Hide or Delete Files created by Vim in Windows?

    - by Hayek
    Whenever I open a file with Vim, the program automatically creates a copy of the file ending with a tilde~ When I'm done editing a few files, the folder is littered with extraneous files ending in ~ Is it possible to have Vim automatically remove said files? Or as an alternative, is it possible to have Windows hide them?

    Read the article

  • Why does lighttpd keep static files in cache, even when modified on disk ?

    - by Pixelastic
    I am using lighttpd to serve static files. I have a bunch of images in a dir that I regularly update. This will change the file content (and filesize) as well as the modification date, but not their filename. When I access the files through http, the updates are not taken into account and lighty serves the old file. I can manually rename the file to something different, then lighttpd will return a 404 error, and if I rename my file back, I will get the correct updated version. Seems like lightty is using some kind of cache mechanism of its own (which is fine) to return static files. Unfortunatly, it seems that this mechanism doesn't update itself when files are modified. I checked through Wireshark, and my browser is really doing a request to the file, this is not a browser caching issue. It returns a 200 OK when requesting it from an empty cache, and a 304 Not Modified otherwise, as expected. But the file is returned with a wrong Last-Modified header that do not reflect the real last modification date. Maybe there is some config directive that I am not aware of ? I would like the files returned by lighty to reflect the changes made on disk directly, or at least being able to invalidate its cache.

    Read the article

  • Looking for Unix tool/script that, given an input path, will compress every batch of uncompressed 100MB text files into a single gzip file

    - by newToFlume
    I have a dump of thousands of small text files (1-5MB) large, each containing lines of text. I need to "batch" them up, so that each batch is of a fixed size - say 100MB, and compress that batch. Now that batch could be: A single file that is just a 'cat' of the contents of the individual text files, or Just the individual text files themselves Caveats: unix split -b will not work here as I need to keep lines of text intact. Using the lines option is a bit complicated as there is a large variance in the number of bytes in each line. The files need not be a fixed size strictly, as long as it's within 5% of the requested size The lines are critical, and should not be lost: I need to confirm that the input made its way to output without loss - what rolling checksum (something like CRC32, BUT better/"stronger" in face of collisions) A script should do nicely, but this seems like a task someone has done before, and it would be nice to see some code (preferably python or ruby) that does atleast something similar.

    Read the article

  • Improve backup performance by watching files added/modified in given directories?

    - by OverTheRainbow
    I use SyncbackSE on Windows to back up files between two hard drives daily. Every time the application starts, it scans every single file in the directories that it watches before copying files that were added or modified. To improve performance, I was wondering if there were a Windows backup application that would hook into Windows to keep tracks of files that were added/modified in given directories, so that it only needs to go through this list when it comes time for a backup. Thank you.

    Read the article

  • Is there a good, free way to fix broken/corrupt .wmv files?

    - by chbtn
    I've recovered some files from an hdd that weren't supposed to be deleted in the first place, but they have seeking problems/crash the players. Since they have the right size, I'm thinking it might be a problem of corrupt index/header, so I'm trying to find a way to fix them. It's easy to find examples on how to fix corrupt .avi files with mencoder, but .wmv seems trickier. Also, I realize there might not be a way to fix these files, but I figure I might as well as try. As far as players go, I've tried opening it with vlc/mplayer/windows media player. I can use anything on Windows XP/7 and Ubuntu, as long as it's free. Since the files are 200mb+ and there are quite a few, I don't think trial software would work.

    Read the article

  • How to recover disk and files after 10.04 boot failure?

    - by K R Jawaharlal
    I have a 1TB HDD with four Windows XP partitions and a 120GB HDD with 10.04. While working in Ubuntu, due to delay and failure to shutdown, I switched off the system. Next it failed to boot in Ubuntu and stopped at initramfs. After that, I tried to repair from the booting stage. By mistake instead of hdd no I used partition no. This damaged the Windows also. Then Windows XP was reloaded and is running. When I boot with 12.04, it is able to detect the 120GB HDD, but, it is unable to mount. I am unable to access the files. I would like to revive the disk and recover files. Would appreciate any help.

    Read the article

  • In Windows XP Professional, is there a limit on the number of files that can be contained in a single folder? [duplicate]

    - by Andrew
    This question already has an answer here: How many files can a windows folder contain? 1 answer I am running Windows XP Professional, service pack 3. Right now I have 4,398 files in a single folder, and Windows XP seems to read it fine. How many more files can I place in this same folder, either theoretically or practically? Thanks for your time.

    Read the article

  • How do I prevent my swf files being hotlinked, downloaded etc.

    - by undefined
    I have swf files that are embedded in a PHP page using SWFObject. These swf files are in the same directory as my PHP files. for example www.myurl.com/index.php embeds www.myurl.com/flashfile.swf, index.php and flashfile.swf are in the same directory. However I want to prevent people from being able to type in www.myurl.com/flashfile.swf and viewing the swf. I want the browser to deny access to this file unless it has been embedded by the PHP file. Should I move my swfs to another folder and protect this folder somehow - is this with the .htaccess file? I am running Apache on a linux machine. While my main concern is for swf files I would like to protect graphics used on the site too. all help appreciated thanks

    Read the article

< Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >