Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 26/1516 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Good and easy way to share files on local machine

    - by jb
    I would like to have a directory that has following properties: Many users can copy files into it These files can be deleted/changed by these users (user A can delete/modify file that was copied into this directory) it cant be done using normal file permissions (because permissions are retained on copy). Here is what I found on the net: brainstorm idea blueprint Some use cases: Sharing music on local machine Simple git repository sharing (just make a bare repository writeable to many people) --- i know that there are solutions like gitosis Allow many developers to modify test instance of php app without giving them root (i guess they would copy files) --- I'm leading a team of nonprofit junior developers and I need to keep that one simple! EDIT AFAIK setting SGID bit is not enugh, it only affects newly created files --- and basic workflow for these use cases ivnolves copying and other operations (which cleave file's gid unchanged)

    Read the article

  • Eclipse: always keep files updated

    - by AK01
    I keep lots of files/editors open in Eclipse. I also love using git stash and other git commands that essentially change the contents of my open files. Is there an Eclipse feature or plugin that will always keep the contents of my open files up to date and live? Currently if I put focus in an out of sync editor, I get an awkwardly worded dialog that I have to parse carefully every time. I wish it would just keep me synced like Textmate does.

    Read the article

  • OSB and Ubuntu 10.04 - Too Many Open Files

    - by jeff.x.davies
    When installing the latest Oracle Service Bus (11gR1PS3) onto my Ubuntu 10.04 system, the Eclipse IDE was complaining about there being too many open files. The Oracle Service Bus and the Oracle Enterprise Pack for Eclipse (aka OEPE) do make use of ALOT of files. By default, Ubuntu will restrict each user to 1024 open files. A much more realistic number for OSB development is 4096. Changing the file limit in Ubuntu is fairly simple (if arcane). You will need to modify two different files and then restart your server. First, you need to modify the limits.conf file as the root user. Open a terminal window and enter the following command: sudo gedit /etc/security/limits.conf Add the following 2 lines to the file. The asterisk simply means that the rule will apply to all users. * soft nofile 4096 * hard nofile 4096 Save your changes and close gedit. The second file to change is the common-session file. Use the following command: sudo gedit /etc/pam.d/common-session Add the following line: session required pam_limits.so Save the file and exit gedit. Restart your machine. You shouldn't have any more problems with too many open files anymore.

    Read the article

  • How to open files which are located in VirtualBox's Guest Machine from Netbeans of Host Machine

    - by Bakhtiyor
    I have Ubuntu 10.04 installed in my Host Machine and it has VirtualBox. I have Guest Machine wich runs Ubuntu 10.10. I have NetBeans installed in Host Machine and need to open my project files which are located in Guest Machine. The reason I need it is because in my working place I have not access to install any applications, that is why I have Guest Machine where I have Web Server installed on it and also I have one web application that I am developing. I need to open that web application files from Guest Machine's Netbeans in order to modify/create new files for my web application. I have configured SSH server of Guest Machine and added port redirection in the VirtualBox so that now I can connect to it from Host Machine. But I could not find any way to open those files from Netbeans. Could anybody give me advice about how can I do that please? UPDATE I forgot to say that I don't want to use SharedFolders.

    Read the article

  • Recover files from NTFS drive with bad sectors

    - by Martin
    A few nights ago I have created a backup of my data on an external 500 GB NTFS USB hard drive. I have then formatted my computer, reinstalled Ubuntu and started transferring back the data from the external HDD. Unfortunately some files have became corrupted and Ubuntu is unable to copy them over. The same issue happens if I login using Windows 7. Disk Utility detects with SMART that there are "a few bad sectors". Some of files are perfectly intact, but other files cannot be accessed (nor read, copied...) although they are displayed within nautilus and show the correct file size. Is there anything I can do to recover this data? I have thought of using TestDisk but this utility seems more useful for repairing lost partitions or deleted files. I have also thought of using ddrescue so I could at least have a low level copy of the disk but I am not sure what use to make of it in order to recover the data!!!

    Read the article

  • DropSpace Syncs Android Files to Dropbox

    - by ETC
    DropSpace is a free Android application that fixes the primary issue that plagues the official Dropbox app for Android–the lack of true file synchronization. Grab a copy of DropSpace and start enjoying true file syncing on the go. The official Dropbox app is limited to grabbing files from your Dropbox account or pushing files from your phone to your Dropbox account. Actual file synchronization, this manual push/pull model aside, is nowhere to be found. DropSpace fills that gap by enabling file synchronization between your SD card directories and your Dropbox directories. It’s packed with handy features including restricting file syncing to Wi-Fi connection only (great if you don’t want to chew up your very limited data plan) as well as numerous toggles for various settings like whether it should delete remote files if the local file is deleted, how often it should run the sync service, and more. Hit up the link below to grab a copy and take it for a test drive. DropSpace is free and works wherever Android does; Dropbox account required. DropSpace [via Addictive Tips] Latest Features How-To Geek ETC Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 Access the Options for Your Favorite Extensions Easier in Firefox Don’t Sleep Keeps Your Windows Machine Awake DropSpace Syncs Android Files to Dropbox Field of Poppies Wallpaper The History Of Operating Systems [Infographic] DriveSafe.ly Reads Your Text Messages Aloud

    Read the article

  • dual boot missing files on ntfs

    - by yehuda
    I have 3 partitions: one for win7 (ntfs), one for Ubuntu (ext4) and one just for data (ntfs so both operating systems can see them). My problem is that I had stored some files on the data partition using ubuntu and when i booted win7 all that data was gone! After that I couldn't find the files even when using Ubuntu. My files were simply GONE :( Is there something I can do in Ubuntu or is it just windows problem?

    Read the article

  • I'm in a group but can't create or modify files

    - by dac
    I have two user accounts. Let's say one is User1 and the other is User2. Both of these accounts are in the "root" group. I made a folder with some files in it. The owner is User1 and the group is root. The permissions are set so the group "root" can create and delete files. However, when I log in as User2, I can only access files. User2 is in the "root" group for sure, and when I right-click on the folder in Nautilus, and then PropertiesPermissions, it says there that the "root" group can create and delete files. What's going on? edit: Logged out and then back in, and, I don't know why, everything works now...

    Read the article

  • How to find corrupted files?

    - by rafalcieslak
    Some files on my hard drive are corrupted (no worries, nothing system-related, just a junk of data files, mp3 etc.). I found that out when I tried to burn them all to a DVD, the burning application show a message that it cannot read the files as they are corrupted. [This is probably a drive issue, it had happened me once or twice already]. I don't care about recovering them, but I have to determine which ones are corrupted. I cannot check by manually opening them all, as there are thousands of them. Is there any tricky way to check all the files and list the ones that may cause problems when tried to open?

    Read the article

  • How can I add the version of a file to the file name with Tortoise-SVN?

    - by Eric Belair
    I would like to start giving unique names to "cache-able" files - i.e. *.css and *.js - in order to prevent caching, without requiring changes to the web-server settings (as is currently done in IIS). For instance, let's I have a JavaScript file called global.js. Going forward I would like it to have the name global.123.js when revision 123 is checked in. This would also require the following: The previous version of the file - perhaps it was global.115.js - is removed when the file is deployed. All references to the file are updated with the new file name How do I go about doing this? What concerns do I need to consider?

    Read the article

  • How to create a deb package that installs a series of files

    - by fossfreedom
    I would like to create a brand new deb package to install series of files. If at all possible, I would like to untar the folder containing these files as part of the installation into a known folder location. Failing that, some knowledge how to package the source folders and files would be very useful. Question is - is this possible and if so - how? Lets give an example: ~/mypluginfolder/ contains the files x, y, a subfolder called abc and inside that another file called z. I want to tar this folder: tar -cvf myfiles.tar ~/mypluginfolder I presume my debian package would look like myfiles.tar.gz myfiles+ppafoss_0.1-1/ myfiles.tar DEBIAN changelog, compat, control, install, rules source Is it possible to somehow untar myfiles.tar to a known folder location for example /usr/share/rhythmbox/plugins/ Thus the final result would be: /usr/share/rhythmbox/plugins/mypluginfolder /usr/share/rhythmbox/plugins/mypluginfolder\x /usr/share/rhythmbox/plugins/mypluginfolder\y /usr/share/rhythmbox/plugins/mypluginfolder\abc\z If - presuming launchpad needs source, advice is sought as to where I should drop the source folders and files into the deb package structure. This will eventually will become a series of individual launchpad PPA packages. What I prefer (but may not be able to achieve...) is to keep my packaging to a minimum - create a series of packages from a template and adjust the bare minimum (changelog etc + the tar file/file & folder structure).

    Read the article

  • Wiped data, and duplicated folders into files.

    - by Kaustubh P
    Something weird happened today, and I dont know how. Within a folder, all folders have a file by the same name, with a colon appended to it. And all the files from the most inner-most directory in my home, have been dumped to ~, with a size of 0 bytes. I have not executed any scripts or anything. I was just checking out some easter eggs, namely the gegls from outer space and free the fish and was away from the computer and was logged because of the screensaver. I couldnt log-back in with my password, so I just reset the PC, and while booting, the PC went into a drive check. BUT, IIRC, i saw the duplicate "folder files" before I had logged out, so thats not the reason! All the files have a timestamp of 14 Jan. Also, the contents of my eclipse folder have been dumped into ~. Right down to the jars and ini files. HELP!

    Read the article

  • Finding duplicate files?

    - by ub3rst4r
    I am going to be developing a program that detects duplicate files and I was wondering what the best/fastest method would be to do this? I am more interested in what the best hash algorithm would be to do this? For example, I was thinking of having it get the hash of each files contents and then group the hashes that are the same. Also, should there be a limit set for what the maximum file size can be or is there a hash that is suitable for large files?

    Read the article

  • Incorrect Dates for Downloable files in Google Snippets

    - by alds
    We have a website which create publications and newsletters. In most (if not all) the search results for our downloadable files, the Google snippets show dates which are less than when those files were actually published, from one to three months before. It would be impossible since those files did not even exist before the dates mentioned. The dates themselves do not seem to have any significance in our site. Any suggestions where the dates come from?

    Read the article

  • Unity Dashboard won't find local files, rearrange icons on two computers

    - by Stanton.Sculpture
    Suddenly I can't move icons around my unity launcher and the Dash won't search for my local files and folders. Was working when I first installed 13.10, but now it won't search for local files, and it won't let me rearrange the icons in any way. I've tried turning on and off all the scopes (lenses?) in multiple combinations, but it won't find any files unless I use nautilus to find them its mostly unresponsive. I can't see my recently used files, or files and folders scope at all. Dragging and dropping the icons on the side dock doesn't work, they only stick to my mouse until I put them back where they were. I cannot unlock any icons from the launcher, it just doesn't do anything when I click it. I tried rebooting both of my computers and its still won't function normally. I used ubuntu-bug -w to report a bug, no one has gotten back to me. Is there some option that I changed to cause this? This is a problem on both my laptop and Desktop. Please Help, Alex

    Read the article

  • How to process large files in NetLogo? [closed]

    - by user65597
    I am running into problems in NetLogo with large *.csv / *.txt files. The documents can consist of about 1 million data sets and I need to read them (to eventually create a diagram based on the data). With the most straightforward source code, my program needs about 2 minutes to process these files. How should I approach reading such large data files faster in NetLogo? Is NetLogo even suitable for such tasks (as it seems to be designed more for teaching and learning)?

    Read the article

  • Deleting "undeletable" files in Vista

    - by Nik Reiman
    I recently upgraded my workstation from XP SP3 to Vista Business, and during the upgrade Windows moved my old C:\Windows directory to C:\Windows.old. I got all of the stuff I needed out of that folder, but there are six "undeletable" files there so I cannot remove it. They are: Windows.old\Program1\Adobe\Reader 9.0\Resource\CMap\Identity-H Windows.old\Program1\Adobe\Reader 9.0\Resource\CMap\Identity-V Windows.old\Program1\Common Files\Adobe\Acrobat\ActiveX\AcroIEHelper.dll Windows.old\Program1\Common Files\Adobe\Acrobat\ActiveX\AcroIEHelperShim.dll Windows.old\Program1\Common Files\Adobe\Acrobat\ActiveX\AcroPDF.dll Windows.old\Program1\Common Files\Adobe\Acrobat\ActiveX\pdfshell.dll Whenever I try to delete the files either through explorer or a command line, I get a permission denied error. I have tried to grant myself full permission on the files, but again, permission denied. I don't even have acrobat installed on my Vista machine, and I uninstalled Adobe updater. However, I still can't manage to get rid of these files. How do I nuke them for good? Edit: I was able to take ownership of the files, but I still can't delete them. Renaming them did not work, as I was denied permission to do that as well. I'll try booting up in safe mode and getting rid of them there. Edit II: Booting up into safe mode did not allow me to delete the files. Bummer.

    Read the article

  • Hiding recent files in Unity dashboard

    - by Eric
    Ubuntu 13.04 (though had the same issue in both 12.04 LTS and 12.10). Unity desktop (yes I like it, shush). Anyways, when clicking on the dashboard there is a tab for 'Files and Folders'. I don't have any files on this computer that isn't porn. In other words, it displays the images there (as it's supposed to), but I can't have it displaying the porn for obvious reasons. I have disabled 'recent activity' and even added the folder it's all in to the 'do not record activity in the following folders'. I'm assuming that works but as I don't actually have any other files, it still displays them. I don't want to have to make it a hidden folder because it's on an external HDD and causes issues when moving from computer to computer (I have other movies on it as well). TL;DR: Get rid of the 'Files and Folders' tab in the dashboard. Is it possible?

    Read the article

  • No Sync of Files in Android U1 folder

    - by Oldbwl
    My Desktop (Ubuntu 12.10) and Laptop (Win 7) happily sync files. The Ubuntu One folders are perfectly in sync. I have an Asus Transformer with the Ubuntu One app installed. I can read files from the cloud, and they download to a U1 folder. If I edit the files in the Asus (say a sheet in Kingsoft) it appears they never go back to the cloud unless I physically select the file to be uploaded. Is this correct?

    Read the article

  • How do Windows 7 encrypted files look like?

    - by Sean Farrell
    Ok this is kind of an odd question: How do Windows 7 (Home Premium) encrypted files look like "from the outside"? Now here is the story. An acquaintance of a freind of mine got a nasty virus / scareware. So I wiped out my PC technician cap and went to work on it. What I did was remove the drive from the laptop and put drive into my external drive bay. I scanned the drive and yes it was loaded with stuff. That basically cured the infection and I could start the system back up. To check if it cured the problem I wanted to see the system while running. There where two user accounts, on with a password and one without (both admin users !?!). So I logged into the unprotected user and cleaned up the residual issues, like proxy server to localhost in the browser config. Now I wanted to do the same for the password protected user. What I noticed that from my system and the unprotected user account the files of the protected user looked garbled. The files are something like 12 random alphanum chars, but the folders looked ok. Naive as was thought this might be how encrypted files looked "from the outside". (I never use Microsoft's own security features, so how would I know. TrueCrypt is one big blob.) Since the second user could not be reached, I though sod it and removed the password from the account. (That might have been a mistake, I know.) Now I did the same clean up tasks and all nice and fine; except for the files which where still "encrypted". So I looked into many Windows Encrypted Files recovery posts and not all hope is lost, since I should be able to extract the certificate and with the password regain access to the files. Also note that windows did "only" prompt me that removing the password would be insecure, not that access to encrypted files would be lost, like it is claimed in most recovery articles. Resetting the password did not help and I gave up for the night. The question that nagged me half of the last night was, what if the files are not encrypted, but the scare-ware encrypted / destroyed the files? I don't want to spend hours of work trying to recover files that are not recoverable. The ting is that the user does not remember turning it on and aren't the files marked in blue and the filename is readable? Many thanks for input from users who have more knowledge about WEF...

    Read the article

  • Tool or script to detect moved or renamed files on Linux prior to a backup

    - by Pharaun
    Basically I am searching to see if there exists a tool or script that can detect moved or renamed files so that I can get a list of renamed/moved files and apply the same operation on the other end of the network to conserve on bandwidth. Basically disk storage is cheap but bandwidth isn't, and the problem is that the files often will be reorganized or moved around into a better directory structure thus when you use rsync to do the backup, rsync won't notice that its a renamed or moved file and re-transmission it over the network all over again despite having the same file on the other end. So I am wondering if there exists a script or tool that can record where all the files are and their names, then just prior to a backup, it would rescan and detect moved or renamed files, then I can take that list and re-apply the move/rename operation on the other side. Here's a list of the "general" features of the files: Large unchanging files They can be renamed or moved around [Edit:] These all are good answers, and what I end up doing in the end was looking at all of the answers and will be writing some code to deal with this. Basically what I am thinking/working on now is: Using something like AIDE for the "initial" scan and enable me to keep checksums on the files because they are supposed to never change, so it would aid on detecting corruption. Creating an inotify daemon that would monitor these files/directory and recording any changes relating to renames & moving the files around to a log file. There are some edge cases where inotify might fail to record that something happened to the file system, thus there is a final step of using find to search the file system for files that has a change time latter than the last backup. This has several benefits: Checksums/etc from AIDE to be able to check/make sure that some media did not get corrupt Inotify keeps resource usage low and no need to re-scan the filesystem over and over No need to patch rsync; If I have to patch things I can, but I would prefer to avoid patching things to keep the burden lower, (IE don't need to re-patch everytime there is an update). I've used Unison before and its really nice, however I could've sworn that Unison does keep copies around on the filesystem and that its "archive" files can grow to be rather large?

    Read the article

  • How do .so files avoid problems associated with passing header-only templates like MS dll files have?

    - by Doug T.
    Based on the discussion around this question. I'd like to know how .so files/the ELF format/the gcc toolchain avoid problems passing classes defined purely in header files (like the std library). According to Jan in that answer, the dynamic linker/loader only picks one version of such a class to load if its defined in two .so files. So if two .so files have two definitions, perhaps with different compiler options/etc, the dynamic linker can pick one to use. Is this correct? How does this work with inlining? For example, MSVC inlines templates aggressively. This makes the solution I describe above untenable for dlls. Does Gcc never inline header-only templates like the std library as MSVC does? If so wouldn't that make the functionality of ELF described above ineffective in these cases?

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >