Search Results

Search found 40229 results on 1610 pages for 'deleted files'.

Page 46/1610 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • Thumbnailers of text and ogg files

    - by David López
    I use ubuntu 12.04 and I can see in nautilus thumbnailers of ogg (with embedded artwork) and text files, just like in figure It's a nice feature. I have a slow machine and I've installed Arch with LXDE and pcmanfm. I would like the same thumbnailers, but I can only see a few of them like in the figure I've installed nautilus, thunar, spacefm... and lots of different thumbnailers in my Arch machine, but I haven't be able to see the thumbnailers of text and ogg files. I think that maybe ubuntu uses a patched nautilus version with extended capabilities or something like this. Any idea? Thanks.

    Read the article

  • Share files - Ubuntu 12.4 and Windows 7 - one network - password not accepted

    - by gotqn
    I ask this question in SuperUser but no one helps me. I hope to get more attention here. I have three computers connected in one network by modem. I want to share files in this network in the most easy way (I have read about solutions using Samba). So, I have three machines: One with Windows 7 One with Windows XP One with Ubuntu 12.04 and I have the following situation: The windows PCs can share files between each other. The windows PCs can see that Ubuntu's one is in the network The PC with Ubuntu can see only the PC with Windows 7, but when I click on a folder it ask to enter the network password and it is not accepting it (I am 100% sure it's the correct one) Is there to fix this situation a little bit - at least to enable the file sharing between the Ubuntu and Windows 7 PCs or I should choose a different approach (please advice).

    Read the article

  • Nautilus tags/labels/marks/columns for folders/files

    - by madox2
    Is there any way how to mark folders or files with tags(or labels, new columns or whatever) in Nautilus? It would be nice to sort marked folders or files through this tags. Especially my first idea was to mark folders in my Movie directory with tags seen, not seen, must see, and so on. Then I realized it would be useful in any other workspaces with any custom tags... Is there any nautilus extension for this? Or any other file manager which can do this? It might look like this:

    Read the article

  • rm on a directory with millions of files

    - by BMDan
    Background: physical server, about two years old, 7200-RPM SATA drives connected to a 3Ware RAID card, ext3 FS mounted noatime and data=ordered, not under crazy load, kernel 2.6.18-92.1.22.el5, uptime 545 days. Directory doesn't contain any subdirectories, just millions of small (~100 byte) files, with some larger (a few KB) ones. We have a server that has gone a bit cuckoo over the course of the last few months, but we only noticed it the other day when it started being unable to write to a directory due to it containing too many files. Specifically, it started throwing this error in /var/log/messages: ext3_dx_add_entry: Directory index full! The disk in question has plenty of inodes remaining: Filesystem Inodes IUsed IFree IUse% Mounted on /dev/sda3 60719104 3465660 57253444 6% / So I'm guessing that means we hit the limit of how many entries can be in the directory file itself. No idea how many files that would be, but it can't be more, as you can see, than three million or so. Not that that's good, mind you! But that's part one of my question: exactly what is that upper limit? Is it tunable? Before I get yelled at--I want to tune it down; this enormous directory caused all sorts of issues. Anyway, we tracked down the issue in the code that was generating all of those files, and we've corrected it. Now I'm stuck with deleting the directory. A few options here: rm -rf (dir)I tried this first. I gave up and killed it after it had run for a day and a half without any discernible impact. unlink(2) on the directory: Definitely worth consideration, but the question is whether it'd be faster to delete the files inside the directory via fsck than to delete via unlink(2). That is, one way or another, I've got to mark those inodes as unused. This assumes, of course, that I can tell fsck not to drop entries to the files in /lost+found; otherwise, I've just moved my problem. In addition to all the other concerns, after reading about this a bit more, it turns out I'd probably have to call some internal FS functions, as none of the unlink(2) variants I can find would allow me to just blithely delete a directory with entries in it. Pooh. while [ true ]; do ls -Uf | head -n 10000 | xargs rm -f 2/dev/null; done ) This is actually the shortened version; the real one I'm running, which just adds some progress-reporting and a clean stop when we run out of files to delete, is: export i=0; time ( while [ true ]; do ls -Uf | head -n 3 | grep -qF '.png' || break; ls -Uf | head -n 10000 | xargs rm -f 2/dev/null; export i=$(($i+10000)); echo "$i..."; done ) This seems to be working rather well. As I write this, it's deleted 260,000 files in the past thirty minutes or so. Now, for the questions: As mentioned above, is the per-directory entry limit tunable? Why did it take "real 7m9.561s / user 0m0.001s / sys 0m0.001s" to delete a single file which was the first one in the list returned by "ls -U", and it took perhaps ten minutes to delete the first 10,000 entries with the command in #3, but now it's hauling along quite happily? For that matter, it deleted 260,000 in about thirty minutes, but it's now taken another fifteen minutes to delete 60,000 more. Why the huge swings in speed? Is there a better way to do this sort of thing? Not store millions of files in a directory; I know that's silly, and it wouldn't have happened on my watch. Googling the problem and looking through SF and SO offers a lot of variations on "find" that obviously have the wrong idea; it's not going to be faster than my approach for several self-evident reasons. But does the delete-via-fsck idea have any legs? Or something else entirely? I'm eager to hear out-of-the-box (or inside-the-not-well-known-box) thinking. Thanks for reading the small novel; feel free to ask questions and I'll be sure to respond. I'll also update the question with the final number of files and how long the delete script ran once I have that. Final script output!: 2970000... 2980000... 2990000... 3000000... 3010000... real 253m59.331s user 0m6.061s sys 5m4.019s So, three million files deleted in a bit over four hours.

    Read the article

  • Source control products that support linked/shared files?

    - by Ian Boyd
    We're interested in moving from a source control system that supports the concept of shared or linked files. A shared file means: a file modified in one project, is automatically updated changed in every other project that uses that same file. It does this without a developer having to request it, reverse-integrate it, ask for it, or even want it. We're trying to see if any other commonly used source-control systems can meet our needs, and include linked or shared files. My limited research shows that: Team Foundation Server doesn't support sharing files Subversion doesn't support sharing files (including Externals) CVS doesn't support sharing files (including Modules) Anything else? (besides our current source control product, obviously) References Subversion and shared files across repositories/projects? How to share files between CVS projects? Will TFS ever support shared files for projects under source control?

    Read the article

  • How to keep "dot files" under version control?

    - by andrewsomething
    Etckeeper is a great tool for keeping track of changes to your configuration files in /etc A few key things about it really stand out. It can be used with a wide variety of VCSs: git, mercurial, darcs, or bzr. It also does auto commits daily and whenever you install, remove or upgrade package. It also keeps track of file permissions and user/group ownership metadata. I would also like to keep my "dot files" in my home directory under version control as well, preferably bazaar. Does anyone know if a tool like etckeeper exists for this purpose? Worst case, I imagine that a simple cron job running bzr add && bzr ci once or twice a day along with adding ~/Documents, ~/Music, ect to the .bzrignore Anyone already doing something similar with a script? While I'd prefer bazaar, other options might be interesting.

    Read the article

  • How to Transfer Files Between Your PC and Android Phone Wirelessly

    - by Zainul Franciscus
    Mounting your Android phone to transfer files is fast and efficient, but nothing beats the convenience of a wireless file transfer. Today, we’ll show you how to transfer files between Android and your computer without a USB cable Latest Features How-To Geek ETC How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition How To Create Your Own Custom ASCII Art from Any Image How To Process Camera Raw Without Paying for Adobe Photoshop How Do You Block Annoying Text Message (SMS) Spam? Change Your MAC Address to Avoid Free Internet Restrictions Battlestar Galactica – Caprica Map of the 12 Colonies (Wallpaper Also Available) View Enlarged Versions of Thumbnail Images with Thumbnail Zoom for Firefox IntoNow Identifies Any TV Show by Sound Walk Score Calculates a Neighborhood’s Pedestrian Friendliness Factor Fantasy World at Twilight Wallpaper

    Read the article

  • Backup files with rsync: error 23

    - by maria
    Hi I'm trying to make a backup of my /home to transfer all data from one computer to another. I wanted to save the backup on the same computer and than transfere it to another one. For safety reasons, I'm trying to learn how does it work on the computer without a lot of data (the new one) to be sure I won't delete something instead of copying it. I've run in terminal: sudo rsync -avz /home/maria /home/guest/backup and I had as the result: sent 58797801 bytes received 23050 bytes 4705668.08 bytes/sec total size is 100202958 speedup is 1.70 rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7] I've tried once again, with the same result. I have no idea, which files were not transferred, what makes the whole backup useless for me (I wanted to do it automatically in order not to forget about something and loose it). On both computers I have the same system (Ubuntu 10.04). Rsync version: 3.0.7-1ubuntu1. Thanks for any tips

    Read the article

  • How to FTP transfer files to /var/www?

    - by jc.yin
    I'm new to linux and I've set up a web server with Ubuntu Desktop edition so I can practice with the GUI a bit before transitioning to Ubuntu Server. I've already set up a LAMP stack as well as FTP. Now I just need to know how to transfer my web files to the /var/www folder in Ubuntu. Previously I've worked on Mac OS and there's a central server for all the web files where I can FTP to. Anyone able to help me understand how do I FTP to the /var/www folder in Ubuntu? Thanks

    Read the article

  • How do I export customized Libreoffice config files?

    - by carestad
    Is this possible? I want to make my own config file for my customizations that I can apply whenever I reinstall my system. For example, Ubuntu's default font color is just stupid. I want it to be BLACK and not dark grey. And I want to turn on autosave every 3rd minute and backup files. Is there a config file that I can change? The .libreoffice/* folders and XML files doesn't make sense, and they don't seem to change when I change stuff in LibreOffice. Could someone please help me out with this? Thanks.

    Read the article

  • cannot rename files when saving

    - by Botond Vajna
    I am using xubuntu 13.10. Since the last update, when I am saving files I cannot rename other files, there are missing buttons in context menu like rename, cut, copy, etc... let`s say I edited the file named style.css in firefox, and I want to save it, but i do not want to overwrite the original, I want to rename the original for ex: style_orig.css and save the edited file with the name style.css. for doing that I must open thunar, navigate to that file, and rename it, after that, I can save the file from firefox. let`s reformulate this question: if I want to overwrite a file, but preserve the original, then I must make a copy of the original. Well, I can not do this when saving the file from the application, I must open thunar and navigate to that file which is an additional task I have to do. It`s a waste of time.

    Read the article

  • Okular can't read pdf files

    - by hoang anh Nguyen
    I recently have installed Okular on my Ubuntu 14.04. The problem is when I open pdf files, okular gives me the error "Can not find a plugin which is able to handle the document being passed." When I ran Okular by Terminal, this is the message I get. okular(14100)/kdeui (KIconLoader): Error: standard icon theme "oxygen" not found! okular(14100)/kdeui (KIconLoader): Error: standard icon theme "oxygen" not found! okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100): No ksycoca4 database available! okular(14100)/kdecore (trader) KServiceTypeTrader::defaultOffers: KServiceTypeTrader: serviceType "okular/Generator" not found okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100): No ksycoca4 database available! okular(14100)/kdecore (trader) mimeTypeSycocaServiceOffers: KMimeTypeTrader: mimeType "application/pdf" not found okular(14100): No ksycoca4 database available! okular(14100)/kdecore (trader): KMimeTypeTrader: couldn't find service type "okular/Generator" Please ensure that the .desktop file for it is installed; then run kbuildsycoca4. okular(14100)/okular (app) Okular::Document::openDocument: No plugin for mimetype '"application/pdf"'. okular(14100): Couldn't start knotify from knotify4.desktop: "KLauncher could not be reached via D-Bus. Error when calling start_service_by_desktop_path: The name org.kde.klauncher was not provided by any .service files " okular(14100)/kdeui (KNotification) KNotification::slotReceivedIdError: Error while contacting notify daemon "The name org.kde.knotify was not provided by any .service files" X Error: BadWindow (invalid Window parameter) 3 Major opcode: 20 (X_GetProperty) Resource id: 0x2a0002e okular(14110) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14110) KPixmapSequence::frameSize: No frame loaded okular(14110) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14110) KPixmapSequence::frameSize: No frame loaded okular(14110) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14110) KPixmapSequence::frameSize: No frame loaded X Error: BadWindow (invalid Window parameter) 3 Major opcode: 20 (X_GetProperty) Resource id: 0x2a0001d X Error: BadWindow (invalid Window parameter) 3 Major opcode: 20 (X_GetProperty) Resource id: 0x2a0001d I would be much appreciated for any suggestion to solve this problem. Thanks a lot :)

    Read the article

  • How to Stream Media Files From any PC to Your PlayStation 3

    - by Zainul Franciscus
    Have you ever wished that you could stream video files from your computer over to your TV without actually hooking the two directly together? If you’ve got a PlayStation 3, you’re in luck, because that’s today’s geek lesson. If you’re wondering how to rip dvds to your PC, we’ve got you covered with an article on the subject, but you can stream video files that you’ve recorded yourself, or downloaded from somewhere. Image by playstation-themes Latest Features How-To Geek ETC Internet Explorer 9 RC Now Available: Here’s the Most Interesting New Stuff Here’s a Super Simple Trick to Defeating Fake Anti-Virus Malware How to Change the Default Application for Android Tasks Stop Believing TV’s Lies: The Real Truth About "Enhancing" Images The How-To Geek Valentine’s Day Gift Guide Inspire Geek Love with These Hilarious Geek Valentines Four Awesome TRON Legacy Themes for Chrome and Iron Anger is Illogical – Old School Style Instructional Video [Star Trek Mashup] Get the Old Microsoft Paint UI Back in Windows 7 Relax and Sleep Is a Soothing Sleep Timer Google Rolls Out Two-Factor Authentication Peaceful Early Morning by the Riverside Wallpaper

    Read the article

  • make inference rule for files in a folder

    - by Gauthier
    I use GNU make, I want my source files and object files to be in different folders. As a first step, I want the source files at the root of my project folder, and the object files in a subfolder (say Debug/). The inference rule would be: .ss.obj: echo Assembling $*.ss... $(ASSEMBLY) $(2210_ASSEMBLY_FLAGS) $*.ss -o Debug\$*.obj but in that case, make rebuilds all files all the time, since there are no .obj in the root folder. Is there a way to include a folder for the target in the line .ss.obj? I also tried: $(OBJS_WITH_PATH):$(SRC) echo Assembling $<... $(ASSEMBLY) $(ASSEMBLY_FLAGS) $< -o $@ with $(SRC) as a list of all my sources, $(OBJS_WITH_PATH) built that way: OBJS_WITH_PATH = $(patsubst %.ss,Debug\\%.obj,$(SRC)) but that builds a dependency on all source files for all object files. What I would like is to modify the inference rule I wrote first, to take Debug/*.obj files. What it says now is no rule to make target Debug/asdfasdf.obj.

    Read the article

  • Extracting GPS Data from JPG files

    - by Peter W. DeBetta
    I have been very remiss in posting lately. Unfortunately, much of what I do now involves client work that I cannot post. Fortunately, someone asked me how he could get a formatted list (e.g. tab-delimited) of files with GPS data from those files. He also added the constraint that this could not be a new piece of software (company security) and had to be scriptable. I did some searching around, and found some techniques for extracting GPS data, but was unable to find a complete solution. So, I did...(read more)

    Read the article

  • g++ cannot find include files (qt3)

    - by Allan
    allan@allan-VirtualBox:~/blackjack_for_the_hopelessly_luckless$ make g++ -c -pipe -g -Wall -W -O2 -D_REENTRANT -DQT_NO_DEBUG -DQT_THREAD_SUPPORT -DQT_SHARED -DQT_TABLET_SUPPORT -I/usr/share/qt3/mkspecs/default -I. -I. -I/usr/include/qt3 -o advicewindow.o advicewindow.cpp advicewindow.cpp:32:19: fatal error: QWidget: No such file or directory compilation terminated. make: *** [advicewindow.o] Error 1 allan@allan-VirtualBox:~/blackjack_for_the_hopelessly_luckless$ qt3 was installed using apt-get. Header files are located in /usr/include/qt3/ Is there a g++ config file or something I need to update? I'm new to compiling from source and not sure what to do. Makefile was created using Qmake from project file. Files in include directory are all lower case, should I change the code in advicewindow.cpp to qwidget.h? Any help appreciated. Thanks.

    Read the article

  • Ubuntu One Windows application only accessing gpg files

    - by Boomer Kuwanger
    I'm on a windows 7 (64) machine right now, I have the Ubuntu One windows application. I'm synced to my online account, the folder I am accessing is... deja-dup\My-desktop When I click 'Sync Locally' checkbox and explore my folder I am only able to see three files of the form duplicity-full.#######.manifest.gpg duplicity-full.#######.vol1.difftar.gpg duplicity-full-signatures.#######.sigtar.gpg How do I access the content of these file? I put them on a linux server and decrypted them/ extracted them, however something is wrong. Note: I cannot use apt-get on the linux server I'm using. Is there a way to access these files using the Ubuntu one software for windows? Many Thanks, Boomerkuwanger

    Read the article

  • Upgraded to 11.10 lost personal folders, Ubuntu one shows no files

    - by Kevin
    Upgraded to 11.04, from 10.10 system would only come up in terminal mode, but it told me that an additional upgrade was available and did I want to do that. Foolishly thinking that might fix the problem, I said yes. This time it did not make it all the way through the upgrade, when I came back to the computer over an hour later, the screen was filled with an error message "could not open display", had to reboot. Went to recovery mode on reboot to install nvidia module, when I rebooted system came up fine, but without carrying over my personal folders, I have the home folder, but no personal named folder in it. Came to Ubuntu One, but gives error message; File Sync error. (org.freedesktop.DBus.Error.NoReply: Did not receive a reply. Possible causes include: the remote application did not send a reply, the message bus security policy blocked Is the a way around this in order to restore my files? I know my files existed on Ubuntu one as of a few months ago.

    Read the article

  • Ubuntu One synching changed files

    - by Mark
    I have several folders in the Ubuntu1 folder and if I add a new file (on my PC oder my Mobile) the file shows up in the other device or in the online access. However, if I change one of the files (like a spreadsheet I change almost daily on my PC) Ubuntu1 is not updating the changed file. It is still the old file on the other devices. Doesn't Ubuntu1 sync changed files only new ones or do I have to change some settings? Could someone be so nice and help me? Thanks in advance! Mark PS: I am using Ubuntu 11.10 64 bit.

    Read the article

  • Extracting GPS Data from JPG files

    - by Peter W. DeBetta
    I have been very remiss in posting lately. Unfortunately, much of what I do now involves client work that I cannot post. Fortunately, someone asked me how he could get a formatted list (e.g. tab-delimited) of files with GPS data from those files. He also added the constraint that this could not be a new piece of software (company security) and had to be scriptable. I did some searching around, and found some techniques for extracting GPS data, but was unable to find a complete solution. So, I did...(read more)

    Read the article

  • I see files in filezilla, but the internet denies their existance

    - by Zach L.
    I am doing some updates to a 10-year old site, and I am baffled. Everything worked great locally, so I uploaded a bunch of stuff to the server using filezilla. Within filezilla I can see all of the files, but for some reason I get a 404 when trying to view them. It seems as though (at least for the folder Im currently checking) this is happening for items which are "farther down the list" alphabetically. I tried to re-upload a file individually but it didn't change anything. Is this an indication that I hit some sort of limit with the hosting company? And if so why can I still view the files from filezilla? Please offer guidance. I am at a loss.

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >