Search Results

Search found 59060 results on 2363 pages for 'dummy data'.

Page 242/2363 | < Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >

  • Adding EXIF Lens data for manual lens (e.g "Lens Baby")

    - by dbr
    I have a Lens Baby Composer, which is an entirely mechanical lens (no electronics in it), so the camera body cannot determine what lens is attached.. So obviously the metadata does not contain the lens info.. Is there any way to manually set this metadata, so the photos don't show up as "Unknown Lens"? It's a Canon 5D Mark II (so the native files are .cr2), and I convert them to DNG with Lightroom

    Read the article

  • Best format for backing up data in Blu Ray

    - by Arrieta
    We are in the process of backing up our hard drives to Blu Rays. I am creating tar.gz files and burning them to Blu Ray. Is it possible to use a simple (preferably Python-based) solution for creating images of those tar.gz files, of a predetermined size (to fit in the Blu Ray), and simply burn this images to the disc? Do you have any other approach for creating physical back-up of your hard drives?

    Read the article

  • Robocopy with local catalog of remote data for incremental backup

    - by Bill
    I am currently using robocopy to an extremely slow destination. The compare between source and destination files can take a while to run through. Since the destination will never change (apart from the robocopy changes), is there any program that will work similarly to robocopy, but have a local list of what files (attributes and timestamps) the destination has, to compare with? I know there are expensive solutions which may do this, but I'm looking for something free if possible. Hopefully this makes sense.

    Read the article

  • Weird PCI bug: lots of missed packets, or data comes in "bursts"

    - by Thomas O
    I have an ABIT KN9 motherboard. It has one PCI-e x16 slot, three PCI-e x4 slots and two legacy PCI. My problem is with the legacy PCI (which I shall just call "PCI".) I currently have an Nvidia GeForce 8600 GT (a low end card) installed in the x16 slot and a TV card in PCI #1; the x4 slots are unused, as is PCI #2. I plan to upgrade the graphics card soon, the current card was spare. I sometimes install a USB expander in PCI #2 but it causes a lot of problems - see below. The problem is under Linux (Ubuntu 10.10, Linux 2.6.35-22-generic), but probably under all operating systems (I have not yet been able to test Windows, but I suspect it will do the same as the problems occur on the BIOS/POST side too, e.g. when using a USB keyboard on the expander the keyboard will not work at all) PCI has an enourmous delay, and packets arrive in large chunks. For example, when using the USB expander, my USB mouse lags and jumps in large steps every second or so, while using the motherboard USB does not present this problem. My TV card will only do one or two frames per second, and the program (xawtv) usually times out and crashes. In dmesg, I'm getting messages like: bttv0: timeout: drop=74, irq=154/100476, risc=31f6256c, bits: VSYNC HSYNC OFLOW RISCI for my TV card, and similar timeout issues for my USB expander with a mouse. I received the motherboard, processor and RAM second hand and have only just got around to building it, so I don't know if this problem has always existed, or if it's a result of my set up. If anyone has any hints or solutions it would be appreciated - this is kind of a show-stopper for me.

    Read the article

  • Disaster recovery backup of files/photos for personal use

    - by Renesis
    I'm looking for the best method to store a backup of important files and 5+ years of digital photos that is safe from some type of fire/flood disaster in my home. I'm looking for: Affordable: Less than $100/yr or first-time cost. Reliable: At least a smaller chance of failing than there is of fire or flood Easy for initial backup and to add to, and at least semi-easy to recover. I recently purchased a small home safe for physical vitals. It was inexpensive, solid, and is fire/water safe. If I had a physical copy of the digital files, the safe would work fine for this, but I don't know what to store in it that adequately meets the requirements above. Hard drive - I read that the danger of it not spinning up makes a hard drive a bad choice for this type of storage, although it was my first thought and would definitely be the simplest choice - very easy to take out once a month and add files to. DVDs - Way too much of a hassle for both backup and restore. Tape - No idea on the affordability of this option Online - Given that I have at least 300GB already and ever-increasing megapixels means ever-bigger files, and my ISP upload is about 2Mb at the best, this just doesn't sound like a good option for me, but I could be convinced. Other - Have I missed something? Also, I'm already covered both for sync between computers (Dropbox) and a nightly backup of these files (External HDD). The problem with the nightly backup is obviously that it's always with the computer and in a disaster would be destroyed along with it. Is anyone else doing something similar? Is the HDD as poor of a choice as I read, or is it a feasible option? Maybe two to reduce the likelihood of failure?

    Read the article

  • Recovering text files in terminal using grep on Mac OS X Snow Leopard

    - by littlejim84
    I foolishly removed some source code from my Mac OS X Snow Leopard machine with rm -rf when doing something with buildout. I want to try and recover these files again. I haven't touched the system since to try and seek an answer. I found this article and it seems like the grep method is the way to go, but when running it on my machine I'm getting 'Resource busy' when trying to run it on the disk. I'm using this command: sudo grep -a -B1000 -A1000 'video_output' /dev/disk0s2 > file.txt Where 'dev/disk0s2' is what came up when I ran df. I get this when running: grep: /dev/disk0s2: Resource busy I'm not an expert with this stuff, I'm trying my best. Please can anyone help me further? I'm on the verge of losing two days of source code work! Thank you

    Read the article

  • methods for preventing large scale data scraping from REST api

    - by Simon Kenyon Shepard
    I know the immediate answer to this is going to be there is no 100% reliable method of doing this. But I'd like to create a question that details the different possibilities, the difficulty of implementing them and success rates. I would like to go from simple software ip/request speed analysis to high end sophisticated soft/hardware tools, e.g. neural networks. With a goal of predicting and preventing bogus requests and attempts to scrape the service. Many Thanks.

    Read the article

  • Can I list file names (or their parent directories) that were recently deleted using rm in OS X?

    - by Andrew Grimm
    Is it possible to find out which files and directories have recently been deleted by rm in OS X? Or failing that, is it possible to find which parent directories have had files or directories within it deleted? The OS version is Snow Leopard. Background: Last night, rvm (ruby version manager) did rm -rf of the ~/ruby directory from the home directory. (This bug has since been fixed) Ideally, I'd like to know what files within the ~/ruby directory were deleted, but failing that, I'd like to know if rvm deleted anything outside of ~/ruby . In case anyone's wondering about backups...: Just about everything within ~/ruby is a git project that has a remote repo, and I have a fairly recent Time Machine backup (only 20 days old).

    Read the article

  • How best to copy an SD card with corrupt filesystem to attempt recovery?

    - by pdbartlett
    I have an SD card with a corrupt filesystem, so wanted to clone it and attempt recovery on the copy (just in case of problems). I was thinking that dd-ing it Linux would be a sensible way to go, but don't really want to experiment in this situation. So if anyone has done this before then it would be good to know the exact approach that works. In case it helps, I have Ubuntu, OSX and Windows machines available. TIA, Paul.

    Read the article

  • Excel removing leading leading zeros when displaying CSV data

    - by Velika Kudac
    I have a CSV text file with the following content: "Col1","Col2" "01",A "2",B "10", C When I open it up with Excel, it displays as shown here: Note that Cell 2A attempts to display "01" as a number without a leading 0. When I format rows 2 through 4 as "Text", it changes the display to ...but still the leading "0" is gone. Is there a way to open up a CSV file in XLS and be able to see all of the leading zeros in the file by flipping some option? I do not want to have to retype '01 in every cell that should have a leading zero. Furthermore, using a leading apostrophe necessitates that the changes be saved to a XLS format when CSV is desired. My goal is simply to use Excel to view the actual content of the file as text without Excel trying to do me any formatting favors.

    Read the article

  • Filter any mailing list in GMail using the "list:" meta-data

    - by Binary255
    Hi, If I ask GMail to create a filter for a mailing list it creates a rule containing list:mailing-list-identifier, in the case of the NAnt mailing list it wrote: Has the words: list: "nant-users.lists.sourceforge.net" Is there a way to filter any mailing list? I would like to filter conversations from any mailing list containing answers to things I've previously asked (or answered to). Part of that filter is identifying "anything which is part of a mailing list" and I'm wondering if there is a better way than adding another label to all mailing list posts (which is cumbersome).

    Read the article

  • KeepLevelReg settings to eliminate sync prompts - errors occurred while Windows was synchronizing your data

    - by Detritus Maximus
    We have 2 XP pro VMs (Citrix) that both have problems with logout prompts appearing during logout. Users are closing the rdc before these appear: The Microsoft solution involves the creation of the KeepProgressLevel registry entry along with a value of 1 for "pause on errors." I have implemented this across the domain for this problem, yet these 2 VM's continue to have the prompts. Today, I experimented by changing the KeepProgressLevel option to 0. This is not one of the options given by MS, yet I stopped getting the prompts. Can anyone tell me what I've done by setting the value to 0? Have I basically turned off the feature as if the KeepProgressLevel entry is gone? If so, why no more prompts? I did notice during logoff that there is a red x and error message, yet no prompt.

    Read the article

  • Complete Public Folder Migration from Exchange 2007 to Exchange 2010

    - by Michael Todd
    We were in the process of migrating from Exchange 2007 to Exchange 2010 and hit a brick wall when trying to migrate Public Folders. After resolving issues with connectivity (and another issue with an old Exchange 2003 server being listed in AD that was causing the replication to fail) it finally appeared that messages were migrating from one server to another. However, we appear to have jumped the gun and ran MoveAllReplicas before the process was complete. We are now stuck with about 210MB of public folders on the new server from a 7GB public folder store on the old server. The messages appear to be available on the old server since running get-publicfolderstatistics shows that there are messages available. We have waited several days for the move to continue but we are stuck at 210MB. Is there something we can do to complete the replication so that all of the messages move from the old server to the new server?

    Read the article

  • Recovering files using Recuva

    - by Nev Meek
    I'm currently using Recuva to recover some files from an external NTFS disk. It finds the files I'm interested in during it's analysis phase (when tools like test-disk fail to find them at all) and reports them as "Not-deleted" and a big green marker to signify 100% chance of recovery. However when it tries to recover the files I get a "the system could not find the file specified" message. Is there any easy way to recover non-deleted files off of a disc that I can no longer simply access through explorer?

    Read the article

  • chrooted sftp user with write permissions to /var/www

    - by matthew
    I am getting confused about this setup that I am trying to deploy. I hope someone of you folks can lend me a hand: much much appreciated. Background info Server is Debian 6.0, ext3, with Apache2/SSL and Nginx at the front as reverse proxy. I need to provide sftp access to the Apache root directory (/var/www), making sure that the sftp user is chrooted to that path with RWX permissions. All this without modifying any default permission in /var/www. drwxr-xr-x 9 root root 4096 Nov 4 22:46 www Inside /var/www -rw-r----- 1 www-data www-data 177 Mar 11 2012 file1 drwxr-x--- 6 www-data www-data 4096 Sep 10 2012 dir1 drwxr-xr-x 7 www-data www-data 4096 Sep 28 2012 dir2 -rw------- 1 root root 19 Apr 6 2012 file2 -rw------- 1 root root 3548528 Sep 28 2012 file3 drwxr-x--- 6 www-data www-data 4096 Aug 22 00:11 dir3 drwxr-x--- 5 www-data www-data 4096 Jul 15 2012 dir4 drwxr-x--- 2 www-data www-data 536576 Nov 24 2012 dir5 drwxr-x--- 2 www-data www-data 4096 Nov 5 00:00 dir6 drwxr-x--- 2 www-data www-data 4096 Nov 4 13:24 dir7 What I have tried created a new group secureftp created a new sftp user, joined to secureftp and www-data groups also with nologin shell. Homedir is / edited sshd_config with Subsystem sftp internal-sftp AllowTcpForwarding no Match Group <secureftp> ChrootDirectory /var/www ForceCommand internal-sftp I can login with the sftp user, list files but no write action is allowed. Sftp user is in the www-data group but permissions in /var/www are read/read+x for the group bit so... It doesn't work. I've also tried with ACL, but as I apply ACL RWX permissions for the sftp user to /var/www (dirs and files recursively), it will change the unix permissions as well which is what I don't want. What can I do here? I was thinking I could enable the user www-data to login as sftp, so that it'll be able to modify files/dirs that www-data owns in /var/www. But for some reason I think this would be a stupid move securitywise.

    Read the article

  • How to scroll up and look at data in GNU screen

    - by dorelal
    I am using a mac (snow leopard). I am a ruby on rails developer and I watched a screencast on GNU screen and am trying it out. So far I like it. On a window when I start server I get to see the log messages. However I can't seem to scroll up. I do get a scroll bar. However when I use the scroll bar and scroll up I don't see anything. How do people use GNU screen and scroll up?

    Read the article

  • Permissions changed after Mail.app and associated data crash

    - by Olivier
    Hi there, I experienced a major Mail.app crash on Snow Leopard couple of days ago. It took me hours to be able to make the folder structure usable again by Mail. I changed permissions back to 755 for all subfolders starting from and including ~/Library/Mail Mail now works again but settings such as folder order in the left side bar and mail ordered by date in some folders don't persist anymore. Any idea? Thxs for help

    Read the article

  • Finding optimal ddrescue command line options where Accuracy > Speed

    - by gav
    I'm read up a bit about this tool and obviously looked at the man pages. The trouble is that ddrescue takes so long I need to get the command right first time. I wasn't sure how to improve on the vanilla; $ sudo ./ddrescue -v /dev/disk0s5 MyVolImage.dmg MyVolRescue.log $ sudo ./ddrescue -v MyVolImage.dmg /dev/disk1s3 MyVolRestore.log From HSF+ to HSF+ drives Source (Broken) HDD is connected via USB 2.0 Dest HDD is inside MacBook I would choose accuracy over speed There seem to be a lot of options but I'm not sure how they impact quality and speed of recovery. Thanks, Gav

    Read the article

  • Backing up data from server to laptop ?

    - by Patrick
    I need a tool to automatically backup my Drupal installations from my server to my laptop. In other terms, I need to copy 1 folder (all my Drupals are inside this folder) and all databases. So I was wondering if I just need to write a script on my laptop connecting to the server every week copying the folder with all mysql databases informing me by email if the backup has been succesfull do you know if I can find some tutorial for it ? or download such script ? thanks

    Read the article

  • Populating cells with data from another spreadsheet after just keying in a few letters

    - by Wendy Griffin
    I have 1 workbook with 2 spreadsheets. Spreadsheet 2 column A contains a long list of company names, Columns B - H contain critical information about the company. Spreadsheet 1 contains all of the columns as Spreadsheet 2 plus some other columns. What I'm trying to achieve is that when you start to type in the first 3 characters of a company name on Spreadsheet 1 it would then have a drop down of the companies (as listed on Spreadsheet 2) that share the first 3-5 letters and you would select one. Upon selecting a company name all of the corresponding company information would populate in the other columns on spreadsheet 1 automatically. This is to avoid copying a row from Spreadsheet 2 and pasting it in Spreadsheet 1. Any help with this would be greatly appreciated. Cheers!

    Read the article

  • Is an NTBackup of a MySQL data directory reliable

    - by Justin Dearing
    This question was asked on the MySQL forums in 2004 with no answers. I'm installing MySQL 5.0.x on a Windows 2003 Server for use with Drupal. I began to configure the backup with mysqldump when it occurred to me that an ntbackup taken using shadow copying should be reliable enough for backing up the database. Is there any flaw in my logic?

    Read the article

  • Are there any disadvantages of having a "free fall sensor" on a hard disk drive?

    - by therobyouknow
    This is a general question that came out of a specific comparison between the Western Digital Scorpio WD3200BEKT and Western Digital Scorpio WD3200BJKT (which is the same as the former but with a free fall sensor.) Note: I'm not asking for a review or appraisal of these specific drives, as the general question does apply on other brands as well. Though your input would help my decision. To break down the general question in order to answer it, I would be looking for comments on things like: if it's necessary to have differing physical dimensions between free fall sensor drives and those without, e.g. does it make it any thicker, and therefore reduce the systems where it can be installed - particularly smaller laptops? does it actually make the system less reliable - because of false alarms whereby the drive thought the laptop was falling but it wasn't? I suppose that the fact that a manufacturer produces both drives with and without free fall sensors says something about possible disadvantages. Or it could be standard marketing techniques where by making drives with and without results in larger sales volume than just those with the feature alone.

    Read the article

< Previous Page | 238 239 240 241 242 243 244 245 246 247 248 249  | Next Page >