Search Results

Search found 10242 results on 410 pages for 'stored proc'.

Page 225/410 | < Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >

  • Extract photo stills from .vob files

    - by Eric Rath
    My parents had all the family slides scanned by a photo lab. The lab returned the digital photos on two DVDs as movies; there's some stock music over a slideshow with fades between each photo. The discs contain only a handful of files, including some very large VOB files. I'd like to extract these photos and import them into iPhoto. I saw this answer about capturing stills, and that might work if I can figure out the right offset from the beginning and the right capture rate. But this approach seems very error-prone for this purpose. Is there a better way? I wish the individual photo files were stored in a directory on the discs, but they're not there.

    Read the article

  • Puppet: hanging at Schedule[weekly]

    - by Andrei Serdeliuc
    Why would puppet hang at Schedule[weekly]? I'm running puppet in a masterless setup, so to apply by manifest I'm just running puppet apply /etc/puppet/manifests/site.pp In debug mode, these are the last things it says before it just hangs debug: /Schedule[never]: Skipping device resources because running on a host debug: /Schedule[daily]: Skipping device resources because running on a host debug: /Schedule[monthly]: Skipping device resources because running on a host debug: /Schedule[puppet]: Skipping device resources because running on a host debug: /Schedule[hourly]: Skipping device resources because running on a host debug: /Schedule[weekly]: Skipping device resources because running on a host If I send a SIGINT, it says Exiting debug: Storing state debug: Stored state in 0.03 seconds debug: Finishing transaction 69992657242500 Thanks

    Read the article

  • Drupal7 doesn't detect MySQL on CentOS, but Wordpress3 does?

    - by jyaworski
    Hey guys. I'm running CentOS 5.5 here with Apache2, PHP5, and MySQL 5. My wordpress install on the same system runs perfectly, but the drupal7 install script only detects SQLite. The mysql module is enabled in php.ini, so that isn't the problem. Do you think it could be something with Drupal 7, or my PHP install? I tested it on localhost (I'm essentially running ArchLinux with Apache) and it installs just fine. I don't see a difference between my local php.ini and my server php.ini. I get this when accessing install.php on the server. SQLite The type of database your Drupal data will be stored in. Your PHP configuration only supports a single database type, so it has been automatically selected. Edit: The mysql PDO module is installed already.

    Read the article

  • Alternative to Canned Response in Gmail

    - by Stuck
    I have a mailbox that i share with a colleague. We want a good way to store templates for e-mails that we send often like answers to common questions and so on. We have tried to use Canned Response to store this templates but that GUI really sucks and is kind of unusable for other things then signatures and stuff. Is anyone aware of a good alternative to this? We need to be able to share this templates. So it must be stored "in the cloud". We want "as easy access" as possible directly in gmail. A firefox plugin would be fine since we both use Firefox. We use both Mac and PC so the solution must be cross-platform. Anyone have any ideas on how to solve this?

    Read the article

  • how to copy the results from a grep command to the bash clipboard?

    - by avilella
    If I type something in a Linux bash terminal with no X, and then use Ctrl+u, whatever I typed is stored in the bash "clipboard" (for lack of a better term), and I can type it again doing Ctrl+y. How can I copy the results from a grep command on a text file to such bash clipboard? For example, if I have an INSTALL file like this: ./installprocedure --do-some-long-and-complicated-operation-on-dir dir1 How can I copy the content of a grep command so that it's available doing Ctrl+y? For example: copy content to bash clipboard "grep installprocedure INSTALL" Ctrl+y ./installprocedure --do-some-long-and-complicated-operation-on-dir dir1 #cursor available here

    Read the article

  • Leaving job, PC personal info deletion guide?

    - by bangoker
    I'm quitting my current job, and I want to make sure all my personal stuff gets whipped off the computer. I know everybody stores stuff on different places, but is there any quick guide of locations (folders) and apps that might need deletion? you know, just to be sure I erased from all the places and I don't forget any (like those were apps save stuff automatically, the data folders and such) I have erased most of my docs in My Document folder, and I am going to erase all the cookies and stored history in my browsers (opera, ie, ff, chrome) (by the way is there any tool for this?)

    Read the article

  • performance monitoring

    - by Sunny
    I want to monitor CPU usage, disk read/write usage for a particular process, say ./myprocess. To monitor CPU top command seems to be a nice option and for read and write iotop seems to be a handy one. For example to monitor read/write for every second i use the command iotop -tbod1 | grep "myprocess". My difficulty is I just want only three variables to store, namely read/sec, write/sec, cpu usage/sec. Could you help me with a script that combines the outputs the above said three variables from top and iotop to be stored into a log file? Thanks!

    Read the article

  • Writing to network share failed

    - by Unreason
    I have outlook files stored on a network share and accessed by clients directly. Outlook is version 2003, clients windows XP and server is 2003. The files are quite big at around 3GB. One of the common problem that happens is that I get 'delayed write failed' and this happens only on these PST files. When this happens I have to run scanpst.exe to fix the PST file. I did not find any entries in even logs that I could relate to the issue. What would you suggest to change to fix the issue or where to look to further diagnose it? EDIT: No loss on ping and ping times within normal for LAN.

    Read the article

  • Do you lose everything when you have a hard disk failure in a multi-hard disk LVM that does NOT use RAID?

    - by user72630
    I'm debating about using LVM for a media/file server because I would like to combine multiple physical hard disks into one volume. I do not wish to use any RAID in my LVM so my question is: If one of the multiple hard disks in my volume were to go down would I lose all my data or would I just lose the data that was stored on that individual disk? Also, if I were to just lose the data on the individual disk, would it be as simple as replacing that disk and restoring what was on it from a backup to recover? Thanks everyone.

    Read the article

  • VPS showing low disk space despite there is nothing major on it

    - by SheoNarayyan
    Hello experts, On my VPS server I was trying to see the used disk space and when I open My Computer it shows 17.9 GB free out of 39.8 GB it means that 21.9 GB space is used. However, when I select all files and folders from C: and try to see the total size, it just count approximately 11 GB. The difference is around 10 GB. Where is this 10 GB going if I have not stored anything else here? I asked above question from my VPS provider and he responded below Check hidden files/system files/etc. This is default windows OS and its utilization and not specific to setup. If you want specifics of usage, you can go ahead and get in touch with Microsoft support team and they'll provide you with exact specification of the same. I am sure that Windows OS must not be taking up 10 GB space for hidden files and folders. My VPS has Windows Server 2008 R2 installed. Can anyone help me in this on who is right?

    Read the article

  • My Quicken 401(K) account has changed to Checking. How do I fix this?

    - by user36492
    This is actually the second time this has happened to me, but I don't remember what I did last time (nor can I find the original forum post that helped then). I'm using Quicken Mac 2007. My 401(k) account, previously properly set up, has changed, seemingly irrevocably, to a Checking account. When I click "Edit" and try to change the account type, the 401(k) option is grayed out. I've got years of data stored in this account, so I am really hoping there's a way to salvage this data file!

    Read the article

  • How to sort time column by value instead of alphabetically

    - by Turch
    I'm creating a pivot table by connecting to an SSAS tabular model (Data - From Other Sources - From Analysis Services) . The model has a "time" column that I want to sort by. The default (database) sorting is earliest to latest: When I click the triangle next to 'Row Labels' and select "Sort A to Z", I get alphabetically sorted times: How can I get the times to sort by time? Changing the number format from "General" to "Time" does nothing. The times aren't stored as text either - the data type of the column in the SSAS model is Auto (Date)

    Read the article

  • ProFTPD Virtual User Directory

    - by Nik
    Alright, I'm trying to replicate a web hosting company's basic setup here by authenticating virtual users via SQL and redirect/jail them to their directory. I've accomplished most of the goals here, with the exception of redirect/jailing them to their directory. The directories are stored in /home/ftp and that's what DefaultRoot is set to. I want each individual user to have and be jailed into their own directory. It doesn't appear that setting homedir in SQL has any effect. Upon logging into FTP with any user, it logs into the DefaultRoot with no directory jailing or redirect. How do I accomplish this last task?

    Read the article

  • Nagios and rrd on a old server

    - by Pier
    I have an old server (P4 based) on which nagios (and all the other tools to monitor) is running. In the last few weeks we are seeing a strange behavior. In the /var/spool/pnp4nagios (where temporary files are stored before getting processed by pnp4nagios daemon) we have many files like perfdata.1274949941-PID-18839 and we get an error in npcd.log: [05-27-2010 11:17:46] NPCD: ThreadCounter 0/15 File is perfdata.1274951306-PID-27849 [05-27-2010 11:17:46] NPCD: File 'perfdata.1274951306-PID-27849' is an already in process PNP file. Leaving it untouched. Sometimes some graph are not drawn. The server is pretty loaded (around 5-6 normally) and i suspect that npcd goes in timeout and leave those files behind. What could I do (apart from change the server)? Few infos about the system: centos 5.5 nagios 3.2.1 pnp4nagios 0.6 (from sources) Thanks

    Read the article

  • Valid path to deploy vm on host

    - by ELSheepO
    I've recently added a VM to the library in SCVMM to use with MS Test Pro 2010, and run into a problem. I cannot import it using test pro, it just won't give me any option to import in the lab manager. Also, if I try to deploy it back to the host it came from, it gives me a error saying the path is not valid. Anyone have any insight into this? Also, SCVMM seems to freeze everytime I try to create a new VM from the template of the VM I've stored in the library, the same one thats giving me the problem when I try to deploy it on to the host. Thanks.

    Read the article

  • Ways to deduplicate files

    - by User1
    I want to simply backup and archive the files on several machines. Unfortunately, the files have some large files that are the same file but stored differently on different machines. For instance, there may a few hundred photos that were copied from one computer to the other as an ad-hoc backup. Now that I want to make a common repository of files, I don't want several copies of the same photo. If I copy all of these files to a single directory, is there a tool that can go thru and recognize duplicate files and give me a list or even delete one of the duplicates?

    Read the article

  • Database implementation question?

    - by gundam
    consider a disk with a sector size of 512 bytes, 2000 tracks/surface, 50 sectors/track, 5 doubled sided platters, average seek time is 10 msec. Assume a block size of 1024-byte is selected. Assume a file that contains 100,000 records of 100-byte each is to be stored on the disk, and NONE of the reocd can be spanned 2 blocks. How many blocks are needed to store the entire file?? If the file is arranged sequentially on disk, how many surfaces are required?? Now, i have calculated that 10,000 blocks are needed to store 100,000 records. But i am not sure how to find out the answer of the surfaces required. I only calculated the capacity of track is 25KB and capacity of surface is 50,000 KB But I don't know how to calculate the number of surfaces... Could anyone help me how to get the answer? Thanks a lot!!

    Read the article

  • Syncing Large Directories/Filesystems using USB Drive

    - by Alan Lue
    Does anyone have a solution for syncing large directories/filesystems using just a USB flash drive (and specifically without using a network connection)? The objective is simply to sync a user directory between two computers. The contents of the user directory could amount to a large quantity of data—say, a quantity larger than could be stored on any single USB drive—but the aggregate size of changes that must be propagated by a single sync could easily fit on a USB drive. As an example, suppose a user directory is already synchronized between a desktop and a laptop computer. Here's a use case: Some changes are made in the user directory on the desktop. We mount a USB drive onto the desktop and copy whatever changes need to be applied to the laptop user directory in order to synchronize the desktop and laptop user directories. We now mount the USB drive onto the laptop and apply the changes. The desktop and laptop user directories are now synchronized. Any ideas? Alan

    Read the article

  • Is rsync corrupting my RAR?

    - by Mark Henderson
    We have two qnap devices - one in our datacentre and one off-site. We have hundreds of password protected RAR files stored on the qnap that contain virtual machine image snapshots, with approx 20 of them being created each day. We synchronise the two devices using rsync, and it looks like all the files are being rsynced OK - they come over and have the same file size and all the files are present and accounted for. However, when I try to open the RAR files on the remote site, I get Cannot open \\qnap01\FromDatacentre\Snapshots\DB001SQL1-20110626.rar I can open the RAR files on the local site just fine, so I assume that something is getting mangled during the rsync procedure. However, the older files (pre 2011-06-20) work just fine, it's something that's only started happening in the last week. There haven't been (as far as I know) any changes to any of the devices, setup or configuration in that time. Obviously something has changed though. Where should I start investigating?

    Read the article

  • QNAP TS-419p as a VPN Gateway?

    - by heisenberg
    Hello, I am hoping one of you might be able to help. I want to make files stored on shared folders on a QNAP TS-409p available to users over a VPN link. How is the possible? Can someone explain what I need to do. What do I need to do at the router and what do I need to do on the QNAP NAS? Effectively, what I want do do is use the built in Windows vpn client to connect to my home network and then be able to browse the shared folders. Thanks in advance.

    Read the article

  • TPM had to be reintialized: Does a new recovery password have to be uploaded to AD?

    - by MDMoore313
    Some way some how, a user's machine couldn't get read the bitlocker password off of the TPM chip, and I had to enter the recovery key (stored in AD) to get in. No big deal, but once in the machine, I tried to suspend bitlocker per recovery documentation, and got an error message about the TPM not being initialized. I knew the TPM was on and activated in the BIOS, but Windows still made me reinitialize the TPM chip, and in the process it created a new TPM owner password. I found that odd because it prompted me to save this password or print it (there wasn't an option not to), but it made no reference of a recovery password, nor did it back this password up to AD. After the user took her laptop and left I started thinking that if the TPM password change, does the recovery password change also? If so, that new recovery password will need to be uploaded to AD, but MS' documentation doesn't make that clear, and doesn't back up the new recovery key (if one exists) to AD automatically when the group policy says it must, and from a network standpoint AD is accessible.

    Read the article

  • Site's performance slows over time until Apache is restarted

    - by udbhav
    I'm running a Django app w/ Nginx and Apache. All our static media is stored on S3, and basically it takes a while for the app to check if thumbnails have been created every time a page is loaded. To alleviate this problem, I'm caching the output of the templates w/ memcached. Over the course of an hour or two, the site's speed goes down significantly, until I restart apache, and then all is good for a little while. I have very little sysadmin experience, and was hoping somebody could at least point me in the right direction.

    Read the article

  • Raid-3 like software backup tool

    - by Chronial
    I have a lot of data (about 7 TB), stored across multiple hard-drives with varying sizes. I would like to have a backup of that data to be safe against drive failure. A RAID is not a good option for me, as I want to keep my cost low and be able to easily extend the storage capacity of my setup by buying an additional HD. I remember seeing a piece of software that generates parity data over all drives and stores that on an extra drive. That solution protects the setup from hard drive failure and works with varying drive sizes (as long as the parity drive is the biggest one). But I can’t seem to find that software again. Does anybody now what I’m talking about or have any other solution for my situation?

    Read the article

  • Securing internal data accessed by a website on the big, bad internet

    - by aehiilrs
    A close relative of this question on Stack Overflow: When you have a web site in your DMZ that needs to access production data stored on an internal DB, what strategies do you recommend using to lower the risks that come from accessing live data? Is it even considered acceptable to have a connection initiated from the DMZ come inside of your network? An extra detail about the nature of the site that kind of throws a monkey wrench into the machinery is that people using the web site will be competing for "spots" on a first-come, first-serve basis with others using the internal software. Because of this, as close to zero lag time between the two applications as possible is ideal.

    Read the article

  • Import EML emails into Outlook 2010 64-bit

    - by nness
    Evening everyone. I'm helping setup a small office network, where a number of old PC's are being replaced with new ones with a 64-bit copy of Outlook 2010. The old emails were stored in Windows Live Email, and were exported as .eml files (since we were replacing the machines). All the support I can find indicates that .eml files could simply be dragged-and-dropped into a folder in Outlook 2010, and it will import them correctly. However, it seems this is not the case in the 64-bit versioin, where dropping in .eml files results in a new message being created with these files as attachments. We can re-download the most of the emails off the server if need be, but there were user folders which were not on the server which we were hoping to import. Any advice would be fantastic at this point!

    Read the article

< Previous Page | 221 222 223 224 225 226 227 228 229 230 231 232  | Next Page >