Search Results

Search found 92226 results on 3690 pages for 'file access'.

Page 394/3690 | < Previous Page | 390 391 392 393 394 395 396 397 398 399 400 401  | Next Page >

  • FFmpeg - No Such File or Directory?

    - by Lynda
    I am attempting to use FFmpeg to extract audio from a mp4 and I keep running into this error: CFileffmpegvideo.mp4: No such file or directory exist. I am in command prompt (in Windows 7) and have the path as C:\Files\ffmpeg (Where ffmpeg is). I run this command line ffmpeg -i C:\Files\ffmpeg\video.mp4 -f mp3 -ab 320000 -vn music.mp3 The file is in the same folder as ffmpeg. I know I am missing something simple here but what is it?

    Read the article

  • Create or Open an .xlsx file having >256 columns in MS Excel 2003

    - by Daredev
    I'm using Microsoft Office 2003. I have installed 'Microsoft Office Compatibility Pack for Word, Excel, Powerpoint 2007' to support new xml based formats (.docx, .xlsx, .pptx). Now given that I have installed Compatibility pack, can I create or open a Microsoft Excel 2007 file (.xlsx) having more than 256 columns in Excel 2003? If no, then how can I achieve the same. My observation: When I open a .xlsx file in Excel 2003 with compatibility, I can't see more than 256 columns (till Column IV).

    Read the article

  • Network issues with DNS not being found

    - by Anriëtte Combrink
    Hi there This is exactly like how our network looks like: Single server with a network router Everything is setup, but I cannot connect our Macs under the Login Options - Join... to this server. Our server's name is Toolbox and I have tried Toolbox.local, Toolbox.private, prepended the afp:// protocol to the name, but nothing, our Macs just don't want to connect this way. Our router has DHCP and gives out all the IP addresses naturally, would I have to add Toolbox.local to the DNS on the router and like it via static internal IP to the server? Our Macs keep giving the following error while trying to join the Network Account Server: Unable to add server Could not resolve the address (2200) What am I doing wrong?

    Read the article

  • Redhat | Fuse | SSH file system

    - by MMRUSer
    I did manage setup and configure fuse and [sshfs][2] on my Redhat EL 5.4. But when I hit the sshfs it's out put an error sshfs: error while loading shared libraries: libfuse.so.2: cannot open shared object file: No such file or directory couldn't figure out the exact reason, requesting some helping hand Thanks..

    Read the article

  • Alternative to Amazon's S3 service?

    - by Cory
    Just wondering if there is good alternative to Amazon's S3 service? I like S3 but the bandwidth cost is high. I looked at CouldFiles from Rackspace but the cost is even higher. I don't mind prepaying or having monthly payment in order to reduce the bandwidth cost greatly. Thank you for any help

    Read the article

  • How to use a custom .bashrc file on SSH login

    - by gsingh2011
    I've found that with the new company I'm working with I often have to access linux servers with relatively short lifetimes. On each of these servers I have an account, but whenever a new one is created, I have to go through the hassle of transferring over my .bashrc. It's possible however that in about a months time that server won't be around anymore. I also have to access many other servers for short periods of times (minutes) where it's just not worth it to transfer over my .bashrc but since I'm working on a lot of servers, this adds up to a lot of wasted time. I don't want to change anything on the servers, but I was wondering if there was a way to have a "per-connection" .bashrc, so whenever I would SSH to a server my settings would be used for that session. If this is possible, it would be nice if I could do the same thing with other configuration files, like gitconfig files.

    Read the article

  • Online Storage and security concerns

    - by Megge
    I plan to set up a small fileserver. I already own a small server at HostEurope (VirtualServer L, 250GB space), but they don't offer enough space (there is the HostEurope Cloud, but paying for bandwidth isn't an option here, video-streaming should be possible) Requirements summarized: Storage: 2TB, Users: ~15, Filesizes: < 100GB, should be easily reachable (Mount as a networkdrive or at least have solid "communication" software) My first question would be: Where can I get halfway affordable online storages? And how should I connect them to my server? Getting an additional server is a bit overkill, as I know no hoster which allows 2 TB on a small 2 Ghz Dual Core 2 GB RAM thingy (that would be enough by far, I just need much space), and connecting it via NFS or FTP over Internet seems a bit strange and cripples performance. Do you have any advice where I could get that storage service from? (I sent HostEurope a custom request today, but they didn't answer till now. If they can provide me with that space, this question will be irrelevant, but the 2nd one is the more important one anway, don't do much more than recommend me some based on experience, you don't have to crawl hours through hosting services) livedrive for example offers 5 TB for 17€ / month, I'd be happy with 2 TB for 20 €, the caveat is: It doesn't allow multiple users, which leads me to my second question: Where are the security problems? Which protocol is sufficient (I want private and "public" folders etc. the usual "every user has its own and a public space"-thing), secure and fast? (I'd tend to (S)FTP, problem with FTP is: Most of those hosting services don't even allow FTP with mutliple users and single users lead me into "hacking" a solution (you could map the basic folder structure on the main server and just mount every subfolder from the storage, things get difficult with a public folder with 644 permissions though) Is useing something like PKI or 802.1X overkill for private uses?

    Read the article

  • Windows Home Server style redundancy/multi-disk-support on Windows Server 2008 R2?

    - by user19597
    I'm setting up a fileserver for our department. It'll be connected to the domain. I want it to have a very large amount of storage (several TB). Ideally, it should also preserve disk space by identifying identical files and only storing them once. It should be fault tollerant so that if one of the drives fails, that drive can be replaced without losing any data. All of these features are available in Microsoft's consumer offering - Windows Home Server. However, I can't find these kind of features within the enterprise Windows Server 2008 R2. Am I missing something? I know that I could buy a Drobo, or similar, and use this instead. However, I would prefer to use a built-in feature of Windows Server should it exist. It seems surprising to me that these features should be available in Home Server but not in an enterprise fileserver.

    Read the article

  • Cyberduck Replacement?

    - by bLee
    Now that I've upgraded my Mac OS X to Snow Leopard, Cyberduck does not work. What are some good replacements? I would like to have both ftp and sftp compatibilities.

    Read the article

  • Fastest SFTP client

    - by Stan
    Protocol: SFTP (port 22) I've tested CuteFtp, FileZilla, SecureCRT and several others. Looks like CuteFTP has the best throughput, usually 200%-400% than others. I've read something about SecureFtp may have slower rate from here. Can anyone explain why CuteFtp has better throughput? And, is there any other FTP client even faster than CuteFtp? Thanks a lot!

    Read the article

  • mkisofs - Floppy Image to Disk Image

    - by CommunistPancake
    I'm trying to compile MikeOS on windows. I've successfully (I think) created a floppy (.flp) image of the operating system. I want to convert it to a disk image (.iso) so I can run it in virtual box. I've tried mkisofs -quiet -V 'MIKEOS' -input-charset iso8859-1 -o disk_images/mikeos.iso -b mikeos.flp disk_images/ Which is the command in the Linux build script. It does create an ISO image, but when I try to run in in VirtualBox, I get a black screen. What am I doing wrong? Here's my build script.

    Read the article

  • Removing duplicate files, keeping only the newest file

    - by pinkie_d_pie_0228
    I'm trying to clean up a photo dump folder, in which several files are duplicated but with different filenames or lost in subfolders. I've looked at tools like rmlint, duff and fdupes, but I can't seem to find a way to have them keep only the file with the most recent timestamp. I suspect I have to postprocess the results, but I don't even know where to start to do this. Can anyone guide me on how to get the duplicate files list and delete everything but the newest file?

    Read the article

  • Copy and Paste without needing to use the Mouse

    - by Paul Farry
    In previous versions of windows when your were Copying Files (Ctrl-C) then alt-Tab (to the appropriate window) and Pasting (Ctrl-V) using the Keyboard everything could be driven by the keyboard. With Vista and 7, it seems if you are copying and pasting (and the files exist) in the Destination Location you have to Click the Copy and Replace, there doesn't seem to be a way to do it... "Do this for the next {n} Conflicts" can be driven by Alt-D, but the "Copy and Replace" and "Don't Copy" are not. Or is there and have I just missed something? I'm more than happy to change registry or something to enable Keyboard shortcuts for this.

    Read the article

  • Replicate / SYNC / Copy thousands of files

    - by rihatum
    Windows 32BIT Box (Server OS) 4GB RAM a 800GB LUN Mapped to the above box as a local drive Around 700GB of text files (Yes text files and a few thousand word documents) nested in thousands of directories. I need to move this to a new storage and map another server to it. What would be the best way to go about it ? What I did was mapped the existing LUN to our new box and mapped a LUN from the New storage to the new box too, and tried copying (windows copy) but that wasn't good / fast enough considering the downtime. I am now looking for either a script which will do this or a utility (prefer opensource / free) to move this size of data at a good speed. 2 x 1GB Nics teamed ether channel 2GBPs Any suggestions or pointers would be off great help Thanks !

    Read the article

  • While in CMD shell, copying files from host OS to guest VM locks files (VMware Player/Workstation)

    - by Malcolm
    We're running the latest versions of VMWare Player and Workstation for Windows. The following behavior is identical across both products. Problem: We open a CMD prompt in our guest OS (XP, Vista, Windows 7) and copy files from our host OS using the standard CMD shell copy command: copy z:\C$\testfiles The copy completes successfully, but from that point forward, all the files that were copied to our guest OS are now LOCKED on our host OS. This does not happen if we use Windows Explorer to copy files - it only happens when files are copied via the CMD shell. As mentioned at the start of this question, this behavior is reproducible in both VMWare Player and VMWare Workstation across multiple machines and multiple guest OS's. I've googled for a workaround, but without success. Any ideas appreciated. Malcolm

    Read the article

  • How to place a dropbox file into other dropbox

    - by Nrew
    There is this file that I really want to download. The kahel os live cd I tried downloading it with 45kbps download rate and it would take 3 hours. And the connection is intermittent. So my download was cut. Is it possible to treat the file as your own and then put it to your own dropbox. And from there it would continue downloading.

    Read the article

< Previous Page | 390 391 392 393 394 395 396 397 398 399 400 401  | Next Page >