Search Results

Search found 5793 results on 232 pages for 'ftp sync'.

Page 55/232 | < Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >

  • Sound out of sync after merging multiple mp4 files with Avidemux

    - by Goto10
    I am trying to join (merge) two or more .mp4 files together, without re-encoding. Here is what I did: Started Avidemux 2.5.5. With File-Open, selected Input1.mp4. I received this message - "H.264 detected. If the file is using B-frames as reference it can lead to a crash or stuttering. Avidemux can use another mode which is safe but YOU WILL LOOSE SOME FRAME ACCURACY. Do you want to use that mode?". I chose "No". With File-Append, selected Input2.mp4. I received the same "H.264 detected" message again and chose "No". Selected the Format to MP4 (from AVI). Saved the output file (called Output.mp4) with File-Save-Save Video. Unfortunately, when I play the Output.mp4 video in VLC, the sound is out of sync with the second video. How can I correct this?

    Read the article

  • One Way Sync with Dropbox?

    - by user244805
    Is there any way I can mirror a dropbox folder to my C drive by just running a portable file? Extra background information because I know you guys hate it when you don't get the entire situation: I go back to University in fall and I need a new storage solution. I decided to use DropBox to sync my tiny University files (< 5 MB). I need to access these files from 4 machines: Windows 7 Home machine Windows 7 University A machine Windows 7 University B machine Android tablet 1 and 4 are a non-issue. The problem lies with 2 and 3. I want to be able to edit my files on 2 and 3 but those machines are not mine. There is an easy fix. Run a portable version of the DropBox syncer on a USB drive. But the problem is that I don't want to carry a USB drive around with me all the time. In that case, I can just run the small portable DropBox syncer off the internet. But where will it to store the files? A temporary directory on the C drive. There is only one issue left: there are hundreds of machines that I will randomly use that fit in categories 2 and 3. My portable DropBox syncer will notice that the temporary directory is empty on each new PC I use and instead of downloading my DropBox folder to the machine, it syncs the other way around i.e. it deletes my entire DropBox. The solution is to mirror my DropBox onto the temporary directory before running the DropBox syncer.

    Read the article

  • customer wont provide ssh access - ftp only

    - by Max
    Eh, here is my problem: I am working in a webdevelopment agency (thats a problem but not the real problem, read on). Most of the time I choose the live server myself when creating a new website project. But now the customer already has a "server" (10 GB on a cheapo host!) and the "admin" refuses to give me ssh access to it. But I need to access the server via shell because many files will be transported (need to be able to upload and extract a tar) and I need to insert or create mysql dumps via command line. He argues FTP and phpmyadmin should be enough... as far as I know the webspace was just ordered to host the website, so no security critical apps are running there. How can I either convince the admin to give me the ssh login or tell management that we need our own server? Anyone with similiar experiences? This is really annoying as this is a very small project that should be done fast and now one has to fight in order to just get the work done...

    Read the article

  • Online FTP or file sharing service [on hold]

    - by Frede
    We need to share large files with clients, e.g. clients upload a large file, we modify it and later make it available for download. Up until now we've used FTP but this has a number of drawbacks. A lot of management of files and setting up accounts etc. We are therefore considering online alternatives. Requirements: Cheap, 8-) Easy to use, ideally just requiring a web browser, but also possible for power users to connect e.g. via FTPS/SFTP No registration requried for users to upload/download files. We ourselves of course need to be able to login an view uploaded files and upload new files. No per user fee High bandwidth. As files may be GBs in size both upload and download speed cannot be too slow Secure. Encryption during upload/download. No way for users to access uploaded files. Once a user has uploaded a file they (or anyone else besides us) should be able to access the file. To download files users get a link with a password. Ideally the link expires after a set time. No software installation We do NOT need any sync features, backup, versioning etc. Just a quick, easy, secure way for us to share files with our clients. Services like JustCloud, DriveHQ etc seems bloated and "too much" for what we need. What other alternatives exist? Thanks!

    Read the article

  • Mysql Replication out of sync? What commands do I run to sync it back up?

    - by Alex
    I have a master-master replication system. However, due to an auto-increment issue, I got an error in replication...and it stopped replicating. Someone told me to do: stop slave; SET GLOBAL SQL_SLAVE_SKIP_COUNTER = 1; start slave; It didn't work. Then they told me to do: SET GLOBAL SQL_SLAVE_SKIP_COUNTER = 2; It didn't work. Then to test it out, I did: SET GLOBAL SQL_SLAVE_SKIP_COUNTER = 99999; It starts, but it is not updating. I created a table on DB1...and it is not showing up on DB2... Below are the SHOW STATUS for both my DB1 and DB2 (I hit them together): mysql> show master status\G *************************** 1. row *************************** File: mysql-bin.000605 Position: 2019727 Binlog_Do_DB: Binlog_Ignore_DB: 1 row in set (0.00 sec) mysql> show slave status\G; *************************** 1. row *************************** Slave_IO_State: Waiting for master to send event Master_Host: Master_User: Master_Port: Connect_Retry: 60 Master_Log_File: mysql-bin.000605 Read_Master_Log_Pos: 2008810 Relay_Log_File: mysqld-relay-bin.001731 Relay_Log_Pos: 10176595 Relay_Master_Log_File: mysql-bin.000470 Slave_IO_Running: Yes Slave_SQL_Running: Yes Replicate_Do_DB: Replicate_Ignore_DB: Replicate_Do_Table: Replicate_Ignore_Table: Replicate_Wild_Do_Table: Replicate_Wild_Ignore_Table: Last_Errno: 0 Last_Error: Skip_Counter: 4255373725 Exec_Master_Log_Pos: 10176458 Relay_Log_Space: 135062517347 Until_Condition: None Until_Log_File: Until_Log_Pos: 0 Master_SSL_Allowed: No Master_SSL_CA_File: Master_SSL_CA_Path: Master_SSL_Cert: Master_SSL_Cipher: Master_SSL_Key: Seconds_Behind_Master: 1376343 1 row in set (0.00 sec) How do I fix it so that they sync back up again? Thank you.

    Read the article

  • Remote desktop is slow when connecting to a computer which is part of a domain

    - by Peuge
    Hey all, We have two windows 2003 machines, one is a DC and another is joined to the domain of the DC. These machines are not locally available to us so we have to remote desktop into them. When we first got the machines remote desktop was blazing as the machines are only a couple of miles away. I then installed AD and setup routing and remote access, I also setup DNS on the DC. Now when I try remote Desktop into the machine which is part of the domain (not the DC) it is painfully slow! Remote Desktop onto the DC is also noticeably slower! Another problem is that our FTP to the DC has also become slow. I don't know what other information I can provide, as I am new to Sys Admin (moving over from development). The speed should be fast as these machines are only a couple of miles away. Any help / suggestions is greatly appreciated! Thanks Peuge

    Read the article

  • Setting up Pure-FTPd with admin/user permissions for same directory

    - by modulaaron
    I need to set up 2 Pure-FTPd accounts - ftpuser and ftpadmin. Both will have access to a directory that contains 2 subdirectories - upload and downlaod. The permissions criteria needs to be as follows: ftpuser can upload to /upload but cannot view the contents (blind drop). ftpuser can download from /download but cannot write to it. ftpadmin has full read/write permissions to both, including file deletion Currently, the first two are not a problem - disabling /upload read access and /download write access for ftpuser did the job. The problem is that when a file is uploaded by ftpuser, it's permissions are set to 644, meaning that user ftpadmin can only read it (note that all FTP directories are chown'd to ftpuser:ftpadmin). How can I give ftpadmin the power he so rightfully deserves?

    Read the article

  • Copying files SSH vs sFTP

    - by jackquack
    I'm a bit of a unix noob, but this question seems super basic, yet I can't find an answer anywhere. Basically, to my knowledge, sFTP is just FTP over ssh. So, why can't I drag and drop files from one folder to another on the server side like I can on ssh. Why when I want to unzip a .tar in a server folder, does it first want to copy it to my machine and then back? Why can't it just unzip like it can when I'm using the command line. I know that when I use the command line it is using the resources of the remote machine, but why can't sFTP do that too? Is there a way to execute commands which I would normally do over SSH, but in a gui? I'm tried mapping to the drive to my own machine, I've tried so many sFTP clients that it's silly. Is there another class of program that I just don't know of?

    Read the article

  • FileZilla Server Configuration Problems

    - by LiamB
    I've set-up FileZilla server a Windows 2008 Machine, I then created the user, password and added a share folder which I set to Home Directory. I then connect to the server from the client computer Status: Connecting to {IP} Status: Connection established, waiting for welcome message... Response: 220-Welcome To {NAME} FTP Response: 220 {DOMAIN} Command: USER {USER} Response: 331 Password required for {USER} Command: PASS ********* Response: 230 Logged on Status: Connected Status: Retrieving directory listing... Command: PWD Response: 257 "/" is current directory. Command: TYPE I Response: 200 Type set to I Command: PASV Response: 227 Entering Passive Mode ({}DATA) Command: MLSD The connection works fine, however no remote directory is selected, it shows as "/" however uploading any file fails. Any suggestions on how to debug this more?

    Read the article

  • All my files uploaded have unusable permissions

    - by cosmicbdog
    I've just moved to a new server and have come across some strange permissions issues. Every file I upload has permissions of 600, owned by the user account and is also in the same group. With this permission, the server is unable to make changes to these files. The folder I'm uploading to (via regular ftp) has permissions of 755. Why are any new files I upload here given this permission of 600? And how do I change it so that files added are given permissions so they can be modified by the webserver?

    Read the article

  • Scriptable FTPS client able to send Keep Alive to control port?

    - by schultkl
    We need a FTP client that satisfies the following constraints: Windows Command-line scriptable, so we can automate it...sorry, FileZilla (?) FTPS, as it seems to perform better than SFTP The ability to send KeepAlive commands to the FTPS control port No passwords sent on the command line...sorry, curl Number 4, above, is critical: we have set KeepAlive in some other clients (e.g., CoreFTP LE) but we seem to have some routing equipment in the server environment which drops our connection when transferring a 7GB+ file. We have also set passive mode and "resume transfer" functionality seems currently broken with this secure file transport server...so we need to download the file in one go. What FTPS clients might meet our needs?

    Read the article

  • Vantec NexStar NAS Enclosures - Writing large files

    - by peter
    I have one of these 'Vantec NexStar LX - NST-475LX-BK' drive enclosures. It is a NAS device. When I write a file to the device using eSata, or a SMB share I cannot write files over 4GB. I think this is because the drive is formatted with FAT32. But when I access the device using FTP it doesn't matter. I can write files of any size. E.g. I wrote one on there last night which was 30GB. Does this make any sense? Why? I guess the most important thing for me is data integrity.

    Read the article

  • Other users on my Server?

    - by Jennifer Weinberg
    I'm buying a Server from a person (that I don't know really well) and I want to make sure that the previous owner hasn't got any access anymore. It's an Ubuntu Virtual Server and I already received the admin access (via shell). How can I find out if there are still other accounts left, who are still able to access my server (e.g. with a still existing shell account, ftp or another type of user account)? And how can I delete them if these accounts exist? Best regards, Jennifer

    Read the article

  • How to schedule a backup in Plesk for Database and configuration?

    - by Dilip Rajkumar
    I like to make a scheduled backup in plesk 10.4. My target is to Take the backup of Database and configuration and put in the FTP location. I dont see any option in backup manager in plesk. Any help is greatly appreciated. Then there is a setting "Suspend domain until backup task is completed" If I uncheck will that be a problem? Is that a way to get backup of specific database of a domain in plesk?. Any help is greatly appreciated. Thanks in advance..

    Read the article

  • Distribute terrabyte files to the public from web server

    - by MarkJ
    Hi We need to set up a website which makes two or three large files publicly available - the files will be 1 or 2 terrabytes each. Although they will be public, in practise I expect only a relatively small number of scientists will want to download them. What is the best way to allow this? I've had a quick talk to a web-hosting provider (rackspace) and they suggested a hybrid solution. An entry-level managed server (we predict fairly low traffic for the website, but we do need to install some custom CGI software). Some cloud storage which hooks into Limelight Networks. This would host the large files, for download by FTP. It sounded OK to me but I know relatively little about server administration. Does it make sense? Thanks in advance, Mark

    Read the article

  • Windows file sharing connects over WiFi instead of LAN

    - by zacaj
    I have a laptop and a desktop computer, and I need to sync lots of files to the laptop and back whenever I go on a trip, etc. I've got a LAN cable connected into an extra port on the desktop that I plug into the laptop so I can get gigabit file transfers instead of wireless G. They connect fine. If I do an FTP transfer, for instance, using the LAN IP addresses, it goes at ~40MB/s, as it should. However when I copy files using explorer and native windows file sharing it detects the other computer by name, not IP (eg \\DESKTOP-PC\ instead of \\192.168.0.100\) and always connects to it by its wireless IP address instead of the faster LAN address. Both computers are running Windows 7. I have tried editing the priorities of the adapters in Advanced Settings and putting the LAN adapters above the wifi ones, but this didn't have any effect

    Read the article

  • Experience with MQ File Transfer Edition?

    - by mfinni
    We've got several processes that move files across servers - SFTP, FTP, SCP; Windows, Linux, AIX; there is a workflow component (usually require a control file with filenames and hash values to move a batch of related files). The action is often initiated on our servers to get the files, so we need to make sure they're done being written. We have some homegrown scripts to do this, but they don't always work properly, and troubleshooting, maintenance, and log review is not easy this way. There's a lot of servers, and our scripts don't have central logging or a dashboard/console/etc. We're looking into commercial products to do this. Has anyone used MQ File Transfer Edition? Another team in our company is using Aspera, does anyone have any thoughts on that, or other favored products? I have no idea what our budget is for this, yet. Just trying to get a handle on the product space from the perspective of other admins.

    Read the article

  • Change permission of files with the owner 'apache'

    - by Dotty
    Hay, i have some files on my server with the owner set to "apache", I'm not quite sure how this happened. Anyway, i need to change the permission of these files to 0777 so i can download/edit them. However i cannot. I'm using a 1and1 Linux server and use Plesk to administrate it. I have the ability to login via SSH. However, if i run chmod or chown i get a "permission denied" error, and if i try to sudo chmod or chown it says the command cannot be found. When i go to edit my domain details, i get this option Shell access to server with FTP user's credentials and have these options /bin/sh /bin/bash /sbin/nologin /bin/bash (chrooted) /bin/rbash Any idea's how i should go about changing the permissions or changing the owner? Thanks

    Read the article

  • transfer code from one server to other server.

    - by Kamlesh Bhure
    I wanted to transfer new code into my new production server. I have code files on my development server. Instead of uploading files using FTP from my local machine, there is other way to transfer code from one server to other. What I am thinking I will make zip file of whole code to be transfer and place it in webroot. So that it would be accessible in internet with some link http://www.mydomain.com/code.tar.gz now on the other server i will just run command wget http://www.mydomain.com/code.tar.gz Will this transfer done in few seconds...? May I know is this correct approach?

    Read the article

  • Vantec NexStar NAS Encloser - Writing large files

    - by peter
    I have one of these 'Vantec NexStar LX - NST-475LX-BK' drive enclosures. It is a NAS device. When I write a file to the device using eSata, or a SMB share I cannot write files over 4GB. I think this is because the drive is formatted with FAT32. But when I access the device using FTP it doesn't matter. I can write files of any size. E.g. I wrote one on there last night which was 30GB. Does this make any sense? Why? I guess the most important thing for me is data integrity.

    Read the article

  • Deny directory browsing in a Proftpd / Ubuntu Installation

    - by skylarking
    I used this guide to set up a Proftpd installation an Ubuntu 8.04 server... Works well, but the generic user ( userftp ) can run ls and is able to change to any Directory and browse freely on the server ..from the root / and upwards.. I added this line to etc/shells /bin/false in hopes that that would prevent this ... I really only want the userftp account to be able to upload to the generic /home/FTP-Shared directory, and be able to do nothing else on the server. How is this accomplished ... This is a headless Ubuntu box..and I am using CLI only .. no GUI admin tools

    Read the article

  • Setting up a DNS port redirect?

    - by Svetlana
    I have a domain using CloudFlare's DNS, I want to make it redirect to my server's IP (dynamic IP, port 21 blocked by ISP) which at the moment uses a No-IP DNS. The current setup is that I have a subdomain as a CNAME targetting the No-IP domain, but that only works for things like the Minecraft server (which looks for a set port that isn't blocked by my ISP), and I'd like a solution that lets me redirect port 21 from the CloudFlare domain to port 2121 on the No-IP domain, or something else that points to my dynamic IP, where an FTP server is already set up and running. I've had SRV records mentioned to me but without any further help, and it only made me more confused. Thanks in advance for the help.

    Read the article

  • VSFTPD: Cannot figure this thing out...

    - by A Wizard Did It
    Alright, I've been giving this the best that I can, reading through various tutorials on google, but I cannot seem to get vsftpd running the way I want. For a short while I had it working with one account, but then that stopped and I haven't been able to get it to work since. I've since reformated and reinstall Ubuntu 10.04 LTS. I used apt-get install vsftpd and that's where I am now... I'd really appreciate if anyone could help me understand exactly how this is supposed to work... How do I add FTP accounts and set their home directory to something like /var/www/public_html?

    Read the article

  • What is the proper way to set up the Apache document root in terms of privileges?

    - by racl101
    I have just installed Ubuntu 9.10 server edition on my machine and I wish to run my own personal local server with other users in the same LAN. First, I was wondering what folder directory structure is best for the web root? Should I just use: /var/www/ and start throwing web documents there or should I create a folder elsewhere (maybe the home directory)? Second, in the /var/www/ directory only the root user can create documents in there, however, I wish to have other users be able to create files in the document root and upload them via FTP. Should I change the permissions or the www/ folder? Or again, should I create the document root elsewhere with different permissions? What is the safest way of doing this?

    Read the article

  • WebRequest using SSL

    - by pm_2
    I have the following code to retrieve a file using FTP (which works fine). FtpWebRequest request = (FtpWebRequest)WebRequest.Create(svrPath); request.KeepAlive = true; request.UsePassive = true; request.UseBinary = true; request.Method = WebRequestMethods.Ftp.DownloadFile; request.Credentials = new NetworkCredential(uname, passw); using (FtpWebResponse response = (FtpWebResponse)request.GetResponse()) using (Stream responseStream = response.GetResponseStream()) using (StreamReader reader = new StreamReader(responseStream)) using (StreamWriter destination = new StreamWriter(destinationFile)) { destination.Write(reader.ReadToEnd()); destination.Flush(); } However, when I try to do this using SSL, I am unable to access the file, as follows: FtpWebRequest request = (FtpWebRequest)WebRequest.Create(svrPath); request.KeepAlive = true; request.UsePassive = true; request.UseBinary = true; // The following line causes the download to fail request.EnableSsl = true; request.Method = WebRequestMethods.Ftp.DownloadFile; request.Credentials = new NetworkCredential(uname, passw); using (FtpWebResponse response = (FtpWebResponse)request.GetResponse()) using (Stream responseStream = response.GetResponseStream()) using (StreamReader reader = new StreamReader(responseStream)) using (StreamWriter destination = new StreamWriter(destinationFile)) { destination.Write(reader.ReadToEnd()); destination.Flush(); } Can anyone tell me why the latter would not work?

    Read the article

< Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >