Search Results

Search found 5793 results on 232 pages for 'ftp sync'.

Page 75/232 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • Easiest solution to sync an offline (local desktop application) database with a central server and multiple pc's?

    - by tyfius
    I have a desktop application which uses a local database. (This can be SQLite, SqlCe, PostgreSQL or any other database I will be able to install locally, I haven't decided which one to use yet.) The plan is to achieve the following: A user can subscribe to some kind of cloud service. If he does his local database should be synced with the online database (one for all users, or one per user, whatever the easiest solution is) so he will be able to sync his local database data between multiple PC's, can access his data online. (Much like dropbox does for files.) What is the best, easiest (and preferably cheapest) solution to achieve this? I am looking into DataObjects.net but I can't find much documentation about their Sync feature. Or, are there other alternatives? For example, I start with some kind of cloud service which allows local caching and use the local caching for users who do not subscribe to the service. Any pointers, tips or experiences are welcome.

    Read the article

  • SQL SERVER – Extending SQL Azure with Azure worker role – Guest Post by Paras Doshi

    - by pinaldave
    This is guest post by Paras Doshi. Paras Doshi is a research Intern at SolidQ.com and a Microsoft student partner. He is currently working in the domain of SQL Azure. SQL Azure is nothing but a SQL server in the cloud. SQL Azure provides benefits such as on demand rapid provisioning, cost-effective scalability, high availability and reduced management overhead. To see an introduction on SQL Azure, check out the post by Pinal here In this article, we are going to discuss how to extend SQL Azure with the Azure worker role. In other words, we will attempt to write a custom code and host it in the Azure worker role; the aim is to add some features that are not available with SQL Azure currently or features that need to be customized for flexibility. This way we extend the SQL Azure capability by building some solutions that run on Azure as worker roles. To understand Azure worker role, think of it as a windows service in cloud. Azure worker role can perform background processes, and to handle processes such as synchronization and backup, it becomes our ideal tool. First, we will focus on writing a worker role code that synchronizes SQL Azure databases. Before we do so, let’s see some scenarios in which synchronization between SQL Azure databases is beneficial: scaling out access over multiple databases enables us to handle workload efficiently As of now, SQL Azure database can be hosted in one of any six datacenters. By synchronizing databases located in different data centers, one can extend the data by enabling access to geographically distributed data Let us see some scenarios in which SQL server to SQL Azure database synchronization is beneficial To backup SQL Azure database on local infrastructure Rather than investing in local infrastructure for increased workloads, such workloads could be handled by cloud Ability to extend data to different datacenters located across the world to enable efficient data access from remote locations Now, let us develop cloud-based app that synchronizes SQL Azure databases. For an Introduction to developing cloud based apps, click here Now, in this article, I aim to provide a bird’s eye view of how a code that synchronizes SQL Azure databases look like and then list resources that can help you develop the solution from scratch. Now, if you newly add a worker role to the cloud-based project, this is how the code will look like. (Note: I have added comments to the skeleton code to point out the modifications that will be required in the code to carry out the SQL Azure synchronization. Note the placement of Setup() and Sync() function.) Click here (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-1-for-extending-sql-azure-with-azure-worker-role1.pdf ) Enabling SQL Azure databases synchronization through sync framework is a two-step process. In the first step, the database is provisioned and sync framework creates tracking tables, stored procedures, triggers, and tables to store metadata to enable synchronization. This is one time step. The code for the same is put in the setup() function which is called once when the worker role starts. Now, the second step is continuous (or on demand) synchronization of SQL Azure databases by propagating changes between databases. This is done on a continuous basis by calling the sync() function in the while loop. The code logic to synchronize changes between SQL Azure databases should be put in the sync() function. Discussing the coding part step by step is out of the scope of this article. Therefore, let me suggest you a resource, which is given here. Also, note that before you start developing the code, you will need to install SYNC framework 2.1 SDK (download here). Further, you will reference some libraries before you start coding. Details regarding the same are available in the article that I just pointed to. You will be charged for data transfers if the databases are not in the same datacenter. For pricing information, go here Currently, a tool named DATA SYNC, which is built on top of sync framework, is available in CTP that allows SQL Azure <-> SQL server and SQL Azure <-> SQL Azure synchronization (without writing single line of code); however, in some cases, the custom code shown in this blogpost provides flexibility that is not available with Data SYNC. For instance, filtering is not supported in the SQL Azure DATA SYNC CTP2; if you wish to have such a functionality now, then you have the option of developing a custom code using SYNC Framework. Now, this code can be easily extended to synchronize at some schedule. Let us say we want the databases to get synchronized every day at 10:00 pm. This is what the code will look like now: (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-2-for-extending-sql-azure-with-azure-worker-role.pdf) Don’t you think that by writing such a code, we are imitating the functionality provided by the SQL server agent for a SQL server? Think about it. We are scheduling our administrative task by writing custom code – in other words, we have developed a “Light weight SQL server agent for SQL Azure!” Since the SQL server agent is not currently available in cloud, we have developed a solution that enables us to schedule tasks, and thus we have extended SQL Azure with the Azure worker role! Now if you wish to track jobs, you can do so by storing this data in SQL Azure (or Azure tables). The reason is that Windows Azure is a stateless platform, and we will need to store the state of the job ourselves and the choice that you have is SQL Azure or Azure tables. Note that this solution requires custom code and also it is not UI driven; however, for now, it can act as a temporary solution until SQL server agent is made available in the cloud. Moreover, this solution does not encompass functionalities that a SQL server agent provides, but it does open up an interesting avenue to schedule some of the tasks such as backup and synchronization of SQL Azure databases by writing some custom code in the Azure worker role. Now, let us see one more possibility – i.e., running BCP through a worker role in Azure-hosted services and then uploading the backup files either locally or on blobs. If you upload it locally, then consider the data transfer cost. If you upload it to blobs residing in the same datacenter, then no transfer cost applies but the cost on blob size applies. So, before choosing the option, you need to evaluate your preferences keeping the cost associated with each option in mind. In this article, I have shown that Azure worker role solution could be developed to synchronize SQL Azure databases. Moreover, a light-weight SQL server agent for SQL Azure can be developed. Also we discussed the possibility of running BCP through a worker role in Azure-hosted services for backing up our precious SQL Azure data. Thus, we can extend SQL Azure with the Azure worker role. But remember: you will be charged for running Azure worker roles. So at the end of the day, you need to ask – am I willing to build a custom code and pay money to achieve this functionality? I hope you found this blog post interesting. If you have any questions/feedback, you can comment below or you can mail me at Paras[at]student-partners[dot]com Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Azure, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Access FTPS from behind Forefront TMG

    - by Maxim V. Pavlov
    I have a web server on which IIS 7 host an SSL-enabled site. The client in am trying to connect with is behind the corporate Forefront TMG. The app is Total Commander - a file manager shell, that has the ability to connect to SSL FTP by putting a checkmark over SSL/TLS in the FTP connection settings. When FTP Access Filter in FF is enabled, my connection attempt fails on Negociating TLS step of FTP connection. The same happens even if I enable Allow Active FTP in the filter's settings. But when I disable the FTP Access Filter on FF completely, I am able to connect fine. How to configure FF TMG to allow FTPS?

    Read the article

  • Force initial Google Drive sync with a non-empty folder?

    - by Terrance Shaw
    I upgraded my iMac with an SSD last night and restored from a Time Capsule backup. Everything is now working substantially zippier and overall better, with the exception of one thing: Google Drive refuses to continue to sync with the Google Drive folder that it'd been using before the upgrade, and I ultimately ended up having to just delete the folder and let it resync from scratch to get past its stubborn error (alternatively, I suppose I could've simply moved the contents, set the path to the now-empty folder, then moved them back). Is there any way to get past this particular issue (for future reference), or is it something that Google put in place to ensure that a new user doesn't go and specify their root drive as the backup destination?

    Read the article

  • Centos 6.3 vsftp unable to upload file to apache webserver

    - by user148648
    I am new to Centos, I did work with Sun Solaris and upload files to Apache web server before. I create an end user account and manage to ftp using command prompt to the server, error message is '226 Transfer Done (but failed to open directory). Content of my vsftpd.conf as below # Example config file /etc/vsftpd/vsftpd.conf # # The default compiled in settings are fairly paranoid. This sample file # loosens things up a bit, to make the ftp daemon more usable. # Please see vsftpd.conf.5 for all compiled in defaults. # # READ THIS: This example file is NOT an exhaustive list of vsftpd options. # Please read the vsftpd.conf.5 manual page to get a full idea of vsftpd's # capabilities. # # Allow anonymous FTP? (Beware - allowed by default if you comment this out). anonymous_enable=YES # ** may need to comment it back # # Uncomment this to allow local users to log in. local_enable=YES # # Uncomment this to enable any form of FTP write command. write_enable=YES # # Default umask for local users is 077. You may wish to change this to 022, # if your users expect that (022 is used by most other ftpd's) #local_umask=022 local_umask=077 # # Uncomment this to allow the anonymous FTP user to upload files. This only # has an effect if the above global write enable is activated. Also, you will # obviously need to create a directory writable by the FTP user. anon_upload_enable=YES # *** maybe to comment it back!!! # # Uncomment this if you want the anonymous FTP user to be able to create # new directories. anon_mkdir_write_enable=YES # ** may need to comment it back!!! # # Activate directory messages - messages given to remote users when they # go into a certain directory. dirmessage_enable=YES # # The target log file can be vsftpd_log_file or xferlog_file. # This depends on setting xferlog_std_format parameter xferlog_enable=YES # # Make sure PORT transfer connections originate from port 20 (ftp-data). connect_from_port_20=YES # # If you want, you can arrange for uploaded anonymous files to be owned by # a different user. Note! Using "root" for uploaded files is not # recommended! #chown_uploads=YES #chown_username=whoever # # The name of log file when xferlog_enable=YES and xferlog_std_format=YES # WARNING - changing this filename affects /etc/logrotate.d/vsftpd.log xferlog_file=/var/log/xferlog # # Switches between logging into vsftpd_log_file and xferlog_file files. # NO writes to vsftpd_log_file, YES to xferlog_file xferlog_std_format=YES # # You may change the default value for timing out an idle session. #idle_session_timeout=600 # # You may change the default value for timing out a data connection. #data_connection_timeout=120 # # It is recommended that you define on your system a unique user which the # ftp server can use as a totally isolated and unprivileged user. #nopriv_user=ftpsecure # # Enable this and the server will recognise asynchronous ABOR requests. Not # recommended for security (the code is non-trivial). Not enabling it, # however, may confuse older FTP clients. #async_abor_enable=YES # # By default the server will pretend to allow ASCII mode but in fact ignore # the request. Turn on the below options to have the server actually do ASCII # mangling on files when in ASCII mode. # Beware that on some FTP servers, ASCII support allows a denial of service # attack (DoS) via the command "SIZE /big/file" in ASCII mode. vsftpd # predicted this attack and has always been safe, reporting the size of the # raw file. # ASCII mangling is a horrible feature of the protocol. ascii_upload_enable=YES ascii_download_enable=YES # # You may fully customise the login banner string: ftpd_banner=Warning, only for authorize login. # # You may specify a file of disallowed anonymous e-mail addresses. Apparently # useful for combatting certain DoS attacks. #deny_email_enable=YES # (default follows) #banned_email_file=/etc/vsftpd/banned_emails # # You may specify an explicit list of local users to chroot() to their home # directory. If chroot_local_user is YES, then this list becomes a list of # users to NOT chroot(). chroot_local_user=YES chroot_list_enable=YES # (default follows) #chroot_list_file=/etc/vsftpd/chroot_list local_root=/var/www # # You may activate the "-R" option to the builtin ls. This is disabled by # default to avoid remote users being able to cause excessive I/O on large # sites. However, some broken FTP clients such as "ncftp" and "mirror" assume # the presence of the "-R" option, so there is a strong case for enabling it. ls_recurse_enable=YES # # When "listen" directive is enabled, vsftpd runs in standalone mode and # listens on IPv4 sockets. This directive cannot be used in conjunction # with the listen_ipv6 directive. listen=YES # # This directive enables listening on IPv6 sockets. To listen on IPv4 and IPv6 # sockets, you must run two copies of vsftpd with two configuration files. # Make sure, that one of the listen options is commented !! #listen_ipv6=YES pam_service_name=vsftpd userlist_enable=YES tcp_wrappers=YES

    Read the article

  • Nagios3: Conditional operators for service checks?

    - by Dave
    I'm trying to setup Nagios to monitor my various using hostgroups to define 'machine roles', against which I run services to check the machines by role. However, I'd like to use conditional operators that would enable me to run the service check against an intersection of two host groups, rather than their unions... i.e. using &&, ||, or () operators. For example, imagine I have the following servers: www-eu: Linux WWW (Apache) server, in the EU www-us: Windows WWW (IIS) server, in the US (West coast) ftp-eu: Linux FTP server, in the EU ftp-us: Windows FTP server, in the US I would want to create the following host groups: US-Servers: www-us, ftp-us EU-Servers: www-eu, ftp-eu WWW-Servers: www-us, www-eu FTP-Servers: ftp-us, ftp-eu Now say I'm interested in checking the HTTP response time for my web servers. Then let's say this particular Nagios service is running from the US (West Coast), and that I have a command called *check_http_response_time*. This command will check the responsiveness of the HTTP server, which I can provide an argument which defines the max response time before raising critical. My command might look like: check_http_response_time $HOSTNAME$ 50 Now traditionally, I can run my checks by specifying a list of host or hostgroups. define service{ use local-service hostgroup_name WWW-Servers # Servers = www-us, www-eu servicegroups WWW Checks service_description Check HTTP Response Time check_command check_http_response_time!50 } However, with the above service definition, given my Nagios service is in US West, I could reasonably expect that my EU server will return critical. Really, I want different thresholds for each region (50 for US West, 200 for EU.) I would have to permutate my service for each host and set their custom threshold, or alternatively permutate out my service groups by role & region (i.e. WWW-Servers-EU), and run my specific thresholds against those. Though the latter is better, both are much messier than I'd like... What I would love, and what this post is asking for, is a way to use hostgroups to perform an intersection using conditional logic, rather than a simple union. It might look like: define service{ use local-service hostgroup_name WWW-Servers && US-Servers servicegroups WWW Checks service_description Check HTTP Response Time check_command check_http_response_time!50 } It then would run the check only against servers that are in both WWW-Servers and US-Servers, in my example, just www-us. The benefits of such a feature would be significant for Nagios services configured for large-scale. Is this feature available? If it isn't, will it be available in the future? Is there an alternative way to accomplish this given the most recent Nagios version? Any tips/suggestions are most appreciated! Dave

    Read the article

  • Are there any home/soho NAS devices that will backup/sync to the cloud?

    - by 3rdparty
    Looking for a home office (SOHO) market (priced) network hard drive (NAS) that will sync some or all of its content to a cloud-based backup service. The only option I've been able to find so far is NetGear's [ReadyNAS Vault][1] however from what I've read it's not as secure as it could be, and the service is quite expensive ($200/yr for 50GB of cloud storage) - it's 'powered' by ElephantDrive Ideally would love to see something like Wuala integrated into a Lacie Network HDD - conveniently, I suspect this is in the works as Lacie recently acquired Wuala, however nothing has come of it yet. I know there are options to use rsync with a customizable NAS (such as the very versatile and hackable D-Link DNS-323, but the easier this is to setup and maintain, the better. Thanks! ps. I had many links posted within this question, but was limited to posting with only one due to anti-spam restrictions - gotta get my 'reputation' higher!

    Read the article

  • How can I sync Nokia smartphones' calendars with Snow Leopard Server's iCal Server?

    - by Joe Carroll
    I've just installed and configured a Mac Mini Server for a customer who wanted to stop using Google Apps for their email. The plan was to also use the new server's calendaring service but we've hit a small snag: the staff all use iCal on Macs and Nokia E71s and there doesn't seem to be a way to use a single calendar for each person that syncs with both. iSync/iCal doesn't appear to allow their manually-synced phones to sync with the CalDAV calendars on the server, only with local calendars on their Macs. Nor does Nokia's organiser software support CalDAV but rather SyncML for live networked syncing. I was wondering if there's a plugin for iCal Server that would provide a bridge to a SyncML service I could run on the server... or anything else that would work (preferably FOSS)!

    Read the article

  • How do I keep multiple copies of Outlook in sync when using RPC over HTTP?

    - by Don
    I use Outlook 2007 at work with our Exchange 2003 server. I just setup my home system with Outlook 2007 so that I could use the RPC over HTTP to access Exchange without having to use a VPN. It works fine. I can get mail, send mail, etc. What it doesn't seem to be doing is staying in sync. For example, I read a few messages at home, moved them into different folders from the Inbox, etc. That all seemed fine. When I login to my work machine and look at the copy of Outlook there, the mail is still unread and nothing has been moved. Am I missing something simple here? I would have to assume that my home machine should be telling Exchange where these messages belong and that they've been read. Both machines are running Windows 7, if that matters. Ideas?

    Read the article

  • iTunes - Duplicate "Purchased on iPhone" playlists

    - by War Chester Shire Sauce
    Whenever I purchase music on my phone and then sync with iTunes later, a duplicate "Purchased on iPhone" playlist will be created. The first one will have everything i've purchased in the past, and the duplicate playlist will have the one or two songs that I just purchased. In the sync options I checked/unchecked the box to have the playlist sync and not sync with my phone. No matter which option I choose I'll always have duplicate playlists when I sync with iTunes. How can I fix this? I don't want a duplicate playlist to be created every time I sync. If i purchase a song on my phone I want it to be added to the existing "Purchased on iPhone playlist".

    Read the article

  • Password Manager that can sync a Blackberry and Mac OSX.

    - by pdhoven
    I use a Blackberry Bold and a Macbook Pro. I am looking for a solution to have a synchronized password manager between the two devices. All the commercial ones I have discovered won't work between a Blackberry and a Mac. The almost solution was KeePass. I like the application on the Blackberry but I could not get the sync working reliably to the Mac. As well, I had to run the PC application by using Mono on the Mac and it was pretty slow. I am happy to pay for a good solution.

    Read the article

  • does any sync software find files that have moved paths within a folder?

    - by kpierce8
    Say I have a pictures folder which I reorganized on one computer. I'd like to use that directory as the base and compare it with another version on a backup drive. Will any sync/compare program find that a file in one folder has moved locations within a compare folder. For instance say I reorganized my pictures from trips into folders by year with the trips folders inside each year folder. If I use a regular compare utility I wind up with two copies of everything that's moved in different locations.

    Read the article

  • vsftpd with pam_winbind.so

    - by David
    I'm trying to setup vsftpd to use logins from our domain. I want the ftp users to be able to login using their active directory username/password and have be able to have full access to /media/storage/ftp/username. I setup pptp using winbind and it is working fine, so I belive the issue is with vsftpd and pam. The ftp server runs and gives 530 for the login. I turned on debug for the pam module, but I see nothing in the syslog. Vsftp only logs a wrong login in its log. /etc/pam.d/vsftpd auth required pam_winbind.so debug /etc/vsftpd.conf listen=YES listen_ipv6=NO connect_from_port_20=YES anonymous_enable=NO local_enable=YES write_enable=YES xferlog_enable=YES idle_session_timeout=600 data_connection_timeout=120 nopriv_user=ftp ftpd_banner=Welcome to Scantiva! Authorized access only! local_umask=022 local_root=/media/storage/ftp/$USER user_sub_token=$USER chroot_local_user=YES secure_chroot_dir=/var/run/vsftpd/empty pam_service_name=vsftpd guest_enable=YES guest_username=ftp ssl_enable=YES allow_anon_ssl=NO force_local_data_ssl=NO force_local_logins_ssl=NO ssl_tlsv1=YES ssl_sslv2=YES ssl_sslv3=YES rsa_cert_file=/etc/ssl/private/vsftpd.pem

    Read the article

  • root directory - www or public_html

    - by Phil Jackson
    Is the root directory where all files are kept (directly from accessing from FTP) always "www" or "public_html" depending on what OS? Or is it possible to rename this folder? And if so, what would be unique about this folder to be able to identify it? i.e. currently I just wrote this; my $root; my $ftp = Net::FTP->new($DB_ftpserver, Debug => 0) or die "Cannot connect to some.host.name: $@"; $ftp->login($DB_ftpuser, $DB_ftppass) or die "Cannot login ", $ftp->message; my @list = $ftp->dir; if( scalar @list != 0 ) { foreach( @list ){ if( $_ =~ m/www$/g ){ $root = "www"; last; }elsif( $_ =~ m/public_html$/g ){ $root = "public_html"; last; } } } but would not work if it has a different name. Any help much appreciated.

    Read the article

  • How to sync passwords one-way between windows domains without trust relationship?

    - by Franco C.
    We're migrating from Windows 2003 to 2008 SBS. We will run concurrently for a short period of time. I cannot establish a trust relationship between Server 2003 & Server 2008 SBS and I would like to know if there is a way to sync the passwords between 2003-2008? For example, I would like to dump the pre-encrypted passswords to a file in 2003 and then use this to update the passwords for the correspoding usernames in 2008 SBS. Is this possible? I have no need to ever see the clear text version of the passwords. I see one commercial product, but it hardly seems worth it given the temporary nature of my project... Thanks, Franco

    Read the article

  • HTTP, HTTPS and FTP is not working but SMTP and IMAP are working.

    - by nWorx
    Yesterday on a computer of a friend a strange thing happened. after booting the ports fo http, https and ftp are closed but e-mail is still working. in the control panel the windows firewall seems active even if he tries to deactivate it. I have a suspision that it is the faul of norton internet security 2010, we have tried to uninstall it, but the uninstallation did not work. when using the removal tool from symantec it just goes to 23% and then it crashes. the process ccSvcHst.exe is still running. How can I safely remove the rest of Norton Internet Security? Edit: Norton Internet Security 2010 is sucesfully removed, but still no connectivity...

    Read the article

  • Which is the best way to sync and share contacts and calender between Thunderbird, iPhone and Android?

    - by bensch
    I would like to keep my contacts and a calendar synchronized between several desktops and cellphones. Is there a way to achieve this without using Google or similar organisations? I want to keep my data protected and safe, so an encrypted transfer would be useful. Do i need to install a service on my own rootserver? or are there any services available, that are safe? I read this post, but there is not mentioned not to use Google: Thunderbird contacts sync so no solutions with SoGo or LDAP. maybe Zimbra is a solution? or Funambol? I tried kolab, but had some unsolveable problems.

    Read the article

  • Is Samba "remote browse sync" possible across OpenVPN tunnel?

    - by John Reynolds
    I'm connecting 2 TomatoUSB (Shibby build on WNR3500L v2) routers with an OpenVPN routed connection: ----------------------- ----------------------- | Router 1, subnet 20 | <--tunnel--> | Router 2, subnet 21 | ----------------------- ----------------------- Router 1 is the OpenVPN server and Router 2 is a client. Clients attached to the routers on both subnets can ping clients on the other subnet, so the tunnel and routing works. I've enabled file sharing on both, in order to get their Samba WINS servers running. Is it possible to get name resolution across the tunnel? I've tried remote browse sync = 192.168.21.1 in /etc/smb.conf on the server side, to no avail. Also tried using the IP adress that the client gets from the OpenVPN address pool (usually 10.8.0.something), but still no joy.

    Read the article

  • WordPress and Debian VPS. Download Plugins from Dashboard: What Are FTP Credentials

    - by jw60660
    I'm pretty much a newbie when it comes to a VPS. I successfully installed a WordPress NGINX setup but I am lost when it comes to downloading plugins from the WordPress dashboard. What credentials do I use? I am still using root on the VPS, although I have disabled root login and am using value key pairs to login. do I have to change to another user from root or can I use root as the ftp credentials from the dashboard to download plugins? Thanks.

    Read the article

  • time on files differ by 1 sec. FAIL Robocopy sync

    - by csmba
    I am trying to use Robocopy to sync (/IMG) a folder on my PC and a shared network drive. The problem is that the file attributes differ by 1 sec on both locations (creation,modified and access). So every time I run robocopy, it syncs the file again... BTW, problem is the same if I delete the target file and robocopy it from new... still, new file has 1 sec different properties. Env Details: Source: Win 7 64 bit Target: WD My Book World Edition NAS 1TB which takes its time from online NTP pool.ntp.org (I don't know if file system is FAT or not)

    Read the article

  • How do I use rsync to sync my repo on my mac to university server where I have ssh access?

    - by snihalani
    I need to use my university's ssh access and run my programs there for testing. I don't have sudo access there. It doesn't have vncserver there either. I would work with vim and make but I need git at least. Now I am looking into rsync to sync my current source directory into a remote directory and I'll ssh into the directory and run my make file to test it. I am looking at the man page of rsync and it looks very complicated. Can anyone please help me with this? I have googled in superuser and all commands seem different for different cases. Can anyone please help me with this?

    Read the article

  • Good free way to clone a hard drive or a partition and send the image over the network (through FTP, Windows file sharing, "anything")?

    - by Deleted
    What I ideally would like is a free software solution which can: Boot from a CD/DVD/USB-stick and Clone a complete hard drive or a partition and Send the resulting image file over the network through Windows file sharing (SMB, I could use SAMBA on my server to receive the image) or through FTP or through SFTP or through SCP It should work with Linux and Windows file-systems (where specific file system support is necessary) Is there anything good out there like this? I know Wikipedia lists a lot of cloning software. But I'm looking for a personal recommendation which you have used yourself, as I find it more credible (I'll see from the upvotes if the answer is liked by a lot of visitors).

    Read the article

  • How can I tell when an FTP is complete?

    - by identry
    I have a cron job that processes files that my client's upload via FTP to my FreeBSD server. The cron job runs once an hour, and normally processing each file only takes a few seconds. The cron job looks in the client's upload directory and moves any new files to a tmp directory. It then processes the file(s) and moves them into a final directory where they are then available to the public through a website. The problem is, every once in awhile, the cron job runs just as a new file is being uploaded. It moves the half-uploaded file to the tmp directory, and tries to process it, and fails, of course. Question: how can I determine if the uploaded file is complete? The only thing I can think of is checking the file size to see if it's changing, but that seems like a kludge. Is there some sort of flag or something that is set when the upload is complete?

    Read the article

  • Ubuntu one, music streaming

    - by iknowshall
    I'm running Ubuntu 11.10, and just signed up for Ubuntu One and Ubuntu One Music. Generally file sync is working fine, but music has been a complete failure so far. Here's what I'm looking at: After 24 hours, not a single mp3 or ogg file from my laptop has sync'd to the Cloud Music folder. I have 4 gigs of data used, which is definitely not enough to include music files. That's about the right amount for my docs and photos. With music, it should be more like 13 gigs. No music shows up on the Ubuntu One Music app on my phone, nor on the web view of my Ubuntu One Cloud folder I followed all the online instructions I could find. Basically, on my laptop, right click on the Music folder and select "Ubuntu One Synchronize this folder" The Music folder now has a green check mark on it indicating its sync'd The Ubuntu One app on my laptop says "File Sync is up to date" I did provide credit card information, and on the Ubuntu One website "Music Streaming" is listed as one of my services. So what am I doing wrong? Just read this post as I'm experiencing the same problem, the answer for this post says that due to the large number of files trying to be uploaded, that it could take up to a week. However I am, as above also being told that "File sync is up to date" Can somebody confirm that this is a bug and that I just need to wait for File sync to actually be up to date? Cheers

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >