Search Results

Search found 5793 results on 232 pages for 'ftp sync'.

Page 65/232 | < Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • Fixing permissions after FTPing ASP.NET code to a Linux system

    - by dnord
    First off, I'm running Mono to run ASP.NET on Linux, but that's not the question. It appears that, every time I clear out my application directory and upload, I have to go back in and fix the permissions. What I'm doing is chmod -R -c 755 /var/www/* ...and there are two questions. What's the deal with having to do this every time I FTP? Feels flaky. Is there a better permissions set than 755? Do I want different permissions for the /bin directory? Or can I fix this all with one fell swoop of chown?

    Read the article

  • Understand ACTV mode and the PORT command

    - by Ramy
    Hello, I'm the part time FTP server administrator (with no real full-time admin). We currently only allow ACTV mode connections. Some of our clients have had issues with this but for the most part they've been ok using ACTV. For the few who aren't, we've been able to push the data over to their servers from ours. there is one client in particular however who is currently having trouble. He is using file-zilla and issuing a PORT command. First, does using the PORT command imply that you are in ACTV mode? Second is there a way in FileZilla to explicitly change to ACTV mode? Thanks for the help, _Ramy

    Read the article

  • VSFTPD: Cannot figure this thing out...

    - by A Wizard Did It
    Alright, I've been giving this the best that I can, reading through various tutorials on google, but I cannot seem to get vsftpd running the way I want. For a short while I had it working with one account, but then that stopped and I haven't been able to get it to work since. I've since reformated and reinstall Ubuntu 10.04 LTS. I used apt-get install vsftpd and that's where I am now... I'd really appreciate if anyone could help me understand exactly how this is supposed to work... How do I add FTP accounts and set their home directory to something like /var/www/public_html?

    Read the article

  • is it posible to upload directly to remote server using SFTP on ASP.net MVC

    - by DucDigital
    Hi! I am currently develope something using asp.net MVC, im still quite not experience with it so please help me out. I have a form for user to upload Video. The current ideal concept to upload to remote server is to Upload it to to the current server, then use FTP to push it to a remote server. For me, this is not quite fast since you have to upload to current server (Time x1) and then the current server push to new server (Time x2) so it's double the time. So my idea is to make user upload it to the current server, and WHILE user is uploading, the current server add the file to DB and also send the file to the remote server at the same time using SFTP... is it posible and are there any security hole in this concept? Thank you very much

    Read the article

  • Is there an easy way to add a secure file upload form (username, password, select file) to a website

    - by user346602
    Hi, I am very new to website design. Have an architect who wants his clients to enable his clients to upload (ftp - but don't know if http could be a better alternative?) files (plans etc.) to him, through the website I'm designing for him. I have seen similar things available on printers websites... I have seen uploadify, but it requires flash (I can only code HTML, CSS and a tiny bit of PHP), and don't think it is a secure option. I have also seen net2ftp, but don't really understand how it works. Any direction would be sincerely appreciated.

    Read the article

  • Uploading to a remote server periodically?

    - by user1048138
    I have been working on an app that takes screen shots, kinda like http://puush.me/ however, I would like to be able to upload the screen shots to a remote server. What protocols can I use to do so. Needs to be cross platform and secure. I know that SSH, SFTP and FTP are options, however, they all require logins that I dont want to provide to the end user. Nor do I want to sign a key for them as it would still allow their machines to remotely log in.

    Read the article

  • git repository sync between computers, when moving around?

    - by Johan
    Hi Let's say that I have a desktop pc and a laptop, and sometimes I work on the desktop and sometimes I work on the laptop. What is the easiest way to move a git repository back and forth? I want the git repositories to be identical, so that I can continue where I left of at the other computer. I would like to make sure that I have the same branches and tags on both of the computers. Thanks Johan Note: I know how to do this with SubVersion, but I'm curious on how this would work with git. If it is easier, I can use a third pc as classical server that the two pc:s can sync against. Note: Both computers are running Linux. Update: So let's try XANI:s idea with a bare git repo on a server, and the push command syntax from KingCrunch. In this example there is two clients and one server. So let's create the server part first. ssh user@server mkdir -p ~/git_test/workspace cd ~/git_test/workspace git --bare init So then from one of the other computers I try to get a copy of the repo with clone: git clone user@server:~/git_test/workspace/ Initialized empty Git repository in /home/user/git_test/repo1/workspace/.git/ warning: You appear to have cloned an empty repository. Then go into that repo and add a file: cd workspace/ echo "test1" > testfile1.txt git add testfile1.txt git commit testfile1.txt -m "Added file testfile1.txt" git push origin master Now the server is updated with testfile1.txt. Anyway, let's see if we can get this file from the other computer. mkdir -p ~/git_test/repo2 cd ~/git_test/repo2 git clone user@server:~/git_test/workspace/ cd workspace/ git pull And now we can see the testfile. At this point we can edit it with some more content and update the server again. echo "test2" >> testfile1.txt git add testfile1.txt git commit -m "Test2" git push origin master Then we go back to the first client and do a git pull to see the updated file. And now I can move back and forth between the two computers, and add a third if I like to.

    Read the article

  • Problem using FtpWebRequest to append to file on a mainframe

    - by MusiGenesis
    I am using FtpWebRequest to append data to a mainframe file. Each record appended is 50 characters long, and I am adding them one record at a time. In our development environment, we do not have a mainframe, so my code was written and tested FTPing to a Windows-based FTP site instead of a mainframe. Initially, I was writing each record using a StreamWriter (using the stream from the FtpWebRequest) and writing each record using WriteLine (which automatically adds a CR/LF to the end). When we ran this for the first time in the test environment (in which we're writing to an actual MVS mainframe), our mainframe contact said the CR/LFs were not able to be read by his program (a green-screen mainframe program of some sort - he's sent me screen captures, which is all I know of it). I changed our code to use Write instead of WriteLine, but now my code executes successfully (i.e no thrown exceptions) when writing multiple records, but no matter how many records we append, he is only able to "see" the first record - according to his mainframe program, there is only one 50-character record in the file. I'm guessing that to fix this, I need to write some other line-delimiting character into the end of the stream (instead of CR/LF) that the mainframe will recognize as a record delimiter. Anybody know what this is, or how else I can fix this problem?

    Read the article

  • Publishing to live website

    - by Alienfluid
    Hey there, My friend and I are collaborating on a ASP.NET powered website. To develop it locally, we use Visual Web Developer Express (good enough for our needs). Subversion (using Tortoise SVN) is our source control of choice with the repository residing on Unfuddle.com. We run into problems when we need to update the live site - since there's no version control on it. Currently we use the "Copy to Website" feature in VWD which copies the files using FTP. Here are some problems: VWD only keeps track of files uploaded by one user, so if the other user uploads a newer version of a file to the live site, VWD on my side cannot tell whether the live version of the file is newer or mine is. There's no way to tell whether all the latest changes are available on the live site. We have to be careful not to party all over the shared web.config file since the other user's local DB settings are different from mine, and of course, the live DB settings are a whole other story! What do you guys use to publish to a live site? Does anything out there tie into Subversion so that we can automate the process and always guarantee that the live site is synced to a change list number? Also, how do you manage the different web.config file settings? Thanks!

    Read the article

  • Handling asynchronous responses

    - by James P.
    I'm building an FTP client from scratch and I've noticed that the response codes aren't immediate (which is no surprise). What would be a good approach for getting the corresponding code to a command? Below is an example of the output of Filezilla server. The response code is the three digits near the end of each line. (000057) 23/05/2010 19:43:10 - (not logged in) (127.0.0.1)> Connected, sending welcome message... (000057) 23/05/2010 19:43:10 - (not logged in) (127.0.0.1)> 220-FileZilla Server version 0.9.12 beta (000057) 23/05/2010 19:43:10 - (not logged in) (127.0.0.1)> 220-written by Tim Kosse ([email protected]) (000057) 23/05/2010 19:43:10 - (not logged in) (127.0.0.1)> 220 Please visit http://sourceforge.net/projects/filezilla/ (000057) 23/05/2010 19:43:10 - (not logged in) (127.0.0.1)> user anonymous (000057) 23/05/2010 19:43:10 - (not logged in) (127.0.0.1)> 331 Password required for anonymous

    Read the article

  • using libcurl to check if a file exists on a SFTP site

    - by Snazzer
    I'm using C++ with libcurl to do SFTP/FTPS transfers. Before uploading a file, I need to check if the file exists without actually downloading it. If the file doesn't exist, I run into the following problems: //set up curlhandle for the public/private keys and whatever else first. curl_easy_setopt(CurlHandle, CURLOPT_URL, "sftp://user@pass:host/nonexistent-file"); curl_easy_setopt(CurlHandle, CURLOPT_NOBODY, 1); curl_easy_setopt(CurlHandle, CURLOPT_FILETIME, 1); int result = curl_easy_perform(CurlHandle); //result is CURLE_OK, not CURLE_REMOTE_FILE_NOT_FOUND //using curl_easy_getinfo to get the file time will return -1 for filetime, regardless //if the file is there or not. If I don't use CURLOPT_NOBODY, it works, I get CURLE_REMOTE_FILE_NOT_FOUND. However, if the file does exist, it gets downloaded, which wastes time for me, since I just want to know if it's there or not. Any other techniques/options I'm missing? Note that it should work for ftps as well. Edit: This error occurs with sftp. With FTPS/FTP I get CURLE_FTP_COULDNT_RETR_FILE, which I can work with.

    Read the article

  • Creation of zip folder in php

    - by Kishan
    I am trying to create a zip folder of my downloaded images. Here is my code. I am not getting any errors, but the zip is not getting downloaded.The code is getting compiled and I am getting the output till the display part of the current directory, but after that the code seems to go wrong somewhere and I am not able to get any Zip archive. <?php $conn_id=ftp_connect("localhost") or die("Could not connect"); ftp_login($conn_id,"kishan","ubuntu"); //login to ftp localhost echo "The current directory is " . ftp_pwd($conn_id); //display current directory ftp_chdir($conn_id,'/var/www/test1'); //changing to the directory where my images are downloaded. echo "<br/><p> Changing to directory" . ftp_pwd($conn_id); $file_folder="."; echo "<br/> The content of the directory is <br/>"; print_r(ftp_rawlist($conn_id,".")); // display the contents of the directory if(extension_loaded('zip')) //check whether extension is loaded { $zip=new ZipArchive(); $zip_name="a.zip"; //some name to my zip file if($zip->open($zip_name,ZIPARCHIVE::CREATE)!==TRUE) { $error="Sorry ZIP creation failed at this time"; } $contents=ftp_nlist($conn_id,"."); foreach($contents as $file) //addition of files to zip one-by-one { $zip->addFile($file_folder.$file); } $zip->close(); //seal the zip } if(file_exists($zip_name)) //Content-dispostion of my zip file { header('Content-type:application/zip'); header('Content-Disposition:attachment; filename="'.$zip_name.'"'); readfile($zip_name); unlink($zip_name); } ?>

    Read the article

  • Develop multiple very similar projects at once

    - by Raveren
    I am developing a semi-complicated site that is available in several countries at once. Much effort has been put in to make the code bases as similar as possible to one another and ultimately only the config file and some representational data will differ between them. Each project has its own SVN repository which maps directly to a live test site. That part is handled by the IDE we use to work. Now I am in need to create a some sort of system to keep all these projects in sync. The best theoretical solution so far is to create a local hook script that would fire on committing and Merge the committed files from the project that is being committed to all other projects Optionally upload them to the live site, replacing previous files The first problem is that I don't know how I would do the merging - I guess it would be like applying a SVN patch or something. The second is if I do not want to upload the changes to the live server, how would I go about synching the live and local code bases (replace older files?). I am posting this question, not going through the potentially huge trouble of solving the aforementioned problems myself is that I believe this is a pretty common situation and someone would already have a solution and others may benefit from the answers in the future. Lastly, I'm on windows7, develop PHP and use tortoiseSVN.

    Read the article

  • In .NET, how do you send multiple arguments into a DOS command prompt?

    - by donde
    I am trying to execute DOS commands in ASP.NET 2.0. What I have now calls a BAT file which, in turn, calls a CMD file. It works (with the end result being a file gets ftp'ed). However, I'd like to dump the BAT and CMD files and run everything in .NET. What is the format of sending multiple arguments into the command window? Here is what I have now. The .NET Code: System.Diagnostics.Process proc = new System.Diagnostics.Process(); proc.EnableRaisingEvents = false; proc.StartInfo.FileName = "C:\\MyBat.BAT"; proc.Start(); proc.WaitForExit(); The Bat File looks like this (all it does is run the cmd file): ftp.exe -s:C:\MyCMD.cmd And here is the content of the Cmd file: open <my host> <my user name> <my pw> quote site cyl pri=1 sec=1 lrecl=1786 blksize=0 recfm=fb retpd=30 put C:\MyDTLFile.dtl 'MyDTLFile.dtl' quit

    Read the article

  • Trying to update debian not working

    - by Sean
    As root i type this command apt-get update and get these error messages. > Err http://security.debian.org lenny/updates Release.gpg Could not resolve 'security.debian.org' Err http://security.debian.org lenny/updates/main Translation-en_US Could not resolve 'security.debian.org' Err http://security.debian.org lenny/updates/contrib Translation-en_US Could not resolve 'security.debian.org' Err http://security.debian.org lenny/updates/non-free Translation-en_US Could not resolve 'security.debian.org' Err http://www.backports.org lenny-backports Release.gpg Could not resolve 'www.backports.org' Err http://www.backports.org lenny-backports/main Translation-en_US Could not resolve 'www.backports.org' Err http://www.backports.org lenny-backports/contrib Translation-en_US Could not resolve 'www.backports.org' Err http://www.backports.org lenny-backports/non-free Translation-en_US Could not resolve 'www.backports.org' Err http://ftp.us.debian.org lenny Release.gpg Could not resolve 'ftp.us.debian.org' Err http://ftp.us.debian.org lenny/main Translation-en_US Could not resolve 'ftp.us.debian.org' Err http://ftp.us.debian.org lenny/contrib Translation-en_US Could not resolve 'ftp.us.debian.org' Err http://ftp.us.debian.org lenny/non-free Translation-en_US Could not resolve 'ftp.us.debian.org' Err http://http.us.debian.org stable Release.gpg Could not resolve 'http.us.debian.org' Err http://http.us.debian.org stable/main Translation-en_US Could not resolve 'http.us.debian.org' Err http://http.us.debian.org stable/contrib Translation-en_US Could not resolve 'http.us.debian.org' Err http://http.us.debian.org stable/non-free Translation-en_US Could not resolve 'http.us.debian.org' Reading package lists... Done W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/Release.gpg Could not resolve 'ftp.us.debian.org' W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/main/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/contrib/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/non-free/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch http://http.us.debian.org/debian/dists/stable/Release.gpg Could not resolve 'http.us.debian.org' W: Failed to fetch http://http.us.debian.org/debian/dists/stable/main/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch http://http.us.debian.org/debian/dists/stable/contrib/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch http://http.us.debian.org/debian/dists/stable/non-free/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch http://security.debian.org/dists/lenny/updates/Release.gpg Could not resolve 'security.debian.org' W: Failed to fetch http://security.debian.org/dists/lenny/updates/main/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch http://security.debian.org/dists/lenny/updates/contrib/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch http://security.debian.org/dists/lenny/updates/non-free/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch http://www.backports.org/debian/dists/lenny-backports/Release.gpg Could not resolve 'www.backports.org' W: Failed to fetch http://www.backports.org/debian/dists/lenny-backports/main/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Failed to fetch http://www.backports.org/debian/dists/lenny-backports/contrib/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Failed to fetch http://www.backports.org/debian/dists/lenny-backports/non-free/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Some index files failed to download, they have been ignored, or old ones used instead. W: You may want to run apt-get update to correct these problems This is on a dreamplug linux server. Configured so that my network starts on 192.168.1.2 and my router is port forwarding ssh to 192.168.1.6 to the server.

    Read the article

  • Trouble with dns and debian update

    - by Sean
    I tried to update my debian dreamplug server with the command running as root apt-get update and recieved these errors. Err http://security.debian.org lenny/updates Release.gpg Could not resolve 'security.debian.org' Err htdtp://security.debian.org lenny/updates/main Translation-en_US Could not resolve 'security.debian.org' Err htdtp://security.debian.org lenny/updates/contrib Translation-en_US Could not resolve 'security.debian.org' Err htdtp://security.debian.org lenny/updates/non-free Translation-en_US Could not resolve 'security.debian.org' Err httdp://www.backports.org lenny-backports Releasegpg Could not resolve 'www.backports.org' Err httdp://www.backports.org lenny-backports/main Translation-en_US Could not resolve 'www.backports.org' Err httdp://www.backports.org lenny-backports/contrib Translation-en_US Could not resolve 'www.backports.org' Err httdp://www.backports.org lenny-backports/non-free Translation-en_US Could not resolve 'www.backports.org' Err httdp://ftp.us.debian.org lenny Release.gpg Could not resolve 'ftp.us.debian.org' Err httdp://ftp.us.debian.org lenny/main Translation-en_US Could not resolve 'ftp.us.debian.org' Err httdp://ftp.us.debian.org lenny/contrib Translation-en_US Could not resolve 'ftp.us.debian.org' Err httdp://ftp.us.debian.org lenny/non-free Translation-en_US Could not resolve 'ftp.us.debian.org' Err httdp://http.us.debian.org stable Release.gpg Could not resolve 'http.us.debian.org' Err htdtp://http.us.debian.org stable/main Translation-en_US Could not resolve 'http.us.debian.org' Err httdp://http.us.debian.org stable/contrib Translation-en_US Could not resolve 'http.us.debian.org' Err htdtp://http.us.debian.org stable/non-free Translation-en_US Could not resolve 'http.us.debian.org' Reading package lists... Done W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/Release.gpg Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/main/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/contrib/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://ftp.us.debian.org/debian/dists/lenny/non-free/i18n/Translation-en_US.gz Could not resolve 'ftp.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/Release.gpg Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/main/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/contrib/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://http.us.debian.org/debian/dists/stable/non-free/i18n/Translation-en_US.gz Could not resolve 'http.us.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/Release.gpg Could not resolve 'security.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/main/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/contrib/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch ttp://security.debian.org/dists/lenny/updates/non-free/i18n/Translation-en_US.gz Could not resolve 'security.debian.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/Release.gpg Could not resolve 'www.backports.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/main/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/contrib/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Failed to fetch ttp://www.backports.org/debian/dists/lenny-backports/non-free/i18n/Translation-en_US.gz Could not resolve 'www.backports.org' W: Some index files failed to download, they have been ignored, or old ones used instead. W: You may want to run apt-get update to correct these problems I am able to ping ip addresses but not namespaces. Can't seem to figure out the problem. My /etc/resolv.conf file contains nameserver 192.168.1.2 which is my router.

    Read the article

  • What happens to an ad hoc installed iPhone/iPad app when a new iTunes profile is synced against?

    - by user363100
    I'm currently involved in a project where a number of iPads loaded with a special app are given away to a number of people at a certain event. Both because of time constraints as well as our desire to give these people a really exclusive app, we decided to prepare these devices using ad hoc installs of the app. What will happen to the app when the recipients of the device decide to sync it with their existing iTunes account instead one of our "recipient x" accounts?

    Read the article

  • Syncing Online Content with iPhone Application

    - by PF1
    Hi Everyone: I am looking for some way to sync a online XML file with my iPhone application and only download the newest changed items. Each item is marked with a date attribute, so I assume this is possible. I have heard that Core Data can accomplish this task, but I am unsure of the suggested method and how to approach implementing it. Thanks for any help.

    Read the article

  • Outlook 2010 hung on updating outlook.com email account

    - by warren
    For the past week, through restarts of Outlook, and even a bounce of my machine, Outlook has been hung synchronizing my outlook.com account, after having moved messages from a different email inbox into the outlook.com inbox. The old account does not have the moved message any more (those synched-out correctly). The new account does not have the moved messages when acessing via the webui - ie, they are stuck in just Outlook. My problem? I need to reimage this laptop for a friend I need Outlook to finish syncing all those messages out to the hosted email How can I force this to happen?

    Read the article

  • How do I export address book from N97

    - by mplungjan
    Hi, I need to copy my numbers from my private Nokia to my office BB. I have not found a way to export my phone numbers from ovi or elsewhere. On Mac iSync stopped working with snow leopard and OVI on windows does not export. I do not mind using a windows suggestion. I lost a description on how to use the ovi backup files in another program. What I have done so far terminal: sudo open -a iSync.app - it launched but iSync said "this device is not supported by iSync" went here http://europe.nokia.com/support/product-support/isync/compatibility-and-download found a plugin (I am sure that was not there a while ago :| ) Checked software version 22.0.110 installed plugin Ran iSync which found and installed my N97 device successfully. synced. It stopped with The connection was lost while talking to the phone. http://discussions.europe.nokia.com/t5/Nseries-and-S60-Smartphones/N97-iSync-Multimedia-Transfer-Modem/m-p/568560 no news since Jan 2010. Tried to download and install http://best-vcard.en.softonic.com/symbian but the installer fails :( I simply do not understand why Nokia is giving us such a hard time. I would not have considered switching from Nokia if Mac had been better supported. It is so frustrating that they just seem not to care losing Nokia fanbois like me - especially since I am this outspoken on the net and what i say on popular forums gets indexed by google fast. I am very close to just go iPhone here. Hope someone has Nokia's ears UPDATE: I Downloaded NbuExplorer from sourceforge. It will extract everything from an OVI backup into VCF, VCS and VMG files. Very useful software and free.

    Read the article

  • Box.com file sharing - How are you managing concurrent document access and file locks? [closed]

    - by Matt
    My company is evaluating Box.com as a file server replacement. It's file locking behavior for concurrent access to files seems incomplete. Specifically, files are not locked* (either exclusive or read-only) when they are being edited by Office or similar programs. This inevitably results in multiple versions of documents as concurrent access results in change conflicts. *The exception is when the file is edited using Zoho Docs - perhaps other web-based office suites as well. Box provides multiple options for editing documents, including Google Docs, a local copy of Office or similar, Zoho Docs and others. If you are using Box how have you managed or worked around this behavior?

    Read the article

  • Bookmarks folder on two machines not syncing

    - by AsheeshR
    I have Chromium installed on a Ubuntu 12.04 machine and Chrome on a Windows 7 machine. Both are updated to newest stable release. I am logged in on both machines with the same Google account. I created a folder A on my Windows machine and bookmarked some pages. A few hours later and before the folder appeared, I created a folder named A on the Ubuntu machine and continued adding links to it. Now, both the folders are not syncing and contain links local to each machine only. Is there any way to resolve this?

    Read the article

< Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >