Search Results

Search found 265 results on 11 pages for 'filezilla'.

Page 10/11 | < Previous Page | 6 7 8 9 10 11  | Next Page >

  • Blocking of certain file downloads

    - by Philip Fourie
    I have a problem where I cannot completely download a certain file from a server. The file is 1.9MB in size but only 68% is downloaded and then it hangs. I tried and these cases, which failed: Downloaded the file with HTTP Downloaded the file with FTP Moved the file to different FTP and web servers behind the ISA firewall Tried with IIS 6.0 & IIS 7.0 Multiple download clients. Which included FireFox, FileZilla (on Windows) and wget (on Linux) This worked: Downloading other files from the same location on the server. Both bigger and smaller and in size than the original. FTP and HTTP worked. Earlier version of this file (.DLL) works. It is as if the content of this file has an influence on this file being served. Network architecture: Client Machine - Internet (ISP) - ISA Server - IIS 7.0 The only constants are the ISP, Cisco router and the ISA server. Is it possible that something is rejecting the download because of the contents of the file? I am hoping ISA is the culprit... I am not a ISA expert is there somewhere I can look to establish if it is indeed ISA causing this? Update: Splitting the file into two parts with a hex editor results in one half of the file being served correctly and the other part not. Zipping the file results in the file being downloaded successfully. However this is not an option for this particular scenario. Renaming the file and its extension also doesn't work. Update 2009/10/22: It does NOT seems to be ISA that is causing this problem. We connected a laptop (running IIS) on an available public IP and still the file download to 68% before it hanged. The two remaining components are the ISP and the Cisco 800 series router. Anyone knows about an issue on the router perhaps?

    Read the article

  • IIS FTP 7.5 Data Channel Problem (SSL)

    - by user59050
    Hey there I wonder if anyone can get me in the right direction. I am setting up both a FTPS Client and Server, FTPS Server using Microsoft’s iis FTP 7.5. On the client side it will be running on Linux and I am using M2crypto for the openssl wrapping (python). I am worried the problem is on the server side (iis7.5) due to the following discovery : If I host using Filezilla with BOTH the control and data channel being forced to be encrypted it works 100% (100% file transmission), if i use iis as the server everything works up to the point when the data channel takes over... i.e. all data of the retrieved file is already received correctly in my basket! The ftp server just won't send the final '226 Transfer complete.' on the cmd socket. Why? If i force the client or server to close the connection the file is 100% intact....If i use iis 7.5 with forced encryption on control channel all works 100% as long as i don’t force data channel... Here are some screenshots to demo this... Client View after Kill Client : pics @ http://forums.iis.net/p/1172936/1960994.aspx#1960994 Summary : We can establish the connection, do directory listings, start the upload, see the file (0bytes) created on the server but then the client hangs. If we terminate the client, the uploaded file on the server suddenly jumps up to full size.

    Read the article

  • User permissions linux. (proftpd / nginx)

    - by user55745
    I've been having a complete nightmare trying to configure proftpd. I've got proftp server working with an sql database. However I want to have any files uploaded able to viewed by the webserver running on the same box. The folders get created in /var/tmp/ as rwx------ 2 ftpuser ftpgroup 4096 Oct 8 20:35 50730c4346512 drwx------ 2 ftpuser ftpgroup 4096 Oct 8 20:38 50730f3a811ca I've tried adding www-data to group with the following usermod -g www-data ftpuser But this doesn't allow the web server access. In proftpd.conf I have the following umask set Umask 0022 It doesn't seem to make a difference what I set that value to. /etc/group (sure I've messed up one of these two but I'm getting desperate) ftpgroup:x:2001:www-data www-data:x:33:ftpgroup /etc/passwd www-data:x:33:33:www-data:/var/www:/bin/sh proftpd:x:108:65534::/var/run/proftpd:/bin/false ftp:x:109:65534::/srv/ftp:/bin/false ftpuser:x:2001:33:proftpd user www-data:/bin/null:/bin/false The ftpuser table in the database has uid / gid set to 2oo1 for both. I'm going absolutely crazy trying to solve this any help would be greatly appreciated. p.s Also, although if I manually connect to the ftp server I can upload files via FileZilla. Although this isn't working for the web-camera, although there is talky talky going on between the server and the camera.

    Read the article

  • Plesk FTP not working but SFTP and Shell is working

    - by shamittomar
    I am facing a strange problem. The FTP on my Plesk VPS is not working. Whenever I try to connect, FileZilla FTP client says: Status: Resolving address of xxxxxxxxxxxxx.com Status: Connecting to xxx.xxx.xxx.xxx:21... Status: Connection established, waiting for welcome message... Error: Could not connect to server So, it's not even going to the step of asking username/password. So, it's something else. The SFTP on port 22 is working fine. Also, I can successfully do shell access and run commands. But, I NEED FTP access too on port 21. I have searched everywhere but can not find any setting to enable it. This is the Plesk version info: Parallels Plesk Panel version 9.5.2 Operating system Linux 2.6.26.8-57.fc8 CPU GenuineIntel, Intel(R) Pentium(R) 4 CPU 3.00GHz Any help is appreciated. [EDIT]: The firewall is not blocking it. I have checked it on server and there are absolutely no blocking rule. Firewall states: All incoming/outgoing connections are accepted on FTP And on client-side (my PC), I can connect to other FTP servers so this is not an issue in my PC's firewall. Moreover, I can not even connect to the FTP from online FTP clients like net2ftp.

    Read the article

  • Mac OS X will only upload zero-byte files through FTP

    - by tabacitu
    I'm using Mac OS X Lion and i've been having this problem with FTP (any FTP client, mind you. I tried Transmit, FileZilla, Cyberduck and the Terminal, all with the same result) I can browse files in my FTP Client, but when I upload files, the client hangs for a few seconds, then thinks it uploaded the files successfully, but it only creates a new file with one blank line in it. Sometimes, it manages to upload 4-5 lines. It then returns: 226 - Error during read from data connection 226 Transfer aborted But 2xx is a success message. It is not a server issue, since any Windows machine will upload just fine using the same network. Can anybody figure out what the problem is? It renders my mac useless for web development. The problem persists with SFTP and FTP with SSL/TLS. Later edit: Solved! Ok, turns out the problem goes away when I take out my router and connect directly through PPPoE. So the problem is with the router, I thought. But no, the problem is with the mac that connects through a router that connects through a PPPoE and tries to upload using FTP. Pretty specific, I know. The problem is with the MTU (maximum transmission unit). Apparently, mac os x breaks the file into chunks that are too large for the router to send, because the router's MTU was set lower than Mac OS X's. My router's was 1492, which is ok, but my Mac's MTU was 1500, which is unacceptable. I don't even understand why it works directly with PPPoE. Anyway, if you encounter the same problem, this is how you diagnose and fix it: In terminal, run: ifconfig | grep mtu to see what the MTU is for en0 (or en1, mine was en0) If it's 1500, run sudo ifconfig en0 mtu 1300 This should solve it. If so, it may only be until the next restart. You can also change the MTU in System Preferences \ Network \ Ethernet - Advanced \ Hardware

    Read the article

  • Euro character messed up during FTP transfer

    - by djechelon
    My customer is using a very outdated ecommerce management system on my hosting service. For that product, no support is being provided anymore by the vendor. Brief explanation: the shop website, that claims to run under LAMP stack, is built by an old Visual Basic Windows application running on MS Access. The user constructs the shop, defines the HTML template, adds products and categories, etc. Then the VB exe builds the PHP pages (one for each template page) and the SQL script to run on MySQL. It also uploads everything via FTP and runs the installation/upgrade script on its own. The problem Browsing the website, many products' descriptions are cut before the euro sign. For example, what was supposed to be "Product price €1000" becomes "Product price" The analysis MySQL contains a cutted description until the € sign, so it's not PHP fault The Access databases contain full description with € sign, so it's not fault of the webmaster writing bad description or eDisplay cutting them The SQL that will run once the site gets uploaded, stored on my local machine before upload, contains the € sign The same script, after being FTPed by eDisplay and opened with nano from SSH, shows the € sign messed up like this: ^À vsftpd log reports (obfuscated for privacy) Sat Dec 15 11:16:57 2012 22 xxx.xxx.128.13 1112727 /srv/www/domains/xxxxxx.it/htdocs/db.sql b _ i r xxxxxxx ftp 0 * c which seems to be a binary transfer (and also a huge security vulnerability because you can download the whole database from unauthenticated HTTP) The eDisplay internal FTP client provides no option for ascii/binary transfer modes [Add] Trying to manually upload the SQL file via SFTP shows messing up euro [Add2] Trying to manually upload using Xftp client with explicit ASCII mode doesn't fix too It looks like the file gets uploaded as binary. Perhaps on the customer's previous host it all worked fine because that was a Windows host. The server It's an Azure virtual machine running openSUSE 12.2 with both vsftpd and openSSH The question Without asking the customer to manually upload files using FileZilla or replacing € with &euro;, because he refuses, what can I do on server side to prevent vsftpd to screw up euro sign?

    Read the article

  • Why does my ftp(e)s server fails like half of the time

    - by user1092608
    I have this discussion at work regarding our ftp server running via vsftpd. Initially, we have opted to serve ftpes instead of sftp because this seemed the most flexible and straightforward solution for our server to have secure file transmission. Afterwards, our ftp server seems to be a source of issues for our end users. Half of the time, users complain about not working ftp connections. I must say, i tested our FTP trough different infrastructures (=in the field, at random times at random places) and indeed, sometimes behind some configurations (=no idea how they are configured, because the 'field' testing), i recieve errors. Some of the are: Error: Failed to retrieve directory listing (filezilla) Furthermore, behind my basic home configuration, everything seems to be running fine. I (think I) did all the basic configuration checks (passive mode?, firewall for all ports?, ...) and can't seem to find the source. Being a bunch of techies at our small office, yet knowing nothing about infrastructure, some start suggesting that ftps protocol could be the source of issues. ("No, i only knew sftp so far" "Ftps is not widespread"). I, however, strongly doubt this hypothesis, since reading around on the www, asking questions on serverfault, everyone seems to deny this. So, as I would like to avoid reconfiguring, since this involves messing around in our SSH service, our virtual user setup and ftp service, i would need some advice on 1) what could be potentially the general cause? 2) do you have some general tips? 3) would you mind having a look at my configuration file? ----- General Settings ----- write_enable=YES dirmessage_enable=YES nopriv_user=ftpsecure ftpd_banner="Welcome to XXXX FTP!" hide_ids=YES hide_file=.* max_per_ip=10 max_clients=10 local_enable=YES local_umask=022 chroot_local_user=YES secure_chroot_dir=/usr/share/empty userlist_enable=NO userlist_deny=YES userlist_file=/etc/vsftp_deny_users guest_enable=YES guest_username=ftpvirtual virtual_use_local_privs=YES user_sub_token=$USER local_root=/srv/ftp/ftpvirtual/$USER anonymous_enable=NO syslog_enable=NO xferlog_enable=YES xferlog_file=/var/log/vsftpd_xfer.log connect_from_port_20=YES pam_service_name=vsftpd listen=YES listen_port=21 pasv_enable=YES pasv_min_port=30000 pasv_max_port=30030 pasv_address=foo ssl_enable=YES rsa_cert_file=/etc/vsftpd.pem rsa_private_key_file=/etc/vsftpd.pem force_local_data_ssl=YES force_local_logins_ssl=YES ssl_tlsv1=YES ssl_sslv2=YES ssl_sslv3=YES ssl_ciphers=HIGH anon_mkdir_write_enable=NO anon_root=/srv/ftp anon_upload_enable=NO idle_session_timeout=900 log_ftp_protocol=NO dsa_cert_file=/etc/vsftpd.pem Thanks

    Read the article

  • FTP Server upload and filesystem questions

    - by Alex
    I'm a photographer who mainly does event photography. A while ago I bought myself a Nikon WT-4 wireless transmitter, a small device which connects via USB to my Nikon D700 DSLR, and then establishes a WiFi connection to an existing WLAN. It can then upload any pictures I take via FTP to an FTP server somewhere in the network. On my laptop I then have a piece of software which will check a given folder on the disk regularly, this software is smart enough to look at the modified file timestamp, if this timestamp is less than 10 seconds ago, it will not attempt to import the folder and skip the file in this iteration of the import scan. The problem I've discovered seems to be inherent to the FTP protocol, as I have the same problem with Windows 7 built in IIS server, as I do with FileZilla FTP server. When the transmitter starts to upload a file, the FTP server will create a small 300-500 KB file with the correct filename on the disk, but then do nothing with the file until it has completely received the file via FTP. So it seems to create this small dummy file, and then buffer the remainder of the FTP upload until it's finished, and then dump the rest of the file into the dummy file making it the correct size. Problem is, these uploads take about 15-30 seconds depending on reception, but since the folder watch tool will already try to import any file older than 10 seconds, it will always try to import the small dummy files which obviously fails as they're not copmlete yet. Is there any way to 'disable' this behaviour? Ideally I would like my file only to show up once it's been completely uploaded. Or perhaps someone knows another FTP server application (it has to run on win7) which does not show this behaviour?

    Read the article

  • FTP Scripting Capabilities

    - by lmg
    I am looking for an FTP client that will allow me to do the following Include a GUI for setting up a number of FTP connections Support FTPS Run unattended on Windows Server 2008 Retry failed transactions Support email Support custom scripts I need to pull files from a few different servers and there are certain calculations that need to be done depending on which server the files come from. I've looked at SmartFTP and it looks like pretty much what I need except I can't get it to run as a Windows Scheduled Task (I currently have some support threads open in their forum). I've also looked at a few other FTP clients (Filezilla, RoboFTP, and AutoFTP (you can find the Windows 7 BSOD in this one!)) that haven't had the capabilities I'm looking for. Right now, I'm looking at WS_FTP and its scripting capabilities. It appears I can create a script to run as a scheduled task, but I can't add the script to a file transfer task. Does anyone have suggestions on how I can do post-transfer processing on the files or better yet how to integrate scripting into the file transfer task? I'm also open to other suggestions for FTP clients as well if you have them! If I can't find a suitable FTP client, custom scripting will just have to do the trick.

    Read the article

  • Are my web server permissions for uploading correct?

    - by user1699176
    I'm on debian and I have my website in the directory /srv/www/mysite.com/public_html I set chown for www-data:www-data on /srv/www. I have root disabled and created a sudo user which is id 1000:1000. I would also like to use this user to upload to /srv/www so I added my sudo user to the www-data group. I originally got a message saying that I didn't have permissions to upload a file to that directory. After playing around with multiple permissions for a while I finally was able to upload properly, but I'm not sure if this set up is correct. I'm hesitant to change it for now since it actually works, so I thought I'd ask for advice. I think what I ended up doing was this: sudo chown -R www-data:www-data /srv/www sudo chmod g+s /srv/www sudo usermod -aG www-data myuser sudo chgrp -R www-data /srv/www sudo chmod -R g+w /srv/www When I was finally able to successfully upload a file (with FileZilla) it showed the owner as myuser myuser. Shouldn't it have been www-data myuser? My question is whether this is correct and if there are any potential security issues? For example, I wasn't sure if I was actually supposed to use "myuser" to own the /srv/www directory instead sudo chown -R myuser:myuser /srv/www or maybe sudo chown -R www-data:myuser /srv/www If you need more info, let me know, thanks.

    Read the article

  • Windows XP VM on VMWare ESXi 4.1 "pausing" / blocked occasionally

    - by FelixD
    We have an issue with Windows XP SP3 VMs on VMWare ESXi 4.1.0 (the free version): They sometimes seem to "pause" for several minutes. This happens rarely (maybe once a month per VM, at least noticed only that often), but still is an issue for us. It happens for three different but similar VMs on three pretty different hosts (different hardware). I have the feeling that the "pausing" is not actually the CPU blocking, but probably the harddisks, but not 100% sure. The servers have one IDE disk (C:) and one SCSI (D:) and it might be either of the two. I have seen scheduled tasks simply not starting for up to 9 minutes and then running normally again with normal speed. They were totally blocked. This is not a load issue, the VMWare hosts have average load and the VMs in question already have reserved CPU resources plus high priorities for CPU and disk. The Windows boxes run mainly MySQL, Tomcat, FileZilla server, Cygwin stuff, Java + R applications, VMWare client, Elusiva Terminal Server pro, Nagios client. Not sure if this might be related with any of that software (e.g. Elusiva). Trying to debug this, there was nothing visible in Windows Event log, other logs in C:\Windows, VMWare events etc. Unfortunately the vmware.log file ends with "Log throttled". We found that we ran into 2 VMWare bugs there: The VMWare client writes lots on bogus messages in the vmware.log, which we now disabled (log level error setting) plus the bug that VMWare does not unthrottle the log (at least so far despite VM reboots). I know there is not much guidance and that may also be the reason why I so far didn't find anything related on the web or on ServerFault, but maybe some of this rings a bell with someone? Or please direct me to what more info to post. I hope that the vmware.logs get unthrottled eventually (can't easily restart the hosts at the moment). Thanks for any input!

    Read the article

  • Notepad++ FTP best solution

    - by yoda
    Hi, I use Notepad++ for more than 2 years now, and there's only 1 thing that it needs for me to be perfect: a actually-working-ftp-plugin. It has an ftp plugin, written by someone that meanwhile left the project (by meanwhile I mean a long time ago), and since then nobody had courage to improve it. The problem is that it does't handle connections very well. Sometimes it lost connection with the server and literally "blocks", others don't save the files properly, other only load half of the ftp files, etc etc .. My question is: Is there a way to use ftp and notepad++ (without using its build-in ftp or a ftp client like filezilla)? I've tried using NetDrive, but it gets stuck sometimes (makes the editor crash), and everytime the temporary file is refreshed by windows / NetDrive, it will load the new file without asking and skip the pointer to the end of the file (very very very annoying). In case you know how to make the built-in notepad++ ftp plugin work at 100%, I'd be much more happy! I'd like to have some feedback from you guys :) (I'm using Windows Vista) Thanks in advance!

    Read the article

  • Unwanted blank lines when committing from SVN

    - by Alon_A
    I'm using CentOS Linux 5.8 as a web server and tortoise SVN for synchronizing version of our code. We write the code in Windows 7 professional 64BIT with NetBeans and NotePad++. I'm committing the code files (.php) from the Linux Shell by this command: svn co svnFolder serverFolder --username **** --password **** The problem is, after committing the files, when I'm opening them directly from the server (for debugging) by NotePad++ (I'm doing View/Edit from Filezilla) I have extra blank lines. A code that looks like this on the localhost (On NotePad++): private $producer; private $account; private $admin; private $producerEvents; private $accountProducers; private $adminAccounts; Will look like this after committing to the server (Again, on NotePad++): private $producer; private $account; private $admin; private $producerEvents; private $accountProducers; private $adminAccounts; If I upload files by FTP, No blank lines are being added. How can I solve it ? Thanks.

    Read the article

  • How to remove window applet from Gnome3?

    - by Filip Nowak
    I installed today window applet for Gnome3 from this webupd8 post. The effect of the installation shown in the picture. I tried apt-get remove --purge and nothing happens. How do I remove this window applet? http://i.stack.imgur.com/D1s9b.jpg When i try metacity --replace &unity [1] 3171 Checking if settings need to be migrated ...no Checking if internal files need to be migrated ...no Backend : gconf Integration : true Profile : default Adding plugins Skipping upgrade com.canonical.unity.unity.01.upgrade Skipping upgrade com.canonical.unity.unity.02.upgrade Initializing core options...done Initializing bailer options...done Initializing detection options...done Initializing composite options...done Initializing opengl options...done Initializing decor options...done Initializing move options...done Initializing vpswitch options...done Initializing gnomecompat options...done Initializing grid options...done Initializing mousepoll options...done Initializing place options...done Initializing resize options...done Initializing animation options...done Initializing wall options...done Initializing session options...done Initializing workarounds options...done Initializing wobbly options...done compiz (expo) - Warn: failed to bind image to texture Initializing expo options...done Initializing ezoom options...done Initializing staticswitcher options...done Initializing fade options...done Initializing scale options...done Screen geometry changed: 0x0x1920x1080 Initializing unityshell options...done DEBUG 2012-02-19 21:22:40 glib <unknown>:0 Setting to primary screen rect: x=0 y=0 w=1920 h=1080 WARN 2012-02-19 21:22:40 unity.favorites FavoriteStoreGSettings.cpp:138 Unable to load GDesktopAppInfo for 'bluefish.desktop' WARN 2012-02-19 21:22:40 unity.favorites FavoriteStoreGSettings.cpp:138 Unable to load GDesktopAppInfo for 'filezilla.desktop' WARN 2012-02-19 21:22:40 unity.favorites FavoriteStoreGSettings.cpp:138 Unable to load GDesktopAppInfo for 'gimp.desktop' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' WARN 2012-02-19 21:22:40 glib.glib-gobject <unknown>:0 invalid cast from `BamfWindow' to `BamfApplication' Setting Update "texture_filter" Setting Update "sync_to_vblank" Setting Update "fullscreen_visual_bell" Setting Update "panel_opacity" Setting Update "launcher_opacity" Setting Update "icon_size" WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/applications does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/applications does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/commands does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/commands does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/files does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/files does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method InfoRequest proxy /com/canonical/unity/lens/music does not exist WARN 2012-02-19 21:23:32 unity.glib.dbusproxy GLibDBusProxy.cpp:255 Cannot call method SetActive proxy /com/canonical/unity/lens/music does not exist WARN 2012-02-19 21:23:33 unity.iconloader IconLoader.cpp:509 Unable to load contents of file:///usr/share/icons/unity-icon-theme/places/svg/category-available.svg: Blad podczas otwierania pliku: Nie ma takiego pliku ani katalogu WARN 2012-02-19 21:23:33 unity.iconloader IconLoader.cpp:509 Unable to load contents of file:///usr/share/icons/unity-icon-theme/places/svg/category-installed.svg: Blad podczas otwierania pliku: Nie ma takiego pliku ani katalogu

    Read the article

  • Things to install on a new machine – revisited

    - by RoyOsherove
    as I prepare to get a new dev machine at work, I write the things I am going to install on it, before writing the first line of code on that machine: Control Freak Tools: Everything Search Engine – a free and amazingly fast search engine for files all over your machine. (just file names, not inside files). This is so fast I use it almost as a replacement for my start menu, but it’s also great for finding those files that get hidden and tucked away in dark places on my system. Ever had a situation where you needed to see exactly how many copies of X.dll were hiding on your machine and where? this tool is perfect for that. Google Chrome. It’s just fast. very fast. and Firefox has become the IE of alternative browsers in terms of speed and memory. Don’t even get me started on IE. TweetDeck – get a complete view of what’s up on twitter Total Commander – my still favorite file manager, over five years now. KatMouse – will scroll any window your hovering on, even if it’s not an active window, when you use scroll the wheel on it. PowerIso or Daemon Tools – for loading up ISO images of discs LogMeIn Ignition – quick access to your LogMeIn computers for online Backup: JungleDisk or BackBlaze KeePass – save important passwords MS Security Essentials – free anti virus that’s quoest and doesn’t make a mess of your system. for home: uTorrent – a torrent client that can read rss feeds (like the ones from ezrss.it ) Camtasia Studio and SnagIt – for recording and capturing the screen, and then adding cool effects on top. Foxit PDF Reader – much faster that adove reader. Toddler Keys (for home) – for when your baby wants to play with your keyboard. Live Writer – for writing blog posts for Lenovo ThinkPads – Lenovo System Update – if you have a “custom” system instead of the one that came built in, this will keep all your lenovo drivers up to date. FileZilla – for FTP stuff All the utils from sysinternals, (or try the live-links) especially: AutoRuns for deciding what’s really going to load at startup, procmon to see what’s really going on with processes in your system   Developer stuff: Reflector. Pure magic. Time saver. See source code of any compiled assembly. Resharper. Great for productivity and navigation across your source code FinalBuilder – a commercial build automation tool. Love it. much better than any xml based time hog out there. TeamCity – a great visual and friendly server to manage continuous integration. powerful features. Test Lint – a free addin for vs 2010 I helped create, that checks your unit tests for possible problems and hints you about it. TestDriven.NET – a great test runner for vs 2008 and 2010 with some powerful features. VisualSVN – a commercial tool if you use subversion. very reliable addin for vs 2008 and 2010 Beyond Compare – a powerful file and directory comparison tool. I love the fact that you can right click in windows exporer on any file and select “select left side to compare”, then right click on another file and select “compare with left side”. Great usability thought! PostSharp 2.0 – for addind system wide concepts into your code (tracing, exception management). Goes great hand in hand with.. SmartInspect – a powerful framework and viewer for tracing for your application. lots of hidden features. Crypto Obfuscator – a relatively new obfuscation tool for .NET that seems to do the job very well. Crypto Licensing – from the same company –finally a licensing solution that seems to really fit what I needed. And it works. Fiddler 2 – great for debugging and tracing http traffic to and from your app. Debugging Tools for Windows and DebugDiag  - great for debugging scenarios. still wanting more? I think this should keep you busy for a while.   Regulator and Regulazy – for testing and generating regular expressions Notepad 2 – for quick editing and viewing with syntax highlighting

    Read the article

  • Azure Web Sites FTP credentials

    - by Bertrand Le Roy
    A quick tip for all you new enthusiastic users of the amazing new Azure. I struggled for a few minutes finding this, so I thought I’d share. The Azure dashboard doesn’t seem to give easy access to your FTP credentials, and they are not the login and password you use everywhere else. What Azure does give you though is a Publish Profile that you can download: This is a plain XML file that should look something like this: <publishData> <publishProfile profileName="nameofyoursite - Web Deploy" publishMethod="MSDeploy" publishUrl="waws-prod-blu-001.publish.azurewebsites.windows.net:443" msdeploySite="nameofyoursite" userName="$NameOfYourSite" userPWD="sOmeCrYPTicL00kIngStr1nG" destinationAppUrl="http://nameofyoursite.azurewebsites.net" SQLServerDBConnectionString="" mySQLDBConnectionString="" hostingProviderForumLink="" controlPanelLink="http://windows.azure.com"> <databases/> </publishProfile> <publishProfile profileName="nameofyoursite - FTP" publishMethod="FTP" publishUrl="ftp://waws-prod-blu-001.ftp.azurewebsites.windows.net/site/wwwroot" ftpPassiveMode="True" userName="nameofyoursite\$nameofyoursite" userPWD="sOmeCrYPTicL00kIngStr1nG" destinationAppUrl="http://nameofyoursite.azurewebsites.net" SQLServerDBConnectionString="" mySQLDBConnectionString="" hostingProviderForumLink="" controlPanelLink="http://windows.azure.com"> <databases/> </publishProfile> </publishData> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } I’ve highlighted the FTP server name, user name and password. This is what you need to use in Filezilla or whatever you use to access your site remotely. Notice how the password looks encrypted. Well, it’s not really encrypted in fact. This is your password in clear text. It’s just crypto-random gibberish, which is the best kind of password. UPDATE: About 2 minutes after I posted that, David Ebbo mentioned to me on Twitter that if you've configured publishing credentials (for Git typically) those will work too. Don't forget to include the full user name though, which should be of the form nameofthesite\username. The password is the one you defined. That’s it. Enjoy.

    Read the article

  • emacs tramp performance

    - by Oleg Pavliv
    Is there a way to improve emacs tramp performance? For me it's faster to open an external ftp client (filezilla), transfer files to the local disk and open them in an external editor (notepad) than open them with emacs. I use emacs23.1 under windows xp. I tried different tramp-default-method (telnet, pscp, ftp), all of them have the same performance. Profiling results with elp-instrument-package are the following (I opened 3 remote files of 1.5 MB each one) tramp-file-name-handler 1461 350.41599999 0.2398466803 tramp-sh-file-name-handler 1461 350.02699999 0.2395804243 tramp-send-command 227 179.63400000 0.7913392070 tramp-send-command-and-check 205 177.77600000 0.8672000000 tramp-wait-for-regexp 227 176.47800000 0.7774361233 tramp-wait-for-output 226 176.40000000 0.7805309734 tramp-barf-unless-okay 18 133.46699999 7.4148333333 tramp-handle-insert-file-contents 3 132.046 44.015333333 tramp-handle-file-local-copy 3 131.281 43.760333333 tramp-accept-process-output 2375 112.95100000 0.0475583157 So, actual file transfer takes 132 sec, about 1/3 of total time. Why does it spend so much time in tramp-sh-file-name-handler? I tried to advice a function tramp-sh-file-name-handler to store and return cached results but it does not work, probably this function has some side effects. Any ideas how to improve tramp performance? (I use emacs 23.1 under WindowsXP)

    Read the article

  • I installed XAMPP in a virtual drive and now I can't run its services. Why?

    - by Haris
    Hi, The description is quite long. Please spend some time to read it. ^:)^ I have an old PHP application and I'm trying to test and debug it. Unfortunately, the application uses important data so I can't just click this and that. Now, what I'm trying to do is create a copy of the application in a different computer. From now on, I will call the computer running my original PHP application as 'Computer A' and the computer which I'm going to use to run the copy of the application as 'Computer B'. To prevent missing link problems since the application contains static paths, such in images or tags, I have to copy all files and folder related to my PHP application from Computer A to the same path in Computer B. Unfortunately, Computer B only has drive C while Computer A has drive D and the files of my PHP application is located in 'D:\xampp\htdocs' in Computer A. OK, now I have to create drive D in computer B. At first, I tried to create a second partition in Computer B by using PowerQuest Partition Magic 8, but somehow Partition Magic doesn't run in Computer B. I have tried to reinstall it but it still doesn't run. So, another alternative is to create a virtual drive. That is what I did. I created a virtual drive by running the 'subst' command in Command Prompt. The virtual drive is D and it refers to a directory, which is 'C:\Virtual'. After I have drive D in Computer B, I installed XAMPP there. The installation was successful. Now, I also have 'D:\xampp\htdocs' in Computer B. However, when I ran the Apache, MySQL, or Filezilla service, I receive an error message "Error 3: The system cannot find the file specified.". In Computer B, there is no IIS or process using the port 80. What should I do? Please help me. Many thanks in advance, Haris

    Read the article

  • Alternative to 'Dispatch for ASP' deployment plug-in?

    - by Django Reinhardt
    Hi there, we've recently stumbled across the excellent Dispatch for ASP deployment plug in. It looks great apart from one thing: It doesn't work with Visual Studio 2010, at least for us, anyway. (It's supposed to work fine.) (Yes, we've tried everything: We've managed to get Dispatch working for another FTP site, but not the main one we regularly deploy to. We have managed to connect to our main site through FileZilla FTP, so the site itself is configured correctly. All settings have been triple checked, but the software still throws up weird errors (always to do with its internal libraries).) So does anyone know of any other comparable FTP-based, deployment plug-ins for Visual Studio? Here's what Dispatch does (and so any suggested replacement must do): Monitor any altered files in the project. When a file is changed, it's added to a list of files to be deployed. To deploy these files to the live site, all we need to do is click "Upload" and the plugin will connect via FTP to our live site and upload all the files. We can filter out any filenames we don't want to be monitored/uploaded (e.g. .cs or web.config or /Images/, etc.) I think that's all the features that we need. Thanks for any suggestions!

    Read the article

  • Compiling a click-once app that requires administrator?

    - by Assimilater
    Hi, a lot of my programs require the ability to write files to the hard drive. When I first made these programs for XP they worked great. Now I'm less ignorant about UAC (got a new laptop recently). And for future customers...I've noticed the potential for a LOT of annoying error messages....and quite frankly if the program can't write data to the hard drive or thumb drive it's on...there's no point to running it.... I've tried multiple times to build in the manifest a requirement for administrator or user access....I'm not sure if anything less would solve the problem...but have failed because click-once has security features in place to prevent me from doing so. I'd rather not have to tell my customers how to make the program run as an administrator by editing the file's properties...I'd much rather have a convenient pop up like what you'd see new programs such as Itunes or Filezilla show if they were in conflict with UAC requesting the privileges they need. I'd really like to do this but have had little success. Any and all advice that can remedy this grievous problem appreciated. Thanks.

    Read the article

  • Deal with update location for click-once.

    - by Assimilater
    I'm not sure how many people here are experts with visual studios, but I'd imagine a handful (not to raise expectations but to appeal to your egos :P). I'm working primarily in visual basic for now (though I hope to switch to c# in the near future and maybe a java or web app). Basically I'm trying to create an update feature that will work similarly to how common programs such as firefox or itunes update automatically. There is supposed to be provided functionality for this in what is called click once. I carry out the following procedures and get the following errors when trying to change the update url of my program to a password-protected ftp location. Go to project properties Go to publish click updates click browse click FTP Site Under Server put: web###.opentransfer.com Under Port: 21 Under Directory put: CMSOFT Passive mode is selected (which is what filezilla tells me the server is accessed with) Anonymous User is unselected and a username and password are typed in Push Ok Under Update location it shows: ftp://web###.opentransfer.com/CMSOFT I push Ok I see a message box titled Microsoft Visual Basic 2010 Express with an x icon Publish.UpdateUrl: The string must be a fully qualified URL or UNC path, for example "http://www.microsoft.com/myapplication" or "\server\myapplication". I've tried changing the directory to "CMSOFT/PQCM.exe" and the results are the same...hope this was descriptive enough.

    Read the article

  • FTP exception 501 "pathname" more than 8 characters

    - by BigMac66
    I am trying to access a file via a URI using the FTP protocol. For obvious security reasons I had to make some changes but this is where the problems seem to be coming from. My URI is as follows: ftp://user:[email protected]/u/Bigpathname/XYZ/ABC/BigPathname/bigpathname/xyz/abc/MY_LOG.LOG And I see this exception: sun.net.ftp.FtpProtocolException: CWD Bigpathname:501 A qualifier in "Bigpathname" is more than 8 characters This is really confusing as I can access the file from a Windows 7 command line with the CD command just fine. Both one directory at a time and as a full path. I found one article mentioning that MVS file names must be 8 or fewer characters but this does not explain how I can get to these same files from my command line! They do exist there is data there that I can download manual but I can not get there via a URI in Java. PS I use .toURL().openStream() to get files on my local machine just fine, it only fails when I try to get them from my server. EDIT October 1st I am able to access files on the MVS host using FileZilla and the basic FTP client from the Windows 7 command line - but I still cannot get them from a URI/URL. I downloaded a very basic Java built FTP client and tried accessing the same file in my program from there and the path works but because my file name has a dot in it "MY_LOG.LOG" I am getting File does not exist 501 Invalid data set name "MY_LOG.LOG". Use MVS Dsname conventions. I am utterly perplexed by this...

    Read the article

  • PC to Macbook Pro Transition - Getting (re)started?

    - by Torus Linvald
    I'm in my second computer science course right now. I've enjoyed programming so far, but really have just scraped my way by. I've not done much programming outside of required class work. For similar reasons, I never really invested in downloading/learning software to help me program (IDE's, editors, compilers, etc). I know it sounds tedious, but my current setup is: notepad++ for coding; Filezilla to transfer .cpp & .h files to school's aludra/unix and compiling; unix tells me where my bugs are and I go back to notepad++ to debug; repeat until done. This isn't fun - and I know it could be easier. But I put it off knowing that I was soon going to switch to a Mac. And, tomorrow, I'm switching. So... How should I set up my Macbook for the best programming experience? What IDEs and editors and debuggers and so on should I download? How will Mac programming differ from PC? I'm open to all ideas and comments, even the most basic. (Background - I'm learning/programming in C++ right now. Next semester, my classes switch to Java. I'm also going to take a class in web development, with HTML/CSS/Javascript/PHP. My new laptop will be a late 2009 Macbook Pro with Leopard, or maybe Snow Leopard. Free would be preferrable for all programs.) Thank you all.

    Read the article

  • D-Link DIR-300 slows down / loses network

    - by basic6
    Hi there, there are 2 buildings (A and B). In bldg A is an open WLAN (which I'm allowed to use btw). In bldg B is a computer that I want to connect to that network. So I flashed an old D-Link DIR-300 AP with DD-WRT, mounted it to the wall (bldg B) near a window, attached a 13 dBi directional antenna (pointing to bldg A) and configured it as AP client in that wireless network. Then there's another AP, connected to the D-Link AP, acting as standard access point, which the computer is connected to. That's basically working so far, but: Every now and then the connection is lost. Not the connection between the computer and the D-Link (I can access the DD-WRT admin page normally) or the connection between the D-Link and the WLAN (in Status - Wireless it says it's still connected to the network), but when I want to access a web page (which only works if I'm connected to the wireless network from bldg A), my Firefox keeps "Looking for" (name resolution) without finding anything. When I reset the D-Link (power off, power on) in this situation, after some moments, everything's working fine again (Internet access). I've no idea why this is happening, but usually it's at most every few weeks (most times when nobody was using the computer, so no traffic). Compared to the connection speed when I connect directly to the WLAN in bldg A (Laptop), the speed in bldg B is rather slow, but I have the impression that this difference is worse in the last few days. A few minutes ago, I got 582 KB/s down and 911 KB/s up in bldg A (directly/laptop) and 84 KB/s down and 9 KB/s up in bldg B. The speed in bldg B used to be way higher (I remember 200 KB/s up) while the actual network speed in bldg A was lower than it is now (close to those 200). I'm aware that the wireless connection between those buildings should slow things down, but I'm wondering why this difference has become that extreme. Thanks for any tips... Update: I currently want to upload a large file (1.5 GB) via FTP (FileZilla). Since that caused the D-Link to disconnect (as described in first post), I took my laptop to bldg A, connected directly to the original WLAN (bypassing my D-Link) and tried the same upload. Guess what - same issue: At some point the connection is dead (at this point I would have reset my D-Link if I was connected to it). Just as the D-Link, my laptop is still connected, but not even name resolution is working ("Looking for..." in Firefox). After reconnecting, it's working again. Maybe my D-Link isn't the problem at all...

    Read the article

  • TCP/IP & throughput between FreeNAS (BSD) server & other LAN machines

    - by Tim Dickerson
    I have got a question for someone that knows BSD a bit better than me that are in regards to my LAN setup at home/work here outside Chicago. I can't seem to fully optimize my network's (LAN) thoughput via my FreeNAS (BSD based) file server. It runs with the latest FreeBSD release which is modified to support several protocols for file transfers and more. Every machine that is behind my Smoothwall (Linux based) router is on the usual 192.168.0.x subnet and for most part works just fine. Behind the Smoothwall box, all machines are connected to a GB HP unmanaged switch. I host a large WISP here and have an OC-3 connection here at home/work and have no issues with downloading/uploading from/to the 'net'. My problem is with throughput. When I try and transfer large files...really any for that matter..between any of the machines to/and from the FreeNAS server via FTP, the max throughput I can achieve say between a Win 7 or a Linux box is ~65Mbit/sec. All machines are running Intel Pro 1000 GB NIC's and all cable is CAT6. Each is set to 'auto negotiation' and each shows 1500 MTU Full Duplex @1GB so I know the hardware is okay. I have not adjusted the MTU on any machine as I understand it to be pointless unless certain configurations are used (I assume I am not one of those). My settings for the FreeNAS machine are the following: # FreeNAS /etc/sysctl.conf - pertinent settings shown kern.ipc.maxsockbuf=262144 kern.ipc.nmbclusters=32768 kern.ipc.somaxconn=8192 kern.maxfiles=65536 kern.maxfilesperproc=32768 net.inet.tcp.delayed_ack=0 net.inet.tcp.inflight.enable=0 net.inet.tcp.path_mtu_discovery=0 net.inet.tcp.recvbuf_auto=1 net.inet.tcp.recvbuf_inc=524288 net.inet.tcp.recvbuf_max=16777216 net.inet.tcp.recvspace=65536 net.inet.tcp.rfc1323=1 net.inet.tcp.sendbuf_inc=16384 net.inet.tcp.sendbuf_max=16777216 net.inet.tcp.sendspace=65536 net.inet.udp.recvspace=65536 net.local.stream.recvspace=65536 net.local.stream.sendspace=65536 net.inet.tcp.hostcache.expire=1 From what I can tell, that looks to be a somewhat optimized profile for a typical BSD machine acting as a server for a LAN. I might be wrong and just wanted to find out from someone that knows BSD better than I do if indeed that is ok or if something is out of tune or what. Are there other ways I would find better for P2P file transfers? I honestly do not know what I SHOULD be looking for with respect to throughput between the NAS box and another client when xferring files via FTP, but I am told that what I get on average (40-70MB/sec) is too low for what it could be. I have thought about adding another NIC in the FreeNAS box as well as the Win7 machine and use a X-over cable via a static route, but wanted to check with someone first to see if that might be worth it or not. I don't know if doing that would bypass the HP GB switch and allow for a machine to machine xfer anyways. The FTP client I use is: Filezilla and have tried both active and passive modes with no real gain over each other. The NAS box runs ProFTPD.

    Read the article

< Previous Page | 6 7 8 9 10 11  | Next Page >