Search Results

Search found 14175 results on 567 pages for 'home entertainment'.

Page 113/567 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • What can I do about rsync of large files killing my laptop's wifi connection

    - by David Dean
    When I run a rsync to backup my home folder over the network like so: rsync -avhz --progress --delete /home/dbdean/ [email protected]:/home/backups/david/ I seem to have problems with my quite large .VirtualBox/HardDisks/Windows XP.vdi file. Occasionally the wifi will silently fail (the transfer stops, and any other network access is broken). If I reconnect the wifi to my network before the transfer times out, it happily keeps going (and other network access is back), but I can't just leave it unattended most of the time, as I have to keep an eye on it. I'm guessing this is probably a bug in the wireless card related to a particularly high sustained volume of network usage, but I'm not really sure where to start with diagnosing this problem so that I can provide a good bug report. Or it could be something else, I guess. Any help would be appreciated. My network card is an Atheros Communications Inc. AR9285, as lspci -k shows: 43:00.0 Network controller: Atheros Communications Inc. AR9285 Wireless Network Adapter (PCI-Express) (rev 01) Subsystem: Hewlett-Packard Company Device 3040 Kernel driver in use: ath9k Kernel modules: ath9k

    Read the article

  • How do I enable write access for an sFTP only user under Ubuntu?

    - by Jon Cage
    I'm running Ubuntu 12.04 and am trying to configure a user to allow chroot'd sFTP connections to another section of the filesystem. I've added the following to my /etc/ssh/sshd_config file: Match Group mygroup X11Forwarding no AllowTcpForwarding no ForceCommand internal-sftp ChrootDirectory /home/%u I've set their home directory so that it's owned by root but has their group. I've created a mount --bind from /home/myuser/transfers to /my/filesystem which appears to be navigable. The problem I'm having is that I'm not able to write to any part of the filesystem which makes this pretty useless as an FTP server. What am I missing? What can I check?

    Read the article

  • How do I tar dot files but not dot directories

    - by bjackfly
    The following tar command will exclude all dot files and dot directories. tar -cvzf /media/bjackfly/bkup/bkup.gz --exclude '.*' --one-file-system /home/bjackfly In my case I want the dot files to be backed up in the home directory (.vimrc, .bashrc) etc. but not the dot directories /.config /.cache /.eclipse etc. Any Linux gurus with a command for this, or do I need to run a find into a tar or do two different tar commands which is non-ideal? One for dot files in the home directory and one for everything else?

    Read the article

  • Apache mod_wsgi error: ImportError: No module named django.core.handlers.wsgi

    - by bigmac
    I am using Python 2.7 with mod_python 3.3.1 and mod_wsgi 3.3. I get an Internal Server Error and this stack trace in the apache logs: [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] import django.core.handlers.wsgi [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] ImportError: No module named django.core.handlers.wsgi [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] mod_wsgi (pid=4463): Target WSGI script '/home/one/codebase/campman/wsgi_handler.py' cannot be loaded as Python module. [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] mod_wsgi (pid=4463): Exception occurred processing WSGI script '/home/one/codebase/campman/wsgi_handler.py'. [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] Traceback (most recent call last): [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] File "/home/one/codebase/campman/wsgi_handler.py", line 13, in <module> [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] import django.core.handlers.wsgi [Thu Apr 21 10:25:37 2011] [error] [client 83.244.243.242] ImportError: No module named django.core.handlers.wsgi

    Read the article

  • Ranking hit after WP site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed WMT completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic come from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings. Any help would be greatly appreciated. Thanks!

    Read the article

  • Is the difference between sudo and gksu the same as the difference between sudo -i and sudo -s?

    - by fred.bear
    Is the difference between sudo cmd and gksu cmd, the same as the difference between starting a shell with sudo -i and sudo -s? ... or put another way, Is sudo cmd the same as sudo -i cmd and gksu cmd the same as sudo -s cmd? EDIT: Based on what I read on an Ubuntu Documentation Page where it says: You should never use normal sudo to start graphical applications as root. You should use gksudo (kdesudo on Kubuntu) to run such programs. gksudo sets HOME=~root, and copies .Xauthority to a tmp directory. This prevents files in your home directory becoming owned by root. (AFAICT, this is all that's special about the environment of the started process with gksudo vs. sudo). The "AFAICT" doen't really give me full confidence that there is nothing more to it. (..a belated UPDATE: I tested his commemnt today (2 months later) about: "This prevents files in your home directory becoming owned by root." All files I created via sudo/gksu were all owned by "root", and the group was "root".) I've read parts of the info sudo and noticed the -i and -s seem to be doing the same thing as the AFAICT environment issue... but I hit overload.. so I've asked my question here. PS.. My question is not about sudo vs gksu .. It is more about: Is gksu the same as sudo -s .. and if not, how do they differ?

    Read the article

  • Use AssignUserId as variable in Apache MPM ITK

    - by Robert Hanson
    I heard that the MPM-ITK module for Apache can change Apache server's behaviour to access some folder / file using the UID or GID from the default UID (www-data) into a given UID on the configuration. For example: <IfModule mpm_itk_module> AssignUserId user group </IfModule> Is it possible to make the username and group a variable? I want to make Apache access the /home folder as its owner. For example /home/me can only be accessed by the user me, while /home/you can only be accessed you.

    Read the article

  • Apache, Rewrite Rule and Directories

    - by milo5b
    my sites-available/ file looks something like the following: <VirtualHost *:80> ServerAdmin webmaster@mysite ServerName mysite.co.uk ServerAlias www.mysite.co.uk DocumentRoot /home/mysite.co.uk/htdocs/ <Directory /home/mysite.co.uk/htdocs/> Options -Indexes FollowSymlinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/mysite.co.uk/error.log LogLevel warn CustomLog ${APACHE_LOG_DIR}/mysite.co.uk/access.log combined </VirtualHost> In .htaccess (in the htdocs/), I have (amongst others) the following rewrite rule: RewriteRule ^enquiries$ /enquiries.php Somehow I have also a directory named "enquiries" (/home/mysite.co.uk/htdocs/enquiries/), and when I hit the url "www.mysite.co.uk/enquiries" I get: HTTP/1.1 301 Moved Permanently Date: Mon, 10 Dec 2012 18:53:37 GMT Server: Apache/2.2.16 (Debian) Location: http://www.mysite.co.uk/enquiries/ Vary: Accept-Encoding Content-Type: text/html; charset=iso-8859-1 And a Browser would display the directory's content. Now, I could easily rename the folder and get it sorted, but I would like to understand what's going on here. What would be the correct way to configure Apache in a way that it wont behave this way, and instead would listen to the Rewrite Rule? If I did not explain myself clearly, please feel free to ask more questions, I'd be happy to answer them. Thanks!

    Read the article

  • I just moved, have no internet on my laptop or ipod, but everyone else who lives here does

    - by Kay
    I just moved to Thunder Bay and my laptop as well as my ipod cannot connect to the internet. My laptop allows me to write the password for the wifi, but I still have no internet connection. When I try to use the cable, the computer tells me I have a perfect connection, even the icon shows that it's working, but I can't open any web pages or use any internet functions. When I try to use MSN it sends me to a troubleshooting option and informs me that there's some kind of problem with the "gateway". I have unplugged the modem and the router and plugged them back in, this did not help. I am living in a home where all the people are using wifi on the same system as me, and no one has ever had any problems. Back home both my laptop and my ipod worked without a problem both in my home, as well as on campus. Since this problem seems to be limited only to me, it would indicate that there's a problem on my end -- with my laptop. However, in that case my ipod would be working. It has never failed to connect before. Any suggestions would be much appreciated.

    Read the article

  • How to make FileZilla open all the required files with one click

    - by Omar Tariq
    Is there any way of configuring FileZilla so that I can open all the files on a server that I use to edit with just one click. For example if the files are like this: /home/abc/def/one.txt /home/abc/def/yet/another/directory/two.txt /home/abc/def/ghi/yet/another/directory/three.txt Then it is very time-consuming to navigate through each directory and open the required files. These are only 3 files but what if we have around 10 to 20 files? Yes, copying the path of the directories is one thing. But something that is built-in so that I can just click a button like open all the required files of this connection and it opens all the files in the editor (as set in FileZilla preferences) then that would be great!

    Read the article

  • How to create public html (apache2) with LDAP authentication?

    - by borjamf
    Im running Apache2 on Ubuntu 12.04 Server because I want to create a home directory for each ldap user. I'm using LDAP for authentication and it's working ok. Also I've done some tests with LDAP module for Apache2 and it's working ok. The problem with this LDAP authentication is that any success login can access to ~user/public_html, even if the user is not the owner of that home. I dont know how to control that, for example, userldap2 access to userldap1/public_html. I want that only the userldap1 access to userldap1. Could anybody tell me how to control that with LDAP authentication? I hope that you'll understand me. My config (auth_ldap.conf) <Directory /home/disco2/*/public_html> AuthName "Authentication" AuthType basic AuthBasicProvider ldap AuthzLDAPAuthoritative off AuthLDAPURL ldap://prueba.borja/dc=prueba,dc=borja?uid? Require ldap-filter objectClass=posixAccount </Directory>

    Read the article

  • How to rebuild fstab automatically

    - by yvoyer
    I accidentally removed all the entries from the fstab files while doing a backup (Yeah, I know ;)). I would like to know if there is a way to rebuild it with the current mount options, since I did not restart the server since the deletion. If there is no such program, could anybody help me rebuild it. Using this, I have found the command to show the current setup, but I don't know what to do with it. $ sudo blkid /dev/sda1: UUID="3fc55e0f-a9b3-4229-9e76-ca95b4825a40" TYPE="ext4" /dev/sda5: UUID="718e611d-b8a3-4f02-a0cc-b3025d8db54d" TYPE="swap" /dev/sdb1: LABEL="Files_Server_Int" UUID="02fc2eda-d9fb-47fb-9e60-5fe3073e5b55" TYPE="ext4" /dev/sdc1: UUID="41e60bc2-2c9c-4104-9649-6b513919df4a" TYPE="ext4" /dev/sdd1: LABEL="Expansion Drive" UUID="782042B920427E5E" TYPE="ntfs" $ cat /etc/mtab /dev/sda1 / ext4 rw,errors=remount-ro 0 0 proc /proc proc rw,noexec,nosuid,nodev 0 0 none /sys sysfs rw,noexec,nosuid,nodev 0 0 none /sys/fs/fuse/connections fusectl rw 0 0 none /sys/kernel/debug debugfs rw 0 0 none /sys/kernel/security securityfs rw 0 0 none /dev devtmpfs rw,mode=0755 0 0 none /dev/pts devpts rw,noexec,nosuid,gid=5,mode=0620 0 0 none /dev/shm tmpfs rw,nosuid,nodev 0 0 none /var/run tmpfs rw,nosuid,mode=0755 0 0 none /var/lock tmpfs rw,noexec,nosuid,nodev 0 0 none /lib/init/rw tmpfs rw,nosuid,mode=0755 0 0 none /var/lib/ureadahead/debugfs debugfs rw,relatime 0 0 /dev/sdc1 /home ext4 rw 0 0 /dev/sdb1 /media/Files_Server ext4 rw 0 0 binfmt_misc /proc/sys/fs/binfmt_misc binfmt_misc rw,noexec,nosuid,nodev 0 0 /dev/sdd1 /media/Expansion\040Drive fuseblk rw,nosuid,nodev,allow_other,blksize=4096,default_permissions 0 0 gvfs-fuse-daemon /home/yvoyer/.gvfs fuse.gvfs-fuse-daemon rw,nosuid,nodev,user=yvoyer 0 0 /dev/sdd1 /media/Backup500 fuseblk rw,nosuid,nodev,sync,allow_other,blksize=4096,default_permissions 0 0 /dev/sr0 /media/DIR-615 iso9660 ro,nosuid,nodev,uhelper=udisks,uid=1000,gid=1000,iocharset=utf8,mode=0400,dmode=0500 0 0 gvfs-fuse-daemon /home/cdrapeau/.gvfs fuse.gvfs-fuse-daemon rw,nosuid,nodev,user=cdrapeau 0 0

    Read the article

  • Linux server cannot be pinged

    - by misamisa
    I have set up a Linux server in DMZ. There is another Windows server running in same DMZ. These two servers can be pinged via internet using my home PC. However, the another Linux server rented from a hosting service provider can only be pinged from the Windows Server and not from the Linux server (accessed via internet). So the situation is: Windows server (DMZ) ---ping--- Rented Server.....Successful Linux server (DMZ) ---ping--- Rented Server.......Unreachable Home PC ---ping--- Linux server (DMZ).......Successful Home PC ---ping--- Windows server (DMZ).....Successful When I ran tcpdump on my Linux Server(DMZ) and started ping from Rented Server, it showed that the Linux Server(DMZ) is receiving ping and replying. There is no restriction defined in hosts.deny and hosts.allow file that might cause this problem. What else should I check to get this working?

    Read the article

  • Session management error: None of the authentication protocols specified are supported

    - by JBWhitmore
    The title is the first error that has sent me on a mission to fix things. Motivation: I was trying to install the new Enthought Python Distribution -- when the error above first showed up. The install finished -- and looked like there were a few more times it flagged dcopserver problems: Please check that "dcopserver" program is running! Could not read network connection list: ~/home/user/.DCOPserver_host__0 When running ipython from the distribution, it claims that readline (the ability to up arrow in history or tab-complete) is not available for my system. It is though -- if I run the ipython that's sitting in /usr/bin/ipython all readline features work perfectly. So, I tried to fix the install by trying to fix what I thought could be causing the problems. Bad things that are happening that I want to be fixed: When restarting I get the error: Could not update ICEauthority file /home/username/.ICEauthority. ipython readline doesn't work with Enthought's ipython Things I have tried: changed the owner of my ~/.ICEauthority to be me. changed the owner of home directory (and all nested files and folders) to be me double checked that /var/lib/gdm was owned by Gnome (yep) attempted to reinstall DCOP, kbuildsycoca stuff (fail) I've removed nautilus; rebooted; reinstalled; rebooted; removed ubuntu-desktop; rebooted; reinstalled; rebooted. Any suggestions on how to fix the Bad Things that are happening would be greatly appreciated! Computer: Ubuntu 10.04 x86

    Read the article

  • How can I avoid repeating DocumentRoot in this Apache virtual host?

    - by David Faux
    I have an Apache virtual host configured for a website powered by Wordpress. <VirtualHost *:80> ServerName 67.178.132.253 DocumentRoot /home/david/wordpressWebsite # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^index\.php$ - [L] RewriteCond /home/david/wordpressWebsite%{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress </VirtualHost> How can I avoid hard-coding /home/david/wordpressWebsite twice? I don't want to use REQUEST_URI since that involves an extra request.

    Read the article

  • Apache file negotiation failed

    - by lorenzo.marcon
    I'm having the following issue on a host using Apache 2.2.22 + PHP 5.4.0 I need to provide the file /home/server1/htdocs/admin/contents.php when a user makes the request: http://server1/admin/contents, but I obtain this message on the server error_log. Negotiation: discovered file(s) matching request: /home/server1/htdocs/admin/contents (None could be negotiated) Notice that I have mod_negotiation enabled and MultiViews among the options for the related virtualhost: <Directory "/home/server1/htdocs"> Options Indexes Includes FollowSymLinks MultiViews Order allow,deny Allow from all AllowOverride All </Directory> I also use mod_rewrite, with the following .htaccess rules: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^([^\./]*)$ index.php?t=$1 [L] </IfModule> It seems very strange, but on the same box with PHP 5.3.6 it used to work correctly. I'm just trying an upgrade to PHP 5.4.0, but I cannot solve this negotiation issue. Any idea on why Apache cannot match contents.php when asking for content (which should be what mod_negotiation is supposed to do)?

    Read the article

  • Install Mouse Driver on Windows 7 Professional 64 bit

    - by Soren
    I have a mouse that I absolutely love (been using them for years), A4Tech WOP-35. It has dual scrollers and 5 buttons, 3 of the buttons are programmable. I use them at work and at home. At work I am using Windows 7 Enterprise (32 bit), at home I am using Windows 7 Professional (64 bit). The drivers installed easily on my machine at work. Unfortunately, they will not install on my computer at home. When I double click on the Setup.exe, it asks me if I want to install it, and of course I click on "Yes", but nothing happens. When I say nothing happens, I mean nothing happens; it appears that it doesn't even try to install. The same thing happens when I right click on the setup.exe and select run as administrator. How can I get around this? I am guessing it is because I am running the 64 bit version of Windows.

    Read the article

  • how to change document root to public_html from root directory

    - by manish
    For testing I hosted my website on free server from 000webhost.com They have a directory structure:- (root folder) \ (public folder) \public_html this directory structure enables to keep all the library files in root folder and all public data in \public_html, so I developed my website accordingly, and my final structure looked like:- / /include(this folder contains library files) /logs(log files) /public_html /public_html/index.php /public_html/home.php /public_html/and other public files on 000webhost makes only public_folder available to be accessed via url and my url looked neat and clean like www.xample.com/index.php or www.example.com/home.php but after completion of development I moved website to shared host purchased from go-daddy.com, now they do not have any such kind of directory permission, all the files are kept in root folder and are accessible via url also url has become like:- www.example.com/public_html/home.php or www.example.com/public_html/index.php How should I redirect url request to public_html folder again so as to make library file unavailable to public access and make url neat and clean.

    Read the article

  • Serving static files fails - nginx

    - by Sergei
    Hi, I've been looking and trying around all night, but without success. I configured nginx to serve my static files and proxy all the other traffic: server { listen 80; server_name mydomain.com; access_log /home/boudewijn/www/bbt/brouwers/logs/access.log; error_log /home/boudewijn/www/bbt/brouwers/logs/error.log; location / { proxy_pass http://127.0.0.1:8080; include /etc/nginx/proxy.conf; } location /media/ { root /home/boudewijn/www/bbt/brouwers/; } } The proxy passing is no problem, but when I go to mydomain.com/media/ or try to access any testfile over there, it's without success. I paid attention to the difference between root and alias, my media folder exists, I paid attention to the trailing slashes, but still I get a 404 when trying to access my static media files. Any help?

    Read the article

  • Virtualmin deactivating PHP on new virtual servers

    - by Josh
    This is related to my other question... but the situation is much worse now. After updating to the most recent version of Virtualmin, when I create new accounts, Virtualmin sets up their VirtualHost entries as follows: <Directory /home/username/public_html> Options -Indexes +IncludesNOEXEC +FollowSymLinks +ExecCGI allow from all AllowOverride All AddHandler fcgid-script .php FCGIWrapper /home/username/fcgi-bin/php.fcgi .php </Directory> <Directory /home/username/cgi-bin> allow from all </Directory> [...] RemoveHandler .php Now, not only is it specifically inserting AddHandler fcgid-script and FCGIWrapper... which I do not want because I am using mod_fastcgi, but it's also setting up PHP in such a way that it will never work! It's adding a RemoveHandler .php after setting up the handler for PHP! Where is this behavior configured and how can I stop it? Better yet, how can I make Virtualmin not include any PHP commands at all in the VirtualHost section?

    Read the article

  • How to selectively route network traffic through VPN on Mac OS X Leopard?

    - by newtonapple
    I don't want to send all my network traffic down to VPN when I'm connected to my company's network (via VPN) from home. For example, when I'm working from home, I would like to be able to backup my all files to the Time Capsule at home and still be able to access the company's internal network. I'm using Leopard's built-in VPN client. I've tried unchecking "Send all traffic over VPN connection." If I do that I will lose access to my company's internal websites be it via curl or the web browser (though internal IPs are still reachable). It'd be ideal if I can selectively choose a set of IPs or domains to be routed through VPN and keep the rest on my own network. Is this achievable with Leopard's built-in VPN client? If you have any software recommendations, I'd like to hear them as well.

    Read the article

  • /tmp shows 690 Mb full, actual size 72 K, Why?

    - by Ankit
    Why is /tmp diretory on my system showing 690 Mb full, whereas du -sh /tmp shows only 72K full. drwxrwxrwt 2 lightdm lightdm 4096 Aug 29 21:49 at-spi2 drwx------ 2 ankit ankit 4096 Aug 29 21:50 keyring-0JTfoY drwx------ 2 ankit ankit 4096 Aug 29 21:44 keyring-rChLLL drwx------ 2 root root 16384 Jul 22 02:10 lost+found drwx------ 2 ankit ankit 4096 Jan 1 1970 orbit-ankit drwx------ 2 lightdm lightdm 4096 Aug 29 21:50 pulse-2L9K88eMlGn7 drwx------ 2 root root 4096 Aug 29 21:44 pulse-PKdhtXMmr18n drwx------ 2 ankit ankit 4096 Aug 29 21:50 pulse-zR1TZUAZfmQW drwx------ 2 ankit ankit 4096 Aug 29 21:44 ssh-dlslOXOq2203 drwx------ 2 ankit ankit 4096 Aug 29 21:50 ssh-MrQQVRyy3316 -rw------- 1 ankit ankit 0 Aug 29 21:45 tmp0qnNG4 -rw------- 1 ankit ankit 0 Aug 29 21:50 tmpVvSMt6 -rw------- 1 ankit ankit 0 Aug 29 21:49 tmpy9Gadz -rw-rw-r-- 1 lightdm lightdm 0 Aug 29 21:44 unity_support_test.0 ankit@duster:/tmp$ df -h df: `/home/ankit/.gvfs': Transport endpoint is not connected Filesystem Size Used Avail Use% Mounted on /dev/sda1 79G 11G 65G 14% / udev 2.9G 4.0K 2.9G 1% /dev tmpfs 1.2G 868K 1.2G 1% /run none 5.0M 0 5.0M 0% /run/lock none 2.9G 220K 2.9G 1% /run/shm /dev/sda7 38G 690M 35G 2% /tmp /dev/sda5 93G 26G 63G 30% /home /dev/sda6 93G 1.6G 87G 2% /boot /dev/sda3 154G 69G 78G 48% /home/mount_150 ankit@duster:/tmp$ ankit@duster:/tmp$ ankit@duster:/tmp$ sudo du -sh /tmp/ 72K

    Read the article

  • How to disable "N" Wireless Mode RTL8192 (Thinkpad Edge 15 Core i5) in natty

    - by Gustavo Rubio
    I've seen many owners of thinkpad edges which are supossed to be linux-friendly having problems with wireless adapter. I've found several links inside askubuntu and in ubuntuforums which have a lot of work-arounds for those problems, mine seems to be wierd though. I use my laptop on both my office and at home. At home I have a router which is A/B/G and here at home the wireless connection works just fine, using a WEP key. But in work I have a B/G/N wireless router and it doesn't work, my guess is that this adapter works with N modes but somehow this is buggy in the bundled driver in natty. I've tried to disable the "N" mode in the router but that didn't work. Later I went to realtek website, downloaded their driver and compiled myself, kinda seems to work most of the time but sometimes some websites keep trying to load or load just parts of it and images start to look like their links are broken and so on, much like what you get when you were loading a page and suddenly the connection is lost. This problem, as I said, is only using the realtek driver from their website. Dmesg gives me this a lot of these: [ 5869.049454] rtl8192se_update_ratr_table: ratr_index=0 ratr_table=0x00000ff5 [ 5879.240563] DHCP pkt src port:68, dest port:67!! So I thougth I might as well switch back to the original driver which seems to work just fine on A/B/G wireless networks but not on N networks so if anybody knows how to disable that mode from within the driver please let us know :) Thanks in advance! PS: I do found a link to a similar question and it was answered but let me remind you I'm NOT using the intel version of wireless for my thinkpad but the realtek (RTL8192SvB)

    Read the article

  • Sharing an external hard drive in Ubuntu using Samba

    - by cambraca
    /media/MYDISK is where my hard drive is mounted automatically. I created a symlink using: ln -s /media/MYDISK /home/camilo/MYDISK chmod 777 /home/camilo/MYDISK I'm setting up smb.conf like this: [myshare1] comment = external disk browsable = yes path = /home/camilo/MYDISK guest ok = yes read only = no create mask = 0775 Also, in the [global] section I tried adding the following lines: follow symlinks = yes wide links = yes unix extensions = no The problem is that when browsing the shared folder in Windows 7, I get a "\\etc\myshare1 is not accessible" error. When pointing the path to a regular folder it works fine. Also, when I point it directly to /media/MYDISK, it shows the same error. EDIT: to make it more interesting, I have no graphical interface, so I need to touch the config files directly..

    Read the article

  • Renting linux server just to make backups of my personnal data ?

    - by Matthieu
    Hi all, I would like to be able to backup ALL my computers data on a Linux server. For now, I have a home server, but soon I will be travelling, without home (so no home server). I was thinking of renting a dedicated linux webserver, but this is expensive, and I don't need a fast machine "web-oriented" with mysql server and all, I just need a full SSH access (full control, and then I install my programs). Does "backup servers" exist ? Am I doing it wrong (maybe that is not a good solution) ? Note : I run Mac OS, Windows and Linux, I backup through rsync, I want full control on my backup, not an automated "magic" backup like MobileMe or anything like that. Edit : I need around 500Gb storage

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >