Search Results

Search found 14079 results on 564 pages for 'folding at home'.

Page 170/564 | < Previous Page | 166 167 168 169 170 171 172 173 174 175 176 177  | Next Page >

  • setting up a WGR614v7 behind a linux box

    - by commodore fancypants
    Here's the setup, I have an openSUSE box with 2 NICs, one goes to my home network router, the other has DHCP running and it attached to a wireless router. I'm trying to get this setup to work before I switch to the linux box as my home network router. My DHCP will offer the wireless router (a WGR614v7) an address, but anything that connects to the wireless router ends up with a APIPA address. I have all the firewalls on the wireless network turned off as well as the wireless router's own DHCP. The linux box isn't offering addresses to anything past the wireless router. Is this a problem with the router or my DHCP setup? For testing purposes, I have both NICs set in the internal zone and I've tried wireless and wired connections to the WGR614v7 both to no avail.

    Read the article

  • "The daemon is being inhibited" error message when mounting volumes on a partitioned external HD [closed]

    - by Todd
    I'm having a great deal of difficulty with an external hard drive. I'm currently running a dual boot system (XP Service Pack 3 and Ubuntu 11.04 Natty Narwahl) on a Dell Inspiron B120. I'm trying to set up a new 80 GB Hitachi external HD. Using GParted, I formatted the drive and set up the partitions. The partitioning scheme is as follows 10GB NTFS Primary, 2GB Linux-Swap Primary, 50GB FAT32 Primary, 12GB Unallocated. After applying those changes, I went into Disk Utility and the HD appears along with the correct partitions. When I try to mount the volumes for partitions 1 and 3, I get a pop-up stating: Error Mounting Volume An error occurred while performing an operation on "Home" (Partition 3 of HTS548080m9AT00): The daemon is being inhibited. When I try to to check the filesystem I get a pop-up stating: Error Checking filesystem on volume An error occurred while performing an operation on "Home" (Partition 3 of HTS548080m9AT00): The daemon is being inhibited. Throughout the time that I'm attempting to troubleshoot the problem, the external drive light is on and blinking. With my frustration hitting a boiling point, I try to shut down the drive and remove it so that I can plug in a different external HD that works PERFECTLY. However, when I try to shut down and safely remove the drive, I get a pop-up stating: Error Detaching Drive An error occurred while performing an operation on "80GB Hard Disk" (HTS548080m9AT00): The daemon is being inhibited. Can anyone tell me what I'm doing wrong? I'm a newbie and not that skilled with terminal commands, so please dumb it down for me if you request specific command output.

    Read the article

  • GPO: Disable WiFi when computer is connected to LAN with wire

    - by r0ca
    Hi all! Is anyone of you know how to disable, thru a GPO, the Wireless connection as soon as the computer is connected to the LAN with a wire. By example, a user is at home with his laptop, connected to his home wifi connection. He leaves and comes to the office but the Wifi is still enabled. He plugs the ethernet cable in the computer and then, both wifi and wire connection are enable. I would like to apply a GPO that does this... Thanks a bunch!

    Read the article

  • Analytics - Where do my drop offs go?

    - by BadCash
    I have a website set up with Google Analytics (through the Wordpress plugin "Google Analytics for WordPress" by Joos de Valk). When I check out the visitors flow in Google Analytics, it shows something like this: (home) - 43% drop-offs /page-2/ - 10% drop-offs ... etc ... I have also set up events for external links. My main "goal" of the website is to drive traffic to my Android app on Google Play, so I have a couple of different links to that that are all set up as events. Everything seems to be working, my events show up when I go to Content - Events in Google Analytics. However, it seems to me that some percentage of the users that are reported as "drop-offs" in fact have clicked on one of the external links. But there's no info about the reason of those drop-offs in the Visitors flow-chart. I can of course check out each specific event category, event action and set "other" to Content/Page, which (I guess) shows the number of visitors who triggered a specific event on a specific page. It just seems like such a complicated way of going about this! So, is there a way to get a more detailed picture, including events, in the Visitors flow chart? Something like: (home) - 43% drop-offs Event Action: "Google Play"=50%, "Youtube"=10%, (not set)=40%

    Read the article

  • Is it a bad idea to make roaming profile share available offline?

    - by Bryan
    This is regarding a Windows 2008 R2 domain. The Documents, Desktop, Application Data folders are all redirected to users' home directory (mapped as Z:). The users home directory is configured to be offline for mobile users. User profiles are configured as roaming, and located on a separate share (not mapped as a network drive), just accessed via an UNC path. Would it be a good or idea to make the roaming profile share available offline for mobile users using the caching option "All files and programs that users open from the share will be automatically available offline"?

    Read the article

  • How to perform fresh linux install while preserving software raid and user accounts

    - by slayton
    I have a system with two software raid arrays. The OS is Ubuntu 9.04 and is no longer receiving updates. I'd like to update the system to 12.04 rather than trying to do the automatic update from 9.04-> 9.10-> ... -> 12.04. My main drive has 2 partitions that are mounted at / and /home. Is it possible to do a fresh install of linux to the partition where / is mounted while preserving user accounts and preferences (such as passwords, home dir locations, etc...)? Additionally what do I need to do to keep my software raid array intact following the OS re-install?

    Read the article

  • How to make ssh-agent automatically add the key on demand?

    - by Vi.
    I want to run ssh-agent (with maximum lifetime option), but not add any keys at startup, but instead add them on demand. Like first time I login to some server it should ask for passphrase, next time (unless I waited for more than a hour) it should connect cleanly: ssh server1 Enter passphrase for key '/home/vi/.ssh/id_dsa': server1> ... ssh server2 server2> # no passphrase this time # wait for lifetime ssh server2 Enter passphrase for key '/home/vi/.ssh/id_dsa': I don't want to manually remember about running 'ssh-add' each time. (e.g. entered passphrase for just for ssh and "Oh, it hasn't remembered, need to retype"). How to configure ssh to automatically add key to ssh-agent if user provided the passphrase?

    Read the article

  • bash-like features in sqlplus, rman and other Oracle command line tools

    - by Gilles Haro
    As far as I can remember, I have always been complaining about the lack of “recall last command” from within sqlplus. Such a basic thing, available in any bash shell or windows cmd terminal, remains missing with Oracle command lines tools. Thanks to davidw who published a post in the french blog EASYTEAM, it is now possible to use a simple rpm package rlwrap to enhance sqlplus, dgmgrl, rman, … tools and give them bash “recall/completion” capabilities. I installed it in a few minutes and I am already wondering how can people work without it. The steps are here : Get the rpm file from sites like RPM PBone. AS root, install the package rpm -ivh rlwrap-0.37-1.el5.x86_64.rpm As Oracle, create a dictionnary file (for autocompletion) . This file is made of a series of words to be used for autocompletion. Put in it the list of dictionary tables, the list of sql commands, the list of sqlplus commands… whatever your like. And use the <tab> key as you would in a bash shell. $HOME/.oracle_keywords Create an alias for sqlplus alias sqlplus='/usr/bin/rlwrap -if $HOME/.oracle_keywords $ORACLE_HOME/bin/sqlplus' And enjoy it !!! Thank you DavidW. Gilles Haro Technical Expert - Core Technology, Oracle Consulting  Technorati Tags: rlwrap bash sqlplus

    Read the article

  • .desktop shortcuts aren't working for java applications in LXDE

    - by chaz
    I just installed minecraft on my LXDE desktop/Lubuntu machine and I'm trying to create a .desktop file on the desktop that executes java -jar ~/minecraftlauncher.jar. The command works in bash scripts and the terminal but refuses to work when I click on my .DESKTOP shortcut which is suppose to execute the same command. I've experimented with other jars and they can't seem to start too. Here is my xsession log: ** (pcmanfm:1572): DEBUG: launch command: <java -jar ~/Downloads/minecraft_server.jar> ** (pcmanfm:1572): DEBUG: sn_id = pcmanfm-1572-administrator-Dimension-3000-java-14_TIME14031891 Unable to access jarfile ~/Downloads/minecraft_server.jar ** (pcmanfm:1572): DEBUG: launch command: <java -jar ~/minecraftlauncher.jar> ** (pcmanfm:1572): DEBUG: sn_id = pcmanfm-1572-administrator-Dimension-3000-java-15_TIME14070158 Unable to access jarfile ~/minecraftlauncher.jar UPDATE: Whoops, it seems to work when I give an absolute path. I guess the home path is something else. UPDATE: I guess X doesn't resolve the home specifier. I ran a .desktop file that executed a script that outputs the current directory, and it seems to be correct.

    Read the article

  • Network connection fails when downloading data

    - by Guus
    In the past few weeks, I've noticed my network connection becoming unstable while downloading files. I'm not sure how to diagnose this issue. I've found that my network connections appear to be temporarily unresponsive (for periods up to a minute or so) while I'm downloading a file (that's large enough to not be downloaded instantly). The problem occurs when downloading data through a webbrowser, but also when using SCP to download data from a remote location. During the period in which network connections are unresponsive, every resource that I try to access over the network is unavailable. This includes: The download itself (SCP reports a 'stalled' download) Web pages (won't load - browsers report 'resource unavailable') SSH sessions (CLI freezes) VPN connections (connections terminate) IM client connections (client starts reconnection attempts) ... (everything is pretty much dead) I've noticed that during such a period, I cannot even access the (web-based) administrative console of the router on my LAN at home (although it remains reachable for other devices). The problem occurs when connected to my home network, but also when I'm in the office. Other devices than my laptop are not affected. Given the above two characteristics I assume the problem cause lies within my laptop, not the network infrastructure. I'm running Ubuntu 11.10. Apart from applying automatic updates from Ubuntu, I can't think of a change that I applied to the OS that could have started the problems. I'm absolutely positive that this issue did not occur up to a few weeks ago (as it's very noticeable, and only started to annoy me in the last few days). When the problem occurs, applications that make use of a network connection fail visibly (I get popups telling me that a VPN connection is broken, for instance). The network manager does not report any issues related to my wifi-connection though. Help?

    Read the article

  • Rename outlook2007 categories not working

    - by Bob Rivers
    Hi, I'm new to outlook 2007 and I'm trying to rename a category. Acordingly to MS, I can achieve it by just renaming it: http://office.microsoft.com/en-us/outlook/HA012316361033.aspx But as soon as I click the OK button, confirming the change, it returns to it's original value. For instance, if I rename the "Blue Category" to "Home" it will accept without erros. But, when I click again in the Categorize icon, I'll see that the "Blue Category" is still there, and my "Home" category does not exist. Has someone experienced this kind of problem? TIA, Bob

    Read the article

  • Configure ubuntu 12.10 to share internet through second NIC with subnetwork

    - by Dan Smith
    I have an Ubuntu 12.10 box with 2 NICs. eth0 is connected to my 192.168.1.X network, which is connected to my home router and out to the internet. I want the other interface, eth1,to support 10.0.0.x and allow devices on that network to access the internet through my ubuntu box. I do not need dhcp on the 10.0.0.x network. Here's a schematic: Internet --- home router ---- ubuntu[eth0:192.168.1.x, eth1:10.0.0.x] --- [10.0.0.x device] How do I configure the ubuntu box to share the internet with devices on that subnetwork? Thanks!

    Read the article

  • GNOME session not starting after filesystem corruption

    - by user3215
    I'm running Ubuntu 9.10 desktop edition. Suddenly today /home became corrupted and I was prompted to run fsck manually. I ran fsck -y /home and rebooted the system. The system booted but I got no GUI interface (GNOME session) but a black screen with a user prompt instead. Any tricks here to start my system normally? Any help is greatly appreciated. EDIT:1 The error were similar to the the following(may be with some mistakes as I had to type it manually): machine1 login: root password: at login Sun Jan 16 15:30:46 IST 2011 on tty1 EXT3-fs error (devie sda1): ext3_lookup :deleted inode referenced aborting journal on device sda1 Remounting filesystem read-only root@machine1:~# startx ktemp: failed to create file via template `/tmp/serverauth.xxxxxxxxxxx: Read-only file /usr/bin/startx: line 157: cannot create temp file for here-document: Read-only file xauth: error in locking authority file /root/.Xauthority /usr/bin/startx: line 173: cannnot create temp file for here-document: Read-only file xauth: error in locking authority file /root/.Xauthority /usr/bin/startx: line 173: cannnot create temp file for here-document: Read-only file X: cannot stat /tmp/.x11-unix (No such file or directory), aborting giving up. xinit: No such file or directory (errno 2): unable to connect to xserver xinit: No such process (errno 3): Server error xauth: error in locking authority file /root/.Xauthority

    Read the article

  • How to block a user in apache httpd server from accessing a *.php file inside a Directory, instead user should access this using Directory name

    - by Oxi
    My requirement looks Simple, But Googling Did not help me yet. my query is i want to Throw a 404 page to a user(Not Re-Direct to another folder or file), who is trying to Access *.php files in my website ex: when a client asks for www.example.com/home/ i want to show the content , but when user says www.example.com/home/index.php i want to show a 404 page. i tried different methods, nothing worked for me, one of which tried is shown below <Directory "C:/xampp/htdocs/*"> <FilesMatch "^\.php"> Order Deny,Allow Deny from all ErrorDocument 403 /test/404/ ErrorDocument 404 /test/404/ </FilesMatch> </Directory> Thanks in Advance

    Read the article

  • Switch monitor configurations on Windows 7

    - by Horst Walter
    I use one of my PCs for flight simulation as well as for home theater. In case one it has 2 monitors attached, and in case 2 (home theater) a HD TV is being used. All 3 monitors are attached at the same time to the graphics card. How could it best switch best between different configurations. In case 1 I'd like to have the configuration with monitor 1/2, alternatively I'd like quickly to switch to another config only with the HD TV as primary screen. A similar question has been asked 6 months back with no full solution yet, so I come up with it again. The comment there of Darius (Windows + P key) is the best so far.

    Read the article

  • All files on automounted NTFS partition are marked as executable

    - by MHC
    I have set up an NTFS partition to automount via fstab: # /etc/fstab: static file system information. # # Use 'blkid' to print the universally unique identifier for a # device; this may be used with UUID= as a more robust way to name devices # that works even if disks are added and removed. See fstab(5). # # <file system> <mount point> <type> <options> <dump> <pass> proc /proc proc nodev,noexec,nosuid 0 0 # / was on /dev/sda7 during installation UUID=e63fa8a2-432f-4749-b9db-dab328807d04 / ext4 errors=remount-ro 0 1 # /boot was on /dev/sda4 during installation UUID=e9ad1bb4-7c1f-4ea9-a6a5-799dfad71c0a /boot ext4 defaults 0 2 # /home was on /dev/sda8 during installation UUID=eda8c755-5448-4de8-b58c-9cb75823c22d /home ext4 defaults 0 2 # swap was on /dev/sda9 during installation UUID=804ff3a7-e5dd-406a-b63c-e8f3c635fbc5 none swap sw 0 0 #Windows-Partition UUID=368CEBC57807FDCD /media/Share ntfs defaults,uid=1000,gid=1000,noexec 0 0 As you can see I have added the noexec bit to the configuration. Why? Because any file I create on or move to the partition is automatically marked as executable. The problem is that there is no way of changing that through nautilus. I cannot uncheck the "Allow executing file as program" option. The noexec option doesn't help, unfortunately. It only prevents nautilus from displaying the "run" or "read" dialog but doesn't change the executable flag. Is there any way I can fix this?

    Read the article

  • Set custom mount point and mount options for USB stick

    - by kayahr
    Hello, I have an USB stick which contains private stuff like the SSH key. I want to mount this stick to my own home directory with 0700 permissions. Currently I do this with this line in /etc/fstab: LABEL=KAYSTICK /home/k/.kaystick auto rw,user,noauto,umask=077,fmask=177 0 0 This works great but there is one minor problem: In Nautilus (The Gnome file manager) the mount point ".kaystick" is displayed. I guess Nautilus simply scans the /etc/fstab file and displays everything it finds there. This mount point is pretty useless because it can't be clicked when the device is not present and it can't be clicked when the device is present (Because then it is already mounted). I know this is a really minor problem because I could simply ignore it but I'm a perfectionist and so I want to get rid of this useless mount point in Nautilus. Is there another way to customize the mount point and mount options for a specific USB device? Maybe it can be configured in udev? If yes, how?

    Read the article

  • User accounts in FTP

    - by Brad
    I have an FTP server(proftpd on debian) that I'm going to allow a couple friends access to, and I want some safety nets in place, just in case. These are some of the things I'd like to do: Jail the accounts to their home directories and impose a cap on the amount of data they can upload Allow them access to a shared folder(via symlink or something) where they have full access(Also with a storage cap, but larger) Allow my own account full access to the system(Using groups I guess) Not allow anonymous access, or allow it with its own folder, separate from the shared user folder Currently, I've got the accounts set up and jailed, but it seems like the symlink that I put in is not allowing them to visit the shared folder. I suppose this has to do with them not having read permissions anywhere but their own home directories, or maybe it's something else, I'll continue to look into it and provide any information that is requested. Is what I'm trying to do possible? Any tips or resources that you can share are appreciated. Thanks.

    Read the article

  • Error encountered compiling kernel 2.6.35-25.44

    - by Matt
    I downloaded the linux-source-2.6.35 package and tried to compile it using the command "fakeroot make-kpkg --append-to-version=.dbg kernel_image kernel_source kernel_headers --initrd" after "make menuconfig". The image .deb file is produced and installs fine, but an error stops the build process while trying to make the source package: scripts/Makefile.clean:17: /home/ade/linux-source-2.6.35/debian/linux-source-2.6.35.10.dbg/usr/src/linux-source-2.6.35.10.dbg/crypto/Makefile: No such file or directory make[1]: *** No rule to make target `/home/ade/linux-source-2.6.35/debian/linux-source-2.6.35.10.dbg/usr/src/linux-source-2.6.35.10.dbg/crypto/Makefile'. Stop. make: *** [_clean_crypto] Error 2 Sure enough, the folder linux-source-2.6.35/debian/linux-source-2.6.35.10.dbg/usr/src/linux-source-2.6.35.10.dbg/crypto does not exist (although all of the other kernel source folders appear to be there). I haven't even been able to determine where the folder is supposed to be copied over, or what's supposed to invoke clean. Am I doing something wrong here? It should be noted that I am running 10.04.

    Read the article

  • Connect to NFS on availability

    - by berkes
    What would be a good way to automatically mount an NFS when it gets/is available? I have the following: Media server at home, running Ubuntu, 10.10 with GUI *) Laptop often at home, often on the road or at clients. Ubuntu 10.10 with GUI. What I'd like is my laptop connecting to the nfs (or any other mountable networked filesystem) so that Banshee sees all the music, new podcast-entries (and video) from that media-server. I already have firefly (mt-daapd) running, which works, but is flakey on both server-side and client-side. But its biggest downside, is that I cannot easily fix metadata on files on the media-server this way. DAAP is read-only by design. I can mount nfs manually, through a sudo mount /media/nfsmultimedia/. I am not looking for a manual, or howto on setting up a NFS client and server. Merely a way to have this more transparently working. Obviously I'd like the NFS to be unmounted if the network is no longer available (i.e. when I open my laptop-lid on my clients buro). It may be, that an NFS is not suited for this, in that case, I'd love to hear other options. :) *) Actually: I also have a fileserver, backupserver and webserver to which I'd like to connect in a somewhat similar way. Right now I connect to these over SSH, using gvfs.

    Read the article

  • rsync - How to exclude one .htaccess but not all of them

    - by Cory Gagliardi
    I have an rsync command for copying my files from dev to production. I don't want to copy the .htaccess file that's in the root of the HTML directory but, I do want to copy the few .htaccess files that are in its sub directories. I'm using the argument --exclude .htaccess which is stopping all of the files from getting copied. The other arguments I'm including are -a --recursive --times --perms. Is it possible to configure rsync to do this? Edit: Here is my full command: rsync -a --recursive --times --perms \ --exclude prop_images --exclude tracking --exclude vtours \ --exclude .htaccess --exclude .htaccess_backup --exclude "*~" \ /home/user/dev_html/* /home/user/public_html/

    Read the article

  • Trying to compile from source newest apache with newest openssl

    - by AlexMA
    I need to install apache 2.4.10 using openssl 1.0.1i. I compiled openssl from source with: $ ./config \ --prefix=/opt/openssl-1.0.1i \ --openssldir=/opt/openssl-1.0.1i $ make $ sudo make install and Apache with: ./configure --prefix=/etc/apache2 \ --enable-access_compat=shared \ --enable-actions=shared \ --enable-alias=shared \ --enable-allowmethods=shared \ --enable-auth_basic=shared \ --enable-authn_core=shared \ --enable-authn_file=shared \ --enable-authz_core=shared \ --enable-authz_groupfile=shared \ --enable-authz_host=shared \ --enable-authz_user=shared \ --enable-autoindex=shared \ --enable-dir=shared \ --enable-env=shared \ --enable-headers=shared \ --enable-include=shared \ --enable-log_config=shared \ --enable-mime=shared \ --enable-negotiation=shared \ --enable-proxy=shared \ --enable-proxy_http=shared \ --enable-rewrite=shared \ --enable-setenvif=shared \ --enable-ssl=shared \ --enable-unixd=shared \ --enable-ssl \ --with-ssl=/opt/openssl-1.0.1i \ --enable-ssl-staticlib-deps \ --enable-mods-static=ssl make (would run sudo make install next but I get an error) I'm essentially following the guide here except with newer slightly newer versions. My problem is I get a linker error when I run make for apache: Making all in support make[1]: Entering directory `/home/developer/downloads/httpd-2.4.10/support' make[2]: Entering directory `/home/developer/downloads/httpd-2.4.10/support' /usr/share/apr-1.0/build/libtool --silent --mode=link x86_64-linux-gnu-gcc -std=gnu99 -pthread -L/opt/openssl-1.0.1i/lib -lssl -lcrypto \ -o ab ab.lo /usr/lib/x86_64-linux-gnu/libaprutil-1.la /usr/lib/x86_64-linux-gnu/libapr-1.la -lm /usr/bin/ld: /opt/openssl-1.0.1i/lib/libcrypto.a(dso_dlfcn.o): undefined reference to symbol 'dlclose@@GLIBC_2.2.5' I tried the answer here, but no luck. I would prefer to just use aptitude, but unfortunately the versions I need aren't available yet. If anyone knows how to fix the linker problem (or what I think is a linker problem), or knows of a better way to tell apache to use a newer openssl, it would be greatly appreciated; I've got apache 1.0.1i working otherwise.

    Read the article

  • MediaTemple Django Bad Gateway

    - by Eeyore
    I have a site running on GS server on MediaTemple. It's Django/PostgreSQL setup. For some reason from time to time I get Bad Gateway error and I can't figure out what's causing it. What can cause this error? What else can I do to find the cause of the problem? url.access-deny = ( "~", ".inc" ) fastcgi.server = ( "/main.fcgi" => ( "main" => ( "socket" => "/var/tmp/" + appname + ".sock", # don't change this "check-local" => "disable", ) ) ) alias.url = ( "/media/" => "/home/xxx/data/python/django/django/contrib/admin/media/", "/static/" => "/home/xxx/containers/django/site/static/", ) url.rewrite-once = ( "^(/media.*)$" => "$1", "^(/static.*)$" => "$1", "^/favicon\.ico$" => "/media/favicon.ico", "^(/.*)$" => "/main.fcgi$1", ) server.error-handler-404 = "/main.fcgi"

    Read the article

  • HDMI Audio drops out when display enters powersave

    - by Jared Tritsch
    I have a Windows 8 machine with an AMD APU attached to my Home Theater system through HDMI (HDMI routes through a Home Theater AMP, then into the TV). Here's my problem, Whenever the display is interrupted, usually by the TV being turned off or into powersave mode, the audio device lists as "Disconnected" in windows audio devices and I cant get it to re-recognize that the HDMI audio is, in fact, plugged in. The only solution I have found so far is to restart the machine, which will then recognize the device without any problems, until the next time the TV turns off and the problem once again resurfaces. Has anyone else seen this phenomenon? I have no idea if its the GPU, the HDMI interface, the AMP, or even the TV itself, as there really isn't much a way to tell...

    Read the article

  • Need to detect the same application open on another computer on the network. Any software around tha

    - by Joe Schmoe
    I have a time management application that I use at home quite a lot and have running most of the time. At home, I have a desktop PC and a couple of laptops scattered around the house...all networked together. Unfortunately, the application I use is not multi-user and I risk losing/corrupting data if it has been left running on one computer inadvertently while I start using it on another one in another part of the house. I use Live Mesh to automatically keep the application's database synced across the different computers and I just need some way of making sure that I don't start using the application on another computer before closing it down on the previous one. Anyone know of any Windows software that can detect if an application is running simultaneously on different computers on my network, and warn me if I am about to have two open at the same time?

    Read the article

< Previous Page | 166 167 168 169 170 171 172 173 174 175 176 177  | Next Page >