Search Results

Search found 24624 results on 985 pages for 'linux rrt'.

Page 57/985 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • Ubuntu/Linux version recommendation for HP dv6 3122TX?

    - by sanjayav
    I purchased a HP dv6 3122TX recently and after installing Ubuntu 10.10 64 bit I ran into multiple issues like, The wireless driver is not supported by Uubntu. (1) [The driver is RaLink RT3090 ] The ethernet stopped working sometimes for no reason [The driver is Realtek RTL8111/8168B ] "Corrupted low memory at ..." issue which is described as a kernel bug in Ubuntu support forums. (It started to take me to a terminal instead of the GUI and couldn't start x server after that) As I'm not an expert Ubuntu user I got fed up of all these issue and got back to Windows 7. But I need an Ubuntu installation up and running for my development work. What are your suggestions about a reasonable Ubuntu version that I should try? Or a different Linux variation? Should I stick to a 32-bit version? It'd be great anyone can give some advice on this issue.

    Read the article

  • Linux: Automatically switch to external monitor (VGA)

    - by peoro
    I've got an eeePC with a really tiny monitor, so whenever I go (home, faculty, parent's home, friend's home, ...) I attach it to any external monitor I can find. If it matters my system is like this: Archlinux Linux 2.6.36 Xorg 7.6 X server 1.9.2 Intel Corporation Mobile 945GME Express Integrated Graphics Controller (fully accelerated by intel modules) When I boot up the system, it uses the integrated monitor (LVDS1) only, and I have to manually manually switch to the external monitor (VGA1) using xrandr. Is it possible to configure my Xorg (or whatever) so that it uses the VGA1 output if present?

    Read the article

  • configuring linux server to send traffic to local machines using local IP address

    - by gkdsp
    Two linux servers, server1 and server2, are on the same local network (they also have access to an external network). Server2 has a local IP of 192.168.0.2 and a host name of host2.mydomain.com. QUESTION 1: If an application on server1 sends traffic to server2 using a host name of host2.mydomain.com, what determines whether this traffic is routed to server2 using the local or external network? QUESTION 2: To ensure that all traffic sent from server1 to server2 always uses the local network, could I simply include in the server1 /etc/hosts file the following? 192.168.0.2 host2.mydomain.com ...the thinking being, if the servers are always on the same network there should never be a need for server2 to send traffic to server1 via the external network (that I can think of anyway). Is this done in practice, or is some other method preferred?

    Read the article

  • Account not getting completed deleted within linux

    - by lbanz
    I've got a nas box running some flavour of linux 2.6.31.8.nv+v2 with an arm processor. It has got a samba share called 'all' that has full read write access to everyone. However one Windows machine cannot access it without prompting for authentication and I found out from the logs that the windows account matches a local account on the nas box. What I then went to do is delete the local account on the nas. I can see that /home,/etc/password + /etc/shadow the account doesn't exist anymore. However the samba logs, shows that it thinks it is still there as it says account is disabled. I've tried rebooting both nas + windows box. Is there somewhere else that it stores account information? I logged on with a different account on that Windows machine and I can access the share fine. The smb logs shows that it can't find the user and then allows anonymous access.

    Read the article

  • Ethernet port sleeping on PS3 running linux

    - by Doug
    My lab has a PS3 running Ubuntu Linux 9.04 Server Edition. After a period of a few hours with no use, the Ethernet connection (eth0) seems to go to sleep, causing the connection to be lost. Pinging or trying to SSH into the machine results in no response. The fix I've been using is to access the machine locally and restart it (trying to bring eth0 down then up doesn't seem to correct it). I've tried setting up an hourly cron job that runs on the PS3 and pings another machine just to create network activity, but this doesn't seem to solve the problem either. Update: The solution was to run the above cron job much more frequently: every 10 minutes works.

    Read the article

  • Using Arch Linux computer as a server for Rack Apps

    - by wxl
    What would be the best way to go about using an Arch Linux computer as a Rack (as in Ruby Rack, not an actual rack server) server? Here's what I want to be able to do: Automatically deploy on a git push to the server. (I already have this worked out, on post-receive the server checks out the app to /home/git/app from /home/git/app.git.) Run a Rack server application to serve up this app, one that can be restarted on demand. Run a MongoDB server Be able to access the app by going to my-server.local/app or something similar. (It's really only going to be used on the local network, no port forwarding or outside use) Any ideas would be greatly appreciated. I apologize if this seems too "do it for me".

    Read the article

  • CPU and HD degradation on sourced based Linux distribution

    - by danilo2
    I was wondering for a long time if source based Linux distributions, like Gentoo or Funtoo are "destroying" your system faster than binary ones (like Fedora or Debian). I'm talking about CPU and hard drive degradation. Of course, when you're updating your system, it has to compile everything from source, so it takes longer and your CPU is used at hard conditions (it is warmer and more loaded). Such systems compile hundreds of packages weekly, so does it really matter? Does such a system degrade faster than binary based ones?

    Read the article

  • [linux] preventing access in shared hosting

    - by jack
    Hi Linux Admins I set up a small shared hosting that contains some sites. For each site, there is a user. I mean, for abcd.com, I created abcd.com user and put htdocs for web hosting. I have no idea on how to prevent abcd.com from accessing xyzd.com's data. I have chmoded by changing 0 to others permission, which makes access defined by Apache when I view it with browser. How can I secure access? Thanks.

    Read the article

  • recommendations for a lightweight linux distribution for a test server

    - by Jack
    I'm planning on setting up a test server to experiment with some application servers (tomcat/jboss/...) and with some portals. Now the machine I've set aside for this is lightweight CPU/GPU wise(Atom D510, 4 gigabyte ram, 500 GiB hdd, onboard GPU). But it should suffice for most things, I'm more interested in the stability of JBoss/Tomcat for my purposes than the stability. However I'm having a bit of trouble picking an appropriate distribution size/performance/setup time wise/security wise since it seems I can't sneeze without another distribution popping up. I've been thinking about going for Fedora since I've read that that distribution has been optimized for Atom, but I'm not really familiar with it. My experience with Linux has mostly been limited to Ubuntu and some tinkering with puppylinux. I'm not afraid to get my hands dirty using the command line. I'm not planning on starting a discussion per se, mostly the pros/cons that people have encountered with some distributions

    Read the article

  • How to remove large number of files/folders in linux

    - by user1745713
    We are using hadoop to split a table into smaller files to feed to mahout, but in the process, we created a huge amount of _temporary logs. we have an nfs mount for the hadoop volume so we can use all the linux commands to delete folders files, but we just can't get them to be deleted, here's what I've tried so far: hadoop fs -rmr /.../_temporary : hangs for hours and does nothing on nfs mount: rmr -rf /.../_temporary :hangs for hours and does nothing find . -name '*.*' -type f -delete : same as above the folders look like this (38 of these folders inside _temporary): drwxr-xr-x 319324 user user 319322 Oct 24 12:12 _attempt_201310221525_0404_r_000000_0 the content of these are actually folders, not files. each one of those 319322 folders has exactly one file inside. not sure why the do the logging this way. Any help is appreciated.

    Read the article

  • Badwidth-Hogging Linux Server Causing Trouble

    - by BlairHippo
    We have a Linux server (2.6.28-11-generic #42-Ubuntu) that's misbehaving on a client site, gobbling up an entirely unacceptable percentage of the client's bandwidth, and we're trying to figure out what the heck it's doing. And the guy who had the sysadmin skillset has yet to be replaced. We're at a loss for what could be causing all that network traffic, and need to figure it out SOON. What log files should I be looking at to find this information? What analysis tools would you recommend for this task? Please note that I'm not looking for a tool that will allow me to analyze FUTURE traffic. The client is on the verge of shutting the machine off entirely; I need to figure out what it's been doing with the data I already have, if that's at all possible. My thanks in advance for helping a development monkey play sysadmin.

    Read the article

  • Linux NetSec/IDS Bridge

    - by Blackninja543
    What I am looking to make is a linux system that acts as a bridge. It simple forwards any data sent on one device over to the next device. It does not attempt to block incoming attacks or redirect any traffic. What it does to is perform an IDS role on the network. Any suspicious activity is logged and reported. Snort would be one such piece of software however I was wondering what other solutions and ideas the rest of the community has.

    Read the article

  • Linux IO scheduler on databases with RAID

    - by Raghu
    Hi, I have a linux database(MySQL) server(Dell 2950) with a 6-disk RAID 10. The default IO scheduler on it is CFQ. However, from what I have read and heard, there is no need for a scheduler like CFQ when reordering/scheduling is also done by underlying RAID controller; on the contrary since it does not account underlying RAID configuration into account performance may actually degrade with CFQ. The primary concern is to reduce CPU usage and improve throughput. Also, I have seen recommendations of using noop/deadline IO scheduler for databases primarily because of the nature of their R/W access.

    Read the article

  • Install Linux with two hard drives

    - by rdecourt
    I've a machine with two hard drives. The first one has 80 GB and the second has 120 GB. I'm about to format this machine and install Linux, and I want to install all the main partitions (/, /boot, /usr/, etc.) on the first hard disk drive (sda) and mount the /home and /var partition on second disk (sdb). Is this possible, and do I have to do something after the instalation? Or is the second hard disk drive automatically mounted? How can I do it? I won't do it, but is there any problem to mount /boot on the second hard disk drive? I'm using Ubuntu 12.04.

    Read the article

  • Tracking the linux config with git: how?

    - by Pierre
    I'd like to track my linux configurations with git. My idea is to have a branch for each server. /etc is not the only one directory to be tracked (I won't git init in '/etc' ) As far as I could see, it is possible to init a git for a distant directory. I tried this: # mkdir -p /git/.git # cd /git # git --work-tree=/ --git-dir=/git/.git init Initialized empty Git repository in /git/.git/ 1) Creating a new branch before everything is not possible # git branch server1 fatal: Not a valid object name: 'HEAD'. 2) adding a file in master/HEAD is not possible # touch README.md # git add README.md fatal: Unable to create '//.git/index.lock': No such file or directory how should I properly setup git to track my system-config ? Thanks. P.

    Read the article

  • Making audio CDs en mass - Linux based solutions?

    - by The Journeyman geek
    My mom's sings and gives away cds to people. Invariably it falls to me to have to burn cds for her, and burning 50-100 cds on a single drive is a pain. I DO have a handful of cd burners and a slightly geriatric old PIII 450. This is what i want to be able to do - either point an application at a folder of WAV or MP3s, say how many copies i need on CLI (since then i can SSH into the system and use it headless) feed 2 or more CD burners cds until its done, OR pop in a single CD into a master drive and have its contents duplicated to 2 or more burners. I'd rather have it running on linux, be command line based, and be as little work as possible - almost automatic short of telling it how many copies i want would be ideal. I'm sure i'll have people wondering about legality - My mom sings her own music, and its classical, and older than copyright law, so, that's a non issue. I just want a way to make this chore a little easier, short of telling my mom to do it herself.

    Read the article

  • Best window manager for Linux for Virtual Desktop / Multimon

    - by mattcodes
    Previous used Ubuntu Gnome with Compiz but for my basic spec intel macbook (4 years old) its a little too heavyweight. So for now Im back on my macbook with os x, but now considering going back to Linux. Im looking for a window manager that has the following properties: 1) Supports virtual desktop (need 4 minimum) 2) Works well with multi monitors - can move an app with shortcut from one monitor to the other (on same virtual desktop) 3) Can remember window position (i.e. open vim on 2 monitor) - however must coerce everything back to first window when 2nd screen is unplugged 4) Keyboard shortcut friendly 5) Not too hard to install 6) Works well with minimum hardware such as integrated graphics Please suggest and share your experiences

    Read the article

  • How to compare differences between directories (linux)

    - by Phil
    I have two directories - one from earlier backup and second from newest backup. How do i compare what changes were made to files in directory from newest backup on Linux? Also how do i display changes in for example text and php files - i'm thinking about something like revision history on wikipedia where you see old version on one side of the screen and newest version on other and changes are highlighted. How do i achieve something like that? edit: How do i also compare remote dir with local?

    Read the article

  • Linux program unable to access files in group

    - by user1064665
    I'm having trouble configuring things on linux so that a program can access certain files. Let's call it pgm A. It has uid uA and gid gA. In addition, uid uA is listed in /etc/group as a member of group gX. The problem is that pgm A cannot access files for which the uid is root and the gid is gX, but only when pgm A is called from another program, pgm B, which also runs as user uA. If I su as user uA and run pgm A from bash, it has no problem accessing files in group gX. But if another program, pgm B, which also runs as user uA, forks and execs pgm A, pgm A cannot access the files. I've verified that pgm A is indeed running as user uA, group gA, when launched from pgm B. So, if uA is a member of group gX, why can't the program access files which are readable by group gX? It's as if the operating system is ignoring the fact that user uA is also in group gX.

    Read the article

  • Ping6 fail on linux

    - by michelemarcon
    I have 2 linux box configured with IPv4. I have tried adding IPv6 to them. I have issued this commands on box1: ip -6 addr add fd32:2d7f:f3c1::1/48 dev eth0 And I get this: inet6 addr: fd32:2d7f:f3c1::1/48 Scope:Global Then I have issued this command on box2: ip -6 addr add fd32:2d7f:f3c2::1/48 dev eth0 Back on box1 (command/response): ping6 fd32:2d7f:f3c1::1 is alive! ping6 fd32:2d7f:f3c2::1 ping6: sendto: Network is unreachable Why doesn't box1 ping box2 (of course, also box2 can't ping box1)?

    Read the article

  • Best network tuning variables for a Linux proxy

    - by smarthall
    What are the best settings to tune so that Linux can handle a very large amount of TCP connections such as would be seen by a proxy server or a webserver? I'm using Centos6 and squid and am seeing a large amount of TIME_WAIT connections backing up until finally the machine stops responding. The machine isn't loaded at the time, and is having trouble making ingoing and outgoing connections. I've had several suggestions of tuning /proc/sys/net/ipv4/tcp_tw_reuse and /proc/sys/net/ipv4/tcp_tw_reuse but they mention bad interactions with load balancers and NAT both of which are used in my situation.

    Read the article

  • Why Java applicaiton is running slow for clients on Linux

    - by Darshani
    We have two linux servers which have Apache Tomcat Installed. One is running the Database, the other is running with Java application. More than 100 users are connected to the servers. Server machines are quad core normal PC's which have 4 GB memory. It was running properly for the last 6 months but the application has recently started to run slowly . Suddenly the Java application is getting stuck & users cannot work for some time. There is no network issue. I am trying to identify the reason for this, whether it is a machine problem or the problem is in the Java application. Can any body help me with this?

    Read the article

  • Most secure way to have IPtables auto-loaded using Debian / Linux

    - by networkIT
    I'd like to know the safest way to load iptables using Debian. Of course, I can use a script that uses iptables-restore : #!/bin/sh iptables-restore < /etc/firewall.conf but : 1) where is the safest place to have it loaded ? /etc/network/if-up.d ? I'm concerned about the script being loaded early enough at boot time, and reliably enough when plugging/unplugging interfaces ... 2) is this script method using iptables-restore the most secure way ? 3) additionnally, how much does the answer validity stretch to other Linux distros ( Ubuntu, Fedora, CentOS ) ? Thanks ^^

    Read the article

  • configuring linux server firewall to allow acces on a certain range of IP addresses

    - by eggman20
    Hi Guys, I'm new to linux server. I'm currently trying to get an Ubuntu 10.10 server up and running for the first time and I'm using Webmin for administration. I'm stuck on the setting up the firewall. What I need to do is to ONLY allow a range of IPs (e.g 128.171.21.1 - 128.171.21.100) to access the HTTP server and Webmin. I've seen a lot of tutorials but none of them fits what I needed. Thanks in advance!

    Read the article

  • Accessing subfolders of a windows share from linux

    - by Born2Smile
    Hi, at my work they have a funny setup: my home folder is a subfolder to a share, as such: \\server\share\subfolder Now I have full permissions to the subfolder, but no permissions to share. From windows I can connect to the VPN of my work place, type the above address into any address field, and voila: I see the contents of my home folder. In Linux (using Ubuntu) however, I can't figure out how to connect directly to the subfolder. Every attempt I can think of keeps returning "Access denied", because I don't have permission to view the share. Any help on how to connect to the subfolder would be greatly appreciated :) Cheers, Born2Smile

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >