Search Results

Search found 7935 results on 318 pages for 'aka nice'.

Page 231/318 | < Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >

  • limits.conf to set memory limits

    - by Rupert Jipe
    I would like to limit any process from using more than 500 MB of RAM. AFAIK this is done using RSS in /etc/security/limits.conf but the process called gnome-panel apparently is using 618436 kB of VmRSS. How can this be ? /etc/security/limits.conf * hard rss 512000 username@debian:~$ cat /proc/3002/status Name: gnome-panel State: S (sleeping) Tgid: 3002 Pid: 3002 PPid: 2910 TracerPid: 0 Uid: 1000 1000 1000 1000 Gid: 1000 1000 1000 1000 FDSize: 64 Groups: 20 24 25 29 44 46 112 116 117 1000 1002 1003 VmPeak: 916636 kB VmSize: 916636 kB VmLck: 0 kB VmHWM: 618436 kB VmRSS: 618436 kB VmData: 601972 kB VmStk: 104 kB VmExe: 516 kB VmLib: 29232 kB VmPTE: 1760 kB Threads: 1 SigQ: 0/14001 SigPnd: 0000000000000000 ShdPnd: 0000000000000000 SigBlk: 0000000000000000 SigIgn: 0000000020001000 SigCgt: 0000000180000000 CapInh: 0000000000000000 CapPrm: 0000000000000000 CapEff: 0000000000000000 CapBnd: ffffffffffffffff Cpus_allowed: 3 Cpus_allowed_list: 0-1 Mems_allowed: 00000000,00000001 Mems_allowed_list: 0 voluntary_ctxt_switches: 871965 nonvoluntary_ctxt_switches: 47553 PaX: PeMRs username@debian:~$ cat /proc/3002/limits Limit Soft Limit Hard Limit Units Max cpu time unlimited unlimited seconds Max file size unlimited unlimited bytes Max data size unlimited unlimited bytes Max stack size 8388608 unlimited bytes Max core file size 0 0 bytes Max resident set 524288000 524288000 bytes Max processes 100 100 processes Max open files 1024 1024 files Max locked memory 65536 65536 bytes Max address space unlimited unlimited bytes Max file locks unlimited unlimited locks Max pending signals 14001 14001 signals Max msgqueue size 819200 819200 bytes Max nice priority 0 0 Max realtime priority 0 0 Max realtime timeout unlimited unlimited us

    Read the article

  • Does AMD Cool n Quiet Slow Down Your System?

    - by Software Monkey
    I discovered today that having AMD Cool n Quiet enabled in my BIOS appears to be slowing down my Windows XP SP2 system by about 29% on memory & CPU intensive workloads. I was wondering if (a) anyone else had encountered this, (b) anyone can offer an explanation, (c) there are any negatives I need to be aware of if I keep AMD CnQ disabled. With some superficial testing so far, I don't immediately notice any difference with CnQ off (other than the performance being what I expected from this new hardware). It seems to ramp up the CPU fan a little bit as my program maxes out 1 core, but that's the same as with CnQ on. And when I let the system idle the CPU fan slows down and the systems as quiet as a mouse (after years of 6 small fans churning like they want to go into orbit it's nice to again have a system where I can hear the HDDs seeking). Bonus question: Does CnQ cause issues with system stability? I ask because the reason I disabled it was because I have had a few freezes and 1 spontaneous reboot with my new hardware.

    Read the article

  • My desktop has started overheating -- how hot is hot?

    - by Jerry
    I have a two year old desktop, some random quad core HP desktop. It used to run very quietly, but in the past month, the fans start up anytime anything "serious" is being done -- compiles, playing video, etc. Right now, speedfan and speccy report the cores are between 50C and 70C. Speedfan reports this as hot. (Nice flame icon.) Well, the system does sit on my carpet, so two weeks ago, I took off the lid, and cough *cough* it was pretty filled with dust. I got out an air can, turned on a vacuum and carefully got out all the dust that I saw on the CPU fan the case fans any fan I saw (graphics board) and blew out all the dust I could from all the circuit boards. And then I closed the case back up. It has definitely run cooler since then, but it still runs hot, and I hear high speed fan noise I never heard before. How hot is too hot? At what temps do consumer grade CPUs die? What should I be looking to do? Replace CPU fan? (It seems to work) Replace power supply fan? Assuming the dust problem is gone, where should I be looking to determine why the machine is heating up? Epilogue: After following the various pieces of advice given here, the system did run cooler, but it was still noticeably running louder (hotter) than just a few months prior. I ended up purchasing a new cpu heatsink and fan and during installation found the cooling grease from the original heatsink was just a dried, cracked layer, probably more of an insulator than heat transfer agent. With the new fan AND the new heatsink compound, the system ran much much cooler and the fan rarely turns on.

    Read the article

  • Rsync: Only preserve meta (time, group, etc) on files and sub-directories, not root directory

    - by Svish
    I am copying some files (all except hidden ones) using rsync from one place to another using this command: rsync -Cav --delete --exclude=.* /Some/Directory/ other-host:/Other/Directory It works nice except that I get the following errors: rsync: chgrp "/Other/Directory/." failed: Operation not permitted (1) rsync: failed to set times on "/Other/Directory/.": Permission denied (13) That is understandable because I do in fact not have those permissions, and I also do not want to change the group of that directory. I only want to do this for all the files and directories that are in that directory. Is there any way to solve this? Tried to --exclude=. and --exclude=./, but those didn't work. Any ideas? I have no idea how to fix this... More details: This is on Mac OS X, and the directories I am syncing is from a local mounted volume to the /Users/Shared/ directory on the other host. That directory has user root and group wheel. The files inside it has user admin and group staff and so does the local source directory.

    Read the article

  • Home media storage solution

    - by Dan
    I record lots of personal HD film footage and am looking for a cheap way to store all of this. I take ~120 GB of footage each month, so something expandable would be nice... something that might be able to hold 6+ SATA drives. There is a low load requirement, as there is never more than a user or two... but it should be able to keep up with streaming 2 simultanious HD videos. I don't really want to spend more than $200-$300 on top of the $900 I am thinking of spending for 6X2GB SATA drives@ $150 apiece, but I am willing to pay extra for a quality solution. Should I get a cheap NAS server? a cheap multi-drive external enclosure? should I just get some used systems off craigslist? If it is an independent system I'll probably just throw ubuntu on it since I can maintain that well. Its easy to do a software raid from ubuntu too, if I choose to go that way. Thanks

    Read the article

  • How to set up that specific domains are tunneled to another server

    - by Peter Smit
    I am working at an university as research assistant. Often I would like to connect from home to university resources over http or ssh, but they are blocked from outside access. Therefore, they have a front-end ssh server where we can ssh into and from there to other hosts. For http access they advise to set up an ssh tunnel like this ssh -L 1234:proxyserver.university.fi:8080 publicsshserver.university.fi and put the proxy settings of your browser to point to port 1234 All nice and working, but I would not like to let all my other internet traffic go over this proxy server, and everytime I want to connect to the university I have to do this steps again. What would I like: - Set up a ssh tunnel everytime I log in my computer. I have a certificate, so no passwords are needed - Have a way to redirect some wildcard-domains always through the ssh-server first. So that when I type intra.university.fi in my browser, transparently the request is going through the tunnel. Same when I want to ssh into another resource within the university Is this possible? For the http part I think I maybe should set up my own local transparent proxy to have this easily done. How about the ssh part?

    Read the article

  • Dedicated server automatic backup solution

    - by Luigi
    I have a dedicated Ubuntu web server in a cloud environment, and I am looking for a nice way to do automated backups. I would like to backup some directories with web apps, and all my MySql databases. As for destination: make snapshots every two hours localy, and every six hours to a remote ftp server. Also delete backup archives older than seven days(localy + ftp), and notify on any problems by email. Now to achieve some of this functionality I use cron + shell script, and http://www.mysqldumper.net/, but really that doesn't answer my needs. Mysqldumper doesn't know automaticly about new databases, and shell script does not notify on problems. It's something I have to check out from time to time, and i don't have trust for. I googled a while, and seems like most people solve this stuff with shell scripts. Is this a method you can trust? Are there any web-gui tools, I'm missing? Maybe there is a smarter startegy for doing this? I'm a little bit confused.

    Read the article

  • /etc/security/limits.conf for setting program limits in Linux

    - by Flavius Akerele
    I have the following inside /etc/security/limits.conf (I have specified root separately because * will not include it.) user2 - core unlimited * - core 0 root - core 0 * - rss 512000 root - rss 512000 * - nproc 100 root - nproc 100 * - maxlogins 1 root - maxlogins 1 I run a program as user2 (./programname) but /proc/3498/limits says cores are disabled: Limit Soft Limit Hard Limit Units Max cpu time unlimited unlimited seconds Max file size unlimited unlimited bytes Max data size unlimited unlimited bytes Max stack size 8388608 unlimited bytes Max core file size 0 0 bytes Max resident set 524288000 524288000 bytes Max processes 100 100 processes Max open files 1024 1024 files Max locked memory 65536 65536 bytes Max address space unlimited unlimited bytes Max file locks unlimited unlimited locks Max pending signals 14001 14001 signals Max msgqueue size 819200 819200 bytes Max nice priority 0 0 Max realtime priority 0 0 Max realtime timeout unlimited unlimited us Both ulimit -Sa and ulimit -Ha output that cores are disabled: core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 14001 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) 512000 open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) unlimited cpu time (seconds, -t) unlimited max user processes (-u) 100 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited Why are cores disabled ?

    Read the article

  • Can't connect to research.microsoft.com on home Qwest DSL connection

    - by rakingleaves
    I have a puzzling issue regarding accessing research.microsoft.com from my home Qwest DSL connection. By default, I frequently get timeouts when accessing research.microsoft.com from Firefox, Safari, or Chrome on my Mac. I also cannot access the site from Internet Explorer in a Windows VM. However, I am able to access the site through proxify.com, so I know the site is not down. Furthermore, I haven't noticed problems accessing other sites (in particular, www.microsoft.com works fine). Also, I can access research.microsoft.com when I'm connected to networks other than my home Qwest DSL connection. Together, the above make me suspect a problem with either my router (Airport Express) or, more likely, my ISP. Anyone have any thoughts on how I can narrow down the problem further? I could call my ISP and tell them the above, but my feeling is that probably won't get me very far. I can get by browsing research.microsoft.com through a proxy, but it would be nice to figure out what's going on here and fix the problem. Oh, the only relevant discussion I found via Google was here: http://forums.whirlpool.net.au/forum-replies-archive.cfm/1311734.html Update: Thanks to those who have tried to help! I found one other thing while Googling that may be vaguely relevant: http://thedaneshproject.com/posts/supportmicrosoftcom-not-working-behind-squid/ Disabling the Accept-Encoding headers in Firefox actually didn't make a difference for me. I just thought the above might spark some other ideas about how mishandling of HTTP headers somewhere might be causing this problem. Thanks again! Another update: In case anyone is still thinking about this; I've found that I can't surf research.microsoft.com using the links text-based browser, but I can reliably download individual files with wget. Maybe that helps?

    Read the article

  • Verizon 4G LTE vs. a LAN

    - by n8wrl
    I have been having quite a bit of trouble getting my new Verizon 4G-LTE service running on a Windows 7 desktop. My desktop is on a LAN here at home with two other PC's. We all share printers, files, media, etc. Until yesterday, we also shared a Verizon 3G modem via a NetGear 3G broadband WAP. That isn't compatible with the 4G so now I am just trying to get the 4G modem working directly connected to one of the desktops. After some USB wrangling, it seems to work. Except, every 7-10 minutes the connection would drop. After some time on the phone with a very nice Verizon technician, it seems to be staying up - it's been up for 20 minutes now. He told me that my LAN was causing the 4G to drop. That traffic on my LAN, even though it is not destined for the internet (ICS not working yet) was causing the cell tower to detect an 'IP change' and a 'security violation' in my modem and drop my connection. Is this Verizon's way of forbidding more than one computer to share a modem? I have my computer running now without a LAN connection and the 4G is still up. But this isn't practical. Has anyone heard of this?

    Read the article

  • There's no sound on Ubuntu with an Intel HDA onboard chip and Realtek ALC1200 codec.

    - by Hanno Fietz
    For a while now, my sound has not been working in Ubuntu. It used to play OK, but after some upgrade (might have been distro upgrade to 9.10), it stopped working. I'm currently running 10.04 on an amd64 architecture. I'm using the builtin audio on a Foxconn motherboard, it's an ATI / Intel HDA chip with an Azalia controller, apparently it's using the Realtek ALC1200 codec. All the gory details here. I found a nice sound troubleshooting tutorial here, which is well-written and pretty extensive, however, I fail to look up the supported "models" for my soundcard. The troubleshooting page says to look for a section giving the codec used by your soundcard, which looks like this for me: !!HDA-Intel Codec information !!--------------------------- --startcollapse-- Codec: Realtek ALC1200 Then, I'm supposed to lookup the models for that codec in the file Documentation/ALSA-Configuration.txt in the appropriate directory of ALSA's git repository. Mine actually pointed me to a separate file, Documentation/HD-Audio-Models.txt, which, for my driver version is located here and contains no section related to ALC1200 codecs. I tried putting the driver options probe-mask=1 and model=auto in a config file for modprobe, as suggested elsewhere, but this just lead to snd-hda-intel not able to load at all anymore. I also tried installing the linux-backports-modules-alsa package for my kernel, because the description sounded promising, but that didn't change anything, either.

    Read the article

  • Recommendations for hosting large videos

    - by Clinton Blackmore
    I recently created and put a 45-minute, 300 MB video file on my website and told a mailing list about it. Checking my site stats, I see that I've used 20% of my "unlimited" bandwidth for the month. As I want to be able to have several videos like this, clearly, I need to consider other options. The appeal to hosting files as my own site (aside from the supposedly unlimited disk space and bandwidth), is to be able to have control over the format, resolution, and quality of the video(s), as well as to ensure that it is clear that I'm the copyright holder (although the videos will be under a creative commons license). I find that for the screencasts I'm making, having a high resolution (say 3/4 of 1024 * 768) really makes seeing what is going on on the screen easier. It is also always a plus to not have the experience marred by advertisements. One more wrench to throw in is that while the videos are non-commercial, they do promote a club, and it seems that that falls afoul of some terms of services (especially for free services; while free is very nice, I will certainly consider putting up some money.) What recommendations do you have for (fairly) long, high-resolution videos? Should I look in depth at sites like YouTube and Vimeo, should I be considering a filesharing site [I have no qualms with someone downloading the entire video first -- I wouldn't want to watch 45 minutes in my browser!], hosting files with Bittorent (ugh -- I think that'd reduce my audience), or should I be looking into other web hosts (and if so, who?)

    Read the article

  • Somewhat powerful server needed for computationally expensive stuff

    - by Dane Larsen
    So here's my problem. My Dad runs a company that does some rather computationally expensive stuff. This is not supercomputer level stuff, but it does take several hours to run the average job on his Core i7 desktop. He asked me to look into a way to have his customers use the code on an hourly basis, namely via a server. Ideally he'd be able to buy a box for about $1000, and hook it right up to our home connection. Unfortunately, the data that needs to be both sent and received is on the order of several hundred megs. We live in a rural area, and the fastest connection offered is 1.5Mbit/s. Download. It's like .3Mbit/s upload. Not workable. What are the options for this kind of thing? Ideally, we'd have about 2GB of ram, 300-500GB of storage, and a nice dual core, and it has to run some flavor of Linux. Any suggestions? Thanks in advance EDIT: Also, ideally the monthly price would be < $100 per month.

    Read the article

  • installing lots of perl modules

    - by Colin Pickard
    Hi, I've been landed with the job of documenting how to install a very complicated application onto a clean server. Part of the application requires a lot of perl scripts, each of which seem to require lots of different perl modules. I don't know much about perl, and I only know one way to install the required modules. This means my documentation now looks this: Type each of these commands and accept all the defaults: sudo perl -MCPAN -e 'install JSON' sudo perl -MCPAN -e 'install Date::Simple' sudo perl -MCPAN -e 'install Log::Log4perl' sudo perl -MCPAN -e 'install Email::Simple' (.... continues for 2 more pages... ) Is there any way I can do all this one line like I can with aptitude i.e. Type the following command and go get a coffee: sudo aptitude install openssh-server libapache2-mod-perl2 build-essential ... Thank you (on behalf of the long suffering people who will be reading my document) EDIT: The best way to do this is to use the packaged versions. For the modules which were not packaged for Ubuntu 10.10 I ended up with a little perl script which I found here ) #!/usr/bin/perl -w use CPANPLUS; use strict; CPANPLUS::Backend->new( conf => { prereqs => 1 } )->install( modules => [ qw( Date::Simple File::Slurp LWP::Simple MIME::Base64 MIME::Parser MIME::QuotedPrint ) ] ); This means I can put a nice one liner in my document: sudo perl installmodules.pl

    Read the article

  • Backup solution, or, how Duplicati duped me

    - by blarghmaster
    TL/DR version: Mono + Duplicati.commandline.exe restore etc. etc. spits this out for several files regardless of what I try. I am able to list sets, list files in said sets, even do a verify, but each time i do a restore of any kind, i get errors to the effect of : Failed to restore file: "snapshot/blahblah/2005-11-07.tar.gz", Error message: The partial file record for snapshot/blahblah/2005-11-07.tar.gz does not match the file Any advice here, or an idea of where to look for a better solution? FULL STORY: Ive recently put together an nice clean, friendly backup solution for several servers, predominantly Linux, but occasionally a windows box is added too. The solution as is meets all my requirements and does it well... save 1: cross-compatibility The solution is based on a combination of several elements, but eventually comes done to using Duplicity and Duplicati for the actual storage of files. The entire solution was ready to go before i realized that Duplicati, does not, in fact allow me to restore my files to a Linux box, regardless of what the commandline under Mono might tell you. It just spits out errors on random zip and image files, for apparently no good reason as i have tried several options to get it to restore, and several versions of Mono including installing it pretty much lib-for-lib. There is no effective log file for the reasons for these errors, and even the "--debug-output=true" flag does nothing. I am able to list sets, list files in said sets, even do a verify, but each time i do a restore of any kind, i get errors to the effect of : Failed to restore file: "snapshot/blahblah/2005-11-07.tar.gz", Error message: The partial file record for snapshot/blahblah/2005-11-07.tar.gz does not match the file Now i could most likely use the friendly instructions on Duplicati's site and script a bash equivalent of the restore, but that's not exactly ideal. Any advice on this? or possibly an alternative solution that presents the same benefits of Duplicati/Duplicity but that actually works across platforms?

    Read the article

  • Securely executing system commands as sudo from PHP

    - by Aydin Hassan
    Is it possible? I have written a command line tool in PHP for creating new environments for our company. It creates system users, directories, databases, VHosts and restarts apache, amongst other things. These commands require sudo privileges. I thought it might be a nice idea to have a web-interface for it, to make it easier for other non-developers to use. The web app would be behind authentication. When running from the command line I just run sudo tool.php, obviously I can't do this from a web app. How could I do this securely? Giving the apache user sudo access seems silly, as this would means all sites hosted on the box (eg all our environments) would have sudo access. Is it possible to make this tool run under a different user? this user could have sudo privileges for only the commands I need? How do things like plesk and cPanel do this? Any thoughts?

    Read the article

  • Edit-text-files-over-SSH using a local text editor

    - by Mikko Ohtamaa
    I am working in various Linux and UNIX environments. I'd like to elegantly solve the problem of editing remote configuration files over SSH. Instead of using terminal editors (nano), I'd like to open the file in a local text editor on my desktop (Sublime Text 2). CyberDuck, WinSCP and various other SFTP apps can do this. Using editors over X11 forwarding has also proven to be problematic. Also using archaic text editors like Vim or Emacs do not serve my needs well. They could do this, but I prefer using other text editing software. Using ssh mounts (FUSE) are also problematic unless they can happen on the demand and triggered by the remote site. So what I hope to achieve Have a somekind of easily deployable shell script etc. which I can copy to remote server (let's call it mooedit) I run mooedit command on the remote server of which I have connected over SSH connection mooedit sends some kind of signal (over SSH( to my local desktop On my local desktop this signal is captured and it determines 'a ha! moo wants to edit a file on server X in folder Y' File is SFTP transfered to the local desktop (/tmp) File is opened in a nice GUI text editor on the local desktop When Save is pressed, the local desktop notices changes in the file and SFTP sends the resulting file back to the server The question is: What signaling mechanisms SSH provides for this? Any other methods to trigger a local text editor for remote SSH file?

    Read the article

  • IIS6 Virtual SMTP server isn't coming back up automatically after a system restart

    - by Julian James
    I've got a virtual server running Win2008 RC2. I've set up IIS6 with a virtual SMTP server on it to be the mail provider for the websites I'm hosting there. It all works great, but if for some reason the server reboots (auto updates are still enabled - I'm trying to make this as little work as possible as we've got a Lot of clients), the IIS6 doesn't restart the SMTP server. The failure causes 500 errors on the current setup, so I'm spending half the day apologising. Any ideas? In Services I've set everything to come back up automatically, but still no dice. As soon as I restart the SMTP, no problems, all the mail gets sent. It's working perfectly, it just won't restart on it's own. I'd really rather not turn auto updates off as we're such a small company I just can't spare the time to be manually updating 15 copies of windows every time MS decide there's a security patch. All advice appreciated! BTW, I am a complete newb to these forums. I searched but couldn't find an answer, so please be nice. But firm. I've got to learn here.

    Read the article

  • Linux/Unix in Windows

    - by Dmitriy Nagirnyak
    Hi, What would be the best way to get the full-blown Unix/Linux bash inside Windows? I don't mean the Virtual Machine, but rather only the terminal with mounted NTFS drives. This way I could use the power of Unix/Linux still being on Windows. The things I want to be able to do from the terminal: Package management (apt-get in Debian). SSH. File operations (including grub and similar). Run a web server (Apache, nginx) for testing purposes. Easy to use: start terminal - Linux is on, end terminal - Linux is shut down. Would be nice to be able to copy-paste from Windows into Terminal and vice versa. This really feels like a separate OS and I realize that VM would, probably, be the best thing. But I guess it should be possible to have a lighter installation. THE NOTE: I cannot just use Linux because of I still need to do development on Windows. Also I am a Linux noobie - just getting started with it so sorry if asking something obvious/stupid. Thanks, Dmitriy.

    Read the article

  • How can I write automated tests for iptables?

    - by Phil Frost
    I am configuring a Linux router with iptables. I want to write acceptance tests for the configuration that assert things like: traffic from some guy on the internet is not forwarded, and TCP to port 80 on the webserver in the DMZ from hosts on the corporate LAN is forwarded. An ancient FAQ alludes to a iptables -C option which allows one to ask something like, "given a packet from X, to Y, on port Z, would it be accepted or dropped?" Although the FAQ suggests it works like this, for iptables (but maybe not ipchains as it uses in the examples) the -C option seems to not simulate a test packet running through all the rules, but rather checks for the existence for an exactly matching rule. This has little value as a test. I want to assert that the rules have the desired effect, not just that they exist. I've considered creating yet more test VMs and a virtual network, then probing with tools like nmap for effects. However, I'm avoiding this solution due to the complexity of creating all those additional virtual machines, which is really quite a heavy way to generate some test traffic. It would also be nice to have an automated testing methodology which can also work on a real server in production. How else might I solve this problem? Is there some mechanism I might use to generate or simulate arbitrary traffic, then know if it was (or would be) dropped or accepted by iptables?

    Read the article

  • Remote desktop type software that the client need not install anything...

    - by allentown
    I am primarily a Macintosh user, and can usually walk a client though any troubles they may have because I have a Macintosh in front of me. If they are on a different OS, things are close enough, or I cam remember, that I can get by. When trying to help clients on Windows, I get stuck. I do not have access to windows, and even if I did, there are far too many versions of Outlook, all with their various esoteric settings and checkboxes, that I could never see exactly what they are seeing. I mostly need to just help them with email setup. Something like copilot.com may do the trick. What is the simplest remote control software out there, ideally, it would accomplish these: No software needed on remote end, or, a single .exe that they can toss when done. I need Mac based software on my end. I do have ARD, which support VNC Free :) If possible, it would be really nice Needs a port forwarding proxy run by the company. There is no way I can get the user to alter their router, or to even plug directly into their WAN for a short time. On the Mac, I just have them open iChat, and this is all built in, proxying through AIM, looking for the same for Windows and Mac.

    Read the article

  • Suggestions on the best home server rack cabinet

    - by allentown
    I have a lot of gear in a colocation facility right now. Some of it is going to come home with me now. I do not know anything about the "rack mount" side of the industry. I lease a rack, and I put my stuff in it. I have a few 1U boxes, a few 2U boxes, and a few 4U boxes. 1U switch. One is a new Xserve, which means it is deep. I think I can get by with around 12U to 18U. I want to keep it as small as possible, since I do not have a lot of spare space at my home. I will not be able to bolt to the wall, floor etc, so it should not be tall. This is something I would love to more or less just be a box that sits on the floor but gives me the ability to mount nicely, do nice cable management etc. Are the "post" style racks junk? I am liking the open space, and the no limitations on depth of something like this: http://www.rackmountsolutions.net/images/products/Martin-relay-rack.jpg However, that thing is way too tall, and probably way too expensive. I am looking to be around $300.00 or less. More if I have to, though I would prefer not to. These look near perfect: (See comment for this link, the system will not let me post a second url) but I am worried the Xserve will not fit in it. If anyone has any good links, or website recommendations of good past experience, I would appreciate it. I am almost considering that I may be able to build something with random scraps of stuff at Home Depot as well.

    Read the article

  • OSX: sync Documents folder to Dropbox with version control

    - by James Porter
    I have ample storage in Dropbox to sync my entire OSX Documents folder, and I'd like to this just so that I have it anywhere I go. I found this question, which describes a method for doing this with symlinks. Seems good, the only problem is that it would be nice also to have everything under version control. I thought perhaps a better solution would be to set up my Documents folder as a git repo with a remote that I would push to in my Dropbox folder. Alternatively, just set up Documents as a git repo with no remote and then symlink it to Dropbox. Which of these two alternatives is preferable? What are some pitfalls I might not be thinking of with each? It also has occurred to me that some of the subdirectories of Documents are themselves git repos with github remotes. Would it cause problems for these subdirectories if I made Documents a git repo? If so, how do I get around this? Would making Documents an svn repo instead help? Is there a way to set up git so that this is not an issue?

    Read the article

  • Does NetworkSolutions have a good DNS service?

    - by joxl
    I'm recovering from a DNS disaster and I need some good advice on an alternate solution. My company owns a domain name through NetworkSolutions. Our website is hosted by another company who also maintains our DNS records. Our email is hosted by Google Apps, and the MX records are maintained through the afore-mentioned website/DNS host. Yesterday our website/DNS host had a serious hiccup in some software and completely overwrote all of our DNS records with invalid values; successfully pointing our domain and MX records at the wrong servers. Unfortunately it wasn't caught until it had time to significantly propagate. On top of that, it wasn't fixed until several hours later, combine that with a long TTL on the records; we have customers who are still bouncing emails. Anyhow, I am now completely terrified of this company's ability to do a good job, so I am considering switching to NetworkSolutions for our DNS service. I need the ability to configure A, CNAME, MX, and TXT records, preferably with a nice user interface (our current provider has a poor UI and doesn't support TXT records). Is NetworkSolutions a recommended DNS host? I am a little biased in their direction because the service will be free since we already pay them for our domain name. However I'm curious what others have experienced with their service.

    Read the article

  • Ubuntu questions - important

    - by asdasd
    They had installed some modified edubuntu's at school... So i have some questions about setting some things up: How we can play HD videos ? They are made for windows machines and are in .wmv format but we need to play them on our multimedia class but don't know how - which player, which codecs etc. How to edit properly the /etc/apt/sources file ? Anything we try to install via apt-get it just says that E:\ is not available. Please tell me which repositories to put in there so we could be able to install some tools. Where are usually viruses/trojans put in ubuntu ? I mean in which directories ? Because our computers are behaving really slow and we need to check for some malware manually - we are not even allowed to install any kind of AV software. So tell me the usual directories and places for hiding such files, how are they hiddem, how to recognize them etc. Any others nice tricks/tips that we need to know. Thank you very much in advance.

    Read the article

< Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >