Search Results

Search found 28590 results on 1144 pages for 'best pactice'.

Page 150/1144 | < Previous Page | 146 147 148 149 150 151 152 153 154 155 156 157  | Next Page >

  • ZFS and SAN -- best practices?

    - by chris
    Most discussions of ZFS suggest that the hardware RAID be turned off and that ZFS should directly talk to the disks and manage the RAID on the host (instead of the RAID controller). This makes sense on a computer with 2-16 or even more local disks, but what about in an environment with a large SAN? For example, the enterprise I work for has what I would consider to be a modest sized SAN with 2 full racks of disks, which is something like 400 spindles. I've seen SAN shelves that are way more dense than ours, and SAN deployments way larger than ours. Do people expose 100 disks directly to big ZFS servers? 300 disks? 3000 disks? Do the SAN management tools facilitate automated management of this sort of thing?

    Read the article

  • Best "Keep your music with you" setup

    - by solomongaby
    I really enjoy listening to music, and because of this i have a lot of sources for it. This is not always a good thing since i have my music stored on a lot of devices ( home computer, work computer, online storage, ipod ) and sometimes its a bit difficult to have the same music in all places. Sometimes i dont have the same songs in all the places, i make a playlist in one place and want it in the other place etc. How do you keep your music in sync on all your devices. PS: i use both windows and linux.

    Read the article

  • Apple - best way to clean the mighty mouse?

    - by lostInTransit
    As much as I love Apple products' designs, I still don't like the mighty mouse. The scrolling keeps stopping in between very frequently. I tried the instructions provided on Apple's site and it does make the scrolling smooth but only momentarily. Isn't there any way to open it up and clean like a normal mouse? Or any other way to clean it better? Thanks.

    Read the article

  • best data+partition recovery software

    - by Pennf0lio
    I accidentally formatted my Drive D that contained all my Backups and Documents. I separated my files to my Drive D hoping I will not harm my files. Since I use Acronis Recovery to Install a new OS with some pre-installed application to my HDD I didn't realize I also formated/erase my Drive D. Now my drive D is unpartitioned. I am really in really in deep trouble and would need some urgent help, Please recommend a Software that at least can restore my Old Drive that contained my files. I'm assuming most of you think this is a duplicate of some old questions here, But I'm not looking for data recovery, I need to recover the whole partition with the files. I used to use "Recuva" but It only recovers files not the whole folders with the files in it. Please advice. Thank You!

    Read the article

  • Best way to grow Linux software RAID 1 to RAID 10

    - by Hans Malherbe
    mdadm does not seem to support growing an array from level 1 to level 10. I have two disks in RAID 1. I want to add two new disks and convert the array to a four disk RAID 10 array. My current strategy: Make good backup. Create a degraded 4 disk RAID 10 array with two missing disks. rsync the RAID 1 array with the RAID 10 array. fail and remove one disk from the RAID 1 array. Add the available disk to the RAID 10 array and wait for resynch to complete. Destroy the RAID 1 array and add the last disk to the RAID 10 array. The problem is the lack of redundancy at step 5. Is there a better way?

    Read the article

  • Securing SSH/SFTP and best practices on security

    - by MultiformeIngegno
    I'm on a fresh VPS with Ubuntu Server 12.04. I wanted to ask you the good practices to apply to enhance security over a stock Ubuntu-server. This is what I did up to now: I added Google Authenticator to SSH, then I created a new user (whom I'll use instead of 'root' for SSH & SFTP access) which I added to my /etc/sudoers list below 'root', so now it's: # User privilege specification root ALL=(ALL:ALL) ALL new_user ALL=(ALL:ALL) ALL Then I edited sshd_config and set PermitRootLogin to 'no'. Then restarted the ssh service. Is this ok? There are a few things I'd like to ask you though: 1) What's the sense of adding a new (sudoer) user whilst the root user still exist (ok it can't access with root privilege but it's still there..)? 2) System files are owned by 'root'.. I want to use my new_user to access via SFTP but with it I can't edit those files!! Should I mass-CHMOD 'em so that new_user has write perms too? What's the good practice on this? Thanks in advance, I hope you'll tell me if I did something wrong and/or other ways to secure the system. :)

    Read the article

  • Best tools for "ssh tail -f" style log file monitoring and analysis

    - by dougnukem
    I'm looking for a tool to monitor custom PHP Error logs/Apache and possibly Java logs on remote development servers. I'm not looking for a full production log system like Splunk, but something that's a little more flexible than an ssh terminal doing a "tail -f". Perhaps something that will: * Monitor multiple log files to my local machine for searching/analysis later * Allow "alerts" when certain strings appear in the log * Provide some kind of tabbed/dashboard view of the multiple logs being monitored (in total less than 10 logs).

    Read the article

  • Best method of transferring files over internet?

    - by EsotericHabit
    I have a seedbox (running Ubuntu 9.10) at my (parent's) house and will be leaving it there once I go to college this fall. Currently I'm using samba to transfer files between computers, but I was wondering if once I am on my university's network, whether using FTP would be a better option versus samba over a VPN. The files will range from 100 MB to 17 GB, if that matters. Would one be more efficient over the other? Did I forget any other options?

    Read the article

  • The best software for users internet usage

    - by nikospkrk
    Hi, We are a small business using a Vigor 2820 as the internet router, and we'd like to install a software that could report any internet usage from our users. I already tried the "official" software made by Draytek called "SmartMonitor", but is reliability is a real issue as it doesn't seem to keep capturing packets after working 3 to 6hours (randomly), whereas Wireshark keeps capturing packets after that amount of time. As I'm really fed up with this tool, I'm looking for other solutions but I still want the same features: users statistics, websites ranking, users traffic, ... I already enabled the port mirroring feature, so it would be perfect if you could suggest me a port mirroring-based software (ideally freeware). I thought I had found the good one with Etherscout, but it just doesn't launch. I am even open to a tool that would "just" make some reports based on Wireshark captured files (*.pcap). Thank you for any of your suggestion, Nicolas.

    Read the article

  • Best format for backing up data in Blu Ray

    - by Arrieta
    We are in the process of backing up our hard drives to Blu Rays. I am creating tar.gz files and burning them to Blu Ray. Is it possible to use a simple (preferably Python-based) solution for creating images of those tar.gz files, of a predetermined size (to fit in the Blu Ray), and simply burn this images to the disc? Do you have any other approach for creating physical back-up of your hard drives?

    Read the article

  • What's the best self-tracking software for Linux?

    - by trench
    I'm looking for a way to track myself and receive quality data upon which I can write future scripts/programs. For example, I use Google Reader a lot. I'd like to track the hrefs that garner my clicks. Further, I'd like to drop all of the words of each href into a database where they can be stacked in a hierarchical manner. At the end of the week I want to know that "Ubuntu" garnered 448 clicks and "Cheetos" garnered 2. :) That's just one example... I'd like this tracking and data-collecting to extend beyond my browser. I know writing something to do this myself wouldn't be too awfully difficult but if something already exists I'd happily use it. Thanks in advance. Primary OS: Ubuntu 10.04

    Read the article

  • Best way to keep configuration for server reinstallation?

    - by Gunnar
    I have a server at home running Ubuntu 12.04 which has grown messy over the years. I have fiddled with various packages, desktop environments (for VNC) etc. and I would like to reinstall it to start again, and have better control over what goes into the box. But I want to keep much of the configurations after reinstallation, like LVM configuration, apache2, samba, etc etc. There would ideally exist a program which could analyze /etc, installed packages and such, store the information, and selectively put it back into the new installation. I am even considering installing Ubuntu server on a virtual machine, just to be able to compare the contents of /etc with a clean installation, and even perform a migration to the virtual machine first, to verify that the transfer process works. How do one go about performing this kind of reinstallation? Have anyone seen any resources on the net on the topic?

    Read the article

  • Best way to handle PHP sessions across Apache vhost wildcard domains

    - by joshholat
    I'm currently running a site that allows users to use custom domains (i.e. so instead of mysite.com/myaccount, they could have myaccount.com). They just change the A record of their domain and we then use a wildcard vhost on Apache to catch the requests from the custom domains. The setup is basically as seen below. The first vhost catches the mysite.com/myaccount requests and the second would be used for myaccount.com. As you can see, they have the exact same path and php cookie_domain. I've noticed some weird behavior surrounding the line below "#The line below me". When active, the custom domains get a new session_id every page load (that isn't the same as the non-custom domain session). However, when I comment that line out, the user keeps the same session_id on each page load, but that session_id is not the same as the one they'd see on a non-custom domain site either despite being completely on the same server. There is a sort of "hack" workaround involving redirecting the user to mysite.com/myaccount, getting the session ID, redirecting back to myaccount.com, and then using that ID on the myaccount.com. But that can get kind of messy (i.e. if the user logs out of mysite.com/myaccount, how does myaccount.com know?). For what it's worth, I'm using a database to manage the sessions (i.e. so there's no issues with being on different servers, etc, but that's irrelevant since we only use one server to handle all requests currently anyways). I'm fairly certain it is related to some sort of CSRF browser protection thing, but shouldn't it be smart enough to know it's on the same server? Note: These are subdomains, they're separate domains entirely (but on the same server). <VirtualHost *:80> DocumentRoot "/opt/local/www/mysite.com" ServerName mysite.local ErrorLog "/opt/local/apache2/logs/mysite.com-error.log" CustomLog "/opt/local/apache2/logs/mysite.com-access.log" common <Directory "/opt/local/www/mysite.com"> AllowOverride All #php_value session.save_path "/opt/local/www/mysite.com/sessions" php_value session.cookie_domain "mysite.local" php_value auto_prepend_file "/opt/local/www/mysite.com/core.php" </Directory> </VirtualHost> #Wildcard (custom domain) vhost <VirtualHost *:80> DocumentRoot "/opt/local/www/mysite.com" ServerName default ServerAlias * ErrorLog "/opt/local/apache2/logs/mysite.com-error.log" CustomLog "/opt/local/apache2/logs/mysite.com-access.log" common <Directory "/opt/local/www/mysite.com"> AllowOverride All #php_value session.save_path "/opt/local/www/mysite.com/sessions" # The line below me php_value session.cookie_domain "mysite.local" php_value auto_prepend_file "/opt/local/www/mysite.com/core.php" </Directory> </VirtualHost>

    Read the article

  • What is the best nginx compression gzip level?

    - by Chamnap
    I'm using nginx reverse proxy cache with gzip enabled. However, I got some problems from android applications http requests to my rails json web service. It seems when I turn off reverse proxy cache, it works ok because the response header comes without gzip. Therefore, I think the problem caused from gzip. What is the most appropriate level of gzip compression? gzip on; gzip_http_version 1.0; gzip_vary on; gzip_comp_level 6; gzip_proxied any; gzip_types text/plain text/css text/javascript application/javascript application/json application/x-javascript text/xml application/xml application/xml+rss;

    Read the article

  • .htaccess https redirect best method

    - by Douglas Cottrell
    I have searched through all the redirects posted buy others and cant quite find the answer to my problem. I have a website with over 3000 pages and we are getting duplication issues within google. We want to keep everything in the parent directory to be http except our contact.php and login.php page. We then have 3 folders that must be secured. admin, clients, customers I have tried using the following code in seperate .htaccess files for each folder, but I keep getting a conflict when I try and I am still trying to find a good solution for the home directory. RewriteEngine On RewriteCond %{SERVER_PORT} 80 RewriteCond %{REQUEST_URI} admin RewriteRule ^(.*)$ https://www.website.com/$1 [R,L] Any help would be greatly appreciated.

    Read the article

  • Best practice Raid groups for EqualLogic PS6510X

    - by 20th Century Boy
    We are thinking about purchasing 4 x EqualLogic PS6510X SANs (the Sumo boxes). Each has 48 x 600GB 10k SAS drives. They will be stacked to form a logical pool of storage (all in the same location). I understand that when you create a RAID group its done on a "per box" basis. So one box could be Raid 50, another Raid 10 etc. My question is, should I make one box a "performance" box ie Raid 10, and the other boxes "standard" ie Raid50? How do people configure their EQL arrays in the real world?

    Read the article

  • Best RAID setup for multimedia fileserver?

    - by Mr. Schwabe
    I'm building a fileserver for my small office. We do film and multimedia design. Only 3 clients connected. The server is primarily for local access to graphic assets and video files. I'm looking for advice on hardware and software required. Particularly for the RAID. I have the following objectives: A) merged capacity I'd like all other systems to access the data as a single mapped network drive that has an initial capacity of 10 TB. So perhaps 5x 2TB drives (plus mirror drives for redundancy). B) easy way to increase capacity Thinking long term, I'd like to 'easily' add more drives to the array for a potential two or three fold increase in capacity. So theoretically it could get upto a 30 TB raid array consisting of maybe 15x 2 TB drives of capacity (plus mirror drives for redundancy). C) maximum fault tolerance I want at least 1 mirror drive per capacity drive (in laymen's terms). So if I start with 10 TB / 5x 2TB of capacity, I suppose I would need another another 5x 2TB drives to be mirrors. So 10 drives total. But I'd also like potential for even more redundancy; with upto 2 additional mirrors per 'capacity drive' (and to be able to add them to the array anytime with ease). D) easy way to monitor drive health I'd like an intuitive interface for managing the raid and monitoring drive health The other systems accessing this network drive will be running Windows, but also the odd Ubuntu and MacOS system as well. Are these objectives attainable? What type of RAID setup do you recommend? What hardware will be required? Also what OS do you think this system should be running? Does it really matter? I'm no network admin - just a long time Windoze user, without much Linux experience. That said, I'm not opposed to a Linux solution if it's easy enough and more practical than a Windows OS for this server. Or maybe something such as Openfiler. Budget should hit the sweet spot for value and performance (hence my preference to use 2TB drives). The biggest focus is storage; aside from that the system just needs to keep the drives running optimally with perhaps 2 or 3 clients accessing / writing files at any given time. The hardware quote would start with something like 10x 2TB WD Caviar Blacks; about $1900 for the storage + $x for remaining parts. http://ncix.com/products/index.php?sku=42775&vpn=WD2001FASS&manufacture=Western%20Digital%20WD Your advice is appreciated, thanks!

    Read the article

  • Blocking a country (mass iP Ranges), best practice for the actual block

    - by kwiksand
    Hi all, This question has obviously been asked many times in many different forms, but I can't find an actual answer to the specific plan I've got. We run a popular European Commercial deals site, and are getting a large amount of incoming registrations/traffic from countries who cannot even take part in the deals we offer (and many of the retailers aren't even known outside Western Europe). I've identified the problem area to block a lot of this traffic, but (as expected) there are thousands of ip ranges required. My question now (finally!). On a test server, I created a script to block each range within iptables, but the amount of time it took to add the rules was large, and then iptables was unresponsive after this (especially when attempting a iptables -L). What is the most efficient way of blocking large numbers of ip ranges: iptables? Or a plugin where I can preload them efficiantly? hosts.deny? .htaccess (nasty as I'd be running it in apache on every load balanced web server)? Cheers

    Read the article

  • where is the best place to store portable apps in Windows 7

    - by CoreyH
    I have a number of small portable apps and utilities. Where does Microsoft recommend we keep these? Lots of installed apps install themselves into \users{username}\appdata\local but it doesn't make much sense for me to put stuff there myself. \program files\ doesn't seem right either because non admin users don't have write access there. Is it recommended to create a \users{username}\applications directory?

    Read the article

  • How to best migrate one Windows 2008 R2 / SharePoint / Exchange / Terminal Services (All-in-one) int

    - by MadBoy
    Hello, My client has one machine with Windows 2008 R2 and everything on it. By everything I mean AD, DNS, SharePoint 2010 Standard, Exchange 2010 Standard, Terminal Services, Office 2010 and a bunch of additional apps. Everything stands on I7 x 2 and 36gb ram for 7 people total. I've decided that we should virtualize it and split things into 4 VM's and keep host only with Hyper-V installed to host all the machines. What problems should I expect? What good advices can you give. My plan is that when i move everything to VM's i will move vm's to safe place and format the host as it has a lot of really bad things happening on it. But this also means that everything will be wiped from current solution so I have to be sure that Exchange etc will work when host gets wiped. MadBoy

    Read the article

  • Best Photo Sharing Website for Teams/Organizations [closed]

    - by Patrick Cuff
    I manage the website for my daughter's U-10 soccer team. I'd like to set up a place where team members can both view and upload pictures. It needs to be secure, so that only authorized/registered members can view and upload pics. I've looked into Picasa and Flickr, and while both are good for sharing photos, only one registered user can upload. Are there any good sites that allow multiple people to upload photos?

    Read the article

  • Best Performing Remote Desktop Software for WAN

    - by Dave
    I've been tasked with connecting a computer in one of our branches in the midlands to one in south wales. We have been using windows remote desktop but find it too slow. The ADSL on the computer were connecting to is about 6Mb Download and 470Kb Upload so not majorly fast. That connection is also shared with about 10 other internet users. I'm trying to find out if the is any remote desktopn connecting software that performs better than the microsoft remote desktop software. Or, would i be better using a KVM over IP? I've looked into connecting the offices through BT's fiber optic but at £21k a year rental were trying to find a cheaper solution! Any help would be great. Thanks, Dave,

    Read the article

< Previous Page | 146 147 148 149 150 151 152 153 154 155 156 157  | Next Page >