Search Results

Search found 31207 results on 1249 pages for 'atg best practice in industries'.

Page 168/1249 | < Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >

  • The best software for users internet usage

    - by nikospkrk
    Hi, We are a small business using a Vigor 2820 as the internet router, and we'd like to install a software that could report any internet usage from our users. I already tried the "official" software made by Draytek called "SmartMonitor", but is reliability is a real issue as it doesn't seem to keep capturing packets after working 3 to 6hours (randomly), whereas Wireshark keeps capturing packets after that amount of time. As I'm really fed up with this tool, I'm looking for other solutions but I still want the same features: users statistics, websites ranking, users traffic, ... I already enabled the port mirroring feature, so it would be perfect if you could suggest me a port mirroring-based software (ideally freeware). I thought I had found the good one with Etherscout, but it just doesn't launch. I am even open to a tool that would "just" make some reports based on Wireshark captured files (*.pcap). Thank you for any of your suggestion, Nicolas.

    Read the article

  • What's the best self-tracking software for Linux?

    - by trench
    I'm looking for a way to track myself and receive quality data upon which I can write future scripts/programs. For example, I use Google Reader a lot. I'd like to track the hrefs that garner my clicks. Further, I'd like to drop all of the words of each href into a database where they can be stacked in a hierarchical manner. At the end of the week I want to know that "Ubuntu" garnered 448 clicks and "Cheetos" garnered 2. :) That's just one example... I'd like this tracking and data-collecting to extend beyond my browser. I know writing something to do this myself wouldn't be too awfully difficult but if something already exists I'd happily use it. Thanks in advance. Primary OS: Ubuntu 10.04

    Read the article

  • Best tools for "ssh tail -f" style log file monitoring and analysis

    - by dougnukem
    I'm looking for a tool to monitor custom PHP Error logs/Apache and possibly Java logs on remote development servers. I'm not looking for a full production log system like Splunk, but something that's a little more flexible than an ssh terminal doing a "tail -f". Perhaps something that will: * Monitor multiple log files to my local machine for searching/analysis later * Allow "alerts" when certain strings appear in the log * Provide some kind of tabbed/dashboard view of the multiple logs being monitored (in total less than 10 logs).

    Read the article

  • Best method of transferring files over internet?

    - by EsotericHabit
    I have a seedbox (running Ubuntu 9.10) at my (parent's) house and will be leaving it there once I go to college this fall. Currently I'm using samba to transfer files between computers, but I was wondering if once I am on my university's network, whether using FTP would be a better option versus samba over a VPN. The files will range from 100 MB to 17 GB, if that matters. Would one be more efficient over the other? Did I forget any other options?

    Read the article

  • Best way to keep configuration for server reinstallation?

    - by Gunnar
    I have a server at home running Ubuntu 12.04 which has grown messy over the years. I have fiddled with various packages, desktop environments (for VNC) etc. and I would like to reinstall it to start again, and have better control over what goes into the box. But I want to keep much of the configurations after reinstallation, like LVM configuration, apache2, samba, etc etc. There would ideally exist a program which could analyze /etc, installed packages and such, store the information, and selectively put it back into the new installation. I am even considering installing Ubuntu server on a virtual machine, just to be able to compare the contents of /etc with a clean installation, and even perform a migration to the virtual machine first, to verify that the transfer process works. How do one go about performing this kind of reinstallation? Have anyone seen any resources on the net on the topic?

    Read the article

  • Best format for backing up data in Blu Ray

    - by Arrieta
    We are in the process of backing up our hard drives to Blu Rays. I am creating tar.gz files and burning them to Blu Ray. Is it possible to use a simple (preferably Python-based) solution for creating images of those tar.gz files, of a predetermined size (to fit in the Blu Ray), and simply burn this images to the disc? Do you have any other approach for creating physical back-up of your hard drives?

    Read the article

  • .htaccess https redirect best method

    - by Douglas Cottrell
    I have searched through all the redirects posted buy others and cant quite find the answer to my problem. I have a website with over 3000 pages and we are getting duplication issues within google. We want to keep everything in the parent directory to be http except our contact.php and login.php page. We then have 3 folders that must be secured. admin, clients, customers I have tried using the following code in seperate .htaccess files for each folder, but I keep getting a conflict when I try and I am still trying to find a good solution for the home directory. RewriteEngine On RewriteCond %{SERVER_PORT} 80 RewriteCond %{REQUEST_URI} admin RewriteRule ^(.*)$ https://www.website.com/$1 [R,L] Any help would be greatly appreciated.

    Read the article

  • What is the best nginx compression gzip level?

    - by Chamnap
    I'm using nginx reverse proxy cache with gzip enabled. However, I got some problems from android applications http requests to my rails json web service. It seems when I turn off reverse proxy cache, it works ok because the response header comes without gzip. Therefore, I think the problem caused from gzip. What is the most appropriate level of gzip compression? gzip on; gzip_http_version 1.0; gzip_vary on; gzip_comp_level 6; gzip_proxied any; gzip_types text/plain text/css text/javascript application/javascript application/json application/x-javascript text/xml application/xml application/xml+rss;

    Read the article

  • Best way to handle PHP sessions across Apache vhost wildcard domains

    - by joshholat
    I'm currently running a site that allows users to use custom domains (i.e. so instead of mysite.com/myaccount, they could have myaccount.com). They just change the A record of their domain and we then use a wildcard vhost on Apache to catch the requests from the custom domains. The setup is basically as seen below. The first vhost catches the mysite.com/myaccount requests and the second would be used for myaccount.com. As you can see, they have the exact same path and php cookie_domain. I've noticed some weird behavior surrounding the line below "#The line below me". When active, the custom domains get a new session_id every page load (that isn't the same as the non-custom domain session). However, when I comment that line out, the user keeps the same session_id on each page load, but that session_id is not the same as the one they'd see on a non-custom domain site either despite being completely on the same server. There is a sort of "hack" workaround involving redirecting the user to mysite.com/myaccount, getting the session ID, redirecting back to myaccount.com, and then using that ID on the myaccount.com. But that can get kind of messy (i.e. if the user logs out of mysite.com/myaccount, how does myaccount.com know?). For what it's worth, I'm using a database to manage the sessions (i.e. so there's no issues with being on different servers, etc, but that's irrelevant since we only use one server to handle all requests currently anyways). I'm fairly certain it is related to some sort of CSRF browser protection thing, but shouldn't it be smart enough to know it's on the same server? Note: These are subdomains, they're separate domains entirely (but on the same server). <VirtualHost *:80> DocumentRoot "/opt/local/www/mysite.com" ServerName mysite.local ErrorLog "/opt/local/apache2/logs/mysite.com-error.log" CustomLog "/opt/local/apache2/logs/mysite.com-access.log" common <Directory "/opt/local/www/mysite.com"> AllowOverride All #php_value session.save_path "/opt/local/www/mysite.com/sessions" php_value session.cookie_domain "mysite.local" php_value auto_prepend_file "/opt/local/www/mysite.com/core.php" </Directory> </VirtualHost> #Wildcard (custom domain) vhost <VirtualHost *:80> DocumentRoot "/opt/local/www/mysite.com" ServerName default ServerAlias * ErrorLog "/opt/local/apache2/logs/mysite.com-error.log" CustomLog "/opt/local/apache2/logs/mysite.com-access.log" common <Directory "/opt/local/www/mysite.com"> AllowOverride All #php_value session.save_path "/opt/local/www/mysite.com/sessions" # The line below me php_value session.cookie_domain "mysite.local" php_value auto_prepend_file "/opt/local/www/mysite.com/core.php" </Directory> </VirtualHost>

    Read the article

  • Best RAID setup for multimedia fileserver?

    - by Mr. Schwabe
    I'm building a fileserver for my small office. We do film and multimedia design. Only 3 clients connected. The server is primarily for local access to graphic assets and video files. I'm looking for advice on hardware and software required. Particularly for the RAID. I have the following objectives: A) merged capacity I'd like all other systems to access the data as a single mapped network drive that has an initial capacity of 10 TB. So perhaps 5x 2TB drives (plus mirror drives for redundancy). B) easy way to increase capacity Thinking long term, I'd like to 'easily' add more drives to the array for a potential two or three fold increase in capacity. So theoretically it could get upto a 30 TB raid array consisting of maybe 15x 2 TB drives of capacity (plus mirror drives for redundancy). C) maximum fault tolerance I want at least 1 mirror drive per capacity drive (in laymen's terms). So if I start with 10 TB / 5x 2TB of capacity, I suppose I would need another another 5x 2TB drives to be mirrors. So 10 drives total. But I'd also like potential for even more redundancy; with upto 2 additional mirrors per 'capacity drive' (and to be able to add them to the array anytime with ease). D) easy way to monitor drive health I'd like an intuitive interface for managing the raid and monitoring drive health The other systems accessing this network drive will be running Windows, but also the odd Ubuntu and MacOS system as well. Are these objectives attainable? What type of RAID setup do you recommend? What hardware will be required? Also what OS do you think this system should be running? Does it really matter? I'm no network admin - just a long time Windoze user, without much Linux experience. That said, I'm not opposed to a Linux solution if it's easy enough and more practical than a Windows OS for this server. Or maybe something such as Openfiler. Budget should hit the sweet spot for value and performance (hence my preference to use 2TB drives). The biggest focus is storage; aside from that the system just needs to keep the drives running optimally with perhaps 2 or 3 clients accessing / writing files at any given time. The hardware quote would start with something like 10x 2TB WD Caviar Blacks; about $1900 for the storage + $x for remaining parts. http://ncix.com/products/index.php?sku=42775&vpn=WD2001FASS&manufacture=Western%20Digital%20WD Your advice is appreciated, thanks!

    Read the article

  • where is the best place to store portable apps in Windows 7

    - by CoreyH
    I have a number of small portable apps and utilities. Where does Microsoft recommend we keep these? Lots of installed apps install themselves into \users{username}\appdata\local but it doesn't make much sense for me to put stuff there myself. \program files\ doesn't seem right either because non admin users don't have write access there. Is it recommended to create a \users{username}\applications directory?

    Read the article

  • How to best migrate one Windows 2008 R2 / SharePoint / Exchange / Terminal Services (All-in-one) int

    - by MadBoy
    Hello, My client has one machine with Windows 2008 R2 and everything on it. By everything I mean AD, DNS, SharePoint 2010 Standard, Exchange 2010 Standard, Terminal Services, Office 2010 and a bunch of additional apps. Everything stands on I7 x 2 and 36gb ram for 7 people total. I've decided that we should virtualize it and split things into 4 VM's and keep host only with Hyper-V installed to host all the machines. What problems should I expect? What good advices can you give. My plan is that when i move everything to VM's i will move vm's to safe place and format the host as it has a lot of really bad things happening on it. But this also means that everything will be wiped from current solution so I have to be sure that Exchange etc will work when host gets wiped. MadBoy

    Read the article

  • Best Photo Sharing Website for Teams/Organizations [closed]

    - by Patrick Cuff
    I manage the website for my daughter's U-10 soccer team. I'd like to set up a place where team members can both view and upload pictures. It needs to be secure, so that only authorized/registered members can view and upload pics. I've looked into Picasa and Flickr, and while both are good for sharing photos, only one registered user can upload. Are there any good sites that allow multiple people to upload photos?

    Read the article

  • Best Performing Remote Desktop Software for WAN

    - by Dave
    I've been tasked with connecting a computer in one of our branches in the midlands to one in south wales. We have been using windows remote desktop but find it too slow. The ADSL on the computer were connecting to is about 6Mb Download and 470Kb Upload so not majorly fast. That connection is also shared with about 10 other internet users. I'm trying to find out if the is any remote desktopn connecting software that performs better than the microsoft remote desktop software. Or, would i be better using a KVM over IP? I've looked into connecting the offices through BT's fiber optic but at £21k a year rental were trying to find a cheaper solution! Any help would be great. Thanks, Dave,

    Read the article

  • Best filesystem choices for NFS storing VMware disk images

    - by mlambie
    Currently we use an iSCSI SAN as storage for several VMware ESXi servers. I am investigating the use of an NFS target on a Linux server for additional virtual machines. I am also open to the idea of using an alternative operating system (like OpenSolaris) if it will provide significant advantages. What Linux-based filesystem favours very large contiguous files (like VMware's disk images)? Alternatively, how have people found ZFS on OpenSolaris for this kind of workload? (This question was originally asked on SuperUser; feel free to migrate answers here if you know how).

    Read the article

  • Best shortcut in Total Commander

    - by life-warrior
    So, what's your favourite TC shortcut or shortcut combination ? Which one do you use and for what purpose ? Among my most often used: Ctrl-Left ( or Ctrl-Right ) - open archive or folder under cursor in opposite tab. Ctrl-Shift-Enter, Alt-F8, Ctrl-X - copy full file path to clipboard. Shift-F6, Shift-End(if needed), Ctrl-C - copy only file name w/o path. Select files, Ctrl-M - multi-rename, for example remove "DVDrip" from file names. Ctrl-\ - go to root directory. Ctrl-D, - go to directory with highlighted letter specified. For example, name a downloads directory "&Downloads" in favourites, and the letter after ampersand will be highlighted. Alt-F7, feed to listbox, Ctrl-A, Mark(menu)-Save selection to file - creates a file with all files and directories inside current, with full path. Ctrl-[3-6] - sort files by name(3), extension(4), date(5), size(6). For example, Sort by name, when you need movies and soundracks with the same name and different extension to group them together. Sort by extension, when you need to find EXEs in Windows directory. Sort by Date, when you need to find the latest file downloaded in your dir. Sort by size, when you need to delete the largest files for free space.

    Read the article

  • What is the Best Free Linux Gateway

    - by rockinthesixstring
    I'm looking at moving away from using my DIR-825 as a gateway and moving into a Linux box to do it all for me. I've found IPCop, but I'm looking for something with a little more power. My main goal is basically to be able to point different external domain names to different internal servers. backup.example.com - 192.168.0.5 home.example.com - 192.168.0.1 I host my DNS on my own dedicated server (windows), so I don't know much about doing the gateway thing in my home (my hosting provider does it all for me). Do any of you know of any free Linux Distros that can accomplish what I'm looking for?

    Read the article

  • How to best tune my SAN/Initiators for best performance?

    - by Disco
    Recent owner of a Dell PowerVault MD3600i i'm experiencing some weird results. I have a dedicated 24x 10GbE Switch (PowerConnect 8024), setup to jumbo frames 9K. The MD3600 has 2 RAID controllers, each has 2x 10GbE ethernet nics. There's nothing else on the switch; one VLAN for SAN traffic. Here's my multipath.conf defaults { udev_dir /dev polling_interval 5 selector "round-robin 0" path_grouping_policy multibus getuid_callout "/sbin/scsi_id -g -u -s /block/%n" prio_callout none path_checker readsector0 rr_min_io 100 max_fds 8192 rr_weight priorities failback immediate no_path_retry fail user_friendly_names yes # prio rdac } blacklist { device { vendor "*" product "Universal Xport" } # devnode "^sd[a-z]" } devices { device { vendor "DELL" product "MD36xxi" path_grouping_policy group_by_prio prio rdac # polling_interval 5 path_checker rdac path_selector "round-robin 0" hardware_handler "1 rdac" failback immediate features "2 pg_init_retries 50" no_path_retry 30 rr_min_io 100 prio_callout "/sbin/mpath_prio_rdac /dev/%n" } } And iscsid.conf : node.startup = automatic node.session.timeo.replacement_timeout = 15 node.conn[0].timeo.login_timeout = 15 node.conn[0].timeo.logout_timeout = 15 node.conn[0].timeo.noop_out_interval = 5 node.conn[0].timeo.noop_out_timeout = 10 node.session.iscsi.InitialR2T = No node.session.iscsi.ImmediateData = Yes node.session.iscsi.FirstBurstLength = 262144 node.session.iscsi.MaxBurstLength = 16776192 node.conn[0].iscsi.MaxRecvDataSegmentLength = 262144 After my tests; i can barely come to 200 Mb/s read/write. Should I expect more than that ? Providing it has dual 10 GbE my thoughts where to come around the 400 Mb/s. Any ideas ? Guidelines ? Troubleshooting tips ?

    Read the article

  • Best Hardware for a server attached to LCD TV displaying a Flash frontend

    - by DomingoSL
    Hello, i need a recomendation on what hardware (PC) to buy in order to achieve this task: I have a webserver (WAMP) running in my laptop, this server has a webapp who can manage information from a user. In a few words, a user enter in my webserver and a php script ask gim for a mensage who get store in a MYSql database, in other hand, inside the same PC there is a flash running with a frontend who take this mensages and shows it. My english is bad, maybe this diagram may help you: http://img689.imageshack.us/img689/2371/diagramav.png Well i want to connect the pc who runs all (Webserver, MySQL server, Flash front end) to a LCD TV in order to create something like an information spot that will be on almost all day (sometimes all day). Do you recomend to have the webserver and the frontend in the same pc? or to separate in two? What hardware do you recomend? i mean, type of pc, with fans or no, please cheap solutions but good. Thanks for your help.

    Read the article

  • Best Hardware for a server attached to LCD TV displaying a Flash frontend

    - by Domingo
    Hello, i need a recomendation on what hardware (PC) to buy in order to achieve this task: I have a webserver (WAMP) running in my laptop, this server has a webapp who can manage information from a user. In a few words, a user enter in my webserver and a php script ask gim for a mensage who get store in a MYSql database, in other hand, inside the same PC there is a flash running with a frontend who take this mensages and shows it. My english is bad, maybe this diagram may help you: Well i want to connect the pc who runs all (Webserver, MySQL server, Flash front end) to a LCD TV in order to create something like an information spot that will be on almost all day (sometimes all day). Do you recomend to have the webserver and the frontend in the same pc? or to separate in two? What hardware do you recomend? i mean, type of pc, with fans or no, please cheap solutions but good. Thanks for your help.

    Read the article

  • Best way to provide redundant switching/links to server

    - by Myles Gray
    We have 3x ESX hosts and 2x SANS that we wish to move to a redundant 10G networking infrastructure. We have 4x Dell PowerConnect 8024F's to provide our backbone and are configured as so (only core switches relevant to this question): So the questions are: 1) Do the interconnects between the 4x 8024F's need to be LAG'd or just STP'd 2) As the NICs on the servers are split across 2 switches, does any special configuration need to be done here or on the switches? 3) If a link or switch fails will the switches automatically find a new path to the Server/SAN?

    Read the article

< Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >