Search Results

Search found 892 results on 36 pages for 'mirror'.

Page 23/36 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Microsoft Forefront Management Gateway 2010 - Which topology to choose for monitoring only server?

    - by MadBoy
    Hello, I've installed Forefront and wanted to use it as monitoring traffic solution until we decide to put it as a router. I've 2 nic's assigned to this virtual machine. One NIC has connected port which is "mirror port" of our WAN redirected on switch so it sees all the network traffic flying by. The other NIC is internet access. This server is located inside our lan network. What topology should i choose and which options I should look at to be able to see which traffic is used (SMTP, WWW etc) and who does what? We had cases of infected machines with spam and we want to be able to see that some machine is sending large amounts of mails. Is that possible ?

    Read the article

  • Free-flow Alternative to Powerpoint?

    - by Nick Klauer
    So I've been digging around the net trying to find a good set of alternatives to Powerpoint. Part of my interest is that I found one, Prezi, that I liked for it's free-form style. Part of the power of it was that I can zoom out and select any part of the presentation to continue from and it feels much like a mind map or association of thoughts. Are there any other tools that offer anything similar in vein to this way of presenting material? I'm looking for something that just pops differently than a death-by-powerpoint-style presentation, so I would be happy to find tools that help present information in more fluid styles. It doesn't have to mirror Prezi, and I wouldn't want that, but after seeing what Prezi does, I have to think there are other ways of presenting information to a group of people than one square slide at a time.

    Read the article

  • Downloading a large site with wget

    - by Evan Gill
    Hi, I'm trying to mirror a very large site but wget never seems to finish properly. I am using the command: wget -r -l inf -nc -w 0.5 {the-site} I have downloaded a good portion of the site, but not the whole thing. The content does not change fast enough to bother using time-stamping. After running overnight, this message appears: File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. Killed does anyone know what is happening and how I can fix it?

    Read the article

  • One Way Sync of a Bucket With Local Directory

    - by user48651
    I have a local directory that I would like to synchronize with an S3 bucket. I have two specific requirements: If local file is the same as the remote, do not re-transfer it to the bucket. If some files or directories exist in the bucket but do not exist on local, delete them. Basically the bucket should mirror the local copy and not vice-versa. I looked into s3cmd sync command, but unfortunately requirement 2 is not fulfilled. If files exists in the bucket but not on local copy, they will be copied to the local instead of being deleted.

    Read the article

  • Splitting a drive which has layout as mirrored and type as dynamic

    - by shiva
    I have a C drive/volume in my server with layout = mirror and type =dynamic and status as healthy(boot,pagefile,crashdump). I have some questions regarding this configuration: I think it is a raid configuration.Please correct me if I am wrong. I read that, mirroring is nothing but raid-1 configuration. All my software and OS is in this drive. I want my software to be in a separate drive, but I am not sure if I can create a separate drive from the above mentioned c drive. I want to know: a. If I can do it and how ?(using disk management) b. If this is a right approach ?

    Read the article

  • Reliable Linux RAID Card

    - by Chris S
    Can anyone recommend an appropriate RAID card for use under Linux (Ubuntu)? I have three Western Digital SATA drives, and would like to setup either RAID-1 or RAID-5. Software RAID has been very flakey and unreliable for me (e.g. it doesn't seem to mirror the boot partition). Unfortunately, most RAID cards I've seen online don't mention any Linux driver support. Would most cards work in Linux, or do I have to look at a select few that explicitly support Linux?

    Read the article

  • Varnish doesn't seem to be caching

    - by Charlie Somerville
    I've setup a Varnish cache mirror to sit in front of a file server, but it seems to be endlessly re-downloading data from my file server. There's about 100GB of data in total, but so far Varnish has downloaded 800GB from my file server. I'm using the default VCL file that comes with Varnish and the response headers for files served by the file server are similar to the following: HTTP/1.1 200 OK Cache-Control: max-age=290304000, public Content-Type: image/jpeg Expires: Wed, 29 Dec 2010 21:38:33 GMT Server: Microsoft-IIS/7.0 E-Tag: "8b4723296ab697530768f18b1378b269" Content-Disposition: inline; filename=image046.jpg; X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Thu, 23 Dec 2010 05:38:33 GMT Content-Length: 100592 I'm starting varnishd with the following options: varnish/sbin/varnishd -a 0.0.0.0:80 -f varnish/etc/varnish/default.vcl -s file,varnish/var/lib/varnish/varnish_storage.bin,100G

    Read the article

  • How to manage a home-grown YUM package repo?

    - by TomOnTime
    There are plenty of websites that explain how to manage a mirror of YUM repos. I want to run a repo for my home-grown packages. Is there a good way to manage such repos? What I need to do: Manage 3 repos: unstable, testing, stable Self-service functions that let users add/remove/promote packages (promote means moving a package unstable?testing or testing-stable). ACLs that control which users/groups may add/remove/promote packages. Automatically re-sign packages as they move repo to repo (since the GPG key for "stable" should be different than "unstable") Automatically run "createrepo" to update repodata when needed. Suggestions?

    Read the article

  • booting FreeBSD 9 from USB stick: boot error

    - by ssc
    I am trying to boot FreeBSD 9 from a USB stick that I created following the official guidelines: dd if=FreeBSD-9.0-RELEASE-i386-memstick.img of=/dev/da0 bs=64k Booting fails with a simple 'boot error'. I have used this USB stick for quite a while for the very purpose of booting / installing new OSs, but I tried a different one anyway - same problem. I have also reproduced the issue on a different machine. I've acquired to image file over torrent which AFAIK has an md5 check built in, but I downloaded it again anyway directly from a FreeBSD mirror. Same result. Does anyone have any success with this ? I did not find anything related online which seems to suggest this is not a well-known problem. Does anyone have a thought where else to search for the cause of the problem ?

    Read the article

  • redundant/multi-site terminal server

    - by Adam
    Hi We have a Hyper-V cluster running 5 virtual terminal servers using HA. We need to be able make this system redundant and so if this site was to fail our users could log into the backup system at another location and access their data via the terminal servers. Any ideas? We were thinking of maybe using a NAS which replicated the data to the other location in real-time(pass-through disks)? and having a similar Hyper-V cluster setup in the backup location. However we would need to create the users in both location and create a virtual mirror without the data ie applications, directories, settings etc. Is this the best way to achieve this? We have read that using Hyper-v pass through disks is a big performance de-grade.

    Read the article

  • KDE on Windows won't download/install

    - by endolith
    No matter which mirror I use, I get this error. I've tried a few times over the last several weeks. Download failed --------------------------- The download of ftp://kde.mirrors.tds.net/pub/kde/stable/4.5.4/win32/libopensp-vc100-1.5.2-bin.tar.bz2 failed with error: archive downloaded from ftp://kde.mirrors.tds.net/pub/kde/stable/4.5.4/win32/libopensp-vc100-1.5.2-bin.tar.bz2 checksum error --------------------------- Retry Ignore Cancel Should I just ignore this and let it continue? Update: I ran as administrator, changed the install directory to C:\KDE, and ignored this error, and it seemed to install, but then gave me a different error, same file: Error --------------------------- Internal Error - File C:/Temp/KDE/libopensp-vc100-1.5.2-bin.tar.bz2 does not exist --------------------------- Cancel But now programs seem to work! Should I just ignore this error? I can't even find a plain English explanation of what libopensp is.

    Read the article

  • What is the best solution for letting both Macs and PCs access digital images and videos on a networ

    - by Gertbert
    I'm looking for a simple way to share digital images and videos over a network to both Macs and PCs. I'm currently looking into three options: NAS, WD MyBook Mirror drives attached to a router, or an HP WHS product like the Data Vault. I'm looking for something easy to implement that allows for drive mirroring but also has good performance for both Macs and PCS. I've read that the HP WHS devices rebalance on their own schedules, making them useless for streaming video, and hope someone can definitively confirm or deny this as it's a dealbreaker if true. Any other suggestions are appreciated. Thanks!

    Read the article

  • DB auto failover in c# does not work when the principal server physically goes offline

    - by user62521
    I'm setting up DB auto failover in C# with SQL Server 2008 and I have a 'high safety with automatic failover mirror' using a witness setup and my connection string looks like "Server=tcp:DC01; Failover Partner=tcp:DC02; database=dbname; uid=sewebsite;pwd=somerndpwd;Connect Timeout=10;Pooling=True;" During testing, when I turn off the SQL Server service on the principal server the auto failover works like a charm, but if I take the principal server offline (by shutting down the server or killing the network card) auto failover does not work and my website just times out. I found this article where the second last post suggests that its because we are using named pipes which does not work when the principal goes offline, but we force TCP in our connection string. What am I missing to get this DB auto failover working?

    Read the article

  • LVM mirroring VS RAID1

    - by syrenity
    Hi. Having learned a bit about LVM mirroring, I thought about replacing the current RAID-1 scheme I'm using to gain some flexibility. Problem is that according to what I found on the Internet, LVM is: 1) Slower then RAID-1, at least in reading (as only single volume being used for reading). 2) Non-reliable on power interrupts, and requires disk cache disabling for prevention of data loss. http://www.joshbryan.com/blog/2008/01/02/lvm2-mirrors-vs-md-raid-1/ Also it seems, at least to several setup guides I read (http://www.tcpdump.com/kb/os/linux/lvm-mirroring/intro.html), that one actually requires a 3rd disk for storing the LVM log. This makes the setup completely unusable on 2 disks installations, and lowers the amount of used mirror disks on higher amount of disks. Can anyone comment the above facts, and let me know his experience of using LVM mirroring? Thanks.

    Read the article

  • How do I rewrite *.example.com to www.example.com?

    - by Lekensteyn
    In my network, I've some Ubuntu machines which need to download files from nl.archive.ubuntu.com. Since it's quite a waste of time to download everything multiple times, I've setup a squid proxy for caching the data. Another use for this proxy was rewriting requests for archive.ubuntu.com or *.archive.ubuntu.com to nl.archive.ubuntu.com because this mirror is faster than the US mirrors. This has worked quite well, but after a recent install of my caching machine, the configuration was lost. I remember having a separate perl program for handling this rewrite. How do I setup such a squid proxy which rewrites the host *.example.com to www.example.com and cache the result of the latter?

    Read the article

  • Setting up Raid 1 Array for Home Server

    - by user1048116
    I'm not sure if this is even possible, but it's worth asking on here! Essentially I have a old machine at home (well, not old hardware wise, but I recently built a new gaming rig), which I decided to install a copy of W2008 R2 on and use as a file/backup server and media center'ish machine. As of now, it has a single drive partitioned into C and D, with D being the Data partition. I have happened to find an old 1TB SATA drive lying around at home, and was wondering if it's possible to setup a Raid 1 array in my rig within Windows without needing to lose everything on my first drive (or maybe even just mirror a specific partition, say the Data partition, as this is just what stores my photos etc). Maybe this isn't possible, but you never know :) Regards, T.C

    Read the article

  • Mirroring svn repository

    - by cardy
    I have an svn repository and I'd like to have it duplicated over multiple machines for availability purpose. By now when my vps goes down, I'm unable to connect to repository and this is very annoying. Easiest (and expansive) solution is to setup two identical machine and make them work like clones. I'd like to know if there are any alternative (involving 2 machines). Ideally I would have two vps in different datacenters, so if one goes down I can rely on the other. Thanks. I need a mirror both for read and write not only for read. Svn Repos are berkley-db based

    Read the article

  • How do you monitor the health of a mirrored disk in Windows?

    - by NitroxDM
    I have a Mirrored Dynamic disk on my Windows 2003 Server. How do you monitor the health of the volume? Is there a way to have the server send an email when there is an issue with the volume? Is there a way to have the server run S.M.A.R.T. tests? EDIT: Nothing says WTF like logging into a client server, running DISKPART LIST VOLUME and seeing this. Volume ### Ltr Label Fs Type Size Status Info ---------- --- ----------- ----- ---------- ------- --------- -------- Volume 0 X xDrive NTFS Mirror 233 GB Failed Rd Volume 1 C NTFS Simple 57 GB Healthy System Volume 2 D DVD-ROM 0 B Healthy Volume 3 F RAW Partition 466 GB Healthy Volume 4 E New Volume NTFS Partition 932 GB Healthy

    Read the article

  • SharePoint MOSS - Serve HTTP content on an HTTPS page without Mixed Content Warning?

    - by kcb263
    Our "portal-like" SharePoint site is served using HTTPS/SSL. So a user goes to https://web.company.com and sees content and different Web Parts. So far, no problem. The desire now is to have new Web Parts added that either frame HTTP content (such as Weather Bug) or HTTP RSS feeds. The issue that arises is that by doing this, results in a "Mixed Content" warning in the browser. Has anybody successfully been able to implement such a scenario, or one similar to it? The options we have looked at, unsuccessfully, have been: using Apache Reverse Proxy Server mirror an external site Custom Web Parts

    Read the article

  • Can I take a HDD in Raid1 and plug it straight into a different machine?

    - by jacko
    I would assume that I can just take my HDD out of my NAS (in raid1 mirror) and plug it into another enclosure and have it work off the bat but I'd like to make sure... Any ideas? Edit: My current setup is a Netgear ReadyNAS in (hardware) raid1. I'm hoping to replace this with a home theatre type PC (possibly running Ubuntu), and would like to migrate my data without having to do a bulk transfer over my network between the 2 machines. Can anyone confirm the case for the Netgear ReadyNAS?

    Read the article

  • Need an alerting system if my cloning script fails

    - by rahum
    I've configured a nightly rsync to mirror one server to a standby offsite backup server. The total datastore on the primary is 1.5TB. In the course of getting this working, I ran into numerous instabilities with the environment, which I seem to have sorted out, but even though it's now working, I am still nervous. This is intended to be a disaster-scenario standby server, and if disaster strikes and the standby does not have all the proper data synchronized, I'm out of a job. Thus, I want to script a system that will confirm, after each nightly sync, that the destination data matches the source. I realize that rsync does this, but if rsync doesn't complete fully (which was happening during the setup troubleshooting), I need to know. Any suggestions? I'm best with Ruby, if that is relevant for the solution.

    Read the article

  • 389 DS Achitecture for Multiple Sites

    - by Kyle Flavin
    I'm looking to deploy 389 Directory in my environment to replace an existing iPlanet installation. I would be using it primarily to store user account data for authentication purposes. I have two physically separate data centers that I would like to share the same directory tree. My initial thinking is to setup 389 DS as follows: -A Master/Consumer in DataCenter A -A Master/Consumer in DataCenter B -Replication agreement between both masters, to mirror the directory tree in both environments. Does this sound like a reasonable approach? Is there a better way to do it? (ie: four masters?) Is there documentation for best practices when setting up 389 DS in situations such as this? Thanks.

    Read the article

  • Best way to troubleshoot apache not starting?

    - by lowgain
    We have recently gotten a backup server to mirror all our data onto in case the primary server goes down. I've gotten all the sites data updated through rsync, and all the apache config and databases updated. Both machines are on Ubuntu 9 (9.04 on the primary, 9.10 on the backup). So everything seems synced up for the most part at this point (still need to figure out user syncing), and I try to start Apache. I get * Starting web server apache2 [fail] Nothing else indicating what the problem could be. I know I don't have enough info to expect a solution from you guys, so I'd just like to know where I can go from here to further investigate this issue. Would there be any error logs for this? Thanks!

    Read the article

  • How to take a CSS animation from a browser, and export a GIF of it?

    - by Truth
    I have the following CSS3 Animation going on: http://dabblet.com/gist/2884702. It's basically a simulation of a mirror rotating on its x-axis. Now, I wish to present that in a PowerPoint presentation. Since PowerPoint doesn't have the webkit engine, I want to extract an animated GIF image of that animation, and embed it into my presentation. The problem? No matter what I've tried, I couldn't make a reasonably smooth animated GIF. I've Googled and found many free software which claim to do the job, tried several, none worked as expected. I've tried IrfanView, same issue (also their site makes me want to vomit). So, is there a solution? Or am I doomed to not be able to display it?

    Read the article

  • Serve mirrored (static) web-page with original headers

    - by aioobe
    I have a dynamic webpage which I want to create a "frozen" copy of. Typically I would do something like wget -m http://example.com, and then put the files in the document root of the web-server. This site however has some dynamic content, including dynamically generated images, for instance http://example.com/company/123/logo This means that in order to mirror the page, I need to Save whatever headers the server currently serves for each URL. This can be done using the wget option --save-headers. Serve the static pages and serve the proper headers for each file. (This I have no idea of how to do.) What is the best way to solve this? Any suggestions are welcome.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >