Search Results

Search found 5298 results on 212 pages for 'automated deploy'.

Page 104/212 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Can I make Apache drop a connection when matching a URL?

    - by PP
    Using mod_rewrite I can construct a rule to respond with a clean error code (e.g. 404 not found, 410 gone, or 403 unauthorised) when a page is requested that I don't want to serve. But frequently I get completely erroneous requests from hackers scanning my website for vulnerabilities or possibly cross-site scripting attempts. For these customers I do not want to return a clean error - I'd rather do something else like immediately drop the connection with no response or, alternatively, hold the connection open for a lengthy period of time to frustrate the automated process. Any ideas how to accomplish this with Apache? I've read that nginx has the ability to immediately terminate a connection when a particular pattern is matched.

    Read the article

  • Allow users to view Word documents only and not be able to edit, copy or save them.

    - by Alexander
    Hello In a traditional Windows Server 2003 environment with AD, we have shared a folder for our policy documents (MS Word). These documents get edited/updated now and then by the administrator(principal of college). Users only have read-only access to the folder, but they can still save-as and then change the content. Sharepoint is a possible solution but not easy to implement. We also thought of using a CMS on Linux and installing Joomla to let users only view the docs with a document management system... but is it possible to automatically retrieve the policy folder on the network and convert or put it in a format that users can only view and not copy? We also thought of saving the docs to secure pdf format but the principal wants an automated system. Basically she just wants to work in Word and the policies must be available to staff members on the network. Any ideas? Much appreciated.

    Read the article

  • Sending an Email from 2 Mail Servers

    - by Ted Smith
    We are currently attempting to move away from using a "local" mail(exchange) server to an cloud based offering for all our automated emails. The problem is that we send and receive thousands for emails a day and its uptime is quite critical so the business do not want to put all their eggs in one basket, so if we would like to use a cloud based offering(mailgun) they would like a backup if this goes down. So my question is: Would it be possible to set multpile A, TXT and CNAME records to multiple IP address so if one mail server goes down we can automatically start sending emails from the fallover(without them being blocked doing a reverse DNS lookup)? I know we will still need to adjust the MX record for incoming emails but that is acceptable to not receive emails for a short(1-2 hours) of time. Does this make sense?

    Read the article

  • What does the arxiv.org anti-bot "search and destroy" actually do?

    - by Brian Campbell
    The lanl.arxiv.org math and scientific preprint service (formerly known as xxx.lanl.gov) has a strict policy against bots that ignore its robots.txt, Robots Beware. On that page, the have a link labelled with "Click here to initiate automated 'seek-and-destroy' against your site", which is forbidden by their robots.txt but presumably badly behaved robots will follow it, and reap the consequences. The question, what are the actual consequences? I have never had the guts to actually click on that link to see what it does. What can they be doing that is both effective and legal?

    Read the article

  • How to create a password-less service account in AD?

    - by Andrew White
    Is it possible to create domain accounts that can only be accessed via a domain administrator or similar access? The goal is to create domain users that have certain network access based on their task but these users are only meant for automated jobs. As such, they don't need passwords and a domain admin can always do a run-as to drop down to the correct user to run the job. No password means no chance of someone guessing it or it being written down or lost. This may belong on SuperUser ServerFault but I am going to try here first since it's on the fuzzy border to me. I am also open to constructive alternatives.

    Read the article

  • how to connect public web server to internal LAN

    - by DefSol
    I have a VPS which is my public web server for all my clients. It's running server 2008 and I would like to have it connect via secure connection to my internal LAN. I would like this to be a route so access is bi-derectional. Have read about Server & Domain isolation, but am concerned this may prevent public views to the webs sites on the server. I currently have a PPTP tunnel, but I'm wanting better security (IPSec or SSL etc) and it's not given my bi derectional access. (In fact my backups aren't copying accross but this could be an acl issue) The goal is to provide easy/automated backups of data & sql db's to my internal LAN, as well as a means to provision new sites & db's from a workflow occuring internally. Internal lan is windows based with ISA 2006 at the perimeter. Thanks

    Read the article

  • creating a journal/blog

    - by DijnsK
    hi, im trying to replace our current journal (excel sheet) with a web portal based journal. something kind of like twitter, but with more options. also needs to have a login so we can track the people entering. it can also be some sort of tool but i cant find anything that meets our demands... i could use a ITIL ticket service but that has way to much functionalities for our use. im basicly looking for a webportal with a shared blog, where users can login and creat new entries with: a topic name a automated follow-up ID number a field where they can enter specific info a reply field and a status field with pre defined statuses can anyone help me with this? thanks in advance. Koen

    Read the article

  • Monitoring bandwidth/latency/jitter between 2 sites?

    - by TheCleaner
    I have 2 sites connected via an MPLS network and I'd like to do the following: setup a host on each end that can "talk" back and forth between each other and somehow report/log what kind of throughput, jitter, latency, etc. they are experiencing between each other in 5 minute intervals. Something similar to Qcheck but that can be automated. Bottom line is I'm trying to determine if the WAN network is "stable" throughout the day or if something is wrong. We have video conferences between these sites and even at 1024kbps calls we are experiencing delays and jitter. I'm hoping to exonerate the network with some testing.

    Read the article

  • Test/Dummy SMTP server for Windows

    - by geoaxis
    I would like to install a Test/Dummy SMTP server on a Windows 2008 server (virtual box). I just want to test my web application on the machine it self so I don't need the mails to go out on the internet, but just to be written to disk (so that I can verify that the mail function was indeed called and the correct data was handed over to SMTP) Can you recommend some tool. I guess starting your own SMTP server in python is an option. I am looking for a simple (ready to use) solution, targeted for tests systems. I will need to integrate it to automated tests (Selenium) at a later stage. Thanks

    Read the article

  • How to go about scheduling a task in windows 7 to change wireless connection

    - by Skindeep2366
    This may or not be something that can be done. I cannot find anything on the wireless connection manager built into windows 7 let alone methods for passing params into it. Problem is as follows: I have 2 wireless routers. One provides internet access, the other provides sole access to the local network. Every day at 4am the main system creates a backup in 2 locations. One is a External usb drive, the other is a location on the network. This is all cool if it is remembered to change over to the local network router before leaving. But if it is forgotten the roof will collapse, the walls will burn, and I will be... well you get the idea. Solution: there is already a custom event that fires a automated backup program at 4am everyday. I need someway to force the wireless network to use the correct connection at say 3:58am everyday. Any ideas????

    Read the article

  • Mailer Daemon greeting failed

    - by Xelluloid
    I wrote a tool that sends automated mails to a couple of addresses. This worked for a couple of weeks. Now since yesterday I get Mailer-Daemon responses like this Hi. This is the qmail-send program at test.test2.net. I'm afraid I wasn't able to deliver your message to the following addresses. This is a permanent error; I've given up. Sorry it didn't work out. testuser@domain.com: Connected to 123.456.789.10 but greeting failed. Remote host said: 554 foo.bar.com I'm not going to try again; this message has been in the queue too long. Does someone have an idea what I can do now?

    Read the article

  • Use xrandr to set the absolute position of the screen?

    - by Eli
    I am running XFCE on Fedora 15. I use xrandr to set the secondary display (HDMI-0) to be to the right of the primary (DVI-0), however it is always at the top-right. Is it possible to set the absolute position of the display (e.g. DVI-0 at 0,0 and HDMI-0 at 1920,56), or even set the display to be at the bottom-right? I cannot modify the Xorg.conf, which would be the easy way, as that would mean generating an Xorg.conf file (there is none right now), and I do not know of any automated tool to do that (other than the fglrx driver). The reason why I need this is because I want to extend the XFCE panel accross both monitors, but with there being a 56-pixel-wide dead zone at the bottom I cannot do this.

    Read the article

  • change default userid for connecting to local AFP share?

    - by Stew
    I've got Netatalk & Avahi running on a local Ubuntu server--I use two different userids, "afp" for Time Machine and "stew" to access my media files etc. In order to mount a shared directory on my server, I have to click "Connect As..." and enter my userid/password every time, because it always tries to log in using Time Machine's userid. I'm not sure if this is because that userid is set as default, or just because it's the last userid that logged in to that server--either way: Is there a way to change the default userid for connecting to a given server? Mega extra credit: I'd love to have this automated, such that my userid, "stew", is always logged in (and heck, it'd be great to have the directories always mounted, too!) whenever the server is available. Thanks!

    Read the article

  • automating sql express backup via VSS backup

    - by Ornus
    I need to set up on my server automated daily SQL db backups (sql express, so no maintenance plans). To keep things simple I'm gonna use a backup solution (JungleDisk) that uses VSS to back up the DB file. SQL fully supports VSS and on requests freezes DB I/O, so I understand I'm taking snapshots. JungleDisk supports doing differential back up and compression, so it simplifies things and keeps the cost/bandwidth down. Is it enough to just backup up db file (mdf). Do I need to back up transaction log (ldf) file as well? I'm ok with losing a day's worth of work (since the last backup). if I go this route, what's the best way to restore the database? are there any issues with this approach I'm not aware of?

    Read the article

  • MySQL: Auto-increment value: 0 is smaller than max used value: xx

    - by Rhodri
    Increasingly I'm getting tables having to be repaired dwith the message returned of: Auto-increment value: 0 is smaller than max used value: xx This has happened on tables with 200 rows and tables with ~3 million rows, but so far the same few tables have had the problem. I'm running MySQL 5.0.22. The repairs are run by a script which checks every minute for the need to repair MySQL tables. I also have an automated backup of the 6 Gigabyte database running very two hours and the repairs always get trigged around the time of the backup. Any ideas?

    Read the article

  • How to remove the Ubuntu Gnome desktop after making the switch to KDE?

    - by codeLes
    This is the opposite of this question. Basically I've been using Ubuntu for a while but decided to give KDE a shot so I went through the process of getting the latest KDE installed. I'm very impressed with KDE and the Kwin window manager seems like a better WM than Compiz which is what I was using for Gnome (sure that's an oppinion). This was an Ubuntu Jaunty install. So how do I go about removing the Gnome desktop? Is there an automated way similar to what my previous question covered? UPDATE: Should there be any packages I should NOT remove in the process?

    Read the article

  • Mail Merge in Microsoft Word with images from Sharepoint

    - by Ian Turner
    Is there any way of doing a Mail Merge in Microsoft Word 2007 taking data, including images from a Sharepoint site? It's a bit crude, but I've managed to merge text by taking the data off the sharepoint site as an Excel sheet and then merging that. My problem is what to do with the images. I can set references to the images up in the Sharepoint site, however all I can find is a way of Mail Merging when images are in the same folder as the document you are trying to Merge and I can't find a sensible automated way to pulls these images together into one single folder.

    Read the article

  • How do i get Safari to ignore the SSL Certificate error?

    - by Tangopop
    In IE 6, 7, 8 and Firefox 3.6.3 and 3.0.5 i have installed a local SSL Certificate on the machine i am testing on and i have gotten the browser to igonre the SSL error (which is off one of my Web Test servers) Now i am tryin to do the same thing within safari 4 and with no luck. Basically i am running some automated scripts to test my website before they go live and i need to be able to ignore these errors as they will all run autonomosly. This is the error screen i am trying to avoid: http://library.bowdoin.edu/news/images/ezproxy-err/safari.jpg As i say i have installed the certificate locally and the IE 7 browser on the same machine works fine.

    Read the article

  • Generic tool to configure startup applications on Unix

    - by srid
    Is there an automated deployment tool that manages startup applications on a variety of machines, especially the Unices? Or is the only hope to study the nuts and bolts of each Unix (osx, linux, solaris, hpux, aix) on how to configure applications to launch on system startup? I want to run them as a specific user, instead of root. At the moment, I run them all within a screen session .. which is a hassle, as this requires manual intervention every time the machine is rebooted for some reason. Ideally, I am looking for a tool that would read, say, ~/.startup-programs which file contains, on each line, the command line to launch the needed daemons. And this tool should work on OSX, Linux, Solaris, HPUX and AIX ... writing the appropriate startup scripts for each platform.

    Read the article

  • Make Windows Task Scheduler alert me on fail

    - by acidzombie24
    I have an automated script that pulls backups from my website to my local computer. Once my server was down, another time i accidentally move my script. How do i make Windows Task Scheduler tell me with the script fails (or doesnt run/not found)? I dont care if a prompt comes up, an email or something that appears on my desktop. I want to be notified if something goes wrong. On my server crontab emails me about errors which is great. I want something like that on my windows 7 local computer.

    Read the article

  • Symantec Protection Suite and System Recovery 2011 Desktop Edition

    - by rihatum
    I am re-posting this as my previous question was being treated as if I am "Shopping or seeking Product Recommendations" even though I was NOT - BTW they have deleted my comments too which were not offensive in nature. anyway - I have re-phrased some parts of my question and I hope SF Admins "Do Not Modify / Edit" this one - will be most grateful for that. I have a lot of respect for the People who visit this SITE and help others ! Just To clarify : Just to go by SF rules - I am not seeking someone to Design this solution, I am simply seeking real world examples, experiences, technical expert opinions / suggestions, any tips or tricks they may have or any problems they may have faced while doing something similar above with these products. I am also not asking for Capacity Planning for Storage, We have done some research and I am seeking Expert Assurance / Suggestions. We (our company) are planning to deploy Symantec Endpoint Protection and Symantec Desktop Recovery 2011 Desktop Edition to our 3000 - 4000 workstations (Windows7 32 and 64) with a few 100s with Windows XP 32/64 Bit. I have read the implementation guide for SEP and have read tech-notes for Desktop Recovery 2011. Our team have planned to deploy this as follows : 1 x dedicated SQL 2008R2 for Symantec Endpoint Protection (Instead of using the Embedded Database) 1 x Dedicated SQL 2008R2 for Symantec Desktop Recovery 2011 (Instead of using the Embedded Database) 1 x Dedicated W2K8 R2 Box for the SEPM (Symantec Endpoint Protection Manager - Mgmt. APP) 1 x Dedicated W2K8 R2 Box for the Symantec Desktop Recovery 2011 Management Application Agent Deployment : As per Symantec Documentation for both of the above, an agent can be pushed via the Mgmt. Application (provided no firewalls are blocking ports required etc. - we have Windows firewall disabled already). Server Hardware : Per SQL Server : 16GB RAM + SAS DISKS + Dual XEON, RAID-10 for the SQL DB or I can always mount a LUN from our existing Hitachi or EMC SAN. SEPM Server : 16GB RAM + SAS DISKS + DUAL XEON System Recovery MGMT SERVER : 16GB RAM + SAS DISKS + DUAL XEON Above is the initial plan we have for 3000 - 4000 client workstation (Windows) Now my Questions :-) a) If we had these users distributed amongst two sites with AD DC / GC in each site, How would I restrict SEPM and Desktop Mgmt. solution to only check for users in their respective site ? b) At present all users are under one building but we are going to move some dept. to a new location (with dedicated connectivity), How would we control which SEPM / MGMT Server is responsible for which site ? c) We have netbackup in our environment backing up other servers, I am planning to protect these 4 (2 x SQL, 1 x SEPM, 1 x System Recovery Mgmt. Server) via netbackup or I can use System recovery 2011 server edition on all 4 of these boxes as well. (License is not an issue as we have the complete symantec portfolio included in our license). d) Now - Saving Desktop backups - What strategies have you implemented ? Any best practice recommendation for a large user base ? I was thinking to either mount a LUN from our Hitachi SAN on the Symantec Recovery Server itself or backup to the users hard drive locally and then copy it over to a network location ? Suggestions welcome :-) If you have anything to add / correct - that will be really helpful before diving into the actual implementation phase. Will be most grateful with your suggestions, recommendations and corrections with above - Many Thanks !

    Read the article

  • Outlook 2010: When sending message on behalf of someone else, store that message in the other person's Sent Items folder

    - by Helge Klein
    We have an e-mail account for support purposes which is tended to by multiple members of the team. When answering a support e-mail we obviously choose the support account as sender. Still, the answer is not stored in the support account's Sent Items folder, but in the Sent Items of the person actually answering. This behavior, which seems to be by design, prevents others from gaining access to the entire conversation and potentially causes multiple answers. I am looking for an automated way of moving e-mails sent on behalf of someone else to that person's Sent Items folder. I tried to create a rule for this but could not find the right setting.

    Read the article

  • Easier way to create floppy disk images?

    - by Bryan
    I'm using Vyatta routers with KVM and want to attach a floppy drive with a config file for Vyatta when I boot the image. I'll be doing this over and over again, and as such am looking for an automated way of creating the floppy images. Right now, I'm doing the following: Create floppy image with qemu-img create Format floppy image with mkdosfs Mount floppy image with mount -t fat /tmp/floppy.img /media/floppy Populate floppy image with cp -r /tmp/configs/ /media/floppy/ Unmount floppy image with umount /media/floppy Save floppy image with mv /tmp/floppy.img ~/floppies/ Any chance there's an easier way to do this?! Perhaps a shortcut application that I can give a directory to and it will do all this for me w/out having to mount the image?

    Read the article

  • CLI-Based monitoring tool for KVM

    - by Pinnacle
    I am developing a scheduler for running VMs on KVM. The scheduling has over-commitment of resources like memory and CPU. For this, I need a CLI-based monitoring tool that keeps me giving information about the resource usage of each VM, because it might be the case that due to over-provisioning of resources, VMs on a particular host are running very slowly depending on the benchmarks/programs each VM is running, and then I need to migrate a VM to another host and so on. I looked into libvirt-based tools like collects, MUNIN, Nagios-vert, etc.( http://libvirt.org/apps.html#monitoring ) I also looked into Ubuntu utility perf-kvm ( http://manpages.ubuntu.com/manpages/maverick/man1/perf-kvm.1.html ) I want to ask which CLI-based would be recommended by the community so that I can make a automated scheduler that takes care of the above situation.

    Read the article

  • SVN command that returns wether a user has a valid login for a repository?

    - by Zárate
    Hi there, I'm trying to find out an SVN command that would return some kind of true / false value depending on wether the user running it has access to a given repository. I'm building a tool for automated deployment and part of the process is checking out the code from the SVN repository. I'd like to find out if the user running the tool has a valid login already. If there's no valid login, just show a message and exit the tool (handling the SVN login internally is not an option at the moment). Plan B would be parsing the file in svn.simple looking for the repo URL, but thought about asking first. Thanks, Juan

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >