Search Results

Search found 19074 results on 763 pages for 'secure government government cloud security'.

Page 600/763 | < Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >

  • What are the requirements for Windows Remote Assistance over Teredo?

    - by Jens
    I try to get the Windows 7 (or Vista) remote assistance feature to work, without using UPnP on the novices computer. After enabling Teredo on the expert's computer (that is in a corporate network, and therefore has teredo disabled by default), I tried to connect to the novice both using Easy Connect and the invitation file with no success. My triubleshooting included the following (so far). A connection to the novice from my home pc was successful, hinting at a misconfiguration on the experts side. Both computers have a "qualified" connection to the Teredo Server. Both computers have a valid Teredo IP, access to the Global_ PNRP cloud and can resolve names registered with PNRP on the other computer. The expert can resolve the PNRP Id automatically generated with an Easy Connect help request Both computers can ping the other's PNRP name. Both computers can ping the other's Teredo IP Address using ping -6 Now, I am a little stumped. I expected Remote Assistance to work at this point, since my corporate firewall has no Teredo filtering. What could RA cause not to work in this setting? Thanks in advance!

    Read the article

  • Why do hosts prefer Linux to Windows Server?

    - by iconiK
    So far I see a HUGE majority of hosts provide only Linux shared hosting, providing Windows only to VPS (or even to only dedicated servers). Why is it so? While Windows is a lot more expensive than Linux (though it depends on a lot of factors, not just initial and support license cost), it also provides ASP.NET, IIS and of course, Microsoft SQL Server. I know in the past it might have been because of cPanel being Linux only but now they have a Windows version. But still, why is Linux predominantly used on shared hosting? PHP works on both systems. IIS can be (and probably is) faster. MySQL runs on both systems as well. cPanel has a Windows version. Python, Perl, Ruby, all run on Windows as well. You even have MS SQL Server Express, which I find more superior than MySQL in both speed and features. Access is there for low usage requirements, as is SQLite (which is so great for quick small stuff). And with PowerShell you have a good alternative to the Unix shell. EDIT: I am looking for common reasons, I realize each hosting company (and/or it's clients) may have different needs. This becomes very important when you get to VPS or Cloud which give you a full operating system to use.

    Read the article

  • is ksplice production ready?

    - by faultyserver
    I would be interested to hear the serverfault community's experiences with Ksplice in production. Quick blurb from wikipedia: Ksplice is a free and open source extension of the Linux kernel which allows system administrators to apply security patches to a running kernel without having to reboot the operating system. and Ksplice can, without restarting the kernel, apply any source code patch that only needs to modify the kernel code. Unlike other hot update systems, Ksplice takes as input only a unified diff and the original kernel source code, and it updates the running kernel correctly, with no further human assistance required. Additionally, taking advantage of Ksplice does not require any preparation before the system is originally booted (the running kernel does not need to have been specially compiled, for example). In order to generate an update, Ksplice must determine what code within the kernel has been changed by the source code patch. So a few questions: How has the stability been? any odd issues that you have encountered with its 'rebootless live patching' of the kernel? Kernel panics or horror stories? I have been running it on a few test systems and so far its been working as advertised, but I am interested in what other sysadmins experiences have been with Ksplice before going 'all in' and deploying this on our production servers. So, anybody using Kspice in production? update: hmm, not seeing any real activity on this question after a couple of hours (besides some kind upvotes and favs). Maybe to spark some activity I'll also ask a few more questions and see if we can get this discussion going... "If you are aware of Ksplice, is there a reason you are not using it?" "Do you feel its still too bleeding edge, unproven or untested?" "Does Ksplice not fit well within your current patch-management system?" "Do you hate having systems that have long (and secure) uptimes?" ;-)

    Read the article

  • Add single sign-on into existing web app

    - by EvilDr
    Apologies if this isn't the best site, I've search for an answer but can't find anything quite right. I don't actually now the correct terminology I should be using here, so any pointers will be appreciated. I have a web application that accessed by many different users across different organisations. Access is provided by each user having a unique username/password which is stored in SQL (database fields are customerID, userID, username). Some organisations are now asking if we can change this to allow "Active Directory single sign-on" so that users don't need to remember yet another set of login details. From research I can see how this is achieved using OpenAuth and Google (etc), but I know hardly anything about AD and can't find much information on this (again I'm sure it helps when you know the terminology). Is this request even possible to achieve, given that most users will be from different (and unrelated) organisations? I saw on a Microsoft Build video not long ago that there is some kind of replication service for AD to allow Cloud authentication. Is this what I should be aiming for?

    Read the article

  • Running $ORIGIN linked binaries from setuid scripts on linux

    - by drscroogemcduck
    I'm using suidperl to run some programs that require root permissions. however, the runtime linker won't expand library paths which contain $ORIGIN entries so the programs i want to run (jstack from java) won't run. more info here There is one exception to the advice to make heavy use of $ORIGIN. The runtime linker will not expand tokens like $ORIGIN for secure (setuid) applications. This should not be a problem in the vast majority of cases. my program looks something like this: #!/usr/bin/perl $ENV{PATH} = "/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/java/jdk1.6.0_12/bin:/root/bin"; $ENV{JAVA_HOME} = "/usr/java/jdk1.6.0_12"; open(FILE, '/var/run/kil.pid'); $pid = <FILE>; close(FILE); chomp($pid); if ($pid =~ /^(\d+)/) { $pid = $1; } else { die 'nopid'; } system( "/usr/java/jdk1.6.0_12/bin/jstack", "$pid"); is there any way to fork off a child process in a way so that the linker will work correctly.

    Read the article

  • A single AD user can't log into a single Mac bound to the domain (DirectoryServices error). How can I resolve this?

    - by Ben Wyatt
    On our campus, we have about 60 Macs joined to our Active Directory domain. Most users have no problems logging into Macs, as long as their accounts are configured correctly. However, we have one particular user who is unable to log in to just some of the Macs. He has no problem with most of them, but there is one group of them (all built from the same image) that he can't log in to. The machine in question is running OS X 10.6.2. The relevant entries from secure.log are below, with the hostname and username redacted. Aug 16 10:32:43 hostname SecurityAgent[4411]: Could not get the user record for username from DirectoryServices. Aug 16 10:32:43 hostname SecurityAgent[4411]: Will sleep 1 seconds and try again (retryCount = 4) Aug 16 10:32:44 hostname SecurityAgent[4411]: Could not get the user record for username from DirectoryServices. Aug 16 10:32:44 hostname SecurityAgent[4411]: Will sleep 2 seconds and try again (retryCount = 3) Aug 16 10:32:46 hostname SecurityAgent[4411]: Could not get the user record for username from DirectoryServices. Aug 16 10:32:46 hostname SecurityAgent[4411]: Will sleep 4 seconds and try again (retryCount = 2) Aug 16 10:33:10 hostname SecurityAgent[4411]: Could not get the user record for username from DirectoryServices. Aug 16 10:33:10 hostname SecurityAgent[4411]: Will sleep 8 seconds and try again (retryCount = 1) Aug 16 10:33:18 hostname SecurityAgent[4411]: User info context values set for username Aug 16 10:33:18 hostname SecurityAgent[4411]: unknown-user (username) login attempt PASSED for auditing Everything I've found online suggests that our use of Mobile Accounts is causing the issue. I turned that feature off, but I still can't log in as that user. id returns a record for his account, and nothing looks out of the ordinary. Has anyone here run into this before?

    Read the article

  • Knowledge and user generated content management system to track files, research, proposals, etc.?

    - by Eshwar
    I'll try keep it short. Here's the scenario: We have employees all over the world performing similar work i.e. research, generating powerpoint slides, word documents, graphics, etc. Many times a lot of this previous work can be reused for another future project. The current arrangement is email and phone calls which as you would agree is quick if you know where to look but otherwise archaic and very very inefficient. So I am looking for software that will allow me to do the following: Tag files e.g. an investor presentation on cellphone usage in kenya would be tagged investor, cellphone, kenya Manage references e.g. if we read something on the internet, should be able to paste that link in some fashion and tag it as above. Preferably cloud based so that it can be accessed by anybody and additionally would be nice (though NOT must) to have access levels (director, manager, everyone) A nice interface that non technically savvy folks can warm up to ;) A desktop app would be handy so that people don't always have to click upload or something A tree based system is inefficient in this case because content is usually linked across branches and also people might not quite agree on one format of a tree. Tagging works around this very nicely. What I have considered so far: Evernote (for its more professional look) Springpad (for its versatility with content) Mendeley (this is a research manager and in some ways ideal, but i fear its limited to PDFs) The goal is that when somebody wants to look for a document, they don't have to ask a colleague, they can just search with keywords and all relevant information shows up. Thanks!

    Read the article

  • Can any iSCSI NAS appliance replicate / clone a LUN to an external drive?

    - by Boden
    I would like to backup using Windows Imaging to some kind of NAS appliance. I believe this will require the NAS to support iSCSI. I would then like the appliance to support the replication of the iSCSI LUN to an external eSATA or USB disk connected directly to the appliance. I've found plenty of NAS appliances that can do iSCSI and replicate to an external drive, but none that I've found thus far can do both at once. That is, the devices can do iSCSI, but then the replication feature doesn't work. The idea here is to backup to an appliance located in a secure office far away from the server room. Offsite backups to external hard drive could be managed from the appliance. The benefits of such a setup would be: 1) very unlikely that fire or random theft would affect both server-room backup and "remote" backup appliance 2) offsite backups could be managed by multiple trusted people without granting access to server room 3) Windows imaging provides poor man's deduplication, so each backup volume can contain a decent backup history. I understand why this would be a non-trivial thing to implement, but I'm wondering if such a thing exists? Preferably a tabletop, low to medium cost device. Alternative solutions welcome. NOTE: I'm backing up very few but very large files, so file replication is not a good option.

    Read the article

  • Setting up SSL on JBoss 5

    - by socal_javaguy
    How can I enable SSL on JBoss 5 on a Linux (Red Hat - Fedora 8) box? What I've done so far is: (1) Create a test keystore. (2) Placed the newly generated server.keystore in $JBOSS_HOME/server/default/conf (3) Make the following change in the server.xml in $JBOSS_HOME/server/default/deploy/jbossweb.sar to include this: <!-- SSL/TLS Connector configuration using the admin devl guide keystore --> <Connector protocol="HTTP/1.1" SSLEnabled="true" port="8443" address="${jboss.bind.address}" scheme="https" secure="true" clientAuth="false" keystoreFile="${jboss.server.home.dir}/conf/server.keystore" keystorePass="mypassword" sslProtocol = "TLS" /> (4) The problem is that when JBoss starts it logs this exception (during start-up) (but I am still able to view everything under http://localhost:8080/): 03:59:54,780 ERROR [Http11Protocol] Error initializing endpoint java.io.IOException: Cannot recover key at org.apache.tomcat.util.net.jsse.JSSESocketFactory.init(JSSESocketFactory.java:456) at org.apache.tomcat.util.net.jsse.JSSESocketFactory.createSocket(JSSESocketFactory.java:139) at org.apache.tomcat.util.net.JIoEndpoint.init(JIoEndpoint.java:498) at org.apache.coyote.http11.Http11Protocol.init(Http11Protocol.java:175) at org.apache.catalina.connector.Connector.initialize(Connector.java:1029) at org.apache.catalina.core.StandardService.initialize(StandardService.java:683) at org.apache.catalina.core.StandardServer.initialize(StandardServer.java:821) at org.jboss.web.tomcat.service.deployers.TomcatService.startService(TomcatService.java:313) I do know that's there's more to be done to enable full SSL client authentication....

    Read the article

  • Setting up MySQL Linux slave with a Windows master

    - by philwilks
    I'm running a MySQL 5.0 database server on Windows Server 2008. The total size of the database is about 1Gb. I make daily backups, but I'd like to step up to having a slave server for extra protection. My thinking was that I wouldn't need the expense of a Windows machine to do this, and a Linux "cloud server" from RackSpace would do the job well for quite a low cost. However I have little experience with Linux, so I have a few questions... Does this sound like a good idea? Is there anything wrong with linking Windows and Linux MySQL servers? Does Linux have the equivalent of Remote Desktop Connection? If so can I use this from a Windows machine? Would a particular Linux distro be well suited to this task? RackSpace offer ArchLinux, CentOS, Debian, Fedora and Ubuntu. My immediate thinking is to go with Ubuntu as I've heard it's more friendly for people coming from a Windows background. Any comments you have would be very appreciated! Phil

    Read the article

  • Cannot Access Shared Folder From IIS

    - by Tim Scott
    From IIS I need to access a folder on another computer. Both servers are Window 2008 SP2, and they live in a Virtual Private Cloud on Amazon EC2. They reach one another by private IP -- they are in WORKGROUP, not a domain. I can access the shared folder manually when logged in to the client as Administrator. But IIS gets "access denied." Here's what I have done: Set File Sharing = ON Set Password Protected Sharing = OFF Set Public Folder Sharing = ON Shared the folder Added permission to the share: Everyone, Full Control Added permission to the share: NETWORK SERVICE, Full Control Verified that File & Printer Sharing is checked in Windows Firewall Opened port 445 to inbound traffic from local sources I tried adding <remote-machine-name>\NETWORK SERVICE to the share but it says it does not recognize the machine, which makes sense, I guess. As I said, from the other computer I have no trouble accessing the shared folder from my user account, but IIS is shut out. How does the file server even know the difference? I would assume that with Everyone given full control and password protected sharing turned off, it would not matter what the client user account is. In any case, how to solve? UPDATE: To clarify, I am not trying to serve up files on the share directly through IIS. Rather I am writing files to the share from my code (System.IO).

    Read the article

  • Someone used or hacked my computer to commit a crime? what defense do I have?

    - by srguws
    Hello, I need IMMEDIATE Help on a computer crime that I was arrested for. It may involve my computer, my ip, and my ex-girlfriend being the true criminal. The police do not tell you much they are very vague. I was charged though! So my questions are: -If someone did use my computer at my house and business and post a rude craigslist ad about a friend of my girlfriend at the time from a fake email address, how can I be the ONLY one as a suspect. Also how can I be charged. I noticed the last few days there are many ways to use other peoples computers, connections, etc. Here are a few things I found: You can steal or illegally use an ip addresss or mac address. Dynamic Ip is less secure and more vulnerable than static. People can sidejack and spoof your Mac, Ip, etc. There is another thing called arp spoofing. I am sure this is more things, but how can I prove that this happened to me or didnt happen to me. -The police contacted Craigslist, the victim, aol, and the two isp companies. They say they traced the IP's to my business and my home. My ex was who I lived with and had a business with has access to the computers and the keys to bothe buildings. My brother also lives and works with me. My business has many teenagers who use the computer and wifi. My brother is a college kid and also has friends over the house and they use the computer freely. So how can they say it was me because of an angry ex girlfriend.

    Read the article

  • What are problems and pitfalls with a public facing Active Directory

    - by Ralph Shillington
    The situation that i'm faced with is this: We plan on using a number of server applications hosted on Amazon EC2 machines, mainly Microsoft Team Foundation Server. These services rely heavily on Active Directory. Since our servers are in the Amazon cloud it should go without saying (but I will) that all our users are remote. It seems that we can't setup VPN on our EC2 instance -- so the users will have to join the domain, directly over the internet then they'll be able to authenticate and once authenticated, use that token for accessing resources such as TFS. on the DC instance, I can shut down all ports, except those needed for joining/authenicating to the domain. I can also filter the IP on that machine to just those address that we are expecting our users to be at (it's a small group) On the web based application servers, I imagine all we need to open is port 80 (or 8080 in the case of TFS) One of the problems that I'm faced with is what domain name to use for this Active directory. Should I go with "ourDomainName.com" or "OurDomainName.local" If I choose the latter, does that not mean that I'll have to get all our users to change their DNS address to point to our server, so it can resolve the domain name (I guess I could also distribute a host file) Perhaps there is another alternative that I'm completely missing.

    Read the article

  • Blocking an IP in Webmin

    - by Dan J
    I've been checking my /var/log/secure log recently and have seen the same bot trying to brute force onto my Centos server running webmin. I created a chain + rule in Networking - Linux Firewall: Drop If source is 113.106.88.146 But I'm still seeing the attempted logins in the log: Jun 6 10:52:18 CentOS5 sshd[9711]: pam_unix(sshd:auth): check pass; user unknown Jun 6 10:52:18 CentOS5 sshd[9711]: pam_succeed_if(sshd:auth): error retrieving information about user larry Jun 6 10:52:19 CentOS5 sshd[9711]: Failed password for invalid user larry from 113.106.88.146 port 49328 ssh2 Here is the contents of /etc/sysconfig/iptables: # Generated by webmin *filter :banned-ips - [0:0] -A INPUT -p udp -m udp --dport ftp-data -j ACCEPT -A INPUT -p udp -m udp --dport ftp -j ACCEPT -A INPUT -p udp -m udp --dport domain -j ACCEPT -A INPUT -p tcp -m tcp --dport 20000 -j ACCEPT -A INPUT -p tcp -m tcp --dport 10000 -j ACCEPT -A INPUT -p tcp -m tcp --dport https -j ACCEPT -A INPUT -p tcp -m tcp --dport http -j ACCEPT -A INPUT -p tcp -m tcp --dport imaps -j ACCEPT -A INPUT -p tcp -m tcp --dport imap -j ACCEPT -A INPUT -p tcp -m tcp --dport pop3s -j ACCEPT -A INPUT -p tcp -m tcp --dport pop3 -j ACCEPT -A INPUT -p tcp -m tcp --dport ftp-data -j ACCEPT -A INPUT -p tcp -m tcp --dport ftp -j ACCEPT -A INPUT -p tcp -m tcp --dport domain -j ACCEPT -A INPUT -p tcp -m tcp --dport smtp -j ACCEPT -A INPUT -p tcp -m tcp --dport ssh -j ACCEPT -A banned-ips -s 113.106.88.146 -j DROP COMMIT # Completed # Generated by webmin *mangle :FORWARD ACCEPT [0:0] :INPUT ACCEPT [0:0] :OUTPUT ACCEPT [0:0] :PREROUTING ACCEPT [0:0] :POSTROUTING ACCEPT [0:0] COMMIT # Completed # Generated by webmin *nat :OUTPUT ACCEPT [0:0] :PREROUTING ACCEPT [0:0] :POSTROUTING ACCEPT [0:0] COMMIT # Completed

    Read the article

  • Software to monitor bill payment to mission critical IT service providers (ISP, DNS etc.)

    - by Sholom
    Hi All, The Problem: Our very likable but absent minded bookkeeper keeps neglecting to pay our IT vendors on time. Just this past week our internet service was disconnected. Same could happen to many other mission critical accounts (domain registrar, backup MX, anti-virus license, HackerSafe (McAfee secure) service and even an 800 number to name a few). As the sysadmin, i monitor my severs to make sure they are plugged into the power-outlet. I believe i should also monitor my services to make sure they are plugged in to their money-outlet. To compound the problem, when the power goes out someone else will likely notice and notify me. But if a bill is not payed, no one will ever notice until service is lost. Lost as in losing our domain name which would cause a lot more damage then the power failing on our server. [Solution] = [Doesn't work because]: Retrain the bookkeeper = Wishful thinking. Notify my manager = Already have (via email). Protects me, does not solve problem. Fire bookkeeper = What makes you so sure the next one will never forget? Bottom line: Humans are humans and sooner or later something critical will be royally messed up. We need to partner with a machine to help us out here. Anybody have the same problem? What software/solution do you use? I would like software that emails me when a bill is passed due just like i get an email when the power outlet fails. Anyone hear of anything like that? Thanks

    Read the article

  • Having Trouble Ripping Some CD's

    - by James
    Hi, When I buy CD's I tend to rip them to FLAC right away. When ripping I use Foobar2000 or Exact Audio Copy and enable secure ripping which uses error correction. Recently I bought a 2 CD compilation album brand new but when I tried to rip the second CD on my laptop using Foobar2000 it struggled with the last 2 tracks and was unable to finish. EAC was also unable to get an accurate rip and reports read errors. Ripping in fast mode results in audible errors in the output track. I have tried another computer and having similar problems. I cannot see any damage to the disc and it has not been dropped or anything. The weird thing is that I had similar problems with a different album and different PC a while back. This other CD was a compilation disk so it was also right up to the CD capacity limit and again it was the last few tracks that would not rip. Dozens of other discs have ripped fine So I am wondering if the CD is simply defective, or whether it is something else. How common are defective CD's? Do some CD drives struggle with CD's of this capacity? Or Is this some kind of copy protection? I'm thinking of asking Amazon for a replacement but it would be annoying if I get the same problem again.

    Read the article

  • Need a recommendation for shared storage on auto-scaling ec2 w/ scalr

    - by john h.
    I have come across so many answers to this question that I am completely lost! I am moving our 2 sites to a load balanced ec2 system with scalr as our cloud manager. Now the question is coming up about persistent storage for the user's uploaded content and other files. Could someone please give me a suggestion and possible a link to a tutorial for the following setup and goals. 2 websites (1 Forum, 1 ecommerce). 1 LB 1 App server (to scale out to as many as needed) 1 DB server (to scale out to as many as needed) Our sites will need to autoscale and according to what I am learning about scalr, that means as new instances load up, I need to run a script to set the basics up on that server (git,php mods, pull site from git, move keys, etc) What I don't understand is how should I handle user uploaded content like profile pictures, avatars, product images, themes, etc... Do I mount an EBS or s3fs folder to hold the websites (maybe /var/www/websitefolder) or do I do something like mount the avatar folders /var/www/websitefolder/images/avatars) I am not sure where to go with this. Could someone give me some detailed help? -John

    Read the article

  • Production deployment to EC2 with minimal downtime

    - by jensendarren
    I have a simple web application deployed on a large instance with EC2. I now want to deploy the latest code to this server but I want to do this in a way which minimizes downtime and is a smooth as possible for the end user. Here is my plan: Fire up another large instance Install all the software layers on that instance Restore and attach an EBS drive to the instance Deploy our latest production ready code on the new instance Run all tests (including manual testing of the application) (If tests pass) Put a "Site Under Maintenance" notice on the live site. Backup the EBS instance on the live site Detach the EBS instance from the new server and replace with the latest backup Use ec2-associate-address to move the IP address to the new instance Sit back and wait for traffic to start flowing though the new instance Terminate the old instance Does this seem like a good strategy? Are there any tutorials or books that might cover this topic? I have already read Cloud Application Architectures by George Reese, which is an excellent book, but does not cover deployment. Additionally, I know that there are tools that can help with this like RightScale or enStratus which I will use when I start using more than one instance.

    Read the article

  • Backup script to FTP with timed subfolders

    - by Frederik Nielsen
    I want to make a backup script, that makes a .tar.gz of a folder I define, say fx /root/tekkit/world This .tar.gz file should then be uploaded to a FTP server, named by the time it was uploaded, for example: 07-10-2012-13-00.tar.gz How should such backup script be written? I already figured out the .tar.gz part - just need the naming and the uploading to FTP. I know that FTP is not the most secure way to do it, but as it is non-sensitive data, and FTP is the only option I have, it will do. Edit: I ended up with this script: #!/bin/bash # have some path predefined for backup unless one is provided as first argument BACKUP_DIR="/root/tekkit/world/" TMP_DIR="/tmp/tekkitbackup/" FINISH_DIR="/tmp/tekkitfinished/" # construct name for our archive TIME=$(date +%d-%m-%Y-%H-%M) if [ $1 ]; then BACKUP_DIR="$1" fi echo "Backing up dir ... $BACKUP_DIR" mkdir $TMP_DIR cp -R $BACKUP_DIR $TMP_DIR cd $FINISH_DIR tar czvfp tekkit-$TIME.tar.gz -C $TMP_DIR . # create upload script for lftp cat <<EOF> lftp.upload.script open server user user password lcd $FINISH_DIR mput tekkit-$TIME.tar.gz exit EOF # start backup using lftp and script we created; if all went well print simple message and clean up lftp -f lftp.upload.script && ( echo Upload successfull ; rm lftp.upload.script )

    Read the article

  • Exchange 2010 sends out spam.

    - by Magnus Gladh
    Hi. I have an Exchange Server 2010, that uses a smart host to send out mails. A day ago the owner of smart host contact us and told us that we send out spam. I have try different open relay test on the net and all of them come back saying that this server is secured and can not be used as relay server. But I can see in my Exchange Queue Viewer that it keeps coming in new messages. Here is an example of how it looks. Identity: mailserver\3874\13128 Subject: Olevererbart:: [email protected] Pfizer -75% now Internet Message ID: <[email protected]> From Address: <> Status: Ready Size (KB): 6 Message Source Name: DSN Source IP: 255.255.255.255 SCL: -1 Date Received: 2010-12-09 21:46:22 Expiration Time: 2010-12-11 21:46:22 Last Error: Queue ID: mailserver\3874 Recipients: [email protected] How can I secure our exchange server more, to stop this from happening? Could I have got an virus that hooks up to our exchange server and send mail throw that? As I can see the From Address is always <, is there someway that I can stop sending mails that don't have a from address that I describe? Pleas help

    Read the article

  • Which CPU for XEN - LAMP testbed - Budget

    - by deploymonkey
    Dear serverfault knowledgeables, im in a decision dilemma right now, which I can't resolve due to lack of hands on experience. I need to build a testbed for basically virtualizing a LAMP application (os'ses not yet decided) including server side calculations. I'll opt for XEN since it seems better supported by cloud hosters at the moment. The hardware is for a proof of concept for a startup doing saas and might be used for closed live alpha/beta later on. After testing, the testbed might be a) deployed as a colocated white box server b) used as workstation Single socket is enough. We want to have ECC memory for reliability, this excludes most of the consumer line at intel. If intel CPU, then threaded cpu (HT) is preferred have at least 16 gig ram If justified by price and reliability is not too bad, a high quality desktop MB instead of a server MB would be worth a try It came down to the opteron 6128 vs. the xeon 5620 for me after a lot of research, but I don't necessarily have to be right. Which CPU is preferrable, concerning TCO (MB price, power requirements 24/7...) , Opteron 6128 or Xeon 5620? Which one offers better performance in real world applications? (Do You have any other suggestions I probably overlooked?) Thank You for Your consideration

    Read the article

  • What are the "least legally restrictive" well-connected countries to host a website?

    - by monster
    NB: I am aware that this question is subjective, as it can't be defined precisely, but the answers should still be "objective": Country name, and what makes it legally safer. EDIT: A) I am located in Germany. B) I am NOT looking for a place to offer pirated Software/Media; no binary on my site, except "profile icon". Hello! I want to start publishing "social" websites / apps, and I found that the biggest initial problem is this: Any and all services I have to depend on, including Domain Registrar, DNS provider, Server/Cloud Provider, CDN Provider, ... even my Insurance Agent, basically say that they can "throw me out" if my website contains "unacceptable" content. It's always phrased in such a way that basically anything can fall under "unacceptable" content. This is very frustrating because you just can't fully control what users post on your "social website", and you so you basically have to expect when you go to bed that your site is going to be gone when you wake up. I've heard a lot of horror stories about this. Since the "Terms Of Service" of all those providers are foremost to protect themselves from legal actions, and those legal actions depend on the country where they are located, it seems like the first step is to find which country is the "safest" to locate a site. "Safest" being defined as, where I am least likely to get in legal trouble with the local authorities, if some user posts something unacceptable in some way. The main restriction is that it should also be a "well-connected" country, because there is no point in being "safe", if my users can't get to my sites, or the latency is unacceptable. I am targeting the English speaking people in any country as my future users.

    Read the article

  • Server Clustering (Django, Apache, Nginx, Postgres)

    - by system-matrix
    I have a project deployed with django, Apache, Nginx and Postgres. The project has requirement of live data viewable to customers. The projects main points are: 1. Devices in field send data to server(devices are also like website users) after login. 2. There is background import process which imports the uploaded data in postgres. 3. The webusers of the system use this data and can send commands to the devices, which devices read when they login. 4. There are also background analysis routines running on the data. All the above mentioned setup and system is deployed on one amazon EC2 cloud machine. The project currently supports over 600 devices and 400 users. But as the number of devices are increasing with time the performance of the server is going down. We want to extend this project so that it can support more and more devices. My initial thinking is, We will create one more server like current one and divide the devices amongst these to servers. But Again We need a central user and device managment point though django admin. Any Ideas? What are the best possible ways to create a scalable architecture? How can I create a Postgres Cluster and Use it with Django, if possible?

    Read the article

  • Immediate logout after login with PAM, Kerberos, and LDAP

    - by Dylan Klomparens
    I've set up remote login on a computer using Kerberos and LDAP. I've also configured NFS to mount onto /home so that the user's home directory is the same wherever they login. Kerberos authentication seems to work fine. I can get a ticket using kinit user1 (assuming user1 is a remote user) and see the ticket with klist. I'm pretty sure LDAP is working because I see the proper output from getent passwd, which lists all the remote users. The contents of /home are present when I list the files. The problem is: when I try to login as a remote user the session is immediately ended. Why is it not letting me stay logged in? Here is the output from /var/log/messages after a login attempt: # /var/log/messages: Oct 9 10:57:53 tophat login[6472]: pam_krb5[6472]: authentication succeeds for 'user1' ([email protected]) Oct 9 10:57:53 tophat login[6472]: pam_krb5[6472]: pam_setcred (establish credential) called Oct 9 10:57:53 tophat login[6472]: pam_krb5[6472]: pam_setcred (delete credential) called EDIT: The distro is openSUSE. Here are the common-* files in /etc/pam.d:   # /etc/pam.d/common-account account required pam_unix.so   # /etc/pam.d/common-auth auth sufficient pam_krb5.so minimum_uid=1000 auth required pam_unix.so nullok_secure try_first_pass   # /etc/pam.d/common-session session optional pam_umask.so umask=002 session sufficient pam_krb5.so minimum_uid=1000 session required pam_unix.so There doesn't appear to be a /var/log/auth.log file nor a /var/log/secure file.

    Read the article

  • How can I get Haproxy to not log local requests?

    - by coneybeare
    I am trying to clean out some of the log clutter from my machines and am starting by removing requests that are generated from the server themselves. I have cache warmers running around the clock and I don't want these polluting the logs. I was able to get apache to stop logging local requests by adding a dontlog for the local IP: SetEnvIf Remote_Addr "RE\.DA\.CT\.ED" dontlog CustomLog "|logger -p local3.info -t http" combined env=!dontlog and now I am looking for something similar to put in a configuration for the Haproxy log. How can I prevent 127.0.0.1 requests from writing to the Haproxy log? UPDATE: 2/15/11 I use the excellent loggly service to pull out logs in the cloud, but I am seeing tons of logs like this: 2011 Feb 15 06:09:42.000 ip-10-251-194-96 http: RE.DA.CT.ED - - [15/Feb/2011:06:09:42 -0500] "HEAD /search/Nevad/predictive/txt HTTP/1.0" 200 - "-" "Wget/1.10.2 (Red Hat modified)" 2011 Feb 15 06:09:42.000 127.0.0.1 haproxy[10390]: 127.0.0.1:58408 [15/Feb/2011:06:09:42] www i-5dd7a331.0 0/0/0/8/8 200 210 - - --NI 0/0/0 0/0 "HEAD /search/Nevad/predictive/txt HTTP/1.1" and I want them gone. This question focuses on how to remove that haproxy log line from writing to the server side log in the first place.

    Read the article

< Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >