Search Results

Search found 22139 results on 886 pages for 'security testing'.

Page 545/886 | < Previous Page | 541 542 543 544 545 546 547 548 549 550 551 552  | Next Page >

  • Firewall GPO not applying despite being enumerated by gpresult

    - by jshin47
    I have a need to open up the admin$ share on all of my domain's client PC's and I am trying to do so using group policy. I defined computer policy for Windows Firewall with Advanced Security in a policy object linked to the appropriate container and added the appropriate rules. However, they are not being applied! I feel like I have tried all of the obvious steps: I've checked gpresult and the resulting set of policy is the way that I would expect it to look. I've gpupdate /force and gpupdate /sync on a few client computers, but no matter what I do they don't seem to respond to my changes. I know that other computer policies in the GPO are being applied so it is strange that these are not. I have also disabled exceptions on clients in the firewall GPO, but that doesn't seem to be applying either. Here is a screenshot of the firewall.cpl from a client: Basically, although other options in the same GPO ARE applied for computer policy, the firewall settings seem to be ignored.

    Read the article

  • Remote kill, upload, execute file

    - by Masoud M.
    I'm developing a program and I need to upload my xyz.exe file to many host machines and execute them frequently. I need a server-client tool to do it as below steps after an update signal from my PC: Those host machine should kill any running processes with name xyz.exe. Download my new xyz.exe. Then execute new xyz.exe. I know about some tools like PsExec, but I need a tools with better user-interface and more powerful. Is there any tool to do it ? UPDATE: The systems are in a same LAN, OS is windows (XP or 7), No full remote access is needed. I'm a developer and my program should run in remote hosts and I'm testing my application.

    Read the article

  • Why do I get "Permission denied (publickey)" when trying to SSH from local Ubuntu to a Amazon EC2 se

    - by Vorleak Chy
    I have an instance of an application running in the cloud on Amazon EC2 instance, and I need to connect it from my local Ubuntu. It works fine on one of local ubuntu and also laptop. I got message "Permission denied (publickey)" when trying to access SSH to EC2 on another local Ubuntu. It's so strange to me. I'm thinking some sort of problems with security settings on the Amazon EC2 which has limited IPs access to one instance or certificate may need to regenerate. Does anyone know a solution?

    Read the article

  • Configure Raid On Red Hat 5

    - by Sopolin
    Hi all, I have a problem with configure raid on red hat enterprise linux. The problem is when I create raid on two hard disks. It works successfully but after I remove one hard disk. It works normally. It means that I plug in one hard disk for testing configure raid. But after that I put both hard disks and create other file. The raid is cleared. My question is: Why do I turn off server machine, it clears raid that I configure first time before I turn off? Could anyone help to solve this problem? Thank, Ung Sopolin

    Read the article

  • Mapping capslock to control on Mac OS X: works for some things, but not others?

    - by keflavich
    I've mapped my capslock key to control using the Modifier Keys mapping in System Preferences: Keyboard. I've also tried mapping to "right control" instead of "left control" as per http://hints.macworld.com/article.php?story=20060825072451882 using a plist editor. The mapping seems to work in all cases except one: I can't use capslock with left-shift to make key mappings or apparently do anything else. capslock (as control) with right-shift works. I'm primarily testing by using control-tab / control-shift-tab to switch between tabs. Using the on-screen-keyboard viewer, I can get capslock-shift-(just about anything) to work, but not capslock-leftshift-tab. My best guess is that somehow the particular keyboard I'm working on is faulty, but I'm curious whether anyone else can reproduce this or has any ideas.

    Read the article

  • Unresponsive virtual OS

    - by confusedGeek
    Hopefully someone has a suggestion on how to resolve this. Configuration Host: Win 2003R2 w/Virtual Server 2005R2 Virtual1: Win 2003R2 w/Sql Server 2005 Virtual2: Win 2003R2 w/WSS 3.0 Situation This past weekend the power went out and took down the servers (no UPS, it's a desktop standing in as dev testing server). Since the servers went down the Virtual2 server after running WSS fairly heavily for an hour to two will become unresponsive via HTTP. If I login via virtual server's remote control I don't get anything beyond a background screen. The CPU counter on the virtual server's master status shows that it isn't doing anything. The only thing I have been able to do is to turn off Virtual2, which loses any state changes. Shutdown commands issue from the virtual server master status are ignored. After restarting Virtual2 the event logs and application logs don't indicate what caused the problem. Anyone have an idea as to how to repair the OS, or maybe what could be the problem? Thanks ahead of time.

    Read the article

  • SSL certificates with password encrypted key at hosting provider

    - by Jurian Sluiman
    We are a software company and offer hosting to our clients. We have a VPS at a large Dutch datacenter. For some of the applications, we need an SSL certificate which we'd like to encrypt with a password protected keyfile. Our VPS reboots now and then because of updates whatsoever, but that means our apache doesn't start right away because the passwords are needed. This results in downtime and is of course a real big problem. We can give the passwords to our VPS datacenter, or create certificates based on keyfiles without passwords. Both solutions seem not the best one, because they compromise the security of our certificates. What's the best solution for this issue?

    Read the article

  • IE 8 doesn't appear to clear cache on demand. Is anyone else seeing this?

    - by Steve
    I have a client that uploads updated pdf files to her Concrete5 CMS, through the file manager, replacing the old file with the same name. She then does a cms "clear cache" and exits as she should. Then, in testing, she finds that the old file still comes up when clicking on the link. On further review, the cms file manager version tracking shows that the file has been updated, and, for me, the new file comes up, as it should, when clicking the link. My client hase also refreshed her browser cache and still, she only gets the old file when clicking on the link. She says that, while she can't seem to force an immediate cache update, overnight it appears to update. My client is also part of a large company-wide lan and intranet. Is it possible that there is a cache function placed outside of her local browser and cms cache that is not updating?

    Read the article

  • Do I need a VPN to secure communication over a T1 line?

    - by Seth
    I have a dedicated T1 line that runs between my office and my data center. Both ends have public IP addresses. On both ends, we have a T1 routers which connect to SonicWall firewalls. The SonicWalls do a site-to-site VPN and handle the network translation, so the computers on the office network (10.0.100.x) can access the servers in the rack (10.0.103.x). So the question: can I just add a static route to the SonicWalls so each network can access each other with out the VPN? Are there security problems (such as, someone else adding the appropriate static route and being able to access either the office or the datacenter)? Is there another / better way to do it? The reason I'm looking at this is because the T1 is already a pretty small pipe, and having the VPN overhead makes connectivity really slow.

    Read the article

  • Apache on Mac Mavericks issue

    - by Michael
    Trying to run Apache so that I can create a testing server on my mac.When I start apache it starts, but it doesn't run (no connection to local host. Ill upload the unix,you'll see that after starting there is no processes, and I did a check to show you what was running on my port 80... I don't entirely know that means. Michaels-MacBook-Pro-3:~ michaelramos$ sudo apachectl start Michaels-MacBook-Pro-3:~ michaelramos$ ps aux | grep httpd michaelramos 348 0.0 0.0 2442000 624 s000 S+ 8:51AM 0:00.00 grep httpd Michaels-MacBook-Pro-3:~ michaelramos$ sudo apachectl start org.apache.httpd: Already loaded Michaels-MacBook-Pro-3:~ michaelramos$ sudo lsof -i ':80' COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME ocspd 96 root 18u IPv4 0x8402f926599c58df 0t0 TCP dhcp-92-67.radford.edu:49267->108.162.232.196:http (ESTABLISHED) ocspd 96 root 20u IPv4 0x8402f926599c58df 0t0 TCP dhcp-92-67.radford.edu:49267->108.162.232.196:http (ESTABLISHED) ocspd 96 root 21u IPv4 0x8402f926599c50f7 0t0 TCP dhcp-92-67.radford.edu:49268->108.162.232.206:http (ESTABLISHED) ocspd 96 root 23u IPv4 0x8402f926599c50f7 0t0 TCP dhcp-92-67.radford.edu:49268->108.162.232.206:http (ESTABLISHED)

    Read the article

  • Downgrade 'local' packages in Debian/Ubuntu

    - by Matt Joiner
    I recently unticked the "pre-released updates" option in Software Sources on my Ubuntu Lucid 10.04.1 installation. The Ubuntu wiki states the following regarding this source: The proposed updates are updates which are waiting to be moved into the recommended updates queue after some testing. They may never reach recommended or they may be replaced with a more recent update. Roughly 20 installed packages have indeed not made it into recommended updates, and occasionally cause conflicts when I install new software, as related packages of the newer version are unavailable now that I've disabled the source. How can I force a downgrade of all packages for which an earlier version exists in a enabled repository?

    Read the article

  • Run a command from Windows 7 Start Menu

    - by Abhijeet Patel
    I'm trying to run commands such as "cmd.exe", "appwiz.cpl" etc by typing it in the Search box of the Start Menu in Windows 7 (x86). I'm able to do this just fine in Vista. After typing in "cmd" I notice that I see a link to "Programs" in the start menu so it seems that "cmd" is being recognized but when I click on "Programs" link which is shown,I get the following message. "These files can't be opened. Your internet security settings prevented one or more files from being opened" P.S. - I'm not looking for enabling the "Run" command to show in the Start Menu Any help would be much appreciated

    Read the article

  • How do i get Safari to ignore the SSL Certificate error?

    - by Tangopop
    In IE 6, 7, 8 and Firefox 3.6.3 and 3.0.5 i have installed a local SSL Certificate on the machine i am testing on and i have gotten the browser to igonre the SSL error (which is off one of my Web Test servers) Now i am tryin to do the same thing within safari 4 and with no luck. Basically i am running some automated scripts to test my website before they go live and i need to be able to ignore these errors as they will all run autonomosly. This is the error screen i am trying to avoid: http://library.bowdoin.edu/news/images/ezproxy-err/safari.jpg As i say i have installed the certificate locally and the IE 7 browser on the same machine works fine.

    Read the article

  • Securing php on a shared apache

    - by Jack
    I'm going to install apache+php in a server where two users, A and B, will deploy their website. I'm trying to achieve isolation of users' space for security reasons: that is no scripts from site A should be able to read files in site B. To achieve this I installed suphp. Website files of user A are owned by A:A with perm=700 and user of B are owned by B:B with perm=700. Suphp works great, but apache complains about permissions to read .htaccess. How can I let apache to read .htaccess in every dir of A and B while keeping isolation between site A and site B? I played with ownership (group = www-data) and permissions (750) but I found no way to keep isolation granted. Any idea? Maybe by running apache as root, but in this case are there any drawbacks?

    Read the article

  • Virtualhost Wildcard Subdomains

    - by Khuram
    We have one static IP on which we have routed our company website. We have setup a local machine on windows with WAMP to run our testing server. We want virtual hosts to test our different apps. However, when creating subdomains, we have a new project which uses wildcard subdomains. How can we create the wildcard subdomains in VirtualHosts. We use, NameVirtualHost * <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/corporate" ServerName companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName project.companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName *.project.companysite.com </VirtualHost> However, the last * wildcard does not work. Any help?

    Read the article

  • PHP + IIS Application Pool Identity Windows\Temp permissions

    - by Matt Boothman
    I am currently running PHP (5.3) on IIS 7.5 on a Win2k8 R2 Web Edition Server and would like to know what, if any, problems or security vulnerabilities I may introduct into a system by assigning Read, Write, Modify & Execute permissions to either IUSR account or the IIS_USERS group for %SystemRoot%\Temp? Should I be altering permissions to that folder at all (as Windows reminds me I probably shouldn't when i attempt to change them)? Should I create a temp folder somewhere else and set permissions accordingly? The problem is when i set Anonymous Authentication (I'm guessing is a more secure option???) to use the App Pool identity, when starting sessions PHP gets stuck in a loop because it's unable to create session files in the %SystemRoot%\Temp folder due to lack of permission on the application pool user or IIS_USERS group. Another problem being ImageMagick (PHP Extension) is being denied access to %SystemRoot%\Temp to write temporary files so is throwing exceptions. I have tried searching Google however have not found anything that touches upon this subject specifically. Any help greatly appreciated.

    Read the article

  • rkhunter warns of inode change by no file modification date changes

    - by Nicholas Tolley Cottrell
    I have several systems running Centos 6 with rkhunter installed. I have a daily cron running rkhunter and reporting back via email. I very often get reports like: ---------------------- Start Rootkit Hunter Scan ---------------------- Warning: The file properties have changed: File: /sbin/fsck Current inode: 6029384 Stored inode: 6029326 Warning: The file properties have changed: File: /sbin/ip Current inode: 6029506 Stored inode: 6029343 Warning: The file properties have changed: File: /sbin/nologin Current inode: 6029443 Stored inode: 6029531 Warning: The file properties have changed: File: /bin/dmesg Current inode: 13369362 Stored inode: 13369366 From what I understand, rkhunter will usually report a changed hash and/or modification date on the scanned files to, so this leads me to think that there is no real change. My question: is there some other activity on the machine that could make the inode change (running ext4) or is this really yum making regular (~ once a week) changes to these files as part of normal security updates?

    Read the article

  • Prevent users from creating / copying / moving anything except .exe

    - by webnoob
    We have a program that compiles executables into a folder into c:\bin. Ideally I would like to share this folder so users can access the exe's within but stop them creating any other files in there. The reason for this is to stop users grabbing source code and putting it in a shared drive then taking it. We have a Domain Controller setup and all the users belong to a specific security group. Is there any way to achieve this? EDIT: TO clarify, I need to stop users from creating or moving files INTO the C:\bin folder which are not executables.

    Read the article

  • Setup IIS 7 as FTP Server that is connectable outside of my local network

    - by Usta
    I was able to setup an FTP site that I was able to access via ftp://127.0.0.1/ or my local(static) ip. To do this I followed these instructions (with the exception that I did not bind to 127.0.0.1 as suggested) http://learn.iis.net/page.aspx/301/creating-a-new-ftp-site-in-iis-7/ I have created a firewall exception for port 20 and 21, and setup port-forwarding on my wireless router. But I can only access the site via local-host, and I need to have a friend have read access to it. So how do I enable remote access to it? (I'd rather not purchase a domain-name) My setup: IIS 7.5 Windows 7 Professional Wireless Network Norton Internet Security 2012 An Internal Static IP Address

    Read the article

  • OS X clients ignoring Windows print server permissions

    - by Ilumiari
    I'm in the process of testing a Windows Server 2008 R2 print server for a mixed OS X/Windows environment. Any security permissions (AD groups) I set for the printers on the print server are not honoured by the OS X clients. Only if I remove absolutely all permissions for a given printer will an OS X client not print to that printer. The Windows clients honour the permissions as expected. The PrintService log doesn't record any activity when an unprivileged Windows client attempts to print, and records a typical print job when an unprivileged OS X client attempts to print. Has anyone encountered this problem before and have a fix? With 600-700 clients, a number of which are dual-booting, restricting by IP address is not viable. EDIT: The jobs are definitely going through the print server, they show up in the logs with their AD credentials.

    Read the article

  • No HTTP Response from Tomcat 7 EC2 instance

    - by David Kaczynski
    I am new to EC2 (and Tomcat, for that matter), and I am trying to deploy a vanilla Tomcat 7 server to an Ubuntu 12.04.1 EC2 instance and access the default test site over HTTP. My EC2 instance is running, and the Security Group includes port 80: My /etc/tomcat7/server.xml config has been edited to listen for HTTP requests on port 80: 0 I have restarted my Tomcat 7 server via sudo service tomcat7 restart. However, according to sudo netstat -lnp, Tomcat is not listed as listening over port 80: I am unable to get any response from going to the ...amazonaws.com public DNS in a web browser. What am I missing?

    Read the article

  • Powershell SQL query--connection string

    - by sean
    I am trying to query several different SQL servers and run a command on each of them. I am unable to get the connection string right. Code, below. I receive the following error:Login failed. The login is from an untrusted domain and cannot be used with Windows authentication. I thought if I passed it the credentials it wouldn't care about the domain. How do I get around this? Thanks in advance. $serverList = @(Get-Content "c:\AllServers.txt") $query = "SELECT COUNT(thing) AS [RowCount] FROM My_table" $Database = "My_DB" # Read a file foreach ( $svr in $serverList ) { $conn=new-object System.Data.SqlClient.SQLConnection $ConnectionString = "Server={0};Database={1};User ID=sa;Password=Password;Integrated Security=True" -f $svr, $Database $conn.ConnectionString=$ConnectionString $conn.Open() $cmd=new-object system.Data.SqlClient.SqlCommand($Query,$conn) $conn.Close() }

    Read the article

  • Ubuntu Server 10.10 vs. Fedora Server 14 for Mono.NET app hosting in VM

    - by Abbas
    Ubuntu Server 10.10 vs. Fedora Server 14 I want to create a web-server running Mono, MySQL 5.5 and OpenLDAP running as a VM (on VMWare Workstation). Searching “Ubuntu Server vs. Fedora Server” mostly yields flame wars and noise. There are a few good articles available but they are either out-of-date or don’t offer very convincing arguments. I know the answer is most likely to be “it depends” but I wanted to harness the collective wisdom on ServerFault and get opinions, experiences and factual information to the extent possible. My selection criteria would be (other than what is mentioned above): Ease of use Ease of development Reliability Security

    Read the article

  • Sending large files - do any vendors sell their solution?

    - by Rob Nicholson
    We currently have an account with www.mailbigfile.com to allow us to send & receive files which exceed our client's email limits. In our industry, a 10MB limit is not unknown. Mailbigfile works fine for what it is but increasingly, our clients are starting to block it as a security risk. A solution would be for us to license the software and run it from our own web server which is far less likely to be blocked. Does anyone know of vendors in this market? We are looking at web collaboration systems but that's a much bigger project. The technology behind www.mailbigfile.com isn't that complex (http upload, email notification and then http download) so I'm hoping it won't be very expensive. Cheers, Rob.

    Read the article

  • Maximum MTU size

    - by user192702
    Thought one of the issues I'm experiencing with the following question is due to MTU rightfully so. ESXi 5 VM Putty session hangs, vSphere client timing out However, when I tried testing the maximum MTU size it seems there's just no limit. Thought Ethernet only allows maximum MTU. But I'm up to 54450. ping -l 54450 192.168.10.7 Pinging 192.168.50.7 with 54450 bytes of data: Reply from 192.168.10.7: bytes=54450 time=1081ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Ping statistics for 192.168.10.7: Packets: Sent = 4, Received = 4, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 1079ms, Maximum = 1081ms, Average = 1079ms

    Read the article

< Previous Page | 541 542 543 544 545 546 547 548 549 550 551 552  | Next Page >