Search Results

Search found 22139 results on 886 pages for 'security testing'.

Page 544/886 | < Previous Page | 540 541 542 543 544 545 546 547 548 549 550 551  | Next Page >

  • Thoughts on Apache log file sizes?

    - by Nathan Long
    Do you place any limits on the size of Apache log files - access.log and error.log? Specifically, can you give: Reasons to limit log file sizes Disk space Any other? Reasons NOT to limit log file sizes Research into performance issues or security breaches Any other? Methods of doing so Cron job that periodically deletes the file, or the first N lines? Any other? Anything you might salvage before deleting For example, grep out how many times a file was downloaded before deleting the access logs I'd like get the thoughts of experienced sysadmins before I do anything. (Marking as community wiki since this may be a matter of opinion.)

    Read the article

  • Slow Starting DHCP Client Service - HP Thin Clients

    - by Ryan
    We have recently began adding XPe thin clients to our domain in preperation for a new citrix environment. One thing that has been picked up on in testing is that they appear slow to boot. The issue manifest's it's self as the classic "Applying Computer Settings..." screen we are all used to seeing. After digging into the issue it appears the DHCP Client service is taking some time to load on boot, this varies but I would estimate it can take around 1 minute in some cases. I've eliminated the classic issues, DHCP is responding correctly and in quick time. DNS is not the cause and GPO's are applying promptly. A simple workaround is to assign the client a static IP which work's great so the TCP/IP servies are obviously firing up quickly, just not DHCP Client. Does anyone have any idea's on how I may be able to improve the service start time? Keen to find a better solution before I get my arm twisted into setting up 250 thin clients with static addressing!

    Read the article

  • Windows global shortcut hijacked by Opera

    - by Balint
    I have Chrome 37 installed as my main browser. Recently I needed to test a design in a new, Chromium based Opera version 21.0.1432.67. This later one hijacked my global shortcuts somehow, so if I press Ctrl+Shift+N to start a new session for testing, even if Chrome is running, and it is the active window, the shortcut starts a new Opera tab - even if the program is not running. It is highly annoying. Even if I uninstall Opera, I'm unable to use the aforementioned shortcut, because it will not work at all. Any hints on how to restore the original shortcut?

    Read the article

  • Are animated GIFs supported in Google Chrome?

    - by user30852
    I have recently been testing a website and found animated gif images that seem to show fine in IE and Firefox but in Google Chrome they only show briefly and then dissapear! This happens if I view the image on the page or view the file directly. Are there any reported problems in displaying GIFs in Chrome, or is it just being fussy? There seemed to have been some problems in older versions of Chrome, but it's hard to believe something as simple as this wouldn't have been fixed by now. The version of Google Chrome I am using is: 4.1.249.1021 Not sure if this is relevant, but some info about the image: Width: 216 pixels Height: 36 pixels Horizontal resolution: 96dpi Vertical resolution: 96dpi Bit Depth: 32 Frame Count: 3 EDIT: Seems to be a problem relating to the latest beta version of Chrome, as it works fine in 4.0.249

    Read the article

  • File ownership and permissions on web site PHP files

    - by columbo
    Hello, I am learning the basics of linux servers so I am green. I have an Ubuntu server upon which there are websites that I have inherited. In a fit of security worry I decided to check out the ownership of the web site files. They are all 2016:sites. If I run the command 'cat /etc/group | more' I can see that the group exists. But when I run 'lastlog' the user 2016 does not appear. I started to worry that 2016 might be the username of web users connecting from the web so I set the permissions on a testfile to chmod 600, giving read permissions to only the file owner. Sure enough I could still access the file from the web. Can anyone suggest what is going on here? I tried creating a new user and giving them file ownership but then when I access the file from the web it wants me to have all directories up stream owned by the same person. Thanks

    Read the article

  • How to remove $data stream from file in windows 8

    - by chris.w.mclean
    Windows for a while now has added an additional hidden stream to files that were downloaded from the internet. If you attempted to use these files, you'd get all kinds of odd behavior as windows was detecting this additional stream and then preventing the app / exe from getting all sorts of security clearance. But in previous versions of windows you could right click on a file, go to properties then click 'Unblock' which removed the extra stream. Windows 8 seems to be doing the additional streams trick, but I haven't yet found a way to remove them using the win 8 UI. Anyone know how to do this?

    Read the article

  • VMWare Converter recommendations

    - by Tank Szuba
    I have two Ghost 14 backups of my machine. One for the machine fully configured with apps after and XP install and one of the last update before i re-imaged it (it's XP, I re-image about once every six months). I recently wanted to try simply using my initial image in a virtual environment to do my testing that generally causes me to need to re-image. I used the VMWare converter to convert the Ghost images to a virtual machine to use in Virtual box but they fail to properly boot. They get stuck after the bios loads and windows begins loading. If I power down the machine and refire it it will go to the error screen in windows that asks if you would like to boot to a different mode. none selected make any difference. What are some possible errors I should look for in the conversion process or in my settings for the converter?

    Read the article

  • How do you manage large web farms?

    - by Andrew Katz
    I have a quickly growing web farm running IIS 7 (30+ servers). All servers are identical copies of each other and all servers are physical. We update the software about once a month, and in the current process, we follow the following steps: Disable server from pool on F5 load balancer. Disable HTTP Keep-alives in IIS so connections drop quickly. Change default directory of website to new folder containing new binaries. Test server Enable HTTP Keep-alives. Enable server in F5 pool. Move to server 2 Microsoft used to have Application Center which was abandoned a while ago. They have made a second attempt with the Web Farm Framework, but this adds as much QA time testing the release package as it saves in the deployment. Has anyone seen a commercial off the shelf application that is tailored for managing and deploying to large web farms? Thanks!

    Read the article

  • Not able to login to new AMI on EC2 - moveing from micro to small instance

    - by zengr
    I had a t1.micro instance (old_server) on Linux on EC2 and now I need to upgrade my server to m1.small Linux (new_server). So, here is what I did: Shutdown old_server and create an AMI. Launch the new AMI with m1.small configuration. (I kept the key and security group same as old_server) I tried to login by: ssh -i my_key.pem [email protected] But, it gives a connection timeout error. My login to old_server works fine. So, my question is: What is the correct way to scale up (vertically) an EC2 instance? Where am I going wrong in the above mentioned step? When I create an AMI in step 1, is the EBS (data) also copied.

    Read the article

  • Outlook 2007 Clients attachments not opening correctly

    - by Az
    Hello, Thanks for taking the time to read this. We are having an issue on our Terminal Server clients that has me perplexed. We are running Server2003 x64 and Exchange 2007 environment. When we attempt to open an attachment for example .jpeg and the user selects to open the file, it prompts them to select a program to open the file with. The file associations apear to be fine, if I save the document to the desktop and open it, it opens with the correct program automatically. If i select "always open using this program" it will then open automatically, but I don't want to have to do this for all file types we open regularly on each client. Is this some sort of Exchange server security setting that is forcing them to associate the file? Does outlook or exchange maintain its own file association database? Thanks for reading!

    Read the article

  • How do I install and run Tomcat on port 80 as my only web server? (Rooted Ubuntu box)

    - by gav
    Hi All, tl;dr - I have a rooted linux box that I want to run tomcat on as a server (No Apache Web Server) how would you set this up avoiding common security pitfalls? I've written a Grails App that I want to run on a VPS I rent. The VPS has very little memory and I am using it for the sole purpose of running this application so I don't need the apache web server. This is my first venture into Server administration and I'm sure to fall into some well known traps. Should I use iptables to redirect requests from port 80 to 8080? Should I run tomcat as root or as it's own user? What configuration settings would be good for a low memory system expecting less than 10 concurrent users? Hopefully an easy one for you! Anyone who could link to a tutorial would be a personal hero destined for great things no doubt. Gav

    Read the article

  • Nginx ignoring client's HTTP 1.0 request and respond by HTTP 1.1

    - by Yoga
    I am testing using nginx/php5-fpm, with the code <?php header($_SERVER["SERVER_PROTOCOL"]." 404 Not Found"); // also tested: header("Status: 404 Not Found"); echo $_SERVER["SERVER_PROTOCOL"]; And force to use HTTP 1.0 with the curl command. curl -0 -v 'http://www.example.com/test.php' > GET /test.php HTTP/1.0 < HTTP/1.1 404 Not Found < Server: nginx < Date: Sat, 27 Oct 2012 08:51:27 GMT < Content-Type: text/html < Connection: close < * Closing connection #0 HTTP/1.0 As you can see I am already requesting using HTTP 1.0, but nginx reply me with HTTP 1.1

    Read the article

  • SQL 2008 R2 3rd Party Peer-to-Peer Replication, Global Site Distribution

    - by gombala
    We are looking at hosting 3 globally distributed SQL Server installations at different data centers. The intent is that Site A will serve web traffic and data for a specific region, same with Site B and C. In the case that Site A data center goes down, looses connectivity, etc. the users of Site A users will fail over to Site B or C (depending which is up). Also, if a user from Site A travels to Site C they should be able to access their data as it was on Site A. My questions is what SQL replication technology (SQL Replication or 3rd party) can support this scenario? We are using SQL 2008 R2 Enterprise at each site, each site runs on top of VMWare with a Netapp filer. Would something like distributed caching help in this scenario as well? We have looked at and tested Peer-to-Peer replication but have encountered issues with conflicts during our testing. I imagine there are other global data centers that have encountered and solved this issue.

    Read the article

  • Odd domain switching behavior in Firefox and Chrome

    - by Jeremy Detrempe
    We have different development severs and a production server. Testing is done in the development servers. As a QA engineer, I'm switching between these servers quite often throughout the day. In Chrome, sometimes I need to reload a page a few times to get it to pull from the newly switched server. In Firefox, sometimes I need to quit the browser in order to get it to pull from the newly switched server. (We have small tags that indicate which server you are pulling from, which is how I know in-browser.) Why does that happen? I'd love to know how that happens (maybe what it's called?) and what the best way to deal with it is. (I know that Firefox has an extension for domain switching; is that the best solution?)

    Read the article

  • Installing SSL certs with nginx on Amazon EC2

    - by Ethan
    I finally got a cert from an authority and am struggling to get things working. I've created the appropriate combined certificate (personal + intermediate + root) and nginx is pointing to it. I got an elastic IP and connected it to my EC2 instance. My DNS records point to that IP. But when I point the browser to the hostname, I get the standard "Connection Untrusted" bit, with ssl_error_bad_cert_domain. Port 443 is open - I can get to the site over https if I ignore the warning. Weird thing is, under technical details, it lists the domain I tried to access as valid! When I try and diagnose with ssl testing sites, they don't even detect a certificate! What am I missing here? domain is yanlj.coinculture.info. Note I've got coinculture.info running on a home server without a dedicated IP and have the same problem, but I'll be moving that to the same EC2 instance as soon as I figure this thing out. I thought the elastic IP would solve things but it hasn't

    Read the article

  • "Brute force attempt" on sending multiple emails

    - by bretddog
    While testing to send multiple emails, I successfully sent about 100 emails (with a 20KB pdf attachment), to the same email-address (my own), and they were all received. But on next attempt, my cPanel account was blocked, due to a "brute force attempt". Are there any special precautions I need to take when sending bulk emails? I simply looped through below code without pause for each email. What type of alert could that give on the email server, and how should I avoid it? client = New SmtpClient(smtp, Convert.ToInt32(port)) AddHandler client.SendCompleted, AddressOf OnAsyncSendComplete client.Credentials = New System.Net.NetworkCredential(usn, psw) client.SendAsync(mail, token) Should I wait for SendComplete event for each email before sending the next?

    Read the article

  • Add separate domain name to Wordpress admin area with htaccess

    - by Marc
    I have a Wordpress installation in a seperate folder on my server (meaning it is not in the root folder). I have a htaccess rewrite rule that maps Domain A to folder A. Inside folder A is the Wordpress admin folder, let's call it folder A.B. I tried mapping Domain B to folder A.B., but I can't get it to work properly. When you log in to Wordpress via /admin, you get redirected to /wp-login.php (so from folder A.B. to folder A), maybe that is where I get into trouble. So what I would like to do is this: Domain A folder A Domain B folder A.B Note that this is not for security purposes, I just like the idea of www.domainb.com instead of www.domaina.com/wp-admin. Can this be done with Wordpress?

    Read the article

  • LinkSys WRT54GL + AM200 in half-bridge mode - UK setup guide recommendations?

    - by Peter Mounce
    Crossposted from here I am basically looking for a good guide on how to set up my home network with this set of hardware. I need: Dynamic DNS Firewall + port-forwarding VPN Wake-on-LAN from outside firewall VOIP would be nice QoS would be nice (make torrents take lower priority to other services when those other services are happening) DHCP Wireless + WPA2 security Ability to play multiplayer computer games I am not a networking or computing neophyte, but the last time I messed with network gear was a few years ago, so am needing to dust off knowledge I kinda half have. I have read that I should be wanting to set up the AM200 in half-bridge mode, so that the WRT54GL gets the WAN IP - this sounds like a good idea, but I'd still like to be advised. I have read that the dd-wrt firmware will meet my needs (though I gather I'll need the vpn-specific build, which appears to preclude supporting VOIP), but I'm not wedded to using it. I live in the UK and my ISP supplies me with: a block of 8 static IPs, of which 5 are usable to me a PPPoA ADSL2+ connection

    Read the article

  • LinkSys WRT54GL + AM200 in half-bridge mode - UK setup guide recommendations?

    - by Peter Mounce
    I am basically looking for a good guide on how to set up my home network with this set of hardware. I need: Dynamic DNS Firewall + port-forwarding VPN Wake-on-LAN from outside firewall VOIP would be nice QoS would be nice (make torrents take lower priority to other services when those other services are happening) DHCP Wireless + WPA2 security Ability to play multiplayer computer games I am not a networking or computing neophyte, but the last time I messed with network gear was a few years ago, so am needing to dust off knowledge I kinda half have. I have read that I should be wanting to set up the AM200 in half-bridge mode, so that the WRT54GL gets the WAN IP - this sounds like a good idea, but I'd still like to be advised. I have read that the dd-wrt firmware will meet my needs (though I gather I'll need the vpn-specific build, which appears to preclude supporting VOIP), but I'm not wedded to using it. I live in the UK and my ISP supplies me with: a block of 8 static IPs, of which 5 are usable to me a PPPoA ADSL2+ connection

    Read the article

  • Advice on new hardware firewall for a small company server-environment

    - by Mestika
    Hi everyone, My companies currently hardware firewall (an old ZyXEL ZyWALL firewall) and is indeed requiring an update to a new firewall. It is a small company with a similar small server-environment, so the need for a huge, complex and expensive solution isn’t there but a more “straight-forward” firewall, that can provide the necessary security to our systems and block unwanted elements from the core server and only accept access through our one server which is used as an “gateway” between the Internet and our internal network. I haven’t that much experience with hardware firewalls so I’m requesting any good advice and/or knowledge on which products will be suited for our specific need. If you need more information about the specific needs we require, please let me know and I’ll provide them to you. Sincerely - Mestika

    Read the article

  • setting up a WGR614v7 behind a linux box

    - by commodore fancypants
    Here's the setup, I have an openSUSE box with 2 NICs, one goes to my home network router, the other has DHCP running and it attached to a wireless router. I'm trying to get this setup to work before I switch to the linux box as my home network router. My DHCP will offer the wireless router (a WGR614v7) an address, but anything that connects to the wireless router ends up with a APIPA address. I have all the firewalls on the wireless network turned off as well as the wireless router's own DHCP. The linux box isn't offering addresses to anything past the wireless router. Is this a problem with the router or my DHCP setup? For testing purposes, I have both NICs set in the internal zone and I've tried wireless and wired connections to the WGR614v7 both to no avail.

    Read the article

  • Is it possible to modify a color scheme and windows decorations in Xfce4?

    - by Juhele
    just testing PCLinuxOS Phoenix XFCE Edition 2011-07 which I would like to install on my grandpa's PC instead older PCLOS 2009 with KDE3 (which is almost impossible to upgrade). The PC is relatively older one (Sempron 2200+ CPU, MSI K7N2GM2 board with integrated GeForce 440MX series graphics, 1GB RAM and 80GB IDE HDD) and I thought that it is too weak for KDE4. I already used Xfce in the past (Sam Linux 2006 and 2007), but in new Xfce4 I am not able to somehow change the windows color scheme and the windows decorations - the settings manager offers me switching between preinstalled themes but it is possible to modify them with some GUI?

    Read the article

  • Firewall GPO not applying despite being enumerated by gpresult

    - by jshin47
    I have a need to open up the admin$ share on all of my domain's client PC's and I am trying to do so using group policy. I defined computer policy for Windows Firewall with Advanced Security in a policy object linked to the appropriate container and added the appropriate rules. However, they are not being applied! I feel like I have tried all of the obvious steps: I've checked gpresult and the resulting set of policy is the way that I would expect it to look. I've gpupdate /force and gpupdate /sync on a few client computers, but no matter what I do they don't seem to respond to my changes. I know that other computer policies in the GPO are being applied so it is strange that these are not. I have also disabled exceptions on clients in the firewall GPO, but that doesn't seem to be applying either. Here is a screenshot of the firewall.cpl from a client: Basically, although other options in the same GPO ARE applied for computer policy, the firewall settings seem to be ignored.

    Read the article

  • Failure to copy files with ownership/ACL information on a Windows Server 2008 R2 machine

    - by darklion
    I'm attempting to copy a directory tree, maintaining its ownership information using the command: XCOPY S:\ProjectsDefault\Tempalte\admin S:\Projects\00\111\admin /S /E /I /O the command gives an Access denied error message, and while it does create the directory tree, the ownership and ACL information is not copied. This is being done on a Windows 2008 R2 Server which has mounted a share from a Windows 2003 R2 domain controller. The user has been been granted full access to the share and is a member of the Domain Admins security group. Oddly enough, the command does work if performed on a different (Windows 2003 R2 Server). (It also works if done using the Domain Administrator account on the 2008 server.)

    Read the article

  • can't execute scripts compiled with shc

    - by serilain
    I'm trying to use SHC to compile a shell script so that I can set the SUID bit on it and obfuscate what it's doing (I'm attempting to have it run as part of all new users' .bashrc). As a test, I wrote a script that's simply: #!/bin/bash env And compiled it using shc -r -f script.sh However, when I try to run the resulting script by simply doing ./script.sh.x, even after setting it to 777 (just for testing purposes), I get "Operation not permitted; killed" unless I run it as sudo (which I don't want to have to do). Am I running afoul of some Ubuntu permissions that won't let me run binaries created by shc? Thanks!

    Read the article

< Previous Page | 540 541 542 543 544 545 546 547 548 549 550 551  | Next Page >