Search Results

Search found 4099 results on 164 pages for 'party'.

Page 122/164 | < Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >

  • Transfer all 1&1 web and e-mail services to own Synology NAS using No-IP for DDNS

    - by Neo
    I have a domain x-treem.net. The registrar is DomainDiscover and I have a hosting package with 1&1 which includes web and e-mail. I also have an additional package with 1&1 - Microsoft Exchange which centralises all my e-mails, tasks, contacts, notes, etc. and I connect to it with my PC (Outlook) and my Android phone. I have just purchased a Synology NAS (DS213) and I can see I can run a web server (Web Station), e-mail server (Mail Server) on it amongst other things. I am behind a dynamic IP. So, I'm looking to get some clarification on what I must do to consolidate my services and make use of my NAS to do as much as possible and save third-party hosting costs. My registrar specifies nameservers as NS45.1AND1.CO.UK and NS46.1AND1.CO.UK. The MX record is mx00.1and1.co.uk and mx01.1and1.co.uk. I'm aware of the concept of DDNS and I am looking at using No-IP.com for this. This is where I need clarification. If I registered with the No-IP paid service and pointed my registrar to No-IP's nameservers, and used the DDNS support on my NAS (which supports No-IP), then any requests to x-treem.net would go to my NAS. Is that correct? Therefore, web requests would hit the web server on my NAS, and e-mails would hit the mail server on my NAS? So, given all of the above, I can then drop 1&1 completely and use my NAS for everything. I use MySQL, phpMyAdmin, phpBB on 1&1 all of which the Synology NAS appears to support in its available packages. As for Microsoft Exchange, Synology offers Zafara which appears to be a drop-in replacement for Exchange. Am I on the right track here, or is there anything I am missing?

    Read the article

  • Windows Server 2003 Trial Activation Issue

    - by Adam Batkin
    I have a Windows Server 2003 (R2 Enterprise with SP2) VM, originally installed with a trial license. We forgot about the server, and now more than 120 days has passed, and I can't do anything with the server. I seem to be at a dead end with the existing installation. When I log in, I get: The evaluation period for this copy of Windows has ended. Windows cannot start. To continue using Windows, please purchase and install a retail copy of the product. Fine. I'll do that with my MSDN media. I should add that safe mode works, but there isn't anything obvious that I found to help me there Next up, I tried repairing my installation: Boot from Server 2003 R2 Enterprise with SP2 media, tell it I want to install (as opposed to recovery console), then let it repair the existing install. Once that completes and reboots I log in: This copy of Windows must be activated with Microsoft before you can continue. You cannot log on until you activate Windows. Do you want to activate Windows now? To shut down the computer, click Cancel. Great! I click "Yes" and am left with a big blue screen. Not a blue screen of death, just a blue screen (i.e. the default windows desktop background color). No Ctrl+Alt+Del. All I can do is power cycle. I have some complex third-party software on there that I can't reinstall, which is why I haven't already built a fresh Windows VM and copied everything over. I have a backup of the VM from after trial period expired but before I installed anything. Ideas?

    Read the article

  • Windows 8 DLNA streaming of MKV files

    - by dbb
    I have a Samsung TV that is capable of playing MKV files. The Windows DLNA Play To menu that appears when you right-click a media file does not support MKV files, but a simple trick has been to change the file extension from .mkv to .avi so the Play To context menu item would appear. At that point I could successfully stream from my computer to my TV. However, this does not appear to work in Windows 8. Doing the same thing in Windows 8 causes the Play To window to open but the file does not get played. Dragging and dropping the file in the Play To window causes it to be silently ignored. Using actual AVI, MP4, etc. files works, of course. It appears Windows 8 is now doing some kind of validation on the file that Windows 7 wasn't previously. The Play To window does not show any kind of obvious error message or warning and there is nothing in the Windows event log. So, is there a way in Windows 8 to stream MKV files to a DLNA device without converting it to another container format? I would rather not use extra third-party software, but I would consider it if it's purposefully designed for this simple case rather than a more robust "media library/server" solution.

    Read the article

  • How to disable windows server 2008 timestamp response

    - by Cal
    Posted this question on stackoverflow but then got instructed to post it here: I was using Rapid7's Nexpose to scan one of our web servers (windows server 2008), and got a vulnerability for timestamp response. According to Rapid7, timestamp response shall be disabled: http://www.rapid7.com/db/vulnerabilities/generic-tcp-timestamp So far I have tried several things: Edit the registry, add a "Tcp1323Opts" key to HKLM\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters, and set it to 0. http://technet.microsoft.com/en-us/library/cc938205.aspx Use this command: netsh int tcp set global timestamps=disabled Tried powershell command: Set-netTCPsetting -SettingName InternetCustom -Timestamps disabled (got error: Set-netTCPsetting : The term 'Set-netTCPsetting' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.) None of above attempts was successful, after re-scan we still got the same alert. Rapid7 suggested using a firewall that's capable of blocking it, but we want to know if there is a setting on windows to achieve it. Is it through a specific port? If yes, what is the port number? If not, could you suggest a 3rd party firewall that is capable of blocking it? Thank you very much.

    Read the article

  • Router for creating site to site VPN to server provider using Cisco ASA 5540

    - by fondie
    We have dedicated servers hosted for us by a third party, we connect to these over a VPN. My server provider uses Cisco ASA 5540 as VPN devices. Currently we're using software clients on individual machines to connect to this VPN, either: Cisco VPN Client Shrew Soft VPN Connect However, I'm looking to purchase a new load balancing router for our office and thought this could be an opportunity to get VPN client duties taken over by hardware. We could then create a permanent VPN tunnel that could be used by anyone on the network with no software client necessary. Sadly I'm not the most knowledgeable on this kind of stuff so is: 1) This a realizable goal? Next I need to know what kind of hardware I will need. I'm not looking to spend lots of money on this (~$500), so doubtful I can afford any Cisco kit. Therefore, this is the most promising candidate I've seen (as far as my limited knowledge goes): Draytek Vigor 2955 - http://www.draytek.co.uk/products/vigor2955.html 2) Would this be compatible with the Cisco kit my server provider uses? 3) If not, are there any alternatives I should consider? Many thanks in advance.

    Read the article

  • How to shrink Windows 7 boot partition with unmovable files.

    - by Alex Che
    I have just bought HP laptop with Windows 7 (64 bit). It has 500 GB HDD with three partitions: small hidden system partition, 12 GiB HP recovery partition, and 450 GiB C: boot partition. I would like to split this large C: partition into two partitions, leaving only 100 GiB for system, and giving the rest to new data partition. Although Windows built-in Disk Management utility has an option to shrink the bootable partition, it only allows me to shrink it roughly by half, even though only 20 GiB on the partition is used. As far as I understand, system unmovable files lie in the middle of the partition, preventing Disk Management utility to do what I want. And since new HP laptops don't come with OS installation disks (they only allow you to create recovery disks youself), I can't just repartition HDD and then reinstall OS. So, is there any way to shrink C: bootable partition and preserve Windows 7 working? P.S.: I have tried to use 3rd party GParted utility, and after shrinking the partition Windows 7 stopped booting with BSOD. System recovery didn't work, and I had to do factory recover. Since this is a long process, I would like to avoid doing it again :) So, please, suggest only proven solutions.

    Read the article

  • Bind ADFS 2.0 service to a specific IP address

    - by ccellar
    I have one server with ADFS-2.0 and a few websites on it. One of the websites is Dynamics CRM which listens on a specific IP address on port 443. Dynamics CRM provides a metadata file for configuration purposes which could be used to configure a relaying party trust with ADFS. It is accessible with the URL https://auth.contoso.com/FederationMetadata/2007-06/federationmetadata.xml The problem is that ADFS-2.0 installs a service which registers following urlacl https://+:443/FederationMetadata/2007-06/ This means the result of accessing the URL https://auth.contoso.com/FederationMetadata/2007-06/federationmetadata.xml is the metadata file of ADFS, not the one of Dynamics CRM. I've tried to delete the default urlacl and added (one of them at a time) https://192.168.1.2:443/FederationMetadata/2007-06/ https://adfs.mydomain.com:443/FederationMetadata/2007-06/ but neither of them worked. Instead the ADFS-service failed to startup complete. Is there any way to bind this service to a IP address? At the moment I see only two alternatives Bind the service to a non standard port. This leads to problems because this means that also the ADFS website has to use a non-standard HTTPS-port. Install ADFS-2.0 on a different server (this is my favorite alternative - however it is not possible in every situation...)

    Read the article

  • Scripting an automated SQLServer 2008 DR move

    - by ItsAMystery
    Hi All We use the built in logshipping in SQLServer to logship to our DR site but once in a month do a DR test which requires us to move back and forth between our Live and BAckup servers. We run multiple (30) databases on the system so manually backing up the final logs and disabling the jobs is too much work and takes too long. I though no problem, I will script it but have run into trouble with it always complaninig that the final logship is too early to apply even though I dont export the final log until putting the database into norecovery mode. Firstly, does any one no a simple and reliable way of doing this? I have lokoed at some 3rd party software (redgate sqlbackup I think it was) but that didnt make it easy in this situation either. What I want to be able to do is basically run a script (a series of stored procedures) to get me to DR and run another to get me back with no dataloss. My scripts are very simplistic at the moment but here they are: 2 servers Primary Paris Secondary ParisT The StartAgentJobAndWait is a script written by someone else (ta) and just checks the jobs have finished or quits it if it never ends. At the moment I am just using a test database called BOB2 but if I can get it working will pass in the database and job names. from PARIS: /* Disable backup job */ exec msdb..sp_update_job @job_name = 'LSBackup_BOB2', @enabled = 0 exec PARIST.msdb..sp_update_job @job_name = 'LSCopy_PARIS_BOB2', @enabled = 0 exec PARIST.msdb..sp_update_job @job_name = 'LSRestore_PARIS_BOB2', @enabled = 0 exec PARIST.master.dbo.DRStage2 ParisT DRStage2 DECLARE @RetValue varchar (10) EXEC @RetValue = StartAgentJobAndWait LSCopy_PARIS_BOB2 , 2 SELECT ReturnValue=@RetValue if @RetValue = 1 begin print 'The Copy Task completed Succesffuly' END ELSE print 'The Copy task failed, This may or may not be a problem, check restore state of database' SELECT @RetValue = 0 EXEC @RetValue = StartAgentJobAndWait LSRestore_PARIS_BOB2 , 2 SELECT ReturnValue=@RetValue if @RetValue = 1 begin print 'The Restore Task completed Succesffuly' END ELSE print 'The Copy task failed, This may or may not be a problem, check restore state of database' exec PARIS.master.dbo.DRStage3 /* Do the last logship and move it to Trumpington */ BACKUP log "BOB2" to disk='c:\drlogshipping\BOB2.bak' with compression, norecovery EXEC xp_cmdshell 'copy c:\drlogshipping \\192.168.7.11\drlogshipping' EXEC PARIST.master.dbo.DRTransferFinish AS BEGIN restore database "BOB2" from disk='c:\drlogshipping\bob2.bak' with recovery

    Read the article

  • windows 8.1 upgrade fails with error code 0xc1900101-0x20017

    - by cmorse
    I just tried to install windows 8.1 on my laptop, but it fails to install with the message: Sorry we couldn't complete the update to Windows 8.1. We restored your previous version of Windows to this PC 0xC1900101 - 0x20017 It looks like on the first boot that the laptop is going to the PC restore screen (it asks what kind of keyboard I have, and then what repair optins I would like to take). Thus far I have just been selecting "Continue to windows 8." I'm running a Lenovo x220i tablet. I've got 43GB of free disk space. It installed just fine on my desktop. The primary difference between the two machines is that the desktop has media center install, and isn't using TrueCrypt. Full WindowsUpdate.log http://pastebin.com/hGmAW4Q1 Most important portion of WindowsUpdate.log: 2013-10-17 10:41:06:671 964 694 Agent ************* 2013-10-17 10:41:06:671 964 694 Agent ** START ** Agent: Finding updates [CallerId = AutomaticUpdates] 2013-10-17 10:41:06:671 964 694 Agent ********* 2013-10-17 10:41:06:671 964 694 Agent * Online = No; Ignore download priority = No 2013-10-17 10:41:06:671 964 694 Agent * Criteria = "IsInstalled=0 and DeploymentAction='Installation' or IsPresent=1 and DeploymentAction='Uninstallation' or IsInstalled=1 and DeploymentAction='Installation' and RebootRequired=1 or IsInstalled=0 and DeploymentAction='Uninstallation' and RebootRequired=1" 2013-10-17 10:41:06:671 964 694 Agent * ServiceID = {7971F918-A847-4430-9279-4A52D1EFE18D} Third party service 2013-10-17 10:41:06:671 964 694 Agent * Search Scope = {Machine & All Users} 2013-10-17 10:41:06:671 964 694 Agent * Caller SID for Applicability: S-1-5-18 2013-10-17 10:41:07:233 964 870 Report REPORT EVENT: {AD47FBDC-F7F9-4E7F-BAF4-DBA3784C7101} 2013-10-17 10:41:06:436-0600 1 202 [AU_REBOOT_COMPLETED] 102 {00000000-0000-0000-0000-000000000000} 0 0 AutomaticUpdates Success Content Install Reboot completed. 2013-10-17 10:41:07:233 964 870 Report REPORT EVENT: {8D4E7A67-9526-4702-A897-5BE5F97497AF} 2013-10-17 10:41:06:639-0600 1 204 [AGENT_INSTALLING_FAILED_POST_REBOOT] 101 {8951E70D-4332-4F7C-B92D-D9362E384959} 1 c1900101 WSAcquisition Failure Content Install Installation Failure Post Reboot. 2013-10-17 10:41:07:249 964 870 Report CWERReporter::HandleEvents - WER report upload completed with status 0x8 2013-10-17 10:41:07:249 964 870 Report WER Report sent: 7.8.9200.16715 0xc1900101(0x20017) 8951E70D-4332-4F7C-B92D-D9362E384959 Install 101 Unmanaged 2013-10-17 10:41:07:249 964 870 Report CWERReporter finishing event handling. (00000000)

    Read the article

  • Nginx dynamic upstream configuration / routing

    - by Dan Sosedoff
    I was experimenting with dynamic upstream configuration for nginx and cant find any good solution to implement upstream configuration from third-party source like redis or mysql. The idea behind it is to have a single file configuration in primary server and proxy requests to various app servers based on environment conditions. Think of dynamic deployments where you have X servers that are running Y workers on different ports. For instance, i create a new app and deploy. App manager selects a server and then rolls out a worker (Ruby/PHP/Python) and then reports the ip:port to the central database with status "up". At this time when i go to the given url nginx should proxy all requests to the specified ip:port upstream. The whole thing is pretty similar to what heroku does, except this proof-of-concept is not supposed to be production ready, mostly for internal needs. The easiest solution i found was using resolver with ruby-based DNS server. It works, nginx gets the IP address correctly, but the only problem is that you cant define port number for that IP. Second solution (which i havent tried yet) is to roll something else as a proxy server, maybe written in Erlang. In this case we need to use something to serve static content. Any ideas how to implement this in more flexible and stable way? P.S. Some research options: http://openresty.org/#DynamicRoutingBasedOnRedis https://github.com/nodejitsu/node-http-proxy

    Read the article

  • Trouble with IIS SMTP relaying to Gmail

    - by saille
    I appreciate that similar questions have been asked about how to setup SMTP relaying with IIS's virtual SMTP server. However I'm still completely stumped on this problem. Here's the setup: IIS 6.0 SMTP server running on Win2k3 box with a NAT'ed IP. Company uses Gmail for all email services. An app on the box needs to send email, so normally we'd just set the app up to talk to smtp.gmail.com directly, but this app doesn't support TLS. Easy, we just setup a local SMTP relay right? So I thought. What we have done so far: Setup IIS SMTP server to relay to smtp.gmail.com, as per these excellent instructions: http://fmuntean.wordpress.com/2008/10/26/how-to-configure-iis-smtp-server-to-forward-emails-using-a-gmail-account/ The local SMTP relay allows anonymous access. Both the local IP and the loopback IP have been explicitly allowed in the Connection... and Relay... dialogs. Tried sending email from 2 different apps via the local SMTP server, but failed (the emails end up in the Queue folder, but never get sent). The IIS logs show the conversation with the local app, but zero conversation happening with smtp.gmail.com. The port used by gmail is open outbound, and indeed the apps we have that support TLS can send email directly via smtp.gmail.com, so there is no problem with the network. At this point I changed the smtp settings in IIS SMTP server to use a different external SMTP server and hey-presto, the local apps can send email via local IIS SMTP relay. So smtp.gmail.com fails to work with our IIS SMTP relay, but another 3rd party SMTP service works fine. We need to use smtp.gmail.com, so how to troubleshoot this one?

    Read the article

  • Certain banking pages not loading

    - by Joseph Lee
    For some unknown reason, I am suddenly unable to access my accounts at several banking and credit sites. I have been a registered user at each site for several years and know I am using the correct user ID and password. Yet, after entering the data, answering security questions, and clicking the submit button, I land on a page with an error message saying their is a technical problem preventing me from accessing my account. On one site, I end up at the sign in page repeatedly. I am never told that my ID/password are incorrect. I believe may be firewall related. Windows firewall was damaged after a recent malware attack. I am now using a third party firewall (Fort Knox). I am not seeing a pop-up indicating sites are blocked or asking me to indicate yes or no. I am using Windows 7 Home Premium. I get the same result regardless of the browser. I switched to Maxthon last night and am getting the same result. This is not happening at other sites. And I am able to access some banking sites normally. This is frustrating because I need to make payments and have gone paperless. Any feedback will be appreciated. ---- Joe ----

    Read the article

  • How to back up non-standard directories in my user profile with Windows Backup?

    - by James Johnston
    I'm using Windows Backup to back up my Win7 Pro laptop. I'd like to use it to back up my complete user profile, but I only see standard profile directories (e.g. C:\Users\JohnstonJ\Documents) in the list. Non-standard ones aren't there (e.g. C:\Users\JohnstonJ\MyCustomDirectory). What's the best way to handle this? The only thing I can think of is to browse under the "Computer" entry and navigate directly to C:\Users\JohnstonJ and check off the entire profile (to get what's in there, and any new directories that come up). But is that going to back up the profile twice? Cause other unforeseen problems given that I checked it off by navigating through the computer, rather than picking it under the "Data Files" category? (e.g. back up temporary file garbage, files in use problems, etc. that the "Data Files" category might be handling better). Looking for solutions that other people use that are known to work well and still uses the Windows Backup software - I don't really want to fuss with 3rd-party backup software. Example - as you can see, I have two directories in my profile that Windows Backup is not offering to back up: "Dropbox" and "New folder": (Link to images album because I don't have enough reputation to directly embed them: http://imgur.com/a/Xyv5u)

    Read the article

  • Is there a simple LDAP-to-HTTP gateway out there?

    - by larsks
    We have a local LDAP directory that provides basic contact information about our user community. We would like to integrate this into some third-party hosted services that allow us to implement widgets that run arbitrary Javascript. In order to connect Javascript to our LDAP directory, I would like to set up a simple LDAP-to-HTTP proxy that would accept HTTP GET requests, translate them into an appropriate LDAP query, and respond with directory information as JSON-encoded data. In an ideal world, something like this: GET /[email protected] Would get me something like this: { "cn": "Bob Person", "title": "System Administrator", "sn": "Person", "mail": "[email protected]", "telepehoneNumber": "617-555-1212", "givenName": "Bob" } (And this obviously assumes that the web application has locally configured information about what base DN to use, how to authenticate, etc). I guess I could write one...but surely something like this already exists? UPDATE The consensus seems to be that there isn't a pre-existing solution out there and that I should just get off my lazy derriere and write one. So I did, and it's here. It's not especially pretty, but it works for my prototyping and I figure maybe someone else will find it useful someday.

    Read the article

  • How to Protect Sensitive (HIPAA) SQL Server Standard Data and Log Files

    - by Quesi
    I am dealing with electronic personal health information (ePHI or PHI) and HIPAA regulations require that only authorized users can access ePHI. Column-level encryption may be of value for some of the data, but I need the ability to do like searches on some of the PHI fields such as name. Transparent Data Encryption (TDE) is a feature of SQL Server 2008 for encrypting database and log files. As I understand it this prevents someone who gains access to the MDF, LDF, or backup files from being able to do anything with the files because they are encrypted. TDE is only on enterprise and developer versions of SQL Server and enterprise is cost-prohibitive for my particular scenario. How can I get similar protection on SQL Server Standard? Is there a way to encrypt the database and backup files (is there a third-party tool)? Or just as good, is there a way to prevent the files from being used if the disk were attached to another machine (linux or windows)? Administrator access to the files from the same machine is fine, but I just want to prevent any issues if the disk were removed and hooked up to another machine. What are some of the solutions for this that are out there?

    Read the article

  • "Cannot perform a differential backup for database "myDb", because a current database backup does no

    - by krimerd
    Hi there, I have what seems to be a pretty common problem when trying to take a differential backup. We have a SQL Server 2008 Standard (64bit) and we use Litespeed v 5.0.2.0 to take our backups. We take full backups once a week and a differential on a daily basis. The problem is, every time I try to take a diff backup I get the following error: "VDI open failed due to requested abort. BACKUP DATABASE is terminating abnormally. Cannot perform a differential backup for database "myDb", because a current database backup does not exist. Perform a full database backup by reissuing BACKUP DATABASE, omitting the WITH DIFFERENTIAL option." The problem is that I know 100% I have a full backup because I just double checked. Only once I was able to take a diff backup and that was when I took it immediately after I took a full backup. I have searched around and noticed that this is pretty common (although mostly with SQL 2005) and a solution that a lot of ppl suggest and that I haven't tried yet is to disable the SQL Server VSS Writer service. The problem with this is #1 I think I might need this service since I am using a third party backup software and #2 I am not sure exactly what the service does and don't want to disable it just like that. Has any of you ever experienced this problem and how did you go about fixing it? Thank you,

    Read the article

  • Sharing large (multi-Gb) files with clients

    - by Tim Long
    I wasn't sure if this was the best place for this question, but I think it is squarely in the realm of the IT admin so that's the reason I put it here. We need to share large files (several Gigabytes) with external clients. We need a simple way of reliably and automatically publishing these files so that clients can then download them. Our organization has Windows desktops and a Windows SBS 2011 server. Sharing from our server is probably suboptimal from the client's perspective, because of the low upstream bandwidth of typical ADSL (around 1 Mbps) - it would take all day (9 hours for a 4Gb file) for the client to download the file. Uploading to a 3rd party sever is good for the client but painful for us, because we then have to deal with a multi-hour upload. Uploading to a third-part server would be less problematic if it could be made reliable and automatic, e.g. something like a Groove/SharePoint Workspace, simply drop the file in and wait for it to synchronize - but Groove has a 2Gb limit which is not big enough. So ideally I'd like a service with the following attributes: Must work for files of at least 5Gb, preferably 10Gb Once the transfer is started, it must be reliable (i.e. not sensitive to disconnections and service outages) and completely automatic Ideally, the sender would get a notification when the transfer completes. Has to work with Windows based systems. Any suggestions?

    Read the article

  • SQL Server 2008 Remote Access

    - by Timothy Strimple
    I'm having problems connecting to my SQL Server 2008 database from my computer. I have enabled remote connections as described in this answer (http://serverfault.com/questions/7798/how-to-enable-remote-connections-for-sql-server-2008). And I have added the ports listed on the microsoft support page to our Cisco Asa firewall and I'm still unable to connect. The error I'm getting from the SQL Management Studio is: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.) (Microsoft SQL Server, Error: 10060) Once again, I have double and triple checked that remote connections are enabled under the database properties and that TCP is enabled on the configuration page. I've added tcp ports 135, 1433, 1434, 2382, 2383, and 4022 as well as udp 1434 to the firewall. I've also checked to make sure that 1433 is the static port that is set in the tcp section of the database server configuration. The ports should be configured correctly in the firewall since http/https and rdp are all working and the sql server ports are setup the same way. What am I missing here? Any help you could offer would be greatly appreciated. Edit: I can connect to the server via TCP on the internal network. The servers are colocated in a datacenter and I can connect from my production box to my development box and vice versa. To me that indicates a firewall issue, but I've no idea what else to open. I've even tried allowing all tcp ports to that server without success.

    Read the article

  • How To Remove Bottleneck with Squid Caching Proxy

    - by Volomike
    I'm more of a LAMP web developer trying to help the sysop. When I joined a project, I inherited some old PHP spaghetti code. Some of that code is that it goes out to a third-party website (let's call it thirdparty.com) and pulls down content with an HTTP-GET request. Unfortunately, the way the code is designed, it needs to do this several times a minute. When we looked at the bottlenecks on the server with 'netstat -a', we saw that connections to thirdparty.com were constantly running when this content would be plenty fine to be gathered once a day. What I need to know is if the Squid Proxy Caching Server is the solution we need? I'm guessing that this might let us have it pretend to be thirdparty.com on the network. If the web server needs to query thirdparty.com, it hits Squid instead. Squid can then determine whether it needs to supply content from cache or if it needs to go to thirdparty.com for fresh content. Is this the solution we need? And second, is this easily configured and only to cache thirdparty.com requests?

    Read the article

  • Apache 2.2 Present rss http 410 pages as application/rss+xml content type

    - by Mark Bakker
    I have a problem sending http-410 for very old rss feeds. Functional this can happen in one Very old rss feeds where content is not updated anymore / subject could not move to another feed Migration from 3th party site to our site where the rss feed is not longer functional supported I tried several things in my site config see below; <VirtualHost *:80> DocumentRoot /opt/tomcat/webapps/ROOT/ ErrorDocument 500 /error/static/error-500.html ErrorDocument 503 /error/static/error-500.html ErrorDocument 404 /error/static/rss/error-404.html ErrorDocument 410 /error/static/rss/error-410.html # When error pages need to be served by apache, # exclude the files to serve as below (in comment) SetEnvIf Request_URI "/error/static/*" no-jk # force all files to be image/gif: <Location *.rss> #<Location *> #ForceType application/rss+xml </Location> #AddType application/rss+xml .rss #AddType application/rss+xml .xml #AddType application/rss+xml .html JkMount /* rss;use_server_errors=402 # JkMount /* rss RewriteEngine on JkMount /news.rss rss JkMount /documenten-en-publicaties.rss rss RewriteEngine on RewriteRule ^/news.rss$ - [NC,T=application/rss+xml,G,L] RewriteRule ^/documenten-en-publicaties.rss$ - [NC,G,L] # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn ErrorLog "|/usr/bin/logger -s -p local3.err -t 'Apache'" CustomLog "|/usr/bin/logger -s -p local2.info -t 'Apache'" combined ServerSignature Off </VirtualHost> The desired end result should be on /news.rss and /documenten-en-publicaties.rss a 410 page with content in the error page with a content type 'application/rss+xml'

    Read the article

  • Windows roaming profile when creating a new user profile

    - by molecule
    When a particular user is having a lot of problems with Windows XP e.g. applications crashing, unresponsive applications (which used to work), and as a general troubleshooting practice for a domain user, I normally rename that user's old profile and get him/her to logon to create a "fresh" profile (on the same PC). More often than not, this will solve the problem albeit some reconfiguration i.e. Outlook, Excel add-ins etc. As I took over the systems admin role from another administrator, I would like to know what is the easiest way to find out (either through a third party or some Windows administrative tool) what settings are carried over if the profile is a Roaming Profile. I tested creating a new user profile for one of my users and it seems basic Outlook settings such as the user's mailbox and PSTs are carried over automatically when I create a new user profile. I suspect this is done through a batch file loaded as part of the login script. However, my knowledge of scripting is limited and I don't want any corruptions to be carried over to the new profile. Can someone share their experiences on this? Thanks in advance.

    Read the article

  • Joining two routers together, but I have no access to the second router, although I know it's IP address and Gateway

    - by JohnnyVegas
    I have temporarily moved into a rented apartment for 4 months, which has wireless. The trouble I am having is that the access points here are wifi only and no RJ45 and I need to use RJ45 to connect some equipment that I am working with. I have purchased an RT-N66U and installed Tomato (shibby ver. 1.28) and successfully replaced the existing access point, but now I want to enable the access point that I have replaced as it links wirelessly to 3 others. Can I plug in a cable from the access point to my RT-N66U and get it to access the internet via my router? I have no access to the existing wireless access point, and don't want to reset it as it's not mine. There is another router situated in the roof somewhere which I also have no access to, but it's supplying my RT-N66U internet and I most definitely have a double-nat, which although isn't the best way of doing things I am limited with what I can do. Any suggestions on routing tables, vlans etc would be helpful, but I have no experience in these fields before - but I know the tomato firmware can cater for this. My router is set to IP 10.0.1.1 and dhcp is 10.0.1.100-200 The wireless access point address was 192.168.1.2 but this was assigned by the router in the roof which has the address 192.168.1.1. There is a cable from this router going to a wall socket which I now have my RT-N66u attached to via the WAN port. I understand it's scruffy and it isn't the way to do things but I have tried to ask for the admin details but as the wireless network is looked after by a third party and nobody knows their details I am stuck with this dilemma. I could buy three wireless access points and replace the existing but this isn't what I want to do, and although I have installed plenty of DD-WRT wireless repeater bridges they simply don't work here for some unknown reason. The phone line here is very noisy too and I don't have the rights to install ADSL in a building that isn't mine, and 3G coverage isn't good enough either. Thanks for your time

    Read the article

  • Apache+PHP on Windows Server 2008

    - by Álvaro G. Vicario
    I've installed Apache/2.2 and PHP/5.3 lots of times under Windows XP, Windows Vista and Windows Server 2003. The official *.msi installers work fine and configure everything. Now I need to install them into a Windows Server 2008 R2 Standard 64-bit box and I'm facing nothing but problems: There are no official 64 bit binaries for Apache and no binaries at all for PHP (official or third-party). It's alright, I'll do with good 32 bits, but it's kind of surprising. Official documentation is vague, generic and completely unaware of UAC or any recent Windows security feature. The PHP installer is unable to configure mod_php and the Apache installer is unable to configure... well, Apache. After three hours I've finally reached the point where I'm installing everything in the root folder and assigning full control access to all users in all files and directories and all I've got is a PHP-less Apache server that's able to serve static pages. So I guess it's time to stop and think. My question is: Has anyone installed an Apache+PHP production server under Windows Server 2008 in a serious, secure and reliable way and documented the whole process? Or should I just find a bundle like XAMPP and the like that requires no installation? === EDIT === I've installed Xampp Lite 1.7.3 and everything was working in 5 minutes. I'd still like to find some documentation about installing the original packages: XAMPP installs tons of stuff I don't need and offers no tool to enable and disable PHP extensions.

    Read the article

  • Terminal server performance over high latency links

    - by holz
    Our datacenter and head office is currently in Brisbane, Australia, and we have a branch office in the UK. We have a private WAN with a 768k link to our UK office and the latency is at about 350ms. The terminal server performance is reeeeealy bad. Applications that don't have too much animation or any images seem to be okay. But as soon as they do, the session is almost unusable. Powerpoint and internet explorer are good examples of apps that make it run slow. And if there is an image in your email signature, outlook will hang for about 10 seconds each time a new line is inserted, while the image gets moved down a few pixels. We are currently running server 2003. I have tried Server 2008 R2 RDS, and also a third party solution called Blaze by a company called Ericom, but it is still not too much better. We currently have a 5 levels dynamic class of service with the priority in the following order. VoIP Video Terminal Services Printing Everything else When testing the terminal server performance, the link monitored using net-flows, and have plenty we of bandwidth available, so I believe that it is a latency issue rather than bandwidth. Is there anything that can be done to improve performance. Would citrix help at all?

    Read the article

  • What do I need in order to extract and combine text files from multiple ZIP files, via command line?

    - by Iszi
    I've got an interesting scripting challenge in front of me. I'm fairly certain there's a way to do it, but I feel like I'm probably lacking some particular tools and/or functional knowledge. There's some fifty-plus ZIP files that each contain, among other things, text files that need to be merged with one another. The structure is something like this: C:\Reports\FirstJob-1.zip |-MyName |-FirstJob |-1 |-[Some other folders] |-TXTReports |-English |-[Some other files] |-Report.txt C:\Reports\FirstJob-2.zip |-MyName |-FirstJob |-1 |-[Some other folders] |-TXTReports |-English |-[Some other files] |-Report.txt C:\Reports\SecondJob-1.zip |-MyName |-SecondJob |-1 |-[Some other folders] |-TXTReports |-English |-[Some other files] |-Report.txt If I had all the Report.txt files in one regular folder, and uniquely named, I could probably just write a FOR statement that targets *.txt and runs something like type filename.txt >> Consolidated.txt on each. However, these all have the same file name and are embedded deep within separate ZIP files. The potentially useful tools I currently have at my disposal are Windows XP Professional SP3, PowerShell, and WinZip. I'd rather not download or install anything else, but I do understand that third-party tools (or additional tools from Microsoft or WinZip) may be necessary. Whatever tools I use should run natively in Windows. I really don't want to have to mess with Cygwin or other emulators on this system. At the very least, I need a tool that will allow me to analyze and manipulate ZIP files from the command line. Also, are there any other particular complications to this that I've not yet thought of?

    Read the article

< Previous Page | 118 119 120 121 122 123 124 125 126 127 128 129  | Next Page >