Search Results

Search found 42887 results on 1716 pages for 'page header'.

Page 417/1716 | < Previous Page | 413 414 415 416 417 418 419 420 421 422 423 424  | Next Page >

  • IE does not send NTLM domain

    - by Buddy Casino
    Hi! I have a problem with NTLM single-sign-on with IE8. We've got multiple domain controllers and users from multiple domains that we try to authenticate to a web application via NTLMv1 passthru. Somehow IE fails to send the user's domain in the NTLM Type 1 message. This has the effect that the webapp can not match users properly to their domain controllers, resulting in failed logon attempts, because a user from domain X tries to authenticate to domain controller Y. This problem does not occur with Firefox, as it always sends the correct domain header. So: how do I get IE to send the domain in the NTLM header? Grateful for any help, Michael

    Read the article

  • How to display "Comic Sans MS" in Linux?

    - by Roman
    I use "Comic Sans MS" font on my web page. The web page looks OK if I open it under Windows and MAC. But it does not work under Linux. How can I solve this problem? May be I can put the font on my web server? Is this font available for free? Can it slow down my page? Or may be I can replace "Comic Sans MS" by another font which is similar and is available on the 3 operation systems?

    Read the article

  • php redirect pages under folder to different domain

    - by matt wilkie
    Using php, how might I redirect all pages under a folder to a different domain? Current site: http://www.example.org/dept http://www.example.org/dept/stuff http://www.example.org/dept/more http://www.example.org/dept/more/stuff New site: http://www.example-too.org/pets/stuff http://www.example-too.org/pets/more http://www.example-too.org/pets/more/stuff I've learned about how to redirect a single page: <? Header( "HTTP/1.1 301 Moved Permanently" ); Header( "Location: http://www.example-too.org/pets/more/stuff" ); ?> but how to apply this to dozens of pages without creating a php redirect for each one? I can't do this using apache mod_rewrite and .htaccess is disabled. thanks.

    Read the article

  • Nginx ignoring client's HTTP 1.0 request and respond by HTTP 1.1

    - by Yoga
    I am testing using nginx/php5-fpm, with the code <?php header($_SERVER["SERVER_PROTOCOL"]." 404 Not Found"); // also tested: header("Status: 404 Not Found"); echo $_SERVER["SERVER_PROTOCOL"]; And force to use HTTP 1.0 with the curl command. curl -0 -v 'http://www.example.com/test.php' > GET /test.php HTTP/1.0 < HTTP/1.1 404 Not Found < Server: nginx < Date: Sat, 27 Oct 2012 08:51:27 GMT < Content-Type: text/html < Connection: close < * Closing connection #0 HTTP/1.0 As you can see I am already requesting using HTTP 1.0, but nginx reply me with HTTP 1.1

    Read the article

  • How to display "Comic Sans MS" in Linux?

    - by Roman
    I use "Comic Sans MS" font on my web page. The web page looks OK if I open it under Windows and MAC. But it does not work under Linux. How can I solve this problem? May be I can put the font on my web server? Is this font available for free? Can it slow down my page? Or may be I can replace "Comic Sans MS" by another font which is similar and is available on the 3 operation systems?

    Read the article

  • How to avoid printing nearly blank pages?

    - by joelarson
    How many times have you printed an email just to have the last page be 2 or 3 lines of a person's signature (or worse, the "This is confidential" copy inserted by corporate mailservers)? How many times has the last page contained just the footer of the website? Does anyone know of a utility or print driver that can help avoid printing blank or nearly-blank pages? I am not looking for techniques for avoiding this in specific programs -- if I take the effort of doing print preview and then adjusting the pages to be printed, then of course I can avoid it. What I want something I can install that, whenever I push "Print" to any of the various printers I print to with my laptop, it automatically says "hmmm... I bet he doesn't really want that page which is 95% empty" and possibly prompts me to say "do you really want to waste paper on this?"

    Read the article

  • Force caching of handler output which actively resists caching

    - by deceze
    I'm trying to force caching of a very obnoxious piece of PHP script which actively tries to resist caching for no good reason by actively setting all the anti-cache headers: Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Content-Type: text/html; charset=UTF-8 Date: Thu, 22 May 2014 08:43:53 GMT Expires: Thu, 19 Nov 1981 08:52:00 GMT Last-Modified: Pragma: no-cache Set-Cookie: ECSESSID=...; path=/ Vary: User-Agent,Accept-Encoding Server: Apache/2.4.6 (Ubuntu) X-Powered-By: PHP/5.5.3-1ubuntu2.3 If at all avoidable I do not want to have to modify this 3rd party piece of code at all and instead just get Apache to cache the page for a while. I'm doing this very selectively to only very specific pages which have no real impact on session cookies or the like, i.e. which do not contain any personalised information. CacheDefaultExpire 600 CacheMinExpire 600 CacheMaxExpire 1800 CacheHeader On CacheDetailHeader On CacheIgnoreHeaders Set-Cookie CacheIgnoreCacheControl On CacheIgnoreNoLastMod On CacheStoreExpired On CacheStoreNoStore On CacheLock On CacheEnable disk /the/script.php Apache is caching the page alright: [cache:debug] AH00698: cache: Key for entity /the/script.php?(null) is http://example.com:80/the/script.php? [cache_disk:debug] AH00709: Recalled cached URL info header http://example.com:80/the/script.php? [cache_disk:debug] AH00720: Recalled headers for URL http://example.com:80/the/script.php? [cache:debug] AH00695: Cached response for /the/script.php isn't fresh. Adding conditional request headers. [cache:debug] AH00750: Adding CACHE_SAVE filter for /the/script.php [cache:debug] AH00751: Adding CACHE_REMOVE_URL filter for /the/script.php [cache:debug] AH00769: cache: Caching url: /the/script.php [cache:debug] AH00770: cache: Removing CACHE_REMOVE_URL filter. [cache_disk:debug] AH00737: commit_entity: Headers and body for URL http://example.com:80/the/script.php? cached. However, it is always insisting that the "cached response isn't fresh" and is never serving the cached version. I guess this has to do with the Expires header, which marks the document as expired (but I don't know whether that's the correct assumption). I've tried to overwrite and unset headers using mod_headers, but this doesn't help; whatever combination I try the cache is not impressed at all. I'm guessing that the order of operation is wrong, and headers are being rewritten after the cache sees them. early header processing doesn't help either. I've experimented with CacheQuickHandler Off and trying to set explicit filter chains, but nothing is helping. But I'm really mostly poking in the dark, as I do not have a lot of experience with configuring Apache filter chains. Is there a straight forward solution for how to cache this obnoxious piece of code?

    Read the article

  • NGINX 301 and 302 serving small nginx document body. Any way to remove this behaviour?

    - by anonymous-one
    We have noticed that when using nginx internal 301 and 302 handling, nginx will serve a small document body with the appropriate Location: ... header. Something along the lines of (in html): 301 redirect - nginx. As appropriate in the above behaviour, a content-type text/html and content-length header is also sent. We do a lot of 302 and some 301 redirects, the above behaviour is wasted bandwidth in our opinion. Any way to disable this behaviour? One idea that crossed our mind was to set error_page 301 302 to an empty text file. We have not tested this yet, but I am assuming even with the above, the content-type and content-length (0) headers will be sent. So, is there a clean way to send a "body-less" 301/302 redirect with nginx?

    Read the article

  • getting a 404/403 error for payment gateway

    - by Obay Ouano
    We are setting up an online payment facility using a payment gateway. After the payment gateway finishes processing the credit card details for a payment, the user is redirected to a "403 Forbidden" page. The logs show: [MY_IP_ADDRESS_HERE] - - [SOME_DATE_HERE] "GET /POSTBACK_URL.php?txnid=1338434567&result=failure&reason=The+remote+server+returned+an+error%3a+(404)+Not+Found.&digest=7a115270c56df5945c43ad86e56b2e930a3cfd50 HTTP/1.1" 404 - "PAYMENT_GATEWAY_URL_HERE" "BROWSER_DETAILS_HERE" It means that when the PAYMENT_GATEWAY_URL attempts to open our POSTBACK_URL, it gets a 404 error, is that correct? But why does the page say "403 Forbidden"? Anyway, we tried to copy-paste that same URL into the browser window, and the page is opened successfully, with our programmed error notification message. So, why couldn't it be opened when the payment gateway tried to redirect to it, but we could? Is this some sort of permissions issue? If so, the postback URL's file permissions are already 755. What am I missing?

    Read the article

  • PC is very slow

    - by Appoos
    Hi All, My Windows XP system is very slow. I tried all possible ways of improving the performance. But now luck. I've 4GB RAM and AMD Phenon XII B53 processor. I don't see any applications consuming CPU resources. But the Page File usage is 4.18 GB(System Managed size in MyComputerPropertiesPeformance). There is enough RAM available, but still why OS is using Page File? How can I improve the Page File usage? Please help me.

    Read the article

  • mod_rewrite RewriteRule is not working

    - by buggy1985
    Hi, This is a follow-up of this question: Rewrite URL - how to get the hostname and the path? And a copy of this: mod_rewrite RewriteRule is not working I got this Rewrite Rule: RewriteEngine On RewriteRule ^(http://[-A-Za-z0-9+&@#/%=~_|!:,.;]*)/([-A-Za-z0-9+&@#/%=~_|!:,.;]*)\?([A-Za-z0-9+&@#/%=~_|!:,.;]*)$ http://http://www.xmldomain.com/bla/$2?$3&rtype=xslt&xsl=$1/$2.xsl it seems to be correct, and exactly what I need. But it doesn't work on my server. I get a 404 page not found error. mod_rewrite is enabled, as the following simple rule is working fine: RewriteEngine On RewriteRule ^page/([^/\.]+)/?$ index.php?page=$1 [L] Can you help? Thanks

    Read the article

  • Why are my Google and Bing search result pages locking up?

    - by Cyberherbalist
    I've got some really weird behavior going on. I can't do any web searching using Google or Bing because when the search result page shows up, every single link on the page is completely unresponsive. That is, every link to a search result. The links to page functions other than search results work fine. This happens in both IE9 and FF13. It doesn't happen to Yahoo! results, though. Any ideas?

    Read the article

  • Setting Up nginx Site Down That Responds Differently to Ajax?

    - by dave mankoff
    I am trying to set up an automatic site-down page for nginx. So far I have this: location / { try_files /sitedown.html @myapp; } location @myapp { ... } That works well enough: if sitedown.html is present, it serves that, otherwise it serves the app. What I'd like to do, however, is respond differently to Ajax requests so that they don't error out the javascript. I believe, using the rewrite module, that I can do something like if ($http_x_requested_with = XMLHttpRequest) { but it's unclear to me how to use this in order to do what I want. I'd like requests that come with that header to return a simple JSON response like "sitedown" with the appropriate json encoding header. Barring that, it would be nice to return a 503 response code that the javascript could react to.

    Read the article

  • How to convert Kindle books into PDF format?

    - by verve
    I'm new to digital books and I use the Kindle app for Windows to read the books I bought but I hate how I can't read the bottom paragraph of a book in the Kindle app in the centre of the monitor; I have to bend my neck down and it gets sore fast. Problem is that I can't move the Kindle book page up or halfway as when I'm reading a PDF document; if you try to move the page in Kindle it skips to the next new page. So, I thought maybe converting my books to PDF will solve the problem. How do I convert Kindle books into the PDF format? Does anyone have another solution? Perhaps a fancier reader that allows me to scroll Kindle book pages? Windows 7 64-bit IE 8

    Read the article

  • iPlanet Authentication provider

    - by Travis
    Good day. I have stepped into project that requires a server migration that would change the means of authentication for our CAC/PKI SSL enabled website. We are using iPlanet 7 and Oracle Directory Server Enterprise 7 as our LDAP server. The situation is that the site is still CAC/PKI enabled, but at the firewall. The information we want to authenticate against is now in the http header. How do I configure iPlanet and LDAP to authenticate against the header instead of SSL? Thanks. Edit, Can this be done with IIS keeping the Directory Server EE LDAP in tact or is the ACL iPlanet only?

    Read the article

  • dovecot rhel 5 installation fails because of newer libraries

    - by kayhan yüksel
    to whom it may respond to, we are trying to install dovecot (dovecot-2.2.10-1_14.el5.x86_64) on a RHEL 5.4 server and we get the error : [root@asgfkm /]# rpm -i dovecot-2.1.17-0_136.el5.x86_64.rpm uyarý: dovecot-2.1.17-0_136.el5.x86_64.rpm: Header V4 DSA/SHA1 Signature, key ID 66534c2b: NOKEY hata: Failed dependencies: libcrypto.so.6()(64bit) is needed by dovecot-1:2.1.17-0_136.el5.x86_64 libldap-2.3.so.0()(64bit) is needed by dovecot-1:2.1.17-0_136.el5.x86_64 libmysqlclient.so.15()(64bit) is needed by dovecot-1:2.1.17-0_136.el5.x86_64 libmysqlclient.so.15(libmysqlclient_15)(64bit) is needed by dovecot-1:2.1.17-0_136.el5.x86_64 libssl.so.6()(64bit) is needed by dovecot-1:2.1.17-0_136.el5.x86_64 [root@asgfkm /]# but when we try to install requested libraries, it conflicts with the never libraries : uyarý: openssl-0.9.8e-27.el5_10.1.x86_64.rpm: Header V3 DSA/SHA1 Signature, key ID e8562897: NOKEY openssl-1.0.0-20.el6.x86_64 paketi zaten yüklü (openssl-0.9.8e-27.el5_10.1.x86_64 sürümünden daha yeni) this is happening with the other libraries also : libldap, libmysql, etc... Do you recommend --force option to install it or is there any other proper way around ? Thank you for your time,

    Read the article

  • Awstats showing strange 404 referrers

    - by Marco Demaio
    When I look at Awstats 404 errors I see sometimes strange referrers. For example on www.mydomain.com I might see a 404 error reported in Awstats that says: URL (not found) Referrers some-file.jpg http://www.mydomain.com/some-page.html some-file.jpg is a file that does not exist, so it's not strange that if someone tried to reach it got back a 404 from server. The strange part is that the referring page DOES NOT EXIST TOO, I mean http://www.domain.com/some-page.html DOES NOT EXIST, so how could it be the referrer? Is it some client cheating the referrer? Thanks!

    Read the article

  • Confused with DKIM, SPF and Exim Configs

    - by 0pt1m1z3
    I've now spent 2 hours trying to figure out this issue and I am about to give up and go to bed. I've been having issues with Gmail rejecting emails from my VPS server because of false spam alerts (probably caused by lfd sending too many emails). So I changed my Exim config to send emails from a different IP (my VPS comes with 3) and that fixed the issue. I also enabled DKIM and SPF on my domains for added measure. But now, all my emails appear as ("From: Sender Name via server.domain1.com") where server.domain1.com is my VPS hostname. I previously had the same issue in Outlook and turning off "Set SMTP Sender: headers" solved that problem. But I believe adding the DKIM and SPF now makes Gmail add "via server.domain1.com" to my messages. How do I fix this? This is a typical header for a message (as it appears at gmail): Delivered-To: [email protected] Received: by 10.60.44.163 with SMTP id f3csp248622oem; Thu, 29 Mar 2012 21:23:18 -0700 (PDT) Received: by 10.50.106.200 with SMTP id gw8mr452788igb.10.1333081398523; Thu, 29 Mar 2012 21:23:18 -0700 (PDT) Return-Path: <[email protected]> Received: from domain2.com ([X.X.X.X]) by mx.google.com with ESMTPS id y1si810998igb.3.2012.03.29.21.23.18 (version=TLSv1/SSLv3 cipher=OTHER); Thu, 29 Mar 2012 21:23:18 -0700 (PDT) Received-SPF: pass (google.com: domain of [email protected] designates X.X.X.X as permitted sender) client-ip=X.X.X.X; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates X.X.X.X as permitted sender) [email protected]; dkim=pass header[email protected] DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=server.domain1.com; s=default; h=Date:Message-Id:From:Content-type:MIME-Version:Subject:To; bh=wF8bBRgh01EYg4t5DAeVPv1Ps906UVIeRnQCb/HvSYw=; b=k/Pg7lnrO+Ud/z1mOTv+O/3DiJzzQgyBhfIizIaFHM8tF/eNJt5P2k+9yQB224sxYstZIWwVRBJmiqvcM1QhARv1HWqWma0crppZ3JOn+LRHANan634OBi+58SIRA+gu; Received: (Exim 4.77) id 1SDTVE-0005HA-9Y for [email protected]; Fri, 30 Mar 2012 00:31:56 -0400 To: [email protected] Subject: Password Reset Request MIME-Version: 1.0 Content-type: text/html; charset=iso-8859-1 From: Sender Name <[email protected]> Message-Id: <[email protected]> Date: Fri, 30 Mar 2012 00:31:56 -0400 X-AntiAbuse: This header was added to track abuse, please include it with any abuse report X-AntiAbuse: Primary Hostname - server.domain1.com X-AntiAbuse: Original Domain - domain2.com X-AntiAbuse: Originator/Caller UID/GID - [507 504] / [47 12] X-AntiAbuse: Sender Address Domain - server.domain1.com

    Read the article

  • How can I diagnose cache misses when using Apache as a reverse proxy?

    - by johnstok
    I have set up Apache 2.2 as a reverse proxy with the following configuration: # jBoss proxying ProxyRequests Off <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPass /foo http://localhost:9080/foo ProxyPassReverse /foo http://localhost:9080/foo ProxyPassReverseCookiePath /foo /foo # Reverse proxy caching CacheEnable disk /foo # Compression SetOutputFilter DEFLATE BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4\.0[678] no-gzip BrowserMatch \bMSIE\s(7|8) !no-gzip !gzip-only-text/html DeflateCompressionLevel 9 Header append Vary User-Agent env=!dont-vary However, in a number of cases where I expect a cached response to be returned the request is sent through to the origin server at localhost:9080. Responses have a HTTP Vary header of 'Accept-Encoding,User-Agent' which is to be expected given the mod_deflate configuration. How can I determine why Apache is unable to serve a response from the cache?

    Read the article

  • Retaining expanded/contracted nodes in Word 2010's navigation pane

    - by msorens
    I have a lengthy document in Word 2010 which makes copious use of standard header styles. These display hierarchically (and very usefully!) in the navigation pane. But the list of these sections and subsections in the navigation pane is somewhat lengthy, with all nodes being expanded by default. I am now working near the end of the document so every time I open the document I contract the first so many header 1 nodes to compress the list to fit on one screen. I then have direct access to the later sections while still letting me access the earlier sections if needed. The problem is that as soon as I close the document all the customization I did in the navigation pane is lost. Is there any option or setting somewhere that will retain the expanded or contracted state of nodes in the navigation pane?

    Read the article

  • Blocking IP's Nginx behind proxy

    - by FunkyChicken
    I'm running a Nginx 1.2.4 webserver here, and I'm behind a proxy of my hoster to prevent ddos attacks. The downside of being behind this proxy is that I need to get the REAL IP information from an extra header. In PHP it works great by doing $_SERVER[HTTP_X_REAL_IP] for example. Now before I was behind this proxy of my hoster I had a very effective way of blocking certain IP's by doing this: include /etc/nginx/block.conf and to allow/deny IP's there. But now due to the proxy, Nginx sees all traffic coming from 1 IP. Is there a way I can get Nginx to read the IP's like how PHP does, with the X-REAL-IP header?

    Read the article

  • How much RAM 64bit Windows 8 reserves to OS internal use?

    - by Barleyman
    Windows reserves some memory for it's internal use which is not normally allocated to applications. This reserve is seen most easily if you run without a page file or limit the pagefile to relatively small size (such as 3GB). Windows will allocate primarily RAM up to the limit, fill up remaining free space in the page file (if any) and issue a low memory warning when there is no page file space left and the allocated RAM limit is exceeded. The limit appears to be a percentage of the total system RAM. Windows 7 x64 limit is discussed here and methods for circumventing the "low memory warning" is discussed here. Disabling the low memory warning has some advantages - You can use some 600MB more RAM on 8GB machine) But there is a serious disadvantage - When you're out of ram, programs will crash. How much RAM can you allocate on 8GB Windows 8 x64 before you get the low memory warning? Is it possible to adjust the warning threshold?

    Read the article

  • DNS Spoofing and Xampp as a proxy, how to configure it?

    - by Angelo
    I have a server running Apache with mod_proxy, a module to use my localhost as a proxy server. When somebody on the same LAN visits my server (my localhost through my lan ip), he/she can see only the .html page loaded into my server. Due to DNS Spoofing restrictions on the client, if he/she clicks on a link that refers to something not on my server, Apache says correctly "Object not found", because the client cannot request the page from the Internet (remember, the DNS is spoofed to my localhost). The question is: how to configure Apache to grab the page in place of the client?

    Read the article

  • Nginx proxy to Apache - resolve HTTP ORIGIN

    - by Fratyr
    I have a server setup with nginx serving static content and proxy all PHP/dynamic requests to apache on 127.0.0.1 I'm building an API for my databases, and I need to allow clients by their origin (domain name), rather than just IP. Based on CORS rules. So when I send an HTTP header header("Access-Control-Allow-Origin: www.client-requesting.myapi.com"); from my API server, I have to tell it which origin I allow, otherwise client side requests won't work to my API due to same-origin policy. The question is how can I know which domain name (if any) called my API? What should be the nginx and apache configuration to pass the origin parameter? I tried to google, and all I found is some possible solution with mod_rpaf, but I wanted to be sure. Thanks!

    Read the article

  • “yourdomain/start is not the same thing as yourIP/start in Apache”

    - by user1883050
    Let's say you're trying to get a CMS up and going. And say you're supposed to find a Start Page at "www.yourdomain.com/start" But you don't have a domain name yet. You only have an IP address (yourIPaddress). Apache is visibly running at yourIPaddress. So you look in "yourIPaddress/start" And you don't find anything there, just a 404 page. And the person who installed it for you tells you: "In Apache, yourdomain/start is not the same thing as yourIP/start. Please read up on Apache server configuration to figure this out. And that's all the help I can give." My question is: what concepts (re: Apache configuration) should I read up on so that I can find the start page? Thoughts?

    Read the article

< Previous Page | 413 414 415 416 417 418 419 420 421 422 423 424  | Next Page >