Search Results

Search found 3111 results on 125 pages for 'mod gzip'.

Page 85/125 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • Use of backreferences in fail2ban filters possible?

    - by Izzy
    From time to time, I see collections of suspect "File not found" errors in my Apache logs, basically using the pattern File does not exist: /var/www/file, referer: http://my.server.com/file In human terms: The file was not found, though it referenced here itself. A clear hacking attempt, as that's hardly possible (and the REQUEST_URIs often enough suggest the same). In my eyes a clear case for fail2ban – if I could get backreferences to work here: failregex = ^%(_apache_error_client)s File does not exist: /var/www(.+), referer: http://.+\1$ (Justin Case: above examples assume the DIRECTORY_ROOT of that webserver being /var/www) I googled for hours, searched the fail2ban wiki up and down – but nowhere I could find a statement concerning backreferences in its filters. Are they not supported, or did I do it the wrong way? Any hints how to make it work (except from "dirty hacks" like first sending the request to another fake url using mod-rewrite, and then catching on that (if anyone is interested, I can elaborate on that approach in an answer), or doing something similar using mod-security)? as an entire log line was requested: [Fri Nov 08 14:57:28 2013] [error] [client 50.67.234.213] File does not exist: /var/www/text/files.htm++++++++++++++++++++++++++Result:+using+proxy+27.34.142.47:9090;+no+post+sending+forms+are+found;, referer: http://www.myserver.com/text/files.htm++++++++++++++++++++++++++Result:+using+proxy+27.34.142.47:9090;+no+post+sending+forms+are+found; (sorry, logs were just switched, so this long candidate was the only one left currently; minor adjustments were made for privacy reasons)

    Read the article

  • Cant access folder on server- Permission denied

    - by Michal Korzeniowski
    I am running a vps with ubuntu 11.04. After a clean Modx install I've tried to access http://www.encepence.pl/manager and I've got a permission denied by my server. the thing is that I can easily access any other folder under that domain and modify this folder(manager) content via ftp. I’ve tried modifying virtual host with that <Directory /var/www/blackflow/data/www/encepence.pl/manager/> Options Indexes FollowSymLinks ExecCGI AllowOverride All Order allow,deny Allow from all </Directory> But it didn't work. <Directory /var/www/blackflow/data/www/encepence.pl> Options -ExecCGI -Includes php_admin_value open_basedir "/var/www/blackflow/data:." php_admin_flag engine on </Directory> <VirtualHost 192.166.219.34:80 > ServerName encepence.pl CustomLog /var/www/httpd-logs/encepence.pl.access.log combined DocumentRoot /var/www/blackflow/data/www/encepence.pl ErrorLog /var/www/httpd-logs/encepence.pl.error.log ServerAdmin [email protected] ServerAlias www.encepence.pl SuexecUserGroup blackflow blackflow AddType application/x-httpd-php .php .php3 .php4 .php5 .phtml AddType application/x-httpd-php-source .phps php_admin_value open_basedir "/var/www/blackflow/data:." php_admin_value sendmail_path "/usr/sbin/sendmail -t -i -f [email protected]" php_admin_value upload_tmp_dir "/var/www/blackflow/data/mod-tmp" php_admin_value session.save_path "/var/www/blackflow/data/mod-tmp" VirtualDocumentRoot /var/www/blackflow/data/www/%0 </VirtualHost> Any ideas on what might have gone wrong?

    Read the article

  • How to install ia32-libs on Wheezy?

    - by javano
    I have seen a couple of questions on ServerFault relating to installing ia32-libs on a 64bit machine but the solutions aren't working for me (I don't think any of these questions where for Wheezy specifically I'm not sure how to proceed); root@server:/home/# apt-get install -f ia32-libs Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: ia32-libs : Depends: ia32-libs-i386 php5 : Depends: libapache2-mod-php5 (>= 5.4.4-14+deb7u2) but it is not going to be installed or libapache2-mod-php5filter (>= 5.4.4-14+deb7u2) but it is not going to be installed or php5-cgi (>= 5.4.4-14+deb7u2) but it is not going to be installed or php5-fpm (>= 5.4.4-14+deb7u2) but it is not going to be installed php5-mysql : Depends: phpapi-20100525 E: Error, pkgProblemResolver::Resolve generated breaks, this may be caused by held packages. root@server:/home/# sudo apt-get install ia32-libs-i386 Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: ia32-libs-i386:i386 : Depends: freeglut3:i386 (>= 2.6.0-1) but it is not going to be installed Depends: lesstif2:i386 (>= 1:0.95.2-1) but it is not going to be installed Depends: libacl1:i386 (>= 2.2.49-4) but it is not going to be installed Depends: libasyncns0:i386 (>= 0.3-1.1) but it is not going to be installed Depends: libattr1:i386 (>= 1:2.4.44-2) but it is not going to be installed Depends: libaudio2:i386 (>= 1.9.2-4) but it is not going to be installed Depends: libaudiofile1:i386 (>= 0.2.6-8) but it is not going to be installed Depends: libavahi-client3:i386 (>= 0.6.27-2+squeeze1) but it is not going to be installed Depends: libavahi-common3:i386 (>= 0.6.27-2+squeeze1) but it is not going to be installed Depends: libbsd0:i386 (>= 0.2.0-1) but it is not going to be installed Depends: libcap2:i386 (>= 1:2.19-3) but it is not going to be installed Depends: libcomerr2:i386 (>= 1.41.12-4stable1) but it is not going to be installed Depends: libcups2:i386 (>= 1.4.4-7+squeeze1) but it is not going to be installed Depends: libcurl3:i386 (>= 7.21.0-2) but it is not going to be installed Depends: libdbus-1-3:i386 (>= 1.2.24-4+squeeze1) but it is not going to be installed Depends: libdirectfb-1.2-9:i386 (>= 1.2.10.0-4) but it is not going to be installed Depends: libdrm-intel1:i386 (>= 2.4.21-1~squeeze3) but it is not going to be installed Depends: libdrm-radeon1:i386 (>= 2.4.21-1~squeeze3) but it is not going to be installed Depends: libdrm2:i386 (>= 2.4.21-1~squeeze3) but it is not going to be installed Depends: libedit2:i386 (>= 2.11-20080614-2) but it is not going to be installed Depends: libesd0:i386 (>= 0.2.41-8) but it is not going to be installed Depends: libexif12:i386 (>= 0.6.19-1) but it is not going to be installed Depends: libexpat1:i386 (>= 2.0.1-7) but it is not going to be installed Depends: libflac8:i386 (>= 1.2.1-2+b1) but it is not going to be installed Depends: libfltk1.1:i386 (>= 1.1.10-2+b1) but it is not going to be installed Depends: libfontconfig1:i386 (>= 2.8.0-2.1) but it is not going to be installed Depends: libfreetype6:i386 (>= 2.4.2-2.1+squeeze3) but it is not going to be installed Depends: libgcrypt11:i386 (>= 1.4.5-2) but it is not going to be installed Depends: libgdbm3:i386 (>= 1.8.3-9) but it is not going to be installed Depends: libgl1-mesa-dri:i386 (>= 7.7.1-5) but it is not going to be installed Depends: libgl1-mesa-glx:i386 (>= 7.7.1-5) but it is not going to be installed Depends: libglu1-mesa:i386 (>= 7.7.1-5) but it is not going to be installed Depends: libgnutls26:i386 (>= 2.8.6-1) but it is not going to be installed Depends: libgpg-error0:i386 (>= 1.6-1) but it is not going to be installed Depends: libgphoto2-2:i386 (>= 2.4.6-3) but it is not going to be installed Depends: libgphoto2-port0:i386 (>= 2.4.6-3) but it is not going to be installed Depends: libgssapi-krb5-2:i386 (>= 1.8.3+dfsg-4squeeze2) but it is not going to be installed Depends: libice6:i386 (>= 2:1.0.6-2) but it is not going to be installed Depends: libidn11:i386 (>= 1.15-2) but it is not going to be installed Depends: libieee1284-3:i386 (>= 0.2.11-6) but it is not going to be installed Depends: libjack-jackd2-0:i386 (>= 1.9.5~dfsg-14) but it is not going to be installed or libjack0:i386 (>= 1:0.118+svn3796-7) but it is not going to be installed Depends: libjpeg62:i386 (>= 6b1-1) but it is not going to be installed Depends: libjpeg8:i386 (>= 8b-1) but it is not going to be installed Depends: libk5crypto3:i386 (>= 1.8.3+dfsg-4squeeze2) but it is not going to be installed Depends: libkeyutils1:i386 (>= 1.4-1) but it is not going to be installed Depends: libkrb5-3:i386 (>= 1.8.3+dfsg-4squeeze2) but it is not going to be installed Depends: libkrb5support0:i386 (>= 1.8.3+dfsg-4squeeze2) but it is not going to be installed Depends: liblcms1:i386 (>= 1.18.dfsg-1.2+b3) but it is not going to be installed Depends: libltdl7:i386 (>= 2.2.6b-2) but it is not going to be installed Depends: liblzo2-2:i386 (>= 2.03-2) but it is not going to be installed Depends: libmpg123-0:i386 (>= 1.12.1-3) but it is not going to be installed Depends: libnspr4-0d:i386 (>= 4.8.6-1) but it is not going to be installed Depends: libnss3-1d:i386 (>= 3.12.8-1+squeeze4) but it is not going to be installed Depends: libogg0:i386 (>= 1.2.0~dfsg-1) but it is not going to be installed Depends: libopenal1:i386 (>= 1:1.12.854-2) but it is not going to be installed Depends: libpam0g:i386 (>= 1.1.1-6.1+squeeze1) but it is not going to be installed Depends: libpng12-0:i386 (>= 1.2.44-1+squeeze1) but it is not going to be installed Depends: libpopt0:i386 (>= 1.16-1) but it is not going to be installed Depends: libpulse0:i386 (>= 0.9.21-3+squeeze1) but it is not going to be installed Depends: libsamplerate0:i386 (>= 0.1.7-3) but it is not going to be installed Depends: libsane:i386 (>= 1.0.21-9) but it is not going to be installed Depends: libsasl2-2:i386 (>= 2.1.23.dfsg1-7) but it is not going to be installed Depends: libsdl1.2debian:i386 (>= 1.2.15) but it is not going to be installed Depends: libselinux1:i386 (>= 2.0.96-1) but it is not going to be installed Depends: libsigc++-2.0-0c2a:i386 (>= 2.2.4.2-1) but it is not going to be installed Depends: libsm6:i386 (>= 2:1.1.1-1) but it is not going to be installed Depends: libsndfile1:i386 (>= 1.0.21-3+squeeze1) but it is not going to be installed Depends: libsqlite3-0:i386 (>= 3.7.3-1) but it is not going to be installed Depends: libssh2-1:i386 (>= 1.2.6-1) but it is not going to be installed Depends: libssl1.0.0:i386 (>= 1) but it is not going to be installed Depends: libstdc++5:i386 (>= 1:3.3.6-20) but it is not going to be installed Depends: libsvga1:i386 (>= 1:1.4.3-29) but it is not going to be installed Depends: libsysfs2:i386 (>= 2.1.0+repack-1) but it is not going to be installed Depends: libtasn1-3:i386 (>= 2.7-1) but it is not going to be installed Depends: libtdb1:i386 (>= 1.2.1-2+b1) but it is not going to be installed Depends: libtiff4:i386 (>= 3.9.4-5+squeeze3) but it is not going to be installed Depends: libts-0.0-0:i386 (>= 1.0-7) but it is not going to be installed Depends: libusb-0.1-4:i386 (>= 2:0.1.12-16) but it is not going to be installed Depends: libuuid1:i386 (>= 2.17.2-9) but it is not going to be installed Depends: libvorbis0a:i386 (>= 1.3.1-1) but it is not going to be installed Depends: libvorbisenc2:i386 (>= 1.3.1-1) but it is not going to be installed Depends: libvorbisfile3:i386 (>= 1.3.1-1) but it is not going to be installed Depends: libwrap0:i386 (>= 7.6.q-19) but it is not going to be installed Depends: libx11-6:i386 (>= 2:1.3.3-4) but it is not going to be installed Depends: libx86-1:i386 (>= 1.1+ds1-6) but it is not going to be installed Depends: libxau6:i386 (>= 1:1.0.6-1) but it is not going to be installed Depends: libxaw7:i386 (>= 2:1.0.7-1) but it is not going to be installed Depends: libxcb-render-util0:i386 (>= 0.3.6-1) but it is not going to be installed Depends: libxcb-render0:i386 (>= 1.6-1) but it is not going to be installed Depends: libxcb1:i386 (>= 1.6-1) but it is not going to be installed Depends: libxcomposite1:i386 (>= 1:0.4.2-1) but it is not going to be installed Depends: libxcursor1:i386 (>= 1:1.1.10-2) but it is not going to be installed Depends: libxdamage1:i386 (>= 1:1.1.3-1) but it is not going to be installed Depends: libxdmcp6:i386 (>= 1:1.0.3-2) but it is not going to be installed Depends: libxext6:i386 (>= 2:1.1.2-1) but it is not going to be installed Depends: libxfixes3:i386 (>= 1:4.0.5-1) but it is not going to be installed Depends: libxft2:i386 (>= 2.1.14-2) but it is not going to be installed Depends: libxi6:i386 (>= 2:1.3-6) but it is not going to be installed Depends: libxinerama1:i386 (>= 2:1.1-3) but it is not going to be installed Depends: libxml2:i386 (>= 2.7.8.dfsg-2+squeeze1) but it is not going to be installed Depends: libxmu6:i386 (>= 2:1.0.5-2) but it is not going to be installed Depends: libxmuu1:i386 (>= 2:1.0.5-2) but it is not going to be installed Depends: libxp6:i386 (>= 1:1.0.0.xsf1-2) but it is not going to be installed Depends: libxpm4:i386 (>= 1:3.5.8-1) but it is not going to be installed Depends: libxrandr2:i386 (>= 2:1.3.0-3) but it is not going to be installed Depends: libxrender1:i386 (>= 1:0.9.6-1) but it is not going to be installed Depends: libxslt1.1:i386 (>= 1.1.26-6) but it is not going to be installed Depends: libxss1:i386 (>= 1:1.2.0-2) but it is not going to be installed Depends: libxt6:i386 (>= 1:1.0.7-1) but it is not going to be installed Depends: libxtst6:i386 (>= 2:1.1.0-3) but it is not going to be installed Depends: libxv1:i386 (>= 2:1.0.5-1) but it is not going to be installed Depends: libxxf86vm1:i386 (>= 1:1.1.0-2) but it is not going to be installed Depends: odbcinst1debian2:i386 (>= 2.2.14p2-1) but it is not going to be installed Depends: libodbc1:i386 but it is not going to be installed Depends: xaw3dg:i386 (>= 1.5+E-18) but it is not going to be installed php5 : Depends: libapache2-mod-php5 (>= 5.4.4-14+deb7u2) but it is not going to be installed or libapache2-mod-php5filter (>= 5.4.4-14+deb7u2) but it is not going to be installed or php5-cgi (>= 5.4.4-14+deb7u2) but it is not going to be installed or php5-fpm (>= 5.4.4-14+deb7u2) but it is not going to be installed php5-mysql : Depends: phpapi-20100525 E: Error, pkgProblemResolver::Resolve generated breaks, this may be caused by held packages. root@server:/home/# dpkg --print-architecture amd64 root@server:/home/# dpkg --print-foreign-architectures i386 root@server:/home/# lsb_release -a No LSB modules are available. Distributor ID: Debian Description: Debian GNU/Linux 7.1 (wheezy) Release: 7.1 Codename: wheezy root@server:/home/# uname -a Linux servername 3.2.0-4-amd64 #1 SMP Debian 3.2.46-1 x86_64 GNU/Linux root@server:/home/# cat /etc/apt/sources.list deb http://ftp.us.debian.org/debian stable main contrib non-free

    Read the article

  • Nginx order of servers

    - by scrat
    I have 3 sites on my server. All are running on gunicorn and use unix sockets to communicate with nginx which routes requests. I got three records in nginx.conf like: server { listen 80; server_name site1.com; location / { proxy_pass http://unix:/tmp/site1.sock; proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } } For site1, site2, site3. If they are ordered as config for site1 goes first, and then goes config for site2 and site3 everything works good. But when I change the order for example to site2, site1, site3, then site1 becomes routed to site2. What am I doing wrong? Full server nginx.conf before servers configs: user www-data; worker_processes 4; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; include /etc/nginx/mime.types; default_type application/octet-stream; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Gzip Settings ## gzip on; gzip_types text/css application/x-javascript text/x-component text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon;

    Read the article

  • default domain and first domain in apache2 causing trouble

    - by acidzombie24
    I have 3 sites and a default/test site using mono's test page. I created aFirst, c, d, e, zLast. zLast has rewrite rules that should be evaluated last. Since the first VirtualHost seen is the default i set it to this --aFirst-- <VirtualHost *:80> ServerName www.domain.tld ServerAdmin webmaster@localhost DocumentRoot /var/www/test DirectoryIndex index.html index.aspx index.php MonoDocumentRootDir "/var/www/test" MonoServerPath rootsite "/usr/local/bin/mod-mono-server2" MonoApplications rootsite "/:/var/www/test" <Directory /var/www/test> MonoSetServerAlias rootsite SetHandler mono AddHandler mod_mono .aspx .ascx .asax .ashx .config .cs .asmx </Directory> </VirtualHost> The problem is my default page (the ip address of my server) and the first website (csite.ddomain.net) have problems (even though csite is defined in c and is not the first virtual host). The ip address of my server and csite.ddomain.net ALWAYS load the same site. Either monos test page or the csite. It flips every time i restart apache. Why isnt the server ip address always loading the default page (mono test page) and why isnt csite.ddomain.net always loading the site i want!?! Heres the config for --csite-- <VirtualHost *:80> ServerName csite.testdomain.net ServerAdmin webmaster@localhost ServerAlias s.csite.testdomain.net DocumentRoot /var/www/prjname DirectoryIndex index.html index.aspx MonoDocumentRootDir "/var/www/prjname" MonoServerPath rootsite "/usr/local/bin/mod-mono-server2" MonoApplications rootsite "/:/var/www/prjname" <Directory /var/www/prjname> MonoSetServerAlias rootsite SetHandler mono AddHandler mod_mono .aspx .ascx .asax .ashx .config .cs .asmx </Directory> </VirtualHost> aFirst, c, d, e, zLast are all enabled.

    Read the article

  • NGINX: How do I calculate an optimal no. of worker processes and worker connections?

    - by bodacious
    Our web app is running on a Linode 2048 server at the moment (~ 2048 GB of RAM) The MYSQL database is on another linode of it's own so this server is really only handling NGINX and and the Rails application. The application itself uses about 185976 of memory per instance (RSS). Our traffic is < 1000 per day and the pages are mostly cached so there are fewer hits to the rails app itself. My question is - how can I calculate optimal NGINX config settings for my app? Below is the current config: worker_processes 1; # pid of nginx master process pid /var/run/nginx.pid; events { worker_connections 1024; } http { access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; passenger_root /home/user/.rvm/gems/ree-1.8.7-2011.01@URTV/gems/passenger-3.0.3; passenger_ruby /home/user/.rvm/rubies/ree-1.8.7-2011.01/bin/ruby; include mime.types; default_type application/octet-stream; sendfile on; tcp_nopush on; tcp_nodelay on; # gzip settings gzip on; gzip_http_version 1.0; gzip_comp_level 2; gzip_vary on; gzip_proxied any; gzip_types text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript; # load extra modules from the vhosts directory include /opt/nginx/vhosts/*.conf; } Any advice would be appreciated! :)

    Read the article

  • Strange issue in header location redirect

    - by hd01
    I have three websites hosted (example1.com, example2.com, example3.com) on a server. There is a page (test.php) on example1.com with just code below inside it: <?php header('Location:http://example2.com/a.php'); ?> When I browse test.php it goes to http://example1.com/a.php . it doesn't understand it is another domain url, it tried to find the page on itself. but when I put http://google.com instead of example2.com/a.php it works correct. I really get confused. What is the problem ? Should I set some configuration on the server? ( I am administrator of the hosting server ). Ps. The server is behind a pound server. Here's the Firebug Net output for example1.com/test.php Response Headers: HTTP/1.1 302 Found Date: Tue, 09 Oct 2012 09:03:34 GMT Server: Apache/2.2.16 (Debian) Location: http://example1.com/a.php Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 21 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=utf-8 Request Headers: Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding gzip, deflate Accept-Language en-us,en;q=0.5 Connection keep-alive Cookie mycookie Host example1.com User-Agent Mozilla/5.0 (X11; Linux i686; rv:14.0) Gecko/20100101 Firefox/14.0.1

    Read the article

  • Concatenating gziped Apache logs

    - by markdrayton
    We rotate and compress our Apache logs each day but it's become apparent that this isn't frequently enough. An uncompressed log is about 6G, which is getting close to filling our log partition (yep, we'll make it bigger in the future!) as well as taking a lot of time and CPU to compress each day. We have to produce a gziped log for each day for our stats processing. Obviously we could move our logs to a partition with more space but I also want to spread the compression overhead throughout the day. Using Apache's rotatelogs we can rotate and compress the log more often -- hourly, say -- but how can I concatenate all the hourly compressed logs into a running compressed log for the day, without decompressing the previous logs? I don't want to uncompress 24 hours' worth of data and recompress it because that has all the disadvantages of our current solution. Gzip doesn't seem to offer any append or concatenate option but perhaps I've missed something obvious. This question suggests straight shell concatenation "works" in that the archive can be decompressed but that gzip -l doesn't work seems a bit dodgy. Alternatively, perhaps this is still a bad way to do things. Other suggestions are welcome -- our only constraints are our relatively small log partitions and the need to provide a daily compressed log.

    Read the article

  • Linux RHEL : Making disk image efficiently

    - by TheProfoundGeek
    I have a linux box having RHEL. Its disk (hda1) is having free space of about 25GB. I have an another disk (hda2) which is of 250GB having another RHEL instance, it's partitioned for 200GB. Data on the disk occupies about 21GB of data. The image of hda2 needs to be taken and restored on other disk of same specs. What is the best way to make image file of the hda2? Ideally the images size should be around 25GBs as the actual data on the disk is just 21GB. I am aware about the following two methods. Method 1 : Raw Image dd if=/dev/hda2 of=/path/to/image dd if=/path/to/image of=/dev/hda3 Question 1 : Will the above method make a gigantic image of 250GBs? Is it efficient? Method 2 : Compressed Image. dd if=/dev/hda2 | gzip > /path/to/image.gz gzip -dc /path/to/image.gz | dd of=/dev/hda2 Question 2 : I tried the method 2, its taking too long. What are the pit falls of this methods? Which of the above method id efficient and why? Is there any other Linux utility which can do the job? Third party tools are no no.

    Read the article

  • How and where to implement basic authentication in Kibana 3

    - by Jabb
    I have put my elasticsearch server behind a Apache reverse proxy that provides basic authentication. Authenticating to Apache directly from the browser works fine. However, when I use Kibana 3 to access the server, I receive authentication errors. Obviously because no auth headers are sent along with Kibana's Ajax calls. I added the below to elastic-angular-client.js in the Kibana vendor directory to implement authentication quick and dirty. But for some reason it does not work. $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); What is the best approach and place to implement basic authentication in Kibana? /*! elastic.js - v1.1.1 - 2013-05-24 * https://github.com/fullscale/elastic.js * Copyright (c) 2013 FullScale Labs, LLC; Licensed MIT */ /*jshint browser:true */ /*global angular:true */ 'use strict'; /* Angular.js service wrapping the elastic.js API. This module can simply be injected into your angular controllers. */ angular.module('elasticjs.service', []) .factory('ejsResource', ['$http', function ($http) { return function (config) { var // use existing ejs object if it exists ejs = window.ejs || {}, /* results are returned as a promise */ promiseThen = function (httpPromise, successcb, errorcb) { return httpPromise.then(function (response) { (successcb || angular.noop)(response.data); return response.data; }, function (response) { (errorcb || angular.noop)(response.data); return response.data; }); }; // check if we have a config object // if not, we have the server url so // we convert it to a config object if (config !== Object(config)) { config = {server: config}; } // set url to empty string if it was not specified if (config.server == null) { config.server = ''; } /* implement the elastic.js client interface for angular */ ejs.client = { server: function (s) { if (s == null) { return config.server; } config.server = s; return this; }, post: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); console.log($http.defaults.headers); path = config.server + path; var reqConfig = {url: path, data: data, method: 'POST'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, get: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; // no body on get request, data will be request params var reqConfig = {url: path, params: data, method: 'GET'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, put: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; var reqConfig = {url: path, data: data, method: 'PUT'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, del: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; var reqConfig = {url: path, data: data, method: 'DELETE'}; return promiseThen($http(angular.extend(reqConfig, config)), successcb, errorcb); }, head: function (path, data, successcb, errorcb) { $http.defaults.headers.common.Authorization = 'Basic ' + Base64Encode('user:Password'); path = config.server + path; // no body on HEAD request, data will be request params var reqConfig = {url: path, params: data, method: 'HEAD'}; return $http(angular.extend(reqConfig, config)) .then(function (response) { (successcb || angular.noop)(response.headers()); return response.headers(); }, function (response) { (errorcb || angular.noop)(undefined); return undefined; }); } }; return ejs; }; }]); UPDATE 1: I implemented Matts suggestion. However, the server returns a weird response. It seems that the authorization header is not working. Could it have to do with the fact, that I am running Kibana on port 81 and elasticsearch on 8181? OPTIONS /solar_vendor/_search HTTP/1.1 Host: 46.252.46.173:8181 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3 Accept-Encoding: gzip, deflate Origin: http://46.252.46.173:81 Access-Control-Request-Method: POST Access-Control-Request-Headers: authorization,content-type Connection: keep-alive Pragma: no-cache Cache-Control: no-cache This is the response HTTP/1.1 401 Authorization Required Date: Fri, 08 Nov 2013 23:47:02 GMT WWW-Authenticate: Basic realm="Username/Password" Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 346 Connection: close Content-Type: text/html; charset=iso-8859-1 UPDATE 2: Updated all instances with the modified headers in these Kibana files root@localhost:/var/www/kibana# grep -r 'ejsResource(' . ./src/app/controllers/dash.js: $scope.ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); ./src/app/services/querySrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); ./src/app/services/filterSrv.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); ./src/app/services/dashboard.js: var ejs = ejsResource({server: config.elasticsearch, headers: {'Access-Control-Request-Headers': 'Accept, Origin, Authorization', 'Authorization': 'Basic XXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='}}); And modified my vhost conf for the reverse proxy like this <VirtualHost *:8181> ProxyRequests Off ProxyPass / http://127.0.0.1:9200/ ProxyPassReverse / https://127.0.0.1:9200/ <Location /> Order deny,allow Allow from all AuthType Basic AuthName “Username/Password” AuthUserFile /var/www/cake2.2.4/.htpasswd Require valid-user Header always set Access-Control-Allow-Methods "GET, POST, DELETE, OPTIONS, PUT" Header always set Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization" Header always set Access-Control-Allow-Credentials "true" Header always set Cache-Control "max-age=0" Header always set Access-Control-Allow-Origin * </Location> ErrorLog ${APACHE_LOG_DIR}/error.log </VirtualHost> Apache sends back the new response headers but the request header still seems to be wrong somewhere. Authentication just doesn't work. Request Headers OPTIONS /solar_vendor/_search HTTP/1.1 Host: 46.252.26.173:8181 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: de-de,de;q=0.8,en-us;q=0.5,en;q=0.3 Accept-Encoding: gzip, deflate Origin: http://46.252.26.173:81 Access-Control-Request-Method: POST Access-Control-Request-Headers: authorization,content-type Connection: keep-alive Pragma: no-cache Cache-Control: no-cache Response Headers HTTP/1.1 401 Authorization Required Date: Sat, 09 Nov 2013 08:48:48 GMT Access-Control-Allow-Methods: GET, POST, DELETE, OPTIONS, PUT Access-Control-Allow-Headers: Content-Type, X-Requested-With, X-HTTP-Method-Override, Origin, Accept, Authorization Access-Control-Allow-Credentials: true Cache-Control: max-age=0 Access-Control-Allow-Origin: * WWW-Authenticate: Basic realm="Username/Password" Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 346 Connection: close Content-Type: text/html; charset=iso-8859-1 SOLUTION: After doing some more research, I found out that this is definitely a configuration issue with regard to CORS. There are quite a few posts available regarding that topic but it appears that in order to solve my problem, it would be necessary to to make some very granular configurations on apache and also make sure that the right stuff is sent from the browser. So I reconsidered the strategy and found a much simpler solution. Just modify the vhost reverse proxy config to move the elastisearch server AND kibana on the same http port. This also adds even better security to Kibana. This is what I did: <VirtualHost *:8181> ProxyRequests Off ProxyPass /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/ ProxyPassReverse /bigdatadesk/ http://127.0.0.1:81/bigdatadesk/src/ ProxyPass / http://127.0.0.1:9200/ ProxyPassReverse / https://127.0.0.1:9200/ <Location /> Order deny,allow Allow from all AuthType Basic AuthName “Username/Password” AuthUserFile /var/www/.htpasswd Require valid-user </Location> ErrorLog ${APACHE_LOG_DIR}/error.log </VirtualHost>

    Read the article

  • Update packages on very old ubuntu

    - by meewoK
    I want to add Mysqli support to a machine running: Server Version: Apache/2.2.4 (Ubuntu) PHP/5.2.3-1ubuntu6.3 I would rather not update more things then I need to. I run the following: sudo apt-get install php5-mysql However, as the ubuntu version is old I get the following. WARNING: The following packages cannot be authenticated! php5-cli php5-mysql php5-mhash php5-xsl php5-pspell php5-snmp php5-curl php5-xmlrpc php5-sqlite php5-gd libapache2-mod-php5 php5-common Install these packages without verification [y/N]? Y Err http://gr.archive.ubuntu.com gutsy-updates/main php5-cli 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-cli 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-mysql 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-mhash 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-xsl 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-pspell 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-snmp 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-curl 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-xmlrpc 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-sqlite 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-gd 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main libapache2-mod-php5 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-common 5.2.3-1ubuntu6.4 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-cli_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-mysql_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-mhash_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-xsl_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-pspell_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-snmp_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-curl_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-xmlrpc_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-sqlite_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-gd_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/libapache2-mod-php5_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-common_5.2.3-1ubuntu6.4_i386.deb 404 Not Found E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? Questions Can I add mysqli feature using another method instead of sudo-apt get? Even if successful can this break something on the system? Update: I have tried to add additional sources using the instructions from: http://superuser.com/questions/339537/where-can-i-get-therepositories-for-old-ubuntu-versions I have the following in the /etc/apt/sources.list file: # deb cdrom:[Ubuntu-Server 7.10 _Gutsy Gibbon_ - Release i386 (20071016)]/ gutsy main restricted #deb cdrom:[Ubuntu-Server 7.10 _Gutsy Gibbon_ - Release i386 (20071016)]/ gutsy main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://gr.archive.ubuntu.com/ubuntu/ gutsy main restricted universe multiverse deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-backports main restricted universe multiverse deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates main restricted deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## universe WILL NOT receive any review or updates from the Ubuntu security ## team. deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates universe deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. #deb http://gr.archive.ubuntu.com/ubuntu/ gutsy multiverse deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy multiverse deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates multiverse deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates multiverse ## Uncomment the following two lines to add software from the 'backports' ## repository. ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. # deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-backports main restricted universe multiverse # deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-backports main restricted universe multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. This software is not part of Ubuntu, but is ## offered by Canonical and the respective vendors as a service to Ubuntu ## users. # deb http://archive.canonical.com/ubuntu gutsy partner # deb-src http://archive.canonical.com/ubuntu gutsy partner deb http://security.ubuntu.com/ubuntu gutsy-security main restricted deb-src http://security.ubuntu.com/ubuntu gutsy-security main restricted deb http://security.ubuntu.com/ubuntu gutsy-security universe deb-src http://security.ubuntu.com/ubuntu gutsy-security universe deb http://security.ubuntu.com/ubuntu gutsy-security multiverse deb-src http://security.ubuntu.com/ubuntu gutsy-security multiverse # Required deb http://old-releases.ubuntu.com/ubuntu/gutsy main restricted universe multiverse deb http://old-releases.ubuntu.com/ubuntu/gutsy-updates main restricted universe multiverse deb http://old-releases.ubuntu.com/ubuntu/gutsy-security main restricted universe multiverse

    Read the article

  • CodePlex Daily Summary for Friday, August 31, 2012

    CodePlex Daily Summary for Friday, August 31, 2012Popular ReleasesStartComp: Beta Release 1.0.0: Beta Release 1 Featured Content Bing-Search has been removed Window anchor implemented The listview can now be configured to be shown in details view or tile view through the context menu The listview now allows sorting through the context menu The view, sort order and sort column are now saved for each repository The listview now shows the background image in the lower right The listview now shows a background image for the user defined repositories Added a "Tell-A-Friend" bu...SharePoint Column & View Permission: SharePoint Column and View Permission v1.2: Version 1.2 of this project. If you will find any bugs please let me know at enti@zoznam.sk or post your findings in Issue TrackerDotNetNuke® Form and List: 06.00.04: DotNetNuke Form and List 06.00.04 Don't forget to backup your installation before upgrade. Changes in 06.00.04 Fix: Sql Scripts for 6.003 missed object qualifiers within stored procedures Fix: added missing resource "cmdCancel.Text" in form.ascx.resx Changes in 06.00.03 Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 Change: Data is now stored in nvarchar(max) instead of ntext Changes in 06.00.02 The scripts are now compatible with SQL Azure, tested in a ne...DotNetNuke Translator: 01.00.00 Beta: First release of the project.Audio Pitch & Shift: Audio Pitch And Shift 5.1.0.2: fixed several issues with streaming modeUrlPager: UrlPager 1.2: Fixed bug in which url parameters will lost after paging; ????????url???bug;EntLib.com????????: EntLib.com???????? v3.0: EntLib eCommerce Solution ???Microsoft .Net Framework?????????????????????。Coevery - Free CRM: Coevery 1.0.0.24: Add a sample database, and installation instructions.NicAudio: NicAudio 2.0.6: ac3,dts Solved some initialization issues with no-linear decode.ExpressProfiler: Initial release of ExpressProfiler v1.2: This is initial release of ExpressProfilerNabu Library: 2012-08-29, 14: .Net Framework 4.0, .Net Framework 4.5 Debug and Release builds.Math.NET Numerics: Math.NET Numerics v2.2.1: Major linear algebra rework since v2.1, now available on Codeplex as well (previous versions were only available via NuGet). Since v2.2.0: Student-T density more robust for very large degrees of freedom Sparse Kronecker product much more efficient (now leverages sparsity) Direct access to raw matrix storage implementations for advanced extensibility Now also separate package for signed core library with a strong name (we dropped strong names in v2.2.0) Also available as NuGet packages...Microsoft SQL Server Product Samples: Database: AdventureWorks Databases – 2012, 2008R2 and 2008: About this release This release consolidates AdventureWorks databases for SQL Server 2012, 2008R2 and 2008 versions to one page. Each zip file contains an mdf database file and ldf log file. This should make it easier to find and download AdventureWorks databases since all OLTP versions are on one page. There are no database schema changes. For each release of the product, there is a light-weight and full version of the AdventureWorks sample database. The light-weight version is denoted by ...DotNetNuke® Blog: 05.00.00: Version 5.0.0 - Final This version of the module requires DotNetNuke Core 6.2 or greater. FYI: Developers should be aware that the module uses Visual Studio 2010 only. Release Highlights: Corrected blog comment sorting problem. 20228 - Integrated with the core Journal API. 20789, 21988 - wired in fix submitted by J Sheely around blank author names. 20210 - Updated manifest to 5.0 format (from 3.0). Automated packaging and made project structure more inline with other DotNetNuke m...Christoc's DotNetNuke Module Development Template: DotNetNuke Project Templates V1.1 for VS2012: This release is specifically for Visual Studio 2012 Support, distributed through the Visual Studio Extensions gallery at http://visualstudiogallery.msdn.microsoft.com/ After you build in Release mode the installable packages (source/install) can be found in the INSTALL folder now, within your module's folder, not the packages folder anymore Check out the blog post for all of the details about this release. http://www.dotnetnuke.com/Resources/Blogs/EntryId/3471/New-Visual-Studio-2012-Projec...Home Access Plus+: v8.0: v8.0828.1800 RELEASE CHANGED TO BETA Any issues, please log them on http://www.edugeek.net/forums/home-access-plus/ This is full release, NO upgrade ZIP will be provided as most files require replacing. To upgrade from a previous version, delete everything but your AppData folder, extract all but the AppData folder and run your HAP+ install Documentation is supplied in the Web Zip The Quota Services require executing a script to register the service, this can be found in there install di...Phalanger - The PHP Language Compiler for the .NET Framework: 3.0.0.3406 (September 2012): New features: Extended ReflectionClass libxml error handling, constants DateTime::modify(), DateTime::getOffset() TreatWarningsAsErrors MSBuild option OnlyPrecompiledCode configuration option; allows to use only compiled code Fixes: ArgsAware exception fix accessing .NET properties bug fix ASP.NET session handler fix for OutOfProc mode DateTime methods (WordPress posting fix) Phalanger Tools for Visual Studio: Visual Studio 2010 & 2012 New debugger engine, PHP-like debugging ...MabiCommerce: MabiCommerce 1.0.1: What's NewSetup now creates shortcuts Fix spelling errors Minor enhancement to the Map window.ScintillaNET: ScintillaNET 2.5.2: This release has been built from the 2.5 branch. Version 2.5.2 is functionally identical to the 2.5.1 release but also includes the XML documentation comments file generated by Visual Studio. It is not 100% comprehensive but it will give you Visual Studio IntelliSense for a large part of the API. Just make sure the ScintillaNET.xml file is in the same folder as the ScintillaNET.dll reference you're using in your projects. (The XML file does not need to be distributed with your application)....BlackJumboDog: Ver5.7.1: 2012.08.25 Ver5.7.1 (1)?????·?????LING?????????????? (2)SMTP???(????)????、?????\?????????????????????New ProjectsAbcLibrary: A Library of methods and class types used for ABCAprendendo Windows 8: Não foi feito nada ainda...Auto fill template generator (word): This program was designed to help the automate generation of files using keywords.ClarkTestCodePlex2: clark test Code Razor: This tools translates Razor files to code. This allows the Razor views to be compiled and shared across projects.Contrib.Mod.ChangePassword: It is an evil module that abuses users rights and lets you change anyone's password.CurrentConsumption: CurrentConsumptionDbSettings - An API to store settings in a database: This stores settings in an OleDb/Sql database using an API similar to ApplicationSettingsBase. Settings vary by app, version, user.JCI prototipos: summaryMemberAdminService: This is a test projectMeteor Rendering Engine: The Meteor rendering engine is developed in C# with XNA 4.0, and provides various rendering outputs for 3D scenes.Mod.Colorbox: Orchard module for Mod.ColorboxMogulTestProject1: papaMogulTestTRY: papaosmm: this is a sample test projectServer Survey: Server Survey ScriptShops' Cloud: This project is a Cloud Platform for Mini Shops' Daily Management.Simple Grocery 5: This is a very simple application to help me (or you) out setting up a grocery list and use it on the food market using ALL smart phones or tablets.Tikun Korim: Community site to help people learn to read in Sefer Torah. This project is going to use ASP.NET MVC 4 and as much as open source project as we can. TreeCreeper: TreeCreeper programs (Spatial and NonSpatial) support the taxonomic analysis of species assemblagesVisual Studio Icon Patcher: Visual Studio Icon Patcher allows you to update Visual Studio 2012 with the Solution Explorer icons from Visual Studio 2010.WPT Generator: WPT Generator HTML5 , Google API 3.0, Javascript and CSS 3.0 Web Application for generating a WPT file (Ozi Explorer Format).

    Read the article

  • How to invalidate nginx reverse proxy cache in front of other nginx servers?

    - by Olivier Lance
    I'm running a Proxmox server on a single IP address, that will dispatch HTTP requests to containers depending on the requested host. I am using nginx on the Proxmox side to listen to HTTP requests and I am using the proxy_pass directive in my different server blocks to dispatch requests according to the server_name. My containers run on Ubuntu and are also running a nginx instance. I'm having troubles with caching on a particular website that is fully static: nginx keeps on serving me stale content after files updates, until I: Clear /var/cache/nginx/ and restart nginx or set proxy_cache off for this server and reload the config Here's the detail of my configuration: On the server (proxmox): /etc/nginx/nginx.conf: user www-data; worker_processes 8; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; use epoll; } http { ## # Basic Settings ## sendfile on; #tcp_nopush on; tcp_nodelay on; #keepalive_timeout 65; types_hash_max_size 2048; server_tokens off; # server_names_hash_bucket_size 64; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; client_body_buffer_size 1k; client_max_body_size 8m; large_client_header_buffers 1 1K; ignore_invalid_headers on; client_body_timeout 5; client_header_timeout 5; keepalive_timeout 5 5; send_timeout 5; server_name_in_redirect off; ## # Logging Settings ## access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; ## # Gzip Settings ## gzip on; gzip_disable "MSIE [1-6]\.(?!.*SV1)"; gzip_vary on; gzip_proxied any; gzip_comp_level 6; # gzip_buffers 16 8k; gzip_http_version 1.1; gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript; limit_conn_zone $binary_remote_addr zone=gulag:1m; limit_conn gulag 50; ## # Virtual Host Configs ## include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } /etc/nginx/conf.d/proxy.conf: proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_hide_header X-Powered-By; proxy_intercept_errors on; proxy_buffering on; proxy_cache_key "$scheme://$host$request_uri"; proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=cache:10m inactive=7d max_size=700m; /etc/nginx/sites-available/my-domain.conf: server { listen 80; server_name .my-domain.com; access_log off; location / { proxy_pass http://my-domain.local:80/; proxy_cache cache; proxy_cache_valid 12h; expires 30d; proxy_cache_use_stale error timeout invalid_header updating; } } On the container (my-domain.local): nginx.conf: (everything is inside the main config file -- it's been done quickly...) user www-data; worker_processes 1; error_log logs/error.log; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; sendfile on; #tcp_nopush on; keepalive_timeout 65; gzip off; server { listen 80; server_name .my-domain.com; root /var/www; access_log logs/host.access.log; } } I've read many blog posts and answers before resolving to posting my own questions... most answers I can see suggest setting sendfile off; but that didn't work for me. I have tried many other things, double checked my settings and all seems fine. So I'm wondering whether I am not expecting nginx's cache to do something it's not meant to...? Basically, I thought that if one of my static files in my container was updated, the cache in my reverse proxy would be invalidated and my browser would get the new version of the file when it requests it... But I now have the sentiment I misunderstood many things. Of all things, I now wonder how nginx on the server can know about a file in the container has changed? I have seen a directive proxy_header_pass (or something alike), should I use this to let the nginx instance from the container somehow inform the one in Proxmox about updated files? Is this expectation just a dream, or can I do it with nginx on my current architecture?

    Read the article

  • Nginx & Apache Cannot get try_files to work with permalinks

    - by tcherokee
    I have been working on this for the past two weeks not and for some reason I cannot seem to get nginx's try_files to work with my wordpress permalinks. I am hoping someone will be able to tell me where I am going wrong and also hopefully tell me if I made any major errors with my configurations as well (I am an nginx newbie... but learning :) ). Here are my Configuration files nginx.conf user www-data; worker_processes 4; pid /var/run/nginx.pid; events { worker_connections 768; # multi_accept on; } http { ## # Basic Settings ## sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; # server_tokens off; # server_names_hash_bucket_size 64; # server_name_in_redirect off; include /etc/nginx/mime.types; default_type application/octet-stream; ## # Logging Settings ## # Defines the cache log format, cache log location # and the main access log location. log_format cache '***$time_local ' '$upstream_cache_status ' 'Cache-Control: $upstream_http_cache_control ' 'Expires: $upstream_http_expires ' '$host ' '"$request" ($status) ' '"$http_user_agent" ' ; access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } mydomain.com.conf server { listen 123.456.78.901:80; # IP goes here. server_name www.mydomain.com mydomain.com; #root /var/www/mydomain.com/prod; index index.php; ## mydomain.com -> www.mydomain.com (301 - Permanent) if ($host !~* ^(www|dev)) { rewrite ^/(.*)$ $scheme://www.$host/$1 permanent; } # Add trailing slash to */wp-admin requests. rewrite /wp-admin$ $scheme://$host$uri/ permanent; # All media (including uploaded) is under wp-content/ so # instead of caching the response from apache, we're just # going to use nginx to serve directly from there. location ~* ^/(wp-content|wp-includes)/(.*)\.(jpg|png|gif|jpeg|css|js|m$ root /var/www/mydomain.com/prod; } # Don't cache these pages. location ~* ^/(wp-admin|wp-login.php) { proxy_pass http://backend; } location / { if ($http_cookie ~* "wordpress_logged_in_[^=]*=([^%]+)%7C") { set $do_not_cache 1; } proxy_cache_key "$scheme://$host$request_uri $do_not_cache"; proxy_cache main; proxy_pass http://backend; proxy_cache_valid 30m; # 200, 301 and 302 will be cached. # Fallback to stale cache on certain errors. # 503 is deliberately missing, if we're down for maintenance # we want the page to display. #try_files $uri $uri/ /index.php?q=$uri$args; #try_files $uri =404; proxy_cache_use_stale error timeout invalid_header http_500 http_502 http_504 http_404; } # Cache purge URL - works in tandem with WP plugin. # location ~ /purge(/.*) { # proxy_cache_purge main "$scheme://$host$1"; # } # No access to .htaccess files. location ~ /\.ht { deny all; } } # End server gzip.conf # Gzip Configuration. gzip on; gzip_disable msie6; gzip_static on; gzip_comp_level 4; gzip_proxied any; gzip_types text/plain text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript; proxy.conf # Set proxy headers for the passthrough proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_max_temp_file_size 0; client_max_body_size 10m; client_body_buffer_size 128k; proxy_connect_timeout 90; proxy_send_timeout 90; proxy_read_timeout 90; proxy_buffer_size 4k; proxy_buffers 4 32k; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; add_header X-Cache-Status $upstream_cache_status; backend.conf upstream backend { # Defines backends. # Extracting here makes it easier to load balance # in the future. Needs to be specific IP as Plesk # doesn't have Apache listening on localhost. ip_hash; server 127.0.0.1:8001; # IP goes here. } cache.conf # Proxy cache and temp configuration. proxy_cache_path /var/www/nginx_cache levels=1:2 keys_zone=main:10m max_size=1g inactive=30m; proxy_temp_path /var/www/nginx_temp; proxy_cache_key "$scheme://$host$request_uri"; proxy_redirect off; # Cache different return codes for different lengths of time # We cached normal pages for 10 minutes proxy_cache_valid 200 302 10m; proxy_cache_valid 404 1m; The two commented out try_files in location \ of the mydomain config files are the ones I tried. This error I found in the error log can be found below. ...rewrite or internal redirection cycle while internally redirecting to "/index.php" Thanks in advance

    Read the article

  • Problem simulating HTTP POST using HttpClient

    - by user560904
    I am trying to programatically send a HTTP Post request using HttpClient to http://ojp.nationalrail.co.uk/en/s/planjourney/query but it is not liking the request I send it. I copied the headers and body from what Chrome browser sends so it is identical but it doesn't like what I send as the HTML mentions there's an error. <div class="padding"> <h1 class="sifr"><strong>Sorry</strong>, something went wrong</h1> <div class="error-message"> <div class="error-message-padding"> <h2>There is a problem with the page you are trying to access.</h2> <p>It is possible that it was either moved, it doesn't exist or we are experiencing some technical difficulties.</p> <p>We are sorry for the inconvenience.</p> </div> </div> </div> Here is my Java program which uses HttpClient: package com.tixsnif; import org.apache.http.*; import org.apache.http.client.HttpClient; import org.apache.http.client.entity.UrlEncodedFormEntity; import org.apache.http.client.methods.HttpPost; import org.apache.http.impl.client.DefaultHttpClient; import org.apache.http.message.BasicNameValuePair; import org.apache.http.protocol.HTTP; import java.io.*; import java.util.*; import java.util.zip.GZIPInputStream; public class WebScrapingTesting { public static void main(String[] args) throws Exception { String target = "http://ojp.nationalrail.co.uk/en/s/planjourney/query"; HttpClient client = new DefaultHttpClient(); HttpPost httpPost = new HttpPost(target); BasicNameValuePair[] params = { new BasicNameValuePair("jpState", "single"), new BasicNameValuePair("commandName", "journeyPlannerCommand"), new BasicNameValuePair("from.searchTerm", "Basingstoke"), new BasicNameValuePair("to.searchTerm", "Reading"), new BasicNameValuePair("timeOfOutwardJourney.arrivalOrDeparture", "DEPART"), new BasicNameValuePair("timeOfOutwardJourney.monthDay", "Today"), new BasicNameValuePair("timeOfOutwardJourney.hour", "10"), new BasicNameValuePair("timeOfOutwardJourney.minute", "15"), new BasicNameValuePair("timeOfReturnJourney.arrivalOrDeparture", "DEPART"), new BasicNameValuePair("timeOfReturnJourney.monthDay", "Today"), new BasicNameValuePair("timeOfReturnJourney.hour", "18"), new BasicNameValuePair("timeOfReturnJourney.minute", "15"), new BasicNameValuePair("_includeOvertakenTrains", "on"), new BasicNameValuePair("viaMode", "VIA"), new BasicNameValuePair("via.searchTerm", "Station name / code"), new BasicNameValuePair("offSetOption", "0"), new BasicNameValuePair("_reduceTransfers", "on"), new BasicNameValuePair("operatorMode", "SHOW"), new BasicNameValuePair("operator.code", ""), new BasicNameValuePair("_lookForSleeper", "on"), new BasicNameValuePair("_directTrains", "on")}; httpPost.setHeader("Host", "ojp.nationalrail.co.uk"); httpPost.setHeader("User-Agent", "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.231 Safari/534.10"); httpPost.setHeader("Accept-Encoding", "gzip,deflate,sdch"); httpPost.setHeader("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,**/*//*;q=0.8"); httpPost.setHeader("Accept-Language", "en-us,en;q=0.8"); httpPost.setHeader("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); httpPost.setHeader("Origin", "http://www.nationalrail.co.uk/"); httpPost.setHeader("Referer", "http://www.nationalrail.co.uk/"); httpPost.setHeader("Content-Type", "application/x-www-form-urlencoded"); httpPost.setHeader("Cookie", "JSESSIONID=B2A3419B79C5D999CA4806B459675CCD.app201; Path=/"); UrlEncodedFormEntity urlEncodedFormEntity = new UrlEncodedFormEntity(Arrays.asList(params)); urlEncodedFormEntity.setContentEncoding(HTTP.UTF_8); httpPost.setEntity(urlEncodedFormEntity); HttpResponse response = client.execute(httpPost); InputStream input = response.getEntity().getContent(); GZIPInputStream gzip = new GZIPInputStream(input); InputStreamReader isr = new InputStreamReader(gzip); BufferedReader br = new BufferedReader(isr); String line = null; while((line = br.readLine()) != null) { System.out.printf("\n%s", line); } client.getConnectionManager().shutdown(); } } I keep the JSESSION ID updated if it expires but there seems to be another problem that I cannot see. Am I missing something rather obvious? He

    Read the article

  • Python Mechanize unable to avoid redirect when Post

    - by Enric Geijo
    I am trying to crawl a site using mechanize. The site provides search results in different pages. When posting to get the next set of results, something is wrong and the server redirects me to the first page, asking mechanize to update the SearchSession Cookie. I have been debugging the requests using Firefox and they look quite the same, and I am unable to find the problem. Any suggestion? Below the requests: ----------- FIRST THE RIGHT SEQUENCE, USING TAMPER IN FIREFOX ------------------------- POST XXX/JobSearch/Results.aspx?Keywords=Python&LTxt=London%2c+South+East&Radius=0&LIds2=ZV&clid=1621&cltypeid=2&clName=London Load Flags[LOAD_DOCUMENT_URI LOAD_INITIAL_DOCUMENT_URI ] Content Size[-1] Mime Type[text/html] Request Headers: Host[www.cwjobs.co.uk] User-Agent[Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.9) Gecko/20100401 Ubuntu/9.10 (karmic) Firefox/3.5.9] Accept[text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8] Accept-Language[en-us,en;q=0.5] Accept-Encoding[gzip,deflate] Accept-Charset[ISO-8859-1,utf-8;q=0.7,*;q=0.7] Keep-Alive[300] Connection[keep-alive] Referer[XXX/JobSearch/Results.aspx?Keywords=Python&LTxt=London%2c+South+East&Radius=0&LIds2=ZV&clid=1621&cltypeid=2&clName=London] Cookie[ecos=774803468-0; AnonymousUser=MemberId=acc079dd-66b6-4081-9b07-60d6955ee8bf&IsAnonymous=True; PJBIPPOPUP=; WT_FPC=id=86.181.183.106-2262469600.30073025:lv=1272812851736:ss=1272812789362; SearchSession=SessionGuid=71de63de-3bd0-4787-895d-b6b9e7c93801&LogSource=NAT] Post Data: __EVENTTARGET[srpPager%24btnForward] __EVENTARGUMENT[] hdnSearchResults[BV%2CA%2CC0P5x%2COou-%2CB4S-%2CBuC-%2CDzx-%2CHwn-%2CKPP-%2CIVA-%2CC9D-%2CH6X-%2CH7x-%2CJ0x-%2CCvX-%2CCra-%2COHa-%2CHhP-%2CCoj-%2CBlM-%2CE9W-%2CIm8-%2CBqG-%2CPFy-%2CN%2Fm-%2Ceaa%2CCvj-%2CCtJ-%2CCr7-%2CBpu-%2Cmh%2CMb6-%2CJ%2Fk-%2CHY8-%2COJ7-%2CNtF-%2CEya-%2CErT-%2CEo4-%2CEKU-%2CDnL-%2CC5M-%2CCyB-%2CBsD-%2CBrc-%2CBpU-%2Col%2C30%2CC1%2Cd4N%2COo8-%2COi0-%2CLz%2F-%2CLxP-%2CFyp-%2CFVR-%2CEHL-%2CPrP-%2CLmE-%2CK3H-%2CKXJ-%2CFyn%2CIcq-%2CIco-%2CIK4-%2CIIg-%2CH2k-%2CH0N-%2CHwp-%2CHvF-%2CFij-%2CFhl-%2CCwj-%2CCb5-%2CCQj-%2CCQh-%2CB%2B2-%2CBc6-%2ChFo%2CNLq-%2CNI%2F-%2CFzM-%2Cdu-%2CHg2-%2CBug-%2CBse-%2CB9Q-] __VIEWSTATE[%2FwEPDwUKLTkyMzI2ODA4Ng9kFgYCBA8WBB4EaHJlZgWJAWh0dHA6Ly93d3cuY3dqb2JzLmNvLnVrL0pvYlNlYXJjaC9SU1MuYXNweD9LZXl3b3Jkcz1QeXRob24mTFR4dD1Mb25kb24lMmMrU291dGgrRWFzdCZSYWRpdXM9MCZMSWRzMj1aViZjbGlkPTE2MjEmY2x0eXBlaWQ9MiZjbE5hbWU9TG9uZG9uHgV0aXRsZQUkTGF0ZXN0IFB5dGhvbiBqb2JzIGZyb20gQ1dKb2JzLmNvLnVrZAIGDxYCHgRUZXh0BV48bGluayByZWw9ImNhbm9uaWNhbCIgaHJlZj0iaHR0cDovL3d3dy5jd2pvYnMuY28udWsvSm9iU2Vla2luZy9QeXRob25fTG9uZG9uX2wxNjIxX3QyLmh0bWwiIC8%2BZAIIEGRkFg4CBw8WAh8CBV9Zb3VyIHNlYXJjaCBvbiA8Yj5LZXl3b3JkczogUHl0aG9uOyBMb2NhdGlvbjogTG9uZG9uLCBTb3V0aCBFYXN0OyA8L2I%2BIHJldHVybmVkIDxiPjg1PC9iPiBqb2JzLmQCCQ8WAh4HVmlzaWJsZWhkAgsPFgIfAgUoVGhlIG1vc3QgcmVsZXZhbnQgam9icyBhcmUgbGlzdGVkIGZpcnN0LmQCEw8PFgIeC05hdmlnYXRlVXJsBQF%2BZGQCFQ9kFgYCBQ8PFgYfAgUGUHl0aG9uHgtEZWZhdWx0VGV4dAUMZS5nLiBhbmFseXN0HhNEZWZhdWx0VGV4dENzc0NsYXNzZWRkAgsPDxYGHwIFEkxvbmRvbiwgU291dGggRWFzdB8FBQllLmcuIEJhdGgfBmVkZAIRDxAPFgYeDURhdGFUZXh0RmllbGQFClJhZGl1c05hbWUeDkRhdGFWYWx1ZUZpZWxkBQZSYWRpdXMeC18hRGF0YUJvdW5kZ2QQFREHMCBtaWxlcwcyIG1pbGVzBzUgbWlsZXMIMTAgbWlsZXMIMTUgbWlsZXMIMjAgbWlsZXMIMjUgbWlsZXMIMzAgbWlsZXMIMzUgbWlsZXMINDAgbWlsZXMINDUgbWlsZXMINTAgbWlsZXMINjAgbWlsZXMINzAgbWlsZXMIODAgbWlsZXMIOTAgbWlsZXMJMTAwIG1pbGVzFREBMAEyATUCMTACMTUCMjACMjUCMzACMzUCNDACNDUCNTACNjACNzACODACOTADMTAwFCsDEWdnZ2dnZ2dnZ2dnZ2dnZ2dnZGQCFw9kFgQCAQ9kFgQCBA8QZA8WA2YCAQICFgMQBQhBbGwgam9icwUBMGcQBRlEaXJlY3QgZW1wbG95ZXIgam9icyBvbmx5BQEyZxAFEEFnZW5jeSBqb2JzIG9ubHkFATFnZGQCBg8QZA8WA2YCAQICFgMQBQlSZWxldmFuY2UFATFnEAUERGF0ZQUBMmcQBQZTYWxhcnkFATNnZGQCBQ8PFgYeClBhZ2VOdW1iZXICAh4PTnVtYmVyT2ZSZXN1bHRzAlUeDlJlc3VsdHNQZXJQYWdlAhRkZAIZDxYCHwNoZGQ%3D] Refinesearch%24txtKeywords[Python] Refinesearch%24txtLocation[London%2C+South+East] Refinesearch%24ddlRadius[0] ddlCompanyType[0] ddlSort[1] Response Headers: Cache-Control[private] Date[Sun, 02 May 2010 16:09:27 GMT] Content-Type[text/html; charset=utf-8] Expires[Sat, 02 May 2009 16:09:27 GMT] Server[Microsoft-IIS/6.0] X-SiteConHost[P310] X-Powered-By[ASP.NET] X-AspNet-Version[2.0.50727] Set-Cookie[SearchSession=SessionGuid=71de63de-3bd0-4787-895d-b6b9e7c93801&LogSource=NAT; path=/] Content-Encoding[gzip] Vary[Accept-Encoding] Transfer-Encoding[chunked] -------- NOW WHAT I'AM SENDING USING MECHANIZE, SOME HEADERS ADDED, ETC ----------- POST /JobSearch/Results.aspx?Keywords=Python&LTxt=London%2c+South+East&Radius=0&LIds2=ZV&clid=1621&cltypeid=2&clName=London HTTP/1.1\r\nContent-Length: 2424\r\n Accept-Language: en-us,en;q=0.5\r\n Accept-Encoding: gzip\r\n Host: www.cwjobs.co.uk\r\n Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8\r\n Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n Connection: keep-alive\r\n Cookie: AnonymousUser=MemberId=8fa5ddd7-17ed-425e-b189-82693bfbaa0c&IsAnonymous=True; SearchSession=SessionGuid=33e4e439-c2d6-423f-900f-574099310d5a&LogSource=NAT\r\n Referer: XXX/JobSearch/Results.aspx?Keywords=Python&LTxt=London%2c+South+East&Radius=0&LIds2=ZV&clid=1621&cltypeid=2&clName=London\r\n Content-Type: application/x-www-form-urlencoded\r\n\r\n' '__EVENTTARGET=srpPager%24btnForward& __EVENTARGUMENT=& hdnSearchResults=BV%2CA%2CC0eif%2CMwc%2CM6s%2COou%2CK09%2CG4H%2CEZf%2CGTu%2CLrr%2CGuX%2CGs9%2CEz9%2CL5X%2CL9U%2ChU%2CHHf%2CMAL%2CNDi%2CJrY%2CGBy%2CM%2Bo%2CdE-%2CpI%2CtDI%2CL5L%2CL7l%2CL8z%2CM%2FA%2CPPP%2CCM0%2CEpK%2CHPy%2Cez%2C7p%2CJ2U%2CJ9b%2CJ%2F2%2CKea%2CLBj%2CLvi%2CL2t%2CM8r%2CM9S%2CM%2Fa%2CPRT%2CPgi%2Csg7%2CF6%2CI2F%2CJTd%2CO-%2CC0v%2CC3f%2CDCq%2CDxn%2CERl%2CUbV%2CGME%2CGMG%2CGd2%2CGgO%2CGyK%2CG0h%2CG4F%2CG5p%2CJGL%2CJHJ%2CKhj%2CL4L%2CMM1%2CMYL%2CMYN%2CMp4%2CNL0%2COrj%2CvuW%2CBdE%2CBfv%2CI1i%2CBCh-%2COLA%2CHH4%2CM6O%2CM8Q%2CMre& __VIEWSTATE=%2FwEPDwUKLTkyMzI2ODA4Ng9kFgYCBA8WBB4EaHJlZgWJAWh0dHA6Ly93d3cuY3dqb2JzLmNvLnVrL0pvYlNlYXJjaC9SU1MuYXNweD9LZXl3b3Jkcz1QeXRob24mTFR4dD1Mb25kb24lMmMrU291dGgrRWFzdCZSYWRpdXM9MCZMSWRzMj1aViZjbGlkPTE2MjEmY2x0eXBlaWQ9MiZjbE5hbWU9TG9uZG9uHgV0aXRsZQUkTGF0ZXN0IFB5dGhvbiBqb2JzIGZyb20gQ1dKb2JzLmNvLnVrZAIGDxYCHgRUZXh0BV48bGluayByZWw9ImNhbm9uaWNhbCIgaHJlZj0iaHR0cDovL3d3dy5jd2pvYnMuY28udWsvSm9iU2Vla2luZy9QeXRob25fTG9uZG9uX2wxNjIxX3QyLmh0bWwiIC8%2BZAIIEGRkFg4CBw8WAh8CBV9Zb3VyIHNlYXJjaCBvbiA8Yj5LZXl3b3JkczogUHl0aG9uOyBMb2NhdGlvbjogTG9uZG9uLCBTb3V0aCBFYXN0OyA8L2I%2BIHJldHVybmVkIDxiPjg1PC9iPiBqb2JzLmQCCQ8WAh4HVmlzaWJsZWhkAgsPFgIfAgUoVGhlIG1vc3QgcmVsZXZhbnQgam9icyBhcmUgbGlzdGVkIGZpcnN0LmQCEw8PFgIeC05hdmlnYXRlVXJsBQF%2BZGQCFQ9kFgYCBQ8PFgYfAgUGUHl0aG9uHgtEZWZhdWx0VGV4dAUMZS5nLiBhbmFseXN0HhNEZWZhdWx0VGV4dENzc0NsYXNzZWRkAgsPDxYGHwIFEkxvbmRvbiwgU291dGggRWFzdB8FBQllLmcuIEJhdGgfBmVkZAIRDxAPFgYeDURhdGFUZXh0RmllbGQFClJhZGl1c05hbWUeDkRhdGFWYWx1ZUZpZWxkBQZSYWRpdXMeC18hRGF0YUJvdW5kZ2QQFREHMCBtaWxlcwcyIG1pbGVzBzUgbWlsZXMIMTAgbWlsZXMIMTUgbWlsZXMIMjAgbWlsZXMIMjUgbWlsZXMIMzAgbWlsZXMIMzUgbWlsZXMINDAgbWlsZXMINDUgbWlsZXMINTAgbWlsZXMINjAgbWlsZXMINzAgbWlsZXMIODAgbWlsZXMIOTAgbWlsZXMJMTAwIG1pbGVzFREBMAEyATUCMTACMTUCMjACMjUCMzACMzUCNDACNDUCNTACNjACNzACODACOTADMTAwFCsDEWdnZ2dnZ2dnZ2dnZ2dnZ2dnZGQCFw9kFgQCAQ9kFgQCBA8QZA8WA2YCAQICFgMQBQhBbGwgam9icwUBMGcQBRlEaXJlY3QgZW1wbG95ZXIgam9icyBvbmx5BQEyZxAFEEFnZW5jeSBqb2JzIG9ubHkFATFnZGQCBg8QZA8WA2YCAQICFgMQBQlSZWxldmFuY2UFATFnEAUERGF0ZQUBMmcQBQZTYWxhcnkFATNnZGQCBQ8PFgYeClBhZ2VOdW1iZXICAR4PTnVtYmVyT2ZSZXN1bHRzAlUeDlJlc3VsdHNQZXJQYWdlAhRkZAIZDxYCHwNoZGQ%3D& Refinesearch%24txtKeywords=Python& Refinesearch%24txtLocation=London%2CSouth+East& Refinesearch%24ddlRadius=0& Refinesearch%24btnSearch=Search& ddlCompanyType=0& ddlSort=1'

    Read the article

  • CURL - HTTPS Wierd error

    - by Vincent
    All, I am having trouble requesting info from HTTPS site using CURL and PHP. I am using Solaris 10. It so happens that sometimes it works and sometimes it doesn't. I am not sure what is the cause. If it doesn't work, this is the entry recorded in the verbose log: * About to connect() to 10.10.101.12 port 443 (#0) * Trying 10.10.101.12... * connected * Connected to 10.10.101.12 (10.10.101.12) port 443 (#0) * error setting certificate verify locations, continuing anyway: * CAfile: /etc/opt/webstack/curl/curlCA CApath: none * error:80089077:lib(128):func(137):reason(119) * Closing connection #0 If it works, this is the entry recorded in the verbose log: * About to connect() to 10.10.101.12 port 443 (#0) * Trying 10.10.101.12... * connected * Connected to 10.10.101.12 (10.10.101.12) port 443 (#0) * error setting certificate verify locations, continuing anyway: * CAfile: /etc/opt/webstack/curl/curlCA CApath: none * SSL connection using DHE-RSA-AES256-SHA * Server certificate: * subject: C=CA, ST=British Columnbia, L=Vancouver, O=google, OU=FDN, CN=g.googlenet.com, [email protected] * start date: 2007-07-24 23:06:32 GMT * expire date: 2027-09-07 23:06:32 GMT * issuer: C=US, ST=California, L=Sunnyvale, O=Google, OU=Certificate Authority, CN=support, [email protected] * SSL certificate verify result: unable to get local issuer certificate (20), continuing anyway. > POST /gportal/gpmgr HTTP/1.1^M Host: 10.10.101.12^M Accept: */*^M Accept-Encoding: gzip,deflate^M Content-Length: 1623^M Content-Type: application/x-www-form-urlencoded^M Expect: 100-continue^M ^M < HTTP/1.1 100 Continue^M < HTTP/1.1 200 OK^M < Date: Wed, 28 Apr 2010 21:56:15 GMT^M < Server: Apache^M < Cache-Control: no-cache^M < Pragma: no-cache^M < Vary: Accept-Encoding^M < Content-Encoding: gzip^M < Content-Length: 1453^M < Content-Type: application/json^M < ^M * Connection #0 to host 10.10.101.12 left intact * Closing connection #0 My CURL options are as under: $ch = curl_init(); $devnull = fopen('/tmp/curlcookie.txt', 'w'); $fp_err = fopen('/tmp/verbose_file.txt', 'ab+'); fwrite($fp_err, date('Y-m-d H:i:s')."\n\n"); curl_setopt($ch, CURLOPT_STDERR, $devnull); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_URL, $desturl); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0); curl_setopt($ch, CURLOPT_CONNECTTIMEOUT,120); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_ENCODING, 'gzip,deflate'); curl_setopt($ch, CURLOPT_POSTFIELDS, $postdata); curl_setopt($ch, CURLOPT_VERBOSE,1); curl_setopt($ch, CURLOPT_FAILONERROR, true); curl_setopt($ch, CURLOPT_STDERR, $fp_err); $ret = curl_exec($ch); Anybody has any idea, why it works sometimes but fails mostly? Thanks

    Read the article

  • BasicAuthProvider in ServiceStack

    - by Per
    I've got an issue with the BasicAuthProvider in ServiceStack. POST-ing to the CredentialsAuthProvider (/auth/credentials) is working fine. The problem is that when GET-ing (in Chrome): http://foo:pwd@localhost:81/tag/string/list the following is the result Handler for Request not found: Request.HttpMethod: GET Request.HttpMethod: GET Request.PathInfo: /login Request.QueryString: System.Collections.Specialized.NameValueCollection Request.RawUrl: /login?redirect=http%3a%2f%2flocalhost%3a81%2ftag%2fstring%2flist which tells me that it redirected me to /login instead of serving the /tag/... request. Here's the entire code for my AppHost: public class AppHost : AppHostHttpListenerBase, IMessageSubscriber { private ITagProvider myTagProvider; private IMessageSender mySender; private const string UserName = "foo"; private const string Password = "pwd"; public AppHost( TagConfig config, IMessageSender sender ) : base( "BM App Host", typeof( AppHost ).Assembly ) { myTagProvider = new TagProvider( config ); mySender = sender; } public class CustomUserSession : AuthUserSession { public override void OnAuthenticated( IServiceBase authService, IAuthSession session, IOAuthTokens tokens, System.Collections.Generic.Dictionary<string, string> authInfo ) { authService.RequestContext.Get<IHttpRequest>().SaveSession( session ); } } public override void Configure( Funq.Container container ) { Plugins.Add( new MetadataFeature() ); container.Register<BeyondMeasure.WebAPI.Services.Tags.ITagProvider>( myTagProvider ); container.Register<IMessageSender>( mySender ); Plugins.Add( new AuthFeature( () => new CustomUserSession(), new AuthProvider[] { new CredentialsAuthProvider(), //HTML Form post of UserName/Password credentials new BasicAuthProvider(), //Sign-in with Basic Auth } ) ); container.Register<ICacheClient>( new MemoryCacheClient() ); var userRep = new InMemoryAuthRepository(); container.Register<IUserAuthRepository>( userRep ); string hash; string salt; new SaltedHash().GetHashAndSaltString( Password, out hash, out salt ); // Create test user userRep.CreateUserAuth( new UserAuth { Id = 1, DisplayName = "DisplayName", Email = "[email protected]", UserName = UserName, FirstName = "FirstName", LastName = "LastName", PasswordHash = hash, Salt = salt, }, Password ); } } Could someone please tell me what I'm doing wrong with either the SS configuration or how I am calling the service, i.e. why does it not accept the supplied user/pwd? Update1: Request/Response captured in Fiddler2when only BasicAuthProvider is used. No Auth header sent in the request, but also no Auth header in the response. GET /tag/string/AAA HTTP/1.1 Host: localhost:81 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA== HTTP/1.1 302 Found Location: /login?redirect=http%3a%2f%2flocalhost%3a81%2ftag%2fstring%2fAAA Server: Microsoft-HTTPAPI/2.0 X-Powered-By: ServiceStack/3,926 Win32NT/.NET Date: Sat, 10 Nov 2012 22:41:51 GMT Content-Length: 0 Update2 Request/Response with HtmlRedirect = null . SS now answers with the Auth header, which Chrome then issues a second request for and authentication succeeds GET http://localhost:81/tag/string/Abc HTTP/1.1 Host: localhost:81 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA== HTTP/1.1 401 Unauthorized Transfer-Encoding: chunked Server: Microsoft-HTTPAPI/2.0 X-Powered-By: ServiceStack/3,926 Win32NT/.NET WWW-Authenticate: basic realm="/auth/basic" Date: Sat, 10 Nov 2012 22:49:19 GMT 0 GET http://localhost:81/tag/string/Abc HTTP/1.1 Host: localhost:81 Connection: keep-alive Authorization: Basic Zm9vOnB3ZA== User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,sv;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: ss-pid=Hu2zuD/T8USgvC8FinMC9Q==; X-UAId=1; ss-id=1HTqSQI9IUqRAGxM8vKlPA==

    Read the article

  • How to convert number(16,10) to date in oracle

    - by Elad
    Hi, I'm trying to read the borland starteam application oracle database and I noticed that they represent their date as a number(16,10) column I think it is not timestamp or epoc. for instance, I have the number: 37137.4347569444, how can I read it as date? I saw that the database has a stored procedure. CONVERT_DATE: CREATE OR REPLACE procedure STARBASE.convert_date ( number_of_days IN integer , nDate OUT number) is nDateOffset number; CurrentDate date; Month integer; Day integer; year number; success boolean := false; bLeapYear boolean:=false; nDaysInMonths number; nLeapDays integer; fDate number (16,10); rgMonthDays number(5,0); begin select sysdate - number_of_days into CurrentDate from dual; nDateOffset := 693959; select to_number(substr((TO_CHAR (CurrentDate, 'MM-DD-YYYY')) , 1, 2), '99') - 1 into month from dual; select to_number(substr((TO_CHAR (CurrentDate, 'MM-DD-YYYY')) , 4, 2), '99') - 1 into day from dual; select to_number(substr((TO_CHAR (CurrentDate, 'MM-DD-YYYY')) , 7, 4), '9999') into year from dual; if ( mod(year , 4) = 0 ) and ( ( mod(year , 400) = 0) or ( mod(year , 100) < 0 )) then bLeapYear :=true; end if; nLeapDays := 0; if ( bLeapYear = true) and ( Day = 28) and ( Month = 1 ) then nLeapDays := 1; end if; select substr(to_char(last_day(CurrentDate) , 'DD-MM-YYYY') , 1 , 2) into nDaysInMonths from dual; if Month = 0 then rgMonthDays := 0; elsif Month = 1 then rgMonthDays := 31; elsif Month = 2 then rgMonthDays := 59; elsif Month = 3 then rgMonthDays := 90; elsif Month = 4 then rgMonthDays := 120; elsif Month = 5 then rgMonthDays := 151; elsif Month = 6 then rgMonthDays := 181; elsif Month = 7 then rgMonthDays := 212; elsif Month = 8 then rgMonthDays := 243; elsif Month = 9 then rgMonthDays := 273; elsif Month = 10 then rgMonthDays := 304; elsif Month = 11 then rgMonthDays := 334; elsif Month = 12 then rgMonthDays := 365; end if; nDate := Year*365 + Year/4 - Year/100 + Year/400 + rgMonthDays + Day + 1; if( Month < 2 ) and ( bLeapYear = true) then nDate := nDate - 1; end if; nDate := nDate - nDateOffset; exception when others then raise; end convert_date; I don't know how to use it. how can i read it anyway? Please help. thank you

    Read the article

  • HttpWebRequest response produces HTTP 422. Why?

    - by Simon
    Hi there. I'm trying to programmatically send a POST-request to a web-server in order to login an then perform other requests that require a login. This is my code: Encoding.UTF8.GetBytes( String.Format( "login={0}&password={1}&authenticity_token={2}", username, password, token)); //Create HTTP-request for login HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://www.xxx.xx/xx/xx"); request.Method = "POST"; request.ContentType = "application/x-www-form-urlencoded"; request.ContentLength = data.Length; request.CookieContainer = new CookieContainer(); request.Accept = "application/xml,application/xhtml+xml,text/html; +"q=0.9,text/plain ;q=0.8,image/png,*/*;q=0.5"; request.Referer = "http://www.garzantilinguistica.it/it/session"; request.Headers.Add("Accept-Language", "de-DE"); request.Headers.Add("Origin", "http://www.xxx.xx"); request.UserAgent = "C#"; request.Headers.Add("Accept-Encoding", "gzip, deflate"); After sending the request //Send post request var requestStream = request.GetRequestStream(); requestStream.Write(data, 0, data.Length); requestStream.Flush(); requestStream.Close(); ... I want to get the servers response: //Get Response StreamReader responseStreamReader = new StreamReader( request.GetResponse().GetResponseStream()); //WebException: HTTP 422! string content = responseStreamReader.ReadToEnd(); This piece of code fires the WebException, that tells me the server responded with HTTP 422 (unprocessable entity due to semantic errors) Then I compared (using a TCP/IP sniffers) the requests of my program and the browser (which of course produces a valid POST-request and gets the right response). (1) My program's request: POST /it/session HTTP/1.1 Content-Type: application/x-www-form-urlencoded Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain; q=0.8,image/png,*/*;q=0.5 Referer: http://www.garzantilinguistica.it/it/session Accept-Language: de-DE Origin: http://www.garzantilinguistica.it User-Agent: C# Accept-Encoding: gzip, deflate Host: www.garzantilinguistica.it Content-Length: 111 Expect: 100-continue Connection: Keep-Alive HTTP/1.1 100 Continue [email protected]&password=machivui&authenticity_token=4vLgtwP3nFNg4NeuG4MbUnU7sy4z91Wi8WJXH0POFmg= HTTP/1.1 422 Unprocessable Entity (2) The browser's request: POST /it/session HTTP/1.1 Host: www.garzantilinguistica.it Referer: http://www.garzantilinguistica.it/it/session Accept: application/xml,application/xhtml+xml,text/html;q=0.9, text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Language: de-DE Origin: http://www.garzantilinguistica.it User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 Accept-Encoding: gzip, deflate Content-Type: application/x-www-form-urlencoded Cookie: __utma=244184339.652523587.1275208707.1275208707.1275211298.2; __utmb=244184339.20.10.1275211298; __utmc=244184339; __utmz=244184339.1275208707.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); _garzanti2009_session=BAh7CDoPc2Vzc2lvbl9pZCIlZDg4MWZjNjg2YTRhZWE0NDQ0ZTJmMTU2YWY4ZTQ1NGU6EF9jc3JmX3Rva2VuIjFqRWdLdll3dTYwOTVVTEpNZkt6dG9jUCtaZ0o4V0FnV2V5ZnpuREx6QUlZPSIKZmxhc2hJQzonQWN0aW9uQ29udHJvbGxlcjo6Rmxhc2g6OkZsYXNoSGFzaHsGOgplcnJvciIVbG9naW4gbm9uIHZhbGlkbwY6CkB1c2VkewY7CFQ%3D--4200fa769898dd156faa49e457baf660cf068d08 Content-Length: 144 Connection: keep-alive authenticity_token=jEgKvYwu6095ULJMfKztocP%2BZgJ8WAgWeyfznDLzAIY%3D&login=thespider14%40hotmail.com&password=machivui&remember_me=1&commit=Entra HTTP/1.1 302 Found Can someone help to understand which part of the request I am missing or what the main difference between the browser's and my request is? Why am I getting that 422?

    Read the article

  • XSLT: a variation on the pagination problem

    - by MarcoS
    I must transform some XML data into a paginated list of fields. Here is an example. Input XML: <?xml version="1.0" encoding="UTF-8"?> <data> <books> <book title="t0"/> <book title="t1"/> <book title="t2"/> <book title="t3"/> <book title="t4"/> </books> <library name="my library"/> </data> Desired output: <?xml version="1.0" encoding="UTF-8"?> <pages> <page number="1"> <field name="library_name" value="my library"/> <field name="book_1" value="t0"/> <field name="book_2" value="t1"/> </page> <page number="2"> <field name="book_1" value="t2"/> <field name="book_2" value="t3"/> </page> <page number="3"> <field name="book_1" value="t4"/> </page> </pages> In the above example I assume that I want at most 2 fields named book_n (with n ranging between 1 and 2) per page. Tags <page> must have an attribute number. Finally, the field named library_name must appear only the first <page>. Here is my current solution using XSLT: <?xml version="1.0" encoding="UTF-8"?> <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="2.0" exclude-result-prefixes="trx xs"> <xsl:output method="xml" indent="yes" omit-xml-declaration="no" /> <xsl:variable name="max" select="2"/> <xsl:template match="//books"> <xsl:for-each-group select="book" group-ending-with="*[position() mod $max = 0]"> <xsl:variable name="pageNum" select="position()"/> <page number="{$pageNum}"> <xsl:for-each select="current-group()"> <xsl:variable name="idx" select="if (position() mod $max = 0) then $max else position() mod $max"/> <field value="{@title}"> <xsl:attribute name="name">book_<xsl:value-of select="$idx"/> </xsl:attribute> </field> </xsl:for-each> <xsl:if test="$pageNum = 1"> <xsl:call-template name="templateFor_library"/> </xsl:if> </page> </xsl:for-each-group> </xsl:template> <xsl:template name="templateFor_library"> <xsl:for-each select="//library"> <field name="library_name" value="{@name}" /> </xsl:for-each> </xsl:template> </xsl:stylesheet> Is there a better/simpler way to perform this transformation?

    Read the article

  • Php template caching design

    - by Thomas
    Hello to all, I want to include caching in my app design. Caching templates for starters. The design I have used so far is very modular. I have created an ORM implementation for all my tables and each table is represented by the corresponding class. All the requests are handled by one controller which routes them to the appropriate webmethod functions. I am using a template class for handling UI parts. What I have in mind for caching includes the implementation of a separate Cache class for handling caching with the flexibility to either store in files, apc or memcache. Right now I am testing with file caching. Some thoughts Should I include the logic of checking for cached versions in the Template class or in the webmethods which handle the incoming requests and which eventually call the Template class. In the first case, things are pretty simple as I will not have to change anything more than pass the template class an extra argument (whether to load from cache or not). In the second case however, I am thinking of checking for a cached version immediately in the webmethod and if found return it. This will save all the processing done until the logic reaches the template (first case senario). Both senarios however, rely on an accurate mechanism of invalidating caches, which brings as to Invalidating caches As I see it (and you can add your input freely) a template cached file, becomes invalidate if: a. the expiration set, is reached. b. the template file itself is updated (ie by the developer when adding a new line) c. the webmethod that handles the request changes (ie the developer adds/deletes something in the code) d. content coming from the db and ending in the template file is modified I am thinking of storing a json encoded array inside the cached file. The first value will be the expiration timestamp of the cache. The second value will be the modification time of the php file with the code handling the request (to cope with option c above) The third will be the content itself The validation process I am considering, according to the above senarios, is: a. If the expiration of the cached file (stored in the array) is reached, delete the cache file b. if the cached file's mod time is smaller than the template's skeleton file mod time, delete the cached file c. if the mod time of the php file is greated than the one stored in the cache, delete the cached file. d. This is tricky. In the ORM implementation I ahve added event handlers (which fire when adding, updating, deleting objects). I could delete the cache file every time an object thatprovides content to the template, is modified. The problem is how to keep track which cached files correpond to each schema object. Take this example, a user has his shortprofile page and a full profile page (2 templates) These templates can be cached. Now, every time the user modifies his profile, the event handler would need to know which templates or cached files correspond to the User, so that these files can be deleted. I could store them in the db but I am looking for a beter approach

    Read the article

  • jQuery.ajax call to Twitter succeeds but returns null for Firefox

    - by Zhami
    I've got code that makes a simple get request to Twitter (search) using jQuery's ajax method. The code works fine on Safari, but fails on Firefox (3.6.3). In the Firefox case, my jQuery.ajax parameters 'success' method is invoked, but the supplied data is null. (In Safari, I receive a boatload of JSON data). My ajax call is: $.ajax({ url: 'http://search.twitter.com/search.json?q='+searchTerm, dataType: 'json', async: true, beforeSend: function(request) { window.console.log('starting AJAX request to get Twitter data'); }, success: function(data, textStatus, request) { window.console.log('AJAX request to get Twitter succeeded: status=' + textStatus); callback(data); }, error: function(request, status, error) { window.console.log('AJAX request to get user data --> Error: ' + status); errback(request, status, error); } }); Firebug shows Response headers: Date Sun, 11 Apr 2010 22:30:26 GMT Server hi Status 200 OK X-Served-From b021 X-Runtime 0.23841 Content-Type application/json; charset=utf-8 X-Served-By sjc1o024.prod.twitter.com X-Timeline-Cache-Hit Miss Cache-Control max-age=15, must-revalidate, max-age=300 Expires Sun, 11 Apr 2010 22:35:26 GMT Vary Accept-Encoding X-Varnish 1827846877 Age 0 Via 1.1 varnish X-Cache-Svr sjc1o024.prod.twitter.com X-Cache MISS Content-Encoding gzip Content-Length 2126 Connection close The HTTP status is OK (200), the Conetnt-Type is properly application/json, and the Content-Length of 2126 (gzip'd) implies data came back. Yet Firebug shows the Response to be empty, and a test of the supplied data shows it o be 'null.' I am aware of a similar post on Stack Overflow: http://stackoverflow.com/questions/1188976/jquery-get-function-succeeds-with-200-but-returns-no-content-in-firefox and from that would assume this problem is possibly related to cross-domain security, but... I know there are many JS widgets and whatnots that ajax get data from Twitter. Is there something I need to enable to allow this?

    Read the article

  • Binary serialization/de-serialization in C++ and C#

    - by 6pack kid
    Hello. I am working on a distributed application which has two components. One is written in standard C++ (not managed C++) and the other one is written in C#. Both are communicating via a message bus. I have a situation in which I need to pass objects from C++ to C# application and for this I need to serialize those objects in C++ and de-serialize them in C# (something like marshaling/un-marshaling in .NET). I need to perform this serialization in binary and not in XML (due to performance reasons). I have used Boost.Serialization to do this when both ends were implemented in C++ but now that I have a .NET application on one end, Boost.Serialization is not a viable solution. I am looking for a solution that allows me to perform (de)serialization across C++ and .NET boundary i.e., cross platform binary serialization. I know I can implement the (de)serialization code in a C++ dll and use P/Invoke in the .NET application, but I want to keep that as a last resort. Also, I want to know if I use some standard like gzip, will that be efficient? Are there any other alternatives to gzip? What are the pros/cons of them? Thanks

    Read the article

  • empty response body in ajax (or 206 Partial Content)

    - by Nikita Rybak
    Hi guys, I'm feeling completely stupid because I've spent two hours solving task which should be very simple and which I solved many times before. But now I'm not even sure in which direction to dig. I fail to fetch static content using ajax from local servers (Apache and Mongrel). I get responses 200 and 206 (depending on the server), empty response text (although Content-Length header is always correct), firebug shows request in red. Javascript is very generic, I'm getting same results even here: http://www.w3schools.com/ajax/tryit.asp?filename=tryajax_first (just change document location to 'http://localhost:3000/whatever') So, it's probably not the cause. Well, now I'm out of ideas. I can also post http headers, if it'll help. Thanks! Response Headers Connection close Date Sat, 01 May 2010 21:05:23 GMT Last-Modified Sun, 18 Apr 2010 19:33:26 GMT Content-Type text/html Content-Length 7466 Request Headers Host localhost:3000 User-Agent Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive Referer http://www.w3schools.com/ajax/tryit_view.asp Origin http://www.w3schools.com Response Headers Date Sat, 01 May 2010 21:54:59 GMT Server Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_jk/1.2.28 Etag "3d5cbdb-fb4-4819c460d4a40" Accept-Ranges bytes Content-Length 4020 Cache-Control max-age=7200, public, proxy-revalidate Expires Sat, 01 May 2010 23:54:59 GMT Content-Range bytes 0-4019/4020 Keep-Alive timeout=5, max=100 Connection Keep-Alive Content-Type application/javascript Request Headers Host localhost User-Agent Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip,deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive Origin null

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >