Search Results

Search found 50839 results on 2034 pages for 'http 404'.

Page 196/2034 | < Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >

  • Tutorials - Getting Started

    - by Rodney Vinyard
    Walkthrough: Creating And Using Ajax-Enabled Web Service http://msdn.microsoft.com/en-us/library/bb532367(v=vs.90).aspx  Getting Started with jQuery in Visual Studio 2008 http://blogs.microsoft.co.il/blogs/bursteg/archive/2009/06/05/getting-started-with-jquery-in-visual-studio-2008.aspx  jQuery Examples http://www.w3schools.com/jquery/jquery_examples.asp  Ajax Control Toolkit http://www.asp.net/ajaxlibrary/act.ashx

    Read the article

  • Free HTML5 & CSS3 Fundamentals course

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/10/13/free-html5--css3-fundamentals-course.aspxAt http://www.microsoftvirtualacademy.com/training-courses/html5-css3-fundamentals-development-for-absolute-beginners there is a free course on HTML5 & CSS3 FundamentalsThis is not a course for pretty web design but for writing good standards compliant HTML. Please note that to get the work files for the course you need to go to http://channel9.msdn.com/Series/HTML5-CSS3-Fundamentals-Development-for-Absolute-Beginners/Series-Introduction-01 as the Microsoft Academy downloads do not seem to work!The course is done by Bob Tabor who runs http://www.learnvisualstudio.net

    Read the article

  • ECMA International adopte JSON comme standard, le format d'échange de données continue son ascension

    ECMA International adopte JSON comme standard, le format d'échange de données continue son ascension JSON (JavaScript Object Notation) a été adopté comme standard ECMA suite à un vote de l'Assemblée Générale. Cette nouvelle norme s'est vue attribuer le numéro 404, ce qui ne manque pas de rappeler celui du code d'erreur du protocole de communication HTTP sur le réseau Internet, renvoyé par un serveur HTTP pour indiquer que la ressource demandée (généralement une page web) n'existe pas.Rappelons...

    Read the article

  • Google search results are invalid

    - by Rufus
    I'm writing a program that lets a user perform a Google search. When the result comes back, all of the links in the search results are links not to other sites but to Google, and if the user clicks on one, the page is fetched not from the other site but from Google. Can anyone explain how to fix this problem? My Google URL consists of this: http://google.com/search?q=gargle But this is what I get back when the user clicks on the Wikipedia search result, which was http://www.google.com/url?q=http://en.wikipedia.org/wiki/Gargling&sa=U&ei=_4vkT5y555Wh6gGBeOzECg&ved=0CBMQejAe&usg=AFQjeNHd1eRV8Xef3LGeH6AvGxt-AF-Yjw <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html lang="en" dir="ltr" class="client-nojs" xmlns="http://www.w3.org/1999/xhtml"> <head> <title>Gargling - Wikipedia, the free encyclopedia</title> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <meta http-equiv="Content-Style-Type" content="text/css" /> <meta name="generator" content="MediaWiki 1.20wmf5" /> <meta http-equiv="last-modified" content="Fri, 09 Mar 2012 12:34:19 +0000" /> <meta name="last-modified-timestamp" content="1331296459" /> <meta name="last-modified-range" content="0" /> <link rel="alternate" type="application/x-wiki" title="Edit this page" > <link rel="edit" title="Edit this page" > <link rel="apple-touch-icon" > <link rel="shortcut icon" > <link rel="search" type="application/opensearchdescription+xml" > <link rel="EditURI" type="application/rsd+xml" > <link rel="copyright" > <link rel="alternate" type="application/atom+xml" title="Wikipedia Atom feed" > <link rel="stylesheet" href="//bits.wikimedia.org/en.wikipedia.org/load.php?debug=false&amp;lang=en&amp;modules=ext.gadget.teahouse%7Cext.wikihiero%7Cmediawiki.legacy.commonPrint%2Cshared%7Cskins.vector&amp;only=styles&amp;skin=vector&amp;*" type="text/css" media="all" /> <style type="text/css" media="all">#mwe-lastmodified { display: none; }</style><meta name="ResourceLoaderDynamicStyles" content="" /> <link rel="stylesheet" href="//bits.wikimedia.org/en.wikipedia.org/load.php?debug=false&amp;lang=en&amp;modules=site&amp;only=styles&amp;skin=vector&amp;*" type="text/css" media="all" /> <style type="text/css" media="all">a:lang(ar),a:lang(ckb),a:lang(fa),a:lang(kk-arab),a:lang(mzn),a:lang(ps),a:lang(ur){text-decoration:none} /* cache key: enwiki:resourceloader:filter:minify-css:7:d5a1bf6cbd05fc6cc2705e47f52062dc */</style>

    Read the article

  • New Videos in the Tailspin Spyworks series live.

    Weve posted a few new videos in the Tailspin Spyworks tutorial serials. Many more in the works ! http://www.asp.net/web-forms/fundamentals http://www.asp.net/aspnet-4/videos http://www.asp.net/aspnet-4/videos/tailspin-spyworks-display-the-product-list http://www.asp.net/aspnet-4/videos/tailspin-spyworks-display-per-product-details...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How do I install Myunity on 12.10?

    - by Brenton Horne
    Basically as is the title how do you install Myunity on 12.10. I've tried adding the repository ppa:myunity/ppa and doing: sudo add-apt-repository ppa:myunity/ppa sudo apt-get update sudo apt-get install myunity At which point I got the error: W: Failed to fetch http://ppa.launchpad.net/myunity/ppa/ubuntu/dists/quantal/main/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • curious about the cached old domain

    - by jogesh_p
    i am a bit curious about my new Domain, actually i had a domain before let say http://example.com before expiration of that domain i bought a new one, with the name http://another-domain.com i uploaded all of my content on the second domain, but now when i search in google about some query related to my another-domain.com then i also find my old domain that is http://example.com is this provide the dulplicate content error to my http://another-domain.com ?? or any kind of penalty by Google

    Read the article

  • Visual Studio &quot;14&quot; CTP

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2014/06/04/visual-studio-quot14quot-ctp.aspxSomasegar has blogged about Visual Studio "14" CTPat http://blogs.msdn.com/b/somasegar/. At a CTP it should be installed on dedicated PC with no other  version of Visual Studio installed. There are important updates for both C# (see http://blogs.msdn.com/b/csharpfaq/archive/2014/06/03/visual-studio-14-ctp-now-available.aspx) and VB.NET (see http://blogs.msdn.com/b/vbteam/archive/2014/06/03/visual-studio-14-ctp-now-available.aspx).

    Read the article

  • IE9 Released - Download Now

    - by kennedysteve
    The final release of IE9 "hit the stores" (just download it free from here - http://http://www.beautyoftheweb.com//). Enjoy new features like pinning, jump lists, and aero-snap. Of course, IE9 is also focused on standards, so you can now get the best of HTML version 5 in IE9. By the way, if you're using IE6 - you might want to check out http://www.theie6countdown.com/ and http://www.ie6death.com/.

    Read the article

  • Restrict access to apache2 web root but allow it to subfolders

    - by razor7
    I need to restrict access by password to my web root apache test server (ie http://localhost) but allow access to subfolders (ie: http://localhost/testsite) I did create the .htpasswd and .htaccess, and put the .htaccess to web root (http://localhost) so when trying to access web root, it asks for user and pass, but so does in subfolders (ie: trying to access http://localhost/testite) I want to be asked for password on web root, but not on subfolders. Is that possible?

    Read the article

  • cannot install cinnimon 2.xx

    - by user207587
    Fetched 841 kB in 28s (29.3 kB/s) W: GPG error: http://packages.mate-desktop.org raring Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 68980A0EA10B4DE8 W: Failed to fetch http://ppa.launchpad.net/pmcenery/ppa/ubuntu/dists/raring/main/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead. userx@bw:~$ How do I fix that? How do I add a PUBKEY to get needed files to complete install of Cinnamon?

    Read the article

  • Same product name in different categories

    - by Shawn
    I have two questions about URL structure for SEO. I have one product belongs to multiple categories. Is this OK for SEO OR shall I give them different product names? http://example.com/shirts/denim/poloshirt.html http://example.com/shirts/pinkcolor/poloshirt.html Is it good or to put numeric product code in the URL as this: http://example.com/shirts/denim/shirt-st397.html http://example.com/shirts/denim/shirt-sw160.html Google regard it as one product or different products?

    Read the article

  • New Version of Sandcastle - 2.7.0.0

    - by TATWORTH
    There is a new version of Sandcastle at http://sandcastlestyles.codeplex.com/This is a free tool for producing help files from the XML documentation within C# or VB.NET code.This is a guided installer that prompts you to install the correct components in the correct order.The reason why the download is at http://sandcastlestyles.codeplex.com/ instead of http://sandcastle.codeplex.com/releases/view/47665 is discussed at http://sandcastle.codeplex.com/discussions/288079

    Read the article

  • Website directory structure regarding subdomains, www, and "global" content

    - by Pawnguy7
    I am trying to make a homemade HTTP server. It occurs to me, though, I never fully understood what you might call "relativity" among web pages. I have come across that www. is a subdomain, and I understand its original purpose. IT sounds like, in general, you would redirect (is that 301 or 302?) it to a... non-subdomain, sort of. As in, redirecting www.example.com to example.com. I am not entirely sure how to make this work when retrieving files for an HTTP server though. I would assume that example.com would be the root, and www manifests as a folder within it. I am unsure. There is also the question of multi-level subdomains, e.g. subdomain2.subdomain1.example.com. It seems to me they are structured "backwards", where you go from the root left in folder structure. In this situation, subdomain2 is a directory within subdomain1, which is a directory in the root. Finally, it occurs to me I might want a sort of global location. For example, maybe all subdomains still use an image as a logo. It makes more sense to me that there is one image, rather than each having a copy. In the same way, albiet more doubtfully, you might have global CSS (though that is a bit contrary to the idea of a subdomain in the first place), or a javascript that is commonly used. (more efficient than each having its copy and better for organization purposes). Finally, mabye you have a global 404 page. I think this might be the case where you have user-created subdomains (say bloggername.example.com), where example.com still has a default 404 when either a subdomain does not exist or page does not exists under a valid blogger. I am confused on what the directory structure for this should be. To summarize: Should and how it have a global files not in a subdirectory, how should www. be handled, (or how a now www or other subdomain should be handled), and the pattern for root/subdomain, as well as subdomain within subdomains (order-wise). Sorry this is multiple questions, but I feel at the root they are all related to the directory.

    Read the article

  • Where can I obtain a first-generation copy of the classic “Sorting out Sorting”?

    - by TML
    I cannot even figure out who made it - even the IMDB page is mostly blank, and Wikipedia does not seem to have any information about it. For such a useful film in CompSci, it strikes me as odd that the only meaningful Internet presence I can find are horrible quality copies or excerpts on YouTube. [Note: SO claims this question was migrated to here, but the URL they provide gives a 404, and I can't find it by searching Programmers.SE, so I'm re-asking...]

    Read the article

  • What's special in July 26th and why is it used in examples for Expires header so often?

    - by zerkms
    I've noticed that July 26th (my birthday) is used really often in various examples related to preventing http caching using Expires header, like: http://stackoverflow.com/questions/12398714/cache-issue-with-private-networking-stream http://stackoverflow.com/questions/2833305/how-to-expire-page-in-php-when-user-logout http://expressionengine.com/archived_forums/viewthread/81945/ What's special in that date? PS: couldn't add conspiracy tag to the tags because of lack of rep points

    Read the article

  • Subdomains on WampServer

    - by MohamedKadri
    I'm working on WampServer for development, I've set up the domain tuniguide.local and it works fine with this configuration: DocumentRoot "D:\www\tuniguide" ServerName tuniguide.local But when I wanted to add a subdomain fr.tuniguide.local I get a 404 Not Found with this configuration: DocumentRoot "D:\www\tuniguide\fr" ServerName fr.tuniguide.local It gives me this message: The requested URL /www/tuniguide/index.php was not found on this server. Is there someting that I missed? Thanks.

    Read the article

  • Why my sub-domain redirect returns a blank page?

    - by Tom Brito
    I have the domain http://dropbox.tombrito.com/ (on GoDaddy) forwarding with masking to www.dropbox.com/sh/k6ypvx4y4kf0gu6/rdjxQ1b1OL It was working fine some time ago, but now the result is a blank page (although the Dropbox's favicon appears correctly in the browser's tab title). The DNS manager shows me a single entry with the name "dropbox": A dropbox 64.202.189.170 Any idea what's wrong? Related: Why my domain redirect on Google Apps is returning 404?

    Read the article

  • How to run WordPress and Java web app running on Tomcat on the same server?

    - by Chantz
    I have to run a WordPress site served via Apache2 & Java-based webapp using Tomcat on the same server. When users come to example.com or example.com/public-pages they need to served from WordPress but when they come to example.com/private-pages they need to be served from the Tomcat. I have asked this question on serverfault where they suggested using different port, different IP & sub-domain. I want to go for different port solution since it will mean I need to buy only one SSL certificate. I tried doing the reverse proxy method by having the following in my default-ssl.conf <VirtualHost _default_:443> ServerAdmin webmaster@localhost ServerName localhost:443 DocumentRoot /var/www <Directory /var/www> #For Wordpress Options FollowSymLinks AllowOverride All </Directory> <Proxy *> Order deny,allow Allow from all </Proxy> ProxyRequests Off ProxyPass /private-pages ajp://localhost:8009/ ProxyPassReverse /private-pages ajp://localhost:8009/ SSLEngine on SSLProxyEngine On SSLCertificateFile /etc/apache2/ssl/apache.crt SSLCertificateKeyFile /etc/apache2/ssl/apache.key </VirtualHost> As you have noticed I am using mod_proxy_ajp in Apache2 for this. And that my Tomcat is listening to port 8009 and then serving content. So now when I go to example.com/private-pages I am seeing the content from my Tomcat. But 2 issues are happening. All my static resources are getting 404-ed, so none of my images, CSS, js are getting loaded. I see that the browser is requesting for the resources using URL example.com/css/* This will clearly not work because it translates to example.com:80/css/* instead of example.com:8009/css/* & there are no such resources in the WordPress directory. If I go to example.com/private-pages/abcd I am somehow kicked to the WordPress site (which obviously displays a 404 page). I can understand why #1 is happening but have no clue why the #2 is happening. Regardless, if there is another clean solution for resolving this, I would appreciate y'alls help.

    Read the article

  • 139 updates wont update

    - by amnesia
    I cant seem to update the 139 files I need too. It is saying I am not connected to the internet yet I am typing this to you on the same computer?? I have deleted most of the errors as it shows the same IP all the time and I am not sure if that is mine or not ANy ideas would be greatly appreciated Failed to fetch http://au.archive.ubuntu.com/ubuntu/pool/main/i/icedtea-web/icedtea-netx-common_1.2-2ubuntu1.2_all.deb 404 Not Found

    Read the article

  • Backlinks for nonexisting page

    - by Michal
    I've bought domain, which was previously used by somebody back in 2007. Now I've realized that internet is full of backlinks that point to non-existing parts (pages) under my website-domain (for example to mypage.com/whatever, where whatever is not present on my website, so 404 error shows). I want to ask, are these links counted by google (for pagerank) and other search engines, or not. So do I have to redirect these links to existing pages in order to be counted?

    Read the article

  • Does Googlebot (and/or search engines) index a forwarded page? [duplicate]

    - by user2889419
    This question already has an answer here: HTTP and HTTPS impacts on SEO 1 answer Let's say I have example.com domain, and I force the user to use the HTTPS over HTTP. The question is as browsers just accept and load the forwarded/new page (when the request for http://example.com - https://example.com), does the Googlebot (or other search engines) accept the forwarded page and index the new page and just ignore the old page? In other word, does search engines accept HTTPS beside the HTTP?

    Read the article

  • Install perl module with dependencies

    - by AlxAlx
    I'm trying to install a Perl module like this : pi@raspbmc:~$ sudo cpan HTTP::Date Cpan get the file Checksum is ok uncompressed successfully But I got this error : Using Tar:/bin/tar xf "HTTP-Date-6.02.tar": Couldn't untar HTTP-Date-6.02.tar: 'Cannot allocate memory' Any ideas ? My filesystem : Filesystem Size Used Avail Use% Mounted on /dev/mmcblk0p2 15G 2.1G 12G 16% / /dev/mmcblk0p1 69M 8.1M 61M 12% /boot EDIT : I tryed curl -L http://cpanmin.us | perl - App::cpanminus But when I do sudo cpanm HTTP::Date I got this error : -bash: cpanm: command not found

    Read the article

  • could not bind socket while haproxy restart

    - by shreyas
    I m restarting HAproxy by following command haproxy -f /etc/haproxy/haproxy.cfg -p /var/run/haproxy.pid -sf $(cat /var/run/haproxy.pid) but i get following message [ALERT] 183/225022 (9278) : Starting proxy appli1-rewrite: cannot bind socket [ALERT] 183/225022 (9278) : Starting proxy appli2-insert: cannot bind socket [ALERT] 183/225022 (9278) : Starting proxy appli3-relais: cannot bind socket [ALERT] 183/225022 (9278) : Starting proxy appli4-backup: cannot bind socket [ALERT] 183/225022 (9278) : Starting proxy ssl-relay: cannot bind socket [ALERT] 183/225022 (9278) : Starting proxy appli5-backup: cannot bind socket my haproxy.cfg file looks likefollowing global log 127.0.0.1 local0 log 127.0.0.1 local1 notice #log loghost local0 info maxconn 4096 #chroot /usr/share/haproxy user haproxy group haproxy daemon #debug #quiet defaults log global mode http option httplog option dontlognull retries 3 option redispatch maxconn 2000 contimeout 5000 clitimeout 50000 srvtimeout 50000 listen appli1-rewrite 0.0.0.0:10001 cookie SERVERID rewrite balance roundrobin server app1_1 192.168.34.23:8080 cookie app1inst1 check inter 2000 rise 2 fall 5 server app1_2 192.168.34.32:8080 cookie app1inst2 check inter 2000 rise 2 fall 5 server app1_3 192.168.34.27:8080 cookie app1inst3 check inter 2000 rise 2 fall 5 server app1_4 192.168.34.42:8080 cookie app1inst4 check inter 2000 rise 2 fall 5 listen appli2-insert 0.0.0.0:10002 option httpchk balance roundrobin cookie SERVERID insert indirect nocache server inst1 192.168.114.56:80 cookie server01 check inter 2000 fall 3 server inst2 192.168.114.56:81 cookie server02 check inter 2000 fall 3 capture cookie vgnvisitor= len 32 option httpclose # disable keep-alive rspidel ^Set-cookie:\ IP= # do not let this cookie tell our internal IP address listen appli3-relais 0.0.0.0:10003 dispatch 192.168.135.17:80 listen appli4-backup 0.0.0.0:10004 option httpchk /index.html option persist balance roundrobin server inst1 192.168.114.56:80 check inter 2000 fall 3 server inst2 192.168.114.56:81 check inter 2000 fall 3 backup listen ssl-relay 0.0.0.0:8443 option ssl-hello-chk balance source server inst1 192.168.110.56:443 check inter 2000 fall 3 server inst2 192.168.110.57:443 check inter 2000 fall 3 server back1 192.168.120.58:443 backup listen appli5-backup 0.0.0.0:10005 option httpchk * balance roundrobin cookie SERVERID insert indirect nocache server inst1 192.168.114.56:80 cookie server01 check inter 2000 fall 3 server inst2 192.168.114.56:81 cookie server02 check inter 2000 fall 3 server inst3 192.168.114.57:80 backup check inter 2000 fall 3 capture cookie ASPSESSION len 32 srvtimeout 20000 option httpclose # disable keep-alive option checkcache # block response if set-cookie & cacheable rspidel ^Set-cookie:\ IP= # do not let this cookie tell our internal IP address #errorloc 502 http://192.168.114.58/error502.html #errorfile 503 /etc/haproxy/errors/503.http errorfile 400 /etc/haproxy/errors/400.http errorfile 403 /etc/haproxy/errors/403.http errorfile 408 /etc/haproxy/errors/408.http errorfile 500 /etc/haproxy/errors/500.http errorfile 502 /etc/haproxy/errors/502.http errorfile 503 /etc/haproxy/errors/503.http errorfile 504 /etc/haproxy/errors/504.http what is wrong with my aproach

    Read the article

  • Apache/Jboss Issue - is this connection timeout?

    - by user115391
    We have an application. The architecture is as below 1 load balancer (apache), which redirects to 2 app servers (jboss). The site is working fine and I am able to access it fine. But sometimes, randomly the homepage takes a while (like 30-40 secs) to load. I tried checking the logs but could not figure out why. I used the httptraffic analyzer, fiddler to see the traffic, but it just says the request/response took 30 secs or so. I checked the apache access logs, mod_jk.log. My configurations are below mod-jk.conf LoadModule jk_module modules/mod_jk.so JkWorkersFile conf/workers.properties JkLogFile logs/mod_jk.log #JkLogLevel info #JkLogLevel debug JkLogLevel error # Select the log format JkLogStampFormat "[%a %b %d %H:%M:%S %Y]" JkOptions +ForwardKeySize +ForwardURICompatUnparsed -ForwardDirectories JkRequestLogFormat "%w %V %T %P %{tid}P %D" JkMount /__application__/* loadbalancer JkUnMount /__application__/images/* loadbalancer <VirtualHost *:8080 > JkMountFile conf/uriworkermap.properties </VirtualHost> JkShmFile run/jk.shm <Location /jkstatus> JkMount status Order deny,allow Deny from all Allow from 127.0.0.1 </Location> ----------------------------- uriworkermap.properties Simple worker configuration file # Mount the Servlet context to the ajp13 worker /=loadbalancer /*=loadbalancer ----------------------------- workers.properties worker.list=loadbalancer,status worker.template.port=8009 worker.template.type=ajp13 worker.template.lbfactor=1 worker.template.prepost_timeout=10000 worker.template.connect_timeout=10000 worker.template.ping_mode=A worker.worker1.reference=worker.template worker.worker1.host=hostname1 worker.worker2.reference=worker.template worker.worker2.host=hostname2 worker.loadbalancer.type=lb worker.loadbalancer.balance_workers=worker1,worker2 worker.status.type=status ----------------------------- my jboss server.xml - $JBOSS_HOME/server/default/deploy/jbossweb.sar/server.xml --------------------------------- The logs from access log is below The issue where it took time - look at the seconds column [23/Mar/2012:12:10:38 -0400] "GET / HTTP/1.1" 200 138 x.x.x.x - - [23/Mar/2012:12:10:49 -0400] "GET /index.jsp HTTP/1.1" 302 - x.x.x.x - - [23/Mar/2012:12:11:10 -0400] "GET /home.jsp HTTP/1.1" 200 936 x.x.x.x - - [23/Mar/2012:12:11:31 -0400] "POST /login/ HTTP/1.1" 200 8895 x.x.x.x - - [23/Mar/2012:12:11:52 -0400] "GET /login/includes/login-style.css HTTP/1.1" 304 - The one after the issue x.x.x.x - - [23/Mar/2012:12:12:18 -0400] "GET / HTTP/1.1" 200 138 x.x.x.x - - [23/Mar/2012:12:12:18 -0400] "GET /index.jsp HTTP/1.1" 302 - x.x.x.x - - [23/Mar/2012:12:12:18 -0400] "GET /home.jsp HTTP/1.1" 200 936 x.x.x.x - - [23/Mar/2012:12:12:18 -0400] "POST /login/ HTTP/1.1" 200 8895 x.x.x.x - - [23/Mar/2012:12:12:18 -0400] "GET /login/includes/login-style.css HTTP/1.1" 304 - Would it be a cache or timeout issue? Any help is appreciated. Thanks.

    Read the article

< Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >