Search Results

Search found 128199 results on 5128 pages for 'http status code 404'.

Page 10/5128 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Update Manager unable to get updates

    - by dPEN
    In last few days my Ubuntu 11.10 update manager is unable to get new updates. When I checked update log I saw that for couple of updates it says "Network isn't available". For other updates it downloaded logs and and internet connection also works fine. Unable to attached screenshot due to SPAM prevention policy. But for below Release gpgv:/var/lib/apt/lists/partial/extras.ubuntu.com_ubuntu_dists_oneiric_Release.gpg it says "Network isn't available" For all other Releases it is downloading fine. And due to this I dont see any update available in last 10 days. LOG OF sudo apt-get update: dipen@EIDLCPU1018:~$ sudo apt-get update [sudo] password for dipen: Ign http:/extras.ubuntu.com oneiric InRelease Ign http:/archive.canonical.com oneiric InRelease Ign http:/archive.canonical.com lucid InRelease Get:1 http:/extras.ubuntu.com oneiric Release.gpg [72 B] Get:2 http:/archive.canonical.com oneiric Release.gpg [198 B] Hit http:/extras.ubuntu.com oneiric Release Get:3 http:/archive.canonical.com lucid Release.gpg [198 B] Err http:/extras.ubuntu.com oneiric Release Hit http:/archive.canonical.com oneiric Release Ign http:/archive.canonical.com oneiric Release Hit http:/archive.canonical.com lucid Release Ign http:/archive.canonical.com lucid Release Ign http:/archive.canonical.com oneiric/partner i386 Packages/DiffIndex Ign http:/archive.canonical.com oneiric/partner TranslationIndex Ign http:/archive.canonical.com lucid/partner i386 Packages/DiffIndex Ign http:/archive.canonical.com lucid/partner TranslationIndex Hit http:/archive.canonical.com oneiric/partner i386 Packages Hit http:/archive.canonical.com lucid/partner i386 Packages Ign http:/dl.google.com stable InRelease Ign http:/archive.canonical.com oneiric/partner Translation-en_IN Ign http:/archive.canonical.com oneiric/partner Translation-en Ign http:/archive.canonical.com lucid/partner Translation-en_IN Ign http:/archive.canonical.com lucid/partner Translation-en Ign http:/in.archive.ubuntu.com oneiric InRelease Ign http:/in.archive.ubuntu.com oneiric-updates InRelease Ign http:/in.archive.ubuntu.com oneiric-security InRelease Get:4 http//dl.google.com stable Release.gpg [198 B] Get:5 http//in.archive.ubuntu.com oneiric Release.gpg [198 B] Get:6 http//in.archive.ubuntu.com oneiric-updates Release.gpg [198 B] Get:7 http//in.archive.ubuntu.com oneiric-security Release.gpg [198 B] Hit http:/in.archive.ubuntu.com oneiric Release Ign http:/in.archive.ubuntu.com oneiric Release Hit http:/in.archive.ubuntu.com oneiric-updates Release Err http:/in.archive.ubuntu.com oneiric-updates Release Hit http:/in.archive.ubuntu.com oneiric-security Release Ign http:/in.archive.ubuntu.com oneiric-security Release Ign http:/in.archive.ubuntu.com oneiric/main i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric/restricted i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric/universe i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric/multiverse i386 Packages/DiffIndex Hit http:/in.archive.ubuntu.com oneiric/main TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/multiverse TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/restricted TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/universe TranslationIndex Ign http:/in.archive.ubuntu.com oneiric-security/main i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric-security/restricted i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric-security/universe i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric-security/multiverse i386 Packages/DiffIndex Hit http:/in.archive.ubuntu.com oneiric-security/main TranslationIndex Hit http:/in.archive.ubuntu.com oneiric-security/multiverse TranslationIndex Hit http:/in.archive.ubuntu.com oneiric-security/restricted TranslationIndex Hit http:/in.archive.ubuntu.com oneiric-security/universe TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/main i386 Packages Hit http:/in.archive.ubuntu.com oneiric/restricted i386 Packages Hit http:/in.archive.ubuntu.com oneiric/universe i386 Packages Hit http:/in.archive.ubuntu.com oneiric/multiverse i386 Packages Hit http:/in.archive.ubuntu.com oneiric/main Translation-en Hit http:/in.archive.ubuntu.com oneiric/multiverse Translation-en Hit http:/in.archive.ubuntu.com oneiric/restricted Translation-en Hit http:/in.archive.ubuntu.com oneiric/universe Translation-en Hit http:/in.archive.ubuntu.com oneiric-security/main i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/restricted i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/universe i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/multiverse i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/main Translation-en Hit http:/in.archive.ubuntu.com oneiric-security/multiverse Translation-en Get:8 http//dl.google.com stable Release [1,347 B] Hit http:/in.archive.ubuntu.com oneiric-security/restricted Translation-en Hit http:/in.archive.ubuntu.com oneiric-security/universe Translation-en Get:9 http//dl.google.com stable/main i386 Packages [1,214 B] Ign http:/dl.google.com stable/main TranslationIndex Ign http:/dl.google.com stable/main Translation-en_IN Ign http:/dl.google.com stable/main Translation-en Fetched 3,821 B in 41s (91 B/s) Reading package lists... Done W: A error occurred during the signature verification. The repository is not updated and the previous index files will be used. GPG error: http:/extras.ubuntu.com oneiric Release: The following signatures were invalid: BADSIG 16126D3A3E5C1192 Ubuntu Extras Archive Automatic Signing Key <[email protected]> W: GPG error: http:/archive.canonical.com oneiric Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: GPG error: http:/archive.canonical.com lucid Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: GPG error: http:/in.archive.ubuntu.com oneiric Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: A error occurred during the signature verification. The repository is not updated and the previous index files will be used. GPG error: http:/in.archive.ubuntu.com oneiric-updates Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: GPG error: http:/in.archive.ubuntu.com oneiric-security Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: Failed to fetch http:/extras.ubuntu.com/ubuntu/dists/oneiric/Release W: Failed to fetch http:/in.archive.ubuntu.com/ubuntu/dists/oneiric-updates/Release W: Some index files failed to download. They have been ignored, or old ones used instead. dipen@EIDLCPU1018:~$ LOG of sudo apt-get upgrade: dipen@EIDLCPU1018:~$ sudo apt-get upgrade Reading package lists... Done Building dependency tree Reading state information... Done The following packages have been kept back: ghc6-doc haskell-zlib-doc libghc6-zlib-doc 0 upgraded, 0 newly installed, 0 to remove and 3 not upgraded. dipen@EIDLCPU1018:~$ /etc/apt/sources.list: deb http:/in.archive.ubuntu.com/ubuntu/ oneiric main restricted deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-updates main restricted deb http:/in.archive.ubuntu.com/ubuntu/ oneiric universe deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-updates universe deb http:/in.archive.ubuntu.com/ubuntu/ oneiric multiverse deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-updates multiverse deb http:/archive.canonical.com/ubuntu oneiric partner deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-security main restricted deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-security universe deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-security multiverse deb http:/extras.ubuntu.com/ubuntu oneiric main #Third party developers repository deb http:/archive.canonical.com/ lucid partner

    Read the article

  • Is code maintenance typically a special project, or is it considered part of daily work?

    - by blueberryfields
    Earlier, I asked to find out which tools are commonly used to monitor methods and code bases, to find out whether the methods have been getting too long. Most of the responses there suggested that, beyond maintenance on the method currently being edited, programmers don't, in general, keep an eye on the rest of the code base. So I thought I'd ask the question in general: Is code maintenance, in general, considered part of your daily work? Do you find that you're spending at least some of your time cleaning up, refactoring, rewriting code in the code base, to improve it, as part of your other assigned work? Is it expected of you/do you expect it of your teammates? Or is it more common to find that cleanup, refactoring, and general maintenance on the codebase as a whole, occurs in bursts (for example, mostly as part of code reviews, or as part of refactoring/cleaning up projects)?

    Read the article

  • Need assistance with Kohana 3 and catch all route turning into a 404 error

    - by alex
    Based on this documentation, I've implemented a catch all route which routes to an error page. Here is the last route in my bootstrap.php Route::set('default', '<path>', array('path' => '.+')) ->defaults(array( 'controller' => 'errors', 'action' => '404', )); However I keep getting this exception thrown when I try and go to a non existent page Kohana_Exception [ 0 ]: Required route parameter not passed: path If I make the <path> segment optional (i.e. wrap it in parenthesis) then it just seems to load the home route, which is... Route::set('home', '') ->defaults(array( 'controller' => 'home', 'action' => 'index', )); The home route is defined first. I execute my main request like so $request = Request::instance(); try { // Attempt to execute the response $request->execute(); } catch (Exception $e) { if (Kohana::$environment === Kohana::DEVELOPMENT) throw $e; // Log the error Kohana::$log->add(Kohana::ERROR, Kohana::exception_text($e)); // Create a 404 response $request->status = 404; $request->response = Request::factory(Route::get('default')->uri())->execute(); } $request->send_headers(); echo $request->response; This means that the 404 header is sent to the browser, but I assumed by sending the request to the capture all route then it should show the 404 error set up in my errors controller. <?php defined('SYSPATH') or die('No direct script access.'); class Controller_Errors extends Controller_Base { public function before() { parent::before(); } public function action_404() { $this->bodyClass[] = '404'; $this->internalView = View::factory('internal/not_found'); $longTitle = 'Page Not Found'; $this->titlePrefix = $longTitle; } } Why won't it show my 404 error page?

    Read the article

  • Handling site not found and page not found with dynamic mass virtual hosting

    - by Rick Moynihan
    I have recently setup mass virtual hosting in Apache so that all we need to do is create a directory to create a new vhost. We're then also using wildcard DNS to map all subdomains to the server running our Apache instance. This works excellently, however I'm now having trouble configuring it to fail-over to an appropriate default/error-page when the vhost directory does not exist. The problem appears to be conflated between by my desire to handle the two error conditions: vhost not found i.e. there was no directory found matching the host supplied in the HTTP host header. I'd like this to display an appropriate site not found error page. The 404 page not found condition of the vhost. Additionally I have a specialised "api" vhost in its own vhost block. I've tried a number of variations and none seem to exhibit the behaviour I want. Here's what I'm working with right now: NameVirtualHost *:80 <VirtualHost *:80> DocumentRoot /var/www/site-not-found ServerName sitenotfound.mydomain.org ErrorDocument 500 /500.html ErrorDocument 404 /500.html </VirtualHost> <VirtualHost *:80> ServerName api.mydomain.org DocumentRoot /var/www/vhosts/api.mydomain.org/current # other directives, e.g. setting up passenger/rails etc... </VirtualHost> <VirtualHost *:80> # get the server name from the Host: header UseCanonicalName Off VirtualDocumentRoot /var/www/vhosts/%0/current # other directives ... e.g proxy passing to api etc... ErrorDocument 404 /404.html </VirtualHost> My understanding is that the first vhost block is used as the default, so I have this here as my catch all site. Next I have my API vhost, and then finally my mass vhost block. So for a domain that doesn't match the first two ServerName's and has no corresponding directory in /var/www/vhosts/ I'd expect it to fall-over to the first vhost, however with this setup, all domains resolve to my default site-not-found. Why is this? By putting the mass-vhost block first, I can get the mass-vhosts to resolve properly, but not my site-not-found vhost... and in this case I can't seem to find a way to distinguish between a page-level 404 in the vhost, and the case where the VirtualDocumentRoot fails to find a vhost directory (this appears to use the 404 also). Any help out of this bind is much appreciated!

    Read the article

  • Problem with apt-get update: failed to fetch error

    - by user171447
    I run an Ubuntu Server 12.04.3 LTS. Today, I wanted to update it, but I did not managed it (yes...), however upgrading worked well. I don't want you to solve my problem but it would be greatful if you could give me some hints. I googled hours, I fould a lot of this kind of errors, but not exactly this. Here is the output of apt-get update: Hit http://filepile.fastit.net precise Release.gpg Hit http://filepile.fastit.net precise Release Hit http://filepile.fastit.net precise/main amd64 Packages Hit http://filepile.fastit.net precise/restricted amd64 Packages Hit http://filepile.fastit.net precise/universe amd64 Packages Hit http://filepile.fastit.net precise/multiverse amd64 Packages Hit http://filepile.fastit.net precise/main i386 Packages Hit http://filepile.fastit.net precise/restricted i386 Packages Hit http://filepile.fastit.net precise/universe i386 Packages Hit http://filepile.fastit.net precise/multiverse i386 Packages Ign http://filepile.fastit.net precise/main TranslationIndex Ign http://filepile.fastit.net precise/multiverse TranslationIndex Ign http://filepile.fastit.net precise/restricted TranslationIndex Ign http://filepile.fastit.net precise/universe TranslationIndex Ign http://filepile.fastit.net precise/main Translation-en_GB Ign http://filepile.fastit.net precise/main Translation-en Ign http://filepile.fastit.net precise/main Translation-en_GB.UTF-8 Hit http://archive.canonical.com precise Release.gpg Ign http://filepile.fastit.net precise/multiverse Translation-en_GB Ign http://filepile.fastit.net precise/multiverse Translation-en Ign http://filepile.fastit.net precise/multiverse Translation-en_GB.UTF-8 Ign http://filepile.fastit.net precise/restricted Translation-en_GB Ign http://filepile.fastit.net precise/restricted Translation-en Ign http://filepile.fastit.net precise/restricted Translation-en_GB.UTF-8 Ign http://filepile.fastit.net precise/universe Translation-en_GB Ign http://filepile.fastit.net precise/universe Translation-en Ign http://filepile.fastit.net precise/universe Translation-en_GB.UTF-8 Hit http://archive.ubuntu.com precise Release.gpg Hit http://archive.canonical.com precise Release Hit http://archive.ubuntu.com precise Release Hit http://archive.ubuntu.com precise/main Sources Hit http://archive.ubuntu.com precise/restricted Sources Hit http://archive.ubuntu.com precise/main i386 Packages Hit http://archive.ubuntu.com precise/restricted i386 Packages Hit http://archive.ubuntu.com precise/multiverse i386 Packages Hit http://archive.ubuntu.com precise/main TranslationIndex Hit http://archive.ubuntu.com precise/multiverse TranslationIndex Hit http://archive.ubuntu.com precise/restricted TranslationIndex Hit http://archive.ubuntu.com precise/main Translation-en_GB Hit http://archive.ubuntu.com precise/main Translation-en Hit http://archive.ubuntu.com precise/multiverse Translation-en_GB Hit http://archive.ubuntu.com precise/multiverse Translation-en Hit http://archive.ubuntu.com precise/restricted Translation-en_GB Hit http://archive.ubuntu.com precise/restricted Translation-en Hit http://fr.archive.ubuntu.com precise-security Release.gpg Hit http://fr.archive.ubuntu.com precise-updates Release.gpg Hit http://fr.archive.ubuntu.com precise-security Release Hit http://fr.archive.ubuntu.com precise-updates Release Hit http://fr.archive.ubuntu.com precise-security/main amd64 Packages Hit http://fr.archive.ubuntu.com precise-security/restricted amd64 Packages Hit http://fr.archive.ubuntu.com precise-security/universe amd64 Packages Hit http://fr.archive.ubuntu.com precise-security/multiverse amd64 Packages :W: Failed to fetch http://archive.canonical.com/dists/precise/Release Unable to find expected entry 'main/binary-amd64/Packages' in Release file (Wrong sources.list entry or malformed file) E: Some index files failed to download. They have been ignored, or old ones used instead. Hit http://fr.archive.ubuntu.com precise-security/main i386 Packages Hit http://fr.archive.ubuntu.com precise-security/restricted i386 Packages Hit http://fr.archive.ubuntu.com precise-security/universe i386 Packages Hit http://fr.archive.ubuntu.com precise-security/multiverse i386 Packages Hit http://fr.archive.ubuntu.com precise-security/main TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/multiverse TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/restricted TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/universe TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/main amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/restricted amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/universe amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/multiverse amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/main i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/restricted i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/universe i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/multiverse i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/main TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/multiverse TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/restricted TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/universe TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/main Translation-en Hit http://fr.archive.ubuntu.com precise-security/multiverse Translation-en Hit http://fr.archive.ubuntu.com precise-security/restricted Translation-en Hit http://fr.archive.ubuntu.com precise-security/universe Translation-en Hit http://fr.archive.ubuntu.com precise-updates/main Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/main Translation-en Hit http://fr.archive.ubuntu.com precise-updates/multiverse Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/multiverse Translation-en Hit http://fr.archive.ubuntu.com precise-updates/restricted Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/restricted Translation-en Hit http://fr.archive.ubuntu.com precise-updates/universe Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/universe Translation-en And here is my /etc/apt/sources.list: ###### Ubuntu Main Repos deb http://filepile.fastit.net/ubuntu/ precise main restricted universe multiverse # deb http://de.archive.ubuntu.com/ubuntu/ precise main restricted universe multiverse deb http://archive.canonical.com/ precise main restricted universe multiverse ###### Ubuntu Update Repos deb http://fr.archive.ubuntu.com/ubuntu/ precise-security main restricted universe multiverse deb http://fr.archive.ubuntu.com/ubuntu/ precise-updates main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu/ precise main restricted multiverse deb-src http://archive.ubuntu.com/ubuntu precise main restricted multiverse Thanks for your help!

    Read the article

  • Weird .ASP pages from my non-ASP site generating 404s

    - by Amanda
    In Google Webmaster Tools, I have a huge list of "Not Found" crawl errors (404) with URLs that look like this: http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=3, seemingly originating from URLs very similar to that (eg http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=4. Thing is, the site is WordPress. Has been for almost a year now. Was plain html before that. I don't know where these ASP requests are coming from. And furthermore, the dates these supposed ASP pages requested these other ASP pages, resulting in 404s, are very recent. What's going on?

    Read the article

  • JQuery + WCF + HTTP 404 Error

    - by hangar18
    HI All, I've searched high and low and finally decided to post a query here. I'm writing a very basic HTML page from which I'm trying to call a WCF service using jQuery and parse it using JSON. Service: IMyDemo.cs [ServiceContract] public interface IMyDemo { [WebInvoke(Method = "POST", BodyStyle = WebMessageBodyStyle.WrappedRequest, ResponseFormat = WebMessageFormat.Json)] Employee DoWork(); [OperationContract] [WebInvoke(Method = "POST", BodyStyle = WebMessageBodyStyle.WrappedRequest, ResponseFormat = WebMessageFormat.Json)] Employee GetEmp(int age, string name); } [DataContract] public class Employee { [DataMember] public int EmpId { get; set; } [DataMember] public string EmpName { get; set; } [DataMember] public int EmpSalary { get; set; } } MyDemo.svc.cs public Employee DoWork() { // Add your operation implementation here Employee obj = new Employee() { EmpSalary = 12, EmpName = "SomeName" }; return obj; } public Employee GetEmp(int age, string name) { Employee emp = new Employee(); if (age > 0) emp.EmpSalary = 12 + age; if (!string.IsNullOrEmpty(name)) emp.EmpName = "Server" + name; return emp; } WEb.Config <system.serviceModel> <services> <service behaviorConfiguration="EmployeesBehavior" name="MySample.MyDemo"> <endpoint address="" binding="webHttpBinding" contract="MySample.IMyDemo" behaviorConfiguration="EmployeesBehavior"/> </service> </services> <behaviors> <serviceBehaviors> <behavior name="EmployeesBehavior"> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> <endpointBehaviors> <behavior name="EmployeesBehavior"> <webHttp/> </behavior> </endpointBehaviors> </behaviors> <serviceHostingEnvironment multipleSiteBindingsEnabled="true" /> </system.serviceModel> MyDemo.htm <head> <title></title> <script type="text/javascript" language="javascript" src="Scripts/jquery-1.4.1.js"></script> <script type="text/javascript" language="javascript" src="Scripts/json.js"></script> <script type="text/javascript"> //create a global javascript object for the AJAX defaults. debugger; var ajaxDefaults = {}; ajaxDefaults.base = { type: "POST", timeout : 1000, dataFilter: function (data) { //see http://encosia.com/2009/06/29/never-worry-about-asp-net-ajaxs-d-again/ data = JSON.parse(data); //use the JSON2 library if you aren’t using FF3+, IE8, Safari 3/Google Chrome return data.hasOwnProperty("d") ? data.d : data; }, error: function (xhr) { //see if (!xhr) return; if (xhr.responseText) { var response = JSON.parse(xhr.responseText); //console.log works in FF + Firebug only, replace this code if (response) alert(response); else alert("Unknown server error"); } } }; ajaxDefaults.json = $.extend(ajaxDefaults.base, { //see http://encosia.com/2008/03/27/using-jquery-to-consume-aspnet-json-web-services/ contentType: "application/json; charset=utf-8", dataType: "json" }); var ops = { baseUrl: "/MyService/MySample/MyDemo.svc/", doWork: function () { //see http://api.jquery.com/jQuery.extend/ var ajaxOptions = $.extend(ajaxDefaults.json, { url: ops.baseUrl + "DoWork", data: "{}", success: function (msg) { console.log("success"); console.log(typeof msg); if (typeof msg !== "undefined") { console.log(msg); } } }); $.ajax(ajaxOptions); return false; }, getEmp: function () { var ajaxOpts = $.extend(ajaxDefaults.json, { url: ops.baseUrl + "GetEmp", data: JSON.stringify({ age: 12, name: "NameName" }), success: function (msg) { $("span#lbl").html("age: " + msg.Age + "name:" + msg.Name); } }); $.ajax(ajaxOpts); return false; } } </script> </head> <body> <span id="lbl">abc</span> <br /><br /> <input type="button" value="GetEmployee" id="btnGetEmployee" onclick="javascript:ops.getEmp();" /> </body> I'm just not able to get this running. When I debug, I see the error being returned from the call is " Server Error in '/jQuerySample' Application. <h2> <i>HTTP Error 404 - Not Found.</i> </h2></span> " Looks like I'm missing something basic here. My sample is based on this I've been trying to fix the code for sometime now so I'd like you to take a look and see if you can figure out what is it that I'm doing wrong here. I'm able to see that the service is created when I browse the service in IE. I've also tried changing the setting as mentioned here Appreciate your help. I'm gonna blog about this as soon as the issue is resolved for the benefit of other devs Thanks -Soni

    Read the article

  • I see files in filezilla, but the internet denies their existance

    - by Zach L.
    I am doing some updates to a 10-year old site, and I am baffled. Everything worked great locally, so I uploaded a bunch of stuff to the server using filezilla. Within filezilla I can see all of the files, but for some reason I get a 404 when trying to view them. It seems as though (at least for the folder Im currently checking) this is happening for items which are "farther down the list" alphabetically. I tried to re-upload a file individually but it didn't change anything. Is this an indication that I hit some sort of limit with the hosting company? And if so why can I still view the files from filezilla? Please offer guidance. I am at a loss.

    Read the article

  • Redirecting non existing post to homepage; is that good for SEO?

    - by BlackEagle
    I am checking my website out on Google Webmasters and I am seeing an astonishing 5000 links that could not be found by Google's Crawlers. That's normal, because my website is built in a manner that users can drop their own things, which also lead to 404 pages. Not a problem at all if I can find a workaround of course... So my question is: what if I made a function or a mod rewrite that will check if the link exists (a post for example) and if not, it will redirect it to the home page. Is this good for SEO? Will Google see this as 'link found'? How do I have to look at this problem?

    Read the article

  • How to prevent a 404 Error when installing a subdomain using a wordpress multi-site installation

    - by Chris
    I have installed a multi-site instillation of WordPress onto my domain. I then added the necessary code to the wp-config.php file and .htaccess as instructed by WordPress. I also installed a plugin called Quick Page/Post Redirect Plugin which allowed me to place a 301 redirect onto the main domain as I only want to use the sub domain and not the main domain. Then I also added the following line of code to the wp-config.php file to redirect the main domain define( 'NOBLOGREDIRECT', 'URL Redirect Address' ); The site works fine with a redirect on the main domain and my subdomain runs fine when you type in subdomain.domain.com or http://subdomain.domain.com. However when I enter www.subdomain.domain.com or http://www.subdomain.domain.com the following error message is returned: Not Found The requested URL / was not found on this server. Apache/2.4.9 (Unix) Server at www.subdomain.domain.com Port 80 Any help with this would be much appreciated.

    Read the article

  • URL parameter names being changed by user agents

    - by Mike Deck
    In reviewing one of our site's web logs I'm seeing instances where we are returning a 404 to requests because we're expecting an id parameter to be sent, but instead we're seeing a di parameter. The resource in question is an image but which image file actually gets served is dependent on the id parameter. The expected url is something like http://images.mysite.com/photo.gif?id=123&width=200&height=300 What I'm seeing in the logs is requests for http://images.mysite.com/photo.gif?di=123&width=200&height=300 The only case where we are seeing this on the id parameter. It seems unlikely that this is due to a server side or JavaScript bug since it seems to be only effecting a small percentage of our traffic. We are seeing this across a wide variety of user agents (both mobile and desktop) and IPs. Has anyone else seen this? Is there a browser plugin or other software you're aware of that could be causing this, and if so is there a good way to work around the issue?

    Read the article

  • Joomla url issue with sh404SEF

    - by user5858
    it's couple of months I've been using SH404SEF With my site. But in my site I'm getting url's in the form: http://www.downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html?task=view If I remove this suffix(?task=view), it takes us to the same page. I had raised this issue in SH404SEF forum, and I was told that this data is taken as parameter by search engines hence ignored. I want to redirect using RewriteMatch in .htaccess all such url's to the url's without ?task=view ones : ....downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html?task=view to be redirected to http://www.downloadformsindia.com/Income-Tax-Forms/all-income-tax-return-itr-forms-2010-2011.html So my question is: Will this redirection create 404's in the Google webmaster. I've thousand's of pages in the site

    Read the article

  • nginx + php fpm -> 404 php pages - file not found

    - by Mahesh
    *2037 FastCGI sent in stderr: "Primary script unknown" while reading response header from upstream server { listen 80; ## listen for ipv4; this line is default and implied #listen [::]:80 default ipv6only=on; ## listen for ipv6 server_name .site.com; root /var/www/site; error_page 404 /404.php; access_log /var/log/nginx/site.access.log; index index.html index.php; if ($http_host != "www.site.com") { rewrite ^ http://www.site.com$request_uri permanent; } location ~* \.php$ { fastcgi_index index.php; fastcgi_pass 127.0.0.1:9000; #fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock; fastcgi_buffer_size 128k; fastcgi_buffers 256 4k; fastcgi_busy_buffers_size 256k; fastcgi_temp_file_write_size 256k; fastcgi_read_timeout 240; include /etc/nginx/fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param SCRIPT_NAME $fastcgi_script_name; } location ~ /\. { access_log off; log_not_found off; deny all; } location ~ /(libraries|setup/frames|setup/libs) { deny all; return 404; } location ~ ^/uploads/(\d+)/(\d+)/(\d+)/(\d+)/(.*)$ { alias /var/www/site/images/missing.gif; #i need to modify this to show only missing files. right now it is showing missing for all the files. } location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ { access_log off; expires 20d; } location /user_uploads/ { location ~ .*\.(php)?$ { deny all; } } location ~ /\.ht { deny all; } } php-fpm config is default and is not touched. The problem is little strange for me. Error pages are showing File not found only if they are .php files. Other error files are clearly calling the 404.php file site.com/test = calls 404.php site.com/test.php = File not found. I am searching and making changes. but it hasn't solved the problem.

    Read the article

  • HTTP 1.0 vs 1.1

    - by Jason Baker
    Could somebody give me a brief overview of the differences between HTTP 1.0 and HTTP 1.1? I've spent some time with both of the RFCs, but haven't been able to pull out a lot of difference between them. Wikipedia says this: HTTP/1.1 (1997-1999) Current version; persistent connections enabled by default and works well with proxies. Also supports request pipelining, allowing multiple requests to be sent at the same time, allowing the server to prepare for the workload and potentially transfer the requested resources more quickly to the client. But that doesn't mean a lot to me. I realize this is a somewhat complicated subject, so I'm not expecting a full answer, but can someone give me a brief overview of the differences at a bit lower level? By this I mean that I'm looking for the info I would need to know to implement either an HTTP server or application. I realize that this can be a somewhat complicated subject (based on what I know about HTTP as of right now), so I'm not necessarily looking for a full answer. I'm really more looking for a nudge in the right direction so that I can figure it out on my own.

    Read the article

  • IIS 7.5 returning 404 for unknown host names

    - by WaldenL
    This just doesn't seem correct to me, so I'm looking for someone to tell me how I've misconfigured IIS... Configuration is IIS7.5 (2008R2), without SP1. I have IIS 7.5 configured w/several sites. ALL sites have hostnames defined in the bindings, there is NO site w/out a hostname. However, if I request an unknown hostname from the server IIS (technically Microsoft-HTTPAPI/2.0) return a 404 error, not a 400 error. I would expect a 400 (or some other major error) rather than a lowly 404. This causes a problem when I have nginx in front of multiple IISs and want to stop a site so nginx takes it out of rotation. Since IIS still returns a 404 for the request even when there is no active site for that name, nginx doesn't know the server is dead. NB: IIS returns the 404 regardless of whether there is a server, but it's stopped, or there is no server. Thoughts? Solutions? -- Additional info: OK, I added a site on a port other than 80 (5000) and then on a connection to that port asked for a site that doesn't exist, and I get the expected error 400 (Invalid hostname). So, while IIS isn't listening for generic (no host name) connections on port 80 it would seem that something is. Any ideas how to get HTTPSys to dump the list of what it's listening for?

    Read the article

  • Upgrading from 12.10 to 13.04 -> dpkg: error processing sudo (--configure)

    - by Korrigan Nagirrok
    Here's the deal and reason I'm asking for your help. Last night I went on upgrading my Xubuntu 12.10 installation to 13.04, so at tty1 I run the command sudo do-release-upgrade and everything seemed to went well except that after rebooting and when I run sudo apt-get update && sudo apt-get upgrade I get this error: sudo apt-get update && sudo apt-get upgrade Hit http://pt.archive.ubuntu.com raring Release.gpg Hit http://pt.archive.ubuntu.com raring-updates Release.gpg Hit http://dl.google.com stable Release.gpg Hit http://pt.archive.ubuntu.com raring-backports Release.gpg Hit http://pt.archive.ubuntu.com raring Release Hit http://archive.canonical.com raring Release.gpg Hit http://ppa.launchpad.net raring Release.gpg Hit http://pt.archive.ubuntu.com raring-updates Release Hit http://extras.ubuntu.com raring Release.gpg Hit http://pt.archive.ubuntu.com raring-backports Release Hit http://dl.google.com stable Release Hit http://pt.archive.ubuntu.com raring/main Sources Hit http://pt.archive.ubuntu.com raring/restricted Sources Hit http://extras.ubuntu.com raring Release Hit http://archive.canonical.com raring Release Hit http://ppa.launchpad.net raring Release.gpg Hit http://pt.archive.ubuntu.com raring/universe Sources Hit http://pt.archive.ubuntu.com raring/multiverse Sources Hit http://dl.google.com stable/main i386 Packages Get:1 http://security.ubuntu.com raring-security Release.gpg [933 B] Hit http://pt.archive.ubuntu.com raring/main i386 Packages Hit http://extras.ubuntu.com raring/main Sources Hit http://ppa.launchpad.net raring Release Hit http://archive.canonical.com raring/partner i386 Packages Hit http://pt.archive.ubuntu.com raring/restricted i386 Packages Hit http://pt.archive.ubuntu.com raring/universe i386 Packages Hit http://extras.ubuntu.com raring/main i386 Packages Hit http://pt.archive.ubuntu.com raring/multiverse i386 Packages Hit http://ppa.launchpad.net raring Release Hit http://pt.archive.ubuntu.com raring/main Translation-en Hit http://ppa.launchpad.net raring/main Sources Hit http://ppa.launchpad.net raring/main i386 Packages Hit http://pt.archive.ubuntu.com raring/multiverse Translation-en Hit http://pt.archive.ubuntu.com raring/restricted Translation-en Hit http://pt.archive.ubuntu.com raring/universe Translation-en Hit http://pt.archive.ubuntu.com raring-updates/main Sources Hit http://pt.archive.ubuntu.com raring-updates/restricted Sources Hit http://ppa.launchpad.net raring/main Sources Hit http://pt.archive.ubuntu.com raring-updates/universe Sources Hit http://pt.archive.ubuntu.com raring-updates/multiverse Sources Hit http://pt.archive.ubuntu.com raring-updates/main i386 Packages Hit http://ppa.launchpad.net raring/main i386 Packages Hit http://pt.archive.ubuntu.com raring-updates/restricted i386 Packages Hit http://pt.archive.ubuntu.com raring-updates/universe i386 Packages Hit http://pt.archive.ubuntu.com raring-updates/multiverse i386 Packages Ign http://dl.google.com stable/main Translation-en_US Hit http://pt.archive.ubuntu.com raring-updates/main Translation-en Ign http://archive.canonical.com raring/partner Translation-en_US Ign http://extras.ubuntu.com raring/main Translation-en_US Ign http://dl.google.com stable/main Translation-en Ign http://archive.canonical.com raring/partner Translation-en Hit http://pt.archive.ubuntu.com raring-updates/multiverse Translation-en Ign http://extras.ubuntu.com raring/main Translation-en Hit http://pt.archive.ubuntu.com raring-updates/restricted Translation-en Hit http://pt.archive.ubuntu.com raring-updates/universe Translation-en Hit http://pt.archive.ubuntu.com raring-backports/main Sources Hit http://pt.archive.ubuntu.com raring-backports/restricted Sources Hit http://pt.archive.ubuntu.com raring-backports/universe Sources Hit http://pt.archive.ubuntu.com raring-backports/multiverse Sources Hit http://pt.archive.ubuntu.com raring-backports/main i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/restricted i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/universe i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/multiverse i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/main Translation-en Hit http://pt.archive.ubuntu.com raring-backports/multiverse Translation-en Get:2 http://security.ubuntu.com raring-security Release [40.8 kB] Hit http://pt.archive.ubuntu.com raring-backports/restricted Translation-en Hit http://pt.archive.ubuntu.com raring-backports/universe Translation-en Ign http://ppa.launchpad.net raring/main Translation-en_US Ign http://ppa.launchpad.net raring/main Translation-en Get:3 http://security.ubuntu.com raring-security/main Sources [2,109 B] Ign http://ppa.launchpad.net raring/main Translation-en_US Ign http://ppa.launchpad.net raring/main Translation-en Get:4 http://security.ubuntu.com raring-security/restricted Sources [14 B] Get:5 http://security.ubuntu.com raring-security/universe Sources [14 B] Get:6 http://security.ubuntu.com raring-security/multiverse Sources [14 B] Get:7 http://security.ubuntu.com raring-security/main i386 Packages [3,670 B] Get:8 http://security.ubuntu.com raring-security/restricted i386 Packages [14 B] Get:9 http://security.ubuntu.com raring-security/universe i386 Packages [2,824 B] Get:10 http://security.ubuntu.com raring-security/multiverse i386 Packages [14 B] Ign http://pt.archive.ubuntu.com raring/main Translation-en_US Ign http://pt.archive.ubuntu.com raring/multiverse Translation-en_US Ign http://pt.archive.ubuntu.com raring/restricted Translation-en_US Ign http://pt.archive.ubuntu.com raring/universe Translation-en_US Ign http://pt.archive.ubuntu.com raring-updates/main Translation-en_US Ign http://pt.archive.ubuntu.com raring-updates/multiverse Translation-en_US Hit http://security.ubuntu.com raring-security/main Translation-en Ign http://pt.archive.ubuntu.com raring-updates/restricted Translation-en_US Ign http://pt.archive.ubuntu.com raring-updates/universe Translation-en_US Ign http://pt.archive.ubuntu.com raring-backports/main Translation-en_US Ign http://pt.archive.ubuntu.com raring-backports/multiverse Translation-en_US Ign http://pt.archive.ubuntu.com raring-backports/restricted Translation-en_US Hit http://security.ubuntu.com raring-security/multiverse Translation-en Ign http://pt.archive.ubuntu.com raring-backports/universe Translation-en_US Hit http://security.ubuntu.com raring-security/restricted Translation-en Hit http://security.ubuntu.com raring-security/universe Translation-en Ign http://security.ubuntu.com raring-security/main Translation-en_US Ign http://security.ubuntu.com raring-security/multiverse Translation-en_US Ign http://security.ubuntu.com raring-security/restricted Translation-en_US Ign http://security.ubuntu.com raring-security/universe Translation-en_US Fetched 50.4 kB in 6s (7,454 B/s) Reading package lists... Done Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 2 not fully installed or removed. Need to get 0 B/373 kB of archives. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? Y dpkg: error processing sudo (--configure): Package is in a very bad inconsistent state - you should reinstall it before attempting configuration. No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of ubuntu-minimal: ubuntu-minimal depends on sudo; however: Package sudo is not configured yet. dpkg: error processing ubuntu-minimal (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: sudo ubuntu-minimal E: Sub-process /usr/bin/dpkg returned an error code (1) I've tried everything I thought logical, like sudo dpkg --configure -a dpkg: error processing sudo (--configure): Package is in a very bad inconsistent state - you should reinstall it before attempting configuration. dpkg: dependency problems prevent configuration of ubuntu-minimal: ubuntu-minimal depends on sudo; however: Package sudo is not configured yet. dpkg: error processing ubuntu-minimal (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: sudo ubuntu-minimal sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 2 not fully installed or removed. Need to get 0 B/373 kB of archives. After this operation, 0 B of additional disk space will be used. dpkg: error processing sudo (--configure): Package is in a very bad inconsistent state - you should reinstall it before attempting configuration. dpkg: dependency problems prevent configuration of ubuntu-minimal: ubuntu-minimal depends on sudo; however: Package sudo is not configured yet. dpkg: error processing ubuntu-minimal (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already Errors were encountered while processing: sudo ubuntu-minimal E: Sub-process /usr/bin/dpkg returned an error code (1) Can someone help me, please. Edit: Here's some more info that could be of help for anyone. The output of apt-cache policy linux-image-generic-pae linux-generic-pae is linux-image-generic-pae: Installed: (none) Candidate: 3.8.0.19.35 Version table: 3.8.0.19.35 0 500 http://pt.archive.ubuntu.com/ubuntu/ raring/main i386 Packages linux-generic-pae: Installed: (none) Candidate: 3.8.0.19.35 Version table: 3.8.0.19.35 0 500 http://pt.archive.ubuntu.com/ubuntu/ raring/main i386 Packages

    Read the article

  • Git push over http (using git-http-backend) and Apache is not working

    - by Ole_Brun
    I have desperately been trying to get push for git working through the "smart-http" mode using git-http-backend. However after many hours of testing and troubleshooting, I am still left with error: Cannot access URL http://localhost/git/hello.git/, return code 22 fatal: git-http-push failed` I am using latest versions of Ubuntu (12.04), Apache2 (2.2.22) and Git (1.7.9.5) and have followed different tutorials found on the Internet, like this one http://www.parallelsymmetry.com/howto/git.jsp. My VHost file currently looks like this: <VirtualHost *:80> SetEnv GIT_PROJECT_ROOT /var/www/git SetEnv GIT_HTTP_EXPORT_ALL SetEnv REMOTE_USER=$REDIRECT_REMOTE_USER DocumentRoot /var/www/git ScriptAliasMatch \ "(?x)^/(.*?)\.git/(HEAD | \ info/refs | \ objects/info/[^/]+ | \ git-(upload|receive)-pack)$" \ /usr/lib/git-core/git-http-backend/$1/$2 <Directory /var/www/git> Options +ExecCGI +SymLinksIfOwnerMatch -MultiViews AllowOverride None Order allow,deny allow from all </Directory> </VirtualHost> I have changed the ownership of the /var/www/git folder to root.www-data and for my test repositories I have enabled anonymous push by doing git config http.receivepack true. I have also tried with authenticated users but with the same outcome. The repositories were created using: sudo git init --bare --shared [repo-name] While looking at the apache2 access.log, it appears to me that WebDAV is trying to be used, and that git-http-backend is never fired: 127.0.0.1 - - [20/May/2012:23:04:53 +0200] "GET /git/hello.git/info/refs?service=git-receive-pack HTTP/1.1" 200 207 "-" "git/1.7.9.5" 127.0.0.1 - - [20/May/2012:23:04:53 +0200] "GET /git/hello.git/HEAD HTTP/1.1" 200 232 "-" "git/1.7.9.5" 127.0.0.1 - - [20/May/2012:23:04:53 +0200] "PROPFIND /git/hello.git/ HTTP/1.1" 405 563 "-" "git/1.7.9.5" What am I doing wrong? Is it an issue with the version of git and/or apache that I am using perhaps? BTW: I have read all the git http related questions on ServerFault and StackOverflow, and none of them provided me with a solution, so please don't mark this as duplicate.

    Read the article

  • Bingbot seems to be adding "ForceRecrawl: 0" to URLs when crawling my sites

    - by Louis Somers
    I'm seeing this in the iis-logs of two websites that I maintain: GET /an/existing/page/on/my/site+ForceRecrawl:+0 - 80 - 207.46.195.105 HTTP/1.1 Mozilla/5.0+(compatible;+bingbot/2.0;++http://www.bing.com/bingbot.htm) I get about one or two of these per day from these IP addresses: 207.46.195.105, 65.52.110.190.. an more, all belonging to msnbot-ip.search.msn.com Probably Microsoft has a bug in their crawler? Any way, doing a search on "ForceRecrawl: 0" in major search engines comes up with a bunch of random sites. Doing the search on StackOverflow or here gave no results (to my amazement). Am I the only one seeing this? I first noticed these on the 9th of this month, and I'm seeing them pass almost daily since... Another thing that I think is crazy, is that the URL http://www.bing.com/bingbot.htm redirects to mail.live.com (hotmail). Currently I'm returning 404's but I'm considering to catch these, strip the trailing " ForceRecrawl: 0" and process as if it were a legitimate url. Could anyone shed some light on this? Could it have to do with some configuration or so in Bing's Webmaster Tools?

    Read the article

  • Copyrights concerning code snippets and larger amounts of code

    - by JustcallmeDrago
    I am designing a public code repository. Users will be allowed to post and edit whatever amount of code they want, from code snippets to entire multi-file projects. I have a few major legal concerns about this: Not getting sued/shut down - I feel the site would be a much easier target than tracking down an individual user to sue. I have looked around a bit and see links to legal info in the footer of each page is common. What specific things should I do--and what does does a site such as YouTube (which I see copyrighted material on all the time) do--for protection? Citing sources and editing sourced code - If a user wants to post code that isn't theirs, what concerns/safeguards should I have? Will a link suffice, and what do I need further to allow the code to be edited (to improve it for example)? What can happen if a user posts copyrighted code without citing it? Large chunks of code - What legal differences should I look out for as the amount grows? Not having a mess of licenses for the site - I would like to have a single license (like RosettaCode) that keeps things simple for interaction on the site. I want the code to be postable and editable. I have looked into StackOverflow's CreativeCommons license a little and it says that If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one. And on RosettaCode: All software found on Rosetta Code should be considered potentially hazardous. Use at your own risk. Be aware that all code on Rosetta Code is under the GNU Free Documentation License, as are any edits made by contributors. See Rosetta Code:Copyrights for details. What other licenses are like this? Commercializing the site - In what ways can I and can't I make money off of a site that contains code like this? All code will be publicly visible. Initial thoughts are having ads or making money by charging for advanced features.

    Read the article

  • Google Webmasters tools crawl error

    - by Shiro
    I am looking in to Google Webmaster Tools - Crawl Error section. How should I handle for those URL due their system / application showed invalid URL. e.g http://www.example/images/products/s_=enlarge_16gb.jpg but, I dunno what happen to yahoo groups, it break the link into http://www.example/images/products/s_= enlarge_16gb.jpg and I only make the top part become hyperlink, which is http://www.example/images/products/s_= Because of the URL, Google show crawl error, I got few error because of this kind of result or because other people typo error. How do I prevent this. I am sure I don't have the right go and change other people post. What is the solution for this. Thanks!

    Read the article

  • Google Webmasters tools crawl error caused by URL split into two lines

    - by Shiro
    I am looking in to Google Webmaster Tools - Crawl Error section. How should I handle for those URL due their system / application showed invalid URL. e.g http://www.example/images/products/s_=enlarge_16gb.jpg but, I dunno what happen to yahoo groups, it break the link into http://www.example/images/products/s_= enlarge_16gb.jpg and I only make the top part become hyperlink, which is http://www.example/images/products/s_= Because of the URL, Google show crawl error, I got few error because of this kind of result or because other people typo error. How do I prevent this. I am sure I don't have the right go and change other people post. What is the solution for this. Thanks!

    Read the article

  • New Trusted Status awarded to first Mobile Java Developer

    - by Jacob Lehrbaum
    Java Verified has just announced that GameLoft is the first developer to receive its new Trusted Status!  Java Verified is an industry-recognized Java testing and signing program backed and funded by companies such as AT&T, LG, Motorola, Nokia, Oracle, Orange, Samsung and Vodafone, and chartered with making it easier for mobile developers to certify and deploy applications for use across the billions of mobile handsets that run the Java ME.  Because of its breadth and diversity, Java ME provides an unmatched opportunity to reach more than 3 billions consumers, but at the same time, developers are faced with the challenge of working with multiple distribution channels and a range of handsets. To this end, the Java Verified program provides a suite of tests that help to validate identity, functionality, integrity, and quality.  Since its rebirth in 2010 as an independent organization, the Java Verified program has been actively working to make it even easier to create and distribute Java ME apps.  Example initiatives include updates to the Unified Testing Criteria to make it easier to test "Simple Apps," community outreach to better understand and address developer pain-points  and a new "Trusted Status."  In the words of the Java Verified Program, Trusted Status is:a privileged status to be granted to developers who will have proven that the quality of their Java ME apps is of a consistently high standard. These are developers who will have earned the trust of Java Verified by demonstrating unfailingly that testing to the UTC standard is a crucial part of their product development activityThe first developer to be awarded this status is GameLoft.  By achieving Trusted Status Gameloft can now test their applications to the Java Verified standard without needing to provide Java Verified with the evidence.  The apps then automatically get signed with the Java Verified signature enabling GameLoft to benefit from reduced costs and time-to-market for their new Java ME applications from here on out.  Learn more about the exciting news or apply now for Trusted Status!

    Read the article

  • Apache, modifying response codes from 404 to 301

    - by user72539
    Hi, I'm running a magento installation on an apache server. There are many pages indexed in both google and linked to from external sites. I can't use 301 redirects in a .htaccess file as I can't be sure I will catch all the links. At the moment all requests are rewritten through magento and if a request isn't found magento returns a 404 File not Found. Is there a way of using one of the apache modules to filter the response* from magento and if a 404 Not Found is being sent back then replace the response with a standard 301 Redirect to the home page? E.g. Request to Magento -- Apache -- Rewrite to Magento index.php page -- page processed. Response if request exists -- return results (200) if request doesn't exist -- return 404 -- apache filter change response -- return 301 redirect to / I appreciate any help. Thanks, Jon as far as I am aware mod_rewrite is only used to rewrite requests and doesn't allow the modification of responses.

    Read the article

  • one subdomain as cname for another domain, can i have different custom 404 pages

    - by lucky cool
    Actually I have a domain - domainone.com I have created a subdomain cname of anoter domain as abc.domaintwo.com - CNAME as - domainone.com so that I can use the js and css files e.g domainone.com/js/jquery.js files as abc.domaintwo.com/js/jquery.js SO Far everything is FINE, no issues at all. Problem: I have a custom 404 page for domainone.com and now when that abc.domaintwo.com goes 404 same page appears which i don't want. Any help is appreciated. Htaccess of domainone.com: ErrorDocument 404 /404/ All i want is to have different 404 for both..... Notes: Don't have access to shell for symlinks or alias as I am on shared hosting.

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >