Search Results

Search found 127718 results on 5109 pages for 'http status code 401'.

Page 3/5109 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • silverlight 3: long running wcf call triggers 401.1 (access denied)

    - by sympatric greg
    I have a wcf service consumed by a silverlight 3 control. The Silverlight client uses a basicHttpBindinging that is constructed at runtime from the control's initialization parameters like this: public static T GetServiceClient<T>(string serviceURL) { BasicHttpBinding binding = new BasicHttpBinding(Application.Current.Host.Source.Scheme.Equals("https", StringComparison.InvariantCultureIgnoreCase) ? BasicHttpSecurityMode.Transport : BasicHttpSecurityMode.None); binding.MaxReceivedMessageSize = int.MaxValue; binding.MaxBufferSize = int.MaxValue; binding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly; return (T)Activator.CreateInstance(typeof(T), new object[] { binding, new EndpointAddress(serviceURL)}); } The Service implements windows security. Calls were returning as expected until the result set increased to several thousand rows at which time HTTP 401.1 errors were received. The Service's HttpBinding defines closeTime, openTimeout, receiveTimeout and sendTimeOut of 10 minutes. If I limit the size of the resultset the call suceeds. Additional Observations from Fiddler: When Method2 is modified to return a smaller resultset (and avoid the problem), control initialization consists of 4 calls: Service1/Method1 -- result:401 Service1/Method1 -- result:401 (this time header includes element "Authorization: Negotiate TlRMTV..." Service1/Method1 -- result:200 Service1/Method2 -- result:200 (1.25 seconds) When Method2 is configured to return the larger resultset we get: Service1/Method1 -- result:401 Service1/Method1 -- result:401 (this time header includes element "Authorization: Negotiate TlRMTV..." Service1/Method1 -- result:200 Service1/Method2 -- result:401.1 (7.5 seconds) Service1/Method2 -- result:401.1 (15ms) Service1/Method2 -- result:401.1 (7.5 seconds)

    Read the article

  • Should we enforce code style in our large codebase?

    - by eighttrackmind
    By "code style" I mean 2 things: Style, eg. // bad if(foo){ ... } // good if (foo) { ... } Conventions and idiomaticity, where two ways of writing the same thing are functionally equivalent, but one is more idiomatic. eg. // bad if (fooLib.equals(a, b)) { ... } // good if (a == b) { ... } I think it makes sense to use an auto-formatter to enforce #1 automatically. So my question is specifically about #2. I like to break things down into pros and cons, here's what I've come up with so far: Pros: Used by many large codebases (eg. Google, jQuery) Helps make it a bit easier to work on new areas of the codebase Helps make code more portable (this is not necessarily true) Code style is automatic once you get used to it Makes it easier to fast-decline pull requests Cons: Takes engineers’ and code reviewers’ time away from more important things (like developing features) Code should ideally be rewritten every 2-3 years anyway, so it’s more important to focus on getting the architecture right, and achieving high test coverage Adds strain to code reviews (eg. “don’t do it this way, I like this other way better”) Even if I’ve been using a code style for a while, I still sometime have to pause and think about how to write a line better Having an enforced, uniform code style makes it hard to experiment with potentially better styles Maintaining a style guide takes a lot of incremental effort Engineers rarely read through the style guide. More often, it's cited in code reviews And as a secondary question: we also have many smaller repositories - should the same code style be enforced there?

    Read the article

  • Jetty - 401 Unauthorized when using basic authentication

    - by JP.
    I am running SOLR on jetty in Ubuntu (a bitnami VM, if that helps) and am trying to lock down access to both the admin pages and the update/delete/etc. pages using basic authentication. When I attempt to connect to the admin console via a web browser I am prompted for a user name and password, but the username and password I use simply does not work. For test purposes I am using foo:bar as the credentials, but I receive a '401 Unauthorized' response. I see the following in my request log. 127.0.0.1 - - [10/Nov/2013:05:35:46 +0000] "GET /solr/ HTTP/1.1" 401 1376 Am I doing something wrong and/or is there anything obviously incorrect with the below configuration? Any help is greatly appreciated. Jetty.xml <Call name="addBean"> <Arg> <New class="org.eclipse.jetty.security.HashLoginService"> <Set name="name">solr</Set> <Set name="config"><SystemProperty name="jetty.home" default="."/>/etc/realm.properties</Set> <Set name="refreshInterval">5</Set> </New> </Arg> </Call> /etc/realm.properties foo: bar, solr_admin webdefault.xml <security-constraint> <web-resource-collection> <url-pattern>/</url-pattern> </web-resource-collection> <auth-constraint> <role-name>solr_admin</role-name> </auth-constraint> </security-constraint> <login-config> <auth-method>BASIC</auth-method> <realm-name>solr</realm-name> </login-config>

    Read the article

  • How to test an HTTP 301 redirect?

    - by NoozNooz42
    How can one easily test HTTP return codes, like, say, a 301 redirect? For example, if I want to "see what's going on", I can use telnet to do something like this: ... $ telnet nytimes.com 80 Trying 199.239.136.200... Connected to nytimes.com. Escape character is '^]'. GET / HTTP/1.0 (enter) (enter) HTTP/1.1 200 OK Server: Sun-ONE-Web-Server/6.1 Date: Mon, 14 Jun 2010 12:18:04 GMT Content-type: text/html Set-cookie: RMID=007af83f42dd4c161dfcce7d; expires=Tuesday, 14-Jun-2011 12:18:04 GMT; path=/; domain=.nytimes.com Set-cookie: adxcs=-; path=/; domain=.nytimes.com Set-cookie: adxcs=-; path=/; domain=.nytimes.com Set-cookie: adxcs=-; path=/; domain=.nytimes.com Expires: Thu, 01 Dec 1994 16:00:00 GMT Cache-control: no-cache Pragma: no-cache Connection: close <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html> <head> ... Which is an easy way to access quite some infos. But now I want to test that a 301 redirect is indeed a 301 redirect. How can I do so? Basically, instead of getting a HTTP/1.1 200 OK I'd like to know how I can get the 301? I know that I can enter the name of the URL in a browser and "see" that I'm redirected, but I'd like to know what tool(s) can be used to actually really "see" the 301 redirect. Btw, I did test with a telnet, but when I enter www.example.org, which I redirected to example.org (without the www), all I can see is an "200 OK", I don't get to see the 301.

    Read the article

  • Cadaver with Kerberos: 401 Unauthorized

    - by Nicolas Raoul
    How to make Cadaver connect to a WebDAV server that uses Kerberos authentication? Usually cadaver http://localhost:8080/alfresco/webdav works, I can browse files, but on a network with Kerberos I get: Could not open collection: 401 Unauthorized Even though I have logged in with kinit successfully and have a valid ticket. I can see that Kerberos support has been implemented in Cadaver in 2005. Is there a special syntax to use? No info in the man.

    Read the article

  • Reporting Services Error 401: Unauthorized

    - by JJ
    I am trying to setup reporting services on a SQL server 2008 and get this error message when I try to connect to the http://localhost/reports The request failed with HTTP status 401: Unauthorized. I asked the server administrator and the IIS should be running (can't find it under administrative tools though) I used the Reporting Services Configuration manager to setup reporting services and am pretty sure it is setup correct.

    Read the article

  • Turning off ASP.Net WebForms authentication for one sub-directory

    - by Keith
    I have a large enterprise application containing both WebForms and MVC pages. It has existing authentication and authorisation settings that I don't want to change. The WebForms authentication is configured in the web.config: <authentication mode="Forms"> <forms blah... blah... blah /> </authentication> <authorization> <deny users="?" /> </authorization> Fairly standard so far. I have a REST service that is part of this big application and I want to use HTTP authentication instead for this one service. So, when a user attempts to get JSON data from the REST service it returns an HTTP 401 status and a WWW-Authenticate header. If they respond with a correctly formed HTTP Authorization response it lets them in. The problem is that WebForms overrides this at a low level - if you return 401 (Unauthorised) it overrides that with a 302 (redirection to login page). That's fine in the browser but useless for a REST service. I want to turn off the authentication setting in the web.config: <location path="rest"> <system.web> <authentication mode="None" /> <authorization><allow users="?" /></authorization> </system.web> </location> The authorisation bit works fine, but when I try to change the authentication I get an exception: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. I'm configuring this at application level though - it's in the root web.config How do I override the authentication so that all of the rest of the site uses WebForms authentication and this one directory uses none? This is similar to another question: 401 response code for json requests with ASP.NET MVC, but I'm not looking for the same solution - I don't want to just remove the WebForms authentication and add new custom code globally, there's far to much risk and work involved. I want to change just the one directory in configuration.

    Read the article

  • HTTP Builder/Groovy - lost 302 (redirect) handling?

    - by Misha Koshelev
    Dear All: I am reading here http://groovy.codehaus.org/modules/http-builder/doc/handlers.html "In cases where a response sends a redirect status code, this is handled internally by Apache HttpClient, which by default will simply follow the redirect by re-sending the request to the new URL. You do not need to do anything special in order to follow 302 responses." This seems to work fine when I simply use the get() or post() methods without a closure. However, when I use a closure, I seem to lose 302 handling. Is there some way I can handle this myself? Thank you p.s. Here is my log output showing it is a 302 response [java] FINER: resp.statusLine: "HTTP/1.1 302 Found" Here is the relevant code: // Copyright (C) 2010 Misha Koshelev. All Rights Reserved. package com.mksoft.fbbday.main import groovyx.net.http.ContentType import java.util.logging.Level import java.util.logging.Logger class HTTPBuilder { def dataDirectory HTTPBuilder(dataDirectory) { this.dataDirectory=dataDirectory } // Main logic def logger=Logger.getLogger(this.class.name) def closure={resp,reader-> logger.finer("resp.statusLine: \"${resp.statusLine}\"") if (logger.isLoggable(Level.FINEST)) { def respHeadersString='Headers:'; resp.headers.each() { header->respHeadersString+="\n\t${header.name}=\"${header.value}\"" } logger.finest(respHeadersString) } def text=reader.text def lastHtml=new File("${dataDirectory}${File.separator}last.html") if (lastHtml.exists()) { lastHtml.delete() } lastHtml<<text new XmlSlurper(new org.cyberneko.html.parsers.SAXParser()).parseText(text) } def processArgs(args) { if (logger.isLoggable(Level.FINER)) { def argsString='Args:'; args.each() { arg->argsString+="\n\t${arg.key}=\"${arg.value}\"" } logger.finer(argsString) } args.contentType=groovyx.net.http.ContentType.TEXT args } // HTTPBuilder methods def httpBuilder=new groovyx.net.http.HTTPBuilder () def get(args) { httpBuilder.get(processArgs(args),closure) } def post(args) { args.contentType=groovyx.net.http.ContentType.TEXT httpBuilder.post(processArgs(args),closure) } } Here is a specific tester: #!/usr/bin/env groovy import groovyx.net.http.HTTPBuilder import groovyx.net.http.Method import static groovyx.net.http.ContentType.URLENC import java.util.logging.ConsoleHandler import java.util.logging.Level import java.util.logging.Logger // MUST ENTER VALID FACEBOOK EMAIL AND PASSWORD BELOW !!! def email='' def pass='' // Remove default loggers def logger=Logger.getLogger('') def handlers=logger.handlers handlers.each() { handler->logger.removeHandler(handler) } // Log ALL to Console logger.setLevel Level.ALL def consoleHandler=new ConsoleHandler() consoleHandler.setLevel Level.ALL logger.addHandler(consoleHandler) // Facebook - need to get main page to capture cookies def http = new HTTPBuilder() http.get(uri:'http://www.facebook.com') // Login def html=http.post(uri:'https://login.facebook.com/login.php?login_attempt=1',body:[email:email,pass:pass]) assert html==null // Why null? html=http.post(uri:'https://login.facebook.com/login.php?login_attempt=1',body:[email:email,pass:pass]) { resp,reader-> assert resp.statusLine.statusCode==302 // Shouldn't we be redirected??? // http://groovy.codehaus.org/modules/http-builder/doc/handlers.html // "In cases where a response sends a redirect status code, this is handled internally by Apache HttpClient, which by default will simply follow the redirect by re-sending the request to the new URL. You do not need to do anything special in order to follow 302 responses. " } Here are relevant logs: FINE: Receiving response: HTTP/1.1 302 Found Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << HTTP/1.1 302 Found Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Cache-Control: private, no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Expires: Sat, 01 Jan 2000 00:00:00 GMT Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Location: http://www.facebook.com/home.php? Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << P3P: CP="DSP LAW" Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Pragma: no-cache Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Set-Cookie: datr=1275687438-9ff6ae60a89d444d0fd9917abf56e085d370277a6e9ed50c1ba79; expires=Sun, 03-Jun-2012 21:37:24 GMT; path=/; domain=.facebook.com Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Set-Cookie: lxe=koshelev%40post.harvard.edu; expires=Tue, 28-Sep-2010 15:24:04 GMT; path=/; domain=.facebook.com; httponly Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Set-Cookie: lxr=deleted; expires=Thu, 04-Jun-2009 21:37:23 GMT; path=/; domain=.facebook.com; httponly Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Set-Cookie: pk=183883c0a9afab1608e95d59164cc7dd; path=/; domain=.facebook.com; httponly Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Content-Type: text/html; charset=utf-8 Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << X-Cnection: close Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Date: Fri, 04 Jun 2010 21:37:24 GMT Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.DefaultClientConnection receiveResponseHeader FINE: << Content-Length: 0 Jun 4, 2010 4:37:22 PM org.apache.http.client.protocol.ResponseProcessCookies processCookies FINE: Cookie accepted: "[version: 0][name: datr][value: 1275687438-9ff6ae60a89d444d0fd9917abf56e085d370277a6e9ed50c1ba79][domain: .facebook.com][path: /][expiry: Sun Jun 03 16:37:24 CDT 2012]". Jun 4, 2010 4:37:22 PM org.apache.http.client.protocol.ResponseProcessCookies processCookies FINE: Cookie accepted: "[version: 0][name: lxe][value: koshelev%40post.harvard.edu][domain: .facebook.com][path: /][expiry: Tue Sep 28 10:24:04 CDT 2010]". Jun 4, 2010 4:37:22 PM org.apache.http.client.protocol.ResponseProcessCookies processCookies FINE: Cookie accepted: "[version: 0][name: lxr][value: deleted][domain: .facebook.com][path: /][expiry: Thu Jun 04 16:37:23 CDT 2009]". Jun 4, 2010 4:37:22 PM org.apache.http.client.protocol.ResponseProcessCookies processCookies FINE: Cookie accepted: "[version: 0][name: pk][value: 183883c0a9afab1608e95d59164cc7dd][domain: .facebook.com][path: /][expiry: null]". Jun 4, 2010 4:37:22 PM org.apache.http.impl.client.DefaultRequestDirector execute FINE: Connection can be kept alive indefinitely Jun 4, 2010 4:37:22 PM groovyx.net.http.HTTPBuilder doRequest FINE: Response code: 302; found handler: post302$_run_closure2@7023d08b Jun 4, 2010 4:37:22 PM groovyx.net.http.HTTPBuilder doRequest FINEST: response handler result: null Jun 4, 2010 4:37:22 PM org.apache.http.impl.conn.SingleClientConnManager releaseConnection FINE: Releasing connection org.apache.http.impl.conn.SingleClientConnManager$ConnAdapter@605b28c9 You can see there is clearly a location argument. Thank you Misha

    Read the article

  • Update Manager unable to get updates

    - by dPEN
    In last few days my Ubuntu 11.10 update manager is unable to get new updates. When I checked update log I saw that for couple of updates it says "Network isn't available". For other updates it downloaded logs and and internet connection also works fine. Unable to attached screenshot due to SPAM prevention policy. But for below Release gpgv:/var/lib/apt/lists/partial/extras.ubuntu.com_ubuntu_dists_oneiric_Release.gpg it says "Network isn't available" For all other Releases it is downloading fine. And due to this I dont see any update available in last 10 days. LOG OF sudo apt-get update: dipen@EIDLCPU1018:~$ sudo apt-get update [sudo] password for dipen: Ign http:/extras.ubuntu.com oneiric InRelease Ign http:/archive.canonical.com oneiric InRelease Ign http:/archive.canonical.com lucid InRelease Get:1 http:/extras.ubuntu.com oneiric Release.gpg [72 B] Get:2 http:/archive.canonical.com oneiric Release.gpg [198 B] Hit http:/extras.ubuntu.com oneiric Release Get:3 http:/archive.canonical.com lucid Release.gpg [198 B] Err http:/extras.ubuntu.com oneiric Release Hit http:/archive.canonical.com oneiric Release Ign http:/archive.canonical.com oneiric Release Hit http:/archive.canonical.com lucid Release Ign http:/archive.canonical.com lucid Release Ign http:/archive.canonical.com oneiric/partner i386 Packages/DiffIndex Ign http:/archive.canonical.com oneiric/partner TranslationIndex Ign http:/archive.canonical.com lucid/partner i386 Packages/DiffIndex Ign http:/archive.canonical.com lucid/partner TranslationIndex Hit http:/archive.canonical.com oneiric/partner i386 Packages Hit http:/archive.canonical.com lucid/partner i386 Packages Ign http:/dl.google.com stable InRelease Ign http:/archive.canonical.com oneiric/partner Translation-en_IN Ign http:/archive.canonical.com oneiric/partner Translation-en Ign http:/archive.canonical.com lucid/partner Translation-en_IN Ign http:/archive.canonical.com lucid/partner Translation-en Ign http:/in.archive.ubuntu.com oneiric InRelease Ign http:/in.archive.ubuntu.com oneiric-updates InRelease Ign http:/in.archive.ubuntu.com oneiric-security InRelease Get:4 http//dl.google.com stable Release.gpg [198 B] Get:5 http//in.archive.ubuntu.com oneiric Release.gpg [198 B] Get:6 http//in.archive.ubuntu.com oneiric-updates Release.gpg [198 B] Get:7 http//in.archive.ubuntu.com oneiric-security Release.gpg [198 B] Hit http:/in.archive.ubuntu.com oneiric Release Ign http:/in.archive.ubuntu.com oneiric Release Hit http:/in.archive.ubuntu.com oneiric-updates Release Err http:/in.archive.ubuntu.com oneiric-updates Release Hit http:/in.archive.ubuntu.com oneiric-security Release Ign http:/in.archive.ubuntu.com oneiric-security Release Ign http:/in.archive.ubuntu.com oneiric/main i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric/restricted i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric/universe i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric/multiverse i386 Packages/DiffIndex Hit http:/in.archive.ubuntu.com oneiric/main TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/multiverse TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/restricted TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/universe TranslationIndex Ign http:/in.archive.ubuntu.com oneiric-security/main i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric-security/restricted i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric-security/universe i386 Packages/DiffIndex Ign http:/in.archive.ubuntu.com oneiric-security/multiverse i386 Packages/DiffIndex Hit http:/in.archive.ubuntu.com oneiric-security/main TranslationIndex Hit http:/in.archive.ubuntu.com oneiric-security/multiverse TranslationIndex Hit http:/in.archive.ubuntu.com oneiric-security/restricted TranslationIndex Hit http:/in.archive.ubuntu.com oneiric-security/universe TranslationIndex Hit http:/in.archive.ubuntu.com oneiric/main i386 Packages Hit http:/in.archive.ubuntu.com oneiric/restricted i386 Packages Hit http:/in.archive.ubuntu.com oneiric/universe i386 Packages Hit http:/in.archive.ubuntu.com oneiric/multiverse i386 Packages Hit http:/in.archive.ubuntu.com oneiric/main Translation-en Hit http:/in.archive.ubuntu.com oneiric/multiverse Translation-en Hit http:/in.archive.ubuntu.com oneiric/restricted Translation-en Hit http:/in.archive.ubuntu.com oneiric/universe Translation-en Hit http:/in.archive.ubuntu.com oneiric-security/main i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/restricted i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/universe i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/multiverse i386 Packages Hit http:/in.archive.ubuntu.com oneiric-security/main Translation-en Hit http:/in.archive.ubuntu.com oneiric-security/multiverse Translation-en Get:8 http//dl.google.com stable Release [1,347 B] Hit http:/in.archive.ubuntu.com oneiric-security/restricted Translation-en Hit http:/in.archive.ubuntu.com oneiric-security/universe Translation-en Get:9 http//dl.google.com stable/main i386 Packages [1,214 B] Ign http:/dl.google.com stable/main TranslationIndex Ign http:/dl.google.com stable/main Translation-en_IN Ign http:/dl.google.com stable/main Translation-en Fetched 3,821 B in 41s (91 B/s) Reading package lists... Done W: A error occurred during the signature verification. The repository is not updated and the previous index files will be used. GPG error: http:/extras.ubuntu.com oneiric Release: The following signatures were invalid: BADSIG 16126D3A3E5C1192 Ubuntu Extras Archive Automatic Signing Key <[email protected]> W: GPG error: http:/archive.canonical.com oneiric Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: GPG error: http:/archive.canonical.com lucid Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: GPG error: http:/in.archive.ubuntu.com oneiric Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: A error occurred during the signature verification. The repository is not updated and the previous index files will be used. GPG error: http:/in.archive.ubuntu.com oneiric-updates Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: GPG error: http:/in.archive.ubuntu.com oneiric-security Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key <[email protected]> W: Failed to fetch http:/extras.ubuntu.com/ubuntu/dists/oneiric/Release W: Failed to fetch http:/in.archive.ubuntu.com/ubuntu/dists/oneiric-updates/Release W: Some index files failed to download. They have been ignored, or old ones used instead. dipen@EIDLCPU1018:~$ LOG of sudo apt-get upgrade: dipen@EIDLCPU1018:~$ sudo apt-get upgrade Reading package lists... Done Building dependency tree Reading state information... Done The following packages have been kept back: ghc6-doc haskell-zlib-doc libghc6-zlib-doc 0 upgraded, 0 newly installed, 0 to remove and 3 not upgraded. dipen@EIDLCPU1018:~$ /etc/apt/sources.list: deb http:/in.archive.ubuntu.com/ubuntu/ oneiric main restricted deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-updates main restricted deb http:/in.archive.ubuntu.com/ubuntu/ oneiric universe deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-updates universe deb http:/in.archive.ubuntu.com/ubuntu/ oneiric multiverse deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-updates multiverse deb http:/archive.canonical.com/ubuntu oneiric partner deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-security main restricted deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-security universe deb http:/in.archive.ubuntu.com/ubuntu/ oneiric-security multiverse deb http:/extras.ubuntu.com/ubuntu oneiric main #Third party developers repository deb http:/archive.canonical.com/ lucid partner

    Read the article

  • Is code maintenance typically a special project, or is it considered part of daily work?

    - by blueberryfields
    Earlier, I asked to find out which tools are commonly used to monitor methods and code bases, to find out whether the methods have been getting too long. Most of the responses there suggested that, beyond maintenance on the method currently being edited, programmers don't, in general, keep an eye on the rest of the code base. So I thought I'd ask the question in general: Is code maintenance, in general, considered part of your daily work? Do you find that you're spending at least some of your time cleaning up, refactoring, rewriting code in the code base, to improve it, as part of your other assigned work? Is it expected of you/do you expect it of your teammates? Or is it more common to find that cleanup, refactoring, and general maintenance on the codebase as a whole, occurs in bursts (for example, mostly as part of code reviews, or as part of refactoring/cleaning up projects)?

    Read the article

  • "(401)Authorization Required" when making a web service call using Axis

    - by Arun P Johny
    Hi, I'm using apache axis to connect to my sugar crm instance. When I'm trying to connect to the instance it is throwing the following exception Exception in thread "main" AxisFault faultCode: {http://xml.apache.org/axis/}HTTP faultSubcode: faultString: (401)Authorization Required faultActor: faultNode: faultDetail: {}:return code: 401 &lt;!DOCTYPE HTML PUBLIC &quot;-//IETF//DTD HTML 2.0//EN&quot;&gt; &lt;html&gt;&lt;head&gt; &lt;title&gt;401 Authorization Required&lt;/title&gt; &lt;/head&gt;&lt;body&gt; &lt;h1&gt;Authorization Required&lt;/h1&gt; &lt;p&gt;This server could not verify that you are authorized to access the document requested. Either you supplied the wrong credentials (e.g., bad password), or your browser doesn't understand how to supply the credentials required.&lt;/p&gt; &lt;/body&gt;&lt;/html&gt; {http://xml.apache.org/axis/}HttpErrorCode:401 (401)Authorization Required at org.apache.axis.transport.http.HTTPSender.readFromSocket(HTTPSender.java:744) at org.apache.axis.transport.http.HTTPSender.invoke(HTTPSender.java:144) at org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32) at org.apache.axis.SimpleChain.doVisiting(SimpleChain.java:118) at org.apache.axis.SimpleChain.invoke(SimpleChain.java:83) at org.apache.axis.client.AxisClient.invoke(AxisClient.java:165) at org.apache.axis.client.Call.invokeEngine(Call.java:2784) at org.apache.axis.client.Call.invoke(Call.java:2767) at org.apache.axis.client.Call.invoke(Call.java:2443) at org.apache.axis.client.Call.invoke(Call.java:2366) at org.apache.axis.client.Call.invoke(Call.java:1812) at org.beanizer.sugarcrm.SugarsoapBindingStub.get_server_info(SugarsoapBindingStub.java:1115) at com.greytip.sugarcrm.GreytipCrm.main(GreytipCrm.java:42) This basically says that I do not have the authorization to the resource. The same code is working fine in my testing environment. Sugarsoap service = new SugarsoapLocator(); SugarsoapPortType port = service.getsugarsoapPort(new java.net.URL( SUGAR_CRM_LOCATION + "/soap.php")); System.out.println(port.get_server_info().getVersion()); User_auth userAuth = new User_auth(); userAuth.setUser_name("user_name"); MessageDigest md = MessageDigest.getInstance("MD5"); String password = getHexString(md.digest("password".getBytes())); userAuth.setPassword(password); // userAuth.setVersion("0.1"); Entry_value login = port.login(userAuth, "myAppName", null); String sessionID = login.getId(); Above code is used to connect to the Sugar CRM installation. here line "System.out.println(port.get_server_info().getVersion());" is throwing the exception. One difference I noticed between the test and production environment is when I used the soap url in the browser the production site pops up a 'Authentication Required' popup. When I gives my proxy username and password in this popup, it shows the soap request details. The same is applicable for the login url also. First it will ask for the 'Authentication' then it will take to the sugar crm login page? Is it a server security setting? If it is then how to set this user name and password using java in a web service call. The authentication required popup is same as the one which comes when we try to access the tomcat manager through a browser. Thanks

    Read the article

  • Problem with apt-get update: failed to fetch error

    - by user171447
    I run an Ubuntu Server 12.04.3 LTS. Today, I wanted to update it, but I did not managed it (yes...), however upgrading worked well. I don't want you to solve my problem but it would be greatful if you could give me some hints. I googled hours, I fould a lot of this kind of errors, but not exactly this. Here is the output of apt-get update: Hit http://filepile.fastit.net precise Release.gpg Hit http://filepile.fastit.net precise Release Hit http://filepile.fastit.net precise/main amd64 Packages Hit http://filepile.fastit.net precise/restricted amd64 Packages Hit http://filepile.fastit.net precise/universe amd64 Packages Hit http://filepile.fastit.net precise/multiverse amd64 Packages Hit http://filepile.fastit.net precise/main i386 Packages Hit http://filepile.fastit.net precise/restricted i386 Packages Hit http://filepile.fastit.net precise/universe i386 Packages Hit http://filepile.fastit.net precise/multiverse i386 Packages Ign http://filepile.fastit.net precise/main TranslationIndex Ign http://filepile.fastit.net precise/multiverse TranslationIndex Ign http://filepile.fastit.net precise/restricted TranslationIndex Ign http://filepile.fastit.net precise/universe TranslationIndex Ign http://filepile.fastit.net precise/main Translation-en_GB Ign http://filepile.fastit.net precise/main Translation-en Ign http://filepile.fastit.net precise/main Translation-en_GB.UTF-8 Hit http://archive.canonical.com precise Release.gpg Ign http://filepile.fastit.net precise/multiverse Translation-en_GB Ign http://filepile.fastit.net precise/multiverse Translation-en Ign http://filepile.fastit.net precise/multiverse Translation-en_GB.UTF-8 Ign http://filepile.fastit.net precise/restricted Translation-en_GB Ign http://filepile.fastit.net precise/restricted Translation-en Ign http://filepile.fastit.net precise/restricted Translation-en_GB.UTF-8 Ign http://filepile.fastit.net precise/universe Translation-en_GB Ign http://filepile.fastit.net precise/universe Translation-en Ign http://filepile.fastit.net precise/universe Translation-en_GB.UTF-8 Hit http://archive.ubuntu.com precise Release.gpg Hit http://archive.canonical.com precise Release Hit http://archive.ubuntu.com precise Release Hit http://archive.ubuntu.com precise/main Sources Hit http://archive.ubuntu.com precise/restricted Sources Hit http://archive.ubuntu.com precise/main i386 Packages Hit http://archive.ubuntu.com precise/restricted i386 Packages Hit http://archive.ubuntu.com precise/multiverse i386 Packages Hit http://archive.ubuntu.com precise/main TranslationIndex Hit http://archive.ubuntu.com precise/multiverse TranslationIndex Hit http://archive.ubuntu.com precise/restricted TranslationIndex Hit http://archive.ubuntu.com precise/main Translation-en_GB Hit http://archive.ubuntu.com precise/main Translation-en Hit http://archive.ubuntu.com precise/multiverse Translation-en_GB Hit http://archive.ubuntu.com precise/multiverse Translation-en Hit http://archive.ubuntu.com precise/restricted Translation-en_GB Hit http://archive.ubuntu.com precise/restricted Translation-en Hit http://fr.archive.ubuntu.com precise-security Release.gpg Hit http://fr.archive.ubuntu.com precise-updates Release.gpg Hit http://fr.archive.ubuntu.com precise-security Release Hit http://fr.archive.ubuntu.com precise-updates Release Hit http://fr.archive.ubuntu.com precise-security/main amd64 Packages Hit http://fr.archive.ubuntu.com precise-security/restricted amd64 Packages Hit http://fr.archive.ubuntu.com precise-security/universe amd64 Packages Hit http://fr.archive.ubuntu.com precise-security/multiverse amd64 Packages :W: Failed to fetch http://archive.canonical.com/dists/precise/Release Unable to find expected entry 'main/binary-amd64/Packages' in Release file (Wrong sources.list entry or malformed file) E: Some index files failed to download. They have been ignored, or old ones used instead. Hit http://fr.archive.ubuntu.com precise-security/main i386 Packages Hit http://fr.archive.ubuntu.com precise-security/restricted i386 Packages Hit http://fr.archive.ubuntu.com precise-security/universe i386 Packages Hit http://fr.archive.ubuntu.com precise-security/multiverse i386 Packages Hit http://fr.archive.ubuntu.com precise-security/main TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/multiverse TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/restricted TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/universe TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/main amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/restricted amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/universe amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/multiverse amd64 Packages Hit http://fr.archive.ubuntu.com precise-updates/main i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/restricted i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/universe i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/multiverse i386 Packages Hit http://fr.archive.ubuntu.com precise-updates/main TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/multiverse TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/restricted TranslationIndex Hit http://fr.archive.ubuntu.com precise-updates/universe TranslationIndex Hit http://fr.archive.ubuntu.com precise-security/main Translation-en Hit http://fr.archive.ubuntu.com precise-security/multiverse Translation-en Hit http://fr.archive.ubuntu.com precise-security/restricted Translation-en Hit http://fr.archive.ubuntu.com precise-security/universe Translation-en Hit http://fr.archive.ubuntu.com precise-updates/main Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/main Translation-en Hit http://fr.archive.ubuntu.com precise-updates/multiverse Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/multiverse Translation-en Hit http://fr.archive.ubuntu.com precise-updates/restricted Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/restricted Translation-en Hit http://fr.archive.ubuntu.com precise-updates/universe Translation-en_GB Hit http://fr.archive.ubuntu.com precise-updates/universe Translation-en And here is my /etc/apt/sources.list: ###### Ubuntu Main Repos deb http://filepile.fastit.net/ubuntu/ precise main restricted universe multiverse # deb http://de.archive.ubuntu.com/ubuntu/ precise main restricted universe multiverse deb http://archive.canonical.com/ precise main restricted universe multiverse ###### Ubuntu Update Repos deb http://fr.archive.ubuntu.com/ubuntu/ precise-security main restricted universe multiverse deb http://fr.archive.ubuntu.com/ubuntu/ precise-updates main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu/ precise main restricted multiverse deb-src http://archive.ubuntu.com/ubuntu precise main restricted multiverse Thanks for your help!

    Read the article

  • 401 - Unauthorized On Server 2008 R2 IIS 7.5

    - by mxmissile
    I have a web application deployed to Server 2008 IIS 7.5 box. From remote it gives this error: 401 - Unauthorized: Access is denied due to invalid credentials. (remote = desktops on the same LAN) Have tried several remote clients using different browsers, all the same result. (IE, FF, and Chrome) Hitting the application from the desktop of the server itself works flawlessly. The application is using Anonymous Authentication. The application is written in .NET 4.0 Asp.Net using the MVC framework. Sysinternals procmon returns these 2 results for each request: FAST IO DISALLOWED and PATH NOT FOUND. I have 2 other MVC apps running fine on the same server. I have checked the security on the folders and they all match. App runs fine on a Server 2008 IIS 7.0 box. Nothing shows up in the Event log on the server related to this. Pulling my hair out here, any troubleshooting tips?

    Read the article

  • Reporting Services Returning HTTP 401 Unauthorized

    - by Chris Arnold
    I have just ported an existing ASP.NET application to a new web server (Windows Server 2008 R2 and SQL Server 2008). It is successfully running on 4 other servers of varying O/S (which I also setup). My ASP.NET app calls into the Reporting Services Web Service (ReportExecution2005.asmx) to generate a report and save it as a pdf to the file system. I consistently receive "System.Net.WebException - The request failed with HTTP status 401: Unauthorized." In UTTER desperation I have performed the following... Granted all Users complete access to SSRS via the Reports web page. Granted all Users 'Full control' to <%ProgramFiles%\Microsoft SQL Server\MSRS10.MSSQLSERVER I am not a network / server specialist but I'm the only one that can deal with this and it's driving me batty. Help!

    Read the article

  • Tomcat repeated 401 and the client nonce cache

    - by PaulNBN
    I've got a Tomcat 6.0.35 service with a SOAP based webapp protected by Digest Authentication. We are seeing issues with various users getting repeated 401 responses since we upgraded to 6.0.35. Additionally we are getting the following entries in Catalina log: WARNING: A valid entry has been removed from client nonce cache to make room for new entries. A replay attack is now possible. To prevent the possibility of replay attacks, reduce nonceValidity or increase cnonceCacheSize. Further warnings of this type will be suppressed for 5 minutes. Any idea what is going on?

    Read the article

  • Could 11.5 Million 401's be causing bottlenecks?

    - by roviuser
    I'm going to preface this with a warning: My knowledge about servers and networking is VERY limited, and if you provide me with technical answers, I probably won't understand much until I research your answer further. I'm trying to expand my knowledge and learn about it, though. If the information that I am able to provide in this question is insufficient to answer the question, I understand, and it can be closed. We have a SharePoint 2007 system that is extremely slow, mostly from huge amounts of use. We've been told that the main speed bottleneck is the access to the sql databases. However, they do provide a statistics dashboard, so I did some poking around, and noticed that we have 11.5 million or more 401 - access denied errors every month. Could this be causing major speed/performance decreases? Authentication for sharepoint uses active directory.

    Read the article

  • HTTP 1.0 vs 1.1

    - by Jason Baker
    Could somebody give me a brief overview of the differences between HTTP 1.0 and HTTP 1.1? I've spent some time with both of the RFCs, but haven't been able to pull out a lot of difference between them. Wikipedia says this: HTTP/1.1 (1997-1999) Current version; persistent connections enabled by default and works well with proxies. Also supports request pipelining, allowing multiple requests to be sent at the same time, allowing the server to prepare for the workload and potentially transfer the requested resources more quickly to the client. But that doesn't mean a lot to me. I realize this is a somewhat complicated subject, so I'm not expecting a full answer, but can someone give me a brief overview of the differences at a bit lower level? By this I mean that I'm looking for the info I would need to know to implement either an HTTP server or application. I realize that this can be a somewhat complicated subject (based on what I know about HTTP as of right now), so I'm not necessarily looking for a full answer. I'm really more looking for a nudge in the right direction so that I can figure it out on my own.

    Read the article

  • Upgrading from 12.10 to 13.04 -> dpkg: error processing sudo (--configure)

    - by Korrigan Nagirrok
    Here's the deal and reason I'm asking for your help. Last night I went on upgrading my Xubuntu 12.10 installation to 13.04, so at tty1 I run the command sudo do-release-upgrade and everything seemed to went well except that after rebooting and when I run sudo apt-get update && sudo apt-get upgrade I get this error: sudo apt-get update && sudo apt-get upgrade Hit http://pt.archive.ubuntu.com raring Release.gpg Hit http://pt.archive.ubuntu.com raring-updates Release.gpg Hit http://dl.google.com stable Release.gpg Hit http://pt.archive.ubuntu.com raring-backports Release.gpg Hit http://pt.archive.ubuntu.com raring Release Hit http://archive.canonical.com raring Release.gpg Hit http://ppa.launchpad.net raring Release.gpg Hit http://pt.archive.ubuntu.com raring-updates Release Hit http://extras.ubuntu.com raring Release.gpg Hit http://pt.archive.ubuntu.com raring-backports Release Hit http://dl.google.com stable Release Hit http://pt.archive.ubuntu.com raring/main Sources Hit http://pt.archive.ubuntu.com raring/restricted Sources Hit http://extras.ubuntu.com raring Release Hit http://archive.canonical.com raring Release Hit http://ppa.launchpad.net raring Release.gpg Hit http://pt.archive.ubuntu.com raring/universe Sources Hit http://pt.archive.ubuntu.com raring/multiverse Sources Hit http://dl.google.com stable/main i386 Packages Get:1 http://security.ubuntu.com raring-security Release.gpg [933 B] Hit http://pt.archive.ubuntu.com raring/main i386 Packages Hit http://extras.ubuntu.com raring/main Sources Hit http://ppa.launchpad.net raring Release Hit http://archive.canonical.com raring/partner i386 Packages Hit http://pt.archive.ubuntu.com raring/restricted i386 Packages Hit http://pt.archive.ubuntu.com raring/universe i386 Packages Hit http://extras.ubuntu.com raring/main i386 Packages Hit http://pt.archive.ubuntu.com raring/multiverse i386 Packages Hit http://ppa.launchpad.net raring Release Hit http://pt.archive.ubuntu.com raring/main Translation-en Hit http://ppa.launchpad.net raring/main Sources Hit http://ppa.launchpad.net raring/main i386 Packages Hit http://pt.archive.ubuntu.com raring/multiverse Translation-en Hit http://pt.archive.ubuntu.com raring/restricted Translation-en Hit http://pt.archive.ubuntu.com raring/universe Translation-en Hit http://pt.archive.ubuntu.com raring-updates/main Sources Hit http://pt.archive.ubuntu.com raring-updates/restricted Sources Hit http://ppa.launchpad.net raring/main Sources Hit http://pt.archive.ubuntu.com raring-updates/universe Sources Hit http://pt.archive.ubuntu.com raring-updates/multiverse Sources Hit http://pt.archive.ubuntu.com raring-updates/main i386 Packages Hit http://ppa.launchpad.net raring/main i386 Packages Hit http://pt.archive.ubuntu.com raring-updates/restricted i386 Packages Hit http://pt.archive.ubuntu.com raring-updates/universe i386 Packages Hit http://pt.archive.ubuntu.com raring-updates/multiverse i386 Packages Ign http://dl.google.com stable/main Translation-en_US Hit http://pt.archive.ubuntu.com raring-updates/main Translation-en Ign http://archive.canonical.com raring/partner Translation-en_US Ign http://extras.ubuntu.com raring/main Translation-en_US Ign http://dl.google.com stable/main Translation-en Ign http://archive.canonical.com raring/partner Translation-en Hit http://pt.archive.ubuntu.com raring-updates/multiverse Translation-en Ign http://extras.ubuntu.com raring/main Translation-en Hit http://pt.archive.ubuntu.com raring-updates/restricted Translation-en Hit http://pt.archive.ubuntu.com raring-updates/universe Translation-en Hit http://pt.archive.ubuntu.com raring-backports/main Sources Hit http://pt.archive.ubuntu.com raring-backports/restricted Sources Hit http://pt.archive.ubuntu.com raring-backports/universe Sources Hit http://pt.archive.ubuntu.com raring-backports/multiverse Sources Hit http://pt.archive.ubuntu.com raring-backports/main i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/restricted i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/universe i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/multiverse i386 Packages Hit http://pt.archive.ubuntu.com raring-backports/main Translation-en Hit http://pt.archive.ubuntu.com raring-backports/multiverse Translation-en Get:2 http://security.ubuntu.com raring-security Release [40.8 kB] Hit http://pt.archive.ubuntu.com raring-backports/restricted Translation-en Hit http://pt.archive.ubuntu.com raring-backports/universe Translation-en Ign http://ppa.launchpad.net raring/main Translation-en_US Ign http://ppa.launchpad.net raring/main Translation-en Get:3 http://security.ubuntu.com raring-security/main Sources [2,109 B] Ign http://ppa.launchpad.net raring/main Translation-en_US Ign http://ppa.launchpad.net raring/main Translation-en Get:4 http://security.ubuntu.com raring-security/restricted Sources [14 B] Get:5 http://security.ubuntu.com raring-security/universe Sources [14 B] Get:6 http://security.ubuntu.com raring-security/multiverse Sources [14 B] Get:7 http://security.ubuntu.com raring-security/main i386 Packages [3,670 B] Get:8 http://security.ubuntu.com raring-security/restricted i386 Packages [14 B] Get:9 http://security.ubuntu.com raring-security/universe i386 Packages [2,824 B] Get:10 http://security.ubuntu.com raring-security/multiverse i386 Packages [14 B] Ign http://pt.archive.ubuntu.com raring/main Translation-en_US Ign http://pt.archive.ubuntu.com raring/multiverse Translation-en_US Ign http://pt.archive.ubuntu.com raring/restricted Translation-en_US Ign http://pt.archive.ubuntu.com raring/universe Translation-en_US Ign http://pt.archive.ubuntu.com raring-updates/main Translation-en_US Ign http://pt.archive.ubuntu.com raring-updates/multiverse Translation-en_US Hit http://security.ubuntu.com raring-security/main Translation-en Ign http://pt.archive.ubuntu.com raring-updates/restricted Translation-en_US Ign http://pt.archive.ubuntu.com raring-updates/universe Translation-en_US Ign http://pt.archive.ubuntu.com raring-backports/main Translation-en_US Ign http://pt.archive.ubuntu.com raring-backports/multiverse Translation-en_US Ign http://pt.archive.ubuntu.com raring-backports/restricted Translation-en_US Hit http://security.ubuntu.com raring-security/multiverse Translation-en Ign http://pt.archive.ubuntu.com raring-backports/universe Translation-en_US Hit http://security.ubuntu.com raring-security/restricted Translation-en Hit http://security.ubuntu.com raring-security/universe Translation-en Ign http://security.ubuntu.com raring-security/main Translation-en_US Ign http://security.ubuntu.com raring-security/multiverse Translation-en_US Ign http://security.ubuntu.com raring-security/restricted Translation-en_US Ign http://security.ubuntu.com raring-security/universe Translation-en_US Fetched 50.4 kB in 6s (7,454 B/s) Reading package lists... Done Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 2 not fully installed or removed. Need to get 0 B/373 kB of archives. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? Y dpkg: error processing sudo (--configure): Package is in a very bad inconsistent state - you should reinstall it before attempting configuration. No apport report written because MaxReports is reached already dpkg: dependency problems prevent configuration of ubuntu-minimal: ubuntu-minimal depends on sudo; however: Package sudo is not configured yet. dpkg: error processing ubuntu-minimal (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: sudo ubuntu-minimal E: Sub-process /usr/bin/dpkg returned an error code (1) I've tried everything I thought logical, like sudo dpkg --configure -a dpkg: error processing sudo (--configure): Package is in a very bad inconsistent state - you should reinstall it before attempting configuration. dpkg: dependency problems prevent configuration of ubuntu-minimal: ubuntu-minimal depends on sudo; however: Package sudo is not configured yet. dpkg: error processing ubuntu-minimal (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: sudo ubuntu-minimal sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 2 not fully installed or removed. Need to get 0 B/373 kB of archives. After this operation, 0 B of additional disk space will be used. dpkg: error processing sudo (--configure): Package is in a very bad inconsistent state - you should reinstall it before attempting configuration. dpkg: dependency problems prevent configuration of ubuntu-minimal: ubuntu-minimal depends on sudo; however: Package sudo is not configured yet. dpkg: error processing ubuntu-minimal (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already Errors were encountered while processing: sudo ubuntu-minimal E: Sub-process /usr/bin/dpkg returned an error code (1) Can someone help me, please. Edit: Here's some more info that could be of help for anyone. The output of apt-cache policy linux-image-generic-pae linux-generic-pae is linux-image-generic-pae: Installed: (none) Candidate: 3.8.0.19.35 Version table: 3.8.0.19.35 0 500 http://pt.archive.ubuntu.com/ubuntu/ raring/main i386 Packages linux-generic-pae: Installed: (none) Candidate: 3.8.0.19.35 Version table: 3.8.0.19.35 0 500 http://pt.archive.ubuntu.com/ubuntu/ raring/main i386 Packages

    Read the article

  • Git push over http (using git-http-backend) and Apache is not working

    - by Ole_Brun
    I have desperately been trying to get push for git working through the "smart-http" mode using git-http-backend. However after many hours of testing and troubleshooting, I am still left with error: Cannot access URL http://localhost/git/hello.git/, return code 22 fatal: git-http-push failed` I am using latest versions of Ubuntu (12.04), Apache2 (2.2.22) and Git (1.7.9.5) and have followed different tutorials found on the Internet, like this one http://www.parallelsymmetry.com/howto/git.jsp. My VHost file currently looks like this: <VirtualHost *:80> SetEnv GIT_PROJECT_ROOT /var/www/git SetEnv GIT_HTTP_EXPORT_ALL SetEnv REMOTE_USER=$REDIRECT_REMOTE_USER DocumentRoot /var/www/git ScriptAliasMatch \ "(?x)^/(.*?)\.git/(HEAD | \ info/refs | \ objects/info/[^/]+ | \ git-(upload|receive)-pack)$" \ /usr/lib/git-core/git-http-backend/$1/$2 <Directory /var/www/git> Options +ExecCGI +SymLinksIfOwnerMatch -MultiViews AllowOverride None Order allow,deny allow from all </Directory> </VirtualHost> I have changed the ownership of the /var/www/git folder to root.www-data and for my test repositories I have enabled anonymous push by doing git config http.receivepack true. I have also tried with authenticated users but with the same outcome. The repositories were created using: sudo git init --bare --shared [repo-name] While looking at the apache2 access.log, it appears to me that WebDAV is trying to be used, and that git-http-backend is never fired: 127.0.0.1 - - [20/May/2012:23:04:53 +0200] "GET /git/hello.git/info/refs?service=git-receive-pack HTTP/1.1" 200 207 "-" "git/1.7.9.5" 127.0.0.1 - - [20/May/2012:23:04:53 +0200] "GET /git/hello.git/HEAD HTTP/1.1" 200 232 "-" "git/1.7.9.5" 127.0.0.1 - - [20/May/2012:23:04:53 +0200] "PROPFIND /git/hello.git/ HTTP/1.1" 405 563 "-" "git/1.7.9.5" What am I doing wrong? Is it an issue with the version of git and/or apache that I am using perhaps? BTW: I have read all the git http related questions on ServerFault and StackOverflow, and none of them provided me with a solution, so please don't mark this as duplicate.

    Read the article

  • Copyrights concerning code snippets and larger amounts of code

    - by JustcallmeDrago
    I am designing a public code repository. Users will be allowed to post and edit whatever amount of code they want, from code snippets to entire multi-file projects. I have a few major legal concerns about this: Not getting sued/shut down - I feel the site would be a much easier target than tracking down an individual user to sue. I have looked around a bit and see links to legal info in the footer of each page is common. What specific things should I do--and what does does a site such as YouTube (which I see copyrighted material on all the time) do--for protection? Citing sources and editing sourced code - If a user wants to post code that isn't theirs, what concerns/safeguards should I have? Will a link suffice, and what do I need further to allow the code to be edited (to improve it for example)? What can happen if a user posts copyrighted code without citing it? Large chunks of code - What legal differences should I look out for as the amount grows? Not having a mess of licenses for the site - I would like to have a single license (like RosettaCode) that keeps things simple for interaction on the site. I want the code to be postable and editable. I have looked into StackOverflow's CreativeCommons license a little and it says that If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one. And on RosettaCode: All software found on Rosetta Code should be considered potentially hazardous. Use at your own risk. Be aware that all code on Rosetta Code is under the GNU Free Documentation License, as are any edits made by contributors. See Rosetta Code:Copyrights for details. What other licenses are like this? Commercializing the site - In what ways can I and can't I make money off of a site that contains code like this? All code will be publicly visible. Initial thoughts are having ads or making money by charging for advanced features.

    Read the article

  • New Trusted Status awarded to first Mobile Java Developer

    - by Jacob Lehrbaum
    Java Verified has just announced that GameLoft is the first developer to receive its new Trusted Status!  Java Verified is an industry-recognized Java testing and signing program backed and funded by companies such as AT&T, LG, Motorola, Nokia, Oracle, Orange, Samsung and Vodafone, and chartered with making it easier for mobile developers to certify and deploy applications for use across the billions of mobile handsets that run the Java ME.  Because of its breadth and diversity, Java ME provides an unmatched opportunity to reach more than 3 billions consumers, but at the same time, developers are faced with the challenge of working with multiple distribution channels and a range of handsets. To this end, the Java Verified program provides a suite of tests that help to validate identity, functionality, integrity, and quality.  Since its rebirth in 2010 as an independent organization, the Java Verified program has been actively working to make it even easier to create and distribute Java ME apps.  Example initiatives include updates to the Unified Testing Criteria to make it easier to test "Simple Apps," community outreach to better understand and address developer pain-points  and a new "Trusted Status."  In the words of the Java Verified Program, Trusted Status is:a privileged status to be granted to developers who will have proven that the quality of their Java ME apps is of a consistently high standard. These are developers who will have earned the trust of Java Verified by demonstrating unfailingly that testing to the UTC standard is a crucial part of their product development activityThe first developer to be awarded this status is GameLoft.  By achieving Trusted Status Gameloft can now test their applications to the Java Verified standard without needing to provide Java Verified with the evidence.  The apps then automatically get signed with the Java Verified signature enabling GameLoft to benefit from reduced costs and time-to-market for their new Java ME applications from here on out.  Learn more about the exciting news or apply now for Trusted Status!

    Read the article

  • Getting 401 when using client certificate with IIS 7.5

    - by Jacob
    I'm trying to configure a web site hosted under IIS 7.5 so that requests to a specific location require client certificate authentication. With my current setup, I still get a "401 - Unauthorized: Access is denied due to invalid credentials" when accessing the location with my client cert. Here's the web.config fragment that sets things up: <location path="MyWebService.asmx"> <system.webServer> <security> <access sslFlags="Ssl, SslNegotiateCert"/> <authentication> <windowsAuthentication enabled="false"/> <anonymousAuthentication enabled="false"/> <digestAuthentication enabled="false"/> <basicAuthentication enabled="false"/> <iisClientCertificateMappingAuthentication enabled="true" oneToOneCertificateMappingsEnabled="true"> <oneToOneMappings> <add enabled="true" certificate="MIICFDCCAYGgAwIBAgIQ+I0z6z8OWqpBIJt2lJHi6jAJBgUrDgMCHQUAMCQxIjAgBgNVBAMTGURldiBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHhcNMTAxMjI5MjI1ODE0WhcNMzkxMjMxMjM1OTU5WjAaMRgwFgYDVQQDEw9kZXYgY2xpZW50IGNlcnQwgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBANJi10hI+Zt0OuNr6eduiUe6WwPtyMxh+hZtr/7eY3YezeJHC95Z+NqJCAW0n+ODHOsbkd3DuyK1YV+nKzyeGAJBDSFNdaMSnMtR6hQG47xKgtUphPFBKe64XXTG+ueQHkzOHmGuyHHD1fSli62i2V+NMG1SQqW9ed8NBN+lmqWZAgMBAAGjWTBXMFUGA1UdAQROMEyAENGUhUP+dENeJJ1nw3gR0NahJjAkMSIwIAYDVQQDExlEZXYgQ2VydGlmaWNhdGUgQXV0aG9yaXR5ghB6CLh2g6i5ikrpVODj8CpBMAkGBSsOAwIdBQADgYEAwwHjpVNWddgEY17i1kyG4gKxSTq0F3CMf1AdWVRUbNvJc+O68vcRaWEBZDo99MESIUjmNhjXxk4LDuvV1buPpwQmPbhb6mkm0BNIISapVP/cK0Htu4bbjYAraT6JP5Km5qZCc0iHZQJZuch7Uy6G9kXQXaweJMiHL06+GHx355Y="/> </oneToOneMappings> </iisClientCertificateMappingAuthentication> </authentication> </security> </system.webServer> </location> The client certificate I'm using in my web browser matches what I've placed in the web.config. What am I doing wrong here?

    Read the article

  • LDAP Authentication fails with 500 or 401 depending on bind for Apache2

    - by Erik
    I'm setting up LDAP authentication for our Subversion repository hosted through Apache on a RHEL 5 system. I run into two different issues when I try to authenticate against Active Directory. <Location /svn/> Dav svn SvnParentPath /srv/subversion SVNListParentPath On AuthType Basic AuthName "Subversion Repository" AuthBasicProvider ldap AuthLDAPBindDN "cn=userfoo,ou=Service Accounts,ou=User Accounts,dc=my,dc=example,dc=com" AuthLDAPBindPassword "mypass" AuthLDAPUrl "ldap://my.example.com:389/ou=User Accounts,dc=my,dc=example,dc=com?sAMAccountName?sub?(objectClass=user)" NONE Require valid-user </Location> If I use the above configuration it continually prompts me with the Basic prompt and I have to eventually select Cancel, which returns a 401 (Authorization Required). If I comment out the bind parts it returns 500 (Internal Server Error), griping that authentication failed: [Mon Nov 02 12:00:00 2009] [warn] [client x.x.x.x] [10744] auth_ldap authenticate: user myuser authentication failed; URI /svn [ldap_search_ext_s() for user failed][Operations error] When I perform the bind using ldapsearch and filter for a simple attribute it returns correctly: ldapsearch -h my.example.com -p 389 -D "cn=userfoo,ou=Service Accounts,ou=User Accounts,dc=my,dc=example,dc=com" -b "ou=User Accounts,dc=my,dc=example,dc=com" -w - "&(objectClass=user)(cn=myuser)" sAMAccountName Unfortunately I have no control or insight into the AD part of the system, only the RHEL server. Does anyone know what the hang up is here?

    Read the article

  • Should I Return "500" or "404" if a Requested Image is not Found?

    - by Michael Robinson
    I work with code written by other people, occasionally I am left somewhat confused and at these times Stack Overflow saves me. Please, save me again. Our site allows people to upload images and later embed them within text in our site like so: <img src="http://site.com/image_script.php?p=some_image_identifier"/> My question is: If the identifier, "p", does not lead us to an image should the server return "500" or "404"? I would have thought it should be "404", but that's not what is happening right now.

    Read the article

  • Getting 401 when using client certificate with IIS 7.5

    - by Jacob
    I'm trying to configure a web site hosted under IIS 7.5 so that requests to a specific location require client certificate authentication. With my current setup, I still get a "401 - Unauthorized: Access is denied due to invalid credentials" when accessing the location with my client cert. Here's the web.config fragment that sets things up: <location path="MyWebService.asmx"> <system.webServer> <security> <access sslFlags="Ssl, SslNegotiateCert"/> <authentication> <windowsAuthentication enabled="false"/> <anonymousAuthentication enabled="false"/> <digestAuthentication enabled="false"/> <basicAuthentication enabled="false"/> <iisClientCertificateMappingAuthentication enabled="true" oneToOneCertificateMappingsEnabled="true"> <oneToOneMappings> <add enabled="true" certificate="MIICFDCCAYGgAwIBAgIQ+I0z6z8OWqpBIJt2lJHi6jAJBgUrDgMCHQUAMCQxIjAgBgNVBAMTGURldiBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHhcNMTAxMjI5MjI1ODE0WhcNMzkxMjMxMjM1OTU5WjAaMRgwFgYDVQQDEw9kZXYgY2xpZW50IGNlcnQwgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBANJi10hI+Zt0OuNr6eduiUe6WwPtyMxh+hZtr/7eY3YezeJHC95Z+NqJCAW0n+ODHOsbkd3DuyK1YV+nKzyeGAJBDSFNdaMSnMtR6hQG47xKgtUphPFBKe64XXTG+ueQHkzOHmGuyHHD1fSli62i2V+NMG1SQqW9ed8NBN+lmqWZAgMBAAGjWTBXMFUGA1UdAQROMEyAENGUhUP+dENeJJ1nw3gR0NahJjAkMSIwIAYDVQQDExlEZXYgQ2VydGlmaWNhdGUgQXV0aG9yaXR5ghB6CLh2g6i5ikrpVODj8CpBMAkGBSsOAwIdBQADgYEAwwHjpVNWddgEY17i1kyG4gKxSTq0F3CMf1AdWVRUbNvJc+O68vcRaWEBZDo99MESIUjmNhjXxk4LDuvV1buPpwQmPbhb6mkm0BNIISapVP/cK0Htu4bbjYAraT6JP5Km5qZCc0iHZQJZuch7Uy6G9kXQXaweJMiHL06+GHx355Y="/> </oneToOneMappings> </iisClientCertificateMappingAuthentication> </authentication> </security> </system.webServer> </location> The client certificate I'm using in my web browser matches what I've placed in the web.config. What am I doing wrong here?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >