Search Results

Search found 57458 results on 2299 pages for 'http response codes'.

Page 74/2299 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • Reconstructing data from PCAP sniff

    - by Ishi
    Hi everyone !! I am trying to sniff HTTP data through libpcap and get all the http contents (header+payload) after processing the TCP payload. As per my discussion at http://stackoverflow.com/questions/2905430/writing-an-http-sniffer-or-any-other-application-level-sniffer , I am facing problems due to fragmentation - I need to reconstruct the whole stream (or defragment it) to get a complete HTTP packet, and this is where I need some help. Thanks in anticipation !!

    Read the article

  • Parametrized get request in Ruby?

    - by Horace Loeb
    How do I make an HTTP GET request with parameters in Ruby? It's easy to do when you're POSTing: require 'net/http' require 'uri' HTTP.post_form URI.parse('http://www.example.com/search.cgi'), { "q" => "ruby", "max" => "50" } But I see now way of passing GET parameters as a hash using net/http.

    Read the article

  • Problems with unique links in database: www.doamin/ or domain/

    - by Thomas
    In my website everybody can send some links to other nice websites. All links in my database must by unique, but some links are with 'www.' prefix, and some without. Some ends for '/', some not. For example: |http://www.domain.com |http://domain.com |http://domain.com |http://domain.com/ and other problems can be with https or http. I know that I should change address before saving to database, but what standard I should use?

    Read the article

  • Uncompress GZIPed HTTP Response in Java

    - by bill0ute
    Hi, I'm trying to uncompress a GZIPed HTTP Response by using GZIPInputStream. However I always have the same exception when I try to read the stream : java.util.zip.ZipException: invalid bit length repeat My HTTP Request Header: GET www.myurl.com HTTP/1.0\r\n User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6\r\n Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\n Accept-Language: fr,fr-fr;q=0.8,en-us;q=0.5,en;q=0.3\r\n Accept-Encoding: gzip,deflate\r\n Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7\r\n Keep-Alive: 115\r\n Connection: keep-alive\r\n X-Requested-With: XMLHttpRequest\r\n Cookie: Some Cookies\r\n\r\n At the end of the HTTP Response header, I get path=/Content-Encoding: gzip, followed by the gziped response. I tried 2 similars codes to uncompress : GZIPInputStream gzip = new GZIPInputStream (new ByteArrayInputStream (tBytes)); StringBuffer szBuffer = new StringBuffer (); byte tByte [] = new byte [1024]; while (true) { int iLength = gzip.read (tByte, 0, 1024); // <-- Error comes here if (iLength < 0) break; szBuffer.append (new String (tByte, 0, iLength)); } And this one that I get on this forum : InputStream gzipStream = new GZIPInputStream (new ByteArrayInputStream (tBytes)); Reader decoder = new InputStreamReader (gzipStream, "UTF-8");//<- I tried ISO-8859-1 and get the same exception BufferedReader buffered = new BufferedReader (decoder); I guess this is an encoding error. Best regards, bill0ute

    Read the article

  • Elmah for non-HTTP protocol applications OR Elmah without HttpContext

    - by Josh
    We are working on a 3-tier application, and we've been allowed to use the latest and greatest (MVC2, IIS7.5, WCF, SQL2k8, etc). The application tier is exposed to the various web applications by WCF services. Since we control both the service and client side, we've decided to use net.tcp bindings for their performance advantage over HTTP. We would like to use ELMAH for the error logging, both on the web apps and services. Here's my question. There's lots of information about using ELMAH with WCF, but it is all for HTTP bindings. Does anyone know if/how you can use ELMAH with WCF services exposing non-HTTP endpoints? My guess is no, because ELMAH wants the HttpContext, which requires the AspNetCompatibilityEnabled flag to be true in the web.config. From MSDN: IIS 7.0 and WAS allows WCF services to communicate over protocols other than HTTP. However, WCF services running in applications that have enabled ASP.NET compatibility mode are not permitted to expose non-HTTP endpoints. Such a configuration generates an activation exception when the service receives its first message. If it is true that you cannot use ELMAH with WCF services having non-HTTP endpoints, then the follow-up question is: Can we use ELMAH in such a way that doesn't need HttpContext? Or more generally (so as not to commit the thin metal ruler error), is there ANY way to use ELMAH with WCF services having non-HTTP endpoints?

    Read the article

  • How to parse a raw HTTP response?

    - by Ed
    If I have a raw HTTP response as a string: HTTP/1.1 200 OK Date: Tue, 11 May 2010 07:28:30 GMT Expires: -1 Cache-Control: private, max-age=0 Content-Type: text/html; charset=UTF-8 Server: gws X-XSS-Protection: 1; mode=block Connection: close <!doctype html><html>...</html> Is there an easy way I can parse it into an HttpListenerResponse object? Or at least some kind .NET object so I don't have to work with raw responses. What I'm doing currently is extracting the header key/value pairs and setting them on the HttpListenerResponse. But some headers can't be set, and then I have to cut out the body of the response and write it to the OutputStream. But the body could be gzipped, or it could be an image, which I can't get to work yet. And some responses contain random characters everywhere, which looks like an encoding problem. It's a lot of trouble. I'm getting a raw response because I'm using SOCKS to send an HTTP request. The program I'm working on is basically an HTTP proxy that can route requests through a SOCKS proxy, like Privoxy does.

    Read the article

  • validate that URI is valid http URI

    - by Alfred
    Hi all, My problem: First of hopefully this is not a duplicate, but I could not find the right answer(right away). I would like to validate that an URI(http) is valid in Java. I came up with the following tests but I can't get them to pass. First I used getPort(), but then http://www.google.nl will return -1 on getPort(). This are the test I want to have passed Test: @Test public void testURI_Isvalid() throws Exception { assertFalse(HttpUtils.validateHTTP_URI("ttp://localhost:8080")); assertFalse(HttpUtils.validateHTTP_URI("ftp://localhost:8080")); assertFalse(HttpUtils.validateHTTP_URI("http://localhost:8a80")); assertTrue(HttpUtils.validateHTTP_URI("http://localhost:8080")); final String justWrong = "/schedule/get?uri=http://localhost:8080&time=1000000"; assertFalse(HttpUtils.validateHTTP_URI(justWrong)); assertTrue(HttpUtils.validateHTTP_URI("http://www.google.nl")); } This is what I came up with after I removed the getPort() part but this does not pass all my unit tests. Production code: public static boolean validateHTTP_URI(String uri) { final URI u; try { u = URI.create(uri); } catch (Exception e1) { return false; } return "http".equals(u.getScheme()); } This is the first test that is failing because I am no longer validating the getPort() part. Hopefully somebody can help me out. I think I am not using the right class to validate url's?? P.S: I don't want to connect to the server to validate the URI is correct. At least not yet in this step. I only want to validate scheme.

    Read the article

  • Payapl sandbox a/c in Dotnet..IPN Response Invaild

    - by Sam
    Hi, I am Integrating paypal to mysite.. i use sandbox account,One Buyer a/c and one more for seller a/c...and downloaded the below code from paypal site string strSandbox = "https://www.sandbox.paypal.com/cgi-bin/webscr"; HttpWebRequest req = (HttpWebRequest)WebRequest.Create(strSandbox); //Set values for the request back req.Method = "POST"; req.ContentType = "application/x-www-form-urlencoded"; byte[] param = Request.BinaryRead(HttpContext.Current.Request.ContentLength); string strRequest = Encoding.ASCII.GetString(param); strRequest += "&cmd=_notify-validate"; req.ContentLength = strRequest.Length; //for proxy //WebProxy proxy = new WebProxy(new Uri("http://url:port#")); //req.Proxy = proxy; //Send the request to PayPal and get the response StreamWriter streamOut = new StreamWriter(req.GetRequestStream(), System.Text.Encoding.ASCII); streamOut.Write(strRequest); streamOut.Close(); StreamReader streamIn = new StreamReader(req.GetResponse().GetResponseStream()); string strResponse = streamIn.ReadToEnd(); streamIn.Close(); if (strResponse == "VERIFIED") { //check the payment_status is Completed //check that txn_id has not been previously processed //check that receiver_email is your Primary PayPal email //check that payment_amount/payment_currency are correct //process payment } else if (strResponse == "INVALID") { //log for manual investigation } else { //log response/ipn data for manual investigation } and when add this snippets in pageload event of success page i get the ipn response as INVALID but amount paid successfully but i am getting invalid..any help..Paypal Docs in not Clear. thanks in advance

    Read the article

  • Handling Erlang inets http client errors

    - by Justin
    I have an Erlang app which makes a large number of http calls to external sites using inets, using the code below case http:request(get, {Url, []}, [{autoredirect, false}], []) of {ok, {{_, Code, _}, _, Body}}-> case Code of 200 -> HandlerFn(Body); _ -> {error, io:format("~s returned HTTP ~p", [Broker, Code])} end; Response -> %% block to handle unexpected responses from inets {error, io:format("~s returned ~p", [Broker, Response])} end. There is an explicit block to handle anything strange inets might return [Response]. Despite this, I still get what look like inets error reports dumped to the console [sample below]. What am I doing wrong here ? Do I need to configure some kind of inets error handler elsewhere ? Thanks. -- =ERROR REPORT==== 24-Apr-2010::06:49:47 === ** Generic server <0.6618.0 terminating ** Last message in was {connect_and_send, {request,#Ref<0.0.0.139358,<0.6613.0,0,http, {"***",80}, "****************", [],get, {http_request_h,undefined,"keep-alive", undefined,undefined,undefined,undefined, undefined,undefined,undefined,undefined, undefined,undefined,undefined,undefined, undefined,undefined,"news.bbc.co.uk", undefined,undefined,undefined,undefined, undefined,undefined,undefined,undefined, undefined,[],undefined,undefined,undefined, undefined,"0",undefined,undefined, undefined,undefined,undefined,undefined,[]}, {[],[]}, {http_options,"HTTP/1.1",infinity,false,[], undefined,false,infinity}, "*******************", [],none,[],1272088179114,undefined,undefined}} * When Server state == {state, {request,#Ref<0.0.0.139358,<0.6613.0,0,http, {"********",80}, "***************", [],get, {http_request_h,undefined,"keep-alive", undefined,undefined,undefined,undefined, undefined,undefined,undefined,undefined, undefined,undefined,undefined,undefined, undefined,undefined,"news.bbc.co.uk", undefined,undefined,undefined,undefined, undefined,undefined,undefined,undefined, undefined,[],undefined,undefined, undefined,undefined,"0",undefined, undefined,undefined,undefined,undefined, undefined,[]}, {[],[]}, {http_options,"HTTP/1.1",infinity,false,[], undefined,false,infinity}, "**********************", [],none,[],1272088179114,undefined,undefined}, undefined,undefined,undefined,undefined,undefined, {[],[]}, {[],[]}, undefined,[],nolimit,nolimit, {options, {undefined,[]}, 0,2,5,120000,2,disabled,false,inet,default, default,[]}, {timers,[],undefined}, httpc_manager,undefined} ** Reason for termination == ** {error,{connect_failed,{#Ref<0.0.0.139358,{error,nxdomain}}}} =ERROR REPORT==== 24-Apr-2010::06:49:47 === HTTPC-MANAGER handler (<0.6618.0, started) failed to connect and/or send request #Ref<0.0.0.139358 Result: {error,{connect_failed,{#Ref<0.0.0.139358,{error,nxdomain}}}}

    Read the article

  • Paypal sandbox account in dotnet: "IPN Response invalid"

    - by Sam
    I am integrating Paypal with my website. I use a sandbox account, one buyer account and one seller account. I downloaded the code below from Paypal: string strSandbox = "https://www.sandbox.paypal.com/cgi-bin/webscr"; HttpWebRequest req = (HttpWebRequest)WebRequest.Create(strSandbox); //Set values for the request back req.Method = "POST"; req.ContentType = "application/x-www-form-urlencoded"; byte[] param = Request.BinaryRead(HttpContext.Current.Request.ContentLength); string strRequest = Encoding.ASCII.GetString(param); strRequest += "&cmd=_notify-validate"; req.ContentLength = strRequest.Length; //for proxy //WebProxy proxy = new WebProxy(new Uri("http://url:port#")); //req.Proxy = proxy; //Send the request to PayPal and get the response StreamWriter streamOut = new StreamWriter(req.GetRequestStream(), System.Text.Encoding.ASCII); streamOut.Write(strRequest); streamOut.Close(); StreamReader streamIn = new StreamReader(req.GetResponse().GetResponseStream()); string strResponse = streamIn.ReadToEnd(); streamIn.Close(); if (strResponse == "VERIFIED") { //check the payment_status is Completed //check that txn_id has not been previously processed //check that receiver_email is your Primary PayPal email //check that payment_amount/payment_currency are correct //process payment } else if (strResponse == "INVALID") { //log for manual investigation } else { //log response/ipn data for manual investigation } When I add this snippet in my pageload event of my success page, I show the IPN response as INVALID, but amount is paid successfully. Why is this? Paypal's docs are not clear.

    Read the article

  • Any HTTP proxies with explicit, configurable support for request/response buffering and delayed conn

    - by Carlos Carrasco
    When dealing with mobile clients it is very common to have multisecond delays during the transmission of HTTP requests. If you are serving pages or services out of a prefork Apache the child processes will be tied up for seconds serving a single mobile client, even if your app server logic is done in 5ms. I am looking for a HTTP server, balancer or proxy server that supports the following: A request arrives to the proxy. The proxy starts buffering in RAM or in disk the request, including headers and POST/PUT bodies. The proxy DOES NOT open a connection to the backend server. This is probably the most important part. The proxy server stops buffering the request when: A size limit has been reached (say, 4KB), or The request has been received completely, headers and body Only now, with (part of) the request in memory, a connection is opened to the backend and the request is relayed. The backend sends back the response. Again the proxy server starts buffering it immediately (up to a more generous size, say 64KB.) Since the proxy has a big enough buffer the backend response is stored completely in the proxy server in a matter of miliseconds, and the backend process/thread is free to process more requests. The backend connection is immediately closed. The proxy sends back the response to the mobile client, as fast or as slow as it is capable of, without having a connection to the backend tying up resources. I am fairly sure you can do 4-6 with Squid, and nginx appears to support 1-3 (and looks like fairly unique in this respect). My question is: is there any proxy server that empathizes these buffering and not-opening-connections-until-ready capabilities? Maybe there is just a bit of Apache config-fu that makes this buffering behaviour trivial? Any of them that it is not a dinosaur like Squid and that supports a lean single-process, asynchronous, event-based execution model? (Siderant: I would be using nginx but it doesn't support chunked POST bodies, making it useless for serving stuff to mobile clients. Yes cheap 50$ handsets love chunked POSTs... sigh)

    Read the article

  • Problem loading scripts from Ajax Response

    - by konathamrajesh
    The problem is I am using get_info() to make a ajax call to Result.lasso and paste the response in div with id 'test'.I am unable to use the sendForm() function from the page where i am calling the get_info(). I have also tried using different versions of jQuery 1.1.1.3 is working fine.But i am facing the problem while using higher versions of jquery. The error with higher versions is as follows missing } in XML expression [Break on this error] alert('hi');\n test.lasso (line 3) sendForm is not defined [Break on this error] sendForm(); get_info() function definition <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js" type="text/javascript"></script> <SCRIPT> function get_info() { $.ajax({url: "Result.lasso", context: document.body, success: function(response){ document.getElementById('test').innerHTML = response ;},dataType:"script"}); } </SCRIPT> The code in Result.lasso is as follows [Content_Type: 'text/html; charset=UTF-8'] <script type="text/javascript"> function sendForm() { alert('hi'); } </script> [Date] form name= "abc" method = "get" action = "abcd.lasso"> input type ="text" name = "element1"/> input type = "button" value="Click" onClick = "javascript: sendForm();"/> </form> Please help me out in resolving this problem Thanks, Rajesh Konatham

    Read the article

  • Ajax response takes time and status is 503

    - by Suresh S
    guys, i have a html page where onclick of a button a ajax request is sent to server , the request calls a jsp page which runs an oracle procedure.The procedure runs the logic and places it in a temp table . once procedure is completed , the values are returned to the client by selecting values from tmp table. as the response is too late . the data is not received at the client side. solution: i tried to run the procedure in a separate thread using a ajax call. when the procedure is completed a global flag is set to indicate that the data is generated. if the response is 500 , a second ajax call invoked by timeout function after 10000 ms . the second call checks the global flag ,if true then reads from table and sends the response. if not again a timeout is set at the client side. this solution is not mature enough. , as the procedure may take long time to respond. please let me know a good solution for this problenm?

    Read the article

  • asp.net dynamic HTML form

    - by user204588
    Hi, I want to create an html page inside a asp.net page using c# and then request that html page. The flow is, I'll be creating a request that will give me a response with some values. Those values will be stored in hidden fields in the html page I'm creating on the fly and then requesting. I figure it would be something like below but I'm not sure if it would work, I've also received some "Thread Aborting" errors. So, does anyone know the proper way to do this or at least direct me to a nice article or something? StringBuilder builder = new StringBuilder(); builder.Append("<html><head></head>"); builder.Append("<body onload=\"document.aButton.submit();\">"); builder.Append("<input type=\"hidden\" name=\"something\" value=\"" + aValue + "\">"); HttpContext.Current...Response.Write(builder.ToString()); ... end response

    Read the article

  • servicestack Razor view with request and response DTO

    - by user7398
    I'm having a go with the razor functionality in service stack. I have a razor cshtml view working for one of my response DTO's. I need to access some values from the request DTO in the razor view that have been filled in from some fields from the REST route, so i can construct a url to put into the response html page and also label some form labels. Is there anyway of doing this? I don't want to duplicate the property from the request DTO into the response DTO just for this html view. Because i'm trying to emulate an existing REST service of another product, i do not want to emit extra data just for the html view. eg http://localhost/rest/{Name}/details/{Id} eg @inherits ViewPage<DetailsResponse> @{ ViewBag.Title = "todo title"; Layout = "HtmlReport"; } this needs to come from the request dto NOT @Model <a href="/rest/@Model.Name">link to user</a> <a href="/rest/@Model.Name/details/@Model.Id">link to user details</a>

    Read the article

  • how to get response from remote server

    - by ruhit
    I have made a desktop application in asp.net using c# that connecting with remote server.I am able to connect but how do i show that my login is successful or not. After that i want to retrieve data from the remote server..........so plz help me.I have written the below code..............is there any better way try { string strId = UserId_TextBox.Text; string strpasswrd = Password_TextBox.Text; ASCIIEncoding encoding = new ASCIIEncoding(); string postData = "UM_email=" + strId; postData += ("&UM_password=" + strpasswrd); byte[] data = encoding.GetBytes(postData); MessageBox.Show(postData); // Prepare web request... //HttpWebRequest myRequest = (HttpWebRequest)WebRequest.Create("http://localhost/ruhit/basic_framework/index.php?menu=login=" + postData); HttpWebRequest myRequest =(HttpWebRequest)WebRequest.Create("http://www.facebook.com/login.php=" + postData); myRequest.Method = "POST"; myRequest.ContentType = "application/x-www-form-urlencoded"; myRequest.ContentLength = data.Length; Stream newStream = myRequest.GetRequestStream(); // Send the data. newStream.Write(data, 0, data.Length); MessageBox.Show("u r now connected"); HttpWebResponse response = (HttpWebResponse)myRequest.GetResponse(); // WebResponse response = myRequest.GetResponse(); StreamReader reader = new StreamReader(response.GetResponseStream()); string str = reader.ReadLine(); while (str != null) { str = reader.ReadLine(); MessageBox.Show(str); } reader.Close(); newStream.Close(); } catch { MessageBox.Show("error connecting"); }

    Read the article

  • how to get email id from google api response

    - by user1726508
    i am able to get user information from Google API response using oath2 . But i do't know how to get those responses individually . Response i am getting from Google Api: * Access token: ya29.AHES6ZQ3QxKxnfAzpZasdfd23423NuxJs29gMa39MXV551yMmyM5IgA { "id": "112361893525676437860", "name": "Ansuman Singh", "given_name": "Ansuman", "family_name": "Singh", "link": "https://plus.google.com/112361893525676437860", "gender": "male", "birthday": "0000-03-18", "locale": "en" } Original Token: ya29.AHES6ZQ3QxKxnfAzpZu0lYHYu8sdfsdafdgMa39MXV551yMmyM5IgA New Token: ya29.AHES6ZQ3QxKxnfdsfsdaYHYu8TNuxJs29gMa39MXV551yMmyM5IgA But i want only "id" & "name" indiviually to save in my Database table. How can i do this? I got those above response/output By using the below code. public static void main(String[] args) throws IOException { ------------------------- ------------------------- ------------------------- String accessToken = authResponse.accessToken; GoogleAccessProtectedResource access = new GoogleAccessProtectedResource(accessToken, TRANSPORT, JSON_FACTORY, CLIENT_ID, CLIENT_SECRET, authResponse.refreshToken); HttpRequestFactory rf = TRANSPORT.createRequestFactory(access); System.out.println("Access token: " + authResponse.accessToken); String url = "https://www.googleapis.com/oauth2/v1/userinfo?alt=json&access_token=" + authResponse.accessToken; final StringBuffer r = new StringBuffer(); final URL u = new URL(url); final URLConnection uc = u.openConnection(); final int end = 1000; InputStreamReader isr = null; BufferedReader br = null; isr = new InputStreamReader(uc.getInputStream()); br = new BufferedReader(isr); final int chk = 0; while ((url = br.readLine()) != null) { if ((chk >= 0) && ((chk < end))) { r.append(url).append('\n'); } } System.out.print(""); System.out.println(); System.out.print(" "+ r ); //this is printing at once but i want them individually access.refreshToken(); System.out.println("Original Token: " + accessToken + " New Token: " + access.getAccessToken()); }

    Read the article

  • How to understand these lines in apache.log

    - by chefnelone
    I just get 19000 lines like these in the apache.log file for my site example.com. My hosting provider shut down the hosting and notified me that I need to avoid to activate my hosting again. I understand that I got a big amount of visits but I don't know how to avoid this. 88.190.47.233 - - [27/Jun/2013:09:51:34 +0200] "GET / HTTP/1.0" 403 389 "http://example.com/" "Opera/9.80 (Windows NT 6.1; U; ru) Presto/2.10.289 Version/12.02" 417 88.190.47.233 - - [27/Jun/2013:09:51:34 +0200] "GET / HTTP/1.0" 403 389 "http://example.com/" "Opera/9.80 (Windows NT 6.1; U; ru) Presto/2.10.289 Version/12.02" 417 175.44.28.155 - - [27/Jun/2013:09:51:44 +0200] "GET /en/user/register HTTP/1.1" 403 503 "http://example.com/en/" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1;)" 248 175.44.29.140 - - [27/Jun/2013:09:53:19 +0200] "GET /en/node/1557?page=2 HTTP/1.0" 403 517 "http://example.com/en/node/1557?page=2" "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.12 Safari/535.11" 491 These are the lines from apache-error.log. There are more than 35000 lines like this. [Thu Jun 27 09:50:58 2013] [error] [client 5.39.19.183] (13)Permission denied: access to /index.php denied, referer: http://example.com/ [Thu Jun 27 09:51:03 2013] [error] [client 125.112.29.105] (13)Permission denied: access to /index.php denied, referer: http://example.com/en/ [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.php denied, referer: http://example.com/en/node/1557?page=1#comment-701 [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.php denied, referer: http://example.com/en/node/1557?page=1#comment-701 [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.html denied, referer: http://example.com/en/node/1557?page=1#comment-701 [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.htm denied, referer: http://example.com/en/node/1557?page=1#comment-701 [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.php denied, referer: http://example.com/ [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.html denied, referer: http://example.com/ [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.htm denied, referer: http://example.com/ [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.php denied, referer: http://example.com/ [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.html denied, referer: http://example.com/ [Thu Jun 27 09:51:34 2013] [error] [client 88.190.47.233] (13)Permission denied: access to /index.htm denied, referer: http://example.com/ [Thu Jun 27 09:51:44 2013] [error] [client 175.44.28.155] (13)Permission denied: access to /index.php denied, referer: http://example.com/en/ [Thu Jun 27 09:53:19 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.php denied, referer: http://example.com/en/node/1557?page=2 [Thu Jun 27 09:53:20 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.php denied, referer: http://example.com/en/node/1557?page=2 [Thu Jun 27 09:53:20 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.html denied, referer: http://example.com/en/node/1557?page=2 [Thu Jun 27 09:53:20 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.htm denied, referer: http://example.com/en/node/1557?page=2 [Thu Jun 27 09:53:21 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.php denied, referer: http://example.com/ [Thu Jun 27 09:53:21 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.html denied, referer: http://example.com/ [Thu Jun 27 09:53:21 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.htm denied, referer: http://example.com/ [Thu Jun 27 09:53:22 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.php denied, referer: http://example.com/ [Thu Jun 27 09:53:22 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.html denied, referer: http://example.com/ [Thu Jun 27 09:53:22 2013] [error] [client 175.44.29.140] (13)Permission denied: access to /index.htm denied, referer: http://example.com/ [Thu Jun 27 09:56:53 2013] [error] [client 113.246.6.147] (13)Permission denied: access to /index.php denied, referer: http://example.com/en/ [Thu Jun 27 09:58:58 2013] [error] [client 108.62.71.180] (13)Permission denied: access to /index.php denied, referer: http://example.com/

    Read the article

  • Intel Rapid Storage / Smart Response SSD caching issue

    - by goober
    Background Recently built my own PC. It works! Almost. It's been a while since getting into the guts of these things, so I'm familiar but may be missing something simple. FYI, I don't care about blowing the OS away -- it's brand new and we can go back from scratch as many times as necessary. Goal / Issue I'd like to use the SSD to take advantage of Intel's Smart Response technology (allows the SSD to act as a cache for HDDs) I would like the SSD cache to act as a cache for my HDDs, which I would like to be in a RAID1 array (so I get the speed from the SSD and the redundancy from the RAID1) However, Windows only sees the drive in device manager (not as a drive), so I'm unsure what to do about that. Related: as far as I know, for this to work, the drives all have to be in a single RAID array (i.e. a RAID0 pairing of the SSD and the RAID1 HDD array). However, when attempting this at the BIOS level, I am told there is not enough space for an array. Steps so Far Moved the SSD onto the Intel controller (I'd had it on the Marvel 6.0 controller instead of the Intel controller, so the BIOS was only seeing it in a strange way) Updated the BIOS of the motherboard to the latest version Reinstalled Intel's RST (iRST?) software several times, as some forums reported it working after reinstalling 3 times (which does not inspire confidence). Checked Intel storage: it does see the SSD as a physical, non-RAID device. However, it says no space exists if I try to create an array. Checked the BIOS: it does not show up in the boot order, but is an option that can be selected under boot options. Tried the firmware update for that model. Issue: the firmware CD doesn't detect a drive; maybe the Intel storage controller is making it difficult? moved the ssd to the marvel controller. The firmware update cd appeared to hang while searching for drives. swapped out the SATA cable for the manufacturer's and moved back to the intel storage controller. Noticed at this point that in the Intel RST software, a device DOES show up in addition to the RAID set -- only shown as a "60 GB internal disk". Windows doesn't appear to see it as a drive, but it does still show in device manager. Move SSD to port from 0-3 on MOBO and set SATA mode to IDE (after disconnecting RAID1 config) to allow the firmware update to work. Firmware was already at the latest version. Next Steps ? Components involved ASUS P8Z68-V PRO motherboard (Intel Z68 Chipset) Intel i7 2600k Processor 2 x 1TB 7200 RPM HDDs 64 GB Crucial M4 SSD (M4-CT064M4SSD2) For Reference -- Storage Configuration Intel 3 gbps Intel 3gbps Intel 6gbps Marvel 6gbps +----------+ +----------+ +----------+ +----------+ | | <----+ | | +-+ | | | |----------| | |----------| |-|--------| |----------| | | | | + | | | | | | +----------+ | +--|-------+ +-|--------+ +----------+ | | | + v v | 1 TB HDD 64 GB SSD + +> 1 TB HDD For Reference -- Intel RST (v10.8.0.1003) Screenshot Don't mind the "rebuilding" -- knocked a power cable out at one point; it's doing its job, not an indicator of a bad HDD. Any thoughts? Thanks in advance for any help!

    Read the article

  • Apache sends plain-text response when accessing SSL-enabled site without HTTPS

    - by animuson
    I've never encountered something such as this before. I was attempting to simply redirect the page to the HTTPS version if it determined that HTTPS was off, but instead it's displaying an HTML page rather than actually redirecting; and even odder, it's displaying it as text/plain! The VirtualHost Declaration (Sort of): ServerAdmin [email protected] DocumentRoot "/path/to/files" ServerName example.com SSLEngine On SSLCertificateFile /etc/ssh/certify/example.com.crt SSLCertificateKeyFile /etc/ssh/certify/example.com.key SSLCertificateChainFile /etc/ssh/certify/sub.class1.server.ca.pem <Directory "/path/to/files/"> AllowOverride All Options +FollowSymLinks DirectoryIndex index.php Order allow,deny Allow from all </Directory> RewriteEngine On RewriteCond %{HTTPS} off RewriteRule .* https://example.com:6161 [R=301] The Page Output: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://example.com:6161">here</a>.</p> <hr> <address>Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/1.0.0e DAV/2 Server at example.com Port 443</address> </body></html> I've tried moving the Rewrite stuff up above the SSL stuff hoping it'd do something and nothing happens. If I view the page with via HTTPS, it displays fine like it should. It's obviously detecting that I'm trying to rewrite the path, but it's not acting. The Apache error log does not indicate anything to me that might have gone wrong. When I remove the RewriteRules: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>400 Bad Request</title> </head><body> <h1>Bad Request</h1> <p>Your browser sent a request that this server could not understand.<br /> Reason: You're speaking plain HTTP to an SSL-enabled server port.<br /> Instead use the HTTPS scheme to access this URL, please.<br /> <blockquote>Hint: <a href="https://example.com/"><b>https://example.com/</b></a></blockquote></p> <p>Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.</p> <hr> <address>Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/1.0.0e DAV/2 Server at example.com Port 443</address> </body></html> I get the standard "you can't do this because you're not using SSL" response, which is also provided in text/plain rather than being rendered as HTML. This would make sense, it should only work for HTTPS-enabled connections, but I still want to redirect them to the HTTPS connection when it determines that it is not enabled. Thinking I could circumvent the system: I tried adding a ErrorDocument 400 https://example.com:6161 to the config file instead of using RewriteRules, and that just gave me a new message, still no cheese. <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>302 Found</title> </head><body> <h1>Found</h1> <p>The document has moved <a href="https://example.com:6161">here</a>.</p> <hr> <address>Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/1.0.0e DAV/2 Server at example.com Port 443</address> </body></html> How can I force Apache to actually redirect rather than displaying a "301" page that shows HTML in plain-text format?

    Read the article

  • Google I/O 2010: Google TV Keynote - Introducing Google TV

    Google I/O 2010: Google TV Keynote - Introducing Google TV Due to licensing and permissions issues, we are unable to show the full Google TV demonstration from the Day 2 keynote at Google I/O. Until we are able to get these permissions, please check out these clips. For Google I/O session videos, presentations, developer interviews and more, go to: code.google.com/io From: GoogleDevelopers Views: 30 1 ratings Time: 06:55 More in Science & Technology

    Read the article

  • Google I/O 2010: Google TV Keynote - Android Apps On Google TV

    Google I/O 2010: Google TV Keynote - Android Apps On Google TV Due to licensing and permissions issues, we are unable to show the full Google TV demonstration from the Day 2 keynote at Google I/O. Until we are able to get these permissions, please check out these clips. For Google I/O session videos, presentations, developer interviews and more, go to: code.google.com/io From: GoogleDevelopers Views: 8 0 ratings Time: 03:18 More in Science & Technology

    Read the article

  • Google Chrome Extensions: Launch Event (part 5)

    Google Chrome Extensions: Launch Event (part 5) Video Footage from the Google Chrome Extensions launch event on 12/09/09. Xmarks, ebay and Google Translate present their experience developing an extension for Google Chrome. From: GoogleDevelopers Views: 3037 18 ratings Time: 10:30 More in Science & Technology

    Read the article

  • What is PubSubHubbub?

    What is PubSubHubbub? An overview of pubsubhubbub, a simple, open, web-hook-based pubsub protocol & open source reference implementation. Brett Slatkin and Brad Fitzpatrick demonstrate what pubsubhubbub is and how it works. For more information, visit pubsubhubbub.googlecode.com From: GoogleDevelopers Views: 23753 112 ratings Time: 03:20 More in Science & Technology

    Read the article

  • Google I/O 2010 - Tech, innovation, CS, & more: A VC panel

    Google I/O 2010 - Tech, innovation, CS, & more: A VC panel Google I/O 2010 - Technology, innovation, computer science, and more: A VC panel Tech Talks Albert Wenger, Chris Dixon, Dave McClure, Brad Feld, Paul Graham, Dick Costolo What do notable tech-minded VCs think about big trends happening today? In this session, you'll get to hear from and ask questions to a panel of well-respected investors, all of whom are programmers by trade. Albert Wenger, Chris Dixon, Dave McClure, Paul Graham, and Brad Feld will duke it out on a number of hot tech topics with Dick Costolo moderating. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 329 5 ratings Time: 01:00:20 More in Science & Technology

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >