Search Results

Search found 50475 results on 2019 pages for 'rpc over http'.

Page 24/2019 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • mount: mount to NFS server 'IPADDRESS' failed: RPC Error: Program not registered

    - by matt74tm
    I've got two Redhat5/CentOS systems which share a folder. I'm trying to change the shared folder location, but I ran into this error on the machine on which the folder is mounted... How can I correct this? I rebooted the computer but to no avail. Server1 - where its "mounted" /etc/fstab IPADDRESS2:/opt/programA/common/files /srv/server2-share nfs rw,intr 0 0 Server2 - where its "shared" /etc/exports /opt/programA/common/files IPADDRESS1/28(rw,insecure,sync,no_root_squash) Ran the following on Server2 root@server2 [~]# /etc/init.d/nfs start root@server2 [~]# rpcinfo -p program vers proto port 100000 2 tcp 111 portmapper 100000 2 udp 111 portmapper 100011 1 udp 875 rquotad 100011 2 udp 875 rquotad 100011 1 tcp 875 rquotad 100011 2 tcp 875 rquotad 100005 1 udp 892 mountd 100005 1 tcp 892 mountd 100005 2 udp 892 mountd 100005 2 tcp 892 mountd 100005 3 udp 892 mountd 100005 3 tcp 892 mountd root@server2 [~]# /etc/init.d/nfs status rpc.mountd (pid 10204) is running... nfsd (pid 10201 10200 10199 10198 10197 10196 10195 10194) is running... rpc.rquotad (pid 10189) is running...

    Read the article

  • NFS compound failed for server foosrv: error 7 (RPC: Authentication error)

    - by automatthias
    I'm setting up an Ubuntu NFS server with a Solaris 10 client. The basic configuration looks okay to me, and it was also working for some time. I'm getting an "RPC: Authentication error" message on the client. server /etc/exports: /export/opencsw-future 192.168.3.0/24(rw,nohide,insecure,no_subtree_check,async) /export/opencsw-current 192.168.3.0/24(rw,nohide,insecure,no_subtree_check,async) $ ls -ld /export/opencsw-current drwxr-xr-x 7 maciej maciej 4096 2012-02-05 14:55 /export/opencsw-current client $ grep opencsw /etc/vfstab foosrv:/opencsw-current - /export/opencsw-current nfs - yes - $ sudo mount /export/opencsw-current NFS compound failed for server foosrv: error 7 (RPC: Authentication error) (...repeated...) nfs mount: mount: /export/opencsw-current: Permission denied My server host name resolves to both IPv4 and IPv6 addresses.

    Read the article

  • Someone tried to hack my Node.js server, need to understand a GET request in the logs

    - by Akay
    Alright, so I left my Node.js server alone for a while and came back to find some really interesting stuff in the logs. Apparently some moron from China or Poland tried to hack my server using directory traversal and what not, while it seems though he did not succeed I am unable understand few entries in the log. This is the output of a "hohup.out" file. The attack starts, apparently he is trying to find out some console entry in my server. All of which fail and return a 404. [90mGET /../../../../../../../../../../../ [31m500 [90m6ms - 2b[0m [90mGET /<script>alert(53416)</script> [33m404 [90m7ms[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET / [32m200 [90m1ms - 240b[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET /pz3yvy3lyzgja41w2sp [33m404 [90m1ms[0m [90mGET /stylesheets/style.css [33m404 [90m0ms[0m [90mGET /index.html [33m404 [90m1ms[0m [90mGET /index.htm [33m404 [90m0ms[0m [90mGET /default.html [33m404 [90m0ms[0m [90mGET /default.htm [33m404 [90m1ms[0m [90mGET /default.asp [33m404 [90m1ms[0m [90mGET /index.php [33m404 [90m0ms[0m [90mGET /default.php [33m404 [90m1ms[0m [90mGET /index.asp [33m404 [90m0ms[0m [90mGET /index.cgi [33m404 [90m0ms[0m [90mGET /index.jsp [33m404 [90m1ms[0m [90mGET /index.php3 [33m404 [90m0ms[0m [90mGET /index.pl [33m404 [90m0ms[0m [90mGET /default.jsp [33m404 [90m0ms[0m [90mGET /default.php3 [33m404 [90m0ms[0m [90mGET /index.html.en [33m404 [90m0ms[0m [90mGET /web.gif [33m404 [90m34ms[0m [90mGET /header.html [33m404 [90m1ms[0m [90mGET /homepage.nsf [33m404 [90m1ms[0m [90mGET /homepage.htm [33m404 [90m1ms[0m [90mGET /homepage.asp [33m404 [90m1ms[0m [90mGET /home.htm [33m404 [90m0ms[0m [90mGET /home.html [33m404 [90m1ms[0m [90mGET /home.asp [33m404 [90m1ms[0m [90mGET /login.asp [33m404 [90m0ms[0m [90mGET /login.html [33m404 [90m0ms[0m [90mGET /login.htm [33m404 [90m1ms[0m [90mGET /login.php [33m404 [90m0ms[0m [90mGET /index.cfm [33m404 [90m0ms[0m [90mGET /main.php [33m404 [90m1ms[0m [90mGET /main.asp [33m404 [90m1ms[0m [90mGET /main.htm [33m404 [90m1ms[0m [90mGET /main.html [33m404 [90m2ms[0m [90mGET /Welcome.html [33m404 [90m1ms[0m [90mGET /welcome.htm [33m404 [90m1ms[0m [90mGET /start.htm [33m404 [90m1ms[0m [90mGET /fleur.png [33m404 [90m0ms[0m [90mGET /level/99/ [33m404 [90m1ms[0m [90mGET /chl.css [33m404 [90m0ms[0m [90mGET /images/ [33m404 [90m0ms[0m [90mGET /robots.txt [33m404 [90m2ms[0m [90mGET /hb1/presign.asp [33m404 [90m1ms[0m [90mGET /NFuse/ASP/login.htm [33m404 [90m0ms[0m [90mGET /CCMAdmin/main.asp [33m404 [90m1ms[0m [90mGET /TiVoConnect?Command=QueryServer [33m404 [90m1ms[0m [90mGET /admin/images/rn_logo.gif [33m404 [90m1ms[0m [90mGET /vncviewer.jar [33m404 [90m1ms[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET / [32m200 [90m7ms - 240b[0m [90mOPTIONS / [32m200 [90m1ms - 3b[0m [90mTRACE / [33m404 [90m0ms[0m [90mPROPFIND / [33m404 [90m0ms[0m [90mGET /\./ [33m404 [90m1ms[0m But here is when things start getting fishy. [90mGET http://www.google.com/ [32m200 [90m2ms - 240b[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET / [32m200 [90m1ms - 240b[0m [90mGET /robots.txt [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m0ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m3ms[0m [90mGET /manager/html [33m404 [90m0ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m0ms[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET http://37.28.156.211/sprawdza.php [33m404 [90m1ms[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET http://www.google.com/ [32m200 [90m2ms - 240b[0m [90mHEAD / [32m200 [90m1ms - 240b[0m [90mGET http://www.daydaydata.com/proxy.txt [33m404 [90m19ms[0m [90mHEAD / [32m200 [90m1ms - 240b[0m [90mGET /manager/html [33m404 [90m2ms[0m [90mGET / [32m200 [90m4ms - 240b[0m [90mGET http://www.google.pl/search?q=wp.pl [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m0ms[0m [90mHEAD / [32m200 [90m2ms - 240b[0m [90mGET http://www.google.pl/search?q=onet.pl [33m404 [90m1ms[0m [90mHEAD / [32m200 [90m2ms - 240b[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET http://www.google.pl/search?q=ostro%C5%82%C4%99ka [33m404 [90m1ms[0m [90mGET http://www.google.pl/search?q=google [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET http://www.google.com/ [32m200 [90m2ms - 240b[0m [90mHEAD / [32m200 [90m2ms - 240b[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m0ms[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET http://www.baidu.com/ [32m200 [90m2ms - 240b[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mPOST /api/login [32m200 [90m1ms - 28b[0m [90mGET /web-console/ServerInfo.jsp [33m404 [90m2ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET http://www.google.com/ [32m200 [90m10ms - 240b[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET http://proxyjudge.info [32m200 [90m2ms - 240b[0m [90mGET / [32m200 [90m2ms - 240b[0m [90mGET / [32m200 [90m1ms - 240b[0m [90mGET http://www.google.com/ [32m200 [90m3ms - 240b[0m [90mGET http://www.google.com/ [32m200 [90m3ms - 240b[0m [90mGET http://www.baidu.com/ [32m200 [90m1ms - 240b[0m [90mGET /manager/html [33m404 [90m0ms[0m [90mGET /manager/html [33m404 [90m1ms[0m [90mGET http://www.google.com/ [32m200 [90m2ms - 240b[0m [90mHEAD / [32m200 [90m1ms - 240b[0m [90mGET http://www.google.com/ [32m200 [90m1ms - 240b[0m [90mGET http://www.google.com/search?tbo=d&source=hp&num=1&btnG=Search&q=niceman [33m404 [90m2ms[0m So my questions are, how come my server is returning a "200" OK for root level domains? How did the hacker even manage to send a GET request to my server such that "http://www.google.com" shows up in the log while my server is simply an API that works on relative URLs such as "/api/login". And, while I looked up the OPTIONS, TRACE and PROPFIND HTTP requests that my server has logged it would be great if someone could explain what exactly was the hacker trying to achieve by using these verbs? Also what in the world does "[90m [32m [90m1ms - 240b[0m" mean? The "ms" makes sense, probably milliseconds for the request, rest I am unable to understand. Thank you!

    Read the article

  • Using Fiddler with BizTalk's HTTP Adapter

    - by Christopher House
    I'm working on an orchestration that's retrieving some data from a Java servlet.  The servlet takes a parameter string via HTTP post and returns POX (plain old XML, no SOAP here).  I was having trouble getting a valid response from the servlet when I was sending some test messages and wanted to see what my messages were looking like as they went across the wire.  Normally I was using WCF, I'd setup message logging, but since that's obviously not an option with the HTTP adapter, my thoughts turned to Fiddler.  A quick Google search turned up some promising results.  The posts I read all referred to using Fiddler with the SOAP adapter, but I thoght I could apply the same ideas to the HTTP adapter.  This led me to try setting the following context properties: HttpRequestMessage(HTTP.UseProxy) = true; HttpRequestMessage(HTTP.ProxyName) = "127.0.0.1"; HttpRequestMessage(HTTP.ProxyPort) = 8888; I rebuilt my orch, gac'd it, bounced my host and tried submitting a test message.  Fiddler was running but I didn't see any traffic show up.  I tried fully undeploying/redeploying my application and still, no traffic in Fiddler.  I was starting to think that BizTalk was ignoring the proxy settings.  To confirm this, I closed Fiddler and submitted a test message.  Sure enough, the orch ran to completion, proving that BizTalk was ignoring the proxy settings. I went back to my orch to see if there could be any other context proprties I needed to set.  I saw one that looked promising:  HTTP.UseHandlerProxySettings.  I set this to false, rebuilt my orch and this time when I submitted, I got an error message, which made sense, I didn't have Fiddler running.  I started up Fiddler, submitted another message and there it was, my HTTP traffic, just as I hoped.  And, I was quickly able to figure out what the problem was...I had forgotten to set HTTP.ContentType to application/x-www-form-urlencoded.

    Read the article

  • Specifying culture for http request/reponse

    - by Akash
    I have a ReSTful web service which needs to parse culture-sensitive data from the request. This data could either be in an XML body or part of the query string. Is there any acepted way of determining which culture the data is being sent in (and by extension the culture in which the response should be sent)? One option is simply to specify to the clients the culture in which all requests should be sent. A friendlier option seems to be to allow the client to specify the culture. I've considered: a) using the accept-language http header to encode this information. b) using the xml:lang attribute for XML POSTs, and an extra field for query strings (e.g. ...&culture=en-GB) http://www.w3.org/International/questions/qa-accept-lang-locales warns of limitations in using the accept-language header, but most of the warnings seem to center around requests originating from browsers. In my case the requests will come from other applications. All advice greatly appreciated!

    Read the article

  • getting internal server error using rest-client in ruby to post to HTTP POST

    - by Angela
    Hi, this is my code and I don't know how to debug it because I just get an "internal server error": I am trying to HTTP POST to an external ASPX: def upload uri = 'https://api.postalmethods.com/2009-02-26/PostalWS.asmx' #postalmethods URI #https://api.postalmethods.com/2009-02-26/PostalWS.asmx?op=UploadFile #http://www.postalmethods.com/method/2009-02-26/UploadFile @postalcard = Postalcard.find(:last) #Username=string&Password=string&MyFileName=string&FileBinaryData=string&FileBinaryData=string&Permissions=string&Description=string&Overwrite=string filename = @postalcard.postalimage.original_filename filebinarydata = File.open("#{@postalcard.postalimage.path}",'rb') body = "Username=me&Password=sekret&MyFileName=#{filename}&FileBinaryData=#{filebinarydata}" @response = RestClient.post(uri, body, #body as string {"Content-Type" => 'application/x-www-form-urlencoded', "Content-Length" => @postalcard.postalimage.size} # end headers ) #close arguments to Restclient.post end

    Read the article

  • Track http domain referer

    - by tony noriega
    Can i track the http referer with javascript, and append a variable to the URL string to store into a dbase? or could i track a cookie that the user gets? (very layman's terms here, sorry) if http referrer is domain.com add to url '&referer=google' which should stay with them during their session. OR when a user clicks my Google adwords ad. they get a cookie with a referring domain in it. try to read that cookie, and append the same variable. any thoughts?

    Read the article

  • HTTP redirect fallback

    - by Ondrej Stastny
    Hi, Is there a way to provide fallback URL when HTTP redirect times out? What I'm trying to achieve is that when I hit the original url, I would respond with HTTP redirect (300, 307?) giving the browser new URL and the fallback URL in case the new URL times out? I was also considering doing this client side but it just does not seem to be very effective (in terms of speed, client support). What I would do is probably have a tiny 1x1px image on each server, try loading it and then check with javascript which server is up and redirect there. Any other ideas? Thanks

    Read the article

  • HTTP header for sending PDF, problem in Firefox

    - by David
    In windows when i save a pdf with firefox adobe reader plugin ocurs this problem. The file saved is: http://www.example.com/opendocument.php_doc=._docs_doc01 My headers are: header('Content-type: application/pdf'); //header('Content-Disposition: inline; filename=doc01.pdf'); header("Content-Transfer-Encoding: binary"); header("Content-Length: ".filesize($pdf)); Original call is: http://www.example.com/opendocument.php?doc=./docs/doc01.pdf I'm not interest on attachment header. I must open into the website, not download o external window. Any idea?

    Read the article

  • HTTP Content-type header for cached files

    - by Brian
    Hello, Using Apache with mod_rewrite, when I load a .css or .js file and view the HTTP headers, the Content-type is only set correctly the first time I load it - subsequent refreshes are missing Content-type altogether and it's creating some problems for me. I can get around this by appending a random query string value to the end of each filename, eg. http://www.site.com/script.js?12345 However, I don't want to have to do that, since caching is good and all I want is for the Content-type to be present. I've tried using a RewriteRule to force the type but still didn't solve the problem. Any ideas? Thanks, Brian

    Read the article

  • Jaxer and HTTP proxy requests...

    - by rakhavan
    Thanks to everyone in advance. I'm using Jaxer.sandbox and making requests just fine. I'd like these requests to go through my http proxy (like squid for example). Here is the code I that is currently working for me. window.onload = function() { //the url to scrape var url = "http://www.cnn.com/"; //our sandboxed browser var sandbox = new Jaxer.Sandbox(); //open optons var openOptions = new Jaxer.Sandbox.OpenOptions(); openOptions.allowJavaScript = false; openOptions.allowMetaRedirects = false; openOptions.allowSubFrames = false; openOptions.allowSubFrames = false; openOptions.onload = function() { //do something onload }; //make the call sandbox.open(url, null, openOptions); //write the response Jaxer.response.setContents(sandbox.toHTML()); }; How can I send this request through a proxy server? Thanks, Reza.

    Read the article

  • Are rails timers reliable when using Net::HTTP?

    - by Frank
    Hi All. When reading data from a potentially slow website, I want to ensure that get_response can not hang, and so added a timer to timeout after x seconds. So far, so good. I then read http://ph7spot.com/musings/system-timer which illustrates that in certain situations timer.rb doesn't work due to ruby's implementation of threads. Does anyone know if this is one of these situations? url = URI.parse(someurl) begin Timeout::timeout(30) do response = Net::HTTP.get_response(url) @responseValue = CGI.unescape(response.body) end rescue Exception = e dosomething end

    Read the article

  • How to display characters in http get response correctly with the right encoding

    - by DixieFlatline
    Hello! Does anyone know how to read c,š,ž characters in http get response properly? When i make my request in browser the browser displays all characters correctly. But in java program with apache jars i don't know how to set the encoding right. I tried with client.getParams().setParameter(CoreProtocolPNames.HTTP_CONTENT_CHARSET, "UTF-8"); but it's not working. My code: HttpClient client = new DefaultHttpClient(); String getURL = "http://www.google.com"; HttpGet get = new HttpGet(getURL); HttpResponse responseGet = client.execute(get); HttpEntity resEntityGet = responseGet.getEntity(); if (resEntityGet != null) { Log.i("GET RESPONSE",EntityUtils.toString(resEntityGet)); } } catch (Exception e) { e.printStackTrace(); }

    Read the article

  • HTTP Data chunks over multiple packets?

    - by myforwik
    What is the correct way for a HTTP server to send data over multiple packets? For example I want to transfer a file, the first packet I send is: HTTP/1.1 200 OK Content-type: application/force-download Content-Type: application/download Content-Type: application/octet-stream Content-Description: File Transfer Content-disposition: attachment; filename=test.dat Content-Transfer-Encoding: chunked 400 <first 1024 bytes here> 400 <next 1024 bytes here> 400 <next 1024 bytes here> Now I need to make a new packet, if I just send: 400 <next 1024 bytes here> All the clients close there connections on me and the files are cut short. What headers do I put in a second packet to continue on with the data stream?

    Read the article

  • Ruby on Rails: reducing complexity of parameters in a RESTFul HTTP POST request (multi-model)

    - by randombits
    I'm using cURL to test a RESTFul HTTP web service. The problem is I'm normally submitting a bunch of values normally like this: curl -d "firstname=bob&lastname=smith&age=25&from=kansas&someothermodelattr=val" -H "Content-Type: application/x-www-form-urlencoded" http://mysite/people.xml -i The problem with this is my controller will then have code like this: unless params[:firstname].nil? end unless params[:lastname].nil? end // FINALLY @person = People.new(params[:firstname], params[:lastname], params[:age], params[:from]) etc.. What's the best way to simplify this? My Person model has all the validations it needs. Is there a way (assuming the request has multi-model parameters) that I can just do: @person = People.new(params[:person]) and then the initializer can take care of the rest?

    Read the article

  • How do I make an HTTP Post with HTTP Basic Authentication, using POCO?

    - by Alyoshak
    I'm trying to make an HTTP Post with HTTP Basic Authentication (cleartext username and password), using POCO. I found an example of a Get and have tried to modify it, but being a rookie I think I've mangled it beyond usefulness. Anyone know how to do this? Yes, I've already seen the other SO question on this: POCO C++ - NET SSL - how to POST HTTPS request, but I can't make sense of how it is trying to implement the username and password part. I also don't understand the use of "x-www-form-urlencoded". Is this required for a Post? I don't have a form. Just want to POST to the server with username and password parameters.

    Read the article

  • Java HTTP Client Request with defined timeout

    - by Maxim Veksler
    Hello, I would like to make BIT (Built in tests) to a number of server in my cloud. I need the request to fail on large timeout. How should I do this with java? Trying something like the below does not seem to work. public class TestNodeAliveness { public static NodeStatus nodeBIT(String elasticIP) throws ClientProtocolException, IOException { HttpClient client = new DefaultHttpClient(); client.getParams().setIntParameter("http.connection.timeout", 1); HttpUriRequest request = new HttpGet("http://192.168.20.43"); HttpResponse response = client.execute(request); System.out.println(response.toString()); return null; } public static void main(String[] args) throws ClientProtocolException, IOException { nodeBIT(""); } } -- EDIT: Clarify what library is being used -- I'm using httpclient from apache, here is the relevant pom.xml section org.apache.httpcomponents httpclient 4.0.1 jar

    Read the article

  • How to communicate/share a session between pages over HTTP and HTTPS

    - by spirytus
    What is common practice for coding web applications where part of the site has to be secured (e.g. checkout section) and part not necessarily, let's say homepage? As far as I know sharing sessions in between HTTP and HTTPS parts of the site is not easily possible (or is it?). What would be common approach if I wanted to display on HTTP page like homepage, shopping cart data (items) that users ordered on HTTPS pages? How those two parts of the site would communicate if necessary? Also isn't it security flaw in popular shopping carts as it seems that many of these have only checkout pages secured (SSL) and the rest not? I'm using PHP if it makes any difference.

    Read the article

  • WCF client using basic HTTP authentication

    - by AZ
    I'm trying to connect to a service that uses basic HTTP authentication. I've configured my binding like this <bindings> <basicHttpBinding> <binding name ="binding"> <security mode="TransportCredentialOnly"> <transport clientCredentialType="Basic"/> </security> </binding> </basicHttpBinding> </bindings> and i'm setting the credentials like this: client.ClientCredentials.UserName.UserName = Settings.UserName; client.ClientCredentials.UserName.Password = Settings.Password; Sill when i make a request i get a "The HTTP request is unauthorized with client authentication scheme 'Basic'" fault back. What am i doing wrong? (i don't have control over the service so all solutions must relate to the client configuration)

    Read the article

  • J2ME's extra annoying HTTP permission prompt

    - by Hans Malherbe
    Some phones only prompt the user for permission the first time a connection is made. Others pop up the permission prompt whenever the MIDlet attempts to make a HTTP connection! What are the options if we want to suppress the prompt? Can we sign the JAR using only one CA (Certificate Authority) and have it work on all devices? Do we have to pay for a signature on every release? Is it an option to create our own CA certificate and tell our customers to install it on there device? Alternatively, it seems that plain socket connections do not suffer so. Is there a free implementation of HTTP on top of TCP for J2ME?

    Read the article

  • failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request

    - by muralikalpana
    I am accessing images from another website. I am getting "failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request " error when copying 'some(not all)' images. here is my code. $img=$_GET['img']; //another website url $file=$img; function getFileextension($file) { return end(explode(".", $file)); } $fileext=getFileextension($file); if($fileext=='jpg' || $fileext=='gif' || $fileext=='jpeg' || $fileext=='png' || $fileext=='x-png' || $fileext=='pjpeg'){ if($img!=''){ $rand_variable1=rand(10000,100000); $node_online_name1=$rand_variable1."image.".$fileext; $s=copy($img,"images/".$node_online_name1); }

    Read the article

  • Redirected wikipedia request

    - by Le_Coeur
    Hi people, i need to write a program, that can redirect's http://localhost:8080 to en.wikipedia.org, it seems to be easy, but i have some problems(only with wikipedia with another sites works good). I make url to wikipedia: URL url = new URL("http", "en.wikipedia.org", 80, "/wiki"); than URLConnection, extract headers, and when i want connection.getInputStream(), i received message 404 Not Found. So i have tried some hack for host header, because in this way host header is localhost:8080, therefor i have tried to change host header to wikipedia, and it works, but after request in browser http://localhost:8080 wikipedia opens, but url in browser changes to en.wikipedia.org, but i want proceed with localhost :)

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >