Search Results

Search found 57810 results on 2313 pages for 'http delete'.

Page 141/2313 | < Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >

  • How do you redirect https to http

    - by mauriciopastrana
    that is, the opposite of what (seemingly) everyone teaches. I have a server on https for which I paid an SSL cert for and a mirror for which I haven't and keep around for just for emergencies so it doesn't merit getting a cert for. On my client's desktops I have SOME shortcuts which point to http://production_server and https://productionserver (both work), however; I know that if my prod. server goes down, then DNS forwarding kicks in and those clients which have https on their shortcut will be staring at https://mirrorserver (Which doesnt work) and a big fat IE7 red screen of uneasyness for my company. Unfortunately, I can't just switch this around at the client level. These users are very computer illiterate: and are very likely to freak out from seeing https "insecurity" errors (specially the way FFX3 and IE7 handle it nowadays: FULL STOP, kinda thankfully, but not helping me here LOL). It's very easy to find apache solutions for http-https redirection, but for the life of me I can't do the opposite. Ideas? Cheers, /mp

    Read the article

  • SQL Server Reporting Services proxy timeout (ASP.NET)

    - by Philip
    Morning, We are using SSRS (2005) and have a ASP.NET frontend using the SSRS WebControl. I've boiled the problem down the time it takes for one particular report to be generated is greater than the timeout on the proxy server. It looks like the way the SSRS web control tries to do things is by performing an HTTP request for the report, however the problem with this is the request can timeout potentially before the report has generated. Looking at the HTTP traffic the response is a 504 (gateway timeout). Is there a way to increase the timeout or change SSRS WebControl to use more robust polling mechanism (which isn't dependant on the timeout of the HTTP request). I could be wrong but I don't think ServerReport.Timeout property would resolve the issue we are seeing? Any thoughts? Philip

    Read the article

  • Can't access web service when connected to the network :: HTTP 407

    - by Ian
    Hi All, I have a console application that communicates with a web service. Both of them are on the same machine. When I am accessing the web service with the LAN disabled, it connects without a problem. But if the LAN is enabled and connected to our office network, I receive this error: "HTTP 407 Proxy Authentication required - The ISA Server requires authorization to fulfill the request. Access to the Web Proxy service is denied." We've been hunting the source of the problem for three days now. We have tried everything that we can think of. Any ideas what's causing the problem? Additional notes: - The machine is in a Workgroup setup but with DNS suffix (computer.local). When accessing the web service, we type the address as "http://machine.computer.local/service.asmx" I talked to the IT guys and they said that we don't have an ISA server installed There is no "proxy" set in IE. The machine is in mint condition.

    Read the article

  • Why is wget so much faster than Firefox at some downloads?

    - by Earlz
    Recently, I needed to do an update of Xilinx WebPack, mind you, this is one hefty piece of software. It weighs in at 6gigs, which definitely isn't "quick" on any internet I've ever had available to me. So, when I went to download it(using Firefox of course), I was very... unsettled by the fact that the download was only going at 110kByte/s. My internet connection is capable of about 2200kByte/s download, so what gives!? My workaround in the past for this issue has been to take the link to my Linode linux server and download it there with wget, where the download will zip along at 14MByte/s, and then either copying it to my website directory and downloading it that way through HTTP, or using sftp. Both ways work about as well and will sufficiently max out my connection. However, I recently figured out the missing variable. I tried doing the download locally with wget and was able to max out my connection! TL;DR; Now, my question. Why is wget so much faster than firefox at downloading this file? I hardly ever have such a difference in download speeds except for with this one file.

    Read the article

  • How does NHibernate handle cascade="all-delete-orphan"?

    - by Johannes Rudolph
    I've been digging around the NHibernate sources a little, trying to understand how NHibernate implements removing child elements from a collection. I think I've already found the answer, but I'd ideally like this to be confirmed by someone familiar with the matter. So far I've found AbstractPersistentCollection (base class for all collection proxies) has a static helper method called GetOrphans to find orphans by comparing the current collection with a snapshot. The existence of this method suggests NHibernate tries to find all oprhaned elements and then deletes them by key. Is this correct, in terms of the generated SQL?

    Read the article

  • Facebook publish HTTP Error 400 : bad request

    - by Abhishek
    Hey I am trying to publish a score to Facebook through python's urllib2 library. import urllib2,urllib url = "https://graph.facebook.com/USER_ID/scores" data = {} data['score']=SCORE data['access_token']='APP_ACCESS_TOKEN' data_encode = urllib.urlencode(data) request = urllib2.Request(url, data_encode) response = urllib2.urlopen(request) responseAsString = response.read() I am getting this error: response = urllib2.urlopen(request) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 124, in urlopen return _opener.open(url, data, timeout) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 389, in open response = meth(req, response) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 502, in http_response 'http', request, response, code, msg, hdrs) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 427, in error return self._call_chain(*args) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 361, in _call_chain result = func(*args) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 510, in http_error_default raise HTTPError(req.get_full_url(), code, msg, hdrs, fp) urllib2.HTTPError: HTTP Error 400: Bad Request Not sure if this is relating to Facebook's Open Graph or improper urllib2 API use.

    Read the article

  • SVN Delete with wildcard?

    - by David Lively
    I'm migrating a VSS repository to SVN and inadvertently included all of the _vti_cnf, *.scc files in the first check-in. I'd like to remove these from SVN. (Not permanently, of course - just in the HEAD). The application in question is quite large, and finding and deleting these files on a folder-by-folder basis will take forever. Suggestions? There must be some obvious way to do this, but the proximity of the weekend is interfering with my higher brain functions.

    Read the article

  • Difficulty screen scraping http://www.momondo.com using nokogiri

    - by Khai Kiong
    I have some difficulty to extract the total price (css selector = '.total') from the flight result. http://www.momondo.com/multicity/?Search=true&TripType=oneway&SegNo=1&SO0=KUL&SD0=KBR&SDP0=31-12-2012&AD=2&CA=0,0&DO=false&NA=false#Search=true&TripType=oneway&SegNo=1&SO0=KUL&SD0=KBR&SDP0=31-12-2012&AD=2&CA=0,0&DO=false&NA=false I get the error "undefined method `text' for nil:NilClass nokogiri ". My code desc "Fetch product prices" task :fetch_details => :environment do require 'nokogiri' require 'open-uri' include ERB::Util OneWayFlight.find_all_by_money(nil).each do |flight| url = "http://www.momondo.com/multicity/Search=true&TripType=oneway&SegNo=1&SO0=KUL&SD0=KBR&SDP0=31-12-2012&AD=2&CA=0,0&DO=false&NA=false#Search=true&TripType=oneway&SegNo=1&SO0=KUL&SD0=KBR&SDP0=31-12-2012&AD=2&CA=0,0&DO=false&NA=false" doc = Nokogiri::HTML(open(url)) price = doc.at_css(".total").text[/[0-9\.]+/] flight.update_attribute(:price, price) end end

    Read the article

  • 404 not found in telnet, works fine in browser

    - by Viranch Mehta
    i am having a very irritating problem, when i open a url ( http://celebs.widewallpapers.net/md/a/adriana-lima/1440/Adriana-Lima-1440x900-002.jpg ) in browser, it works fine.. but when i try to access it by telnet on bash, i get 404 not found!! my exact terminal: $ telnet celebs.widewallpapers.net 80 HEAD /md/a/adriana-lima/1440/Adriana-Lima-1440x900-002.jpg HTTP/1.0 [enter] [enter] HTTP/1.1 404 Not Found Server: nginx Date: Sun, 23 May 2010 21:36:05 GMT Content-Type: text/html; charset=windows-1251 Content-Length: 166 Connection: close please help me with this as i m trying to make a C batch-downloader, which is almost working as same as the telnet.

    Read the article

  • delete all but minimal values, based on two columns in SQL Server table

    - by sqlill
    how to write a statement to accomplish the folowing? lets say a table has 2 columns (both are nvarchar) with the following data col1 10000_10000_10001_10002_10002_10002 col2 10____20____10____30____40_____50 I'd like to keep only the following data: col1 10000_10001_10002 col2 10____10____30 thus removing the duplicates based on the second column values (neither of the columns are primary keys), keeping only those records with the minimal value in the second column. how to accomplish this?

    Read the article

  • how to get the http header in asynchronous request mode using ASIHHTTP

    - by user262325
    Hello everyone I hope to display the download file size by reading http header. I know there is way do this: ASIHTTPRequest request = [ASIHTTPRequest requestWithURL:url]; [request startSynchronous]; NSString poweredBy = [[request responseHeaders] objectForKey:@"X-Powered-By"]; NSString *contentType = [[request responseHeaders] objectForKey:@"Content-Type"]; but this is Synchronous mode, in Asynchronous mode it can be done as below: (void)requestFinished:(ASIHTTPRequest *)request { unsigned long long contentLength = [request contentLength]; } but 'requestFinished' is at the end of download. Is there an event to get the http header info at the beginning of download? Thank interdev

    Read the article

  • WPF/MVVM: Delegating a domain Model collection to a ViewModel

    - by msfanboy
    A domain model collection (normally a List or IEnumerable) is delegated to a ViewModel. Thats means my CustomerViewModel has a order collection of type List or IEnumerable. No change in the list is recognized by the bound control. But with ObservableCollection it is. This is a problem in the MVVM design pattern. How do you cope with it?

    Read the article

  • Logging raw HTTP request/response in ASP.NET MVC & IIS7

    - by Greg Beech
    I'm writing a web service (using ASP.NET MVC) and for support purposes we'd like to be able to log the requests and response in as close as possible to the raw, on-the-wire format (i.e including HTTP method, path, all headers, and the body) into a database. What I'm not sure of is how to get hold of this data in the least 'mangled' way. I can re-constitute what I believe the request looks like by inspecting all the properties of the HttpRequest object and building a string from them (and similarly for the response) but I'd really like to get hold of the actual request/response data that's sent on the wire. I'm happy to use any interception mechanism such as filters, modules, etc. and the solution can be specific to IIS7. However, I'd prefer to keep it in managed code only. Any recommendations? Edit: I note that HttpRequest has a SaveAs method which can save the request to disk but this reconstructs the request from the internal state using a load of internal helper methods that cannot be accessed publicly (quite why this doesn't allow saving to a user-provided stream I don't know). So it's starting to look like I'll have to do my best to reconstruct the request/response text from the objects... groan. Edit 2: Please note that I said the whole request including method, path, headers etc. The current responses only look at the body streams which does not include this information. Edit 3: Does nobody read questions around here? Five answers so far and yet not one even hints at a way to get the whole raw on-the-wire request. Yes, I know I can capture the output streams and the headers and the URL and all that stuff from the request object. I already said that in the question, see: I can re-constitute what I believe the request looks like by inspecting all the properties of the HttpRequest object and building a string from them (and similarly for the response) but I'd really like to get hold of the actual request/response data that's sent on the wire. If you know the complete raw data (including headers, url, http method, etc.) simply cannot be retrieved then that would be useful to know. Similarly if you know how to get it all in the raw format (yes, I still mean including headers, url, http method, etc.) without having to reconstruct it, which is what I asked, then that would be very useful. But telling me that I can reconstruct it from the HttpRequest/HttpResponse objects is not useful. I know that. I already said it. Please note: Before anybody starts saying this is a bad idea, or will limit scalability, etc., we'll also be implementing throttling, sequential delivery, and anti-replay mechanisms in a distributed environment, so database logging is required anyway. I'm not looking for a discussion of whether this is a good idea, I'm looking for how it can be done.

    Read the article

  • How to delete a user in MSCRM 4.0

    - by Chris Richner
    Do you know of a way to get rid of CRM User accounts in MSCRM 4.0? After some user accounts have been deleted in AD we're faced a lot of issues while importing the organisation to another server stating that there are issues with the user mapping. What kind of idea is behind the fact that no user accounts can be deleted from the crm installation? Is there any tool or undocumented Webservice API call to get rid of crm users? Thanks for sharing yours insights

    Read the article

  • Remove HTTP headers from a raw response

    - by Ed
    Let's say we make a request to a URL and get back the raw response, like this: HTTP/1.1 200 OK Date: Wed, 28 Apr 2010 14:39:13 GMT Expires: -1 Cache-Control: private, max-age=0 Content-Type: text/html; charset=ISO-8859-1 Set-Cookie: PREF=ID=e2bca72563dfffcc:TM=1272465553:LM=1272465553:S=ZN2zv8oxlFPT1BJG; expires=Fri, 27-Apr-2012 14:39:13 GMT; path=/; domain=.google.co.uk Server: gws X-XSS-Protection: 1; mode=block Connection: close <!doctype html><html><head>...</head><body>...</body></html> What would be the best way to remove the HTTP headers from the response in C#? With regexes? Parsing it into some kind of HTTPResponse object and using only the body? EDIT: I'm using SOCKS to make the request, that's why I get the raw response.

    Read the article

  • jQuery code to delete all spans with same content but keep one

    - by Axel
    Hello, Say i have the following html: <span class="fruit">Apple</span> <span class="fruit">banana</span> <span class="fruit">Apple</span> <span class="fruit">Apple</span> <span class="fruit">orange</span> I tried different methods but it didn't work, I want a jQuery code to remove all (.fruit) spans with same content but keep one (the first if possible), so i will end up with the following: <span class="fruit">Apple</span> <span class="fruit">banana</span> <span class="fruit">orange</span> Thank you

    Read the article

  • Sening a file from memory (rather than disk) over HTTP using libcurl

    - by cinek1lol
    Hi! I would like to send pictures via a program written in C + +. - OK WinExec("C:\\curl\\curl.exe -H Expect: -F \"fileupload=@C:\\curl\\ok.jpg\" -F \"xml=yes\" -# \"http://www.imageshack.us/index.php\" -o data.txt -A \"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1\" -e \"http://www.imageshack.us\"", NULL); It works, but I would like to send the pictures from pre-loaded carrier to a variable char (you know what I mean? First off, I load the pictures into a variable and then send the variable), cause now I have to specify the path of the picture on a disk. I wanted to write this program in c++ by using the curl library, not through exe. extension. I have also found such a program (which has been modified by me a bit) #include <stdio.h> #include <string.h> #include <iostream> #include <curl/curl.h> #include <curl/types.h> #include <curl/easy.h> int main(int argc, char *argv[]) { CURL *curl; CURLcode res; struct curl_httppost *formpost=NULL; struct curl_httppost *lastptr=NULL; struct curl_slist *headerlist=NULL; static const char buf[] = "Expect:"; curl_global_init(CURL_GLOBAL_ALL); /* Fill in the file upload field */ curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "send", CURLFORM_FILE, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "nowy.jpg", CURLFORM_COPYCONTENTS, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "submit", CURLFORM_COPYCONTENTS, "send", CURLFORM_END); curl = curl_easy_init(); headerlist = curl_slist_append(headerlist, buf); if(curl) { curl_easy_setopt(curl, CURLOPT_URL, "http://www.imageshack.us/index.php"); if ( (argc == 2) && (!strcmp(argv[1], "xml=yes")) ) curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist); curl_easy_setopt(curl, CURLOPT_HTTPPOST, formpost); res = curl_easy_perform(curl); curl_easy_cleanup(curl); curl_formfree(formpost); curl_slist_free_all (headerlist); } system("pause"); return 0; }

    Read the article

  • net/http.rb:560:in `initialize': getaddrinfo: Name or service not known (SocketError)

    - by Sid
    ` @@timestamp = nil def generate_oauth_url @@timestamp = timestamp url = CONNECT_URL + REQUEST_TOKEN_PATH + "&oauth_callback=#{OAUTH_CALLBACK}&oauth_consumer_key=#{OAUTH_CONSUMER_KEY}&oauth_nonce=#{NONCE} &oauth_signature_method=#{OAUTH_SIGNATURE_METHOD}&oauth_timestamp=#{@@timestamp}&oauth_version=#{OAUTH_VERSION}" puts url url end def sign(url) Base64.encode64(HMAC::SHA1.digest((NONCE + url), OAUTH_CONSUMER_SECRET)).strip end def get_request_token url = generate_oauth_url signed_url = sign(url) request = Net::HTTP.new((CONNECT_URL + REQUEST_TOKEN_PATH),80) puts request.inspect headers = { "Authorization" => "Authorization: OAuth oauth_nonce = #{NONCE}, oauth_callback = #{OAUTH_CALLBACK}, oauth_signature_meth od = #{OAUTH_SIGNATURE_METHOD}, oauth_timestamp=#{@@timestamp}, oauth_consumer_key = #{OAUTH_CONSUMER_KEY}, oauth_signature = #{signed_url}, oauth_versio n = #{OAUTH_VERSION}" } request.post(url, nil,headers) end def timestamp Time.now.to_i end ` I am trying to do what oauth does in an attempt to understand how to use the Authorization headers. I am also getting the following error. I am trying to connect to the linkedin API. /usr/lib/ruby/1.8/net/http.rb:560:in 'initialize': getaddrinfo: Name or service not known (SocketError) I would really appreciate it if someone could nudge me in the right direction.

    Read the article

  • Deleting object in function

    - by wrongusername
    Let's say I have created two objects from class foo and now want to combine the two. How, if at all possible, can I accomplish that within a function like this: def combine(first, second): first.value += second.value del second #this doesn't work, though first.value *does* get changed instead of doing something like def combine(first, second): first.value += second.value in the function and putting del second immediately after the function call?

    Read the article

  • Python: deleting rows in a text file

    - by Jenny
    A sample of the following text file i have is: > 1 -4.6 -4.6 -7.6 > > 2 -1.7 -3.8 -3.1 > > 3 -1.6 -1.6 -3.1 the data is separated by tabs in the text file and the first column indicates the position. I need to iterate through every value in the text file apart from column 0 and find the lowest value. once the lowest value has been found that value needs to be written to a new text file along with the column name and position. Column 0 has the name "position" Column 1 "fifteen", column 2 "sixteen" and column 3 "seventeen" for example the lowest value in the above data is "-7.6" and is in column 3 which has the name "seventeen". Therefore "7.6", "seventeen" and its position value which in this case is 1 need to be written to the new text file. I then need a number of rows deleted from the above text file. E.G. the lowest value above is "-7.6" and is found at position "1" and is found in column 3 which as the name "seventeen". I therefore need seventeen rows deleted from the text file starting from and including position 1 so the the column in which the lowest value is found denotes the amount of rows that needs to be deleted and the position it is found at states the start point of the deletion

    Read the article

  • Is my form password being passed in clear text?

    - by liinkas
    This is what my browser sent, when logging into some site: POST http://www.some.site/login.php HTTP/1.0 User-Agent: Opera/8.26 (X2000; Linux i686; Z; en) Host: www.some.site Accept: text/html, application/xml;q=0.9, application/xhtml+xml, image/png, image/jpeg, image/gif, image/x-xbitmap, */*;q=0.1 Accept-Language: en-US,en;q=0.9 Accept-Charset: iso-8859-1, utf-8, utf-16, *;q=0.1 Accept-Encoding: deflate, gzip, x-gzip, identity, *;q=0 Referer: http://www.some.site/ Proxy-Connection: close Content-Length: 123 Content-Type: application/x-www-form-urlencoded lots_of_stuff=here&e2ad811=my_login_name&e327696=my_password&lots_of_stuff=here Can I state that anyone can sniff my login name and password for that site? Maybe just on my LAN? If so (even only on LAN ) then I'm shocked. I thought using <input type="password"> did something more than make all characters look like ' * ' p.s. If it matters I played with netcat (on linux) and made connection browser <= netcat (loged here) <= proxy <= remote_site

    Read the article

< Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >