Search Results

Search found 50115 results on 2005 pages for 'http referer'.

Page 34/2005 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • RESTful issue with data access when using HTTP DELETE method ...

    - by Wilhelm Murdoch
    I'm having an issue accessing raw request information from PHP when accessing a script using the HTTP DELETE directive. I'm using a JS front end which is accessing a script using Ajax. This script is actually part of a RESTful API which I am developing. The endpoint in this example is: http://api.site.com/session This endpoint is used to generate an authentication token which can be used for subsequent API requests. Using the GET method on this URL along with a modified version of HTTP Basic Authentication will provide an access token for the client. This token must then be included in all other interactions with the service until it expires. Once a token is generated, it is passed back to the client in a format specified by an 'Accept' header which the client sends the service; in this case 'application/json'. Upon success it responds with an HTTP 200 Ok status code. Upon failure, it throws an exception using the HTTP 401 Authorization Required code. Now, when you want to delete a session, or 'log out', you hit the same URL, but with the HTTP DELETE directive. To verify access to this endpoint, the client must prove they were previously authenticated by providing the token they want to terminate. If they are 'logged in', the token and session are terminated and the service should respond with the HTTP 204 No Content status code, otherwise, they are greeted with the 401 exception again. Now, the problem I'm having is with removing sessions. With the DELETE directive, using Ajax, I can't seem to access any parameters I've set once the request hits the service. In this case, I'm looking for the parameter entitled 'token'. I look at the raw request headers using Firebug and I notice the 'Content-Length' header changes with the size of the token being sent. This is telling me that this data is indeed being sent to the server. The question is, using PHP, how the hell to I access parameter information? It's not a POST or GET request, so I can't access it as you normally would in PHP. The parameters are within the content portion of the request. I've tried looking in $_SERVER, but that shows me limited amount of headers. I tried 'apache_request_headers()', which gives me more detailed information, but still, only for headers. I even tried 'file_get_contents('php://stdin');' and I get nothing. How can I access the content portion of a raw HTTP request? Sorry for the lengthy post, but I figured too much information is better than too little. :)

    Read the article

  • wget hangs in http request sent awaiting response in some sites

    - by gkr
    Using Ubuntu 12.04. wget hangs in http request sent, awaiting response... in some sites. Browser's are not opening sites that are failed in wget. But in WinXP everything works. This works gkr@gkr-desktop:~/Documents/curl$ wget google.com --2012-06-12 21:29:37-- http://google.com/ Resolving google.com (google.com)... 74.125.236.174, 74.125.236.160, 74.125.236.161, ... Connecting to google.com (google.com)|74.125.236.174|:80... connected. HTTP request sent, awaiting response... 301 Moved Permanently Location: http://www.google.com/ [following] --2012-06-12 21:29:38-- http://www.google.com/ Resolving www.google.com (www.google.com)... 74.125.236.179, 74.125.236.180, 74.125.236.176, ... Connecting to www.google.com (www.google.com)|74.125.236.179|:80... connected. HTTP request sent, awaiting response... 302 Found Location: http://www.google.co.in/ [following] --2012-06-12 21:29:38-- http://www.google.co.in/ Resolving www.google.co.in (www.google.co.in)... 74.125.236.184, 74.125.236.191, 74.125.236.183, ... Connecting to www.google.co.in (www.google.co.in)|74.125.236.184|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: `index.html.3' [ ] 13,383 --.-K/s in 0.04s 2012-06-12 21:29:39 (308 KB/s) - `index.html.3' saved [13383] gkr@gkr-desktop:~/Documents/curl$ This site just stops/hangs in awaiting response. gkr@gkr-desktop:~/Documents/curl$ wget grooveshark.com --2012-06-12 21:27:29-- http://grooveshark.com/ Resolving grooveshark.com (grooveshark.com)... 8.20.213.76 Connecting to grooveshark.com (grooveshark.com)|8.20.213.76|:80... connected. HTTP request sent, awaiting response... ^C gkr@gkr-desktop:~/Documents/curl$ Thanks

    Read the article

  • Windows 2008 IIS 7.0 HTTP to HTTPS Redirect -- Versus IIS 6.0 Mechanism

    - by Dan7el
    This topic, creating a mechanism for redirection from HTTP to HTTPS on a Windows 2008 server running IIS 7.0 is a much written-about topic on the Internet. How this is done is really not so much my issue. My issue is more of explaining why this can't be done with the standard HTTP Redirect module that ships with Windows 2008 IIS 7.0. Instead, there are other methods needed that are more arduous. First, the IIS 6.0 method requires no externally available modules nor does it require any additional modifications to the web.config or any type of other development effort. It's outlined here: http://blogs.microsoft.co.il/blogs/dorr/archive/2009/01/13/how-to-force-redirection-from-http-to-https-on-iis-6-0.aspx And, you can see the basic steps are to run the snap-in, get the properties on the site, and do some modifications. Presto, you have the HTTP -- HTTP redirect setup. Now, on the IIS 7.0 platform, it doesn't seem this simple. An initial search found the following site: http://www.sslshopper.com/iis7-redirect-http-to-https.html Which has two separate approcates: 1. Involves installing a separately available Microsoft module -- URL Rewrite Module, and then adding XML to the web.config. 2. Custom Error Page. ...there might be other methods, but these are the basic ones and the first is listed as the primary method. But wait...There exists on the IIS 7.0 an HTTP Redirect Module. So...why can't I use the HTTP Redirect Module to do this very thing? This is really my big question. I need to know this because my management is going to insist I use the HTTP Redirect Module and set up the HTTP to HTTPS redirect in a similar fashion to how we do in IIS 6.0. Can someone please explain to me, in clean, simple, easy to understand, terms that both I and my management can understand as to why I need to go get the URL Rewrite Module and install that on the server and make the web.config changes suggested by the article instead of simply using the HTTP Redirect module that's already installed on the site? Thanks a bunch.

    Read the article

  • Is there a simple LDAP-to-HTTP gateway out there?

    - by larsks
    We have a local LDAP directory that provides basic contact information about our user community. We would like to integrate this into some third-party hosted services that allow us to implement widgets that run arbitrary Javascript. In order to connect Javascript to our LDAP directory, I would like to set up a simple LDAP-to-HTTP proxy that would accept HTTP GET requests, translate them into an appropriate LDAP query, and respond with directory information as JSON-encoded data. In an ideal world, something like this: GET /[email protected] Would get me something like this: { "cn": "Bob Person", "title": "System Administrator", "sn": "Person", "mail": "[email protected]", "telepehoneNumber": "617-555-1212", "givenName": "Bob" } (And this obviously assumes that the web application has locally configured information about what base DN to use, how to authenticate, etc). I guess I could write one...but surely something like this already exists? UPDATE The consensus seems to be that there isn't a pre-existing solution out there and that I should just get off my lazy derriere and write one. So I did, and it's here. It's not especially pretty, but it works for my prototyping and I figure maybe someone else will find it useful someday.

    Read the article

  • Simple Linux program that takes any HTTP/HTTPS request and returns a single page?

    - by ultrasawblade
    I have a Linux box operating as router. There's a NIC that's connected to the internet (WAN), a NIC connected to an 8-port GbE switch (LAN), and a NIC connected to a Linksys wireless N-router (WLAN). Routing between everything is working perfectly. I have security completely disabled on the wireless router, but the WLAN NIC is firewalled such that it will only accept DNS queries and PPTP VPN connections. Currently HTTP/HTTPS traffic and everything else is blocked. I would like to run something that listens on port 80/443 of the WLAN NIC, and, for non VPN'ed connections, given any HTTP/HTTPS request it will return a single webpage saying "Unauthenticated" and explain how to sign into the VPN. A transparent proxy seems to be what I need, but my searches all seem to direct me to Squid, which is already running on my server and seems overkill for this simple task. Is there a simpler, lightweight program out there that does just this or should I just suck it up and run two instances of Squid (or figure out how to configure it)? Or, is this entire VPN thing I'm doing complete nonsense and I should just enable encryption on the wireless router?

    Read the article

  • What is the recommended method of HTTP Redirection from multiple URLs to one URL?

    - by ChrisHDog
    I have a website that has a number of URLs that people use to connect to that site (uses the bindings on the IIS website and everything works as intended): http://www.sample.com http://sample.com https://www.sample.com http://xyz.sample.com http://oldurl.com Now what I want to do is have all of the URLs go to https://www.sample.com - so if you type in "http://xyz.sample.com" or "sample.com" you should go to https://www.sample.com The question is what is the best mechanism to do this? I have one possible solution (which I will put as an answer to this question), but I get the feeling that there might be another, better solution available.

    Read the article

  • Errors when connecting to HTTPS using HTTP::Net routines (Ruby on Rails)

    - by jaycode
    Hi all, the code below explains the problem in detail #this returns error Net::HTTPBadResponse url = URI.parse('https://sitename.com') response = Net::HTTP.start(url.host, url.port) {|http| http.get('/remote/register_device') } #this works url = URI.parse('http://sitename.com') response = Net::HTTP.start(url.host, url.port) {|http| http.get('/remote/register_device') } #this returns error Net::HTTPBadResponse response = Net::HTTP.post_form(URI.parse('https://sitename.com/remote/register_device'), {:text => 'hello world'}) #this returns error Errno::ECONNRESET (Connection reset by peer) response = Net::HTTP.post_form(URI.parse('https://sandbox.itunes.apple.com/verifyReceipt'), {:text => 'hello world'}) #this works response = Net::HTTP.post_form(URI.parse('http://sitename.com/remote/register_device'), {:text => 'hello world'}) So... how do I send POST parameters to https://sitename.com or https://sandbox.itunes.apple.com/verifyReceipt in this example? Further information, I am trying to get this working in Rails: http://developer.apple.com/iphone/library/documentation/NetworkingInternet/Conceptual/StoreKitGuide/VerifyingStoreReceipts/VerifyingStoreReceipts.html#//apple_ref/doc/uid/TP40008267-CH104-SW1

    Read the article

  • Using HTTP status codes to reflect success/failure of Web service request?

    - by jgarbers
    I'm implementing a Web service that returns a JSON-encoded payload. If the service call fails -- say, due to invalid parameters -- a JSON-encoded error is returned. I'm unsure, however, what HTTP status code should be returned in that situation. On one hand, it seems like HTTP status codes are for HTTP: even though an application error is being returned, the HTTP transfer itself was successful, suggesting a 200 OK response. On the other hand, a RESTful approach would seem to suggest that if the caller is attempting to post to a resource, and the JSON parameters of the request are invalid somehow, that a 400 Bad Request is appropriate. I'm using Prototype on the client side, which has a nice mechanism for automatically dispatching to different callbacks based on HTTP status code (onSuccess and onFailure), so I'm tempted to use status codes to indicate service success or failure, but I'd be interested to hear if anyone has opinions or experience with common practice in this matter. Thanks!

    Read the article

  • Building a webserver, client doesn't acknowledge HTTP 200 OK frame.

    - by Evert
    Hi there, I'm building my own webserver based on a tutorial. I have found a simple way to initiate a TCP connection and send one segment of http data (the webserver will run on a microcontroller, so it will be very small) Anyway, the following is the sequence I need to go through: receive SYN send SYN,ACK receive ACK (the connection is now established) receive ACK with HTTP GET command send ACK send FIN,ACK with HTTP data (e.g 200 OK) receive FIN,ACK <- I don't recieve this packet! send ACK Everything works fine until I send my acknowledgement and HTTP 200 OK message. The client won't send an acknowledgement to those two packages and thus no webpage is being displayed. I've added a pcap file of the sequence how I recorded it with wireshark. Pcap file: http://cl.ly/5f5 (now it's the right data) All sequence and acknowledgement numbers are correct, checksum are ok. Flags are also right. I have no idea what is going wrong.

    Read the article

  • Custom http service responds fine to local IP address but NOT to localhost or 127.0.0.1

    - by Scrappydog
    I'm trying to connect to a custom http service written by another developer. The service responds fine on a local IP address and port number. Such as: http://10.1.1.1:1234 but it does NOT respond to http://localhost:1234 or http://127.0.0.1:1234 The service is a simple single function application written in VC++ that takes an http post string and returns another string. I'm trying to all it from C# using HttpWebRequest.GetResponse, but I can reproduce the same problem manually from a web browser... Test environment is Windows 2008 Server. Bottom line I'm looking for some troubleshooting tips to help the other developer fix his code.

    Read the article

  • Setting custom HTTP request headers in an URL object doesn't work.

    - by Blagovest Buyukliev
    I am trying to fetch an image from an IP camera using HTTP. The camera requires HTTP basic authentication, so I have to add the corresponding request header: URL url = new URL("http://myipcam/snapshot.jpg"); URLConnection uc = url.openConnection(); uc.setRequestProperty("Authorization", "Basic " + new String(Base64.encode("user:pass".getBytes()))); // outputs "null" System.out.println(uc.getRequestProperty("Authorization")); I am later passing the url object to ImageIO.read(), and, as you can guess, I am getting an HTTP 401 Unauthorized, although user and pass are correct. What am I doing wrong? I've also tried new URL("http://user:pass@myipcam/snapshot.jpg"), but that doesn't work either.

    Read the article

  • Web.config WordPress rewrite rules next to Magento

    - by Flo
    I've installed Magento on IIS in folder: E:\mydomain\wwwroot (I already have it all running correctly). I have no deeper folder magento, I placed all files directly in the wwwroot folder, so: wwwroot\app wwwroot\downloader wwwroot\errors wwwroot\includes etc... UPDATE: since I'm on IIS my .htaccess is ignored completely and my web.config rules are used instead. Here's my web.config in folder e:\mydomain\wwwroot: <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rules> <rule name="Magento SEO: remove index.php from URL"> <match url="^(?!index.php)([^?#]*)(\\?([^#]*))?(#(.*))?" /> <conditions> <add input="{URL}" pattern="^/(media|skin|js)/" ignoreCase="false" negate="true" /> <add input="{REQUEST_FILENAME}" matchType="IsFile" ignoreCase="false" negate="true" /> <add input="{REQUEST_FILENAME}" matchType="IsDirectory" ignoreCase="false" negate="true" /> </conditions> <action type="Rewrite" url="index.php/{R:0}" /> </rule> </rules> </rewrite> </system.webServer> </configuration> Next, I wanted to install WordPress. I unzipped all files in folder e:\mydomain\wwwroot\wordpress Browsed to www.mydomain.com/wordpress/wp-admin/install.php, where I configured everything for my database. Everything was installed correctly. I then navigate to http://www.mydomain.com/wordpress/wp-login.php where I type my credentials. I seem to be logged in and am redirected to http://www.mydomain.com/wordpress/wp-admin/ But there I receive an empty page. I enabled detailed error message in IIS following this article: http://www.iis.net/learn/troubleshoot/diagnosing-http-errors/how-to-use-http-detailed-errors-in-iis I also checkec with Fiddler and see that I receive a 500 error: GET /wordpress/wp-admin/ HTTP/1.1 Host: www.mydomain.com Connection: keep-alive Cache-Control: max-age=0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.76 Safari/537.36 Referer: http://www.mydomain.com/wordpress/wp-login.php Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8,nl;q=0.6 Cookie: wordpress_fabec4083cf12d8de89c98e8aef4b7e3=floran%7C1381236774%7C2d8edb4fc6618f290fadb49b035cad31; wordpress_test_cookie=WP+Cookie+check; wordpress_logged_in_fabec4083cf12d8de89c98e8aef4b7e3=floran%7C1381236774%7Cbf822163926b8b8df16d0f1fefb6e02e HTTP/1.1 500 Internal Server Error Content-Type: text/html Server: Microsoft-IIS/7.5 X-Powered-By: PHP/5.4.14 X-Powered-By: ASP.NET Date: Sun, 06 Oct 2013 12:56:03 GMT Content-Length: 0 My WordPress web.config in folder e:\mydomain\wwwroot\wordpress contains: <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rules> <rule name="wordpress" patternSyntax="Wildcard"> <match url="*"/> <conditions> <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true"/> <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true"/> </conditions> <action type="Rewrite" url="index.php"/> </rule></rules> </rewrite> </system.webServer> </configuration> I also want my WordPress articles to be available on www.mydomain.com/blog instead of www.mydomain.com/wordpress Ofcourse my admin links for Magento and Wordpress should also work. How can I configure my web.config files to achieve the above?

    Read the article

  • if I set up the expire http header of a css file to 1 year, if I modify that file, will it be ignore

    - by user39511
    I'm using rails with nginx/passenger. If I set up the expire http header of a css file to 1 year, if I modify that file, will it be ignored by the browser (ie, it will not request the new version)? Given that Rails adds a different timestamps to each asset such as foo.css?1270165626 every time I restart the server? That's the config I use right now (nginx/passenger): location ~* \.(ico|css|js|gif|jpe?g|png)(\?[0-9]+)?$ { expires max; break; }

    Read the article

  • Is there a security concern exposing NTLM authentication over http or should it only be https?

    - by Shane
    We are setting up a SharePoint 2010 site. Don't worry, this is not a Sharepoint question, just adding it for context. Most of the site will be anonymous, but some users are able to authenticate in and edit content. They use NTLM (users exist in AD). Is there any concern about exposing NTLM login for users that can modify content over the internet via http or should that only be exposed via https?

    Read the article

  • JAX-RS --- How to return JSON and HTTP Status code together?

    - by masato-san
    I'm writing REST web app (Netbean6.9, JAX-RS, Toplink-essential) and trying to return JSON and Http status code. I have code ready and working just to return JSON when HTTP GET Method is called from client. Code snippet @Path("get/id") @GET @Produces("application/json") public M_?? getMachineToUpdate(@PathParam("id") String id) { //some code to return JSON . . return myJson But I also want to return HTTP status code (500, 200, 204 etc) along with returning JSON. I tried using HttpServletResponse object, response.sendError("error message", 500); But this made browser to think it's real 500 so output web page was regular Http 500 error page. What I want to is just to return status code so that my Javascript on client side can handle some logic depending on what HTTP status code is returned. (maybe just to display the error code and message on html page.) Is it possible to do so? or should HTTP status code not be used for such thing?

    Read the article

  • Which HTTP redirect status code is best for this REST API scenario?

    - by Aseem Kishore
    I'm working on a REST API. The key objects ("nouns") are "items", and each item has a unique ID. E.g. to get info on the item with ID foo: GET http://api.example.com/v1/item/foo New items can be created, but the client doesn't get to pick the ID. Instead, the client sends some info that represents that item. So to create a new item: POST http://api.example.com/v1/item/ hello=world&hokey=pokey With that command, the server checks if we already have an item for the info hello=world&hokey=pokey. So there are two cases here. Case 1: the item doesn't exist; it's created. This case is easy. 201 Created Location: http://api.example.com/v1/item/bar Case 2: the item already exists. Here's where I'm struggling... not sure what's the best redirect code to use. 301 Moved Permanently? 302 Found? 303 See Other? 307 Temporary Redirect? Location: http://api.example.com/v1/item/foo I've studied the Wikipedia descriptions and RFC 2616, and none of these seem to be perfect. Here are the specific characteristics I'm looking for in this case: The redirect is permanent, as the ID will never change. So for efficiency, the client can and should make all future requests to the ID endpoint directly. This suggests 301, as the other three are meant to be temporary. The redirect should use GET, even though this request is POST. This suggests 303, as all others are technically supposed to re-use the POST method. In practice, browsers will use GET for 301 and 302, but this is a REST API, not a website meant to be used by regular users in browsers. It should be broadly usable and easy to play with. Specifically, 303 is HTTP/1.1 whereas 301 and 302 are HTTP/1.0. I'm not sure how much of an issue this is. At this point, I'm leaning towards 303 just to be semantically correct (use GET, don't re-POST) and just suck it up on the "temporary" part. But I'm not sure if 302 would be better since in practice it's been the same behavior as 303, but without requiring HTTP/1.1. But if I go down that line, I wonder if 301 is even better for the same reason plus the "permanent" part. Thoughts appreciated!

    Read the article

  • HTTP Post requests using HttpClient take 2 seconds, why?

    - by pableu
    Update: You might better hold off this for a bit, I just noticed I could be my fault after all. Working on this all afternoon, and then I find a flaw ten minutes after posting here, ts. Hi, I'am currently coding an android app that submits stuff in the background using HTTP Post and AsyncTask. I use the org.apache.http.client Package for this. I based my code on this example. Basically, my code looks like this: public void postData() { // Create a new HttpClient and Post Header HttpClient httpclient = new DefaultHttpClient(); HttpPost httppost = new HttpPost("http://192.168.1.137:8880/form"); try { List<NameValuePair> nameValuePairs = new ArrayList<NameValuePair>(2); nameValuePairs.add(new BasicNameValuePair("id", "12345")); nameValuePairs.add(new BasicNameValuePair("stringdata", "AndDev is Cool!")); httppost.setEntity(new UrlEncodedFormEntity(nameValuePairs)); // Execute HTTP Post Request HttpResponse response = httpclient.execute(httppost); } catch (ClientProtocolException e) { Log.e(TAG,e.toString()); } catch (IOException e) { Log.e(TAG,e.toString()); } } The problem is that the httpclient.execute(..) line takes around 1.5 to 3 seconds, and I do not understand why. Just requesting a page with HTTP Get takes around 80 ms or so, so the problem doesn't seem to be the network latency itself. The problem doesn't seem to be on the server side either, I have also tried POSTing data to http://www.disney.com/ with similarly slow results. And Firebug shows 1 ms response time when POSTing data to my server locally. This happens on the Emulator and with my Nexus One (both with Android 2.2). If you want to look at the complete code, I've put it on GitHub. It's just a dummy program to do HTTP Post in the background using AsyncTask on the push of a button. It's my first Android app, and my first java code for a long time. And incidentially, also my first question on Stackoverflow ;-) Any ideas why httpclient.execute(httppost) takes so long?

    Read the article

  • Use HTTP PUT to create new cache (ehCache) running on the same Tomcat?

    - by socal_javaguy
    I am trying to send a HTTP PUT (in order to create a new cache and populate it with my generated JSON) to ehCache using my webservice which is on the same local tomcat instance. Am new to RESTful Web Services and am using JDK 1.6, Tomcat 7, ehCache, and JSON. I have my POJOs defined like this: Person POJO: import javax.xml.bind.annotation.XmlRootElement; @XmlRootElement public class Person { private String firstName; private String lastName; private List<House> houses; // Getters & Setters } House POJO: import javax.xml.bind.annotation.XmlRootElement; @XmlRootElement public class House { private String address; private String city; private String state; // Getters & Setters } Using a PersonUtil class, I hardcoded the POJOs as follows: public class PersonUtil { public static Person getPerson() { Person person = new Person(); person.setFirstName("John"); person.setLastName("Doe"); List<House> houses = new ArrayList<House>(); House house = new House(); house.setAddress("1234 Elm Street"); house.setCity("Anytown"); house.setState("Maine"); houses.add(house); person.setHouses(houses); return person; } } Am able to create a JSON response per a GET request: @Path("") public class MyWebService{ @GET @Produces(MediaType.APPLICATION_JSON) public Person getPerson() { return PersonUtil.getPerson(); } } When deploying the war to tomcat and pointing the browser to http://localhost:8080/personservice/ Generated JSON: { "firstName" : "John", "lastName" : "Doe", "houses": [ { "address" : "1234 Elmstreet", "city" : "Anytown", "state" : "Maine" } ] } So far, so good, however, I have a different app which is running on the same tomcat instance (and has support for REST): http://localhost:8080/ehcache/rest/ While tomcat is running, I can issue a PUT like this: echo "Hello World" | curl -S -T - http://localhost:8080/ehcache/rest/hello/1 When I "GET" it like this: curl http://localhost:8080/ehcache/rest/hello/1 Will yield: Hello World What I need to do is create a POST which will put my entire Person generated JSON and create a new cache: http://localhost:8080/ehcache/rest/person And when I do a "GET" on this previous URL, it should look like this: { "firstName" : "John", "lastName" : "Doe", "houses": [ { "address" : "1234 Elmstreet", "city" : "Anytown", "state" : "Maine" } ] } So, far, this is what my PUT looks like: @PUT @Path("/ehcache/rest/person") @Produces(MediaType.APPLICATION_JSON) @Consumes(MediaType.APPLICATION_JSON) public Response createCache() { ResponseBuilder response = Response.ok(PersonUtil.getPerson(), MediaType.APPLICATION_JSON); return response.build(); } Question(s): (1) Is this the correct way to write the PUT? (2) What should I write inside the createCache() method to have it PUT my generated JSON into: http://localhost:8080/ehcache/rest/person (3) What would the command line CURL comment look like to use the PUT? Thanks for taking the time to read this...

    Read the article

  • ExpressionEngine Segment Variables Lost on Site Index Page

    - by Jesse Bunch
    Hey Everyone, I've been messing with this for days now and can't seem to figure it out. I am trying to pass a 2nd segment variable to my client's index page. The URL I'm trying is: http://www.compupay.com/site/CSCPA/. The problem is, rather than showing the site's index page with the segment variable of "CSCPA" still in the URL, it shows the index page with no segment variables. Initially, I thought it was a .htaccess problem but I couldn't find anything in it that seemed out of whack. Any ideas? I am posting the .htaccess file so another pair of eyes can see it. Thanks for the help! # -- LG .htaccess Generator Start -- # .htaccess generated by LG .htaccess Generator v1.0.0 # http://leevigraham.com/cms-customisation/expressionengine/addon/lg-htaccess-generator/ # secure .htaccess file <Files .htaccess> order allow,deny deny from all </Files> # Dont list files in index pages IndexIgnore * #URL Segment Support AcceptPathInfo On Options +FollowSymLinks #Redirect old incoming links Redirect 301 /contactus.cfm http://www.compupay.com/about_compupay/contact_us/ Redirect 301 /Internet_Payroll.cfm http://www.compupay.com/payroll_solutions/c/online_payroll/ Redirect 301 /Internet_Payroll_XpressPayroll.cfm http://www.compupay.com/payroll_solutions/xpresspayroll/ Redirect 301 /about_compupay.cfm http://www.compupay.com/about_compupay/news/ Redirect 301 /after_payroll.cfm http://www.compupay.com/after_payroll_solutions/ Redirect 301 /news101507.cfm http://www.compupay.com/about_compupay/news/ Redirect 301 /quote.cfm http://www.compupay.com/payroll_solutions/get_a_free_quote/ Redirect 301 /solution_finder_sm.cfm http://www.compupay.com/ Redirect 301 /state_payroll/mississippi_payroll.cfm http://www.compupay.com/resource_center/state_resources/ Redirect 301 /state_payroll/washington_payroll.cfm http://www.compupay.com/resource_center/state_resources/ #Redirect for old top linked to pages Redirect 301 /Payroll_Services.cfm http://www.compupay.com/payroll_solutions/ Redirect 301 /About_CompuPay.cfm http://www.compupay.com/about_compupay/ Redirect 301 /Partnerships.cfm http://www.compupay.com/business_partner_solutions/ Redirect 301 /about_compupay.cfm?subpage=393 http://www.compupay.com/about_compupay/ Redirect 301 /quote.cfm http://www.compupay.com/payroll_solutions/get_a_free_quote/ Redirect 301 /After_Payroll.cfm http://www.compupay.com/after_payroll_solutions/ Redirect 301 /Accountant_Services.cfm http://www.compupay.com/accountant_solutions/ Redirect 301 /careers/careers_payroll.cfm http://www.compupay.com/about_compupay/careers/ Redirect 301 /Industry_Resources.cfm http://www.compupay.com/resource_center/ Redirect 301 /Client_Resources.cfm http://www.compupay.com/resource_center/client_login/ Redirect 301 /client_resources.cfm?subpage=375 http://www.compupay.com/resource_center/client_login/ Redirect 301 /solution_finder_sm.cfm http://www.compupay.com/payroll_solutions/ Redirect 301 /Internet_Payroll_PowerPayroll.cfm http://www.compupay.com/payroll_solutions/powerpayroll/ Redirect 301 /Payroll_Outsourcing.cfm http://www.compupay.com/payroll_solutions/why_outsource/ Redirect 301 /Phone_Payroll_Fax_Payroll.cfm http://www.compupay.com/payroll_solutions/phone_fax_payroll/ Redirect 301 /contactus.cfm http://www.compupay.com/about_compupay/contact_us/ Redirect 301 /state_payroll/iowa_payroll.cfm http://www.compupay.com/resource_center/state_resources/ Redirect 301 /Construction_Payroll.cfm http://www.compupay.com/payroll_solutions/specialty_payroll/ Redirect 301 /PC_Payroll.cfm http://www.compupay.com/payroll_solutions/c/pc_payroll/ Redirect 301 /state_payroll/washington_payroll.cfm http://www.compupay.com/resource_center/state_resources/ Redirect 301 /Internet_Payroll_XpressPayroll.cfm http://www.compupay.com/payroll_solutions/xpresspayroll/ Redirect 301 /accountant_services.cfm?subpage=404 http://www.compupay.com/accountant_solutions/ Redirect 301 /after_payroll.cfm http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=361 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=362 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=363 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=364 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=365 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=366 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=367 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=368 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=369 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /after_payroll.cfm?subpage=416 http://www.compupay.com/after_payroll_solutions/ Redirect 301 /payload_payroll.cfm http://www.compupay.com/payroll_solutions/payload/ Redirect 301 /payroll_services.cfm?subpage=358 http://www.compupay.com/payroll_solutions/ Redirect 301 /payroll_services.cfm?subpage=399 http://www.compupay.com/payroll_solutions/ Redirect 301 /payroll_services.cfm?subpage=409 http://www.compupay.com/payroll_solutions/ Redirect 301 /payroll_services.cfm?subpage=413 http://www.compupay.com/payroll_solutions/ Redirect 301 /payroll_services.cfm?subpage=418 http://www.compupay.com/payroll_solutions/ Redirect 301 /state_payroll/mississippi_payroll.cfm http://www.compupay.com/resource_center/state_resources/ <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / # Remove the www # RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] # RewriteRule ^ http://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] # Force www RewriteCond %{HTTP_HOST} !^www.compupay.com$ [NC] RewriteRule ^(.*)$ http://www.compupay.com/$1 [R=301,L] # Add a trailing slash to paths without an extension RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(\.[a-zA-Z0-9]{1,5}|/)$ RewriteRule ^(.*)$ $1/ [L,R=301] #Legacy Partner Link Redirect RewriteCond %{QUERY_STRING} partnerCode=(.*) [NC] RewriteRule compupay_payroll.cfm site/%1? [R=301,L] # Catch any remaining requests for .cfm files RewriteCond %{REQUEST_URI} \.cfm RewriteRule ^.*$ http://www.compupay.com/ [R=301,L] #Expression Engine RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /index.php?/$1 [L] AcceptPathInfo On </IfModule> # Remove IE image toolbar <FilesMatch "\.(html|htm|php)$"> Header set imagetoolbar "no" </FilesMatch> # enable gzip compression <FilesMatch "\.(js|css|php)$"> SetOutputFilter DEFLATE </FilesMatch> #Deal with ETag <IfModule mod_headers.c> <FilesMatch "\.(ico|flv|jpg|jpeg|png|gif)$"> Header unset Last-Modified </FilesMatch> <FilesMatch "\.(ico|flv|jpg|jpeg|png|gif|js|css|swf)$"> Header unset ETag FileETag None Header set Cache-Control "public" </FilesMatch> </IfModule> <IfModule mod_expires.c> <FilesMatch "\.(ico|flv|jpg|jpeg|png|gif|css|js)$"> ExpiresActive on ExpiresDefault "access plus 1 year" </FilesMatch> </IfModule> #Force Download PDFs <FilesMatch "\.(?i:pdf)$"> ForceType application/octet-stream Header set Content-Disposition attachment </FilesMatch> #Increase Upload Size php_value upload_max_filesize 5M php_value post_max_size 5M # -- LG .htaccess Generator End --

    Read the article

  • How do I make some files on my machine accessible via HTTP using Apache?

    - by Lazer
    I did a wget on the source and built the apache binaries correctly. Now what do I need to do to get some documents accessible using HTTP (start some services?)? Also, do I need to group all the files I want to make accessible in some directory and make the directory and its contents accessible or can I just make the individual documents available? I will be providing these links to my colleagues and do not want them to be down, so need to make sure that the apache services are up automatically after a reboot. Does apache have some inbuilt support for this?

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >