Search Results

Search found 12068 results on 483 pages for 'hudson api'.

Page 5/483 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • API design and versioning using EJB

    - by broschb
    I have an API that is EJB based (i.e. there are remote interfaces defined) that most of the clients use. As the client base grows there are issues with updates to the API and forcing clients to have to update to the latest version and interface definition. I would like to possibly look at having a couple versions of the API deployed at a time (i.e. have multiple EAR files deployed with different versions of the API) to support not forcing the clients to update as frequently. I am not concerned about the actual deployment of this, but instead am looking for thoughts and experiences that others have on using EJB's as an API client. How do you support updating versions, are clients required to update? Does anyone run multiple versions in a production environment? Are there pro's cons? Any other experiences or thoughts on this approach, and having an EJB centric API?

    Read the article

  • Is there an API for determining congressional districts?

    - by ardavis
    I'm looking to determine the congressional district based on an address my user is providing. This will avoid having the user to look it up themselves. Does an API of this sort exist? Note Through my attempts to find one, I've only come across these: http://www.govtrack.us/developers/api (not sure how to submit an an address or zip code however) The following resources are available in the API ...Bills and resolutions in the U.S. Congress since 1973 (the 93rd Congress). ...A (bill, person) pair indicating cosponsorship, with join and withdrawn dates. ...Members of Congress and U.S. Presidents since the founding of the nation. ...Terms held in office by Members of Congress and U.S. Presidents. Each term corresponds with an election, meaning each term in the House covers two years (one 'Congress'), as President four years, and in the Senate six years (three 'Congresses'). ...Roll call votes in the U.S. Congress since 1789. How people voted is accessed through the Vote_voter API. ...How people voted on roll call votes in the U.S. Congress since 1789. See the Vote API. Filter on the vote field to get the results of a particular vote... http://www.opencongress.org/api (seems to be a way to find congress information, but not districts) This API provides programmers with structured access to all the data on OpenCongress, everything from official bill info to news and blog coverage to user-generated votes on bills and much more... This API defaults to returning XML. All queries can also return JSON... https://groups.google.com/forum/?fromgroups=#!topic/opendems-discuss/CeKyi_aANaE (similar question, no resolution) I've been looking over Open Dems, and seeing what's exposed at this point and what isn't. I work with Democrats Abroad, and am interested in using stuff from the lab for their sites. I quickly looked over the Precinct API, which does both more and less than what I'd need. An ideal resource would be any way of translating addresses into CD at the very least (getting state district data would be good as well), since that would make it easier for DA's membership to make a difference in races like last month's NY26 race... Update I'm looking at the source for the govtrack.us website and the 'doGeoCode' function may be useful. view-source:http://www.govtrack.us/congress/members If no one has any suggestions, I will try to go off of what they are doing.

    Read the article

  • hudson/jenkins: help needed to get started with customization work

    - by user64204
    I'm would to customize jenkins by adding links to the left hand side panel and use the pages associated with these links to serve some custom content in place of the jobs/views table displayed by default. I managed to add links to the side-bar using the sidebar-links plugin. Now I'm trying to see how to replace the content of the <td id="main-panel"> element with some custom content. The custom content is generated by some PHP scripts which ideally should be called by hudson every time the custom pages are requested, though if too complicated I can either create static content to be served by jenkins by calling my PHP scripts in a crontab or see if calls to the PHP scripts can be done by apache itself before the page requests are sent to jenkins. I'm not sure writing a plugin is the best way to proceed and I would like to have your thoughts as to how you think I should implement this.

    Read the article

  • Google Maps API vs Multimap/Bing Maps API

    - by mdresser
    I want to know if anyone who has experience of using both the Google Maps API and the Multimap API can give a good reason as to why one is better than the other - or maybe a list of pros and cons? I will be working on a complete re-development of a site which currently uses the Multimap (Classic) API and want to consider the possibility of using Google Maps API instead of Multimap (now MS Bing), but I need a compelling reason to justify this decision. The site currently provides a search mechanism allowing users to search for addresses using postcode/partial postcode or city. The current system has a sqlserver database back-end containing full address details and also uploads (geocodes this information to Multimap with a daily scheduled task). I'm wondering if it's possible with the Google API to avoid the need for the daily upload and just use it's geocoding API instead (though this is limited by Google's restriction of a certain number of geocoding requests per day).

    Read the article

  • Sharing authentication methods across API and web app

    - by Snixtor
    I'm wanting to share an authentication implementation across a web application, and web API. The web application will be ASP.NET (mostly MVC 4), the API will be mostly ASP.NET WEB API, though I anticipate it will also have a few custom modules or handlers. I want to: Share as much authentication implementation between the app and API as possible. Have the web application behave like forms authentication (attractive log-in page, logout option, redirect to / from login page when a request requires authentication / authorisation). Have API callers use something closer to standard HTTP (401 - Unauthorized, not 302 - Redirect). Provide client and server side logout mechanisms that don't require a change of password (so HTTP basic is out, since clients typically cache their credentials). The way I'm thinking of implementing this is using plain old ASP.NET forms authentication for the web application, and pushing another module into the stack (much like MADAM - Mixed Authentication Disposition ASP.NET Module). This module will look for some HTTP header (implementation specific) which indicates "caller is API". If the header "caller is API" is set, then the service will respond differently than standard ASP.NET forms authentication, it will: 401 instead of 302 on a request lacking authentication. Look for username + pass in a custom "Login" HTTP header, and return a FormsAuthentication ticket in a custom "FormsAuth" header. Look for FormsAuthentication ticket in a custom "FormsAuth" header. My question(s) are: Is there a framework for ASP.NET that already covers this scenario? Are there any glaring holes in this proposed implementation? My primary fear is a security risk that I can't see, but I'm similarly concerned that there may be something about such an implementation that will make it overly restrictive or clumsy to work with.

    Read the article

  • Using an alternate JSON Serializer in ASP.NET Web API

    - by Rick Strahl
    The new ASP.NET Web API that Microsoft released alongside MVC 4.0 Beta last week is a great framework for building REST and AJAX APIs. I've been working with it for quite a while now and I really like the way it works and the complete set of features it provides 'in the box'. It's about time that Microsoft gets a decent API for building generic HTTP endpoints into the framework. DataContractJsonSerializer sucks As nice as Web API's overall design is one thing still sucks: The built-in JSON Serialization uses the DataContractJsonSerializer which is just too limiting for many scenarios. The biggest issues I have with it are: No support for untyped values (object, dynamic, Anonymous Types) MS AJAX style Date Formatting Ugly serialization formats for types like Dictionaries To me the most serious issue is dealing with serialization of untyped objects. I have number of applications with AJAX front ends that dynamically reformat data from business objects to fit a specific message format that certain UI components require. The most common scenario I have there are IEnumerable query results from a database with fields from the result set rearranged to fit the sometimes unconventional formats required for the UI components (like jqGrid for example). Creating custom types to fit these messages seems like overkill and projections using Linq makes this much easier to code up. Alas DataContractJsonSerializer doesn't support it. Neither does DataContractSerializer for XML output for that matter. What this means is that you can't do stuff like this in Web API out of the box:public object GetAnonymousType() { return new { name = "Rick", company = "West Wind", entered= DateTime.Now }; } Basically anything that doesn't have an explicit type DataContractJsonSerializer will not let you return. FWIW, the same is true for XmlSerializer which also doesn't work with non-typed values for serialization. The example above is obviously contrived with a hardcoded object graph, but it's not uncommon to get dynamic values returned from queries that have anonymous types for their result projections. Apparently there's a good possibility that Microsoft will ship Json.NET as part of Web API RTM release.  Scott Hanselman confirmed this as a footnote in his JSON Dates post a few days ago. I've heard several other people from Microsoft confirm that Json.NET will be included and be the default JSON serializer, but no details yet in what capacity it will show up. Let's hope it ends up as the default in the box. Meanwhile this post will show you how you can use it today with the beta and get JSON that matches what you should see in the RTM version. What about JsonValue? To be fair Web API DOES include a new JsonValue/JsonObject/JsonArray type that allow you to address some of these scenarios. JsonValue is a new type in the System.Json assembly that can be used to build up an object graph based on a dictionary. It's actually a really cool implementation of a dynamic type that allows you to create an object graph and spit it out to JSON without having to create .NET type first. JsonValue can also receive a JSON string and parse it without having to actually load it into a .NET type (which is something that's been missing in the core framework). This is really useful if you get a JSON result from an arbitrary service and you don't want to explicitly create a mapping type for the data returned. For serialization you can create an object structure on the fly and pass it back as part of an Web API action method like this:public JsonValue GetJsonValue() { dynamic json = new JsonObject(); json.name = "Rick"; json.company = "West Wind"; json.entered = DateTime.Now; dynamic address = new JsonObject(); address.street = "32 Kaiea"; address.zip = "96779"; json.address = address; dynamic phones = new JsonArray(); json.phoneNumbers = phones; dynamic phone = new JsonObject(); phone.type = "Home"; phone.number = "808 123-1233"; phones.Add(phone); phone = new JsonObject(); phone.type = "Home"; phone.number = "808 123-1233"; phones.Add(phone); //var jsonString = json.ToString(); return json; } which produces the following output (formatted here for easier reading):{ name: "rick", company: "West Wind", entered: "2012-03-08T15:33:19.673-10:00", address: { street: "32 Kaiea", zip: "96779" }, phoneNumbers: [ { type: "Home", number: "808 123-1233" }, { type: "Mobile", number: "808 123-1234" }] } If you need to build a simple JSON type on the fly these types work great. But if you have an existing type - or worse a query result/list that's already formatted JsonValue et al. become a pain to work with. As far as I can see there's no way to just throw an object instance at JsonValue and have it convert into JsonValue dictionary. It's a manual process. Using alternate Serializers in Web API So, currently the default serializer in WebAPI is DataContractJsonSeriaizer and I don't like it. You may not either, but luckily you can swap the serializer fairly easily. If you'd rather use the JavaScriptSerializer built into System.Web.Extensions or Json.NET today, it's not too difficult to create a custom MediaTypeFormatter that uses these serializers and can replace or partially replace the native serializer. Here's a MediaTypeFormatter implementation using the ASP.NET JavaScriptSerializer:using System; using System.Net.Http.Formatting; using System.Threading.Tasks; using System.Web.Script.Serialization; using System.Json; using System.IO; namespace Westwind.Web.WebApi { public class JavaScriptSerializerFormatter : MediaTypeFormatter { public JavaScriptSerializerFormatter() { SupportedMediaTypes.Add(new System.Net.Http.Headers.MediaTypeHeaderValue("application/json")); } protected override bool CanWriteType(Type type) { // don't serialize JsonValue structure use default for that if (type == typeof(JsonValue) || type == typeof(JsonObject) || type== typeof(JsonArray) ) return false; return true; } protected override bool CanReadType(Type type) { if (type == typeof(IKeyValueModel)) return false; return true; } protected override System.Threading.Tasks.Taskobject OnReadFromStreamAsync(Type type, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext) { var task = Taskobject.Factory.StartNew(() = { var ser = new JavaScriptSerializer(); string json; using (var sr = new StreamReader(stream)) { json = sr.ReadToEnd(); sr.Close(); } object val = ser.Deserialize(json,type); return val; }); return task; } protected override System.Threading.Tasks.Task OnWriteToStreamAsync(Type type, object value, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext, System.Net.TransportContext transportContext) { var task = Task.Factory.StartNew( () = { var ser = new JavaScriptSerializer(); var json = ser.Serialize(value); byte[] buf = System.Text.Encoding.Default.GetBytes(json); stream.Write(buf,0,buf.Length); stream.Flush(); }); return task; } } } Formatter implementation is pretty simple: You override 4 methods to tell which types you can handle and then handle the input or output streams to create/parse the JSON data. Note that when creating output you want to take care to still allow JsonValue/JsonObject/JsonArray types to be handled by the default serializer so those objects serialize properly - if you let either JavaScriptSerializer or JSON.NET handle them they'd try to render the dictionaries which is very undesirable. If you'd rather use Json.NET here's the JSON.NET version of the formatter:// this code requires a reference to JSON.NET in your project #if true using System; using System.Net.Http.Formatting; using System.Threading.Tasks; using System.Web.Script.Serialization; using System.Json; using Newtonsoft.Json; using System.IO; using Newtonsoft.Json.Converters; namespace Westwind.Web.WebApi { public class JsonNetFormatter : MediaTypeFormatter { public JsonNetFormatter() { SupportedMediaTypes.Add(new System.Net.Http.Headers.MediaTypeHeaderValue("application/json")); } protected override bool CanWriteType(Type type) { // don't serialize JsonValue structure use default for that if (type == typeof(JsonValue) || type == typeof(JsonObject) || type == typeof(JsonArray)) return false; return true; } protected override bool CanReadType(Type type) { if (type == typeof(IKeyValueModel)) return false; return true; } protected override System.Threading.Tasks.Taskobject OnReadFromStreamAsync(Type type, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext) { var task = Taskobject.Factory.StartNew(() = { var settings = new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore, }; var sr = new StreamReader(stream); var jreader = new JsonTextReader(sr); var ser = new JsonSerializer(); ser.Converters.Add(new IsoDateTimeConverter()); object val = ser.Deserialize(jreader, type); return val; }); return task; } protected override System.Threading.Tasks.Task OnWriteToStreamAsync(Type type, object value, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext, System.Net.TransportContext transportContext) { var task = Task.Factory.StartNew( () = { var settings = new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore, }; string json = JsonConvert.SerializeObject(value, Formatting.Indented, new JsonConverter[1] { new IsoDateTimeConverter() } ); byte[] buf = System.Text.Encoding.Default.GetBytes(json); stream.Write(buf,0,buf.Length); stream.Flush(); }); return task; } } } #endif   One advantage of the Json.NET serializer is that you can specify a few options on how things are formatted and handled. You get null value handling and you can plug in the IsoDateTimeConverter which is nice to product proper ISO dates that I would expect any Json serializer to output these days. Hooking up the Formatters Once you've created the custom formatters you need to enable them for your Web API application. To do this use the GlobalConfiguration.Configuration object and add the formatter to the Formatters collection. Here's what this looks like hooked up from Application_Start in a Web project:protected void Application_Start(object sender, EventArgs e) { // Action based routing (used for RPC calls) RouteTable.Routes.MapHttpRoute( name: "StockApi", routeTemplate: "stocks/{action}/{symbol}", defaults: new { symbol = RouteParameter.Optional, controller = "StockApi" } ); // WebApi Configuration to hook up formatters and message handlers // optional RegisterApis(GlobalConfiguration.Configuration); } public static void RegisterApis(HttpConfiguration config) { // Add JavaScriptSerializer formatter instead - add at top to make default //config.Formatters.Insert(0, new JavaScriptSerializerFormatter()); // Add Json.net formatter - add at the top so it fires first! // This leaves the old one in place so JsonValue/JsonObject/JsonArray still are handled config.Formatters.Insert(0, new JsonNetFormatter()); } One thing to remember here is the GlobalConfiguration object which is Web API's static configuration instance. I think this thing is seriously misnamed given that GlobalConfiguration could stand for anything and so is hard to discover if you don't know what you're looking for. How about WebApiConfiguration or something more descriptive? Anyway, once you know what it is you can use the Formatters collection to insert your custom formatter. Note that I insert my formatter at the top of the list so it takes precedence over the default formatter. I also am not removing the old formatter because I still want JsonValue/JsonObject/JsonArray to be handled by the default serialization mechanism. Since they process in sequence and I exclude processing for these types JsonValue et al. still get properly serialized/deserialized. Summary Currently DataContractJsonSerializer in Web API is a pain, but at least we have the ability with relatively limited effort to replace the MediaTypeFormatter and plug in our own JSON serializer. This is useful for many scenarios - if you have existing client applications that used MVC JsonResult or ASP.NET AJAX results from ASMX AJAX services you can plug in the JavaScript serializer and get exactly the same serializer you used in the past so your results will be the same and don't potentially break clients. JSON serializers do vary a bit in how they serialize some of the more complex types (like Dictionaries and dates for example) and so if you're migrating it might be helpful to ensure your client code doesn't break when you switch to ASP.NET Web API. Going forward it looks like Microsoft is planning on plugging in Json.Net into Web API and make that the default. I think that's an awesome choice since Json.net has been around forever, is fast and easy to use and provides a ton of functionality as part of this great library. I just wish Microsoft would have figured this out sooner instead of now at the last minute integrating with it especially given that Json.Net has a similar set of lower level JSON objects JsonValue/JsonObject etc. which now will end up being duplicated by the native System.Json stuff. It's not like we don't already have enough confusion regarding which JSON serializer to use (JavaScriptSerializer, DataContractJsonSerializer, JsonValue/JsonObject/JsonArray and now Json.net). For years I've been using my own JSON serializer because the built in choices are both limited. However, with an official encorsement of Json.Net I'm happily moving on to use that in my applications. Let's see and hope Microsoft gets this right before ASP.NET Web API goes gold.© Rick Strahl, West Wind Technologies, 2005-2012Posted in Web Api  AJAX  ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • automating hudson builds with ant throwing 403

    - by Christopher Dancy
    We have a hudson server which deploys builds. We have a few services which we want to be able to remotely tell hudson to deploy a certain build ... these services are using ant. So I'm trying to get it working but keeping getting a 403 response when giving a build number like so... <ac:post to="http://hostname:8080/hudson/job/test_release_indexes/build?" verbose="true" wantresponse="true"> <prop name="token" value="indexes"/> <prop name="BUILDNUMBER" value="0354"/> </ac:post> this throws the 403. I've also tried passing it props for the username and password like so ... <ac:post to="http://srulesre2:8080/hudson/job/test_dartmouth_indexes/build?" verbose="true" wantresponse="true"> <prop name="token" value="indexes"/> <prop name="BUILDNUMBER" value="0354"/> <prop name="username" value="test"/> <prop name="password" value="test"/> </ac:post> I've tried a hundred different variations on username and password ... like j_username and j_password or user and pass ... but nothing is working ... keep getting the same 403. And the username and password are valid because I can manually log in with admin privileges. Any ideas would be great

    Read the article

  • google spreadsheet api from android: google-api-java-client or handmade?

    - by yetanothercoderu
    For working with google spreadsheet api from android (2.2) - google suggests using google-api-java-client for android. For that you have to include 5 jars to your android application: guava-r09.jar google-http-client-extensions-android2-1.6.0-beta.jar google-api-client-extensions-android2-1.6.0-beta.jar google-http-client-1.6.0-beta.jar google-api-client-1.6.0-beta.jar and digging into google-api-java-client javadocs for fast-changing api. Does it worth the effort? in term of android specifics and device fragmentation? Isn't it reasonable to write your own simple http response parser or take small existing library like google-spreadsheet-lib-android ? Thanks! UPD: choosed google-api-java-client finally as it has all routine stuff (like parsing http, xml) out of box

    Read the article

  • Hudson leaving open sessions

    - by James Carr
    Does anyone have any experiences with Hudson leaving sessions open to a Subversion server? We've been increasing our job list and got ~50 which poll the SCM regularly. It's been working fine but recently our SCM has started acting up by refusing handshakes, which we suspect is down to the sessions left open by Hudson. Last count there were ~400 sessions with nothing building on Hudson. At the moment the only solution we've found is restarting the Subversion service but this is becoming increasingly frequent and not a long term solution. Any experiences/ideas would be appreciated.

    Read the article

  • Set environment variable for build in hudson

    - by pbreault
    I am trying to put a maven2 project under continuous integration in hudson. The project uses selenium for some integration testing. Hudson is running on a headless linux. I am using xvfb to start a x server session for selenium. In order to run the tests, I need to export an environment variable named DISPLAY. e.g. export DISPLAY=:99 However, I don't want to set the variable on the box since it would affect all builds. I have tried to do a shell execute using the m2 extra steps plugin but it doesnt work since it is executed in a separate bash file, meaning that environment variables are not persisted. Is there a way to register the environment variable from hudson.

    Read the article

  • Hudson CI project doesn't run NetBeans JUnit tests of dependent projects

    - by Liron Yahdav
    I have a set of NetBeans java projects with dependencies between them. I added the project at the top of the dependency tree to Hudson for continuous integration. Everything works fine, except that the unit tests of dependent projects don't get run by Hudson. This is because the ant scripts that NetBeans creates has dependent projects setup to run the "jar" target and not a target that also runs the unit tests. I could add ant build steps for each dependent project in Hudson to run the unit tests, but I was hoping there's a simpler solution.

    Read the article

  • Hudson trigget builds remotely gives a forbidden 403 error

    - by Ritesh M Nayak
    I have a shell script on the same machine that hudson is deployed on and upon executing it, it calls wget on a hudson build trigger URL. Since its the same machine, I access it as http://localhost:8080/hudson/job/jobname/build?token=sometoken Typically, this is supposed to trigger a build on the project. But I get a 403 forbidden when I do this. Anybody has any idea why? I have tried this using a browser and it triggers the build, but via the command line it doesn't seem to work. Any ideas?

    Read the article

  • Hudson XML error-- No module named dom.minidom

    - by Arnab Sen Gupta
    I am trying to send a simple XML file of the format given in http://wiki.hudson-ci.org/display/HUDSON/Monitoring+external+jobs . I was able to send it easily and was getting desired result!! Then I tried to build this XML file using python script and it was giving me the exact file that I wanted without any problem. But when I tried to run this and send it to Hudson, I was getting the error - "No module named dom.minidom" . I checked again by executing in Python IDLE and it ws working fine but when I tried to send it again, I was getting the same error.. plz help..

    Read the article

  • Hudson, C++ and UnitTest++

    - by Gilad Naor
    Has anyone used Hudson as a Continuous-Integration server for a C++ project using UnitTest++ as a testing library? How exactly did you set it up? I know there have been several questions on Continuous Integration before, but I hope this one has a narrower scope. EDIT: I'll clarify a bit on what I'm looking for. I already have the build set to fail when the Unit-Tests fail. I'm looking for something like Hudson's JUnit support. UnitTest++ can create XML reports (See here). So, perhaps if someone knows how to translate these reports to be JUnit compatible, Hudson will know how to eat it up?

    Read the article

  • Sending Subversion Change Log Info Via Hudson

    - by GrumpyCanuck
    I'm trying to integrate Hudson into our development process, and everything is going smooth except for one thing. I had been using Phing to do deployments, and one of the things that was being triggered was an email to our tech support email address containing a list of all the commit messages between the last time code was deployed and the present SVN revision. I was doing something like this: read in a file from the root directory of the currently-deployed application that contains the SVN revision when the app was deployed place that value in a Phing variable insert that value into a command to send the SVN commit messages via email create a file in the root directory of the newly-deployed application that contains the current SVN revision I'd like to be able to add that information to the email that gets sent out by Hudson when a successful build goes out. Any pointers on how to accomplish this task in Hudson would be greatly appreciated.

    Read the article

  • Hudson trigger builds remotely gives a forbidden 403 error

    - by Ritesh M Nayak
    I have a shell script on the same machine that hudson is deployed on and upon executing it, it calls wget on a hudson build trigger URL. Since its the same machine, I access it as http://localhost:8080/hudson/job/jobname/build?token=sometoken Typically, this is supposed to trigger a build on the project. But I get a 403 forbidden when I do this. Anybody has any idea why? I have tried this using a browser and it triggers the build, but via the command line it doesn't seem to work. Any ideas?

    Read the article

  • Build Pipelining and Continuous Integration with Maven and Hudson

    - by Brandon
    Currently the my team is considering splitting our single CI build process into a more streamlined multi-stage process to speed up basic build feedback and isolate different ci concerns. The idea we had was to have each stage exist in Hudson as a different build with the correct maven goal or maven plugin execution, then chain them together using the post-build hooks of Hudson. However to my knowledge, Maven as a build tool mandates that any lifecycle phase which is performed automatically builds every preceding lifecycle phase. This presents a number of problems the most significant of which is that maven is recreating the build resources with each distinct call and not using those of the previous stage. This not only breaks the consistency of the build lifecycle but has much more unnecessary processing overhead. Is there a way to accomplish pipelining with CI using Maven? Assuming there is, is there a way to let Hudson know to use those resources built from the previous stage in the next one?

    Read the article

  • April 14th Links: ASP.NET, ASP.NET MVC, ASP.NET Web API and Visual Studio

    - by ScottGu
    Here is the latest in my link-listing blog series: ASP.NET Easily overlooked features in VS 11 Express for Web: Good post by Scott Hanselman that highlights a bunch of easily overlooked improvements that are coming to VS 11 (and specifically the free express editions) for web development: unit testing, browser chooser/launcher, IIS Express, CSS Color Picker, Image Preview in Solution Explorer and more. Get Started with ASP.NET 4.5 Web Forms: Good 5-part tutorial that walks-through building an application using ASP.NET Web Forms and highlights some of the nice improvements coming with ASP.NET 4.5. What is New in Razor V2 and What Else is New in Razor V2: Great posts by Andrew Nurse, a dev on the ASP.NET team, about some of the new improvements coming with ASP.NET Razor v2. ASP.NET MVC 4 AllowAnonymous Attribute: Nice post from David Hayden that talks about the new [AllowAnonymous] filter introduced with ASP.NET MVC 4. Introduction to the ASP.NET Web API: Great tutorial by Stephen Walher that covers how to use the new ASP.NET Web API support built-into ASP.NET 4.5 and ASP.NET MVC 4. Comprehensive List of ASP.NET Web API Tutorials and Articles: Tugberk Ugurlu links to a huge collection of articles, tutorials, and samples about the new ASP.NET Web API capability. Async Mashups using ASP.NET Web API: Nice post by Henrik on how you can use the new async language support coming with .NET 4.5 to easily and efficiently make asynchronous network requests that do not block threads within ASP.NET. ASP.NET and Front-End Web Development Visual Studio 11 and Front End Web Development - JavaScript/HTML5/CSS3: Nice post by Scott Hanselman that highlights some of the great improvements coming with VS 11 (including the free express edition) for front-end web development. HTML5 Drag/Drop and Async Multi-file Upload with ASP.NET Web API: Great post by Filip W. that demonstrates how to implement an async file drag/drop uploader using HTML5 and ASP.NET Web API. Device Emulator Guide for Mobile Development with ASP.NET: Good post from Rachel Appel that covers how to use various device emulators with ASP.NET and VS to develop cross platform mobile sites. Fixing these jQuery: A Guide to Debugging: Great presentation by Adam Sontag on debugging with JavaScript and jQuery.  Some really good tips, tricks and gotchas that can save a lot of time. ASP.NET and Open Source Getting Started with ASP.NET Web Stack Source on CodePlex: Fantastic post by Henrik (an architect on the ASP.NET team) that provides step by step instructions on how to work with the ASP.NET source code we recently open sourced. Contributing to ASP.NET Web Stack Source on CodePlex: Follow-on to the post above (also by Henrik) that walks-through how you can submit a code contribution to the ASP.NET MVC, Web API and Razor projects. Overview of the WebApiContrib project: Nice post by Pedro Reys on the new open source WebApiContrib project that has been started to deliver cool extensions and libraries for use with ASP.NET Web API. Entity Framework Entity Framework 5 Performance Improvements and Performance Considerations for EF5:  Good articles that describes some of the big performance wins coming with EF5 (which will ship with both .NET 4.5 and ASP.NET MVC 4). Automatic compilation of LINQ queries will yield some significant performance wins (up to 600% faster). ASP.NET MVC 4 and EF Database Migrations: Good post by David Hayden that covers the new database migrations support within EF 4.3 which allows you to easily update your database schema during development - without losing any of the data within it. Visual Studio What's New in Visual Studio 11 Unit Testing: Nice post by Peter Provost (from the VS team) that talks about some of the great improvements coming to VS11 for unit testing - including built-in VS tooling support for a broad set of unit test frameworks (including NUnit, XUnit, Jasmine, QUnit and more) Hope this helps, Scott

    Read the article

  • REST Framework - MS Web Api vs the rest of the field

    - by Mike
    I am a .NET developer who is looking into the OSS world for a REST framework similar to Microsoft's Web Api. I'll be starting a personal project soon and need to develop both a web site and an API with the API coming first. I've ruled out Ruby on Rails just because I feel that with my background in C#, I can get up to speed quickly with either a Java or PHP based framework. So far I've looked at Slim (PHP) and JAX-RS and Jersey (Java). Would I want to consider any others? My API will be private at first with a public one on the roadmap. I'll be hosting the API on Heroku or some cloud based service.

    Read the article

  • ASP.NET Web API and Simple Value Parameters from POSTed data

    - by Rick Strahl
    In testing out various features of Web API I've found a few oddities in the way that the serialization is handled. These are probably not super common but they may throw you for a loop. Here's what I found. Simple Parameters from Xml or JSON Content Web API makes it very easy to create action methods that accept parameters that are automatically parsed from XML or JSON request bodies. For example, you can send a JavaScript JSON object to the server and Web API happily deserializes it for you. This works just fine:public string ReturnAlbumInfo(Album album) { return album.AlbumName + " (" + album.YearReleased.ToString() + ")"; } However, if you have methods that accept simple parameter types like strings, dates, number etc., those methods don't receive their parameters from XML or JSON body by default and you may end up with failures. Take the following two very simple methods:public string ReturnString(string message) { return message; } public HttpResponseMessage ReturnDateTime(DateTime time) { return Request.CreateResponse<DateTime>(HttpStatusCode.OK, time); } The first one accepts a string and if called with a JSON string from the client like this:var client = new HttpClient(); var result = client.PostAsJsonAsync<string>(http://rasxps/AspNetWebApi/albums/rpc/ReturnString, "Hello World").Result; which results in a trace like this: POST http://rasxps/AspNetWebApi/albums/rpc/ReturnString HTTP/1.1Content-Type: application/json; charset=utf-8Host: rasxpsContent-Length: 13Expect: 100-continueConnection: Keep-Alive "Hello World" produces… wait for it: null. Sending a date in the same fashion:var client = new HttpClient(); var result = client.PostAsJsonAsync<DateTime>(http://rasxps/AspNetWebApi/albums/rpc/ReturnDateTime, new DateTime(2012, 1, 1)).Result; results in this trace: POST http://rasxps/AspNetWebApi/albums/rpc/ReturnDateTime HTTP/1.1Content-Type: application/json; charset=utf-8Host: rasxpsContent-Length: 30Expect: 100-continueConnection: Keep-Alive "\/Date(1325412000000-1000)\/" (yes still the ugly MS AJAX date, yuk! This will supposedly change by RTM with Json.net used for client serialization) produces an error response: The parameters dictionary contains a null entry for parameter 'time' of non-nullable type 'System.DateTime' for method 'System.Net.Http.HttpResponseMessage ReturnDateTime(System.DateTime)' in 'AspNetWebApi.Controllers.AlbumApiController'. An optional parameter must be a reference type, a nullable type, or be declared as an optional parameter. Basically any simple parameters are not parsed properly resulting in null being sent to the method. For the string the call doesn't fail, but for the non-nullable date it produces an error because the method can't handle a null value. This behavior is a bit unexpected to say the least, but there's a simple solution to make this work using an explicit [FromBody] attribute:public string ReturnString([FromBody] string message) andpublic HttpResponseMessage ReturnDateTime([FromBody] DateTime time) which explicitly instructs Web API to read the value from the body. UrlEncoded Form Variable Parsing Another similar issue I ran into is with POST Form Variable binding. Web API can retrieve parameters from the QueryString and Route Values but it doesn't explicitly map parameters from POST values either. Taking our same ReturnString function from earlier and posting a message POST variable like this:var formVars = new Dictionary<string,string>(); formVars.Add("message", "Some Value"); var content = new FormUrlEncodedContent(formVars); var client = new HttpClient(); var result = client.PostAsync(http://rasxps/AspNetWebApi/albums/rpc/ReturnString, content).Result; which produces this trace: POST http://rasxps/AspNetWebApi/albums/rpc/ReturnString HTTP/1.1Content-Type: application/x-www-form-urlencodedHost: rasxpsContent-Length: 18Expect: 100-continue message=Some+Value When calling ReturnString:public string ReturnString(string message) { return message; } unfortunately it does not map the message value to the message parameter. This sort of mapping unfortunately is not available in Web API. Web API does support binding to form variables but only as part of model binding, which binds object properties to the POST variables. Sending the same message as in the previous example you can use the following code to pick up POST variable data:public string ReturnMessageModel(MessageModel model) { return model.Message; } public class MessageModel { public string Message { get; set; }} Note that the model is bound and the message form variable is mapped to the Message property as would other variables to properties if there were more. This works but it's not very dynamic. There's no real easy way to retrieve form variables (or query string values for that matter) in Web API's Request object as far as I can discern. Well only if you consider this easy:public string ReturnString() { var formData = Request.Content.ReadAsAsync<FormDataCollection>().Result; return formData.Get("message"); } Oddly FormDataCollection does not allow for indexers to work so you have to use the .Get() method which is rather odd. If you're running under IIS/Cassini you can always resort to the old and trusty HttpContext access for request data:public string ReturnString() { return HttpContext.Current.Request.Form["message"]; } which works fine and is easier. It's kind of a bummer that HttpRequestMessage doesn't expose some sort of raw Request object that has access to dynamic data - given that it's meant to serve as a generic REST/HTTP API that seems like a crucial missing piece. I don't see any way to read query string values either. To me personally HttpContext works, since I don't see myself using self-hosted code much.© Rick Strahl, West Wind Technologies, 2005-2012Posted in Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • What is the best currency API out there?

    - by YouBook
    I may be asking an odd question, but webmasters seem like a decent place to post this. I'm in the search for an accurate, easy API (such as using JSON or XML) so I can use my web application. I've been trying Google's secret API, but it's dependency isn't that good because of parsing the string (weird JSON format that PHP sometimes return incorrect data or truncates the string due to parsing error, but Google's API is fair but can be improved. So, all I'm asking is your current best currency API out there, I want to have them to include documented API so I can use it with PHP. Cheers.

    Read the article

  • What ever happened to the Google AJAX Search API

    - by John
    I am looking to query the main Google search however all references including stackoveflow point to the Google AJAX Search API. The odd thing is that it does not seem to exist any more not even a note to say it is depreciated? The old links point to main Google code site. If I look at the list of API's on that site the API it replaced is there Web Search API (Deprecated) which links back to same page but not the Google AJAX Search API. Further Google searching is not being helpful either, many blog posts pointing to the same Google site (http://code.google.com/apis/ajaxsearch/) that has no content and redirects to the same place? Just to prove it did exist I have found it on the way back machine however the last snapshot did not show any special unusual message.

    Read the article

  • Build Environment setup - Using .net, java, hudson, and ruby - Could really use a critique

    - by Jeff D
    I'm trying to figure out the best way to stitch together a fast, repeatable, unbreakable build process for the following environment. I've got a plan for how to do it, but I'd really appreciate a critique. (I'd also appreciate some sample code, but more on that later) Ecosystem - Logical: Website - asp.net MVC 2, .net 3.5, Visual Studio 2010. IIS 6, Facebook iframe application application. This website/facebook app uses a few services. An internal search api, an internal read/write api, facebook, and an IP geolocation service. More details on these below Internal search api - .net, restful, built using old school .ashx handlers. The api uses lucene, and a sql server database behind the scenes. My project won't touch the lucene code, but does potentially touch the database and the web services. internal read/write api - java, restful, running on Tomcat Facebook web services A mocking site that emulates the internal read/write api, and parts of the facebook api Hudson - Runs unit tests on checkin, and creates some installers that behave inconsistently. Ecosystem - Physical: All of these machines can talk to one another, except for Hudson. Hudson can't see any of the target machines. So code must be pulled, rather than pushed. (Security thing) 1. Web Server - Holds the website, and the read/write api. (The api itself writes to a replicated sql server environment). 2. Search Server - Houses the search api. 3. Hudson Server - Does not have permissions to push to any environment. They have to pull. 4. Lucene Server 5. Database Server Problem I've been trying to set this site up to run in a stress environment, but the number of setup steps, the amount of time it takes to update a component, the black-box nature of the current installers, and the time it takes to generate data into the test system is absolutely destroying my productivity. I tweak one setting, have to redeploy, restart in a certain order, resetup some of the settings, and rebuild test data. Errors result in headscratching, and then basically starting over. Very bad. This problem is complicated further by my stress testing. I need to be able to turn on and off different external components, so that I can effectively determine the scalability of each piece. I've got strategies in place for how to do that for each dependency, but it further complicates my setup strategy, because now each component has 2 options. A mock version, or a real version. Configurations everywhere must be updated accordingly. Goals Fast - I want to drop this from a 20 minute exercise when things go perfectly, to a 3 minute one Stupid simple - I want to tell the environment what to do with as few commands as possible, and not have to remember how to stitch the environments together Repeatable - I want the script to be idempotent. Kind of a corollary to the Stupid Simple thing. The Plan So Far Here's what I've come up with so far, and what I've come looking for feedback on: Use VisualStudio's new web.config transformations to permit easily altering configs based on envrionment. This solution isn't really sufficient though. I will leave web.config set up to let the site run locally, but when deploying elsewhere, I have as many as 6 different possible outputs for the stress environment alone (because of the mocks of the various dependencies), let alone the settings for prod, QA, and dev. Each of these would then require it's own setup, or a setup that would then post-process the configs. So I'm currently leaning toward just having the dev version, and a version that converts key configuration values into a ruby string interpolation syntax. ({#VAR_NAME} kinda thing) Create a ruby script for each server that is essentially a bootstrapping script. That is to say, it will do nothing but load the ruby code that does the 'real' work from hudson/subversion, so that the script's functionality can evolve with the application, making it easy to build the site at any point in time by reference the appropriate version of the script. So in a nutshell, this script loads another script, and runs it. The 'real' ruby script will then accept commandline parameters that describe how the environment should look. From there, 1 configuration file can be used, and ruby will download the current installers, run them, post-process the configs, restart IIS/Tomcat, and kick off any data setup code that is needed. So that's it. I'm in a real time crunch to get this site stress-tested, so any feedback that you think could abbreviate the time this might take would be appreciated. That includes a shameless request for sample ruby code. I've not gotten too much further than puts "Hello World". :-) Just guidance would be helpful. Is this something that Rake would be useful for? How would you recommend I write tests for this animal? (I use interfaces and automocking frameworks to mock out things like http requests in .net. With ducktyping, it seems that this might be easier, but I don't know how to tell my code to use a fake duck in test, but a real one in practice) Thanks all. Sorry for such such a long-winded, open-ended question.

    Read the article

  • Can I use google API to convert a PDF into PNGs?

    - by Ken
    I have noticed that when you view PDFs in google docs the PDF viewer renders the PDF file into PNG images. I was wondering if you could use Google Data API to upload a PDF and get the URLs of the rendered PNG files? I have never used the google API or really had the extra time to learn it, but if it help me do this it will be well worth the extra time.

    Read the article

  • API Auth vs User Auth

    - by user1626384
    I have read many posts and articles on this topic but still cant connect the dots. I want to make a Rails app that is strictly a JSON API maybe using Sinatra or the rails-api gem. I also want to make both a web client app and an iPhone app which consumes the API. No plans on letting third party dev's use it. So I could create a separate username/password combination for both the web and mobile client and use HTTP Basic over SSL. Each app would have these values as configs in the source and use it to authenticate to the API so only these can make a call. Anyone else trying would get a 401 error returned. This would be considered handling the API authentication. The web and mobile client apps allow end users to sign up and read/write data to the API. When each user is created, I create and save a token in their profile. If a user successfully signs in, I send back the token. On each future read/write then also send along this token in the header. I get the token and lookup the user in the database and make the read/write. Does this sound like an appropriate way to handle it. For the web client, when I initially send back the token, where do I store it. In a cookie? Do I also drop a cookie to handle session state?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >