Search Results

Search found 24427 results on 978 pages for 'ec2 api tools'.

Page 7/978 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • EC2 Image to start

    - by HD.
    I'm starting to test EC2 for a couple of new projects. I need to choose an AMI (Amazon Machine Image) and Amazon offered me as first option Fedora Core 8, which is a very old version of one of my favorites distributions. There is a lot of choices, but it's not clear for me which one is the better option. I have my own reasons in order to choice a distro and a version when I need to install a new server but I don't know If I can apply the same for EC2. I know there is a beta for RHEL, how stable is this beta?, How can I choose between all the CentOS AMIs in the list? So this is my question: Do you recommend an AMI to start with EC2? Thanks

    Read the article

  • Google Maps API vs Multimap/Bing Maps API

    - by mdresser
    I want to know if anyone who has experience of using both the Google Maps API and the Multimap API can give a good reason as to why one is better than the other - or maybe a list of pros and cons? I will be working on a complete re-development of a site which currently uses the Multimap (Classic) API and want to consider the possibility of using Google Maps API instead of Multimap (now MS Bing), but I need a compelling reason to justify this decision. The site currently provides a search mechanism allowing users to search for addresses using postcode/partial postcode or city. The current system has a sqlserver database back-end containing full address details and also uploads (geocodes this information to Multimap with a daily scheduled task). I'm wondering if it's possible with the Google API to avoid the need for the daily upload and just use it's geocoding API instead (though this is limited by Google's restriction of a certain number of geocoding requests per day).

    Read the article

  • juju bootstrap fails with a local environment, why?

    - by Braiam
    Each time I try to bootstrap juju using a local enviroment it fails starting the juju-db-braiam-local script as follows: $ sudo juju --debug --verbose bootstrap 2013-10-20 02:28:53 INFO juju.provider.local environprovider.go:32 opening environment "local" 2013-10-20 02:28:53 DEBUG juju.provider.local environ.go:210 found "10.0.3.1" as address for "lxcbr0" 2013-10-20 02:28:53 DEBUG juju.provider.local environ.go:234 checking 10.0.3.1:8040 to see if machine agent running storage listener 2013-10-20 02:28:53 DEBUG juju.provider.local environ.go:237 nope, start some 2013-10-20 02:28:53 DEBUG juju.environs.tools storage.go:87 Uploading tools for [raring precise] 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:109 looking for: juju 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:150 checking: /usr/bin/jujud 2013-10-20 02:28:53 INFO juju.environs.tools build.go:156 found existing jujud 2013-10-20 02:28:53 INFO juju.environs.tools build.go:166 target: /tmp/juju-tools243949228/jujud 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:217 forcing version to 1.14.1.1 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:37 adding entry: &tar.Header{Name:"FORCE-VERSION", Mode:420, Uid:0, Gid:0, Size:8, ModTime:time.Time{sec:63517832933, nsec:278894120, loc:(*time.Location)(0x108fda0)}, Typeflag:0x30, Linkname:"", Uname:"ubuntu", Gname:"ubuntu", Devmajor:0, Devminor:0, AccessTime:time.Time{sec:63517832933, nsec:278894120, loc:(*time.Location)(0x108fda0)}, ChangeTime:time.Time{sec:63517832933, nsec:278894120, loc:(*time.Location)(0x108fda0)}} 2013-10-20 02:28:53 DEBUG juju.environs.tools build.go:37 adding entry: &tar.Header{Name:"jujud", Mode:493, Uid:0, Gid:0, Size:19179512, ModTime:time.Time{sec:63517832933, nsec:274894120, loc:(*time.Location)(0x108fda0)}, Typeflag:0x30, Linkname:"", Uname:"ubuntu", Gname:"ubuntu", Devmajor:0, Devminor:0, AccessTime:time.Time{sec:63517832933, nsec:274894120, loc:(*time.Location)(0x108fda0)}, ChangeTime:time.Time{sec:63517832933, nsec:274894120, loc:(*time.Location)(0x108fda0)}} 2013-10-20 02:28:55 INFO juju.environs.tools storage.go:106 built 1.14.1.1-raring-amd64 (4196kB) 2013-10-20 02:28:55 INFO juju.environs.tools storage.go:112 uploading 1.14.1.1-precise-amd64 2013-10-20 02:28:55 INFO juju.environs.tools storage.go:112 uploading 1.14.1.1-raring-amd64 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:29 reading tools with major version 1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:34 filtering tools by version: 1.14.1.1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:37 filtering tools by series: precise 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:41 reading v1.* tools 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-precise-amd64 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-raring-amd64 2013-10-20 02:28:55 INFO juju.environs.boostrap bootstrap.go:57 bootstrapping environment "local" 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:29 reading tools with major version 1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:34 filtering tools by version: 1.14.1.1 2013-10-20 02:28:55 INFO juju.environs.tools tools.go:37 filtering tools by series: precise 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:41 reading v1.* tools 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-precise-amd64 2013-10-20 02:28:55 DEBUG juju.environs.tools storage.go:61 found 1.14.1.1-raring-amd64 2013-10-20 02:28:55 DEBUG juju.provider.local environ.go:395 create mongo journal dir: /home/braiam/.juju/local/db/journal 2013-10-20 02:28:55 DEBUG juju.provider.local environ.go:401 generate server cert 2013-10-20 02:28:55 INFO juju.provider.local environ.go:421 installing service juju-db-braiam-local to /etc/init 2013-10-20 02:28:56 ERROR juju.provider.local environ.go:423 could not install mongo service: exec ["start" "juju-db-braiam-local"]: exit status 1 (start: Job failed to start) 2013-10-20 02:28:56 ERROR juju supercommand.go:282 command failed: exec ["start" "juju-db-braiam-local"]: exit status 1 (start: Job failed to start) error: exec ["start" "juju-db-braiam-local"]: exit status 1 (start: Job failed to start) What is the reason for this error and how to solve it?

    Read the article

  • Can't launch glassfish on ec2 - can't open port

    - by orange80
    I'm trying to start glassfish on an EBS-based AMI of Ubuntu 10.04 64-bit. I have used glassfish on non-ec2 servers with no problems, but on ec2 I get this message: $ sudo -u glassfish bin/asadmin start-domain domain1 There is a process already using the admin port 4848 -- it probably is another instance of a GlassFish server. Command start-domain failed. I know that ec2 has requires that firewall rules be modified using ec2-authorize to let outside traffic thru the firewall, as I had to do to make ssh work. This still doesn't explain the port error when all I'm trying to do is start glassfish so I can try $ wget localhost:8080and make sure it's working. This is very frustrating and I'd really appreciate any help. Thanks. FINAL UPDATE: Sorry if you came here looking for answers. I never figured out what was causing the problem. I created another fresh instance, installed the same stuff, and Glassfish worked perfectly. Something obviously got boned during installation, but I have no idea what. I guess it will remain a mystery. UPDATE: Here's what I get from netstat: # netstat -nuptl Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 462/sshd tcp6 0 0 :::22 :::* LISTEN 462/sshd udp 0 0 0.0.0.0:5353 0.0.0.0:* 483/avahi-daemon: r udp 0 0 0.0.0.0:1194 0.0.0.0:* 589/openvpn udp 0 0 0.0.0.0:37940 0.0.0.0:* 483/avahi-daemon: r udp 0 0 0.0.0.0:68 0.0.0.0:* 377/dhclient3 UPDATE: One more thing... I know that the "net.ipv6.bindv6only" kernel option can cause problems with java networking, so I did set this: # sysctl -w net.ipv6.bindv6only=0 UPDATE: I also verified that it has nothing at all to do with the port number (4848). As you can see here, when I changed the admin-listener port in domain.xml to 4949, I get a similar message: # sudo -u glassfish bin/asadmin start-domain domain1 There is a process already using the admin port 4949 -- it probably is another instance of a GlassFish server. Command start-domain failed. UPDATE: Here are the contents of /etc/hosts: 127.0.0.1 localhost # The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters ff02::3 ip6-allhosts I should mention that I have another Ubuntu Lucid 10.04 64-bit slice that is NOT hosted on ec2, and set it up the exact same way with no problems whatsoever. Also server.log doesn't offer much insight either: # cat ./server.log Nov 20, 2010 8:46:49 AM com.sun.enterprise.admin.launcher.GFLauncherLogger info INFO: JVM invocation command line: /usr/lib/jvm/java-6-sun-1.6.0.22/bin/java -cp /opt/glassfishv3/glassfish/modules/glassfish.jar -XX:+UnlockDiagnosticVMOptions -XX:MaxPermSize=192m -XX:NewRatio=2 -XX:+LogVMOutput -XX:LogFile=/opt/glassfishv3/glassfish/domains/domain1/logs/jvm.log -Xmx512m -client -javaagent:/opt/glassfishv3/glassfish/lib/monitor/btrace-agent.jar=unsafe=true,noServer=true -Dosgi.shell.telnet.maxconn=1 -Djdbc.drivers=org.apache.derby.jdbc.ClientDriver -Dfelix.fileinstall.dir=/opt/glassfishv3/glassfish/modules/autostart/ -Djavax.net.ssl.keyStore=/opt/glassfishv3/glassfish/domains/domain1/config/keystore.jks -Dosgi.shell.telnet.port=6666 -Djava.security.policy=/opt/glassfishv3/glassfish/domains/domain1/config/server.policy -Dfelix.fileinstall.poll=5000 -Dcom.sun.aas.instanceRoot=/opt/glassfishv3/glassfish/domains/domain1 -Dcom.sun.enterprise.config.config_environment_factory_class=com.sun.enterprise.config.serverbeans.AppserverConfigEnvironmentFactory -Dosgi.shell.telnet.ip=127.0.0.1 -Djava.endorsed.dirs=/opt/glassfishv3/glassfish/modules/endorsed:/opt/glassfishv3/glassfish/lib/endorsed -Dcom.sun.aas.installRoot=/opt/glassfishv3/glassfish -Djava.ext.dirs=/usr/lib/jvm/java-6-sun-1.6.0.22/lib/ext:/usr/lib/jvm/java-6-sun-1.6.0.22/jre/lib/ext:/opt/glassfishv3/glassfish/domains/domain1/lib/ext -Dfelix.fileinstall.bundles.new.start=true -Djavax.net.ssl.trustStore=/opt/glassfishv3/glassfish/domains/domain1/config/cacerts.jks -Dcom.sun.enterprise.security.httpsOutboundKeyAlias=s1as -Djava.security.auth.login.config=/opt/glassfishv3/glassfish/domains/domain1/config/login.conf -DANTLR_USE_DIRECT_CLASS_LOADING=true -Dfelix.fileinstall.debug=1 -Dorg.glassfish.web.rfc2109_cookie_names_enforced=false -Djava.library.path=/opt/glassfishv3/glassfish/lib:/usr/lib/jvm/java-6-sun-1.6.0.22/jre/lib/amd64/server:/usr/lib/jvm/java-6-sun-1.6.0.22/jre/lib/amd64:/usr/lib/jvm/java-6-sun-1.6.0.22/lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib com.sun.enterprise.glassfish.bootstrap.ASMain -domainname domain1 -asadmin-args start-domain,,,domain1 -instancename server -verbose false -debug false -asadmin-classpath /opt/glassfishv3/glassfish/modules/admin-cli.jar -asadmin-classname com.sun.enterprise.admin.cli.AsadminMain -upgrade false -domaindir /opt/glassfishv3/glassfish/domains/domain1 -read-stdin true

    Read the article

  • Sharing authentication methods across API and web app

    - by Snixtor
    I'm wanting to share an authentication implementation across a web application, and web API. The web application will be ASP.NET (mostly MVC 4), the API will be mostly ASP.NET WEB API, though I anticipate it will also have a few custom modules or handlers. I want to: Share as much authentication implementation between the app and API as possible. Have the web application behave like forms authentication (attractive log-in page, logout option, redirect to / from login page when a request requires authentication / authorisation). Have API callers use something closer to standard HTTP (401 - Unauthorized, not 302 - Redirect). Provide client and server side logout mechanisms that don't require a change of password (so HTTP basic is out, since clients typically cache their credentials). The way I'm thinking of implementing this is using plain old ASP.NET forms authentication for the web application, and pushing another module into the stack (much like MADAM - Mixed Authentication Disposition ASP.NET Module). This module will look for some HTTP header (implementation specific) which indicates "caller is API". If the header "caller is API" is set, then the service will respond differently than standard ASP.NET forms authentication, it will: 401 instead of 302 on a request lacking authentication. Look for username + pass in a custom "Login" HTTP header, and return a FormsAuthentication ticket in a custom "FormsAuth" header. Look for FormsAuthentication ticket in a custom "FormsAuth" header. My question(s) are: Is there a framework for ASP.NET that already covers this scenario? Are there any glaring holes in this proposed implementation? My primary fear is a security risk that I can't see, but I'm similarly concerned that there may be something about such an implementation that will make it overly restrictive or clumsy to work with.

    Read the article

  • What Are Basic Tools For A New Project?

    - by Morgan Cheng
    For a long time, I thought that to start a new project we only need 3 basic tools. 1) A Build System (e.g. Maven & CruiseControl) 2) A Version Control System (e.g. CVS & SVN & GIT) 3) A Bug Tracking System (e.g. Bugzilla) Yesterday, a senior guy told me that we need at least one thing more. That is KPI(Key Performance Index). Without KPI, it is impossible to measure whether the project is progressing well or not. KPI is kind of SOFT tool compared to Maven/SVN/Bugzilla. I believe since I missed SOFT tools, there must be some other kind of tools I missed. So, anybody get some ideas what other basic tools necessary for a new project?

    Read the article

  • Free tools versus paid tools.

    - by Dennis Vroegop
    We live in a strange world. Information should be free. Tools should be free. Software should be free (and I mean free as in free beer, not as in free speech). Of course, since I make my living (and pay my mortgage) by writing software I tend to disagree. Or rather: I want to get paid for the things I do in the daytime. Next to that I also spend time on projects I feel are valuable for the community, which I do for free. The reason I can do that is because I get paid enough in the daytime to afford that time. It gives me a good feeling, I help others and it’s fun to do. But the baseline is: I get paid to write software. I am sure this goes for a lot of other developers. We get paid for what we do during the daytime and spend our free time giving back. So why does everyone always make a fuzz when a company suddenly starts to charge for software? To me, this seems like a very reasonable decision. Companies need money: they have staff to pay, buildings to rent, coffee to buy, etc. All of this doesn’t come free so it makes sense that they charge their customers for the things they produce. I know there’s a very big Open Source market out there, where companies give away (parts of) their software and get revenue out of the services they provide. But this doesn’t work if your product doesn’t need services. If you build a great tool that is very easy to use, and you give it away for free you won’t get any money by selling services that no user of your tool really needs. So what do you do? You charge money for your tool. It’s either that or stop developing the tool and turn to other, more profitable projects. Like it or not, that’s simple economics at work. You have something other people want, so you charge them for it. This week it was announced that what I believe is the most used tool for .net developers (besides Visual Studio of course),namely Red Gates .net reflector, will stop being a free tool. They will charge you $35 for the next version. Suddenly twitter was on fire and everyone was mad about it. But why? The tool is downloaded by so many developers that it must be valuable to them. I know of no serious .net developer who hasn’t got it on his or her machine. So apparently the tool gives them something they need. So why do they expect it to be free? There are developers out there maintaining and extending the tool, building new and better versions of it. And the price? $35 doesn’t seem much. If I think of the time the tool saved me the 35 dollars were earned back in a day. If by spending this amount of money I can rely on great software that helps me do my job better and faster, I have no problems by spending it. I know that there is a great team behind it, (the Red Gate tools are a must have when developing SQL systems, for instance), and I do believe they are in their right to charge this. So.. there you have it. This is of course, my opinion. You may think otherwise. Please let me know in the comments what you think! Tags van Technorati: redgate,reflector,opensource

    Read the article

  • Using an alternate JSON Serializer in ASP.NET Web API

    - by Rick Strahl
    The new ASP.NET Web API that Microsoft released alongside MVC 4.0 Beta last week is a great framework for building REST and AJAX APIs. I've been working with it for quite a while now and I really like the way it works and the complete set of features it provides 'in the box'. It's about time that Microsoft gets a decent API for building generic HTTP endpoints into the framework. DataContractJsonSerializer sucks As nice as Web API's overall design is one thing still sucks: The built-in JSON Serialization uses the DataContractJsonSerializer which is just too limiting for many scenarios. The biggest issues I have with it are: No support for untyped values (object, dynamic, Anonymous Types) MS AJAX style Date Formatting Ugly serialization formats for types like Dictionaries To me the most serious issue is dealing with serialization of untyped objects. I have number of applications with AJAX front ends that dynamically reformat data from business objects to fit a specific message format that certain UI components require. The most common scenario I have there are IEnumerable query results from a database with fields from the result set rearranged to fit the sometimes unconventional formats required for the UI components (like jqGrid for example). Creating custom types to fit these messages seems like overkill and projections using Linq makes this much easier to code up. Alas DataContractJsonSerializer doesn't support it. Neither does DataContractSerializer for XML output for that matter. What this means is that you can't do stuff like this in Web API out of the box:public object GetAnonymousType() { return new { name = "Rick", company = "West Wind", entered= DateTime.Now }; } Basically anything that doesn't have an explicit type DataContractJsonSerializer will not let you return. FWIW, the same is true for XmlSerializer which also doesn't work with non-typed values for serialization. The example above is obviously contrived with a hardcoded object graph, but it's not uncommon to get dynamic values returned from queries that have anonymous types for their result projections. Apparently there's a good possibility that Microsoft will ship Json.NET as part of Web API RTM release.  Scott Hanselman confirmed this as a footnote in his JSON Dates post a few days ago. I've heard several other people from Microsoft confirm that Json.NET will be included and be the default JSON serializer, but no details yet in what capacity it will show up. Let's hope it ends up as the default in the box. Meanwhile this post will show you how you can use it today with the beta and get JSON that matches what you should see in the RTM version. What about JsonValue? To be fair Web API DOES include a new JsonValue/JsonObject/JsonArray type that allow you to address some of these scenarios. JsonValue is a new type in the System.Json assembly that can be used to build up an object graph based on a dictionary. It's actually a really cool implementation of a dynamic type that allows you to create an object graph and spit it out to JSON without having to create .NET type first. JsonValue can also receive a JSON string and parse it without having to actually load it into a .NET type (which is something that's been missing in the core framework). This is really useful if you get a JSON result from an arbitrary service and you don't want to explicitly create a mapping type for the data returned. For serialization you can create an object structure on the fly and pass it back as part of an Web API action method like this:public JsonValue GetJsonValue() { dynamic json = new JsonObject(); json.name = "Rick"; json.company = "West Wind"; json.entered = DateTime.Now; dynamic address = new JsonObject(); address.street = "32 Kaiea"; address.zip = "96779"; json.address = address; dynamic phones = new JsonArray(); json.phoneNumbers = phones; dynamic phone = new JsonObject(); phone.type = "Home"; phone.number = "808 123-1233"; phones.Add(phone); phone = new JsonObject(); phone.type = "Home"; phone.number = "808 123-1233"; phones.Add(phone); //var jsonString = json.ToString(); return json; } which produces the following output (formatted here for easier reading):{ name: "rick", company: "West Wind", entered: "2012-03-08T15:33:19.673-10:00", address: { street: "32 Kaiea", zip: "96779" }, phoneNumbers: [ { type: "Home", number: "808 123-1233" }, { type: "Mobile", number: "808 123-1234" }] } If you need to build a simple JSON type on the fly these types work great. But if you have an existing type - or worse a query result/list that's already formatted JsonValue et al. become a pain to work with. As far as I can see there's no way to just throw an object instance at JsonValue and have it convert into JsonValue dictionary. It's a manual process. Using alternate Serializers in Web API So, currently the default serializer in WebAPI is DataContractJsonSeriaizer and I don't like it. You may not either, but luckily you can swap the serializer fairly easily. If you'd rather use the JavaScriptSerializer built into System.Web.Extensions or Json.NET today, it's not too difficult to create a custom MediaTypeFormatter that uses these serializers and can replace or partially replace the native serializer. Here's a MediaTypeFormatter implementation using the ASP.NET JavaScriptSerializer:using System; using System.Net.Http.Formatting; using System.Threading.Tasks; using System.Web.Script.Serialization; using System.Json; using System.IO; namespace Westwind.Web.WebApi { public class JavaScriptSerializerFormatter : MediaTypeFormatter { public JavaScriptSerializerFormatter() { SupportedMediaTypes.Add(new System.Net.Http.Headers.MediaTypeHeaderValue("application/json")); } protected override bool CanWriteType(Type type) { // don't serialize JsonValue structure use default for that if (type == typeof(JsonValue) || type == typeof(JsonObject) || type== typeof(JsonArray) ) return false; return true; } protected override bool CanReadType(Type type) { if (type == typeof(IKeyValueModel)) return false; return true; } protected override System.Threading.Tasks.Taskobject OnReadFromStreamAsync(Type type, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext) { var task = Taskobject.Factory.StartNew(() = { var ser = new JavaScriptSerializer(); string json; using (var sr = new StreamReader(stream)) { json = sr.ReadToEnd(); sr.Close(); } object val = ser.Deserialize(json,type); return val; }); return task; } protected override System.Threading.Tasks.Task OnWriteToStreamAsync(Type type, object value, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext, System.Net.TransportContext transportContext) { var task = Task.Factory.StartNew( () = { var ser = new JavaScriptSerializer(); var json = ser.Serialize(value); byte[] buf = System.Text.Encoding.Default.GetBytes(json); stream.Write(buf,0,buf.Length); stream.Flush(); }); return task; } } } Formatter implementation is pretty simple: You override 4 methods to tell which types you can handle and then handle the input or output streams to create/parse the JSON data. Note that when creating output you want to take care to still allow JsonValue/JsonObject/JsonArray types to be handled by the default serializer so those objects serialize properly - if you let either JavaScriptSerializer or JSON.NET handle them they'd try to render the dictionaries which is very undesirable. If you'd rather use Json.NET here's the JSON.NET version of the formatter:// this code requires a reference to JSON.NET in your project #if true using System; using System.Net.Http.Formatting; using System.Threading.Tasks; using System.Web.Script.Serialization; using System.Json; using Newtonsoft.Json; using System.IO; using Newtonsoft.Json.Converters; namespace Westwind.Web.WebApi { public class JsonNetFormatter : MediaTypeFormatter { public JsonNetFormatter() { SupportedMediaTypes.Add(new System.Net.Http.Headers.MediaTypeHeaderValue("application/json")); } protected override bool CanWriteType(Type type) { // don't serialize JsonValue structure use default for that if (type == typeof(JsonValue) || type == typeof(JsonObject) || type == typeof(JsonArray)) return false; return true; } protected override bool CanReadType(Type type) { if (type == typeof(IKeyValueModel)) return false; return true; } protected override System.Threading.Tasks.Taskobject OnReadFromStreamAsync(Type type, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext) { var task = Taskobject.Factory.StartNew(() = { var settings = new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore, }; var sr = new StreamReader(stream); var jreader = new JsonTextReader(sr); var ser = new JsonSerializer(); ser.Converters.Add(new IsoDateTimeConverter()); object val = ser.Deserialize(jreader, type); return val; }); return task; } protected override System.Threading.Tasks.Task OnWriteToStreamAsync(Type type, object value, System.IO.Stream stream, System.Net.Http.Headers.HttpContentHeaders contentHeaders, FormatterContext formatterContext, System.Net.TransportContext transportContext) { var task = Task.Factory.StartNew( () = { var settings = new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore, }; string json = JsonConvert.SerializeObject(value, Formatting.Indented, new JsonConverter[1] { new IsoDateTimeConverter() } ); byte[] buf = System.Text.Encoding.Default.GetBytes(json); stream.Write(buf,0,buf.Length); stream.Flush(); }); return task; } } } #endif   One advantage of the Json.NET serializer is that you can specify a few options on how things are formatted and handled. You get null value handling and you can plug in the IsoDateTimeConverter which is nice to product proper ISO dates that I would expect any Json serializer to output these days. Hooking up the Formatters Once you've created the custom formatters you need to enable them for your Web API application. To do this use the GlobalConfiguration.Configuration object and add the formatter to the Formatters collection. Here's what this looks like hooked up from Application_Start in a Web project:protected void Application_Start(object sender, EventArgs e) { // Action based routing (used for RPC calls) RouteTable.Routes.MapHttpRoute( name: "StockApi", routeTemplate: "stocks/{action}/{symbol}", defaults: new { symbol = RouteParameter.Optional, controller = "StockApi" } ); // WebApi Configuration to hook up formatters and message handlers // optional RegisterApis(GlobalConfiguration.Configuration); } public static void RegisterApis(HttpConfiguration config) { // Add JavaScriptSerializer formatter instead - add at top to make default //config.Formatters.Insert(0, new JavaScriptSerializerFormatter()); // Add Json.net formatter - add at the top so it fires first! // This leaves the old one in place so JsonValue/JsonObject/JsonArray still are handled config.Formatters.Insert(0, new JsonNetFormatter()); } One thing to remember here is the GlobalConfiguration object which is Web API's static configuration instance. I think this thing is seriously misnamed given that GlobalConfiguration could stand for anything and so is hard to discover if you don't know what you're looking for. How about WebApiConfiguration or something more descriptive? Anyway, once you know what it is you can use the Formatters collection to insert your custom formatter. Note that I insert my formatter at the top of the list so it takes precedence over the default formatter. I also am not removing the old formatter because I still want JsonValue/JsonObject/JsonArray to be handled by the default serialization mechanism. Since they process in sequence and I exclude processing for these types JsonValue et al. still get properly serialized/deserialized. Summary Currently DataContractJsonSerializer in Web API is a pain, but at least we have the ability with relatively limited effort to replace the MediaTypeFormatter and plug in our own JSON serializer. This is useful for many scenarios - if you have existing client applications that used MVC JsonResult or ASP.NET AJAX results from ASMX AJAX services you can plug in the JavaScript serializer and get exactly the same serializer you used in the past so your results will be the same and don't potentially break clients. JSON serializers do vary a bit in how they serialize some of the more complex types (like Dictionaries and dates for example) and so if you're migrating it might be helpful to ensure your client code doesn't break when you switch to ASP.NET Web API. Going forward it looks like Microsoft is planning on plugging in Json.Net into Web API and make that the default. I think that's an awesome choice since Json.net has been around forever, is fast and easy to use and provides a ton of functionality as part of this great library. I just wish Microsoft would have figured this out sooner instead of now at the last minute integrating with it especially given that Json.Net has a similar set of lower level JSON objects JsonValue/JsonObject etc. which now will end up being duplicated by the native System.Json stuff. It's not like we don't already have enough confusion regarding which JSON serializer to use (JavaScriptSerializer, DataContractJsonSerializer, JsonValue/JsonObject/JsonArray and now Json.net). For years I've been using my own JSON serializer because the built in choices are both limited. However, with an official encorsement of Json.Net I'm happily moving on to use that in my applications. Let's see and hope Microsoft gets this right before ASP.NET Web API goes gold.© Rick Strahl, West Wind Technologies, 2005-2012Posted in Web Api  AJAX  ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • rsync to EC2 using ssh -i

    - by isomorphismes
    I'm able to ssh -i mykey.pem to EC2. I'm able to scp -i mykey.pem to EC2. But when I try to rsync -avz -e "ssh -i mykey.pem" I get this error: Warning: Identity file mykey.pem not accessible: No such file or directory. Permission denied (publickey). rsync: connection unexpectedly closed (0 bytes received so far) [sender] rsync error: unexplained error (code 255) at io.c(605) [sender=3.0.9] Any suggestions what I've done wrong?

    Read the article

  • Converting an EC2 AMI to vmdk image

    - by Reed G. Law
    I've come quite close to getting Amazon Linux to boot inside VirtualBox, thanks to this answer and these websites. A quick overview of the steps I've taken: Launch EC2 instance with Amazon Linux 2011.09 64-bit AMI dd the contents of the EBS volume over ssh to a local image file. Mount the image file as a loopback device and then to a local mount point. Create a new empty disk image file, partition with an offset for a bootloader, and create an ext4 filesystem. Mount the new image's partition and copy everything from the EC2 image. Install grub (using Ubuntu's grub-legacy-ec2 package, not grub2). Convert the image file to vmdk using qemu-img. Create a new VirtualBox VM with the vmdk. Now the VM boots, grub loads, and the kernel is found. But it fails when it tries to mount the root device: dracut Warning: No root device "block:/dev/xvda1" found dracut Warning: Boot has failed. To debug this issue add "rdshell" to the kernel command line. dracut Warning: Signal caught! dracut Warning: Boot has failed. To debug this issue add "rdshell" to the kernel command line. Kernel panic - not syncing: Attempted to kill init! Pid: 1, comm: init Not tainted 2.6.35.14-107.1.39.amzn1.x86_64 #1 I have tried changing /boot/grub/menu.lst to find the root device by label and UUID, but nothing works. I'm guessing the xen kernel is not compatible with VirtualBox. The reasoning behind all this effort is to make a Vagrant box that is as close to possible as the production enviroment, so deploys can be tested locally. I know it's cheap to do test runs on EC2, but poor connectivity often ruins the experience. Plus it would be really nice to have a virtual machine with the production environment so that co-workers don't have to install everything under the sun just to get up and running with app development. If I were to try running a different kernel, what kernel could I get to be as close as possible to Amazon Linux 2011.09?

    Read the article

  • Mounting an Amazon EC2 instance on MAC OS X

    - by hinghoo
    What are some methods for transferring files to and from Amazon EC2 instances. I'm looking for solutions / tools for editing files as well as copying files to EC2 instances from both Mac and Windows. For example, what are some solutions for mounting a drive from an instance locally? Generally, what other methods are out there?

    Read the article

  • FTP Load Balancer on EC2

    - by inakiabt
    I need an EC2 instance to balance all incoming FTP connections to a list of FTP servers (EC2 instances too). This list will be changed dynamically due to the load of the FTP servers (launch a new FTP server when the FTP servers are overloaded or shutdown a FTP server when the load is low). What you recommend? a FTP proxy? DNS server? Load balancer? Note: The FTP servers must support Passive Mode

    Read the article

  • Upgrade Ubuntu linux-virtual on EC2

    - by James Ward
    I have an Ubuntu 10.10 EBS boot server on EC2. There are new updates available for it: linux-image-virtual linux-virtual Is it even ok to upgrade those packages on an EC2 server? When I do try to upgrade them I get: The following packages have been kept back: linux-image-virtual linux-virtual Should I do a dist-upgrade or something to force it? Will my instance be able to be rebooted?

    Read the article

  • Test Amazon Ec2 instance Small

    - by user102130
    I have some questions about amazon ec2 and especially on small instance. I want to host my new website (the beta version) on this kind of instance, but before i want to know about how many simultaneous users can be connected on one small instance. You can see the caracteristics of a small one here : http://aws.amazon.com/ec2/instance-types/ My website is a kind of social network in PHP. Is someone had already test this type of instance ?

    Read the article

  • Add drupal modules on ec2 server

    - by CQM
    how do I add external drupal modules to an ec2 server? Drupal interface wants me to provide ftp password, but amazon ec2 uses private key pair and not username/password (unless I enable that, which I don't want to) how would I install from a site like this http://drupal.org/project/restws if the automated way is not feasible, do I just have to upload the individual module files to a particular drupal folder via sftp?

    Read the article

  • Autoscaling EC2 with NFS mounts

    - by Jamie Taylor
    I'm trying to set up a shared filesystem on EC2 and I've read tutorials such as this: http://blog.ronaldmccollam.com/2012/07/configuring-nfs-on-ubuntu-in-amazon-ec2.html In step 2 it talks about configuring the exports, for this I need an IP range but when I'm auto-scaling I can't predict what the IP will be before it scales. Is there any other way of doing this while still staying secure? Thanks Edit: Just tried s3fs, didn't seem to work properly

    Read the article

  • How to monitor an Amazon EC2 instance

    - by sektor
    Is there a way to monitor ec2 instances without using cloudwatch? I ask this since ec2 instances are basically VPS's, and using output from commands like top, vmstat, htop in scripts may not give the clear picture as the CPU cycles are shared between other instances as well. What should one keep in mind while monitoring CPU usage on a VPS?, should one have alerts based on top load or % of CPU used by user processes coupled with other factors like processes waiting on disk io, hardware interrupts?

    Read the article

  • use Amazon EC2 tools in Capistrano to get the servers to push the code

    - by APZ
    I am trying to use EC2 tools to get all the machines with a particular tag in some type of array in /config/deploy/prod.rb file in Capistrano. Something like this: In prod.rb file: //untested command workers-array[]=$(ec2-describe-instances -F vpc-id=1234 -F tag:Env=prod -F tag:SystemType=worker) for(i=0;i<workers-array.len;i++){ role :worker-A, workers-array[i] } I am not sure how we can do this in capistrano, am new to ruby too. Guys any help on this would be really appreciated.

    Read the article

  • I can't ping to my EC2 instance although icmp has been set

    - by user79356
    I wasn't able to ping to my ec2 server although I've done the following command: ec2-authorize default -P icmp -t -1:-1 -s 0.0.0.0/0 when I try this again it gives me: Client.InvalidPermission.Duplicate: The permission '0.0.0.0/0-3--1--1' has already been authorized on the specified group now when I ping from my laptop it gives me: PING 54.251.103.225 (54.251.103.225): 56 data bytes Request timeout for icmp_seq 0 Request timeout for icmp_seq 1 Request timeout for icmp_seq 2 Request timeout for icmp_seq 3 any idea on what to try on next?

    Read the article

  • Amazon EC2 + SSL Godaddy are not routing properly in HTTPS

    - by azngunit81
    I have an Amazon EC2 + SSL just installed on GoDaddy. I have successfully managed to install it and get the green https on the main domain https://www.example.com however it doesn't any https://www.example.com/something but the route works under http://www.example.com I am using an .htacess file for some rewrite. Options -MultiViews RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^ index.php [L] the Ec2 instance is ubuntu if that helps in anyway.

    Read the article

  • EC2 Instance of Wordpress not mapping URLs correctly

    - by Benjamin
    I'm using an AWS EC2 micro instance to run a wordpress blog. I've successfully mapped a subdomain to the Elastic IP for the micro instance. After a few minor changes, the URL I mapped to the Elastic IP (blog.example.com) opens up the wordpress home page but whenever I click on any of the wordpress links the domain changes to the AWS public DNS for that instance (http://ec2-123-45-678-910.compute-1.amazonaws.com/wordpress/). How do I fix the URLs so that they all follow the subdomain I have setup?

    Read the article

  • Essential management tools for a small/medium software development shop

    - by mikera
    I've recently started work with an organisation that is rapidly expanding and is recruiting or growing several development teams (including two web-based products and a data warehouse/BI team). They are basically working to agile methodologies but haven't formalised a standard way of working yet. Despite the fact that it is early days, I've been surprised by the lack of tools being used to manage the development processes (e.g. no issue tracker, no tool to manage the product backlog etc.) Although it's not my primary responsibility, I'd like to help them out with some recommendations on the most important tools they should get in place. What are the 3-5 top priority tools to establish for management of a good development shop? Why are they necessary? How do they improve the software development process, and how do I justify them to my bosses?

    Read the article

  • Interesting/Innovative Open Source tools for indie games

    - by Gastón
    Just out of curiosity, I want to know opensource tools or projects that can add some interesting features to indie games, preferably those that could only be found on big-budget games. EDIT: As suggested by The Communist Duck and Joe Wreschnig, I'm putting the examples as answers. EDIT 2: Please do not post tools like PyGame, Inkscape, Gimp, Audacity, Slick2D, Phys2D, Blender (except for interesting plugins) and the like. I know they are great tools/libraries and some would argue essential to develop good games, but I'm looking for more rare projects. Could be something really specific or niche, like generating realistic trees and plants, or realistic AI for animals.

    Read the article

  • Using model tools as map editor

    - by cooky451
    I want to make a game which would require a 3D map editor. Of course, I would like to avoid creating such an editor. My idea is now to use modeling tools (3DS Max, Maya, Blender) to create the map, and to give game specific objects specified names. This way I'd just need to write an COLLADA - native map format converter. But I'm not sure if this is possible the way I imagine it, that's why I'd like to hear your thoughts on the matter. Are modeling tools suitable to create big open world maps? Can this "naming convention"-idea for game specific objects work? Are the modeling tools able to export a scene in chunks / in a way that occlusion culling and collision detection can be properly done? If not: Is there a way to build a suitable data structure from the exported data?

    Read the article

  • Interesting/Innovative Open Source tools for indie games [closed]

    - by Gastón
    Just out of curiosity, I want to know opensource tools or projects that can add some interesting features to indie games, preferably those that could only be found on big-budget games. EDIT: As suggested by The Communist Duck and Joe Wreschnig, I'm putting the examples as answers. EDIT 2: Please do not post tools like PyGame, Inkscape, Gimp, Audacity, Slick2D, Phys2D, Blender (except for interesting plugins) and the like. I know they are great tools/libraries and some would argue essential to develop good games, but I'm looking for more rare projects. Could be something really specific or niche, like generating realistic trees and plants, or realistic AI for animals.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >