Search Results

Search found 56255 results on 2251 pages for 'json net'.

Page 8/2251 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • How to convert datatable to json string using json.net?

    - by Pandiya Chendur
    How to convert datatable to json using json.net? Any suggestion... I ve downloaded the necessary binaries... Which class should i use to get the conversion of my datatable to json? Thus far used this method to get json string by passing my datatable... public string GetJSONString(DataTable table) { StringBuilder headStrBuilder = new StringBuilder(table.Columns.Count * 5); //pre-allocate some space, default is 16 bytes for (int i = 0; i < table.Columns.Count; i++) { headStrBuilder.AppendFormat("\"{0}\" : \"{0}{1}¾\",", table.Columns[i].Caption, i); } headStrBuilder.Remove(headStrBuilder.Length - 1, 1); // trim away last , StringBuilder sb = new StringBuilder(table.Rows.Count * 5); //pre-allocate some space sb.Append("{\""); sb.Append(table.TableName); sb.Append("\" : ["); for (int i = 0; i < table.Rows.Count; i++) { string tempStr = headStrBuilder.ToString(); sb.Append("{"); for (int j = 0; j < table.Columns.Count; j++) { table.Rows[i][j] = table.Rows[i][j].ToString().Replace("'", ""); tempStr = tempStr.Replace(table.Columns[j] + j.ToString() + "¾", table.Rows[i][j].ToString()); } sb.Append(tempStr + "},"); } sb.Remove(sb.Length - 1, 1); // trim last , sb.Append("]}"); return sb.ToString(); } Now i thought of using json.net but dont know where to get started....

    Read the article

  • Decode sparse json array to php array

    - by Isaac Sutherland
    I can create a sparse php array (or map) using the command: $myarray = array(10=>'hi','test20'=>'howdy'); I want to serialize/deserialize this as JSON. I can serialize it using the command: $json = json_encode($myarray); which results in the string {"10":"hi","test20":"howdy"}. However, when I deserialize this and cast it to an array using the command: $mynewarray = (array)json_decode($json); I seem to lose any mappings with keys which were not valid php identifiers. That is, mynewarray has mapping 'test20'=>'howdy', but not 10=>'hi' nor '10'=>'hi'. Is there a way to preserve the numerical keys in a php map when converting to and back from json using the standard json_encode / json_decode functions?

    Read the article

  • Serializing Python bytestrings to JSON, preserving ordinal character values

    - by Doctor J
    I have some binary data produced as base-256 bytestrings in Python (2.x). I need to read these into JavaScript, preserving the ordinal value of each byte (char) in the string. If you'll allow me to mix languages, I want to encode a string s in Python such that ord(s[i]) == s.charCodeAt(i) after I've read it back into JavaScript. The cleanest way to do this seems to be to serialize my Python strings to JSON. However, json.dump doesn't like my bytestrings, despite fiddling with the ensure_ascii and encoding parameters. Is there a way to encode bytestrings to Unicode strings that preserves ordinal character values? Otherwise I think I need to encode the characters above the ASCII range into JSON-style \u1234 escapes; but a codec like this does not seem to be among Python's codecs. Is there an easy way to serialize Python bytestrings to JSON, preserving char values, or do I need to write my own encoder?

    Read the article

  • Receive JSON payload with ZEND framework and / or PHP

    - by kent3800
    I'm receiving a JSON payload from a webservice at my site's internal webpage at /asset/setjob. The following is the JSON payload being posted to /asset/setjob: [{"job": {"source_filename": "beer-drinking-pig.mpg", "current_step": "waiting_for_file", "encoding_profile_id": "nil", "resolution": "nil", "status_url": "http://example.com/api/v1/jobs/1.json", "id": 1, "bitrate": "nil", "current_status": "waiting for file", "current_progress": "nil", "remote_id": "my-own-remote-id"}}] This payload posts one time to this page. The page is not meant for viewing but parsing the JSON object for the id and current_status so that I can insert it into a database. I'm using Zend framework. HOW DO I receive this payload in Zend? Do I $_GET['json']? $_POST['job']? None of these seem to work. I essentially need to assign this payload to a php variable so that I can then manipulate it. I've tried: $jsonStrGet = var_dump($_GET); $jsonStrPost = var_dump($_POST); And I've tried: $response = $this-getResponse(); $body = $response-getBody(); Blockquote Any help would be much appreciated! Thanks.

    Read the article

  • JSON Returning Null in PHP

    - by kira423
    Here is the two scripts I have Script 1: if(sha1($json+$secret) == $_POST['signature']) { $conversion_id = md5(($obj['amount'])); echo "OK"; echo $conversion_id; mysql_query("INSERT INTO completed (`id`,`uid`,`completedid`) VALUES ('','".$obj['uid']."','".$conversion_id."')"); } else { } ?> Script 2: <? $json = $_POST['payload']; $secret = "78f12668216b562a79d46b170dc59f695070e532"; $obj = json_decode($json); if(sha1($json+$secret) == $_POST['signature']) { print "OK"; } else { } ?> The problem here is that it is returning all NULL values. I am not an expert with JSON so I have no idea what is going on here. I really have no way of testing it because the information is coming from an outside website sending information such as this: { payload: { uid: "900af657a65e", amount: 50, adjusted_amount: 25 }, signature: "4dd0f5da77ecaf88628967bbd91d9506" } The site allows me to test the script, but because json_decode is providing NULL values it will not get through the signature block. Is there a way I can test it myself? Or is there a simple error in this script that I may have just looked over?

    Read the article

  • Create JSON From C# using json Library

    Create JSON From C# using json library available at codeplex...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Daily tech links for .net and related technologies - Apr 8-10, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - Apr 8-10, 2010 Web Development Using RIA DomainServices with ASP.NET and MVC 2 - geekswithblogs Using AntiXss As The Default Encoder For ASP.NET - Phil Haack New Syntax for HTML Encoding Output in ASP.NET 4 (and ASP.NET MVC 2) - Scott Gu Multi-Step Processing in ASP.NET - Dave M. Bush MvcContrib - Portable Area – Visual Studio project template - erichexter Encoding/Decoding URIs and HTML in the .NET 4 Client Profile - Pete Brown Jon Takes Five...(read more)

    Read the article

  • consume a .net webservice using jQuery

    - by Babunareshnarra
    Implementation shows the way to consume web service using jQuery. The client side AJAX with HTTP POST request is significant when it comes to loading speed and responsiveness.Following is the service created that return's string in JSON.[WebMethod][ScriptMethod(ResponseFormat = ResponseFormat.Json)]public string getData(string marks){    DataTable dt = retrieveDataTable("table", @"              SELECT * FROM TABLE WHERE MARKS='"+ marks.ToString() +"' ");    List<object> RowList = new List<object>();    foreach (DataRow dr in dt.Rows)    {        Dictionary<object, object> ColList = new Dictionary<object, object>();        foreach (DataColumn dc in dt.Columns)        {            ColList.Add(dc.ColumnName,            (string.Empty == dr[dc].ToString()) ? null : dr[dc]);        }        RowList.Add(ColList);    }    JavaScriptSerializer js = new JavaScriptSerializer();    string JSON = js.Serialize(RowList);    return JSON;}Consuming the webservice $.ajax({    type: "POST",    data: '{ "marks": "' + val + '"}', // This is required if we are using parameters    contentType: "application/json",    dataType: "json",    url: "/dataservice.asmx/getData",    success: function(response) {               RES = JSON.parse(response.d);        var obj = JSON.stringify(RES);     }     error: function (msg) {                    alert('failure');     }});Remember to reference jQuery library on the page.

    Read the article

  • Understanding Request Validation in ASP.NET MVC 3

    - by imran_ku07
         Introduction:             A fact that you must always remember "never ever trust user inputs". An application that trusts user inputs may be easily vulnerable to XSS, XSRF, SQL Injection, etc attacks. XSS and XSRF are very dangerous attacks. So to mitigate these attacks ASP.NET introduced request validation in ASP.NET 1.1. During request validation, ASP.NET will throw HttpRequestValidationException: 'A potentially dangerous XXX value was detected from the client', if he found, < followed by an exclamation(like <!) or < followed by the letters a through z(like <s) or & followed by a pound sign(like &#123) as a part of query string, posted form and cookie collection. In ASP.NET 4.0, request validation becomes extensible. This means that you can extend request validation. Also in ASP.NET 4.0, by default request validation is enabled before the BeginRequest phase of an HTTP request. ASP.NET MVC 3 moves one step further by making request validation granular. This allows you to disable request validation for some properties of a model while maintaining request validation for all other cases. In this article I will show you the use of request validation in ASP.NET MVC 3. Then I will briefly explain the internal working of granular request validation.       Description:             First of all create a new ASP.NET MVC 3 application. Then create a simple model class called MyModel,     public class MyModel { public string Prop1 { get; set; } public string Prop2 { get; set; } }             Then just update the index action method as follows,   public ActionResult Index(MyModel p) { return View(); }             Now just run this application. You will find that everything works just fine. Now just append this query string ?Prop1=<s to the url of this application, you will get the HttpRequestValidationException exception.           Now just decorate the Index action method with [ValidateInputAttribute(false)],   [ValidateInput(false)] public ActionResult Index(MyModel p) { return View(); }             Run this application again with same query string. You will find that your application run without any unhandled exception.           Up to now, there is nothing new in ASP.NET MVC 3 because ValidateInputAttribute was present in the previous versions of ASP.NET MVC. Any problem with this approach? Yes there is a problem with this approach. The problem is that now users can send html for both Prop1 and Prop2 properties and a lot of developers are not aware of it. This means that now everyone can send html with both parameters(e.g, ?Prop1=<s&Prop2=<s). So ValidateInput attribute does not gives you the guarantee that your application is safe to XSS or XSRF. This is the reason why ASP.NET MVC team introduced granular request validation in ASP.NET MVC 3. Let's see this feature.           Remove [ValidateInputAttribute(false)] on Index action and update MyModel class as follows,   public class MyModel { [AllowHtml] public string Prop1 { get; set; } public string Prop2 { get; set; } }             Note that AllowHtml attribute is only decorated on Prop1 property. Run this application again with ?Prop1=<s query string. You will find that your application run just fine. Run this application again with ?Prop1=<s&Prop2=<s query string, you will get HttpRequestValidationException exception. This shows that the granular request validation in ASP.NET MVC 3 only allows users to send html for properties decorated with AllowHtml attribute.            Sometimes you may need to access Request.QueryString or Request.Form directly. You may change your code as follows,   [ValidateInput(false)] public ActionResult Index() { var prop1 = Request.QueryString["Prop1"]; return View(); }             Run this application again, you will get the HttpRequestValidationException exception again even you have [ValidateInput(false)] on your Index action. The reason is that Request flags are still not set to unvalidate. I will explain this later. For making this work you need to use Unvalidated extension method,     public ActionResult Index() { var q = Request.Unvalidated().QueryString; var prop1 = q["Prop1"]; return View(); }             Unvalidated extension method is defined in System.Web.Helpers namespace . So you need to add using System.Web.Helpers; in this class file. Run this application again, your application run just fine.             There you have it. If you are not curious to know the internal working of granular request validation then you can skip next paragraphs completely. If you are interested then carry on reading.             Create a new ASP.NET MVC 2 application, then open global.asax.cs file and the following lines,     protected void Application_BeginRequest() { var q = Request.QueryString; }             Then make the Index action method as,    [ValidateInput(false)] public ActionResult Index(string id) { return View(); }             Please note that the Index action method contains a parameter and this action method is decorated with [ValidateInput(false)]. Run this application again, but now with ?id=<s query string, you will get HttpRequestValidationException exception at Application_BeginRequest method. Now just add the following entry in web.config,   <httpRuntime requestValidationMode="2.0"/>             Now run this application again. This time your application will run just fine. Now just see the following quote from ASP.NET 4 Breaking Changes,   In ASP.NET 4, by default, request validation is enabled for all requests, because it is enabled before the BeginRequest phase of an HTTP request. As a result, request validation applies to requests for all ASP.NET resources, not just .aspx page requests. This includes requests such as Web service calls and custom HTTP handlers. Request validation is also active when custom HTTP modules are reading the contents of an HTTP request.             This clearly state that request validation is enabled before the BeginRequest phase of an HTTP request. For understanding what does enabled means here, we need to see HttpRequest.ValidateInput, HttpRequest.QueryString and HttpRequest.Form methods/properties in System.Web assembly. Here is the implementation of HttpRequest.ValidateInput, HttpRequest.QueryString and HttpRequest.Form methods/properties in System.Web assembly,     public NameValueCollection Form { get { if (this._form == null) { this._form = new HttpValueCollection(); if (this._wr != null) { this.FillInFormCollection(); } this._form.MakeReadOnly(); } if (this._flags[2]) { this._flags.Clear(2); this.ValidateNameValueCollection(this._form, RequestValidationSource.Form); } return this._form; } } public NameValueCollection QueryString { get { if (this._queryString == null) { this._queryString = new HttpValueCollection(); if (this._wr != null) { this.FillInQueryStringCollection(); } this._queryString.MakeReadOnly(); } if (this._flags[1]) { this._flags.Clear(1); this.ValidateNameValueCollection(this._queryString, RequestValidationSource.QueryString); } return this._queryString; } } public void ValidateInput() { if (!this._flags[0x8000]) { this._flags.Set(0x8000); this._flags.Set(1); this._flags.Set(2); this._flags.Set(4); this._flags.Set(0x40); this._flags.Set(0x80); this._flags.Set(0x100); this._flags.Set(0x200); this._flags.Set(8); } }             The above code indicates that HttpRequest.QueryString and HttpRequest.Form will only validate the querystring and form collection if certain flags are set. These flags are automatically set if you call HttpRequest.ValidateInput method. Now run the above application again(don't forget to append ?id=<s query string in the url) with the same settings(i.e, requestValidationMode="2.0" setting in web.config and Application_BeginRequest method in global.asax.cs), your application will run just fine. Now just update the Application_BeginRequest method as,   protected void Application_BeginRequest() { Request.ValidateInput(); var q = Request.QueryString; }             Note that I am calling Request.ValidateInput method prior to use Request.QueryString property. ValidateInput method will internally set certain flags(discussed above). These flags will then tells the Request.QueryString (and Request.Form) property that validate the query string(or form) when user call Request.QueryString(or Request.Form) property. So running this application again with ?id=<s query string will throw HttpRequestValidationException exception. Now I hope it is clear to you that what does requestValidationMode do. It just tells the ASP.NET that not invoke the Request.ValidateInput method internally before the BeginRequest phase of an HTTP request if requestValidationMode is set to a value less than 4.0 in web.config. Here is the implementation of HttpRequest.ValidateInputIfRequiredByConfig method which will prove this statement(Don't be confused with HttpRequest and Request. Request is the property of HttpRequest class),    internal void ValidateInputIfRequiredByConfig() { ............................................................... ............................................................... ............................................................... ............................................................... if (httpRuntime.RequestValidationMode >= VersionUtil.Framework40) { this.ValidateInput(); } }              Hopefully the above discussion will clear you how requestValidationMode works in ASP.NET 4. It is also interesting to note that both HttpRequest.QueryString and HttpRequest.Form only throws the exception when you access them first time. Any subsequent access to HttpRequest.QueryString and HttpRequest.Form will not throw any exception. Continuing with the above example, just update Application_BeginRequest method in global.asax.cs file as,   protected void Application_BeginRequest() { try { var q = Request.QueryString; var f = Request.Form; } catch//swallow this exception { } var q1 = Request.QueryString; var f1 = Request.Form; }             Without setting requestValidationMode to 2.0 and without decorating ValidateInput attribute on Index action, your application will work just fine because both HttpRequest.QueryString and HttpRequest.Form will clear their flags after reading HttpRequest.QueryString and HttpRequest.Form for the first time(see the implementation of HttpRequest.QueryString and HttpRequest.Form above).           Now let's see ASP.NET MVC 3 granular request validation internal working. First of all we need to see type of HttpRequest.QueryString and HttpRequest.Form properties. Both HttpRequest.QueryString and HttpRequest.Form properties are of type NameValueCollection which is inherited from the NameObjectCollectionBase class. NameObjectCollectionBase class contains _entriesArray, _entriesTable, NameObjectEntry.Key and NameObjectEntry.Value fields which granular request validation uses internally. In addition granular request validation also uses _queryString, _form and _flags fields, ValidateString method and the Indexer of HttpRequest class. Let's see when and how granular request validation uses these fields.           Create a new ASP.NET MVC 3 application. Then put a breakpoint at Application_BeginRequest method and another breakpoint at HomeController.Index method. Now just run this application. When the break point inside Application_BeginRequest method hits then add the following expression in quick watch window, System.Web.HttpContext.Current.Request.QueryString. You will see the following screen,                                              Now Press F5 so that the second breakpoint inside HomeController.Index method hits. When the second breakpoint hits then add the following expression in quick watch window again, System.Web.HttpContext.Current.Request.QueryString. You will see the following screen,                            First screen shows that _entriesTable field is of type System.Collections.Hashtable and _entriesArray field is of type System.Collections.ArrayList during the BeginRequest phase of the HTTP request. While the second screen shows that _entriesTable type is changed to Microsoft.Web.Infrastructure.DynamicValidationHelper.LazilyValidatingHashtable and _entriesArray type is changed to Microsoft.Web.Infrastructure.DynamicValidationHelper.LazilyValidatingArrayList during executing the Index action method. In addition to these members, ASP.NET MVC 3 also perform some operation on _flags, _form, _queryString and other members of HttpRuntime class internally. This shows that ASP.NET MVC 3 performing some operation on the members of HttpRequest class for making granular request validation possible.           Both LazilyValidatingArrayList and LazilyValidatingHashtable classes are defined in the Microsoft.Web.Infrastructure assembly. You may wonder why their name starts with Lazily. The fact is that now with ASP.NET MVC 3, request validation will be performed lazily. In simple words, Microsoft.Web.Infrastructure assembly is now taking the responsibility for request validation from System.Web assembly. See the below screens. The first screen depicting HttpRequestValidationException exception in ASP.NET MVC 2 application while the second screen showing HttpRequestValidationException exception in ASP.NET MVC 3 application.   In MVC 2:                 In MVC 3:                          The stack trace of the second screenshot shows that Microsoft.Web.Infrastructure assembly (instead of System.Web assembly) is now performing request validation in ASP.NET MVC 3. Now you may ask: where Microsoft.Web.Infrastructure assembly is performing some operation on the members of HttpRequest class. There are at least two places where the Microsoft.Web.Infrastructure assembly performing some operation , Microsoft.Web.Infrastructure.DynamicValidationHelper.GranularValidationReflectionUtil.GetInstance method and Microsoft.Web.Infrastructure.DynamicValidationHelper.ValidationUtility.CollectionReplacer.ReplaceCollection method, Here is the implementation of these methods,   private static GranularValidationReflectionUtil GetInstance() { try { if (DynamicValidationShimReflectionUtil.Instance != null) { return null; } GranularValidationReflectionUtil util = new GranularValidationReflectionUtil(); Type containingType = typeof(NameObjectCollectionBase); string fieldName = "_entriesArray"; bool isStatic = false; Type fieldType = typeof(ArrayList); FieldInfo fieldInfo = CommonReflectionUtil.FindField(containingType, fieldName, isStatic, fieldType); util._del_get_NameObjectCollectionBase_entriesArray = MakeFieldGetterFunc<NameObjectCollectionBase, ArrayList>(fieldInfo); util._del_set_NameObjectCollectionBase_entriesArray = MakeFieldSetterFunc<NameObjectCollectionBase, ArrayList>(fieldInfo); Type type6 = typeof(NameObjectCollectionBase); string str2 = "_entriesTable"; bool flag2 = false; Type type7 = typeof(Hashtable); FieldInfo info2 = CommonReflectionUtil.FindField(type6, str2, flag2, type7); util._del_get_NameObjectCollectionBase_entriesTable = MakeFieldGetterFunc<NameObjectCollectionBase, Hashtable>(info2); util._del_set_NameObjectCollectionBase_entriesTable = MakeFieldSetterFunc<NameObjectCollectionBase, Hashtable>(info2); Type targetType = CommonAssemblies.System.GetType("System.Collections.Specialized.NameObjectCollectionBase+NameObjectEntry"); Type type8 = targetType; string str3 = "Key"; bool flag3 = false; Type type9 = typeof(string); FieldInfo info3 = CommonReflectionUtil.FindField(type8, str3, flag3, type9); util._del_get_NameObjectEntry_Key = MakeFieldGetterFunc<string>(targetType, info3); Type type10 = targetType; string str4 = "Value"; bool flag4 = false; Type type11 = typeof(object); FieldInfo info4 = CommonReflectionUtil.FindField(type10, str4, flag4, type11); util._del_get_NameObjectEntry_Value = MakeFieldGetterFunc<object>(targetType, info4); util._del_set_NameObjectEntry_Value = MakeFieldSetterFunc(targetType, info4); Type type12 = typeof(HttpRequest); string methodName = "ValidateString"; bool flag5 = false; Type[] argumentTypes = new Type[] { typeof(string), typeof(string), typeof(RequestValidationSource) }; Type returnType = typeof(void); MethodInfo methodInfo = CommonReflectionUtil.FindMethod(type12, methodName, flag5, argumentTypes, returnType); util._del_validateStringCallback = CommonReflectionUtil.MakeFastCreateDelegate<HttpRequest, ValidateStringCallback>(methodInfo); Type type = CommonAssemblies.SystemWeb.GetType("System.Web.HttpValueCollection"); util._del_HttpValueCollection_ctor = CommonReflectionUtil.MakeFastNewObject<Func<NameValueCollection>>(type); Type type14 = typeof(HttpRequest); string str6 = "_form"; bool flag6 = false; Type type15 = type; FieldInfo info6 = CommonReflectionUtil.FindField(type14, str6, flag6, type15); util._del_get_HttpRequest_form = MakeFieldGetterFunc<HttpRequest, NameValueCollection>(info6); util._del_set_HttpRequest_form = MakeFieldSetterFunc(typeof(HttpRequest), info6); Type type16 = typeof(HttpRequest); string str7 = "_queryString"; bool flag7 = false; Type type17 = type; FieldInfo info7 = CommonReflectionUtil.FindField(type16, str7, flag7, type17); util._del_get_HttpRequest_queryString = MakeFieldGetterFunc<HttpRequest, NameValueCollection>(info7); util._del_set_HttpRequest_queryString = MakeFieldSetterFunc(typeof(HttpRequest), info7); Type type3 = CommonAssemblies.SystemWeb.GetType("System.Web.Util.SimpleBitVector32"); Type type18 = typeof(HttpRequest); string str8 = "_flags"; bool flag8 = false; Type type19 = type3; FieldInfo flagsFieldInfo = CommonReflectionUtil.FindField(type18, str8, flag8, type19); Type type20 = type3; string str9 = "get_Item"; bool flag9 = false; Type[] typeArray4 = new Type[] { typeof(int) }; Type type21 = typeof(bool); MethodInfo itemGetter = CommonReflectionUtil.FindMethod(type20, str9, flag9, typeArray4, type21); Type type22 = type3; string str10 = "set_Item"; bool flag10 = false; Type[] typeArray6 = new Type[] { typeof(int), typeof(bool) }; Type type23 = typeof(void); MethodInfo itemSetter = CommonReflectionUtil.FindMethod(type22, str10, flag10, typeArray6, type23); MakeRequestValidationFlagsAccessors(flagsFieldInfo, itemGetter, itemSetter, out util._del_BitVector32_get_Item, out util._del_BitVector32_set_Item); return util; } catch { return null; } } private static void ReplaceCollection(HttpContext context, FieldAccessor<NameValueCollection> fieldAccessor, Func<NameValueCollection> propertyAccessor, Action<NameValueCollection> storeInUnvalidatedCollection, RequestValidationSource validationSource, ValidationSourceFlag validationSourceFlag) { NameValueCollection originalBackingCollection; ValidateStringCallback validateString; SimpleValidateStringCallback simpleValidateString; Func<NameValueCollection> getActualCollection; Action<NameValueCollection> makeCollectionLazy; HttpRequest request = context.Request; Func<bool> getValidationFlag = delegate { return _reflectionUtil.GetRequestValidationFlag(request, validationSourceFlag); }; Func<bool> func = delegate { return !getValidationFlag(); }; Action<bool> setValidationFlag = delegate (bool value) { _reflectionUtil.SetRequestValidationFlag(request, validationSourceFlag, value); }; if ((fieldAccessor.Value != null) && func()) { storeInUnvalidatedCollection(fieldAccessor.Value); } else { originalBackingCollection = fieldAccessor.Value; validateString = _reflectionUtil.MakeValidateStringCallback(context.Request); simpleValidateString = delegate (string value, string key) { if (((key == null) || !key.StartsWith("__", StringComparison.Ordinal)) && !string.IsNullOrEmpty(value)) { validateString(value, key, validationSource); } }; getActualCollection = delegate { fieldAccessor.Value = originalBackingCollection; bool flag = getValidationFlag(); setValidationFlag(false); NameValueCollection col = propertyAccessor(); setValidationFlag(flag); storeInUnvalidatedCollection(new NameValueCollection(col)); return col; }; makeCollectionLazy = delegate (NameValueCollection col) { simpleValidateString(col[null], null); LazilyValidatingArrayList array = new LazilyValidatingArrayList(_reflectionUtil.GetNameObjectCollectionEntriesArray(col), simpleValidateString); _reflectionUtil.SetNameObjectCollectionEntriesArray(col, array); LazilyValidatingHashtable table = new LazilyValidatingHashtable(_reflectionUtil.GetNameObjectCollectionEntriesTable(col), simpleValidateString); _reflectionUtil.SetNameObjectCollectionEntriesTable(col, table); }; Func<bool> hasValidationFired = func; Action disableValidation = delegate { setValidationFlag(false); }; Func<int> fillInActualFormContents = delegate { NameValueCollection values = getActualCollection(); makeCollectionLazy(values); return values.Count; }; DeferredCountArrayList list = new DeferredCountArrayList(hasValidationFired, disableValidation, fillInActualFormContents); NameValueCollection target = _reflectionUtil.NewHttpValueCollection(); _reflectionUtil.SetNameObjectCollectionEntriesArray(target, list); fieldAccessor.Value = target; } }             Hopefully the above code will help you to understand the internal working of granular request validation. It is also important to note that Microsoft.Web.Infrastructure assembly invokes HttpRequest.ValidateInput method internally. For further understanding please see Microsoft.Web.Infrastructure assembly code. Finally you may ask: at which stage ASP NET MVC 3 will invoke these methods. You will find this answer by looking at the following method source,   Unvalidated extension method for HttpRequest class defined in System.Web.Helpers.Validation class. System.Web.Mvc.MvcHandler.ProcessRequestInit method. System.Web.Mvc.ControllerActionInvoker.ValidateRequest method. System.Web.WebPages.WebPageHttpHandler.ProcessRequestInternal method.       Summary:             ASP.NET helps in preventing XSS attack using a feature called request validation. In this article, I showed you how you can use granular request validation in ASP.NET MVC 3. I explain you the internal working of  granular request validation. Hope you will enjoy this article too.   SyntaxHighlighter.all()

    Read the article

  • Can I run asp.net mvc 1 on .net 4?

    - by Jenea
    I have a asp.net mvc site that references a couple of libraries. Recently I discovered that it is necessary to migrate those dlls to .net 4 (I mean compile them for .net 4). Can I run asp.net mvc 1 on .net 4. Migration to asp.net mvc 2 is postponed because of the removal of response.WriteSubstitution(...) method.

    Read the article

  • ASP.NET MVC3 checkbox dropdownlist create [migrated]

    - by user95381
    i'm new in asp.net MVC and I/m use view model to poppulate the dropdown list and group of checkboxes. I use SQL Server 2012, where have many to many relationships between Students - Books; Student - Cities. I need collect StudentName, one city and many books for one student. I have next questions: 1. How can I get the values from database to my StudentBookCityViewModel? 2. How can I save the values to my database in [HttpPost] Create method? Here is the code: MODEL public class Student { public int StudentId { get; set; } public string StudentName { get; set; } public ICollection<Book> Books { get; set; } public ICollection<City> Cities { get; set; } } public class Book { public int BookId { get; set; } public string BookName { get; set; } public bool IsSelected { get; set; } public ICollection<Student> Students { get; set; } } public class City { public int CityId { get; set; } public string CityName { get; set; } public bool IsSelected { get; set; } public ICollection<Student> Students { get; set; } } VIEW MODEL public class StudentBookCityViewModel { public string StudentName { get; set; } public IList<Book> Books { get; set; } public StudentBookCityViewModel() { Books = new[] { new Book {BookName = "Title1", IsSelected = false}, new Book {BookName = "Title2", IsSelected = false}, new Book {BookName = "Title3", IsSelected = false} }.ToList(); } public string City { get; set; } public IEnumerable<SelectListItem> CityValues { get { return new[] { new SelectListItem {Value = "Value1", Text = "Text1"}, new SelectListItem {Value = "Value2", Text = "Text2"}, new SelectListItem {Value = "Value3", Text = "Text3"} }; } } } Context public class EFDbContext : DbContext{ public EFDbContext(string connectionString) { Database.Connection.ConnectionString = connectionString; } public DbSet<Book> Books { get; set; } public DbSet<Student> Students { get; set; } public DbSet<City> Cities { get; set; } protected override void OnModelCreating(DbModelBuilder modelBuilder) { modelBuilder.Entity<Book>() .HasMany(x => x.Students).WithMany(x => x.Books) .Map(x => x.MapLeftKey("BookId").MapRightKey("StudentId").ToTable("StudentBooks")); modelBuilder.Entity<City>() .HasMany(x => x.Students).WithMany(x => x.Cities) .Map(x => x.MapLeftKey("CityId").MapRightKey("StudentId").ToTable("StudentCities")); } } Controller public ActionResult Create() { return View(); } [HttpPost] public ActionResult Create() { //I don't understand how I can save values to db context.SaveChanges(); return RedirectToAction("Index"); } View @model UsingEFNew.ViewModels.StudentBookCityViewModel @using (Html.BeginForm()) { Your Name: @Html.TextBoxFor(model = model.StudentName) <div>Genre:</div> <div> @Html.DropDownListFor(model => model.City, Model.CityValues) </div> <div>Books:</div> <div> @for (int i = 0; i < Model.Books.Count; i++) { <div> @Html.HiddenFor(x => x.Books[i].BookId) @Html.CheckBoxFor(x => x.Books[i].IsSelected) @Html.LabelFor(x => x.Books[i].IsSelected, Model.Books[i].BookName) </div> } </div> <div> <input id="btnSubmit" type="submit" value="Submit" /> </div> </div> }

    Read the article

  • JSON Formatting with Jersey, Jackson, & json.org/java Parser using Curl Command

    - by socal_javaguy
    Using Java 6, Tomcat 7, Jersey 1.15, Jackson 2.0.6 (from FasterXml maven repo), & www.json.org parser, I am trying to pretty print the JSON String so it will look indented by the curl -X GET command line. I created a simple web service which has the following architecture: My POJOs (model classes): Family.java import javax.xml.bind.annotation.XmlRootElement; @XmlRootElement public class Family { private String father; private String mother; private List<Children> children; // Getter & Setters } Children.java import javax.xml.bind.annotation.XmlRootElement; @XmlRootElement public class Children { private String name; private String age; private String gender; // Getters & Setters } Using a Utility Class, I decided to hard code the POJOs as follows: public class FamilyUtil { public static Family getFamily() { Family family = new Family(); family.setFather("Joe"); family.setMother("Jennifer"); Children child = new Children(); child.setName("Jimmy"); child.setAge("12"); child.setGender("male"); List<Children> children = new ArrayList<Children>(); children.add(child); family.setChildren(children); return family; } } My web service: import java.io.IOException; import javax.ws.rs.GET; import javax.ws.rs.Path; import javax.ws.rs.PathParam; import javax.ws.rs.Produces; import javax.ws.rs.core.MediaType; import org.codehaus.jackson.JsonGenerationException; import org.codehaus.jackson.map.JsonMappingException; import org.codehaus.jackson.map.ObjectMapper; import org.codehaus.jettison.json.JSONException; import org.json.JSONObject; import org.json.JSONTokener; import com.myapp.controller.myappController; import com.myapp.resource.output.HostingSegmentOutput; import com.myapp.util.FamilyUtil; @Path("") public class MyWebService { @GET @Produces(MediaType.APPLICATION_JSON) public static String getFamily() throws IOException, JsonGenerationException, JsonMappingException, JSONException, org.json.JSONException { ObjectMapper mapper = new ObjectMapper(); String uglyJsonString = mapper.writeValueAsString(FamilyUtil.getFamily()); System.out.println(uglyJsonString); JSONTokener tokener = new JSONTokener(uglyJsonString); JSONObject finalResult = new JSONObject(tokener); return finalResult.toString(4); } } When I run this using: curl -X GET http://localhost:8080/mywebservice I get this in my Eclipse's console: {"father":"Joe","mother":"Jennifer","children":[{"name":"Jimmy","age":"12","gender":"male"}]} But from the curl command on the command line (this response is more important): "{\n \"mother\": \"Jennifer\",\n \"children\": [{\n \"age\": \"12\",\n \"name\": \"Jimmy\",\n \"gender\": \"male\"\n }],\n \"father\": \"Joe\"\n}" This is adding newline escape sequences and placing double quotes (but not indenting like it should it does have 4 spaces after the new line but its all in one line). Would appreciate it if someone could point me in the right direction.

    Read the article

  • DataTable to JSON

    - by Joel Coehoorn
    I recently needed to serialize a datatable to JSON. Where I'm at we're still on .Net 2.0, so I can't use the JSON serializer in .Net 3.5. I figured this must have been done before, so I went looking online and found a number of different options. Some of them depend on an additional library, which I would have a hard time pushing through here. Others require first converting to List<Dictionary<>>, which seemed a little awkward and needless. Another treated all values like a string. For one reason or another I couldn't really get behind any of them, so I decided to roll my own, which is posted below. As you can see from reading the //TODO comments, it's incomplete in a few places. This code is already in production here, so it does "work" in the basic sense. The places where it's incomplete are places where we know our production data won't currently hit it (no timespans or byte arrays in the db). The reason I'm posting here is that I feel like this can be a little better, and I'd like help finishing and improving this code. Any input welcome. public static class JSONHelper { public static string FromDataTable(DataTable dt) { string rowDelimiter = ""; StringBuilder result = new StringBuilder("["); foreach (DataRow row in dt.Rows) { result.Append(rowDelimiter); result.Append(FromDataRow(row)); rowDelimiter = ","; } result.Append("]"); return result.ToString(); } public static string FromDataRow(DataRow row) { DataColumnCollection cols = row.Table.Columns; string colDelimiter = ""; StringBuilder result = new StringBuilder("{"); for (int i = 0; i < cols.Count; i++) { // use index rather than foreach, so we can use the index for both the row and cols collection result.Append(colDelimiter).Append("\"") .Append(cols[i].ColumnName).Append("\":") .Append(JSONValueFromDataRowObject(row[i], cols[i].DataType)); colDelimiter = ","; } result.Append("}"); return result.ToString(); } // possible types: // http://msdn.microsoft.com/en-us/library/system.data.datacolumn.datatype(VS.80).aspx private static Type[] numeric = new Type[] {typeof(byte), typeof(decimal), typeof(double), typeof(Int16), typeof(Int32), typeof(SByte), typeof(Single), typeof(UInt16), typeof(UInt32), typeof(UInt64)}; // I don't want to rebuild this value for every date cell in the table private static long EpochTicks = new DateTime(1970, 1, 1).Ticks; private static string JSONValueFromDataRowObject(object value, Type DataType) { // null if (value == DBNull.Value) return "null"; // numeric if (Array.IndexOf(numeric, DataType) > -1) return value.ToString(); // TODO: eventually want to use a stricter format // boolean if (DataType == typeof(bool)) return ((bool)value) ? "true" : "false"; // date -- see http://weblogs.asp.net/bleroy/archive/2008/01/18/dates-and-json.aspx if (DataType == typeof(DateTime)) return "\"\\/Date(" + new TimeSpan(((DateTime)value).ToUniversalTime().Ticks - EpochTicks).TotalMilliseconds.ToString() + ")\\/\""; // TODO: add Timespan support // TODO: add Byte[] support //TODO: this would be _much_ faster with a state machine // string/char return "\"" + value.ToString().Replace(@"\", @"\\").Replace(Environment.NewLine, @"\n").Replace("\"", @"\""") + "\""; } }

    Read the article

  • Code and Slides from my Fall 2012 DevConnections Talks

    - by dwahlin
    Thanks to everyone who attended my sessions at the Fall 2012 DevConnections conference in Las Vegas. There was a ton of interest in different JavaScript and HTML5 topics. Here’s a picture taken after finishing up my first talk. The second one was packed (standing room only…forgot to take a picture though unfortunately) – thanks to everyone for the great questions and interest in the sessions! I really enjoyed talking with everyone that came up afterward.   As promised, here’s where you can find the code and slides I demonstrated during my talks on building an HTML5 application with a variety of technologies and structuring JavaScript code. Building the Account at a Glance ASP.NET MVC, HTML5 and jQuery Application Structuring JavaScript Code - Techniques, Strategies and Patterns If you’re on Twitter keep in touch with me through my DanWahlin alias.

    Read the article

  • Azure, don't give me multiple VMs, give me one elastic VM

    - by FransBouma
    Yesterday, Microsoft revealed new major features for Windows Azure (see ScottGu's post). It all looks shiny and great, but after reading most of the material describing the new features, I still find the overall idea behind all of it flawed: why should I care on how much VMs my web app runs? Isn't that a problem to solve for the Windows Azure engineers / software? And what if I need the file system, why can't I simply get a virtual filesystem ? To illustrate my point, let's use a real example: a product website with a customer system/database and next to it a support site with accompanying database. Both are written in .NET, using ASP.NET and use a SQL Server database each. The product website offers files to download by customers, very simple. You have a couple of options to host these websites: Buy a server, place it in a rack at an ISP and run the sites on that server Use 'shared hosting' with an ISP, which means your sites' appdomains are running on the same machine, as well as the files stored, and the databases are hosted in the same server as the other shared databases. Hire a VM, install your OS of choice at an ISP, and host the sites on that VM, basically the same as the first option, except you don't have a physical server At some cloud-vendor, either host the sites 'shared' or in a VM. See above. With all of those options, scalability is a problem, even the cloud-based ones, though not due to the same reasons: The physical server solution has the obvious problem that if you need more power, you need to buy a bigger server or more servers which requires you to add replication and other overhead Shared hosting solutions are almost always capped on memory usage / traffic and database size: if your sites get too big, you have to move out of the shared hosting environment and start over with one of the other solutions The VM solution, be it a VM at an ISP or 'in the cloud' at e.g. Windows Azure or Amazon, in theory allows scaling out by simply instantiating more VMs, however that too introduces the same overhead problems as with the physical servers: suddenly more than 1 instance runs your sites. If a cloud vendor offers its services in the form of VMs, you won't gain much over having a VM at some ISP: the main problems you have to work around are still there: when you spin up more than one VM, your application must be completely stateless at any moment, including the DB sub system, because what's in memory in instance 1 might not be in memory in instance 2. This might sounds trivial but it's not. A lot of the websites out there started rather small: they were perfectly runnable on a single machine with normal memory and CPU power. After all, you don't need a big machine to run a website with even thousands of users a day. Moving these sites to a multi-VM environment will cause a problem: all the in-memory state they use, all the multi-page transitions they use while keeping state across the transition, they can't do that anymore like they did that on a single machine: state is something of the past, you have to store every byte of state in either a DB or in a viewstate or in a cookie somewhere so with the next request, all state information is available through the request, as nothing is kept in-memory. Our example uses a bunch of files in a file system. Using multiple VMs will require that these files move to a cloud storage system which is mounted in each VM so we don't have to store the files on each VM. This might require different file paths, but this change should be minor. What's perhaps less minor is the maintenance procedure in place on the new type of cloud storage used: instead of ftp-ing into a VM, you might have to update the files using different ways / tools. All in all this makes moving an existing website which was written for an environment that's based around a VM (namely .NET with its CLR) overly cumbersome and problematic: it forces you to refactor your website system to be able to be used 'in the cloud', which is caused by the limited way how e.g. Windows Azure offers its cloud services: in blocks of VMs. Offer a scalable, flexible VM which extends with my needs Instead, cloud vendors should offer simply one VM to me. On that VM I run the websites, store my DB and my files. As it's a virtual machine, how this machine is actually ran on physical hardware (e.g. partitioned), I don't care, as that's the problem for the cloud vendor to solve. If I need more resources, e.g. I have more traffic to my server, way more visitors per day, the VM stretches, like I bought a bigger box. This frees me from the problem which comes with multiple VMs: I don't have any refactoring to do at all: I can simply build my website as if it runs on my local hardware server, upload it to the VM offered by the cloud vendor, install it on the VM and I'm done. "But that might require changes to windows!" Yes, but Microsoft is Windows. Windows Azure is their service, they can make whatever change to what they offer to make it look like it's windows. Yet, they're stuck, like Amazon, in thinking in VMs, which forces developers to 'think ahead' and gamble whether they would need to migrate to a cloud with multiple VMs in the future or not. Which comes down to: gamble whether they should invest time in code / architecture which they might never need. (YAGNI anyone?) So the VM we're talking about, is that a low-level VM which runs a guest OS, or is that VM a different kind of VM? The flexible VM: .NET's CLR ? My example websites are ASP.NET based, which means they run inside a .NET appdomain, on the .NET CLR, which is a VM. The only physical OS resource the sites need is the file system, however this too is accessed through .NET. In short: all the websites see is what .NET allows the websites to see, the world as the websites know it is what .NET shows them and lets them access. How the .NET appdomain is run physically, that's the concern of .NET, not mine. This begs the question why Windows Azure doesn't offer virtual appdomains? Or better: .NET environments which look like one machine but could be physically multiple machines. In such an environment, no change has to be made to the websites to migrate them from a local machine or own server to the cloud to get proper scaling: the .NET VM will simply scale with the need: more memory needed, more CPU power needed, it stretches. What it offers to the application running inside the appdomain is simply increasing, but not fragmented: all resources are available to the application: this means that the problem of how to scale is back to where it should be: with the cloud vendor. "Yeah, great, but what about the databases?" The .NET application communicates with the database server through a .NET ADO.NET provider. Where the database is located is not a problem of the appdomain: the ADO.NET provider has to solve that. I.o.w.: we can host the databases in an environment which offers itself as a single resource and is accessible through one connection string without replication overhead on the outside, and use that environment inside the .NET VM as if it was a single DB. But what about memory replication and other problems? This environment isn't simple, at least not for the cloud vendor. But it is simple for the customer who wants to run his sites in that cloud: no work needed. No refactoring needed of existing code. Upload it, run it. Perhaps I'm dreaming and what I described above isn't possible. Yet, I think if cloud vendors don't move into that direction, what they're offering isn't interesting: it doesn't solve a problem at all, it simply offers a way to instantiate more VMs with the guest OS of choice at the cost of me needing to refactor my website code so it can run in the straight jacket form factor dictated by the cloud vendor. Let's not kid ourselves here: most of us developers will never build a website which needs a truck load of VMs to run it: almost all websites created by developers can run on just a few VMs at most. Yet, the most expensive change is right at the start: moving from one to two VMs. As soon as you have refactored your website code to run across multiple VMs, adding another one is just as easy as clicking a mouse button. But that first step, that's the problem here and as it's right there at the beginning of scaling the website, it's particularly strange that cloud vendors refuse to solve that problem and leave it to the developers to solve that. Which makes migrating 'to the cloud' particularly expensive.

    Read the article

  • New Bundling and Minification Support (ASP.NET 4.5 Series)

    - by ScottGu
    This is the sixth in a series of blog posts I'm doing on ASP.NET 4.5. The next release of .NET and Visual Studio include a ton of great new features and capabilities.  With ASP.NET 4.5 you'll see a bunch of really nice improvements with both Web Forms and MVC - as well as in the core ASP.NET base foundation that both are built upon. Today’s post covers some of the work we are doing to add built-in support for bundling and minification into ASP.NET - which makes it easy to improve the performance of applications.  This feature can be used by all ASP.NET applications, including both ASP.NET MVC and ASP.NET Web Forms solutions. Basics of Bundling and Minification As more and more people use mobile devices to surf the web, it is becoming increasingly important that the websites and apps we build perform well with them. We’ve all tried loading sites on our smartphones – only to eventually give up in frustration as it loads slowly over a slow cellular network.  If your site/app loads slowly like that, you are likely losing potential customers because of bad performance.  Even with powerful desktop machines, the load time of your site and perceived performance can make an enormous customer perception. Most websites today are made up of multiple JavaScript and CSS files to separate the concerns and keep the code base tight. While this is a good practice from a coding point of view, it often has some unfortunate consequences for the overall performance of the website.  Multiple JavaScript and CSS files require multiple HTTP requests from a browser – which in turn can slow down the performance load time.  Simple Example Below I’ve opened a local website in IE9 and recorded the network traffic using IE’s built-in F12 developer tools. As shown below, the website consists of 5 CSS and 4 JavaScript files which the browser has to download. Each file is currently requested separately by the browser and returned by the server, and the process can take a significant amount of time proportional to the number of files in question. Bundling ASP.NET is adding a feature that makes it easy to “bundle” or “combine” multiple CSS and JavaScript files into fewer HTTP requests. This causes the browser to request a lot fewer files and in turn reduces the time it takes to fetch them.   Below is an updated version of the above sample that takes advantage of this new bundling functionality (making only one request for the JavaScript and one request for the CSS): The browser now has to send fewer requests to the server. The content of the individual files have been bundled/combined into the same response, but the content of the files remains the same - so the overall file size is exactly the same as before the bundling.   But notice how even on a local dev machine (where the network latency between the browser and server is minimal), the act of bundling the CSS and JavaScript files together still manages to reduce the overall page load time by almost 20%.  Over a slow network the performance improvement would be even better. Minification The next release of ASP.NET is also adding a new feature that makes it easy to reduce or “minify” the download size of the content as well.  This is a process that removes whitespace, comments and other unneeded characters from both CSS and JavaScript. The result is smaller files, which will download and load in a browser faster.  The graph below shows the performance gain we are seeing when both bundling and minification are used together: Even on my local dev box (where the network latency is minimal), we now have a 40% performance improvement from where we originally started.  On slow networks (and especially with international customers), the gains would be even more significant. Using Bundling and Minification inside ASP.NET The upcoming release of ASP.NET makes it really easy to take advantage of bundling and minification within projects and see performance gains like in the scenario above. The way it does this allows you to avoid having to run custom tools as part of your build process –  instead ASP.NET has added runtime support to perform the bundling/minification for you dynamically (caching the results to make sure perf is great).  This enables a really clean development experience and makes it super easy to start to take advantage of these new features. Let’s assume that we have a simple project that has 4 JavaScript files and 6 CSS files: Bundling and Minifying the .css files Let’s say you wanted to reference all of the stylesheets in the “Styles” folder above on a page.  Today you’d have to add multiple CSS references to get all of them – which would translate into 6 separate HTTP requests: The new bundling/minification feature now allows you to instead bundle and minify all of the .css files in the Styles folder – simply by sending a URL request to the folder (in this case “styles”) with an appended “/css” path after it.  For example:    This will cause ASP.NET to scan the directory, bundle and minify the .css files within it, and send back a single HTTP response with all of the CSS content to the browser.  You don’t need to run any tools or pre-processor to get this behavior.  This enables you to cleanly separate your CSS into separate logical .css files and maintain a very clean development experience – while not taking a performance hit at runtime for doing so.  The Visual Studio designer will also honor the new bundling/minification logic as well – so you’ll still get a WYSWIYG designer experience inside VS as well. Bundling and Minifying the JavaScript files Like the CSS approach above, if we wanted to bundle and minify all of our JavaScript into a single response we could send a URL request to the folder (in this case “scripts”) with an appended “/js” path after it:   This will cause ASP.NET to scan the directory, bundle and minify the .js files within it, and send back a single HTTP response with all of the JavaScript content to the browser.  Again – no custom tools or builds steps were required in order to get this behavior.  And it works with all browsers. Ordering of Files within a Bundle By default, when files are bundled by ASP.NET they are sorted alphabetically first, just like they are shown in Solution Explorer. Then they are automatically shifted around so that known libraries and their custom extensions such as jQuery, MooTools and Dojo are loaded before anything else. So the default order for the merged bundling of the Scripts folder as shown above will be: Jquery-1.6.2.js Jquery-ui.js Jquery.tools.js a.js By default, CSS files are also sorted alphabetically and then shifted around so that reset.css and normalize.css (if they are there) will go before any other file. So the default sorting of the bundling of the Styles folder as shown above will be: reset.css content.css forms.css globals.css menu.css styles.css The sorting is fully customizable, though, and can easily be changed to accommodate most use cases and any common naming pattern you prefer.  The goal with the out of the box experience, though, is to have smart defaults that you can just use and be successful with. Any number of directories/sub-directories supported In the example above we just had a single “Scripts” and “Styles” folder for our application.  This works for some application types (e.g. single page applications).  Often, though, you’ll want to have multiple CSS/JS bundles within your application – for example: a “common” bundle that has core JS and CSS files that all pages use, and then page specific or section specific files that are not used globally. You can use the bundling/minification support across any number of directories or sub-directories in your project – this makes it easy to structure your code so as to maximize the bunding/minification benefits.  Each directory by default can be accessed as a separate URL addressable bundle.  Bundling/Minification Extensibility ASP.NET’s bundling and minification support is built with extensibility in mind and every part of the process can be extended or replaced. Custom Rules In addition to enabling the out of the box - directory-based - bundling approach, ASP.NET also supports the ability to register custom bundles using a new programmatic API we are exposing.  The below code demonstrates how you can register a “customscript” bundle using code within an application’s Global.asax class.  The API allows you to add/remove/filter files that go into the bundle on a very granular level:     The above custom bundle can then be referenced anywhere within the application using the below <script> reference:     Custom Processing You can also override the default CSS and JavaScript bundles to support your own custom processing of the bundled files (for example: custom minification rules, support for Saas, LESS or Coffeescript syntax, etc). In the example below we are indicating that we want to replace the built-in minification transforms with a custom MyJsTransform and MyCssTransform class. They both subclass the CSS and JavaScript minifier respectively and can add extra functionality:     The end result of this extensibility is that you can plug-into the bundling/minification logic at a deep level and do some pretty cool things with it. 2 Minute Video of Bundling and Minification in Action Mads Kristensen has a great 90 second video that shows off using the new Bundling and Minification feature.  You can watch the 90 second video here. Summary The new bundling and minification support within the next release of ASP.NET will make it easier to build fast web applications.  It is really easy to use, and doesn’t require major changes to your existing dev workflow.  It is also supports a rich extensibility API that enables you to customize it however you want. You can easily take advantage of this new support within ASP.NET MVC, ASP.NET Web Forms and ASP.NET Web Pages based applications. Hope this helps, Scott P.S. In addition to blogging, I use Twitter to-do quick posts and share links. My Twitter handle is: @scottgu

    Read the article

  • json-framework: EXC_BAD_ACCESS on stringWithObject

    - by Vegar
    UPDATE: I found that the reason for the previous error was an error in the documentation. The method should be named proxyForJson, not jsonProxyObject... But I'm still stuck, though. I now get an EXC_BAD_ACCESS error inside stringWithObject some where. Any clues? UPDATE 2: My proxyForJson implementation is a cut-n-paste from then documentation: - (id)proxyForJson { return [NSDictionary dictionaryWithObjectsAndKeys: Navn, @"Navn", Adresse, @"Adresse", Alder, @"Alder", nil]; } Trying to make json serialization work for my custom objective-c class. As I understand the documentation, json-framework can serialize custom objects, if they implement the jsonProxyObject method. So I have this class: @interface MyObject : NSObject { NSString *Name; NSString *Addresse; NSInteger Age; } @property (nonatomic, retain) NSString *Name; @property (nonatomic, retain) NSString *Addresse; @property (nonatomic, assign) NSInteger Age; - (id)jsonProxyObject; @end And I try to serialize an array with some instances in it: [json stringWithObject:list error:&error]; But all I get is he following error: "JSON serialisation not supported for MyObject" I guess the jsonWriter can't find my jsonProxyObject method for some reason, buy why? Regards.

    Read the article

  • How to get JSON code into MYSQL database?

    - by the_boy_za
    I've created this form with a jQuery autocomplete function. The selected brand from the autocomplete list needs to get sent to a PHP file using $.ajax function. My code doesn't seem to work but i can't find the error. I don't know why the data isn't getting inserted into MYSQL database. Here is my code: JQUERY: $(document).ready(function() { $("#autocomplete").autocomplete({ minLength: 2 }); $("#autocomplete").autocomplete({ source: ["Adidas", "Airforce", "Alpha Industries", "Asics", "Bikkemberg", "Birkenstock", "Bjorn Borg", "Brunotti", "Calvin Klein", "Cars Jeans", "Chanel", "Chasin", "Diesel", "Dior", "DKNY", "Dolce & Gabbana"] }); $("#add-brand").click(function() { var merk = $("#autocomplete").val(); $("#selected-brands").append(" <a class=\"deletemerk\" href=\"#\">" + merk + "</a>"); //Add your parameters here var param = JSON.stringify({ Brand: merk }); $.ajax({ type: "POST", async: true, url: "scripttohandlejson.php", contentType: "application/json", data: param, dataType: "json", success: function (good){ //handle success alert(good) }, failure: function (bad){ //handle any errors alert(bad) } }); return false; }); }); PHP FILE: scripttohandlejson.php <?PHP $getcontent = json_decode($json, true); $getcontent->{'Brand'}; $vraag ="INSERT INTO kijken (merk) VALUES ='$getcontent' "; $voerin = mysql_query($vraag) or die("couldnt put into db"); <?

    Read the article

  • Creating STA COM compatible ASP.NET Applications

    - by Rick Strahl
    When building ASP.NET applications that interface with old school COM objects like those created with VB6 or Visual FoxPro (MTDLL), it's extremely important that the threads that are serving requests use Single Threaded Apartment Threading. STA is a COM built-in technology that allows essentially single threaded components to operate reliably in a multi-threaded environment. STA's guarantee that COM objects instantiated on a specific thread stay on that specific thread and any access to a COM object from another thread automatically marshals that thread to the STA thread. The end effect is that you can have multiple threads, but a COM object instance lives on a fixed never changing thread. ASP.NET by default uses MTA (multi-threaded apartment) threads which are truly free spinning threads that pay no heed to COM object marshaling. This is vastly more efficient than STA threading which has a bit of overhead in determining whether it's OK to run code on a given thread or whether some sort of thread/COM marshaling needs to occur. MTA COM components can be very efficient, but STA COM components in a multi-threaded environment always tend to have a fair amount of overhead. It's amazing how much COM Interop I still see today so while it seems really old school to be talking about this topic, it's actually quite apropos for me as I have many customers using legacy COM systems that need to interface with other .NET applications. In this post I'm consolidating some of the hacks I've used to integrate with various ASP.NET technologies when using STA COM Components. STA in ASP.NET Support for STA threading in the ASP.NET framework is fairly limited. Specifically only the original ASP.NET WebForms technology supports STA threading directly via its STA Page Handler implementation or what you might know as ASPCOMPAT mode. For WebForms running STA components is as easy as specifying the ASPCOMPAT attribute in the @Page tag:<%@ Page Language="C#" AspCompat="true" %> which runs the page in STA mode. Removing it runs in MTA mode. Simple. Unfortunately all other ASP.NET technologies built on top of the core ASP.NET engine do not support STA natively. So if you want to use STA COM components in MVC or with class ASMX Web Services, there's no automatic way like the ASPCOMPAT keyword available. So what happens when you run an STA COM component in an MTA application? In low volume environments - nothing much will happen. The COM objects will appear to work just fine as there are no simultaneous thread interactions and the COM component will happily run on a single thread or multiple single threads one at a time. So for testing running components in MTA environments may appear to work just fine. However as load increases and threads get re-used by ASP.NET COM objects will end up getting created on multiple different threads. This can result in crashes or hangs, or data corruption in the STA components which store their state in thread local storage on the STA thread. If threads overlap this global store can easily get corrupted which in turn causes problems. STA ensures that any COM object instance loaded always stays on the same thread it was instantiated on. What about COM+? COM+ is supposed to address the problem of STA in MTA applications by providing an abstraction with it's own thread pool manager for COM objects. It steps in to the COM instantiation pipeline and hands out COM instances from its own internally maintained STA Thread pool. This guarantees that the COM instantiation threads are STA threads if using STA components. COM+ works, but in my experience the technology is very, very slow for STA components. It adds a ton of overhead and reduces COM performance noticably in load tests in IIS. COM+ can make sense in some situations but for Web apps with STA components it falls short. In addition there's also the need to ensure that COM+ is set up and configured on the target machine and the fact that components have to be registered in COM+. COM+ also keeps components up at all times, so if a component needs to be replaced the COM+ package needs to be unloaded (same is true for IIS hosted components but it's more common to manage that). COM+ is an option for well established components, but native STA support tends to provide better performance and more consistent usability, IMHO. STA for non supporting ASP.NET Technologies As mentioned above only WebForms supports STA natively. However, by utilizing the WebForms ASP.NET Page handler internally it's actually possible to trick various other ASP.NET technologies and let them work with STA components. This is ugly but I've used each of these in various applications and I've had minimal problems making them work with FoxPro STA COM components which is about as dififcult as it gets for COM Interop in .NET. In this post I summarize several STA workarounds that enable you to use STA threading with these ASP.NET Technologies: ASMX Web Services ASP.NET MVC WCF Web Services ASP.NET Web API ASMX Web Services I start with classic ASP.NET ASMX Web Services because it's the easiest mechanism that allows for STA modification. It also clearly demonstrates how the WebForms STA Page Handler is the key technology to enable the various other solutions to create STA components. Essentially the way this works is to override the WebForms Page class and hijack it's init functionality for processing requests. Here's what this looks like for Web Services:namespace FoxProAspNet { public class WebServiceStaHandler : System.Web.UI.Page, IHttpAsyncHandler { protected override void OnInit(EventArgs e) { IHttpHandler handler = new WebServiceHandlerFactory().GetHandler( this.Context, this.Context.Request.HttpMethod, this.Context.Request.FilePath, this.Context.Request.PhysicalPath); handler.ProcessRequest(this.Context); this.Context.ApplicationInstance.CompleteRequest(); } public IAsyncResult BeginProcessRequest( HttpContext context, AsyncCallback cb, object extraData) { return this.AspCompatBeginProcessRequest(context, cb, extraData); } public void EndProcessRequest(IAsyncResult result) { this.AspCompatEndProcessRequest(result); } } public class AspCompatWebServiceStaHandlerWithSessionState : WebServiceStaHandler, IRequiresSessionState { } } This class overrides the ASP.NET WebForms Page class which has a little known AspCompatBeginProcessRequest() and AspCompatEndProcessRequest() method that is responsible for providing the WebForms ASPCOMPAT functionality. These methods handle routing requests to STA threads. Note there are two classes - one that includes session state and one that does not. If you plan on using ASP.NET Session state use the latter class, otherwise stick to the former. This maps to the EnableSessionState page setting in WebForms. This class simply hooks into this functionality by overriding the BeginProcessRequest and EndProcessRequest methods and always forcing it into the AspCompat methods. The way this works is that BeginProcessRequest() fires first to set up the threads and starts intializing the handler. As part of that process the OnInit() method is fired which is now already running on an STA thread. The code then creates an instance of the actual WebService handler factory and calls its ProcessRequest method to start executing which generates the Web Service result. Immediately after ProcessRequest the request is stopped with Application.CompletRequest() which ensures that the rest of the Page handler logic doesn't fire. This means that even though the fairly heavy Page class is overridden here, it doesn't end up executing any of its internal processing which makes this code fairly efficient. In a nutshell, we're highjacking the Page HttpHandler and forcing it to process the WebService process handler in the context of the AspCompat handler behavior. Hooking up the Handler Because the above is an HttpHandler implementation you need to hook up the custom handler and replace the standard ASMX handler. To do this you need to modify the web.config file (here for IIS 7 and IIS Express): <configuration> <system.webServer> <handlers> <remove name="WebServiceHandlerFactory-Integrated-4.0" /> <add name="Asmx STA Web Service Handler" path="*.asmx" verb="*" type="FoxProAspNet.WebServiceStaHandler" precondition="integrated"/> </handlers> </system.webServer> </configuration> (Note: The name for the WebServiceHandlerFactory-Integrated-4.0 might be slightly different depending on your server version. Check the IIS Handler configuration in the IIS Management Console for the exact name or simply remove the handler from the list there which will propagate to your web.config). For IIS 5 & 6 (Windows XP/2003) or the Visual Studio Web Server use:<configuration> <system.web> <httpHandlers> <remove path="*.asmx" verb="*" /> <add path="*.asmx" verb="*" type="FoxProAspNet.WebServiceStaHandler" /> </httpHandlers> </system.web></configuration> To test, create a new ASMX Web Service and create a method like this: [WebService(Namespace = "http://foxaspnet.org/")] [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)] public class FoxWebService : System.Web.Services.WebService { [WebMethod] public string HelloWorld() { return "Hello World. Threading mode is: " + System.Threading.Thread.CurrentThread.GetApartmentState(); } } Run this before you put in the web.config configuration changes and you should get: Hello World. Threading mode is: MTA Then put the handler mapping into Web.config and you should see: Hello World. Threading mode is: STA And you're on your way to using STA COM components. It's a hack but it works well! I've used this with several high volume Web Service installations with various customers and it's been fast and reliable. ASP.NET MVC ASP.NET MVC has quickly become the most popular ASP.NET technology, replacing WebForms for creating HTML output. MVC is more complex to get started with, but once you understand the basic structure of how requests flow through the MVC pipeline it's easy to use and amazingly flexible in manipulating HTML requests. In addition, MVC has great support for non-HTML output sources like JSON and XML, making it an excellent choice for AJAX requests without any additional tools. Unlike WebForms ASP.NET MVC doesn't support STA threads natively and so some trickery is needed to make it work with STA threads as well. MVC gets its handler implementation through custom route handlers using ASP.NET's built in routing semantics. To work in an STA handler requires working in the Page Handler as part of the Route Handler implementation. As with the Web Service handler the first step is to create a custom HttpHandler that can instantiate an MVC request pipeline properly:public class MvcStaThreadHttpAsyncHandler : Page, IHttpAsyncHandler, IRequiresSessionState { private RequestContext _requestContext; public MvcStaThreadHttpAsyncHandler(RequestContext requestContext) { if (requestContext == null) throw new ArgumentNullException("requestContext"); _requestContext = requestContext; } public IAsyncResult BeginProcessRequest(HttpContext context, AsyncCallback cb, object extraData) { return this.AspCompatBeginProcessRequest(context, cb, extraData); } protected override void OnInit(EventArgs e) { var controllerName = _requestContext.RouteData.GetRequiredString("controller"); var controllerFactory = ControllerBuilder.Current.GetControllerFactory(); var controller = controllerFactory.CreateController(_requestContext, controllerName); if (controller == null) throw new InvalidOperationException("Could not find controller: " + controllerName); try { controller.Execute(_requestContext); } finally { controllerFactory.ReleaseController(controller); } this.Context.ApplicationInstance.CompleteRequest(); } public void EndProcessRequest(IAsyncResult result) { this.AspCompatEndProcessRequest(result); } public override void ProcessRequest(HttpContext httpContext) { throw new NotSupportedException("STAThreadRouteHandler does not support ProcessRequest called (only BeginProcessRequest)"); } } This handler code figures out which controller to load and then executes the controller. MVC internally provides the information needed to route to the appropriate method and pass the right parameters. Like the Web Service handler the logic occurs in the OnInit() and performs all the processing in that part of the request. Next, we need a RouteHandler that can actually pick up this handler. Unlike the Web Service handler where we simply registered the handler, MVC requires a RouteHandler to pick up the handler. RouteHandlers look at the URL's path and based on that decide on what handler to invoke. The route handler is pretty simple - all it does is load our custom handler: public class MvcStaThreadRouteHandler : IRouteHandler { public IHttpHandler GetHttpHandler(RequestContext requestContext) { if (requestContext == null) throw new ArgumentNullException("requestContext"); return new MvcStaThreadHttpAsyncHandler(requestContext); } } At this point you can instantiate this route handler and force STA requests to MVC by specifying a route. The following sets up the ASP.NET Default Route:Route mvcRoute = new Route("{controller}/{action}/{id}", new RouteValueDictionary( new { controller = "Home", action = "Index", id = UrlParameter.Optional }), new MvcStaThreadRouteHandler()); RouteTable.Routes.Add(mvcRoute);   To make this code a little easier to work with and mimic the behavior of the routes.MapRoute() functionality extension method that MVC provides, here is an extension method for MapMvcStaRoute(): public static class RouteCollectionExtensions { public static void MapMvcStaRoute(this RouteCollection routeTable, string name, string url, object defaults = null) { Route mvcRoute = new Route(url, new RouteValueDictionary(defaults), new MvcStaThreadRouteHandler()); RouteTable.Routes.Add(mvcRoute); } } With this the syntax to add  route becomes a little easier and matches the MapRoute() method:RouteTable.Routes.MapMvcStaRoute( name: "Default", url: "{controller}/{action}/{id}", defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional } ); The nice thing about this route handler, STA Handler and extension method is that it's fully self contained. You can put all three into a single class file and stick it into your Web app, and then simply call MapMvcStaRoute() and it just works. Easy! To see whether this works create an MVC controller like this: public class ThreadTestController : Controller { public string ThreadingMode() { return Thread.CurrentThread.GetApartmentState().ToString(); } } Try this test both with only the MapRoute() hookup in the RouteConfiguration in which case you should get MTA as the value. Then change the MapRoute() call to MapMvcStaRoute() leaving all the parameters the same and re-run the request. You now should see STA as the result. You're on your way using STA COM components reliably in ASP.NET MVC. WCF Web Services running through IIS WCF Web Services provide a more robust and wider range of services for Web Services. You can use WCF over HTTP, TCP, and Pipes, and WCF services support WS* secure services. There are many features in WCF that go way beyond what ASMX can do. But it's also a bit more complex than ASMX. As a basic rule if you need to serve straight SOAP Services over HTTP I 'd recommend sticking with the simpler ASMX services especially if COM is involved. If you need WS* support or want to serve data over non-HTTP protocols then WCF makes more sense. WCF is not my forte but I found a solution from Scott Seely on his blog that describes the progress and that seems to work well. I'm copying his code below so this STA information is all in one place and quickly explain. Scott's code basically works by creating a custom OperationBehavior which can be specified via an [STAOperation] attribute on every method. Using his attribute you end up with a class (or Interface if you separate the contract and class) that looks like this: [ServiceContract] public class WcfService { [OperationContract] public string HelloWorldMta() { return Thread.CurrentThread.GetApartmentState().ToString(); } // Make sure you use this custom STAOperationBehavior // attribute to force STA operation of service methods [STAOperationBehavior] [OperationContract] public string HelloWorldSta() { return Thread.CurrentThread.GetApartmentState().ToString(); } } Pretty straight forward. The latter method returns STA while the former returns MTA. To make STA work every method needs to be marked up. The implementation consists of the attribute and OperationInvoker implementation. Here are the two classes required to make this work from Scott's post:public class STAOperationBehaviorAttribute : Attribute, IOperationBehavior { public void AddBindingParameters(OperationDescription operationDescription, System.ServiceModel.Channels.BindingParameterCollection bindingParameters) { } public void ApplyClientBehavior(OperationDescription operationDescription, System.ServiceModel.Dispatcher.ClientOperation clientOperation) { // If this is applied on the client, well, it just doesn’t make sense. // Don’t throw in case this attribute was applied on the contract // instead of the implementation. } public void ApplyDispatchBehavior(OperationDescription operationDescription, System.ServiceModel.Dispatcher.DispatchOperation dispatchOperation) { // Change the IOperationInvoker for this operation. dispatchOperation.Invoker = new STAOperationInvoker(dispatchOperation.Invoker); } public void Validate(OperationDescription operationDescription) { if (operationDescription.SyncMethod == null) { throw new InvalidOperationException("The STAOperationBehaviorAttribute " + "only works for synchronous method invocations."); } } } public class STAOperationInvoker : IOperationInvoker { IOperationInvoker _innerInvoker; public STAOperationInvoker(IOperationInvoker invoker) { _innerInvoker = invoker; } public object[] AllocateInputs() { return _innerInvoker.AllocateInputs(); } public object Invoke(object instance, object[] inputs, out object[] outputs) { // Create a new, STA thread object[] staOutputs = null; object retval = null; Thread thread = new Thread( delegate() { retval = _innerInvoker.Invoke(instance, inputs, out staOutputs); }); thread.SetApartmentState(ApartmentState.STA); thread.Start(); thread.Join(); outputs = staOutputs; return retval; } public IAsyncResult InvokeBegin(object instance, object[] inputs, AsyncCallback callback, object state) { // We don’t handle async… throw new NotImplementedException(); } public object InvokeEnd(object instance, out object[] outputs, IAsyncResult result) { // We don’t handle async… throw new NotImplementedException(); } public bool IsSynchronous { get { return true; } } } The key in this setup is the Invoker and the Invoke method which creates a new thread and then fires the request on this new thread. Because this approach creates a new thread for every request it's not super efficient. There's a bunch of overhead involved in creating the thread and throwing it away after each thread, but it'll work for low volume requests and insure each thread runs in STA mode. If better performance is required it would be useful to create a custom thread manager that can pool a number of STA threads and hand off threads as needed rather than creating new threads on every request. If your Web Service needs are simple and you need only to serve standard SOAP 1.x requests, I would recommend sticking with ASMX services. It's easier to set up and work with and for STA component use it'll be significantly better performing since ASP.NET manages the STA thread pool for you rather than firing new threads for each request. One nice thing about Scotts code is though that it works in any WCF environment including self hosting. It has no dependency on ASP.NET or WebForms for that matter. STA - If you must STA components are a  pain in the ass and thankfully there isn't too much stuff out there anymore that requires it. But when you need it and you need to access STA functionality from .NET at least there are a few options available to make it happen. Each of these solutions is a bit hacky, but they work - I've used all of them in production with good results with FoxPro components. I hope compiling all of these in one place here makes it STA consumption a little bit easier. I feel your pain :-) Resources Download STA Handler Code Examples Scott Seely's original STA WCF OperationBehavior Article© Rick Strahl, West Wind Technologies, 2005-2012Posted in FoxPro   ASP.NET  .NET  COM   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Fastest way to parse big json android

    - by jem88
    I've a doubt that doesn't let me sleep! :D I'm currently working with big json files, with many levels. I parse these object using the 'default' android way, so I read the response with a ByteArrayOutputStream, get a string and create a JSONObject from the string. All fine here. Now, I've to parse the content of the json to get the objects of my interest, and I really can't find a better way that parse it manually, like this: String status = jsonObject.getString("status"); Boolean isLogged = jsonObject.getBoolean("is_logged"); ArrayList<Genre> genresList = new ArrayList<Genre>(); // Get jsonObject with genres JSONObject jObjGenres = jsonObject.getJSONObject("genres"); // Instantiate an iterator on jsonObject keys Iterator<?> keys = jObjGenres.keys(); // Iterate on keys while( keys.hasNext() ) { String key = (String) keys.next(); JSONObject jObjGenre = jObjGenres.getJSONObject(key); // Create genre object Genre genre = new Genre( jObjGenre.getInt("id_genre"), jObjGenre.getString("string"), jObjGenre.getString("icon") ); genresList.add(genre); } // Get languages list JSONObject jObjLanguages = jsonObject.getJSONObject("languages"); Iterator jLangKey = jObjLanguages.keys(); List<Language> langList = new ArrayList<Language>(); while (jLangKey.hasNext()) { // Iterate on jlangKey obj String key = (String) jLangKey.next(); JSONObject jCurrentLang = (JSONObject) jObjLanguages.get(key); Language lang = new Language( jCurrentLang.getString("id_lang"), jCurrentLang.getString("name"), jCurrentLang.getString("code"), jCurrentLang.getString("active").equals("1") ); langList.add(lang); } I think this is really ugly, frustrating, timewaster, and fragile. I've seen parser like json-smart and Gson... but seems difficult to parse a json with many levels, and get the objects! But I guess that must be a better way... Any idea? Every suggestion will be really appreciated. Thanks in advance!

    Read the article

  • Python read multiline JSON

    - by Paul W
    I have been trying to use JSON to store settings for a program. I can't seem to get Python 2.6 's JSON Decoder to decode multi-line JSON strings... Here is example input: .settings file: """ {\ 'user':'username',\ 'password':'passwd',\ }\ """ I have tried a couple other syntaxes for this file, which I will specify below (with the traceback they cause). My python code for reading the file in is import json settings_text = open(".settings", "r").read() settings = json.loads(settings_text) The Traceback for this is: Traceback (most recent call last): File "json_test.py", line 4, in <module> print json.loads(text) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/__init__.py", line 307, in loads return _default_decoder.decode(s) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/decoder.py", line 322, in decode raise ValueError(errmsg("Extra data", s, end, len(s))) ValueError: Extra data: line 1 column 2 - line 7 column 1 (char 2 - 41) I assume the "Extra data" is the triple-quote. Here are the other syntaxes I have tried for the .settings file, with their respective Tracebacks: "{\ 'user':'username',\ 'pass':'passwd'\ }" Traceback (most recent call last): File "json_test.py", line 4, in <module> print json.loads(text) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/__init__.py", line 307, in loads return _default_decoder.decode(s) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/decoder.py", line 319, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/decoder.py", line 336, in raw_decode obj, end = self._scanner.iterscan(s, **kw).next() File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/scanner.py", line 55, in iterscan rval, next_pos = action(m, context) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/decoder.py", line 155, in JSONString return scanstring(match.string, match.end(), encoding, strict) ValueError: Invalid \escape: line 1 column 2 (char 2) '{\ "user":"username",\ "pass":"passwd",\ }' Traceback (most recent call last): File "json_test.py", line 4, in <module> print json.loads(text) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/__init__.py", line 307, in loads return _default_decoder.decode(s) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/decoder.py", line 319, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/json/decoder.py", line 338, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded If I put the settings all on one line, it decodes fine.

    Read the article

  • Need help helping in converting jquery, ajax, json and asp.net

    - by Haja Mohaideen
    I am tying out this tutorial, http://www.ezzylearning.com/tutorial.aspx?tid=5869127. It works perfectly. What I am now trying to do is to host the aspx contents as html file. This html file is hosted on my wampserver which is on my laptop. The asp.net code hosted on my test server. When I try to access, I get the following error, Resource interpreted as Script but transferred with MIME type text/html: "http://201.x.x.x/testAjax/Default.aspx/AddProductToCart?callback=jQuery17103264484549872577_1346923699990&{%20pID:%20%226765%22,%20qty:%20%22100%22,%20lblType:%20%2220%22%20}&_=1346923704482". jquery.min.js:4 Uncaught SyntaxError: Unexpected token < I am not sure how to solve this problem. index.html code $(function () { $('#btnAddToCart').click(function () { var result = $.ajax({ type: "POST", url: "http://202.161.45.124/testAjax/Default.aspx/AddProductToCart", crossDomain: true, data: '{ pID: "6765", qty: "100", lblType: "20" }', contentType: "application/json; charset=utf-8", dataType: "jsonp", success: succeeded, failure: function (msg) { alert(msg); }, error: function (xhr, err) { alert(err); } }); }); }); function succeeded(msg) { alert(msg.d); } function btnAddToCart_onclick() { } </script> </head> <body> <form name="form1" method="post"> <div> <input type="button" id="btnAddToCart" onclick="return btnAddToCart_onclick()" value="Button" /> </div> </form> aspx.vb Imports System.Web.Services Imports System.Web.Script.Services <ScriptService()> Public Class WebForm1 Inherits Page Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Session("test") = "" End Sub <WebMethod()> <ScriptMethod(UseHttpGet:=False, ResponseFormat:=ResponseFormat.Json)> Public Shared Function AddProductToCart(pID As String, qty As String, lblType As String) As String Dim selectedProduct As String = String.Format("+ {0} - {1} - {2}", pID, qty, lblType) HttpContext.Current.Session("test") += selectedProduct Return HttpContext.Current.Session("test").ToString() End Function End Class

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >