Search Results

Search found 214 results on 9 pages for 'datamember'.

Page 4/9 | < Previous Page | 1 2 3 4 5 6 7 8 9  | Next Page >

  • Performance Optimization &ndash; It Is Faster When You Can Measure It

    - by Alois Kraus
    Performance optimization in bigger systems is hard because the measured numbers can vary greatly depending on the measurement method of your choice. To measure execution timing of specific methods in your application you usually use Time Measurement Method Potential Pitfalls Stopwatch Most accurate method on recent processors. Internally it uses the RDTSC instruction. Since the counter is processor specific you can get greatly different values when your thread is scheduled to another core or the core goes into a power saving mode. But things do change luckily: Intel's Designer's vol3b, section 16.11.1 "16.11.1 Invariant TSC The time stamp counter in newer processors may support an enhancement, referred to as invariant TSC. Processor's support for invariant TSC is indicated by CPUID.80000007H:EDX[8]. The invariant TSC will run at a constant rate in all ACPI P-, C-. and T-states. This is the architectural behavior moving forward. On processors with invariant TSC support, the OS may use the TSC for wall clock timer services (instead of ACPI or HPET timers). TSC reads are much more efficient and do not incur the overhead associated with a ring transition or access to a platform resource." DateTime.Now Good but it has only a resolution of 16ms which can be not enough if you want more accuracy.   Reporting Method Potential Pitfalls Console.WriteLine Ok if not called too often. Debug.Print Are you really measuring performance with Debug Builds? Shame on you. Trace.WriteLine Better but you need to plug in some good output listener like a trace file. But be aware that the first time you call this method it will read your app.config and deserialize your system.diagnostics section which does also take time.   In general it is a good idea to use some tracing library which does measure the timing for you and you only need to decorate some methods with tracing so you can later verify if something has changed for the better or worse. In my previous article I did compare measuring performance with quantum mechanics. This analogy does work surprising well. When you measure a quantum system there is a lower limit how accurately you can measure something. The Heisenberg uncertainty relation does tell us that you cannot measure of a quantum system the impulse and location of a particle at the same time with infinite accuracy. For programmers the two variables are execution time and memory allocations. If you try to measure the timings of all methods in your application you will need to store them somewhere. The fastest storage space besides the CPU cache is the memory. But if your timing values do consume all available memory there is no memory left for the actual application to run. On the other hand if you try to record all memory allocations of your application you will also need to store the data somewhere. This will cost you memory and execution time. These constraints are always there and regardless how good the marketing of tool vendors for performance and memory profilers are: Any measurement will disturb the system in a non predictable way. Commercial tool vendors will tell you they do calculate this overhead and subtract it from the measured values to give you the most accurate values but in reality it is not entirely true. After falling into the trap to trust the profiler timings several times I have got into the habit to Measure with a profiler to get an idea where potential bottlenecks are. Measure again with tracing only the specific methods to check if this method is really worth optimizing. Optimize it Measure again. Be surprised that your optimization has made things worse. Think harder Implement something that really works. Measure again Finished! - Or look for the next bottleneck. Recently I have looked into issues with serialization performance. For serialization DataContractSerializer was used and I was not sure if XML is really the most optimal wire format. After looking around I have found protobuf-net which uses Googles Protocol Buffer format which is a compact binary serialization format. What is good for Google should be good for us. A small sample app to check out performance was a matter of minutes: using ProtoBuf; using System; using System.Diagnostics; using System.IO; using System.Reflection; using System.Runtime.Serialization; [DataContract, Serializable] class Data { [DataMember(Order=1)] public int IntValue { get; set; } [DataMember(Order = 2)] public string StringValue { get; set; } [DataMember(Order = 3)] public bool IsActivated { get; set; } [DataMember(Order = 4)] public BindingFlags Flags { get; set; } } class Program { static MemoryStream _Stream = new MemoryStream(); static MemoryStream Stream { get { _Stream.Position = 0; _Stream.SetLength(0); return _Stream; } } static void Main(string[] args) { DataContractSerializer ser = new DataContractSerializer(typeof(Data)); Data data = new Data { IntValue = 100, IsActivated = true, StringValue = "Hi this is a small string value to check if serialization does work as expected" }; var sw = Stopwatch.StartNew(); int Runs = 1000 * 1000; for (int i = 0; i < Runs; i++) { //ser.WriteObject(Stream, data); Serializer.Serialize<Data>(Stream, data); } sw.Stop(); Console.WriteLine("Did take {0:N0}ms for {1:N0} objects", sw.Elapsed.TotalMilliseconds, Runs); Console.ReadLine(); } } The results are indeed promising: Serializer Time in ms N objects protobuf-net   807 1000000 DataContract 4402 1000000 Nearly a factor 5 faster and a much more compact wire format. Lets use it! After switching over to protbuf-net the transfered wire data has dropped by a factor two (good) and the performance has worsened by nearly a factor two. How is that possible? We have measured it? Protobuf-net is much faster! As it turns out protobuf-net is faster but it has a cost: For the first time a type is de/serialized it does use some very smart code-gen which does not come for free. Lets try to measure this one by setting of our performance test app the Runs value not to one million but to 1. Serializer Time in ms N objects protobuf-net 85 1 DataContract 24 1 The code-gen overhead is significant and can take up to 200ms for more complex types. The break even point where the code-gen cost is amortized by its faster serialization performance is (assuming small objects) somewhere between 20.000-40.000 serialized objects. As it turned out my specific scenario involved about 100 types and 1000 serializations in total. That explains why the good old DataContractSerializer is not so easy to take out of business. The final approach I ended up was to reduce the number of types and to serialize primitive types via BinaryWriter directly which turned out to be a pretty good alternative. It sounded good until I measured again and found that my optimizations so far do not help much. After looking more deeper at the profiling data I did found that one of the 1000 calls did take 50% of the time. So how do I find out which call it was? Normal profilers do fail short at this discipline. A (totally undeserved) relatively unknown profiler is SpeedTrace which does unlike normal profilers create traces of your applications by instrumenting your IL code at runtime. This way you can look at the full call stack of the one slow serializer call to find out if this stack was something special. Unfortunately the call stack showed nothing special. But luckily I have my own tracing as well and I could see that the slow serializer call did happen during the serialization of a bool value. When you encounter after much analysis something unreasonable you cannot explain it then the chances are good that your thread was suspended by the garbage collector. If there is a problem with excessive GCs remains to be investigated but so far the serialization performance seems to be mostly ok.  When you do profile a complex system with many interconnected processes you can never be sure that the timings you just did measure are accurate at all. Some process might be hitting the disc slowing things down for all other processes for some seconds as well. There is a big difference between warm and cold startup. If you restart all processes you can basically forget the first run because of the OS disc cache, JIT and GCs make the measured timings very flexible. When you are in need of a random number generator you should measure cold startup times of a sufficiently complex system. After the first run you can try again getting different and much lower numbers. Now try again at least two times to get some feeling how stable the numbers are. Oh and try to do the same thing the next day. It might be that the bottleneck you found yesterday is gone today. Thanks to GC and other random stuff it can become pretty hard to find stuff worth optimizing if no big bottlenecks except bloatloads of code are left anymore. When I have found a spot worth optimizing I do make the code changes and do measure again to check if something has changed. If it has got slower and I am certain that my change should have made it faster I can blame the GC again. The thing is that if you optimize stuff and you allocate less objects the GC times will shift to some other location. If you are unlucky it will make your faster working code slower because you see now GCs at times where none were before. This is where the stuff does get really tricky. A safe escape hatch is to create a repro of the slow code in an isolated application so you can change things fast in a reliable manner. Then the normal profilers do also start working again. As Vance Morrison does point out it is much more complex to profile a system against the wall clock compared to optimize for CPU time. The reason is that for wall clock time analysis you need to understand how your system does work and which threads (if you have not one but perhaps 20) are causing a visible delay to the end user and which threads can wait a long time without affecting the user experience at all. Next time: Commercial profiler shootout.

    Read the article

  • ASP.NET Creating a Rich Repeater, DataBind wiping out custom added controls...

    - by tonyellard
    So...I had this clever idea that I'd create my own Repeater control that implements paging and sorting by inheriting from Repeater and extending it's capabilities. I found some information and bits and pieces on how to go about this and everything seemed ok... I created a WebControlLibrary to house my custom controls. Along with the enriched repeater, I created a composite control that would act as the "pager bar", having forward, back and page selection. My pager bar works 100% on it's own, properly firing a paged changed event when the user interacts with it. The rich repeater databinds without issue, but when the databind fires (when I call base.databind()), the control collection is cleared out and my pager bars are removed. This screws up the viewstate for the pager bars making them unable to fire their events properly or maintain their state. I've tried adding the controls back to the collection after base.databind() fires, but that doesn't solve the issue. I start to get very strange results including problems with altering the hierarchy of the control tree (resolved by adding [ViewStateModeById]). Before I go back to the drawing board and create a second composite control which contains a repeater and the pager bars (so that the repeater isn't responsible for the pager bars viewstate) are there any thoughts about how to resolve the issue? In the interest of share and share alike, the code for the repeater itself is below, the pagerbars aren't as significant as the issue is really the maintaining of state for any additional child controls. (forgive the roughness of some of the code...it's still a work in progress) using System; using System.Collections.Generic; using System.ComponentModel; using System.Text; using System.Data; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; [ViewStateModeById] public class SortablePagedRepeater : Repeater, INamingContainer { private SuperRepeaterPagerBar topBar = new SuperRepeaterPagerBar(); private SuperRepeaterPagerBar btmBar = new SuperRepeaterPagerBar(); protected override void OnInit(EventArgs e) { Page.RegisterRequiresControlState(this); InitializeControls(); base.OnInit(e); EnsureChildControls(); } protected void InitializeControls() { topBar.ID = this.ID + "__topPagerBar"; topBar.NumberOfPages = this._currentProperties.numOfPages; topBar.CurrentPage = this.CurrentPageNumber; topBar.PageChanged += new SuperRepeaterPagerBar.PageChangedEventHandler(PageChanged); btmBar.ID = this.ID + "__btmPagerBar"; btmBar.NumberOfPages = this._currentProperties.numOfPages; btmBar.CurrentPage = this.CurrentPageNumber; btmBar.PageChanged += new SuperRepeaterPagerBar.PageChangedEventHandler(PageChanged); } protected override void CreateChildControls() { EnsureDataBound(); this.Controls.Add(topBar); this.Controls.Add(btmBar); //base.CreateChildControls(); } private void PageChanged(object sender, int newPage) { this.CurrentPageNumber = newPage; } public override void DataBind() { //pageDataSource(); //DataBind removes all controls from control collection... base.DataBind(); Controls.Add(topBar); Controls.Add(btmBar); } private void pageDataSource() { //Create paged data source PagedDataSource pds = new PagedDataSource(); pds.PageSize = this.ItemsPerPage; pds.AllowPaging = true; // first get a PagedDataSource going and perform sort if possible... if (base.DataSource is System.Collections.IEnumerable) { pds.DataSource = (System.Collections.IEnumerable)base.DataSource; } else if (base.DataSource is System.Data.DataView) { DataView data = (DataView)DataSource; if (this.SortBy != null && data.Table.Columns.Contains(this.SortBy)) { data.Sort = this.SortBy; } pds.DataSource = data.Table.Rows; } else if (base.DataSource is System.Data.DataTable) { DataTable data = (DataTable)DataSource; if (this.SortBy != null && data.Columns.Contains(this.SortBy)) { data.DefaultView.Sort = this.SortBy; } pds.DataSource = data.DefaultView; } else if (base.DataSource is System.Data.DataSet) { DataSet data = (DataSet)DataSource; if (base.DataMember != null && data.Tables.Contains(base.DataMember)) { if (this.SortBy != null && data.Tables[base.DataMember].Columns.Contains(this.SortBy)) { data.Tables[base.DataMember].DefaultView.Sort = this.SortBy; } pds.DataSource = data.Tables[base.DataMember].DefaultView; } else if (data.Tables.Count > 0) { if (this.SortBy != null && data.Tables[0].Columns.Contains(this.SortBy)) { data.Tables[0].DefaultView.Sort = this.SortBy; } pds.DataSource = data.Tables[0].DefaultView; } else { throw new Exception("DataSet doesn't have any tables."); } } else if (base.DataSource == null) { // don't do anything? } else { throw new Exception("DataSource must be of type System.Collections.IEnumerable. The DataSource you provided is of type " + base.DataSource.GetType().ToString()); } if (pds != null && base.DataSource != null) { //Make sure that the page doesn't exceed the maximum number of pages //available if (this.CurrentPageNumber >= pds.PageCount) { this.CurrentPageNumber = pds.PageCount - 1; } //Set up paging values... btmBar.CurrentPage = topBar.CurrentPage = pds.CurrentPageIndex = this.CurrentPageNumber; this._currentProperties.numOfPages = btmBar.NumberOfPages = topBar.NumberOfPages = pds.PageCount; base.DataSource = pds; } } public override object DataSource { get { return base.DataSource; } set { //init(); //reset paging/sorting values since we've potentially changed data sources. base.DataSource = value; pageDataSource(); } } protected override void Render(HtmlTextWriter writer) { topBar.RenderControl(writer); base.Render(writer); btmBar.RenderControl(writer); } [Serializable] protected struct CurrentProperties { public int pageNum; public int itemsPerPage; public int numOfPages; public string sortBy; public bool sortDir; } protected CurrentProperties _currentProperties = new CurrentProperties(); protected override object SaveControlState() { return this._currentProperties; } protected override void LoadControlState(object savedState) { this._currentProperties = (CurrentProperties)savedState; } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue("")] [Localizable(false)] public string SortBy { get { return this._currentProperties.sortBy; } set { //If sorting by the same column, swap the sort direction. if (this._currentProperties.sortBy == value) { this.SortAscending = !this.SortAscending; } else { this.SortAscending = true; } this._currentProperties.sortBy = value; } } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue(true)] [Localizable(false)] public bool SortAscending { get { return this._currentProperties.sortDir; } set { this._currentProperties.sortDir = value; } } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue(25)] [Localizable(false)] public int ItemsPerPage { get { return this._currentProperties.itemsPerPage; } set { this._currentProperties.itemsPerPage = value; } } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue(1)] [Localizable(false)] public int CurrentPageNumber { get { return this._currentProperties.pageNum; } set { this._currentProperties.pageNum = value; pageDataSource(); } } }

    Read the article

  • JQuery + WCF + HTTP 404 Error

    - by hangar18
    HI All, I've searched high and low and finally decided to post a query here. I'm writing a very basic HTML page from which I'm trying to call a WCF service using jQuery and parse it using JSON. Service: IMyDemo.cs [ServiceContract] public interface IMyDemo { [WebInvoke(Method = "POST", BodyStyle = WebMessageBodyStyle.WrappedRequest, ResponseFormat = WebMessageFormat.Json)] Employee DoWork(); [OperationContract] [WebInvoke(Method = "POST", BodyStyle = WebMessageBodyStyle.WrappedRequest, ResponseFormat = WebMessageFormat.Json)] Employee GetEmp(int age, string name); } [DataContract] public class Employee { [DataMember] public int EmpId { get; set; } [DataMember] public string EmpName { get; set; } [DataMember] public int EmpSalary { get; set; } } MyDemo.svc.cs public Employee DoWork() { // Add your operation implementation here Employee obj = new Employee() { EmpSalary = 12, EmpName = "SomeName" }; return obj; } public Employee GetEmp(int age, string name) { Employee emp = new Employee(); if (age > 0) emp.EmpSalary = 12 + age; if (!string.IsNullOrEmpty(name)) emp.EmpName = "Server" + name; return emp; } WEb.Config <system.serviceModel> <services> <service behaviorConfiguration="EmployeesBehavior" name="MySample.MyDemo"> <endpoint address="" binding="webHttpBinding" contract="MySample.IMyDemo" behaviorConfiguration="EmployeesBehavior"/> </service> </services> <behaviors> <serviceBehaviors> <behavior name="EmployeesBehavior"> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> <endpointBehaviors> <behavior name="EmployeesBehavior"> <webHttp/> </behavior> </endpointBehaviors> </behaviors> <serviceHostingEnvironment multipleSiteBindingsEnabled="true" /> </system.serviceModel> MyDemo.htm <head> <title></title> <script type="text/javascript" language="javascript" src="Scripts/jquery-1.4.1.js"></script> <script type="text/javascript" language="javascript" src="Scripts/json.js"></script> <script type="text/javascript"> //create a global javascript object for the AJAX defaults. debugger; var ajaxDefaults = {}; ajaxDefaults.base = { type: "POST", timeout : 1000, dataFilter: function (data) { //see http://encosia.com/2009/06/29/never-worry-about-asp-net-ajaxs-d-again/ data = JSON.parse(data); //use the JSON2 library if you aren’t using FF3+, IE8, Safari 3/Google Chrome return data.hasOwnProperty("d") ? data.d : data; }, error: function (xhr) { //see if (!xhr) return; if (xhr.responseText) { var response = JSON.parse(xhr.responseText); //console.log works in FF + Firebug only, replace this code if (response) alert(response); else alert("Unknown server error"); } } }; ajaxDefaults.json = $.extend(ajaxDefaults.base, { //see http://encosia.com/2008/03/27/using-jquery-to-consume-aspnet-json-web-services/ contentType: "application/json; charset=utf-8", dataType: "json" }); var ops = { baseUrl: "/MyService/MySample/MyDemo.svc/", doWork: function () { //see http://api.jquery.com/jQuery.extend/ var ajaxOptions = $.extend(ajaxDefaults.json, { url: ops.baseUrl + "DoWork", data: "{}", success: function (msg) { console.log("success"); console.log(typeof msg); if (typeof msg !== "undefined") { console.log(msg); } } }); $.ajax(ajaxOptions); return false; }, getEmp: function () { var ajaxOpts = $.extend(ajaxDefaults.json, { url: ops.baseUrl + "GetEmp", data: JSON.stringify({ age: 12, name: "NameName" }), success: function (msg) { $("span#lbl").html("age: " + msg.Age + "name:" + msg.Name); } }); $.ajax(ajaxOpts); return false; } } </script> </head> <body> <span id="lbl">abc</span> <br /><br /> <input type="button" value="GetEmployee" id="btnGetEmployee" onclick="javascript:ops.getEmp();" /> </body> I'm just not able to get this running. When I debug, I see the error being returned from the call is " Server Error in '/jQuerySample' Application. <h2> <i>HTTP Error 404 - Not Found.</i> </h2></span> " Looks like I'm missing something basic here. My sample is based on this I've been trying to fix the code for sometime now so I'd like you to take a look and see if you can figure out what is it that I'm doing wrong here. I'm able to see that the service is created when I browse the service in IE. I've also tried changing the setting as mentioned here Appreciate your help. I'm gonna blog about this as soon as the issue is resolved for the benefit of other devs Thanks -Soni

    Read the article

  • DomainService method not compiling; claims "Return types must be an entity ..."

    - by Duncan Bayne
    I have a WCF RIA Domain Service that contains a method I'd like to invoke when the user clicks a button: [Invoke] public MyEntity PerformAnalysis(int someId) { return new MyEntity(); } However, when I try to compile I'm given the following error: Operation named 'PerformAnalysis' does not conform to the required signature. Return types must be an entity, collection of entities, or one of the predefined serializable types. The thing is, as far as I can tell, MyEntity is an entity: [Serializable] public class MyEntity: EntityObject, IMyEntity { [Key] [DataMember] [Editable(false)] public int DummyKey { get; set; } [DataMember] [Editable(false)] public IEnumerable<SomeOtherEntity> Children { get; set; } } I figure I'm missing something simple here. Could someone please tell me how I can create an invokable method that returns a single MyEntity object?

    Read the article

  • Catches communication exception instead of custom fault exception - WCF

    - by Ismail S
    On Server I'm throwing the exception like this. catch(SqlException exception) { if (exception.Message.Contains("Custom error from stored proc")) { //Exception to be thrown when authentication fails. throw new FaultException<MyServiceFault>(new MyServiceFault { MessageText = exception.Message }); } } And on client end I'm catching the exception catch(FaultException<MyServiceFault> faultException) { } Here is my MyServiceFault [DataContract] public class MyServiceFault { [DataMember] public string MessageText { get; set; } [DataMember] public Guid Id { get; set; } } The problem is that on client, it doesn't go to MyServiceFault catch block instead it goes to communication exception catch block and throws this error System.ServiceModel.CommunicationException: The underlying connection was closed: The connection was closed unexpectedly. ---> System.Net.WebException I've also decorated my service method [FaultContract(typeof(MyServiceFault))] in the interface which is implemented by my service. In my web.config servicebehaviour tag consist <serviceDebug includeExceptionDetailInFaults="true" /> Any idea where I'm going wrong

    Read the article

  • Bidirectional Serialization with Linq

    - by Andy
    Anyone know why only undirectional serialization is supported in the Linq designer? Consider the following example: Say we have a Customer who requested an Order containing Products. We set the Serialization Mode in the Linq designer to Unidirectional to enable serialization. When looking at the code for the Order object, the DataMember attribute is added to all its internal properties such as ID,OrderNumber, etc. and also to the EntitySet of Products, but not to Customer. One can get around this by manually adding the DataMember attribute to Customer, but this becomes quite cumbersome when there's loads of entities in the database.

    Read the article

  • Is there a way to get programmatic access to the columns of a ActiveReports detail section?

    - by Seth Spearman
    Hello, I have a report in Data Dynamics ActiveReports for .NET. In this report I am programmatically setting the ColumnCount property of the detail section to X. The detail section has one databound textbox. The ColumnDirection property of the detail section is set to AcrossDown and the and then the data binding mechanism automatically fills across with data after setting the DataSource and DataMember. Here is the code... Public Sub RunReport Dim count As Integer = 0 ' ... get count Detail1.ColumnCount = count Me.DataSource = ds Me.DataMember = ds.Tables(0).TableName End Sub That code works fine and the data is automatically filled across the report. Now I need to alter the report and circle or highlight one of the items that is auto-filled across columns in the report. I cannot find any way to programmatically access the auto-generated columns so I can turn on a border or draw a circle or something. Any ideas how I would do that? Seth

    Read the article

  • Populate a combo box with one data tabel and save the choice in another.

    - by Scott Chamberlain
    I have two data tables in a data set for example lets say Servers: | Server | Ip | Users: | Username | Password | Server | and there is a foreign key on users that points to servers. How do I make a combo box that list all of the servers. Then how do I make the selected server in that combo box be saved in the data set in the Server column of the users table. EDIT -- I havent tested it yet but is this the right way? this.bsServers.DataMember = "Servers"; this.bsServers.DataSource = this.dataSet; this.bsUsers.DataMember = "FK_Users_Servers"; this.bsUsers.DataSource = this.bsServers; this.serverComboBox.DataBindings.Add(new System.Windows.Forms.Binding("SelectedValue", this.bsUsers, "Server", true)); this.serverComboBox.DataSource = this.bsServers; this.serverComboBox.DisplayMember = "Server"; this.serverComboBox.ValueMember = "Server";

    Read the article

  • Why does the proxy generated code create the wrong class namespace when a MessageContract is in my W

    - by DaleyKD
    I have created two WCF Services (Shipping & PDFGenerator). They both, along with my ClientApp, share an assembly named Kyle.Common.Contracts. Within this assembly, I have three classes: namespace Kyle.Common.Contracts { [MessageContract] public class PDFResponse { [MessageHeader] public string fileName { get; set; } [MessageBodyMember] public System.IO.Stream fileStream { get; set; } } [MessageContract] public class PDFRequest { [MessageHeader] public Enums.PDFDocumentNameEnum docType { get; set; } [MessageHeader] public int? pk { get; set; } [MessageHeader] public string[] emailAddress { get; set; } [MessageBodyMember] public Kyle.Common.Contracts.TrackItResult[] trackItResults { get; set; } } [DataContract(Name = "TrackResult", Namespace = "http://kyle")] public class TrackResult { [DataMember] public int SeqNum { get; set; } [DataMember] public int ShipmentID { get; set; } [DataMember] public string StoreNum { get; set; } } } My PDFGenerator ServiceContract looks like: namespace Kyle.WCF.PDFDocs { [ServiceContract(Namespace="http://kyle")] public interface IPDFDocsService { [OperationContract] PDFResponse GeneratePDF(PDFRequest request); [OperationContract] void GeneratePDFAsync(Kyle.Common.Contracts.Enums.PDFDocumentNameEnum docType, int? pk, string[] emailAddress); [OperationContract] Kyle.Common.Contracts.TrackResult[] Test(); } } If I comment out the GeneratePDF stub, the proxy generated by VS2010 realizes that Test returns an array of Kyle.Common.Contracts.TrackResult. However, if I leave GeneratePDF there, the proxy refuses to use Kyle.Common.Contracts.TrackResult, and instead creates a new class, ClientApp.PDFDocServices.TrackResult, and uses that as the return type of Test. Is there a way to force the proxy generator to use Kyle.Common.Contracts.TrackResult whenever I use a MessageContract? Perhaps there's a better method for using a Stream and File Name as return types? I just don't want to have to create a Copy method to copy from ClientApp.PDFDocServices.TrackResult to Kyle.Common.Contracts.TrackResult, since they should be the exact same class. Thanks in advance, Kyle

    Read the article

  • Why does the proxy generated code create a new class when a MessageContract is in my WCF Service?

    - by DaleyKD
    I have created two WCF Services (Shipping & PDFGenerator). They both, along with my ClientApp, share an assembly named Kyle.Common.Contracts. Within this assembly, I have three classes: namespace Kyle.Common.Contracts { [MessageContract] public class PDFResponse { [MessageHeader] public string fileName { get; set; } [MessageBodyMember] public System.IO.Stream fileStream { get; set; } } [MessageContract] public class PDFRequest { [MessageHeader] public Enums.PDFDocumentNameEnum docType { get; set; } [MessageHeader] public int? pk { get; set; } [MessageHeader] public string[] emailAddress { get; set; } [MessageBodyMember] public Kyle.Common.Contracts.TrackItResult[] trackItResults { get; set; } } [DataContract(Name = "TrackResult", Namespace = "http://kyle")] public class TrackResult { [DataMember] public int SeqNum { get; set; } [DataMember] public int ShipmentID { get; set; } [DataMember] public string StoreNum { get; set; } } } My PDFGenerator ServiceContract looks like: namespace Kyle.WCF.PDFDocs { [ServiceContract(Namespace="http://kyle")] public interface IPDFDocsService { [OperationContract] PDFResponse GeneratePDF(PDFRequest request); [OperationContract] void GeneratePDFAsync(Kyle.Common.Contracts.Enums.PDFDocumentNameEnum docType, int? pk, string[] emailAddress); [OperationContract] Kyle.Common.Contracts.TrackResult[] Test(); } } If I comment out the GeneratePDF stub, the proxy generated by VS2010 realizes that Test returns an array of Kyle.Common.Contracts.TrackResult. However, if I leave GeneratePDF there, the proxy refuses to use Kyle.Common.Contracts.TrackResult, and instead creates a new class, ClientApp.PDFDocServices.TrackResult, and uses that as the return type of Test. Is there a way to force the proxy generator to use Kyle.Common.Contracts.TrackResult whenever I use a MessageContract? Perhaps there's a better method for using a Stream and File Name as return types? I just don't want to have to create a Copy method to copy from ClientApp.PDFDocServices.TrackResult to Kyle.Common.Contracts.TrackResult, since they should be the exact same class. Thanks in advance, Kyle

    Read the article

  • Windows Service Hosting WCF Objects over SSL (https) - Custom JSON Error Handling Doesn't Work

    - by bpatrick100
    I will first show the code that works in a non-ssl (http) environment. This code uses a custom json error handler, and all errors thrown, do get bubbled up to the client javascript (ajax). // Create webservice endpoint WebHttpBinding binding = new WebHttpBinding(); ServiceEndpoint serviceEndPoint = new ServiceEndpoint(ContractDescription.GetContract(Type.GetType(svcHost.serviceContract + ", " + svcHost.assemblyName)), binding, new EndpointAddress(svcHost.hostUrl)); // Add exception handler serviceEndPoint.Behaviors.Add(new FaultingWebHttpBehavior()); // Create host and add webservice endpoint WebServiceHost webServiceHost = new WebServiceHost(svcHost.obj, new Uri(svcHost.hostUrl)); webServiceHost.Description.Endpoints.Add(serviceEndPoint); webServiceHost.Open(); I'll also show you what the FaultingWebHttpBehavior class looks like: public class FaultingWebHttpBehavior : WebHttpBehavior { public FaultingWebHttpBehavior() { } protected override void AddServerErrorHandlers(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher) { endpointDispatcher.ChannelDispatcher.ErrorHandlers.Clear(); endpointDispatcher.ChannelDispatcher.ErrorHandlers.Add(new ErrorHandler()); } public class ErrorHandler : IErrorHandler { public bool HandleError(Exception error) { return true; } public void ProvideFault(Exception error, MessageVersion version, ref Message fault) { // Build an object to return a json serialized exception GeneralFault generalFault = new GeneralFault(); generalFault.BaseType = "Exception"; generalFault.Type = error.GetType().ToString(); generalFault.Message = error.Message; // Create the fault object to return to the client fault = Message.CreateMessage(version, "", generalFault, new DataContractJsonSerializer(typeof(GeneralFault))); WebBodyFormatMessageProperty wbf = new WebBodyFormatMessageProperty(WebContentFormat.Json); fault.Properties.Add(WebBodyFormatMessageProperty.Name, wbf); } } } [DataContract] public class GeneralFault { [DataMember] public string BaseType; [DataMember] public string Type; [DataMember] public string Message; } The AddServerErrorHandlers() method gets called automatically, once webServiceHost.Open() gets called. This sets up the custom json error handler, and life is good :-) The problem comes, when we switch to and SSL (https) environment. I'll now show you endpoint creation code for SSL: // Create webservice endpoint WebHttpBinding binding = new WebHttpBinding(); ServiceEndpoint serviceEndPoint = new ServiceEndpoint(ContractDescription.GetContract(Type.GetType(svcHost.serviceContract + ", " + svcHost.assemblyName)), binding, new EndpointAddress(svcHost.hostUrl)); // This exception handler code below (FaultingWebHttpBehavior) doesn't work with SSL communication for some reason, need to resarch... // Add exception handler serviceEndPoint.Behaviors.Add(new FaultingWebHttpBehavior()); //Add Https Endpoint WebServiceHost webServiceHost = new WebServiceHost(svcHost.obj, new Uri(svcHost.hostUrl)); binding.Security.Mode = WebHttpSecurityMode.Transport; binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.None; webServiceHost.AddServiceEndpoint(svcHost.serviceContract, binding, string.Empty); Now, with this SSL endpoint code, the service starts up correctly, and wcf hosted objects can be communicated with just fine via client javascript. However, the custom error handler doesn't work. The reason is, the AddServerErrorHandlers() method never gets called when webServiceHost.Open() is run. So, can anyone tell me what is wrong with this picture? And why, is AddServerErrorHandlers() not getting called automatically, like it does when I'm using non-ssl endpoints? Thanks!

    Read the article

  • Binding a dependency property to another dependency property

    - by Rire1979
    Take this scenario where I am working with a grid like control: <RadGrid DataContext={Binding someDataContextObject, Mode=OneWay}> <RadGrid.columns> <RadGrid.Column Header="Column Header" DataMember="{Binding dataContextObjectProperty, Mode=OneWay}"> [...] <DataTemplate> <MyCustomControl Data="{Binding ???}" /> </DataTemplate> <\RadGrid.Column> </RadGrid.columns> </RadGrid> I would like to bind the Data dependency property of MyCustomControl to the DataMember dependency property of the column to avoid multiple bindings to the same data. How do I do it?

    Read the article

  • WCF issues with KnownType for Dictionary

    - by Tom Frey
    Hi, I have a service that implements the following DataMember: [DataMember] public Dictionary<string, List<IOptionQueryResult>> QueryResultItems { get; set; } I have the class "OptionQuerySingleResult" which inherits from IOptionQueryResult. Now, I understand that I need to make the OptionQueryResult type "known" to the Service and thus tried to add the KnownType in various ways: [KnownType(typeof(Dictionary<string, OptionQuerySingleResult[]>))] [KnownType(typeof(Dictionary<string, List<OptionQuerySingleResult>>))] [KnownType(typeof(OptionQuerySingleResult)] However, none of those approaches worked and on the client side I'm either getting that deserialization failed or the server simply aborted the request, causing a connection aborted error. Does anyone have an idea on what's the proper way to get this to work? I'd like to add, the if I change the QueryResultItems definition to use the concrete type, instead of the interface, everything works just fine. Thanks, Tom

    Read the article

  • WCF method that updates object passed in

    - by Georgia Brown
    Am I correct in thinking that if I have a WCF OperationContract takes in an object and needs to set a property on that object so the client gets the update, I need to declare it to return the object. e.g. given a datacontract: [DataContract] public class CompositeType { [DataMember] public int Key { get; set; } [DataMember] public string Something { get; set; } } this will not work with WCF: public void GetDataUsingDataContract(CompositeType composite) { composite.Key = 42; } this will work: public CompositeType GetDataUsingDataContract(CompositeType composite) { composite.Key = 42; return new CompositeType { Key = composite.Key, Something = composite.Something }; }

    Read the article

  • NHibernate FetchMode.Lazy

    - by RyanFetz
    I have an object which has a property on it that has then has collections which i would like to not load in a couple situations. 98% of the time i want those collections fetched but in the one instance i do not. Here is the code I have... Why does it not set the fetch mode on the properties collections? [DataContract(Name = "ThemingJob", Namespace = "")] [Serializable] public class ThemingJob : ServiceJob { [DataMember] public virtual Query Query { get; set; } [DataMember] public string Results { get; set; } } [DataContract(Name = "Query", Namespace = "")] [Serializable] public class Query : LookupEntity<Query>, DAC.US.Search.Models.IQueryEntity { [DataMember] public string QueryResult { get; set; } private IList<Asset> _Assets = new List<Asset>(); [IgnoreDataMember] [System.Xml.Serialization.XmlIgnore] public IList<Asset> Assets { get { return _Assets; } set { _Assets = value; } } private IList<Theme> _Themes = new List<Theme>(); [IgnoreDataMember] [System.Xml.Serialization.XmlIgnore] public IList<Theme> Themes { get { return _Themes; } set { _Themes = value; } } private IList<Affinity> _Affinity = new List<Affinity>(); [IgnoreDataMember] [System.Xml.Serialization.XmlIgnore] public IList<Affinity> Affinity { get { return _Affinity; } set { _Affinity = value; } } private IList<Word> _Words = new List<Word>(); [IgnoreDataMember] [System.Xml.Serialization.XmlIgnore] public IList<Word> Words { get { return _Words; } set { _Words = value; } } } using (global::NHibernate.ISession session = NHibernateApplication.GetCurrentSession()) { global::NHibernate.ICriteria criteria = session.CreateCriteria(typeof(ThemingJob)); global::NHibernate.ICriteria countCriteria = session.CreateCriteria(typeof(ThemingJob)); criteria.AddOrder(global::NHibernate.Criterion.Order.Desc("Id")); var qc = criteria.CreateCriteria("Query"); qc.SetFetchMode("Assets", global::NHibernate.FetchMode.Lazy); qc.SetFetchMode("Themes", global::NHibernate.FetchMode.Lazy); qc.SetFetchMode("Affinity", global::NHibernate.FetchMode.Lazy); qc.SetFetchMode("Words", global::NHibernate.FetchMode.Lazy); pageIndex = Convert.ToInt32(pageIndex) - 1; // convert to 0 based paging index criteria.SetMaxResults(pageSize); criteria.SetFirstResult(pageIndex * pageSize); countCriteria.SetProjection(global::NHibernate.Criterion.Projections.RowCount()); int totalRecords = (int)countCriteria.List()[0]; return criteria.List<ThemingJob>().ToPagedList<ThemingJob>(pageIndex, pageSize, totalRecords); }

    Read the article

  • Setting the initial value of a property when using DataContractSerializer

    - by Eric
    If I am serializing and later deserializing a class using DataContractSerializer how can I control the initial values of properties that were not serialized? Consider the Person class below. Its data contract is set to serialize the FirstName and LastName properties but not the IsNew property. I want IsNew to initialize to TRUE whether a new Person is being instantiate as a new instance or being deserialized from a file. This is easy to do through the constructor, but as I understand it DataContractSerializer does not call the constructor as they could require parameters. [DataContract(Name="Person")] public class Person { [DataMember(Name="FirstName")] public string FirstName { get; set; } [DataMember(Name = "LastName")] public string LastName { get; set; } public bool IsNew { get; set; } public Person(string first, string last) { this.FirstName = first; this.LastName = last; this.IsNew = true; } }

    Read the article

  • should I ever put a major version number into a C#/Java namespace?

    - by Andrew Patterson
    I am designing a set of 'service' layer objects (data objects and interface definitions) for a WCF web service (that will be consumed by third party clients i.e. not in-house, so outside my direct control). I know that I am not going to get the interface definition exactly right - and am wanting to prepare for the time when I know that I will have to introduce a breaking set of new data objects. However, the reality of the world I am in is that I will also need to run my first version simultaneously for quite a while. The first version of my service will have URL of http://host/app/v1service.svc and when the times comes by new version will live at http://host/app/v2service.svc However, when it comes to the data objects and interfaces, I am toying with putting the 'major' version of the interface number into the actual namespace of the classes. namespace Company.Product.V1 { [DataContract(Namespace = "company-product-v1")] public class Widget { [DataMember] string widgetName; } public interface IFunction { Widget GetWidgetData(int code); } } When the time comes for a fundamental change to the service, I will introduce some classes like namespace Company.Product.V2 { [DataContract(Namespace = "company-product-v2")] public class Widget { [DataMember] int widgetCode; [DataMember] int widgetExpiry; } public interface IFunction { Widget GetWidgetData(int code); } } The advantages as I see it are that I will be able to have a single set of code serving both interface versions, sharing functionality where possible. This is because I will be able to reference both interface versions as a distinct set of C# objects. Similarly, clients may use both interface versions simultaneously, perhaps using V1.Widget in some legacy code whilst new bits move on to V2.Widget. Can anyone tell why this is a stupid idea? I have a nagging feeling that this is a bit smelly.. notes: I am obviously not proposing every single new version of the service would be in a new namespace. Presumably I will do as many non-breaking interface changes as possible, but I know that I will hit a point where all the data modelling will probably need a significant rewrite. I understand assembly versioning etc but I think this question is tangential to that type of versioning. But I could be wrong.

    Read the article

  • Is it possible to change WCF service without regenerating & recompiling client proxy?

    - by Buu Nguyen
    Let's say I have a WCF service which has a method returning object Person. In one of the clients of this service, I can add service reference to the service and start using its method. Now, let's say the Person class is changed on the server, having a new DataMember added. Other clients will make use of this new DataMember, but my client doesn't. Therefore, this client shouldn't even be aware that the service returns s/t "more" than what it needs. Is there any way that my client can still work with the service without having to update the service reference (which, as I understand, means regenerating the proxy & compiling it)?

    Read the article

  • WCF: generic list serialized to array

    - by OpticalDelusion
    So I am working with WCF and my services return types that contain generic lists. WCF is currently converting these to arrays over the wire. Is there a way I configure WCF to convert them back to lists afterwards? I know there is a way by clicking advanced when you add a service reference but I am looking for a solution in configuration files or something similar. [DataContract(IsReference = true)] public class SampleObject { [DataMember] public long ID { get; private set; } [DataMember] public ICollection<AnotherObject> Objects { get; set; } } It is very odd, also, because one service returns it as a list and the other as an array and I am pretty sure they are configured identically.

    Read the article

  • How do I return clean JSON from a WCF Service?

    - by user208662
    I am trying to return some JSON from a WCF service. This service simply returns some content from my database. I can get the data. However, I am concerned about the format of my JSON. Currently, the JSON that gets returned is formatted like this: {"d":"[{\"Age\":35,\"FirstName\":\"Peyton\",\"LastName\":\"Manning\"},{\"Age\":31,\"FirstName\":\"Drew\",\"LastName\":\"Brees\"},{\"Age\":29,\"FirstName\":\"Tony\",\"LastName\":\"Romo\"}]"} In reality, I would like my JSON to be formatted as cleanly as possible. I believe (I may be incorrect), that the same collection of results, represented in clean JSON, should look like so: [{"Age":35,"FirstName":"Peyton","LastName":"Manning"},{"Age":31,"FirstName":"Drew","LastName":"Brees"},{"Age":29,"FirstName":"Tony","LastName":"Romo"}] I have no idea where the “d” is coming from. I also have no clue why the escape characters are being inserted. My entity looks like the following: [DataContract] public class Person { [DataMember] public string FirstName { get; set; } [DataMember] public string LastName { get; set; } [DataMember] public int Age { get; set; } public Person(string firstName, string lastName, int age) { this.FirstName = firstName; this.LastName = lastName; this.Age = age; } } The service that is responsible for returning the content is defined as: [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class TestService { [OperationContract] [WebGet(ResponseFormat = WebMessageFormat.Json)] public string GetResults() { List<Person> results = new List<Person>(); results.Add(new Person("Peyton", "Manning", 35)); results.Add(new Person("Drew", "Brees", 31)); results.Add(new Person("Tony", "Romo", 29)); // Serialize the results as JSON DataContractJsonSerializer serializer = new DataContractJsonSerializer(results.GetType()); MemoryStream memoryStream = new MemoryStream(); serializer.WriteObject(memoryStream, results); // Return the results serialized as JSON string json = Encoding.Default.GetString(memoryStream.ToArray()); return json; } } How do I return “clean” JSON from a WCF service? Thank you!

    Read the article

  • What are the pros/cons to these 2 ways of defining parameters for a web service method

    - by Antony Scott
    I have an existing web service I need to expand, but it has not gone into production yet. So, I am free to change the contracts as I see fit. But I am not sure of the best way to define the methods. I am leaning towards Method 2 for no other reason than I cannot think of good names to give the parameters classes! Are there any major disadvantages to using Method 2 over Method 1? Method 1 [DataContract(Namespace = Constants.ServiceNamespace)] public class MyParameters { [DataMember(Order = 1, IsRequired = true)] public int CompanyID { get; set; } [DataMember(Order = 2, IsRequired = true)] public string Filter { get; set; } } [ServiceContract(Namespace = Constants.ServiceNamespace)] public interface IMyService { [OperationContract, FaultContract(MyServiceFault)] MyResult MyMethod(MyParameters params); } Method 2 public interface IMyService { [OperationContract, FaultContract(MyServiceFault)] MyResult MyMethod(int companyID, string filter); }

    Read the article

  • WCF returning custom types

    - by Gena Verdel
    Hi. I'm a newbie to WCF, trying to perform relatively simple task. I'm trying to return list of objects read from the database but cannot overcome some really annoying exceptions. The question is very simple? What's wrong with the picture? [ServiceContract] public interface IDBService { [OperationContract] string Ping(string name); [OperationContract] InitBDResult InitBD(); } public InitBDResult InitBD() { _dc = new CentralDC(); InitBDResult result = new InitBDResult(); result.ord = _dc.Orders.First(); return result; } [DataContract] public class InitBDResult { //[DataMember] //public List<Order> Orders { get; set; } [DataMember] public Order ord { get; set; } }

    Read the article

  • AppFabric serialization problem.

    - by jandark
    I am trying cache a class instance with AppFabric but it return class instance with empty members. The reason is DataContract Attribute. My class is marked with [Serializable] and [DataContract(Name = "TestClass", Namespace = "CustomNameSpace.TestClass")] attributes. Problem solving if I mark all properties with DataMember or remove DataContract attribute. But I do not want to remove DataContract attributte because of other serialization needs (such as json or something else) Or I do not want to add DataMember attribute to other classes. (a lot of) Do you have any idea to solve that problem ? Thanks.

    Read the article

  • Request/Response pattern in SOA implementation

    - by UserControl
    In some enterprise-like project (.NET, WCF) i saw that all service contracts accept a single Request parameter and always return Response: [DataContract] public class CustomerRequest : RequestBase { [DataMember] public long Id { get; set; } } [DataContract] public class CustomerResponse : ResponseBase { [DataMember] public CustomerInfo Customer { get; set; } } where RequestBase/ResponseBase contain common stuff like ErrorCode, Context, etc. Bodies of both service methods and proxies are wrapped in try/catch, so the only way to check for errors is looking at ResponseBase.ErrorCode (which is enumeration). I want to know how this technique is called and why it's better compared to passing what's needed as method parameters and using standard WCF context passing/faults mechanisms?

    Read the article

  • Why have private fields in data contracts modified through public ones?

    - by Nordvind
    When you make a new WCF project, sample service is generated for you. The default data contract is (I've just changed the string type field title): [DataContract] public class CompositeType { bool boolValue = true; string name = ""; [DataMember] public bool BoolValue { get { return boolValue; } set { boolValue = value; } } [DataMember] public string Name { get { return name; } set { name = value; } } } What is the point of having those private fields boolValue and name? Is it a good practice writing some data sanitizing or some other manipulations in contract, thus bloating it? It seems the only sane reason for me not writing to fields directly. So is it a bloatware or it has some reason behind it?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9  | Next Page >