Search Results

Search found 355 results on 15 pages for 'memorystream'.

Page 12/15 | < Previous Page | 8 9 10 11 12 13 14 15  | Next Page >

  • Adding Message Part dynamically in Receive Pipeline

    - by Sean
    Hello, I tried to create a custom pipeline component that takes a message and attaches additional another part dynamically (during Disassemble stage). I haven't set up a send port, so that I can see what BizTalk is trying to process. I can see only the body part, the additional part doesn't show up. This is the code I used: var part = pc.GetMessageFactory().CreateMessagePart(); part.Data = new MemoryStream(new byte[] {1, 2, 3, 4, 5}); inmsg.AddPart("another_part", part, false); Thank you.

    Read the article

  • Create a Stream without having a physical file to create from.

    - by jhorton
    I'm needing to create a zip file containing documents that exist on the server. I am using the .Net Package class to do so, and to create a new Package (which is the zip file) I have to have either a path to a physical file or a stream. I am trying to not create an actual file that would be the zip file, instead just create a stream that would exist in memory or something. My question is how do you instantiate a new Stream (i.e. FileStream, MemoryStream, etc) without having a physical file to instantiate from.

    Read the article

  • Clone LINQ To SQL object Extension Method throws object dispose exception....

    - by gtas
    Hello all, I have this extension method for cloning my LINQ To SQL objects: public static T CloneObjectGraph<T>(this T obj) where T : class { var serializer = new DataContractSerializer(typeof(T), null, int.MaxValue, false, true, null); using (var ms = new System.IO.MemoryStream()) { serializer.WriteObject(ms, obj); ms.Position = 0; return (T)serializer.ReadObject(ms); } } But while i carry objects with not all references loaded, while qyuerying with DataLoadOptions, sometimes it throws the object disposed exception, but thing is I don't ask for references that is not loaded (null). e.g. I have Customer with many references and i just need to carry on memory the Address reference EntityRef< and i don't Load anything else. But while i clone the object this exception forces me to load all the EntitySet< references with the Customer object, which might be too much and slow down the application speed. Any suggestions?

    Read the article

  • How to reload different objects by applicationsettingsbase?

    - by younevertell
    // TestASettingsString and TestBSettingsString are byte[] // TestASettings and TestBSettings are two objects to be saved My question is how to recover TestASettings and TestBSettings from TestASettingsString and TestASettingsString seperately in loadsavedsettings? Thanks private void SettingsSaving(object sender, CancelEventArgs e) { try { var stream = new MemoryStream(); var formatter = new BinaryFormatter(); formatter.Serialize(stream, TestASettings); // TestASettingsString and TestBSettingsString are byte[] TestASettingsString = stream.ToArray(); stream.Flush(); formatter.Serialize(stream, TestBSettings); TestBSettingsString = stream.ToArray(); stream.Close(); } catch (Exception ex) { Debug.WriteLine(ex); } } private void LoadSavedSettings() { Reload(); // how to get TestASettings and TestBSettings from TestASettingsString and // TestASettingsString seperately? }

    Read the article

  • How to create and download an XML without storing it on the server

    - by CiccioMiami
    NET Web Forms** application. I would like to create a download link to make available to the user the possibility to download an XML file. However the file does not have to be stored on the server. In my aspx file I have the download link (placed inside a GridView): <asp:HyperLinField Text="Download" DataNavigateUrlFormatString="download.aspx?ProductId={0}" DataNavigateUrlFields="ProductId"> In the download.aspx.vb page: Dim productId As String = Request.QueryString("productId") Dim xmlDoc As String = _ProductServices.GetXmlDocPerId(productId) Dim xdoc As XmlDocument = New XmlDocument() xdoc.LoadXml(xmlLicense) Now I would like to create a file, place the XML content inside and deliver it to the user without saving it to the server. Shall I use a MemoryStream combined with a StreamReader?

    Read the article

  • mocking static method call to c# library class

    - by Joe
    This seems like an easy enough issue but I can't seem to find the keywords to effect my searches. I'm trying to unit test by mocking out all objects within this method call. I am able to do so to all of my own creations except for this one: public void MyFunc(MyVarClass myVar) { Image picture; ... picture = Image.FromStream(new MemoryStream(myVar.ImageStream)); ... } FromStream is a static call from the Image class (part of c#). So how can I refactor my code to mock this out because I really don't want to provide a image stream to the unit test.

    Read the article

  • Me As Child Type In General Function

    - by Steven
    I have a MustInherit Parent class with two Child classes which Inherit from the Parent. How can I use (or Cast) Me in a Parent function as the the child type of that instance? EDIT: My actual goal is to be able to serialize (BinaryFormatter.Serialize(Stream, Object)) either of my child classes. However, "repeating the code" in each child "seems" wrong. EDIT2: This is my Serialize function. Where should I implement this function? Copying and pasting to each child doesn't seem right, but casting the parent to a child doesn't seem right either. Public Function Serialize() As Byte() Dim bFmt As New BinaryFormatter() Dim mStr As New MemoryStream() bFmt.Serialize(mStr, Me) Return mStr.ToArray() End Function

    Read the article

  • Problem in serializing

    - by Harun
    Hello Guyz, Presently I need to serialize one of my object which contains more my own classes object. But the problem is I dont want to save it in a file and then retrieve it into memory stream. Is there any way to directly serialize my object into stream. I used BinaryFormatter for seializing. First I used a MemoryStream directly to take serialize output but it is giving error at time of deserialization. But later when I serialize it with a file then close it and again reopen it , it works perfectly. But I want to take it direct into stream because in my program I need to do it frequently to pass it into network client. And using file repeatedly might slow down my software. Hope I clear my problem. Any Sugetion ?

    Read the article

  • How do I automatically delete an Excel file after creating it on a server and returning it to the us

    - by David A Gibson
    Hello, I am creating an Excel file on a web server, using OleDb to connect the the physical (well as physical as it can be) file and appending records. I am then returning a FilePathResult to the user via MVC, and would like to delete the physical file afterwards due to data protection concerns over the appended records. I have tried using a File.Delete in a Finally clause but I get a File Not Found error which must mean the file has gone when MVC is trying to send the file to the user. I thought about creating the File as a MemoryStream but I think OleDb needs a physical file to connect to so this isn't an option. Any suggestions on how to delete the file after returning it in one operation? Thanks

    Read the article

  • Change English numbers to Persian and vice versa in MVC (httpmodule)?

    - by Mohammad
    I wanna change all English numbers to Persian for showing to users. and change them to English numbers again for giving all requests (Postbacks) e.g: we have something like this in view IRQ170, I wanna show IRQ??? to users and give IRQ170 from users. I know, I have to use Httpmodule, But I don't know how ? Could you please guide me? Edit : Let me describe more : I've written the following http module : using System; using System.Collections.Specialized; using System.Diagnostics; using System.IO; using System.Text; using System.Text.RegularExpressions; using System.Web; using System.Web.UI; using Smartiz.Common; namespace Smartiz.UI.Classes { public class PersianNumberModule : IHttpModule { private StreamWatcher _watcher; #region Implementation of IHttpModule /// <summary> /// Initializes a module and prepares it to handle requests. /// </summary> /// <param name="context">An <see cref="T:System.Web.HttpApplication"/> that provides access to the methods, properties, and events common to all application objects within an ASP.NET application </param> public void Init(HttpApplication context) { context.BeginRequest += ContextBeginRequest; context.EndRequest += ContextEndRequest; } /// <summary> /// Disposes of the resources (other than memory) used by the module that implements <see cref="T:System.Web.IHttpModule"/>. /// </summary> public void Dispose() { } #endregion private void ContextBeginRequest(object sender, EventArgs e) { HttpApplication context = sender as HttpApplication; if (context == null) return; _watcher = new StreamWatcher(context.Response.Filter); context.Response.Filter = _watcher; } private void ContextEndRequest(object sender, EventArgs e) { HttpApplication context = sender as HttpApplication; if (context == null) return; _watcher = new StreamWatcher(context.Response.Filter); context.Response.Filter = _watcher; } } public class StreamWatcher : Stream { private readonly Stream _stream; private readonly MemoryStream _memoryStream = new MemoryStream(); public StreamWatcher(Stream stream) { _stream = stream; } public override void Flush() { _stream.Flush(); } public override int Read(byte[] buffer, int offset, int count) { int bytesRead = _stream.Read(buffer, offset, count); string orgContent = Encoding.UTF8.GetString(buffer, offset, bytesRead); string newContent = orgContent.ToEnglishNumber(); int newByteCountLength = Encoding.UTF8.GetByteCount(newContent); Encoding.UTF8.GetBytes(newContent, 0, Encoding.UTF8.GetByteCount(newContent), buffer, 0); return newByteCountLength; } public override void Write(byte[] buffer, int offset, int count) { string strBuffer = Encoding.UTF8.GetString(buffer, offset, count); MatchCollection htmlAttributes = Regex.Matches(strBuffer, @"(\S+)=[""']?((?:.(?![""']?\s+(?:\S+)=|[>""']))+.)[""']?", RegexOptions.IgnoreCase | RegexOptions.Multiline); foreach (Match match in htmlAttributes) { strBuffer = strBuffer.Replace(match.Value, match.Value.ToEnglishNumber()); } MatchCollection scripts = Regex.Matches(strBuffer, "<script[^>]*>(.*?)</script>", RegexOptions.Singleline | RegexOptions.IgnoreCase | RegexOptions.Multiline | RegexOptions.IgnorePatternWhitespace); foreach (Match match in scripts) { MatchCollection values = Regex.Matches(match.Value, @"([""'])(?:(?=(\\?))\2.)*?\1", RegexOptions.Singleline | RegexOptions.IgnoreCase | RegexOptions.Multiline | RegexOptions.IgnorePatternWhitespace); foreach (Match stringValue in values) { strBuffer = strBuffer.Replace(stringValue.Value, stringValue.Value.ToEnglishNumber()); } } MatchCollection styles = Regex.Matches(strBuffer, "<style[^>]*>(.*?)</style>", RegexOptions.Singleline | RegexOptions.IgnoreCase | RegexOptions.Multiline | RegexOptions.IgnorePatternWhitespace); foreach (Match match in styles) { strBuffer = strBuffer.Replace(match.Value, match.Value.ToEnglishNumber()); } byte[] data = Encoding.UTF8.GetBytes(strBuffer); _memoryStream.Write(data, offset, count); _stream.Write(data, offset, count); } public override string ToString() { return Encoding.UTF8.GetString(_memoryStream.ToArray()); } #region Rest of the overrides public override bool CanRead { get { throw new NotImplementedException(); } } public override bool CanSeek { get { throw new NotImplementedException(); } } public override bool CanWrite { get { throw new NotImplementedException(); } } public override long Seek(long offset, SeekOrigin origin) { throw new NotImplementedException(); } public override void SetLength(long value) { throw new NotImplementedException(); } public override long Length { get { throw new NotImplementedException(); } } public override long Position { get { throw new NotImplementedException(); } set { throw new NotImplementedException(); } } #endregion } } It works well, but It converts all numbers in css and scripts files to Persian and it causes error.

    Read the article

  • How to read binary column in database into image on asp.net page?

    - by marko
    I want to read from database where I've stored a image in binary field and display a image. while (reader.Read()) { byte[] imgarr = (byte[])reader["file"]; Stream s = new MemoryStream(imgarr); System.Drawing.Image image = System.Drawing.Image.FromStream(s); Graphics g = Graphics.FromImage(image); g.DrawImage(image, new Point(400, 10)); image.Save(Response.OutputStream, ImageFormat.Jpeg); g.Dispose(); image.Dispose(); } con.Close(); This piece of code doesn't work: System.Drawing.Image image = System.Drawing.Image.FromStream(s); What am I doing wrong?

    Read the article

  • error while converting the byte arry to image ,after modifying the byte array and the byte array is

    - by geehta
    Hi, this is my code. Here i hv formed the byte array of img, i am trying to add some vlue to this byte array say 10 and i'll take care that the value is not exceeding 255. later if i try to redraw the image via the following code i am getting error at this line... what can be the problem.. without modification if i try to draw the image it is coming but if i cahnge some value it is not drawing.. public Image btoi(byte[] bt) { ms = new MemoryStream(bt, 0, bt.Length); img = Image.FromStream (ms, true); // error at this line ms.Close(); return img; }

    Read the article

  • Pass a datatable model to controller action in MVC

    - by user2906420
    Code: @model System.Data.DataTable <button type="submit" class="btn btn-primary" data-dismiss="modal" aria-hidden="true"> Download</button> [HttpPost] public void getCSV(DataTable dt) { MemoryStream stream = Export.GetCSV(dt); var filename = "ExampleCSV.csv"; var contenttype = "text/csv"; Response.Clear(); Response.ContentType = contenttype; Response.AddHeader("content-disposition", "attachment;filename=" + filename); Response.Cache.SetCacheability(HttpCacheability.NoCache); Response.BinaryWrite(stream.ToArray()); Response.End(); } I want to allow the user to export a CSV file when the button is clicked on. the datatable should be passed to the method getCSV. Please can someone help me. thanks I do not mind using Tempdata to store the datatable and then access it in the controller

    Read the article

  • I have a custom type which i want to serialize, this custom type accepts input which might consists

    - by starz26
    I have a custom type which i want to serialize, this custom type accepts input which might consists of escape chars. M1_Serilizer(MyCustomType customTypeObj) {XmlSerializer serializer = new XmlSerializer(typeof(MyCustomType)); StringWriter sw = new StringWriter(CultureInfo.InvariantCulture); serializer.Serialize(sw, customTypeObj); string str= sw.ToString(); M2_Deserializer(str); } M2_Deserializer(string str) { XmlSerializer serializer = new XmlSerializer(typeof(MyCustomType)); StringReader sr = new StringReader(str); MyCustomType customTypeObj = (MyCustomType)serializer.Deserialize(sr); } when escape type chars are part of the CustomTypeObj, on deserialization it throws an exception. Q1)How do i overcome this?, Q2)I should use StringReader and StringWriter and not memorystream or other thing ways. StringWriter/reader will only serve my internal functionality Q3)How can these escape chars be handled?

    Read the article

  • ASP.NET Frameworks and Raw Throughput Performance

    - by Rick Strahl
    A few days ago I had a curious thought: With all these different technologies that the ASP.NET stack has to offer, what's the most efficient technology overall to return data for a server request? When I started this it was mere curiosity rather than a real practical need or result. Different tools are used for different problems and so performance differences are to be expected. But still I was curious to see how the various technologies performed relative to each just for raw throughput of the request getting to the endpoint and back out to the client with as little processing in the actual endpoint logic as possible (aka Hello World!). I want to clarify that this is merely an informal test for my own curiosity and I'm sharing the results and process here because I thought it was interesting. It's been a long while since I've done any sort of perf testing on ASP.NET, mainly because I've not had extremely heavy load requirements and because overall ASP.NET performs very well even for fairly high loads so that often it's not that critical to test load performance. This post is not meant to make a point  or even come to a conclusion which tech is better, but just to act as a reference to help understand some of the differences in perf and give a starting point to play around with this yourself. I've included the code for this simple project, so you can play with it and maybe add a few additional tests for different things if you like. Source Code on GitHub I looked at this data for these technologies: ASP.NET Web API ASP.NET MVC WebForms ASP.NET WebPages ASMX AJAX Services  (couldn't get AJAX/JSON to run on IIS8 ) WCF Rest Raw ASP.NET HttpHandlers It's quite a mixed bag, of course and the technologies target different types of development. What started out as mere curiosity turned into a bit of a head scratcher as the results were sometimes surprising. What I describe here is more to satisfy my curiosity more than anything and I thought it interesting enough to discuss on the blog :-) First test: Raw Throughput The first thing I did is test raw throughput for the various technologies. This is the least practical test of course since you're unlikely to ever create the equivalent of a 'Hello World' request in a real life application. The idea here is to measure how much time a 'NOP' request takes to return data to the client. So for this request I create the simplest Hello World request that I could come up for each tech. Http Handler The first is the lowest level approach which is an HTTP handler. public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public bool IsReusable { get { return true; } } } WebForms Next I added a couple of ASPX pages - one using CodeBehind and one using only a markup page. The CodeBehind page simple does this in CodeBehind without any markup in the ASPX page: public partial class HelloWorld_CodeBehind : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { Response.Write("Hello World. Time is: " + DateTime.Now.ToString() ); Response.End(); } } while the Markup page only contains some static output via an expression:<%@ Page Language="C#" AutoEventWireup="false" CodeBehind="HelloWorld_Markup.aspx.cs" Inherits="AspNetFrameworksPerformance.HelloWorld_Markup" %> Hello World. Time is <%= DateTime.Now %> ASP.NET WebPages WebPages is the freestanding Razor implementation of ASP.NET. Here's the simple HelloWorld.cshtml page:Hello World @DateTime.Now WCF REST WCF REST was the token REST implementation for ASP.NET before WebAPI and the inbetween step from ASP.NET AJAX. I'd like to forget that this technology was ever considered for production use, but I'll include it here. Here's an OperationContract class: [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World" + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } } WCF REST can return arbitrary results by returning a Stream object and a content type. The code above turns the string result into a stream and returns that back to the client. ASP.NET AJAX (ASMX Services) I also wanted to test ASP.NET AJAX services because prior to WebAPI this is probably still the most widely used AJAX technology for the ASP.NET stack today. Unfortunately I was completely unable to get this running on my Windows 8 machine. Visual Studio 2012  removed adding of ASP.NET AJAX services, and when I tried to manually add the service and configure the script handler references it simply did not work - I always got a SOAP response for GET and POST operations. No matter what I tried I always ended up getting XML results even when explicitly adding the ScriptHandler. So, I didn't test this (but the code is there - you might be able to test this on a Windows 7 box). ASP.NET MVC Next up is probably the most popular ASP.NET technology at the moment: MVC. Here's the small controller: public class MvcPerformanceController : Controller { public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } } ASP.NET WebAPI Next up is WebAPI which looks kind of similar to MVC. Except here I have to use a StringContent result to return the response: public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } } Testing Take a minute to think about each of the technologies… and take a guess which you think is most efficient in raw throughput. The fastest should be pretty obvious, but the others - maybe not so much. The testing I did is pretty informal since it was mainly to satisfy my curiosity - here's how I did this: I used Apache Bench (ab.exe) from a full Apache HTTP installation to run and log the test results of hitting the server. ab.exe is a small executable that lets you hit a URL repeatedly and provides counter information about the number of requests, requests per second etc. ab.exe and the batch file are located in the \LoadTests folder of the project. An ab.exe command line  looks like this: ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld which hits the specified URL 100,000 times with a load factor of 20 concurrent requests. This results in output like this:   It's a great way to get a quick and dirty performance summary. Run it a few times to make sure there's not a large amount of varience. You might also want to do an IISRESET to clear the Web Server. Just make sure you do a short test run to warm up the server first - otherwise your first run is likely to be skewed downwards. ab.exe also allows you to specify headers and provide POST data and many other things if you want to get a little more fancy. Here all tests are GET requests to keep it simple. I ran each test: 100,000 iterations Load factor of 20 concurrent connections IISReset before starting A short warm up run for API and MVC to make sure startup cost is mitigated Here is the batch file I used for the test: IISRESET REM make sure you add REM C:\Program Files (x86)\Apache Software Foundation\Apache2.2\bin REM to your path so ab.exe can be found REM Warm up ab.exe -n100 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJsonab.exe -n100 -c20 http://localhost/aspnetperf/api/HelloWorldJson ab.exe -n100 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld ab.exe -n100000 -c20 http://localhost/aspnetperf/handler.ashx > handler.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_CodeBehind.aspx > AspxCodeBehind.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_Markup.aspx > AspxMarkup.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld > Wcf.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldCode > Mvc.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld > WebApi.txt I ran each of these tests 3 times and took the average score for Requests/second, with the machine otherwise idle. I did see a bit of variance when running many tests but the values used here are the medians. Part of this has to do with the fact I ran the tests on my local machine - result would probably more consistent running the load test on a separate machine hitting across the network. I ran these tests locally on my laptop which is a Dell XPS with quad core Sandibridge I7-2720QM @ 2.20ghz and a fast SSD drive on Windows 8. CPU load during tests ran to about 70% max across all 4 cores (IOW, it wasn't overloading the machine). Ideally you can try running these tests on a separate machine hitting the local machine. If I remember correctly IIS 7 and 8 on client OSs don't throttle so the performance here should be Results Ok, let's cut straight to the chase. Below are the results from the tests… It's not surprising that the handler was fastest. But it was a bit surprising to me that the next fastest was WebForms and especially Web Forms with markup over a CodeBehind page. WebPages also fared fairly well. MVC and WebAPI are a little slower and the slowest by far is WCF REST (which again I find surprising). As mentioned at the start the raw throughput tests are not overly practical as they don't test scripting performance for the HTML generation engines or serialization performances of the data engines. All it really does is give you an idea of the raw throughput for the technology from time of request to reaching the endpoint and returning minimal text data back to the client which indicates full round trip performance. But it's still interesting to see that Web Forms performs better in throughput than either MVC, WebAPI or WebPages. It'd be interesting to try this with a few pages that actually have some parsing logic on it, but that's beyond the scope of this throughput test. But what's also amazing about this test is the sheer amount of traffic that a laptop computer is handling. Even the slowest tech managed 5700 requests a second, which is one hell of a lot of requests if you extrapolate that out over a 24 hour period. Remember these are not static pages, but dynamic requests that are being served. Another test - JSON Data Service Results The second test I used a JSON result from several of the technologies. I didn't bother running WebForms and WebPages through this test since that doesn't make a ton of sense to return data from the them (OTOH, returning text from the APIs didn't make a ton of sense either :-) In these tests I have a small Person class that gets serialized and then returned to the client. The Person class looks like this: public class Person { public Person() { Id = 10; Name = "Rick"; Entered = DateTime.Now; } public int Id { get; set; } public string Name { get; set; } public DateTime Entered { get; set; } } Here are the updated handler classes that use Person: Handler public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { var action = context.Request.QueryString["action"]; if (action == "json") JsonRequest(context); else TextRequest(context); } public void TextRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public void JsonRequest(HttpContext context) { var json = JsonConvert.SerializeObject(new Person(), Formatting.None); context.Response.ContentType = "application/json"; context.Response.Write(json); } public bool IsReusable { get { return true; } } } This code adds a little logic to check for a action query string and route the request to an optional JSON result method. To generate JSON, I'm using the same JSON.NET serializer (JsonConvert.SerializeObject) used in Web API to create the JSON response. WCF REST   [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World " + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } [OperationContract] [WebGet(ResponseFormat=WebMessageFormat.Json,BodyStyle=WebMessageBodyStyle.WrappedRequest)] public Person HelloWorldJson() { // Add your operation implementation here return new Person(); } } For WCF REST all I have to do is add a method with the Person result type.   ASP.NET MVC public class MvcPerformanceController : Controller { // // GET: /MvcPerformance/ public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } public JsonResult HelloWorldJson() { return Json(new Person(), JsonRequestBehavior.AllowGet); } } For MVC all I have to do for a JSON response is return a JSON result. ASP.NET internally uses JavaScriptSerializer. ASP.NET WebAPI public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } [HttpGet] public Person HelloWorldJson() { return new Person(); } [HttpGet] public HttpResponseMessage HelloWorldJson2() { var response = new HttpResponseMessage(HttpStatusCode.OK); response.Content = new ObjectContent<Person>(new Person(), GlobalConfiguration.Configuration.Formatters.JsonFormatter); return response; } } Testing and Results To run these data requests I used the following ab.exe commands:REM JSON RESPONSES ab.exe -n100000 -c20 http://localhost/aspnetperf/Handler.ashx?action=json > HandlerJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJson > MvcJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorldJson > WebApiJson.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorldJson > WcfJson.txt The results from this test run are a bit interesting in that the WebAPI test improved performance significantly over returning plain string content. Here are the results:   The performance for each technology drops a little bit except for WebAPI which is up quite a bit! From this test it appears that WebAPI is actually significantly better performing returning a JSON response, rather than a plain string response. Snag with Apache Benchmark and 'Length Failures' I ran into a little snag with Apache Benchmark, which was reporting failures for my Web API requests when serializing. As the graph shows performance improved significantly from with JSON results from 5580 to 6530 or so which is a 15% improvement (while all others slowed down by 3-8%). However, I was skeptical at first because the WebAPI test reports showed a bunch of errors on about 10% of the requests. Check out this report: Notice the Failed Request count. What the hey? Is WebAPI failing on roughly 10% of requests when sending JSON? Turns out: No it's not! But it took some sleuthing to figure out why it reports these failures. At first I thought that Web API was failing, and so to make sure I re-ran the test with Fiddler attached and runiisning the ab.exe test by using the -X switch: ab.exe -n100 -c10 -X localhost:8888 http://localhost/aspnetperf/api/HelloWorldJson which showed that indeed all requests where returning proper HTTP 200 results with full content. However ab.exe was reporting the errors. After some closer inspection it turned out that the dates varying in size altered the response length in dynamic output. For example: these two results: {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.841926-10:00"} {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.8519262-10:00"} are different in length for the number which results in 68 and 69 bytes respectively. The same URL produces different result lengths which is what ab.exe reports. I didn't notice at first bit the same is happening when running the ASHX handler with JSON.NET result since it uses the same serializer that varies the milliseconds. Moral: You can typically ignore Length failures in Apache Benchmark and when in doubt check the actual output with Fiddler. Note that the other failure values are accurate though. Another interesting Side Note: Perf drops over Time As I was running these tests repeatedly I was finding that performance steadily dropped from a startup peak to a 10-15% lower stable level. IOW, with Web API I'd start out with around 6500 req/sec and in subsequent runs it keeps dropping until it would stabalize somewhere around 5900 req/sec occasionally jumping lower. For these tests this is why I did the IIS RESET and warm up for individual tests. This is a little puzzling. Looking at Process Monitor while the test are running memory very quickly levels out as do handles and threads, on the first test run. Subsequent runs everything stays stable, but the performance starts going downwards. This applies to all the technologies - Handlers, Web Forms, MVC, Web API - curious to see if others test this and see similar results. Doing an IISRESET then resets everything and performance starts off at peak again… Summary As I stated at the outset, these were informal to satiate my curiosity not to prove that any technology is better or even faster than another. While there clearly are differences in performance the differences (other than WCF REST which was by far the slowest and the raw handler which was by far the highest) are relatively minor, so there is no need to feel that any one technology is a runaway standout in raw performance. Choosing a technology is about more than pure performance but also about the adequateness for the job and the easy of implementation. The strengths of each technology will make for any minor performance difference we see in these tests. However, to me it's important to get an occasional reality check and compare where new technologies are heading. Often times old stuff that's been optimized and designed for a time of less horse power can utterly blow the doors off newer tech and simple checks like this let you compare. Luckily we're seeing that much of the new stuff performs well even in V1.0 which is great. To me it was very interesting to see Web API perform relatively badly with plain string content, which originally led me to think that Web API might not be properly optimized just yet. For those that caught my Tweets late last week regarding WebAPI's slow responses was with String content which is in fact considerably slower. Luckily where it counts with serialized JSON and XML WebAPI actually performs better. But I do wonder what would make generic string content slower than serialized code? This stresses another point: Don't take a single test as the final gospel and don't extrapolate out from a single set of tests. Certainly Twitter can make you feel like a fool when you post something immediate that hasn't been fleshed out a little more <blush>. Egg on my face. As a result I ended up screwing around with this for a few hours today to compare different scenarios. Well worth the time… I hope you found this useful, if not for the results, maybe for the process of quickly testing a few requests for performance and charting out a comparison. Now onwards with more serious stuff… Resources Source Code on GitHub Apache HTTP Server Project (ab.exe is part of the binary distribution)© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET  Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • A Simple Entity Tagger

    - by Elton Stoneman
    In the REST world, ETags are your gateway to performance boosts by letting clients cache responses. In the non-REST world, you may also want to add an ETag to an entity definition inside a traditional service contract – think of a scenario where a consumer persists its own representation of your entity, and wants to keep it in sync. Rather than load every entity by ID and check for changes, the consumer can send in a set of linked IDs and ETags, and you can return only the entities where the current ETag is different from the consumer’s version.  If your entity is a projection from various sources, you may not have a persistent ETag, so you need an efficient way to generate an ETag which is deterministic, so an entity with the same state always generates the same ETag. I have an implementation for a generic ETag generator on GitHub here: EntityTagger code sample. The essence is simple - we get the entity, serialize it and build a hash from the serialized value. Any changes to either the state or the structure of the entity will result in a different hash. To use it, just call SetETag, passing your populated object and a Func<> which acts as an accessor to the ETag property: EntityTagger.SetETag(user, x => x.ETag); The implementation is all in at 80 lines of code, which is all pretty straightforward: var eTagProperty = AsPropertyInfo(eTagPropertyAccessor); var originalETag = eTagProperty.GetValue(entity, null); try { ResetETag(entity, eTagPropertyAccessor); string json; var serializer = new DataContractJsonSerializer(entity.GetType()); using (var stream = new MemoryStream()) { serializer.WriteObject(stream, entity); json = Encoding.UTF8.GetString(stream.GetBuffer(), 0, (int)stream.Length); } var guid = GetDeterministicGuid(json); eTagProperty.SetValue(entity, guid.ToString(), null); //... There are a couple of helper methods to check if the object has changed since the ETag value was last set, and to reset the ETag. This implementation uses JSON to do the serializing rather than XML. Benefit - should be marginally more efficient as your hashing a much smaller serialized string; downside, JSON doesn't include namespaces or class names at the root level, so if you have two classes with the exact same structure but different names, then instances which have the same content will have the same ETag. You may want that behaviour, but change to use the XML DataContractSerializer if you think that will be an issue. If you can persist the ETag somewhere, it will save you server processing to load up the entity, but that will only apply to scenarios where you can reliably invalidate your ETag (e.g. if you control all the entry points where entity contents can be updated, then you can calculate and persist the new ETag with each update).

    Read the article

  • Dataset.ReadXml() - Invalid character in the given encoding

    - by NLV
    Hello all I have saved a dataset in the sql database in an xml column using the following code. XmlDataDocument dd = new XmlDataDocument(dataset); and passing this xml document as sql parameter using param.value = new XmlNodeReader(dd); The XML is like <NewDataSet><SubContractChangeOrders><AGCol>1</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>006</Contract_x0020_Number><ContractID>30</ContractID><ChangeOrderID>211</ChangeOrderID><Amount>0.0000</Amount><Udf_CostReimbursableFlag>false</Udf_CostReimbursableFlag><Udf_CustomerCode /><Udf_SubChangeOrderStatus /></SubContractChangeOrders><SubContractChangeOrders><AGCol>2</AGCol><SCO_x0020_Number>002</SCO_x0020_Number><Contract_x0020_Number>006</Contract_x0020_Number><ContractID>30</ContractID><ChangeOrderID>212</ChangeOrderID><Amount>0.0000</Amount><Udf_CostReimbursableFlag>false</Udf_CostReimbursableFlag></SubContractChangeOrders><SubContractChangeOrders><AGCol>3</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>111</Contract_x0020_Number><ContractID>87</ContractID><ChangeOrderID>12</ChangeOrderID><Amount>300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>4</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>222</Contract_x0020_Number><ContractID>80</ContractID><ChangeOrderID>6</ChangeOrderID><Amount>100.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>5</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>777</Contract_x0020_Number><ContractID>79</ContractID><ChangeOrderID>5</ChangeOrderID><Amount>200.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>6</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>786</Contract_x0020_Number><ContractID>77</ContractID><ChangeOrderID>3</ChangeOrderID><Amount>100.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>7</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>787</Contract_x0020_Number><ContractID>78</ContractID><ChangeOrderID>4</ChangeOrderID><Amount>500.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>8</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 009</Contract_x0020_Number><ContractID>219</ContractID><ChangeOrderID>78</ChangeOrderID><Amount>9000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>9</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 010</Contract_x0020_Number><ContractID>220</ContractID><ChangeOrderID>79</ChangeOrderID><Amount>13000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>10</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 012</Contract_x0020_Number><ContractID>222</ContractID><ChangeOrderID>83</ChangeOrderID><Amount>2300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>11</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 020</Contract_x0020_Number><ContractID>226</ContractID><ChangeOrderID>86</ChangeOrderID><Amount>5400.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>12</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con 021</Contract_x0020_Number><ContractID>227</ContractID><ChangeOrderID>87</ChangeOrderID><Amount>2300.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>13</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con001</Contract_x0020_Number><ContractID>208</ContractID><ChangeOrderID>72</ChangeOrderID><Amount>3000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>14</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con002</Contract_x0020_Number><ContractID>209</ContractID><ChangeOrderID>73</ChangeOrderID><Amount>400.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>15</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con003</Contract_x0020_Number><ContractID>210</ContractID><ChangeOrderID>74</ChangeOrderID><Amount>6000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>16</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con004</Contract_x0020_Number><ContractID>211</ContractID><ChangeOrderID>75</ChangeOrderID><Amount>9000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>17</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Con005</Contract_x0020_Number><ContractID>213</ContractID><ChangeOrderID>76</ChangeOrderID><Amount>17000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>18</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>Cont001</Contract_x0020_Number><ContractID>228</ContractID><ChangeOrderID>89</ChangeOrderID><Amount>2000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>19</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>PUR001</Contract_x0020_Number><ContractID>229</ContractID><ChangeOrderID>88</ChangeOrderID><Amount>1000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>20</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>PUR002</Contract_x0020_Number><ContractID>230</ContractID><ChangeOrderID>90</ChangeOrderID><Amount>3000.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>21</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>SC-002</Contract_x0020_Number><ContractID>2</ContractID><ChangeOrderID>7</ChangeOrderID><Amount>200.0000</Amount></SubContractChangeOrders><SubContractChangeOrders><AGCol>22</AGCol><SCO_x0020_Number>001</SCO_x0020_Number><Contract_x0020_Number>SC-004</Contract_x0020_Number><ContractID>7</ContractID><ChangeOrderID>65</ChangeOrderID><Amount>1000.0000</Amount></SubContractChangeOrders></NewDataSet> I'm trying to read it back as follows using (SqlConnection con = new SqlConnection("Server=#####;Initial Catalog=#####;User ID=####;Pwd=########")) { using (SqlCommand com = new SqlCommand("select * from dbo.tbl_#####", con)) { using (SqlDataAdapter ada = new SqlDataAdapter(com)) { ada.Fill(dt); MemoryStream ms = new MemoryStream(); object contractXML1 = dt.Rows[0]["SCOXML1"]; System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter(); bf.Serialize(ms, contractXML1); ms.Seek(0, SeekOrigin.Begin); ds.ReadXml(ms); } } } I'm getting the following error Data at the root level is invalid. Line 1, position 6. Any ideas? NLV

    Read the article

  • .NET file Decryption - Bad Data

    - by Jon
    I am in the process of rewriting an old application. The old app stored data in a scoreboard file that was encrypted with the following code: private const String SSecretKey = @"?B?n?Mj?"; public DataTable GetScoreboardFromFile() { FileInfo f = new FileInfo(scoreBoardLocation); if (!f.Exists) { return setupNewScoreBoard(); } DESCryptoServiceProvider DES = new DESCryptoServiceProvider(); //A 64 bit key and IV is required for this provider. //Set secret key For DES algorithm. DES.Key = ASCIIEncoding.ASCII.GetBytes(SSecretKey); //Set initialization vector. DES.IV = ASCIIEncoding.ASCII.GetBytes(SSecretKey); //Create a file stream to read the encrypted file back. FileStream fsread = new FileStream(scoreBoardLocation, FileMode.Open, FileAccess.Read); //Create a DES decryptor from the DES instance. ICryptoTransform desdecrypt = DES.CreateDecryptor(); //Create crypto stream set to read and do a //DES decryption transform on incoming bytes. CryptoStream cryptostreamDecr = new CryptoStream(fsread, desdecrypt, CryptoStreamMode.Read); DataTable dTable = new DataTable("scoreboard"); dTable.ReadXml(new StreamReader(cryptostreamDecr)); cryptostreamDecr.Close(); fsread.Close(); return dTable; } This works fine. I have copied the code into my new app so that I can create a legacy loader and convert the data into the new format. The problem is I get a "Bad Data" error: System.Security.Cryptography.CryptographicException was unhandled Message="Bad Data.\r\n" Source="mscorlib" The error fires at this line: dTable.ReadXml(new StreamReader(cryptostreamDecr)); The encrypted file was created today on the same machine with the old code. I guess that maybe the encryption / decryption process uses the application name / file or something and therefore means I can not open it. Does anyone have an idea as to: A) Be able explain why this isn't working? B) Offer a solution that would allow me to be able to open files that were created with the legacy application and be able to convert them please? Here is the whole class that deals with loading and saving the scoreboard: using System; using System.Collections.Generic; using System.Text; using System.Security.Cryptography; using System.Runtime.InteropServices; using System.IO; using System.Data; using System.Xml; using System.Threading; namespace JawBreaker { [Serializable] class ScoreBoardLoader { private Jawbreaker jawbreaker; private String sSecretKey = @"?B?n?Mj?"; private String scoreBoardFileLocation = ""; private bool keepScoreBoardUpdated = true; private int intTimer = 180000; public ScoreBoardLoader(Jawbreaker jawbreaker, String scoreBoardFileLocation) { this.jawbreaker = jawbreaker; this.scoreBoardFileLocation = scoreBoardFileLocation; } // Call this function to remove the key from memory after use for security [System.Runtime.InteropServices.DllImport("KERNEL32.DLL", EntryPoint = "RtlZeroMemory")] public static extern bool ZeroMemory(IntPtr Destination, int Length); // Function to Generate a 64 bits Key. private string GenerateKey() { // Create an instance of Symetric Algorithm. Key and IV is generated automatically. DESCryptoServiceProvider desCrypto = (DESCryptoServiceProvider)DESCryptoServiceProvider.Create(); // Use the Automatically generated key for Encryption. return ASCIIEncoding.ASCII.GetString(desCrypto.Key); } public void writeScoreboardToFile() { DataTable tempScoreBoard = getScoreboardFromFile(); //add in the new scores to the end of the file. for (int i = 0; i < jawbreaker.Scoreboard.Rows.Count; i++) { DataRow row = tempScoreBoard.NewRow(); row.ItemArray = jawbreaker.Scoreboard.Rows[i].ItemArray; tempScoreBoard.Rows.Add(row); } //before it is written back to the file make sure we update the sync info if (jawbreaker.SyncScoreboard) { //connect to webservice, login and update all the scores that have not been synced. for (int i = 0; i < tempScoreBoard.Rows.Count; i++) { try { //check to see if that row has been synced to the server if (!Boolean.Parse(tempScoreBoard.Rows[i].ItemArray[7].ToString())) { //sync info to server //update the row to say that it has been updated object[] tempArray = tempScoreBoard.Rows[i].ItemArray; tempArray[7] = true; tempScoreBoard.Rows[i].ItemArray = tempArray; tempScoreBoard.AcceptChanges(); } } catch (Exception ex) { jawbreaker.writeErrorToLog("ERROR OCCURED DURING SYNC TO SERVER UPDATE: " + ex.Message); } } } FileStream fsEncrypted = new FileStream(scoreBoardFileLocation, FileMode.Create, FileAccess.Write); DESCryptoServiceProvider DES = new DESCryptoServiceProvider(); DES.Key = ASCIIEncoding.ASCII.GetBytes(sSecretKey); DES.IV = ASCIIEncoding.ASCII.GetBytes(sSecretKey); ICryptoTransform desencrypt = DES.CreateEncryptor(); CryptoStream cryptostream = new CryptoStream(fsEncrypted, desencrypt, CryptoStreamMode.Write); MemoryStream ms = new MemoryStream(); tempScoreBoard.WriteXml(ms, XmlWriteMode.WriteSchema); ms.Position = 0; byte[] bitarray = new byte[ms.Length]; ms.Read(bitarray, 0, bitarray.Length); cryptostream.Write(bitarray, 0, bitarray.Length); cryptostream.Close(); ms.Close(); //now the scores have been added to the file remove them from the datatable jawbreaker.Scoreboard.Rows.Clear(); } public void startPeriodicScoreboardWriteToFile() { while (keepScoreBoardUpdated) { //three minute sleep. Thread.Sleep(intTimer); writeScoreboardToFile(); } } public void stopPeriodicScoreboardWriteToFile() { keepScoreBoardUpdated = false; } public int IntTimer { get { return intTimer; } set { intTimer = value; } } public DataTable getScoreboardFromFile() { FileInfo f = new FileInfo(scoreBoardFileLocation); if (!f.Exists) { jawbreaker.writeInfoToLog("Scoreboard not there so creating new one"); return setupNewScoreBoard(); } else { DESCryptoServiceProvider DES = new DESCryptoServiceProvider(); //A 64 bit key and IV is required for this provider. //Set secret key For DES algorithm. DES.Key = ASCIIEncoding.ASCII.GetBytes(sSecretKey); //Set initialization vector. DES.IV = ASCIIEncoding.ASCII.GetBytes(sSecretKey); //Create a file stream to read the encrypted file back. FileStream fsread = new FileStream(scoreBoardFileLocation, FileMode.Open, FileAccess.Read); //Create a DES decryptor from the DES instance. ICryptoTransform desdecrypt = DES.CreateDecryptor(); //Create crypto stream set to read and do a //DES decryption transform on incoming bytes. CryptoStream cryptostreamDecr = new CryptoStream(fsread, desdecrypt, CryptoStreamMode.Read); DataTable dTable = new DataTable("scoreboard"); dTable.ReadXml(new StreamReader(cryptostreamDecr)); cryptostreamDecr.Close(); fsread.Close(); return dTable; } } public DataTable setupNewScoreBoard() { //scoreboard info into dataset DataTable scoreboard = new DataTable("scoreboard"); scoreboard.Columns.Add(new DataColumn("playername", System.Type.GetType("System.String"))); scoreboard.Columns.Add(new DataColumn("score", System.Type.GetType("System.Int32"))); scoreboard.Columns.Add(new DataColumn("ballnumber", System.Type.GetType("System.Int32"))); scoreboard.Columns.Add(new DataColumn("xsize", System.Type.GetType("System.Int32"))); scoreboard.Columns.Add(new DataColumn("ysize", System.Type.GetType("System.Int32"))); scoreboard.Columns.Add(new DataColumn("gametype", System.Type.GetType("System.String"))); scoreboard.Columns.Add(new DataColumn("date", System.Type.GetType("System.DateTime"))); scoreboard.Columns.Add(new DataColumn("synced", System.Type.GetType("System.Boolean"))); scoreboard.AcceptChanges(); return scoreboard; } private void Run() { // For additional security Pin the key. GCHandle gch = GCHandle.Alloc(sSecretKey, GCHandleType.Pinned); // Remove the Key from memory. ZeroMemory(gch.AddrOfPinnedObject(), sSecretKey.Length * 2); gch.Free(); } } }

    Read the article

  • IOException: Unable To Delete Images Due To File Lock

    - by Arslan Pervaiz
    I am Unable To Delete Image File From My Server Path It Gaves Error That The Process Cannot Access The File "FileName" Because it is being Used By Another Process. I Tried Many Methods But Still All In Vain. Please Help me Out in This Issue. Here is My Code Snippet. using System; using System.Data; using System.Web; using System.Data.SqlClient; using System.Web.UI; using System.Web.UI.HtmlControls; using System.Globalization; using System.Web.Security; using System.Text; using System.DirectoryServices; using System.Collections; using System.IO; using System.Drawing; using System.Drawing.Imaging; using System.Drawing.Drawing2D; //============ Main Block ================= byte[] data = (byte[])ds.Tables[0].Rows[0][0]; MemoryStream ms = new MemoryStream(data); Image returnImage = Image.FromStream(ms); returnImage.Save(Server.MapPath(".\\TmpImages\\SavedImage.jpg"), System.Drawing.Imaging.ImageFormat.Jpeg); returnImage.Dispose(); \\ I Tried this Dispose Method To Unlock The File But Nothing Done. ms.Close(); \\ I Tried The Memory Stream Close Method Also But Its Also Not Worked For Me. watermark(); \\ Here is My Water Mark Method That Print Water Mark Image on My Saved Image (Image That is Converted From Byte Array) DeleteImages(); \\ Here is My Delete Method That I Call To Delete The Images //===== ==== My Delete Method To Delete Files================== public void DeleteImages() { try { File.Delete(Server.MapPath(".\\TmpImages\\WaterMark.jpg")); \\This Image Deleted Fine. File.Delete(Server.MapPath(".\\TmpImages\\SavedImage.jpg")); \\ Exception Thrown On Deleting of This Image. } catch (Exception ex) { LogManager.LogException(ex, "Error in Deleting Images."); Master.ShowMessage(ex.Message, true); } } \ ==== Method Declartion That Make Watermark of One Image On Another Image.======= public void watermark() { //create a image object containing the photograph to watermark Image imgPhoto = Image.FromFile(Server.MapPath(".\\TmpImages\\SavedImage.jpg")); int phWidth = imgPhoto.Width; int phHeight = imgPhoto.Height; //create a Bitmap the Size of the original photograph Bitmap bmPhoto = new Bitmap(phWidth, phHeight, PixelFormat.Format24bppRgb); bmPhoto.SetResolution(imgPhoto.HorizontalResolution, imgPhoto.VerticalResolution); //load the Bitmap into a Graphics object Graphics grPhoto = Graphics.FromImage(bmPhoto); //create a image object containing the watermark Image imgWatermark = new Bitmap(Server.MapPath(".\\TmpImages\\PrintasWatermark.jpg")); int wmWidth = imgWatermark.Width; int wmHeight = imgWatermark.Height; //Set the rendering quality for this Graphics object grPhoto.SmoothingMode = SmoothingMode.AntiAlias; //Draws the photo Image object at original size to the graphics object. grPhoto.DrawImage( imgPhoto, // Photo Image object new Rectangle(0, 0, phWidth, phHeight), // Rectangle structure 0, // x-coordinate of the portion of the source image to draw. 0, // y-coordinate of the portion of the source image to draw. phWidth, // Width of the portion of the source image to draw. phHeight, // Height of the portion of the source image to draw. GraphicsUnit.Pixel); // Units of measure //------------------------------------------------------- //to maximize the size of the Copyright message we will //test multiple Font sizes to determine the largest posible //font we can use for the width of the Photograph //define an array of point sizes you would like to consider as possiblities //------------------------------------------------------- //Define the text layout by setting the text alignment to centered StringFormat StrFormat = new StringFormat(); StrFormat.Alignment = StringAlignment.Center; //define a Brush which is semi trasparent black (Alpha set to 153) SolidBrush semiTransBrush2 = new SolidBrush(Color.FromArgb(153, 0, 0, 0)); //define a Brush which is semi trasparent white (Alpha set to 153) SolidBrush semiTransBrush = new SolidBrush(Color.FromArgb(153, 255, 255, 255)); //------------------------------------------------------------ //Step #2 - Insert Watermark image //------------------------------------------------------------ //Create a Bitmap based on the previously modified photograph Bitmap Bitmap bmWatermark = new Bitmap(bmPhoto); bmWatermark.SetResolution(imgPhoto.HorizontalResolution, imgPhoto.VerticalResolution); //Load this Bitmap into a new Graphic Object Graphics grWatermark = Graphics.FromImage(bmWatermark); //To achieve a transulcent watermark we will apply (2) color //manipulations by defineing a ImageAttributes object and //seting (2) of its properties. ImageAttributes imageAttributes = new ImageAttributes(); //The first step in manipulating the watermark image is to replace //the background color with one that is trasparent (Alpha=0, R=0, G=0, B=0) //to do this we will use a Colormap and use this to define a RemapTable ColorMap colorMap = new ColorMap(); //My watermark was defined with a background of 100% Green this will //be the color we search for and replace with transparency colorMap.OldColor = Color.FromArgb(255, 0, 255, 0); colorMap.NewColor = Color.FromArgb(0, 0, 0, 0); ColorMap[] remapTable = { colorMap }; imageAttributes.SetRemapTable(remapTable, ColorAdjustType.Bitmap); //The second color manipulation is used to change the opacity of the //watermark. This is done by applying a 5x5 matrix that contains the //coordinates for the RGBA space. By setting the 3rd row and 3rd column //to 0.3f we achive a level of opacity float[][] colorMatrixElements = { new float[] {1.0f, 0.0f, 0.0f, 0.0f, 0.0f}, new float[] {0.0f, 1.0f, 0.0f, 0.0f, 0.0f}, new float[] {0.0f, 0.0f, 1.0f, 0.0f, 0.0f}, new float[] {0.0f, 0.0f, 0.0f, 0.3f, 0.0f}, new float[] {0.0f, 0.0f, 0.0f, 0.0f, 1.0f}}; ColorMatrix wmColorMatrix = new ColorMatrix(colorMatrixElements); imageAttributes.SetColorMatrix(wmColorMatrix, ColorMatrixFlag.Default, ColorAdjustType.Bitmap); //For this example we will place the watermark in the upper right //hand corner of the photograph. offset down 10 pixels and to the //left 10 pixles int xPosOfWm = ((phWidth - wmWidth) - 10); int yPosOfWm = 10; grWatermark.DrawImage(imgWatermark, new Rectangle(xPosOfWm, yPosOfWm, wmWidth, wmHeight), //Set the detination Position 0, // x-coordinate of the portion of the source image to draw. 0, // y-coordinate of the portion of the source image to draw. wmWidth, // Watermark Width wmHeight, // Watermark Height GraphicsUnit.Pixel, // Unit of measurment imageAttributes); //ImageAttributes Object //Replace the original photgraphs bitmap with the new Bitmap imgPhoto = bmWatermark; grPhoto.Dispose(); grWatermark.Dispose(); //save new image to file system. imgPhoto.Save(Server.MapPath(".\\TmpImages\\WaterMark.jpg"), ImageFormat.Jpeg); imgPhoto.Dispose(); imgWatermark.Dispose(); }

    Read the article

  • An Xml Serializable PropertyBag Dictionary Class for .NET

    - by Rick Strahl
    I don't know about you but I frequently need property bags in my applications to store and possibly cache arbitrary data. Dictionary<T,V> works well for this although I always seem to be hunting for a more specific generic type that provides a string key based dictionary. There's string dictionary, but it only works with strings. There's Hashset<T> but it uses the actual values as keys. In most key value pair situations for me string is key value to work off. Dictionary<T,V> works well enough, but there are some issues with serialization of dictionaries in .NET. The .NET framework doesn't do well serializing IDictionary objects out of the box. The XmlSerializer doesn't support serialization of IDictionary via it's default serialization, and while the DataContractSerializer does support IDictionary serialization it produces some pretty atrocious XML. What doesn't work? First off Dictionary serialization with the Xml Serializer doesn't work so the following fails: [TestMethod] public void DictionaryXmlSerializerTest() { var bag = new Dictionary<string, object>(); bag.Add("key", "Value"); bag.Add("Key2", 100.10M); bag.Add("Key3", Guid.NewGuid()); bag.Add("Key4", DateTime.Now); bag.Add("Key5", true); bag.Add("Key7", new byte[3] { 42, 45, 66 }); TestContext.WriteLine(this.ToXml(bag)); } public string ToXml(object obj) { if (obj == null) return null; StringWriter sw = new StringWriter(); XmlSerializer ser = new XmlSerializer(obj.GetType()); ser.Serialize(sw, obj); return sw.ToString(); } The error you get with this is: System.NotSupportedException: The type System.Collections.Generic.Dictionary`2[[System.String, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089],[System.Object, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]] is not supported because it implements IDictionary. Got it! BTW, the same is true with binary serialization. Running the same code above against the DataContractSerializer does work: [TestMethod] public void DictionaryDataContextSerializerTest() { var bag = new Dictionary<string, object>(); bag.Add("key", "Value"); bag.Add("Key2", 100.10M); bag.Add("Key3", Guid.NewGuid()); bag.Add("Key4", DateTime.Now); bag.Add("Key5", true); bag.Add("Key7", new byte[3] { 42, 45, 66 }); TestContext.WriteLine(this.ToXmlDcs(bag)); } public string ToXmlDcs(object value, bool throwExceptions = false) { var ser = new DataContractSerializer(value.GetType(), null, int.MaxValue, true, false, null); MemoryStream ms = new MemoryStream(); ser.WriteObject(ms, value); return Encoding.UTF8.GetString(ms.ToArray(), 0, (int)ms.Length); } This DOES work but produces some pretty heinous XML (formatted with line breaks and indentation here): <ArrayOfKeyValueOfstringanyType xmlns="http://schemas.microsoft.com/2003/10/Serialization/Arrays" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"> <KeyValueOfstringanyType> <Key>key</Key> <Value i:type="a:string" xmlns:a="http://www.w3.org/2001/XMLSchema">Value</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key2</Key> <Value i:type="a:decimal" xmlns:a="http://www.w3.org/2001/XMLSchema">100.10</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key3</Key> <Value i:type="a:guid" xmlns:a="http://schemas.microsoft.com/2003/10/Serialization/">2cd46d2a-a636-4af4-979b-e834d39b6d37</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key4</Key> <Value i:type="a:dateTime" xmlns:a="http://www.w3.org/2001/XMLSchema">2011-09-19T17:17:05.4406999-07:00</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key5</Key> <Value i:type="a:boolean" xmlns:a="http://www.w3.org/2001/XMLSchema">true</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key7</Key> <Value i:type="a:base64Binary" xmlns:a="http://www.w3.org/2001/XMLSchema">Ki1C</Value> </KeyValueOfstringanyType> </ArrayOfKeyValueOfstringanyType> Ouch! That seriously hurts the eye! :-) Worse though it's extremely verbose with all those repetitive namespace declarations. It's good to know that it works in a pinch, but for a human readable/editable solution or something lightweight to store in a database it's not quite ideal. Why should I care? As a little background, in one of my applications I have a need for a flexible property bag that is used on a free form database field on an otherwise static entity. Basically what I have is a standard database record to which arbitrary properties can be added in an XML based string field. I intend to expose those arbitrary properties as a collection from field data stored in XML. The concept is pretty simple: When loading write the data to the collection, when the data is saved serialize the data into an XML string and store it into the database. When reading the data pick up the XML and if the collection on the entity is accessed automatically deserialize the XML into the Dictionary. (I'll talk more about this in another post). While the DataContext Serializer would work, it's verbosity is problematic both for size of the generated XML strings and the fact that users can manually edit this XML based property data in an advanced mode. A clean(er) layout certainly would be preferable and more user friendly. Custom XMLSerialization with a PropertyBag Class So… after a bunch of experimentation with different serialization formats I decided to create a custom PropertyBag class that provides for a serializable Dictionary. It's basically a custom Dictionary<TType,TValue> implementation with the keys always set as string keys. The result are PropertyBag<TValue> and PropertyBag (which defaults to the object type for values). The PropertyBag<TType> and PropertyBag classes provide these features: Subclassed from Dictionary<T,V> Implements IXmlSerializable with a cleanish XML format ToXml() and FromXml() methods to export and import to and from XML strings Static CreateFromXml() method to create an instance It's simple enough as it's merely a Dictionary<string,object> subclass but that supports serialization to a - what I think at least - cleaner XML format. The class is super simple to use: [TestMethod] public void PropertyBagTwoWayObjectSerializationTest() { var bag = new PropertyBag(); bag.Add("key", "Value"); bag.Add("Key2", 100.10M); bag.Add("Key3", Guid.NewGuid()); bag.Add("Key4", DateTime.Now); bag.Add("Key5", true); bag.Add("Key7", new byte[3] { 42,45,66 } ); bag.Add("Key8", null); bag.Add("Key9", new ComplexObject() { Name = "Rick", Entered = DateTime.Now, Count = 10 }); string xml = bag.ToXml(); TestContext.WriteLine(bag.ToXml()); bag.Clear(); bag.FromXml(xml); Assert.IsTrue(bag["key"] as string == "Value"); Assert.IsInstanceOfType( bag["Key3"], typeof(Guid)); Assert.IsNull(bag["Key8"]); //Assert.IsNull(bag["Key10"]); Assert.IsInstanceOfType(bag["Key9"], typeof(ComplexObject)); } This uses the PropertyBag class which uses a PropertyBag<string,object> - which means it returns untyped values of type object. I suspect for me this will be the most common scenario as I'd want to store arbitrary values in the PropertyBag rather than one specific type. The same code with a strongly typed PropertyBag<decimal> looks like this: [TestMethod] public void PropertyBagTwoWayValueTypeSerializationTest() { var bag = new PropertyBag<decimal>(); bag.Add("key", 10M); bag.Add("Key1", 100.10M); bag.Add("Key2", 200.10M); bag.Add("Key3", 300.10M); string xml = bag.ToXml(); TestContext.WriteLine(bag.ToXml()); bag.Clear(); bag.FromXml(xml); Assert.IsTrue(bag.Get("Key1") == 100.10M); Assert.IsTrue(bag.Get("Key3") == 300.10M); } and produces typed results of type decimal. The types can be either value or reference types the combination of which actually proved to be a little more tricky than anticipated due to null and specific string value checks required - getting the generic typing right required use of default(T) and Convert.ChangeType() to trick the compiler into playing nice. Of course the whole raison d'etre for this class is the XML serialization. You can see in the code above that we're doing a .ToXml() and .FromXml() to serialize to and from string. The XML produced for the first example looks like this: <?xml version="1.0" encoding="utf-8"?> <properties> <item> <key>key</key> <value>Value</value> </item> <item> <key>Key2</key> <value type="decimal">100.10</value> </item> <item> <key>Key3</key> <value type="___System.Guid"> <guid>f7a92032-0c6d-4e9d-9950-b15ff7cd207d</guid> </value> </item> <item> <key>Key4</key> <value type="datetime">2011-09-26T17:45:58.5789578-10:00</value> </item> <item> <key>Key5</key> <value type="boolean">true</value> </item> <item> <key>Key7</key> <value type="base64Binary">Ki1C</value> </item> <item> <key>Key8</key> <value type="nil" /> </item> <item> <key>Key9</key> <value type="___Westwind.Tools.Tests.PropertyBagTest+ComplexObject"> <ComplexObject> <Name>Rick</Name> <Entered>2011-09-26T17:45:58.5789578-10:00</Entered> <Count>10</Count> </ComplexObject> </value> </item> </properties>   The format is a bit cleaner than the DataContractSerializer. Each item is serialized into <key> <value> pairs. If the value is a string no type information is written. Since string tends to be the most common type this saves space and serialization processing. All other types are attributed. Simple types are mapped to XML types so things like decimal, datetime, boolean and base64Binary are encoded using their Xml type values. All other types are embedded with a hokey format that describes the .NET type preceded by a three underscores and then are encoded using the XmlSerializer. You can see this best above in the ComplexObject encoding. For custom types this isn't pretty either, but it's more concise than the DCS and it works as long as you're serializing back and forth between .NET clients at least. The XML generated from the second example that uses PropertyBag<decimal> looks like this: <?xml version="1.0" encoding="utf-8"?> <properties> <item> <key>key</key> <value type="decimal">10</value> </item> <item> <key>Key1</key> <value type="decimal">100.10</value> </item> <item> <key>Key2</key> <value type="decimal">200.10</value> </item> <item> <key>Key3</key> <value type="decimal">300.10</value> </item> </properties>   How does it work As I mentioned there's nothing fancy about this solution - it's little more than a subclass of Dictionary<T,V> that implements custom Xml Serialization and a couple of helper methods that facilitate getting the XML in and out of the class more easily. But it's proven very handy for a number of projects for me where dynamic data storage is required. Here's the code: /// <summary> /// Creates a serializable string/object dictionary that is XML serializable /// Encodes keys as element names and values as simple values with a type /// attribute that contains an XML type name. Complex names encode the type /// name with type='___namespace.classname' format followed by a standard xml /// serialized format. The latter serialization can be slow so it's not recommended /// to pass complex types if performance is critical. /// </summary> [XmlRoot("properties")] public class PropertyBag : PropertyBag<object> { /// <summary> /// Creates an instance of a propertybag from an Xml string /// </summary> /// <param name="xml">Serialize</param> /// <returns></returns> public static PropertyBag CreateFromXml(string xml) { var bag = new PropertyBag(); bag.FromXml(xml); return bag; } } /// <summary> /// Creates a serializable string for generic types that is XML serializable. /// /// Encodes keys as element names and values as simple values with a type /// attribute that contains an XML type name. Complex names encode the type /// name with type='___namespace.classname' format followed by a standard xml /// serialized format. The latter serialization can be slow so it's not recommended /// to pass complex types if performance is critical. /// </summary> /// <typeparam name="TValue">Must be a reference type. For value types use type object</typeparam> [XmlRoot("properties")] public class PropertyBag<TValue> : Dictionary<string, TValue>, IXmlSerializable { /// <summary> /// Not implemented - this means no schema information is passed /// so this won't work with ASMX/WCF services. /// </summary> /// <returns></returns> public System.Xml.Schema.XmlSchema GetSchema() { return null; } /// <summary> /// Serializes the dictionary to XML. Keys are /// serialized to element names and values as /// element values. An xml type attribute is embedded /// for each serialized element - a .NET type /// element is embedded for each complex type and /// prefixed with three underscores. /// </summary> /// <param name="writer"></param> public void WriteXml(System.Xml.XmlWriter writer) { foreach (string key in this.Keys) { TValue value = this[key]; Type type = null; if (value != null) type = value.GetType(); writer.WriteStartElement("item"); writer.WriteStartElement("key"); writer.WriteString(key as string); writer.WriteEndElement(); writer.WriteStartElement("value"); string xmlType = XmlUtils.MapTypeToXmlType(type); bool isCustom = false; // Type information attribute if not string if (value == null) { writer.WriteAttributeString("type", "nil"); } else if (!string.IsNullOrEmpty(xmlType)) { if (xmlType != "string") { writer.WriteStartAttribute("type"); writer.WriteString(xmlType); writer.WriteEndAttribute(); } } else { isCustom = true; xmlType = "___" + value.GetType().FullName; writer.WriteStartAttribute("type"); writer.WriteString(xmlType); writer.WriteEndAttribute(); } // Actual deserialization if (!isCustom) { if (value != null) writer.WriteValue(value); } else { XmlSerializer ser = new XmlSerializer(value.GetType()); ser.Serialize(writer, value); } writer.WriteEndElement(); // value writer.WriteEndElement(); // item } } /// <summary> /// Reads the custom serialized format /// </summary> /// <param name="reader"></param> public void ReadXml(System.Xml.XmlReader reader) { this.Clear(); while (reader.Read()) { if (reader.NodeType == XmlNodeType.Element && reader.Name == "key") { string xmlType = null; string name = reader.ReadElementContentAsString(); // item element reader.ReadToNextSibling("value"); if (reader.MoveToNextAttribute()) xmlType = reader.Value; reader.MoveToContent(); TValue value; if (xmlType == "nil") value = default(TValue); // null else if (string.IsNullOrEmpty(xmlType)) { // value is a string or object and we can assign TValue to value string strval = reader.ReadElementContentAsString(); value = (TValue) Convert.ChangeType(strval, typeof(TValue)); } else if (xmlType.StartsWith("___")) { while (reader.Read() && reader.NodeType != XmlNodeType.Element) { } Type type = ReflectionUtils.GetTypeFromName(xmlType.Substring(3)); //value = reader.ReadElementContentAs(type,null); XmlSerializer ser = new XmlSerializer(type); value = (TValue)ser.Deserialize(reader); } else value = (TValue)reader.ReadElementContentAs(XmlUtils.MapXmlTypeToType(xmlType), null); this.Add(name, value); } } } /// <summary> /// Serializes this dictionary to an XML string /// </summary> /// <returns>XML String or Null if it fails</returns> public string ToXml() { string xml = null; SerializationUtils.SerializeObject(this, out xml); return xml; } /// <summary> /// Deserializes from an XML string /// </summary> /// <param name="xml"></param> /// <returns>true or false</returns> public bool FromXml(string xml) { this.Clear(); // if xml string is empty we return an empty dictionary if (string.IsNullOrEmpty(xml)) return true; var result = SerializationUtils.DeSerializeObject(xml, this.GetType()) as PropertyBag<TValue>; if (result != null) { foreach (var item in result) { this.Add(item.Key, item.Value); } } else // null is a failure return false; return true; } /// <summary> /// Creates an instance of a propertybag from an Xml string /// </summary> /// <param name="xml"></param> /// <returns></returns> public static PropertyBag<TValue> CreateFromXml(string xml) { var bag = new PropertyBag<TValue>(); bag.FromXml(xml); return bag; } } } The code uses a couple of small helper classes SerializationUtils and XmlUtils for mapping Xml types to and from .NET, both of which are from the WestWind,Utilities project (which is the same project where PropertyBag lives) from the West Wind Web Toolkit. The code implements ReadXml and WriteXml for the IXmlSerializable implementation using old school XmlReaders and XmlWriters (because it's pretty simple stuff - no need for XLinq here). Then there are two helper methods .ToXml() and .FromXml() that basically allow your code to easily convert between XML and a PropertyBag object. In my code that's what I use to actually to persist to and from the entity XML property during .Load() and .Save() operations. It's sweet to be able to have a string key dictionary and then be able to turn around with 1 line of code to persist the whole thing to XML and back. Hopefully some of you will find this class as useful as I've found it. It's a simple solution to a common requirement in my applications and I've used the hell out of it in the  short time since I created it. Resources You can find the complete code for the two classes plus the helpers in the Subversion repository for Westwind.Utilities. You can grab the source files from there or download the whole project. You can also grab the full Westwind.Utilities assembly from NuGet and add it to your project if that's easier for you. PropertyBag Source Code SerializationUtils and XmlUtils Westwind.Utilities Assembly on NuGet (add from Visual Studio) © Rick Strahl, West Wind Technologies, 2005-2011Posted in .NET  CSharp   Tweet (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Serving up a RSS feed in MVC using WCF Syndication

    - by brian_ritchie
    With .NET 3.5, Microsoft added the SyndicationFeed class to WCF for generating ATOM 1.0 & RSS 2.0 feeds.  In .NET 3.5, it lives in System.ServiceModel.Web but was moved into System.ServiceModel in .NET 4.0. Here's some sample code on constructing a feed: .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: Consolas, "Courier New", Courier, Monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } 1: SyndicationFeed feed = new SyndicationFeed(title, description, new Uri(link)); 2: feed.Categories.Add(new SyndicationCategory(category)); 3: feed.Copyright = new TextSyndicationContent(copyright); 4: feed.Language = "en-us"; 5: feed.Copyright = new TextSyndicationContent(DateTime.Now.Year + " " + ownerName); 6: feed.ImageUrl = new Uri(imageUrl); 7: feed.LastUpdatedTime = DateTime.Now; 8: feed.Authors.Add(new SyndicationPerson() { Name = ownerName, Email = ownerEmail }); 9:   10: var feedItems = new List<SyndicationItem>(); 11: foreach (var item in Items) 12: { 13: var sItem = new SyndicationItem(item.title, null, new Uri(link)); 14: sItem.Summary = new TextSyndicationContent(item.summary); 15: sItem.Id = item.id; 16: if (item.publishedDate != null) 17: sItem.PublishDate = (DateTimeOffset)item.publishedDate; 18: sItem.Links.Add(new SyndicationLink() { Title = item.title, Uri = new Uri(link), Length = item.size, MediaType = item.mediaType }); 19: feedItems.Add(sItem); 20: } 21: feed.Items = feedItems;   Then, we create a custom ContentResult to serialize the feed & stream it to the client: 1: public class SyndicationFeedResult : ContentResult 2: { 3: public SyndicationFeedResult(SyndicationFeed feed) 4: : base() 5: { 6: using (var memstream = new MemoryStream()) 7: using (var writer = new XmlTextWriter(memstream, System.Text.UTF8Encoding.UTF8)) 8: { 9: feed.SaveAsRss20(writer); 10: writer.Flush(); 11: memstream.Position = 0; 12: Content = new StreamReader(memstream).ReadToEnd(); 13: ContentType = "application/rss+xml" ; 14: } 15: } 16: } Finally, we wire it up through the controller: 1: public class RssController : Controller 2: { 3: public SyndicationFeedResult Feed() 4: { 5: var feed = new SyndicationFeed(); 6: // populate feed... 7: return new SyndicationFeedResult(feed); 8: } 9: }   In the next post, I'll discuss how to add iTunes markup to the feed to publish it on iTunes as a Podcast. 

    Read the article

  • WPF 3.5 RenderTargetBitmap memory hog

    - by kingRauk
    I have a 3.5 WPF application that use's RenderTargetBitmap. It eat's memory like a big bear. It's is a know problem in 3.5 that RenderTargetBitmap.Render has memory problems. Have find some solutions for it, but i doesnt help. https://connect.microsoft.com/VisualStudio/feedback/details/489723/rendertargetbitmap-render-method-causes-a-memory-leak Program takes too much memory And more... Does anyway have any more ideas to solve it... static Image Method(FrameworkElement e, int width, int height) { const int dpi = 192; e.Width = width; e.Height = height; e.Arrange(new Rect(0, 0, width, height)); e.UpdateLayout(); if(element is Graph) (element as Graph).UpdateComponents(); var bitmap = new RenderTargetBitmap((int)(width*dpi/96.0), (int)(height*dpi/96.0), dpi, dpi, PixelFormats.Pbgra32); bitmap.Render(element); var encoder = new PngBitmapEncoder(); encoder.Frames.Add(BitmapFrame.Create(bitmap)); using (var stream = new MemoryStream()) { encoder.Save(stream); element.Clip = null; Dispose(element); bitmap.Freeze(); DisposeRender(bitmap); bitmap.Clear(); GC.Collect(); GC.WaitForPendingFinalizers(); return System.Drawing.Image.FromStream(stream); } } public static void Dispose(FrameworkElement element) { GC.Collect(); GC.WaitForPendingFinalizers(); GC.Collect(); } public static void DisposeRender(RenderTargetBitmap bitmap) { if (bitmap != null) bitmap.Clear(); bitmap = null; GC.Collect(); GC.WaitForPendingFinalizers(); }

    Read the article

  • WCF Restful services getting error 400 (bad request) when post xml data

    - by Wayne Lo
    I am trying to self host a WCF services and calling the services via javascript. It works when I pass the request data via Json but not xml (400 bad request). Please help. Contract: public interface iSelfHostServices { [OperationContract] [WebInvoke(Method = "POST", UriTemplate = INFOMATO.RestTemplate.hello_post2,RequestFormat = WebMessageFormat.Xml, ResponseFormat = WebMessageFormat.Xml, BodyStyle = WebMessageBodyStyle.Wrapped)] Stream hello_post2(string helloString); } Server side code: public Stream hello_post2(string helloString) { if (helloString == null) { WebOperationContext.Current.OutgoingResponse.StatusCode = System.Net.HttpStatusCode.BadRequest; return null; } WebOperationContext.Current.OutgoingResponse.StatusCode = System.Net.HttpStatusCode.OK; return new MemoryStream(Encoding.UTF8.GetBytes(helloString)); } JavaScript: function testSelfHost_WCFService_post_Parameter() { var xmlString = "<helloString>'hello via Post'</helloString>"; Ajax_sendData("hello/post2", xmlString); } function Ajax_sendData(url, data) { var request = false; request = getHTTPObject(); if (request) { request.onreadystatechange = function() { parseResponse(request); }; request.open("post", url, true); request.setRequestHeader("Content-Type", "text/xml; charset=utf-8"); charset=utf-8"); request.send(data); return true; } } function getHTTPObject() { var xhr = false; if (window.XMLHttpRequest) { xhr = new XMLHttpRequest(); } else if (window.ActiveXObject) {...} }

    Read the article

  • .NET: Can I use DataContractJsonSerializer to serialize to a JSON associative array?

    - by Cheeso
    When using DataContractJsonSerializer to serialize a dictionary, like so: [CollectionDataContract] public class Clazz : Dictionary<String,String> {} .... var c1 = new Clazz(); c1["Red"] = "Rosso"; c1["Blue"] = "Blu"; c1["Green"] = "Verde"; Serializing c1 with this code: var dcjs = new DataContractJsonSerializer(c1.GetType()); var json = new Func<String>(() => { using (var ms = new System.IO.MemoryStream()) { dcjs.WriteObject(ms, c1); return Encoding.ASCII.GetString(ms.ToArray()); } })(); ...produces this JSON: [{"Key":"Red","Value":"Rosso"}, {"Key":"Blue","Value":"Blu"}, {"Key":"Green","Value":"Verde"}] But, this isn't a Javascript associative array. If I do the corresponding thing in javascript: produce a dictionary and then serialize it, like so: var a = {}; a["Red"] = "Rosso"; a["Blue"] = "Blu"; a["Green"] = "Verde"; // use utility class from http://www.JSON.org/json2.js var json = JSON.stringify(a); The result is: {"Red":"Rosso","Blue":"Blu","Green":"Verde"} How can I get DCJS to produce or consume a serialized string for a dictionary, that is compatible with JSON2.js ? I know about JavaScriptSerializer from ASP.NET. Not sure if it's very WCF friendly. Does it respect DataMember, DataContract attributes?

    Read the article

  • DynamicObject and WCF support

    - by rboarman
    Hi, I was wondering if anyone has had any luck getting a DynamicObject to serialize and work with WCF? Here’s my little test: [DataContract] class MyDynamicObject : DynamicObject { [DataMember] private Dictionary<string, object> _attributes = new Dictionary<string, object>(); public override bool TryGetMember(GetMemberBinder binder, out object result) { string key = binder.Name; result = null; if (_attributes.ContainsKey(key)) result = _attributes[key]; return true; } public override bool TrySetMember(SetMemberBinder binder, object value) { _attributes.Add(binder.Name, value); return true; } } var dy = new MyDynamicObject(); var ser = new DataContractSerializer(typeof(MyDynamicObject)); var mem = new MemoryStream(); ser.WriteObject(mem, dy); The error I get is: System.Runtime.Serialization.InvalidDataContractException was unhandled Message=Type 'ElasticTest1.MyDynamicObject' cannot inherit from a type that is not marked with DataContractAttribute or SerializableAttribute. Consider marking the base type 'System.Dynamic.DynamicObject' with DataContractAttribute or SerializableAttribute, or removing them from the derived type. Any suggestions? Thanks, Rick

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15  | Next Page >