Search Results

Search found 3489 results on 140 pages for 'summary'.

Page 51/140 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • Initialize preferences from XML in main Activity

    - by pixel
    My problem is that when I start application and user didn't open my PreferenceActivity so when I retrieve them don't get any default values defined in my preference.xml file. preference.xml file: <?xml version="1.0" encoding="utf-8"?> <PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android" android:key="applicationPreference" android:title="@string/config" > <ListPreference android:key="pref1" android:defaultValue="default" android:title="Title" android:summary="Summary" android:entries="@array/entry_names" android:entryValues="@array/entry_values" android:dialogTitle="@string/dialog_title" /> </PreferenceScreen> Snippet from my main Activity (onCreate method): SharedPreferences appPreferences = PreferenceManager.getDefaultSharedPreferences(this); String pref1 = appPreferences.getString("pref1", null); In result I end up with a null value.

    Read the article

  • connect to web API from ..NET

    - by Saif Khan
    How can I access and consume a web API from .NET? The API is not a .NET API. Here is sample code I have in Ruby require 'uri' require 'net/http' url = URI.parse("http://account.codebasehq.com/widgets/tickets") req = Net::HTTP::Post.new(url.path) req.basic_auth('dave', '6b2579a03c2e8825a5fd0a9b4390d15571f3674d') req.add_field('Content-type', 'application/xml') req.add_field('Accept', 'application/xml') xml = "<ticket><summary>My Example Ticket</summary><status-id>1234</status-id><priority-id>1234</priority-id><ticket-type>bug</ticket-type></ticket>" res = Net::HTTP.new(url.host, url.port).start {|http| http.request(req, xml)} case res when Net::HTTPCreated puts "Record was created successfully." else puts "An error occurred while adding this record" end Where can I find information on consuming API like this from .NET? I am aware how to use .NET webservices.

    Read the article

  • Asynchronous Sockets - Handling false socket.AcceptAsync values

    - by David
    The Socket class has a method .AcceptAsync which either returns true or false. I'd thought the false return value was an error condition, but in the samples Microsoft provide for Async sockets they call the callback function synchronously after checking for failure, as shown here: public void StartAccept(SocketAsyncEventArgs acceptEventArg) { if (acceptEventArg == null) { acceptEventArg = new SocketAsyncEventArgs(); acceptEventArg.Completed += new EventHandler<SocketAsyncEventArgs>(AcceptEventArg_Completed); } else { // socket must be cleared since the context object is being reused acceptEventArg.AcceptSocket = null; } m_maxNumberAcceptedClients.WaitOne(); bool willRaiseEvent = listenSocket.AcceptAsync(acceptEventArg); if (!willRaiseEvent) { ProcessAccept(acceptEventArg); } } /// <summary> /// This method is the callback method associated with Socket.AcceptAsync operations and is invoked /// when an accept operation is complete /// </summary> void AcceptEventArg_Completed(object sender, SocketAsyncEventArgs e) { ProcessAccept(e); } Why do they do this? It defeats the purpose of asynchronous sockets and stops the method from returning.

    Read the article

  • Question on SyndicationItem - ASP.NET

    - by bobsmith123
    I am trying to understand how to use SyndicationItem to display feed which is RSS 2.0 or Atom compliant. What property of SyndicationItem gives me the entire description of the post. It appears that there is a Summary property but per MSDN, it gives only the Summary. Also I have noticed that in my RSS feed reader, some RSS feeds show only a few lines of description and I have to click and go to the website for the full post. But in some feeds, I can see the full post within the Feed Reader. Can someone explain how all this comes together? PS: My web page lets user enter a RSS feed address and I need to validate if the feed exists. If it does, I need to grab the last x items and show the feed's title and full description

    Read the article

  • ASP.Net MVC Where do you convert from Entities to ViewModels?

    - by Pino
    Title pretty much explains it all, its the last thing I'm trying to work into our project. We are structured with a Service Library which contains a function like so. /// <summary> /// Returns a single category based on the specified ID. /// </summary> public Category GetCategory(int CategoryID) { var RetVal = _session.Single<Category>(x => x.ID == CategoryID); return RetVal; } Now Category is a Entity (We are using Entity Framework) we need to convert that to a CategoryViewModel. Now, how would people structure this? Would you make sure the service function returned a CategoryViewModel? Have the controller pull the data from the service then call another function to covnert to a view model?

    Read the article

  • Visual studio does not gray out properties with a ReadOnlyAttribute(true)

    - by Fire-Dragon-DoL
    I know it's stupid but visual studio (2010) doesn't gray out my properties tagged with ReadOnlyAttribute, I can't edit their values (if I try to do it, simply return to the previous value), but they aren't grayed out, I think it's really boring this when using the editor Is there an option or an attribute that I'm forgetting? Thanks for any help Example 1: /// <summary> /// Inform if the LcdDisplay has been already initiated /// </summary> [Description("Inform if the LcdDisplay has been already initiated")] [DefaultValue(false)] [ReadOnly(true)] public bool Initialized { get; private set; } Initialized is not grayed out

    Read the article

  • StyleCop SA1638

    - by niaher
    I am using StyleCop in VS2008. I get this error: SA1638: The file attribute in the file header's copyright tag must contain the name of the file. Here is my header. // <copyright file="AssemblyInfo.cs" company="company"> // Copyright (c) company. All rights reserved. // </copyright> // <author>me</author> // <email>[email protected]</email> // <date>2010-03-04</date> // <summary>blah blah.</summary> I suspect the problem is that my AssemblyInfo.cs is located inside the Properties folder. Any clues to how I can fix this warning without silencing StyleCop?

    Read the article

  • Microsoft Unity, parameters in constructor

    - by raffaeu
    I am using Unity with MVC and NHibernate. Unfortunately, our UnitOfWork resides in a different .dll and it doesn't have a default empty .ctor. This is what I do to register NHibernate: var connectionString = ConfigurationManager.ConnectionStrings["jobManagerConnection"].ConnectionString; var assemblyMap = ConfigurationManager.AppSettings["assemblyMap"]; container .RegisterType<IUnitOfWork, UnitOfWork>(new ContainerControlledLifetimeManager()); In my WebController I have this: /// <summary>Gets or sets UnitOfWork.</summary> [Dependency] public IUnitOfWork UnitOfWork { get; set; } The problem is that the constructor of UnitOfWork expects 2 mandatory strings. How I can setup the RegisterType for this Interface in order to pass the two parameters retreived from the web.config? Is it possible?

    Read the article

  • Design Time Attribute For CSS Class in ASP.net Custom Server Control

    - by Jon P
    Hopefully some Custom Control Designers/Builders can help I'm attempting to build my first custom control that is essential a client detail collection form. There are to be a series of elements to this form that require various styles applied to them. Ideally I'd like the VS 2005/2008 properties interface to be able to apply the CSSClass as it does at the control level, i.e. with a dropdown list of available CSS Clases. Take for example the Class to be applied to the legend tag /// <summary>Css Class for Legend</summary> [Category("Appearance")] [Browsable(true)] [DefaultValue("")] //I am at a loss as to what goes in [Editor] [Editor(System.Web.UI.CssStyleCollection), typeof(System.Drawing.Design.UITypeEditor))] public string LegendCSSClass { get { return _LegendCSSClass; } set { _LegendCSSClass = value; } } I have tried a couple of options, as you can see from above, without much luck. Hopefully there is something simple I am missing. I'd also be happy for references pertaining to the [Editor] attribute

    Read the article

  • Is this code thread safe?

    - by Shawn Simon
    ''' <summary> ''' Returns true if a submission by the same IP address has not been submitted in the past n minutes. ''' </summary> Protected Function EnforceMinTimeBetweenSubmissions() As Boolean Dim minTimeBetweenRequestsMinutes As Integer = 0 Dim configuredTime = ConfigurationManager.AppSettings("MinTimeBetweenSchedulingRequestsMinutes") If String.IsNullOrEmpty(configuredTime) Then Return True If (Not Integer.TryParse(configuredTime, minTimeBetweenRequestsMinutes)) _ OrElse minTimeBetweenRequestsMinutes > 1440 _ OrElse minTimeBetweenRequestsMinutes < 0 Then Throw New ApplicationException("Invalid configuration setting for AppSetting 'MinTimeBetweenSchedulingRequestsMinutes'") End If If minTimeBetweenRequestsMinutes = 0 Then Return True End If If Cache("submitted-requests") Is Nothing Then Cache("submitted-requests") = New Dictionary(Of String, Date) End If ' Remove old requests. Dim submittedRequests As Dictionary(Of String, Date) = CType(Cache("submitted-requests"), Dictionary(Of String, Date)) Dim itemsToRemove = submittedRequests.Where(Function(s) s.Value < Now).Select(Function(s) s.Key).ToList For Each key As String In itemsToRemove submittedRequests.Remove(key) Next If submittedRequests.ContainsKey(Request.UserHostAddress) Then ' User has submitted a request in the past n minutes. Return False Else submittedRequests.Add(Request.UserHostAddress, Now.AddMinutes(minTimeBetweenRequestsMinutes)) End If Return True End Function

    Read the article

  • jquery anchor to html extract

    - by Benjamin Ortuzar
    I would like to implement something similar to the Google quick scroll extension with jquery for the extracts of a search result, so when the full document is opened (within the same website) it gives the user the opportunity to go straight to the extract location. Here is a sample of what I get returned from the search engine when I search for 'food'. <doc> <docid>129305</docid> <title><span class='highlighted'>Food</span></title> <summary> <summarytext>Papers subject to Negative Resolution: 4 <span class='highlighted'>Food</span> <span class='highlighted'>Food</span> Irradiation (England) Regulations 2009 (S.I., 2009, No. 1584), dated 24 June 2009 (by Act), </summarytext> </summary> <paras> <paraitemcount>2</paraitemcount> <para> <paraitem>1</paraitem> <paraid>42</paraid> <pararelevance>100</pararelevance> <paraweights>50</paraweights> <paratext>4 <span class='highlighted'>Food</span></paratext> </para> <para> <paraitem>2</paraitem> <paraid>54</paraid> <pararelevance>100</pararelevance> <paraweights>50</paraweights> <paratext><span class='highlighted'>Food</span> Irradiation (England) Regulations 2009 (S.I., 2009, No. 1584), dated 24 June 2009 (by Act), with an Explanatory Memorandum and an Impact Assessment (</paratext> </para> </paras> </doc> As you see the search engine has returned a document that contains one summary and two extracts. So let's say the user clicks on the second extract in the search resutls page, the browser would open the detailed document in the same website, and would offer the user the possibility to go to the extract as the Google quick scroll extension does. Is there an existing jquery script for this? If not, can you suggest any jquery/javascript code that would simplify my task to implement this. Notes: I can access the extracts from the document details page. I'm aware that the HTML in some cases could be slightly different in the extract than in the details page, finding no match. The search engine does not return where the extract was located. At the moment I'm trying to understand the JS code that the extension uses.

    Read the article

  • Is it possible to have "inherited partial structs" in c#?

    - by Balazs
    Is it possible to use partial structs to achieve something like this: Have an asp.net page base class with the following defined: public partial struct QuerystringKeys { public const string Username = "username"; public const string CurrentUser = "currentUser"; } Have an asp.net page, which inherits from the base class mentioned above, extend the partial declaration: public partial struct QuerystringKeys { /// <summary> /// The page number of the review instances list /// </summary> public const string Page = "page"; } The final goal is a QuerystringKeys struct with all three constant strings defined in it available to the asp.net page.

    Read the article

  • Parsing xml with dom4j or jdom or anyhow

    - by c0mrade
    Hello, I wanna read feed entries and I'm just stuck now. Take this for example : http://stackoverflow.com/feeds/question/2084883 lets say I wanna read all the summary node value inside each entry node in document. How do I do that? I've changed many variations of code this one is closest to what I want to achieve I think : Element entryPoint = document.getRootElement(); Element elem; for(Iterator iter = entryPoint.elements().iterator(); iter.hasNext();){ elem = (Element)iter.next(); System.out.println(elem.getName()); } It goes trough all nodes in xml file and writes their name. Now what I wanted to do next is if(elem.getName() == "entry") to get only the entry nodes, how do I get elements of the entry nodes, and how to get let say summary and its value? tnx

    Read the article

  • Enclosing service execution in try-catch: bad practice?

    - by Sorin Comanescu
    Hi, Below is the usual Program.cs content for a windows service program: static class Program { /// <summary> /// The main entry point for the application. /// </summary> static void Main() { ServiceBase[] ServicesToRun; ServicesToRun = new ServiceBase[] { new MyService() }; ServiceBase.Run(ServicesToRun); } } Is it a bad practice to enclose the ServiceBase.Run(...) in a try-catch block? Thanks.

    Read the article

  • Will this safely delete my record?

    - by Sergio Tapia
    I hate these three tables that. Two tables have a many to many relationship and as such it generates a third table. I'm using Linq-to-SQL and in the .dbml file I've dragged all the folder there to the graphic surface. Here is the method I'm using to delete an Area safely. Remember that documents are associated to an Area, so I can't just delete it and leave documents hanging. ScansDataContext db = new ScansDataContext(); /// <summary> /// Deletes an Area object from the database along with all associations in the database. /// </summary> /// <param name="area">Area object to save</param> public void Delete(Area area) { db.DocumentAreaRelations.DeleteAllOnSubmit(area.DocumentAreaRelations); db.Areas.DeleteOnSubmit(area); db.SubmitChanges(System.Data.Linq.ConflictMode.FailOnFirstConflict); }

    Read the article

  • How can I avoid properties being reset at design-time in tightly bound user controls?

    - by David Anderson
    I have UserControl 'A' with a label, and this property: /// <summary> /// Gets or Sets the text of the control /// </summary> [ Browsable(true), EditorBrowsable(EditorBrowsableState.Always), Category("Appearance") ] public override string Text { get { return uxLabel.Text; } set { uxLabel.Text = value; } } I then have UserControl 'B' which has UserControl 'A' on it, and I set the Text Property to "My Example Label" in the designer. Then, I have my MainForm, which has UserControl 'B' on it. Each time I do a build or run, the Text property of UserControl 'A' is reset to its default value. I suppose this is because since I am doing a rebuild, it rebuilds both UserControl 'A' and 'B', thus causing the problem. How can I go about a better approach to design pattern to avoid this type of behavior when working with tightly bound controls and forms in a application?

    Read the article

  • R glm standard error estimate differences to SAS PROC GENMOD

    - by Michelle
    I am converting a SAS PROC GENMOD example into R, using glm in R. The SAS code was: proc genmod data=data0 namelen=30; model boxcoxy=boxcoxxy ~ AGEGRP4 + AGEGRP5 + AGEGRP6 + AGEGRP7 + AGEGRP8 + RACE1 + RACE3 + WEEKEND + SEQ/dist=normal; FREQ REPLICATE_VAR; run; My R code is: parmsg2 <- glm(boxcoxxy ~ AGEGRP4 + AGEGRP5 + AGEGRP6 + AGEGRP7 + AGEGRP8 + RACE1 + RACE3 + WEEKEND + SEQ , data=data0, family=gaussian, weights = REPLICATE_VAR) When I use summary(parmsg2) I get the same coefficient estimates as in SAS, but my standard errors are wildly different. The summary output from SAS is: Name df Estimate StdErr LowerWaldCL UpperWaldCL ChiSq ProbChiSq Intercept 1 6.5007436 .00078884 6.4991975 6.5022897 67911982 0 agegrp4 1 .64607262 .00105425 .64400633 .64813891 375556.79 0 agegrp5 1 .4191395 .00089722 .41738099 .42089802 218233.76 0 agegrp6 1 -.22518765 .00083118 -.22681672 -.22355857 73401.113 0 agegrp7 1 -1.7445189 .00087569 -1.7462352 -1.7428026 3968762.2 0 agegrp8 1 -2.2908855 .00109766 -2.2930369 -2.2887342 4355849.4 0 race1 1 -.13454883 .00080672 -.13612997 -.13296769 27817.29 0 race3 1 -.20607036 .00070966 -.20746127 -.20467944 84319.131 0 weekend 1 .0327884 .00044731 .0319117 .03366511 5373.1931 0 seq2 1 -.47509583 .00047337 -.47602363 -.47416804 1007291.3 0 Scale 1 2.9328613 .00015586 2.9325559 2.9331668 -127 The summary output from R is: Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 6.50074 0.10354 62.785 < 2e-16 AGEGRP4 0.64607 0.13838 4.669 3.07e-06 AGEGRP5 0.41914 0.11776 3.559 0.000374 AGEGRP6 -0.22519 0.10910 -2.064 0.039031 AGEGRP7 -1.74452 0.11494 -15.178 < 2e-16 AGEGRP8 -2.29089 0.14407 -15.901 < 2e-16 RACE1 -0.13455 0.10589 -1.271 0.203865 RACE3 -0.20607 0.09315 -2.212 0.026967 WEEKEND 0.03279 0.05871 0.558 0.576535 SEQ -0.47510 0.06213 -7.646 2.25e-14 The importance of the difference in the standard errors is that the SAS coefficients are all statistically significant, but the RACE1 and WEEKEND coefficients in the R output are not. I have found a formula to calculate the Wald confidence intervals in R, but this is pointless given the difference in the standard errors, as I will not get the same results. Apparently SAS uses a ridge-stabilized Newton-Raphson algorithm for its estimates, which are ML. The information I read about the glm function in R is that the results should be equivalent to ML. What can I do to change my estimation procedure in R so that I get the equivalent coefficents and standard error estimates that were produced in SAS? To update, thanks to Spacedman's answer, I used weights because the data are from individuals in a dietary survey, and REPLICATE_VAR is a balanced repeated replication weight, that is an integer (and quite large, in the order of 1000s or 10000s). The website that describes the weight is here. I don't know why the FREQ rather than the WEIGHT command was used in SAS. I will now test by expanding the number of observations using REPLICATE_VAR and rerunning the analysis.

    Read the article

  • How do you update the aspnetdb membership IsApproved value?

    - by Matt
    I need to update existing users IsApproved status in the aspnet_Membership table. I have the code below that does not seem to be working. The user.IsApproved property is updated but it is not saving it to the database table. Are there any additional calls I need to make? Any suggestions? Thanks. /// <summary> /// Updates a users approval status to the specified value /// </summary> /// <param name="userName">The user to update</param> /// <param name="isApproved">The updated approval status</param> public static void UpdateApprovalStatus(string userName, bool isApproved) { MembershipUser user = Membership.GetUser(userName); if (user != null) user.IsApproved = isApproved; }

    Read the article

  • Anyone succeeded at injecting Interfaces into Entity Framework 4 Entities, using T4?

    - by Ciel
    Hello: POCO sort of leaves me wanting: (how can I say I use DI/IoC, if the Repository is not the only place that is creating the entities?)...hence my desire to lock it down, get rid of the temptation of newing up POCOs or EntityObjects anywhere in the code, and just allowing entity interfaces above the Repository/Factory layer. For a second there, I nearly thought I had it...was editing EF4's T4 in order to inject in an Interface def. Was going swimmingly, compiled and worked, until I got to the Associations... I wrapped them with a ICollection, and renamed the underlying original collection with a prefix of Wrapped. Unfortunately, when run, throws an error: //The Member 'WrappedSubExamples' in the CLR type 'XAct.App.Data.Model.EF4.Example' is not present in the conceptual model type 'XAct.App.Data.Model.Entity.Example'. var examples = context2.CreateObjectSet(); My T4 segment I used was (this may not work, as it's the longest code snippet I've ever posted here...sorry): #region Generic Property Abstraction <# if (navProperty.ToEndMember.RelationshipMultiplicity == RelationshipMultiplicity.Many) {#> //XAct.App Generic Wrapper: <#=code.SpaceAfter(NewModifier(navProperty))#><#=Accessibility.ForProperty(navProperty)#> ICollection<I<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>> <#=code.Escape(navProperty)#> { get { if (_X<#=code.Escape(navProperty)# == null){ _X<#=code.Escape(navProperty)# = new WrappedCollection,<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#(this.<#=(navProperty.ToEndMember.RelationshipMultiplicity == RelationshipMultiplicity.Many)?"Wrapped":""#<#=code.Escape(navProperty)#); } return _X<#=code.Escape(navProperty)#; } } private ICollection _X<#=code.Escape(navProperty)#; <# } else { # <#=code.SpaceAfter(NewModifier(navProperty))#<#=Accessibility.ForProperty(navProperty)# I<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)# <#=code.Escape(navProperty)# { get { return (I<#=code.Escape(navProperty)#)this.Wrapped<#=code.Escape(navProperty)#; } set { this.Wrapped<#=code.Escape(navProperty)# = value as <#=code.Escape(navProperty)#; } } <# } # #endregion which then wraps the original collection, renamed with the prefix 'Wrapped': /// <summary> /// <#=SummaryComment(navProperty)#> /// </summary><#=LongDescriptionCommentElement(navProperty, region.CurrentIndentLevel) #> [XmlIgnoreAttribute()] [SoapIgnoreAttribute()] [DataMemberAttribute()] [EdmRelationshipNavigationPropertyAttribute("<#=navProperty.RelationshipType.NamespaceName#>", "<#=navProperty.RelationshipType.Name#>", "<#=navProperty.ToEndMember.Name#>")] <# if (navProperty.ToEndMember.RelationshipMultiplicity == RelationshipMultiplicity.Many) { #> <#=code.SpaceAfter(NewModifier(navProperty))#><#=Accessibility.ForProperty(navProperty)#> EntityCollection<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>> Wrapped<#=code.Escape(navProperty)#> { <#=code.SpaceAfter(Accessibility.ForGetter(navProperty))#>get { return ((IEntityWithRelationships)this).RelationshipManager.GetRelatedCollection<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>>("<#=navProperty.RelationshipType.FullName#>", "<#=navProperty.ToEndMember.Name#>"); } <#=code.SpaceAfter(Accessibility.ForSetter(navProperty))#>set { if ((value != null)) { ((IEntityWithRelationships)this).RelationshipManager.InitializeRelatedCollection<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>>("<#=navProperty.RelationshipType.FullName#>", "<#=navProperty.ToEndMember.Name#>", value); } } } <# } else { #> <#=code.SpaceAfter(NewModifier(navProperty))#><#=Accessibility.ForProperty(navProperty)#> <#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#> Wrapped<#=code.Escape(navProperty)#> { <#=code.SpaceAfter(Accessibility.ForGetter(navProperty))#>get { return ((IEntityWithRelationships)this).RelationshipManager.GetRelatedReference<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>>("<#=navProperty.RelationshipType.FullName#>", "<#=navProperty.ToEndMember.Name#>").Value; } <#=code.SpaceAfter(Accessibility.ForSetter(navProperty))#>set { ((IEntityWithRelationships)this).RelationshipManager.GetRelatedReference<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>>("<#=navProperty.RelationshipType.FullName#>", "<#=navProperty.ToEndMember.Name#>").Value = value; } } <# string refPropertyName = navProperty.Name + "Reference"; if (entity.Members.Any(m => m.Name == refPropertyName)) { // 6017 is the same error number that EntityClassGenerator uses. Errors.Add(new System.CodeDom.Compiler.CompilerError(SourceCsdlPath, -1, -1, "6017", String.Format(CultureInfo.CurrentCulture, GetResourceString("Template_ConflictingGeneratedNavPropName"), navProperty.Name, entity.FullName, refPropertyName))); } #> /// <summary> /// <#=SummaryComment(navProperty)#> /// </summary><#=LongDescriptionCommentElement(navProperty, region.CurrentIndentLevel)#> [BrowsableAttribute(false)] [DataMemberAttribute()] <#=Accessibility.ForProperty(navProperty)#> EntityReference<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>> <#=refPropertyName#> { <#=code.SpaceAfter(Accessibility.ForGetter(navProperty))#>get { return ((IEntityWithRelationships)this).RelationshipManager.GetRelatedReference<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>>("<#=navProperty.RelationshipType.FullName#>", "<#=navProperty.ToEndMember.Name#>"); } <#=code.SpaceAfter(Accessibility.ForSetter(navProperty))#>set { if ((value != null)) { ((IEntityWithRelationships)this).RelationshipManager.InitializeRelatedReference<<#=MultiSchemaEscape(navProperty.ToEndMember.GetEntityType(), code)#>>("<#=navProperty.RelationshipType.FullName#>", "<#=navProperty.ToEndMember.Name#>", value); } } } <# } The point is...it bugs out. I've tried various solutions...none worked. Any ideas -- or is this just a wild goose chase, and time to give it up?

    Read the article

  • How to generate C# documentation to a CHM or HTML file?

    - by BrunoLM
    Is there a way to generate a readable document file from the documentation on the code directly from Visual Studio? (also considering 2010) If there isn't, what should I use to create a CHM or HTML file? Code example: /// <summary> /// Convert a number to string /// </summary> /// <param name="number">An integer number to be converted to string</param> /// <returns>Number as string</returns> /// <example> /// <code> /// var s = MyMethod(5); /// </code> /// </example> /// <exception cref="Exception">In case it can't convert</exception> /// <remarks> /// Whatever /// </remarks> public string MyMethod(int number) { return number.ToString(); }

    Read the article

  • Another serialization ? - c#

    - by ltech
    I had asked this Yesterday If my xsd schema changes to <xs:element name="Document" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="MetaDoc" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="ATTRIBUTES" minOccurs="0" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="author" type="xs:string" minOccurs="0" /> <xs:element name="max_versions" type="xs:string" minOccurs="0" /> <xs:element name="summary" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> My xsd - class generation becomes /// <remarks/> [System.Xml.Serialization.XmlArrayAttribute(Form=System.Xml.Schema.XmlSchemaForm.Unqualified)] [System.Xml.Serialization.XmlArrayItemAttribute("MetaDoc", typeof(DocumentMetaDocATTRIBUTES[]), Form=System.Xml.Schema.XmlSchemaForm.Unqualified, IsNullable=false)] [System.Xml.Serialization.XmlArrayItemAttribute("ATTRIBUTES", typeof(DocumentMetaDocATTRIBUTES), Form=System.Xml.Schema.XmlSchemaForm.Unqualified, IsNullable=false, NestingLevel=1)] public DocumentMetaDocATTRIBUTES[][][] Document { get { return this.documentField; } set { this.documentField = value; } } If I am deriving to CollectionBase, as shown in my previous post, how would I manage the XmlArrayItemAttribute ? so that I can read this part of my input xml into my strongly types object <Document> <MetaDoc> <ATTRIBUTES> <author>asas</author> <max_versions>1</max_versions> <summary>aasasqqqq</summary> </ATTRIBUTES> </MetaDoc> </Document>

    Read the article

  • MVVM (ICommand) in Silverlight

    - by Andrey Khataev
    Hello! Please, don't judge strictly if this question was discussed previously or indirectly answered in huge nearby prism and mvvm blogs. In WPF implementation of RelayCommand or DelegateCommand classes there is a such eventhandler /// <summary> /// Occurs whenever the state of the application changes such that the result of a call to <see cref="CanExecute"/> may return a different value. /// </summary> public event EventHandler CanExecuteChanged { add { CommandManager.RequerySuggested += value; } remove { CommandManager.RequerySuggested -= value; } } but in SL subset of namespaces there are no CommandManager class. And this is where I'm stuck. I haven't yet found an workaround for this in MVVM adoptation for SL (PRISM is so complex for me yet). Different simple HelloWorldMVVM apps don't deal with at all. Thanks in advance and sorry for my English -)

    Read the article

  • When using delegates, need better way to do sequential processing

    - by Padawan
    I have a class WebServiceCaller that uses NSURLConnection to make asynchronous calls to a web service. The class provides a delegate property and when the web service call is done, it calls a method webServiceDoneWithXXX on the delegate. There are several web service methods that can be called, two of which are say GetSummary and GetList. The classes that use WebServiceCaller initially need both the summary and list so they are written like this: -(void)getAllData { [webServiceCaller getSummary]; } -(void)webServiceDoneWithGetSummary { [webServiceCaller getList]; } -(void)webServiceDoneWithGetList { ... } This works but there are at least two problems: The calls are split across delegate methods so it's hard to see the sequence at a glance but more important it's hard to control or modify the sequence. Sometimes I want to call just GetSummary and not also GetList so I would then have to use an ugly class-level state variable that tells webServiceDoneWithGetSummary whether to call GetList or not. Assume that GetList cannot be done until GetSummary completes and returns some data which is used as input to GetList. Is there a better way to handle this and still get asynchronous calls? Update based on Matt Long's answer: Using notifications instead of a delegate, it looks like I can solve problem #2 by setting a different selector depending on whether I want the full sequence (GetSummary+GetList) or just GetSummary. Both observers would still use the same notification name when calling GetSummary. I would have to write two separate methods to handle GetSummaryDone instead of using a single delegate method (where I would have needed some class-level variable to tell whether to then call GetList). -(void)getAllData { [[NSNotificationCenter defaultCenter] addObserver:self              selector:@selector(getSummaryDoneAndCallGetList:)                  name:kGetSummaryDidFinish object:nil];     [webServiceCaller getSummary]; } -(void)getSummaryDoneAndCallGetList { [NSNotificationCenter removeObserver] //process summary data [[NSNotificationCenter defaultCenter] addObserver:self              selector:@selector(getListDone:)                  name:kGetListDidFinish object:nil];     [webServiceCaller getList]; } -(void)getListDone { [NSNotificationCenter removeObserver] //process list data } -(void)getJustSummaryData { [[NSNotificationCenter defaultCenter] addObserver:self              selector:@selector(getJustSummaryDone:) //different selector but                  name:kGetSummaryDidFinish object:nil]; //same notification name     [webServiceCaller getSummary]; } -(void)getJustSummaryDone { [NSNotificationCenter removeObserver] //process summary data } I haven't actually tried this yet. It seems better than having state variables and if-then statements but you have to write more methods. I still don't see a solution for problem 1.

    Read the article

  • What's the purpose of GC.SuppressFinalize(this) in Dispose() method?

    - by mr.b
    I have code that looks like this: /// <summary> /// Dispose of the instance /// </summary> public void Dispose() { if (_instance != null) { _instance = null; // Call GC.SupressFinalize to take this object off the finalization // queue and prevent finalization code for this object from // executing a second time. GC.SuppressFinalize(this); } } Although there is a comment that explains purpose of that GC-related call, I still don't understand why it's there. Isn't object destined for garbage collection once all instances cease from existence (like, when used in using() block)? What's the use case scenario where this would play important role? Thanks!

    Read the article

  • Embedding mercurial revision information in Visual Studio c# projects automatically

    - by Mark Booth
    Original Problem In building our projects, I want the mercurial id of each repository to be embedded within the product(s) of that repository (the library, application or test application). I find it makes it so much easier to debug an application ebing run by custiomers 8 timezones away if you know precisely what went into building the particular version of the application they are using. As such, every project (application or library) in our systems implement a way of getting at the associated revision information. I also find it very useful to be able to see if an application has been compiled with clean (un-modified) changesets from the repository. 'Hg id' usefully appends a + to the changeset id when there are uncommitted changes in a repository, so this allows is to easily see if people are running a clean or a modified version of the code. My current solution is detailed below, and fulfills the basic requirements, but there are a number of problems with it. Current Solution At the moment, to each and every Visual Studio solution, I add the following "Pre-build event command line" commands: cd $(ProjectDir) HgID I also add an HgID.bat file to the Project directory: @echo off type HgId.pre > HgId.cs For /F "delims=" %%a in ('hg id') Do <nul >>HgID.cs set /p = @"%%a" echo ; >> HgId.cs echo } >> HgId.cs echo } >> HgId.cs along with an HgId.pre file, which is defined as: namespace My.Namespace { /// <summary> Auto generated Mercurial ID class. </summary> internal class HgID { /// <summary> Mercurial version ID [+ is modified] [Named branch]</summary> public const string Version = When I build my application, the pre-build event is triggered on all libraries, creating a new HgId.cs file (which is not kept under revision control) and causing the library to be re-compiled with with the new 'hg id' string in 'Version'. Problems with the current solution The main problem is that since the HgId.cs is re-created at each pre-build, every time we need to compile anything, all projects in the current solution are re-compiled. Since we want to be able to easily debug into our libraries, we usually keep many libraries referenced in our main application solution. This can result in build times which are significantly longer than I would like. Ideally I would like the libraries to compile only if the contents of the HgId.cs file has actually changed, as opposed to having been re-created with exactly the same contents. The second problem with this method is it's dependence on specific behaviour of the windows shell. I've already had to modify the batch file several times, since the original worked under XP but not Vista, the next version worked under Vista but not XP and finally I managed to make it work with both. Whether it will work with Windows 7 however is anyones guess and as time goes on, I see it more likely that contractors will expect to be able to build our apps on their Windows 7 boxen. Finally, I have an aesthetic problem with this solution, batch files and bodged together template files feel like the wrong way to do this. My actual questions How would you solve/how are you solving the problem I'm trying to solve? What better options are out there than what I'm currently doing? Rejected Solutions to these problems Before I implemented the current solution, I looked at Mercurials Keyword extension, since it seemed like the obvious solution. However the more I looked at it and read peoples opinions, the more that I came to the conclusion that it wasn't the right thing to do. I also remember the problems that keyword substitution has caused me in projects at previous companies (just the thought of ever having to use Source Safe again fills me with a feeling of dread *8'). Also, I don't particularly want to have to enable Mercurial extensions to get the build to complete. I want the solution to be self contained, so that it isn't easy for the application to be accidentally compiled without the embedded version information just because an extension isn't enabled or the right helper software hasn't been installed. I also thought of writing this in a better scripting language, one where I would only write HgId.cs file if the content had actually changed, but all of the options I could think of would require my co-workers, contractors and possibly customers to have to install software they might not otherwise want (for example cygwin). Any other options people can think of would be appreciated. Update Partial solution Having played around with it for a while, I've managed to get the HgId.bat file to only overwrite the HgId.cs file if it changes: @echo off type HgId.pre > HgId.cst For /F "delims=" %%a in ('hg id') Do <nul >>HgId.cst set /p = @"%%a" echo ; >> HgId.cst echo } >> HgId.cst echo } >> HgId.cst fc HgId.cs HgId.cst >NUL if %errorlevel%==0 goto :ok copy HgId.cst HgId.cs :ok del HgId.cst Problems with this solution Even though HgId.cs is no longer being re-created every time, Visual Studio still insists on compiling everything every time. I've tried looking for solutions and tried checking "Only build startup projects and dependencies on Run" in Tools|Options|Projects and Solutions|Build and Run but it makes no difference. The second problem also remains, and now I have no way to test if it will work with Vista, since that contractor is no longer with us. If anyone can test this batch file on a Windows 7 and/or Vista box, I would appreciate hearing how it went. Finally, my aesthetic problem with this solution, is even strnger than it was before, since the batch file is more complex and this there is now more to go wrong. If you can think of any better solution, I would love to hear about them.

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >